There seems to be great documentation on multicasting live streams.I can't find anything about multicasting recorded stream. Is there documentation? Or any other sources of info on multicasting recorded streams?
i test the fms 4 update 1 rtmfp streams multicast after 10 minutes i get this message RTMFP Multicast stream has exceeded max duration allowed; closing stream. but i do not use IP multicast
We have a video broadcasting platform and we use FMS 4.5 to broadcast a multicast stream. My question is : can we do QoS tagging of this multicast stream ?
I'm developping a FMS 4 application that read an external stream, and then, republish it in multicast:
1.- So, first of all I open a NetConnection to the remote application. And I associate it to a new Stream created in the application. Then I have the stream available in my application.
nc = new NetConnection(); nc.connect(REMOTE_APPLICATION); nc.onStatus = function(info)
[Code].....
But, I don't know who would be the client in the registerStream method. Which reference should I add there? Is it possible in this way?
I made another different script that republish that stream in localhost using NetConnection.publish(localhost/sameapplication). It works properly. But I would like to be able of managing it in the other way.
I use FMLE 3 streaming and recording (DVR) to FMS 3.5, and the sample DVR scripts on FMS for recording the live stream, but the recorded stream not playable (from the remote FMS with FLVPlayback and from my local Developer FMS with Adobe Video Player too). The recorded file is crashed.
I'm streaming live video through FMS 3 and passing the streams to another FMS application to be recorded. While recording, I inject some custom metadata into the stream. Once recording is finished, I move the flv to another location and update my database with the metadata contained in the stream for later reference. All this succeeds without a problem.
However, only sometimes, the default 'duration' metadata (note - I never mess with the duration metadata, merely insert my own fields that are required for other reasons) is COMPLETELY wrong
NetStream.play(streamName, -1); This seems to be working wrong.if I have recorded an flv on server using FMS and FMLE with only audio with name "myaudio" and then after if I try to play a live stream using NetStream.play("myaudio", -1) then it plays the recorded stream. I believe that documentation says that it should start a live stream instead of playing recorded stream as the second argument is -1. Is this a bug in NetStream.play method?
I'm having a very frustrating problem with FMS. A stream recorded on the server-side won't play until I restart the server.
The user flow I'm currently working with is:
1. record webcam stream
2. close the stream
3. call FMS function to post process the stream
4. FMS joins the recorded webcam stream with two pre-recorded videos and stores as a new stream
Then, knowing the new stream name, I'm trying to play it. However, I'm always seeing its last frame and it doesn't play. Surprisingly, restarting the server and re-connecting to it helps, then I can play the stream.
So I wonder if (server-side):
var stream = Stream.get("streamName"); stream.record(); stream.play(...); stream.play(...); stream.play(...);
I have successfully been able to write some client-side ActionScript that allows me to record my webcam and mic data to my FMS server. The .flv files get saved to a directory under the applications directory. However, I need to process those flv files for a web application and would like those .flv files to be saved to the web app's directory (outside of the FMS application directory). I have tried changing the <Streams> tag in the Application.xml file that I placed in the FMS application directory, but that simply did not work.My goal is simple: to save the recorded stream data outside of the FMS application's directory. I have been going crazy trying to do this, but would like to know any pointers that any of you experts have. Can this be done? If not, how can I execute a server side PHP script after the stream is done recording. I know about using the exec() function in PHP, but am unsure as to how to execute that script via client-side actionscript.Here is the path to my FMS server (Linux OS) installation and the application directory. The application name is "ngale"./opt/adobe/fms/applications/ngaleHere is the path that my client-side ActionScript is placing the .flv files of the recorded streams/opt/adobe/fms/applications/ngale/streams/recordings/Here is the path to my web application/home/user/public_html/dev.ngale.net/public/Here is the path that I would like for the .flv recording files to be saved to/home/user/public_html/dev.ngale.net/public/audio/.flv
Is it possible to limit the length of live recorded video stream to N minutes/MegaBytes by SSAS in FMS? If the aggregate size of stream outcomes the limit,always make the recorded video file the last N minutes/MegaBytes of the live show? remove the video file and record from the beginning when arriving N minutes/MegaBytes?
I have started a stream from Flash media encoder to flash media server. According to my encoder, it should be located at: "rtmp://localhost/live" and is called livestream.In the Flash software I try and set the source of FLVPlayback component to rtmp://localhost/live/livestream", and I get this error: "NetStream.Play.StreamNotFound : Adobe Flash tried to play a live or recorded stream that does not exist. Source can't be found.". Both the server and encoder is runnind and I am encoding.
As titled, what is the way to record video/audio files using Flash Meida Server through rmtp, and allow users to access the recorded files through http?What I am trying to do, is to record a user's microphone's input and save it to the server.fterwards, I would like other users to be able to access the recorded files and mainuplating the audio data, by computeSpectrum(), to do some visualization of the audio. As I know computeSpectrum() cannot work on streaming files, so I think I need to access the recorded files using http instead of rmtp. Is that true?
I was wondering if any one could clarify something with IP multicast. From what I understand to get IP multicast to work the server needs to publish to the router box. So how can this be achieved if the FMS server is not in the same network as the router box? Would it require a second FMS server to be placed in the same network as the router box that connects to the main FMS server?
We are testing IP Multicast, with Flahs media player 4.0. We set up the encoder and generated the manifest file. Multicast start, but we see lot of bufferring and video running fast. After about 2 minutes, the encode stops publishing to the server. We are using digital rapids as the encoder.
I am testing the P2P function of Flash Media Server 4.I have setup two environment for testing, but only one can success to using the P2P.However, i found that there is some problem in network connection.Here is my procedure to create the Peer to Peer Multicast.1. Using Multicast Config Tool ( oolsmulticastconfigurator) to create the "Peer to Peer" type (Generate the manifest.f4m and copy the publisher stream name to flash encoder) --> multicast type: Peer to Peer --> RTMFP URL: rtmfp://true ip/miltocast --> stream name: livestream --> publish password: password --> group name: fms.multicast.example --> ip multicastaddress and port: 224.0.0.254:30000 Using Flash encoder with the public stream name connect to flash media server.3. In client side, we use the example player ( oolsmulticastmulticastplayer)
First Environment (A) - not work 1. Flash Media Server (Data center) Data center reported that "All the port have opened (TCP/UDP)
I'm in the process of developing a realtime video chat application where multiple users can send video streams simultanuously. The number of users receiving the streams can be very big, e.g. 10 broadcasters and 500 receivers, each receiver should get all streams.
I use RTMFP connections to an FMS and streams are published in P2P multicast groups by passing the groupspec to the NetStream constructor. Currently I'm having problems with audio/video synchronization and video stream 'jumps' (not continuous). From what I read on other threads, this is related to the fact that there is not enough upstream bandwidth for sending the streams. So my questions are:
How to calculate the required upstream bandwidth on every peer for the given example of 10 broadcasters and 500 receivers (is it 10*bandwidth of one stream)?What settings (on NetStream, Camera, Microphone etc.) should be used for best results and how to adapt them based on the number of broadcasters?
We have a problem with amazon FMS 4.0.3 server for enable muticast.In the server we have open all the port for ensure the good works.but when we used the http://public-dns/multicast/multicastplayer/multicastplayer.html, we have this message : The connection attempt to FMS failed.In the server administration we can view the Flash Media Live Encoder session but the multicastplayer cannot connect on the rtmfp://public-dns/multicas
I know this question comes up from time to time, but it's completely unaddressed in the documentation and I'd like there to exist a more authoritative treatment of the subject.Like many other developers, we're trying to record multicast RTMFP streams. We see three options:
Broadcasting clients open two outgoing NetStreams: one for the multicast group through RTMFP, and another directly to FMS through RTMP for recording. One downside to this solution is that now the user has two outgoing streams, so available outbound bandwidth for any broadcasting client could easily become a constraint on video quality.One can stream to FMS 4 using RTMP and have FMS-side code that records and broadcasts the stream to the multicast group using RTMFP (recommended by JayCharles at http:url...). I haven't tried this out yet but my guess it that it will have worse video quality than a pure RTMFP solution. Is my concern justified?Super-hacky solution: one could have flash player running in some kind of virtual environment on a server. This flash player could subscribe to the multicast stream and record the video at a systems level. Has anyone tried a hack as daring as this? What solution is recommended by Adobe?It's incredible to me that FMS 4 can't act as an RTMFP consumer, thereby both acting as a multicast node and also recording the video. Can anyone at Adobe comment on this omission in functionality?
I work for a carrier, and am looking at implementing IP Multicast on an MPLS network to support enterprise executive webcasts. As your probably aware, there are multiple multicast methods available.
Anyways its down to two multicast methods. PIM-SM ( which our engineering group will probably reject to complexity)
PIM-SSM ( Which not every application supports Can anyone advise on whether Adobe flash media server ( and player ) support PIM-SSM multicast?
am setting up multiple multicast sessions, i generated configurations for each session using a seperate stream name, IP address, port.
1) Is it possible to run multiple multicast session.From configuration, looks like we can, but need confirmation 2) How can i test these sessions using the default player ? or any sample player
I'm using the server Stream.play() method to playback a recorded stream, it plays back fine until it hits the buffer limit (Say I set the buffer to 5 seconds, it will playback fine for 5 seconds). But then it freezes and playback is very stuttery (1 frame every 2/3 seconds). Is this a know issue? I'm using windows 2008 server. I've tried a few things to resolve this but no luck. The server is running the dev license and has no load.