Media Server :: Stream.getOnMetaData() Not Returning Anything?
Jan 22, 2009
I'm currently trying to return the metadata of recorded streams using Stream.getOnMetaData(), but it never returns an object. If I trace server-side, the stream is found, as I can get the size and length of it, but no metadata object comes back to my client. Does anyone have a sample using this method properly?
I have Flash Media Streaming Server 3.5 (not Interactive) running on RHEL5.5 x86_64 Linux.All is working well, however how do I prevent unauthorized access to connecting to the live stream and streaming content?How can I setup the server to require a user and password to stream live media to the server?I am new to this product and I have been reading some documentation but I have not found a clear cut answer on how to force a username and password to connect to the server to stream live content only.I am using the Adobe FMS Apache install, what files need changing?[code]I want to lock down a person from connecting to the server on the public internet and starting a live stream?Can this be done with a user name and password?
when i try to live stream with FMS! I can stream video with Flash media live encoder to the server but when i create the player to recieve the livestream from server,i can not recieve the live stream,can anyone give me a step by step tutorial of how to do it?
Only just getting started on this whole domain of learning, so go easy!If I set up a P2P video/audio chat (similar to the sample VideoPhone thing on the Cirrus site), can I get the stream from both parties to send to a server at the same time so that I can record it? If so, would I have to use a FMS to stream it to and perform the recording (and if so which version could I get away with)? Are there any (preferably free, or just tutorialised) solutions for the recording side of things?
Currently it seems like the only option for doing the P2P thing is to use Stratus/Cirrus unless I use FMS4 Enterprise.
how effective this kind of situation can be, in terms of quality of the stream and recording? Does any of this make sense?
I've had FMS running on my local machine for a while and have had a little experience writing FMS apps, but I've just tried recording audio for the first time using the standard vod application and I keep getting a "Write access denied for stream" error. My AS3 code is copied and pasted for various examples and am confident that it works.
I'm running Windows XP service pack 3 & FMIS 3.5.
I've had a look at the vod/media directory and under windows->properties the read-only attribute is ticked. Every time I un-tick this it reverts back to being ticked. I've googled this and MS say that most programs ignore the read-only attribute and that it only really applied to files. I've also tried the MS fix for setting the read-only attribute via cmd and still no joy (doesn't fix read-only attribute or FMS recording the audio after setting via cmd).
I've also tried our dev server install of FMS (running under linux) and am getting the same results.
Here's my AS3 code...
private function initApp(event:Event):void { removeEventListener(Event.ADDED_TO_STAGE,initApp);
i test the fms 4 update 1 rtmfp streams multicast after 10 minutes i get this message RTMFP Multicast stream has exceeded max duration allowed; closing stream. but i do not use IP multicast
I build a client side application where is only a FLVPlayback2.5 component and a short AS3 script.
[Code]....
My Encoder is setup with three streams: Vid: 500 kbps - Audio: 48 kbpsVid: 800 kbps - Audio: 48 kbpsVid: 1500 kbps - Audio: 48 kbps I start the encoder and everything looks fine in the log. In my browser (Safari or Firefox) I go to my html site and the stream starts after 6-8 sec. But anytime with the lowest bitrate 548 kbps and nothing look like the stream is switching to another bitrate. I tried it with the smil playlist and the result is the same. Only the lowest bitrate is plublished.
I have recently installed FMIS 3.5.3. In checking the access logs I find data in both logs that display the same stream stop and stream play time .I'm not sure why the time is the same (00:19:27 example below). Videos play fine when testing from work (T3 connection). However, occasionally a very slight hesitation when playing video from home (I have cable connection). [code]...
I'm having a problem with recording a live webcam stream. The last few seconds of the stream is getting cut off. The recording is stopped with the following piece of code:
I'm trying to stream a HDS live multi-bit stream, it seems to push to the FMS but my player doesn't display the stream.Are these settings and files correct? The documenation is confusing on what and which files need to be edited and/or created.
Encoder settings: Bit Rate: 150,500,700 FMS URL: rtmp://myserver/livepkgr Stream: liveevent%i?adbe-live-event?liveevent
FMS 4.5
I see the following directories being created when I start encoding and each directory has a single file with a .stream extension in them. Are these correct? fC:FMS-HOMEapplicationslivepkgrevents\_definst_liveevent1[code].....
I'm new to Adobe Flash and I'm just trying out the Flashe Media Server 4.5 to stream live video on a local machine. I'm using a FLME 3.2 to capture the video from my digital camera, which works fine, and I can connect to the server with no worries as well. But I have issues streaming the content in the Sample Video Player that comes with the server.
[Code]....
I've also tried playing some sample videos on it. it even refuses to play those. What could be the problem??
I'm currently working as part of team of developers and we've run into an issue with our FMS project.In our site the user is able to record video of themselves which is turned into flvs and stored by the FMS. These videos are played back at a later time. The site works fine but intermittently stops recording new videos or streaming previously recorded videos.The only solution at this point is to restart the server. The server logs show that the onDisconnect event handler is running when users terminate their sessions and we can see that the number of active connections does not appear to be exceeding the limit on the server. There are no runtime errors to indicate that anything has gone wrong.From looking at output from the app it just seems like the publish and unpublish event handlers stop running but nothing actually breaks.The user doesn't realize anything is wrong until they try to watch their videos only to get a blank screen.
We're wondering if perhaps it is a garbage collection issue? Either the garbage collector in the FMS is running and taking a really long time to reload the app or perhaps it isn't running at all and somewhere some memory is overloading.
One server - is streaming server with Flash Media Interactive Server 3.5that host the application with asc files in FMS application directory.Second server - is the IIS web server that host thehtml,aspx,swf etc.. files.So basically i heve swf file on one server that have to connect to live streaming via rtmp on different server with different ip address.I did not find any clear explanation on this crossdomain rmtp issue.ys only HTTP can use crossdomain.xml policy filebut not rmtp.So what policy security procedure need to be done in order to enableswf file hosted on one server to connect and show rmtp live cam broadcasting from another server?
I am trying to create a flash player to stream an on online Internet radio station using FMS. I chose FMS after being told that I need RTMP server to extract the metadata. Our radio audio is being encoded using a DJ interface called SAM Broadcaster. But for some reason, it does not have an option to send over the stream directly to a Flash Media Server. Only options are either IceCAST or ShoutCAST. How should i setup the stream? Should it be Stream encoder>> IceCAST/SHOUTCast Server>>Flash Media Server>>Flash Player Client. Or should I setup Stream Encoder>>Flash Media Server>>Flash Player Client? Shouldn't Flash Media Server be an alternative to IceCast or Shoutcast? If that is the case, how do I send over the stream from SAM Broadcaster directly to Flash Media Server without restreaming through another streaming server?
'URL missing from Media tag' on FMS4.5 HTTP Live Stream playback.I have spent a whole day trying in vain to publish f4m live http streams from FMLE using FMS 4.5, and almost everything I try results in the error .The F4M document contains errors URL missing from Media tag...in both Flash Media Playback and the fms videoPlayer app.Other people seem to have seen this and solved it, but I've found nothing that helps. I have tried to publish a single stream, and with multiple bitrates. For the latter, I followed the Adobe video tutorial as follows:
Multiple bitrate live ================ I'm publishing 3 streams from FMLE, using FMS URL: rtmp://[serveraddress]/livepkgr[code]..........
I am trying to connect to an FMS application without flash (no SWF). To the best of my knowledge,the only way to accomplish this is thru the admin APIs, is that correct? If so, how do I return meaningful values from the admin API? I can call functions and stuff using the broadcastMsg API call but I can't seem to get back any values. All that is returned is a useless "Success" type message. I have tried using shared objects and the getSharedObjects API call, but the call doesn't return any info on the contents of the shared objects, just their names and some info on their persistance.
Is there another way of accomplishing my goal outside of the Admin API? If not how do I return meaningful variables from the Admin API? Is there a way to hack or extend the admin API?
What would be some top reasons an FLV will not stream with Flash Media Streaming Serve 3.5? I ran the FLVcheck.exe on the FLV and it passed; however, the FLV will not stream. If I could just get some good reasons that would cause an FLV from streaming I think that will give me a good starting point for troubleshooting.
Beside Flash Media Encoder, can I use a live stream from Windows Media Services?We have live streaming with Windows Media Server, but now want also offer to mobile devices, but we dont re-encode the entire streams (tvs) again, it would be great if we can use a stream from windows to tunnel it to flash media server.
Is there a way to stream data instead of media files? In particular I'm interested to stream swf files that will be dinamically loaded by another swf client. I see this interesting message talking about how to do this on Wowza but I prefer to find a solution also in FMS to use the SWF Verification feature.
I have FLV files streaming fine from the media folder in the applications folder.I want to split up the videos into folders to make them more manageable.If I create another folder in the applications folder at the same level as the media folder and put a video in there the video does not play.
I have an *.flv file on a FMS. When I play it on the client side the video plays just fine, but when I call Stream.play(filename, 0, -1, false) on the server side the video turns out really choppy.I both cases I use NetConnection to connect to an rtmp and NetStream to play the stream, but in one case I connect to a stream and request the server to play my file on that stream. Apparently that doesn't work with files? It works just fine for live streams.
Somebody I know can't watch because they have a proxy server.The nc.connect() call to get the live stream fails because it goes direct from AS3 to FMS and doesn't go through the proxy server.
I seem to recall reading somewhere (but for the life of me I can't find the article I was reading), that with Adobe FMS you could store content on one machine, and then run the actual server on a separate machine. Is this correct? If so, how would you go about doing it?
I understand how clients use bandwidth detection etc to dynamically switch streams via client calls with ns.play2( ... ), but I was wondering if it's possible to only ever use 1 initial ns.play( ... ) call on the client side, but let FMS server side logic that I write dictate which client sees what content. For example, I have 3 clients connected to my FMS server, all watching a live stream. I then decide I want clientA to see 'recordedMovieA.flv', clientB to continue seeing the live stream, and clientC to watch 'recordedMovieB.flv'.
I am recording a video and while recording I issue some NetStream.send("doSomething", params) commands from client side. When I am playing back this video I receive the doSomething events on client side. No problems so far.Can I receive those events on server side? I want to handle those events on server side. Not client side.