Media Server :: Playback F4v Files Recorded In FMLE?
Feb 25, 2010
my name is Göran and I know this is probably the wrong place to post my question but as far as I can see there´s no forum for FMLE here. I´m using the betaversion for mac. I was broadcasting to justin tv last night and decided to try out the record feature i FMLE. It looked like it worked ok and a 125 mb f4v landed on my harddrive. There´s no way I can play it back though or for that matter convert it mov etc etc. What am I doing wrong do I need another special app for playback or converting. Seem a bit strange if this feature is onboard the FMLE but you can´t use it.
As titled, what is the way to record video/audio files using Flash Meida Server through rmtp, and allow users to access the recorded files through http?What I am trying to do, is to record a user's microphone's input and save it to the server.fterwards, I would like other users to be able to access the recorded files and mainuplating the audio data, by computeSpectrum(), to do some visualization of the audio. As I know computeSpectrum() cannot work on streaming files, so I think I need to access the recorded files using http instead of rmtp. Is that true?
I'm using the server Stream.play() method to playback a recorded stream, it plays back fine until it hits the buffer limit (Say I set the buffer to 5 seconds, it will playback fine for 5 seconds). But then it freezes and playback is very stuttery (1 frame every 2/3 seconds). Is this a know issue? I'm using windows 2008 server. I've tried a few things to resolve this but no luck. The server is running the dev license and has no load.
page 3 of his article "Working with metadata for live Flash video streaming" (http:url....) Jens Loeffler says the following...This code defines the function sendDataEvent on the server side, which utilizes the NetStream.send() function to inject the event, including the associated data synchronized with the current timecode of the video.
Note: If you record the video on the server side (a functionality of FMIS), the injected events are also being triggered by the recorded file.
Now, I've got both my own project and his sample project sending/receiving injected plain text data events via the whole "call ('sendDataEvent',data) -> send('onDataEvent',data) -> onDataEvent(data)" relationship just as expected and explained in his article.The Note that Jens makes, however, I cannot seem to validate and, unfortunately, that's linchpin piece in my project. Whenever I capture a stream with injected data events and play it back (via the same 'player' mechanism used with the live stream) the captured video plays back but I do not see the injected events being triggered by the recorded file.
I am using adobe dvrcast to record my live streams. the problem I am having is when I stop the encode during a live stream, a new file is not created, the existing stream is written over. How can I preventive this from occuring. Is there a way to have all vod files that are in one folder to roll over to the next file during play back.
There are some recorded flv files. The guy in the video is showing numbers with his hands. (1.flv, 2.flv, 3.flv etc) And there are live idle videos. In these idle videos the guy doesnt do anything.
I need to be able to click 1 as the operator and viewers should see the guy make "1" in the video. Actually I have accomplished so far. It works. I do this:
If the operator selects a video it adds the video to playlist: stream.play(filename, -2, -1, false);
Also have this event handler to play a random idle video when there are no more action videos lined up:
function onStreamStatus(obj) { if (obj.code != "NetStream.Play.Stop") return; switch (obj.code) {
[Code].....
So it is working so far. Problem is:
As time goes by there appears to be a delay between I issue the play command on the stream and I actually see it in the video on client side.
I wanted to know if there is any facility to use Software Switchers along with FMIS The Situation: I am streaming multiple streams using FMLE from one place and at the receivers end, the client receives only of the streams at a time. So I wanted to apply switcher so as, I could stream only one stream from FMLE at one time I have explained the scenario in the attached image.
My problem and I SWFVerification enabled on the server and saw that FMLE does not connect more, I looked for some explanation and I saw that if that is enabled for SWFVerification you must also change UserAgentExceptions tag <Exception from = "" to =""/> at this point I added this change<Exception From="FME/0.0" to="FME/4.0"/>but I saw FMLE does not connect as little to solve this problem
1) I am trying to make connection with Flash Media Server to publish live video streaming, and I am sitting behind proxy and firewall, I am not able to make my connection with rtmp that's why I use rtmpt to make my connection with media server to publish .But in this case after publishing some time(ie 5 to 10 minutes) connection break down and streaming get stopped .The same problem occurs when I use "Flash Media Live Encoder 3" to publish live stream .
2) I am making my connection like this "rtmpt://xx.yy.zz.kk:80/Training" where xx.yy.zz.kk represent live System IP where Flash Media Server is running.
3) After analyzing the problem I observed that while publishing the stream ,http requests from Publishing system keeps on increasing and at one time it crosses the limit of proxy server (our proxy can handle 600 http requests per minute from any client PC) so streaming get stopped.
4) Then I attended "adobe connect pro" Trail classroom session by sitting behind the same proxy and firewall,But in this case my live video streaming is not breaking down ,as per my understanding adobe connect pro also use Flash Media Server for streaming.
I have been trying to block incoming connections with fms 4 ent for the live application.I set allowedSWFdomains.txt and allowedHTMLdomains.txt to [URL] so that I block every domain. I don't want any connection to come in and setting the domain to [URL] blocks everything when I try to load. Except using FMLE it allows this no matter what you set in the allowedSWF or allowedHTML what kind of security is this. Does that mean anyone with FMLE can connect to anyones server. How do I actually block connections?
So I have FMLE and FMS. to record and stream video at the same time. On server side I have "main" application.onPublish = function(client, potok){ potok.onStatus = function(info){ for(var i in info){ trace(i + "=" + info[i]); [Code] .....
I need to take the some stream which is coming from firewire on mac, and then transcode/encode it to perticular bitrates, then stream.I think from firewire to FMLE, its straight forward. How can i stream? How can i give my clients ( say from mobile, or from any player as a http url streaming) stream? Do i need to use Flash media server, if yes can anybody point out the steps to set-up the same.
I am using FMLE and fms 4 ent. When using fmle to broadcast webcam without audio. It goes really fast. I am on 1gbps connection with nothing else coming to the server at moment. I tested download speeds to and from the server and they are around 700mbs. I have tried all the different audio sample and bit rate settings. All have the same result. Once audio is turned on there becaomes a 15 second delay. What am I doing wrong.
I read that it is best to de-interlace video "at the hardware level" before sending it to FMLE 3.2 for encoding. I have looked and looked for a software or solution but can't find anything that I can work. Is it best to de-interlace before sending the video to FMLE? I am streaming live video that needs de-interlacing before sending to our site. I know there is a "de-interlace" option in FMLE 3.2 but read that it was better to de-interlace the signal before sending to FMLE. I found something called DScaler but don't know how to get the video from DScaler into FMLE 3.2.
i was modified the main.asc script in livepkgr, it used to call s.play(streamObj.name), which i assume this is the code it play the stream from FMLE to s.play("FileName") , which is a flv video file in my application. unfortunately when i start the FMLE stream to the FMS and watch the stream with the FMS Sample Video player, i was expecting to see the video content from the flv, but instead the camera footage in playing on the player.
i am very confusing now, i thought the s.play the only code to control what should be playing in application. or when FMLE is connected to server, the server must play the footage from FMLE by some hidden code, no matter what s.play wanna to play?
As seen the tutorial [URL].. I have successfully implimented the steps. However the problem comes when the stream is disconnected and reconnected again due to network problems. Once the stream is reconnected the video stops coming, even though all 3 streams are succesfully publishing the stream to Flash Media Server but on the client side where I'm using OSMF media player the stream gets stops. Is there any setting that I need to impliment in the OSMF player to resolve this issue ?
I now have a Windows PC and I purchase Adobe Flash Media Server 3.5 software and install.I but server space on a managed server arrangement.I upload my pre recorded Flash clips to the server and my website and playlist to my host. Basically the website is channelled via http and the clips via RTMP. The key question that no one inside Adobe seems to be able to grasp (or answer) is ~ will it work if I stream my own clips?! or is it inferior to say that of a professional streaming company.The streaming companies charge anything from $125 to $250 a month, a big cost for someone like me, streaming around 1000mb a month. So does anyone out there have direct knowledge of this software and can you stream via rented server space without using a streaming company?
The auth_addin works as expected when using the Flash Media Live encoder requiring a username and password before publishing a stream.However, tools like Wirecast and even the sample broadcasting SWF that comes in the FMS Introduction page can publish a stream without providing any credentials!
I have developed an application for recording the mp4 format movie. I am able to record mp4 format file but my recorded file is not playing in window media player.
so, if any body have some idea/suggestion about play pre-recorded mp4 file in window media player then your most welcome.
I wrote a simple web app that record web cam capture over FMS, now I'm trying to find out how can I fetch the recorded file and give the option to download it to a local computer
I have a problem with recorded stream: The MetaData its partialy correct.Most of the informations are correct, but the 2 most important information for me width and height for the video are always 320 x 240 for any recorded stream althow they are published at different WxH.I tried to get my head around this but cant seem to figure out were the problem is.FMLE its used to publish the stream useing H.264 / MP3 codecs.Here is the server side code used to record the stream:[code]....
I'm creating a Flex application that enables users to communicate in a kind of video chat room. I only allow to live streams at any given moment. I'm using FMS to record the videos in FLVs. What I need to do is post process those videos and create a single video with all that happened in the video room. That means that I need to combine 2 videos, reduce the size of one of them so it appears over the other one. In other words one video will use the whole space an the other will appear at the upper left corner (over the other video,).
I need a server side solution for this. I created another post here and someone suggest a screen capture solution. That would work perfectly, but since this appliction will be used for several people and any time, I want the process to be automaticaly not manual. So, I'm thinking I can do it with FFMPEG, but I need a way to identify in which order I need to compose the videos. Something like adding a time stamp to each FLV in FMS (when they are recorded). How can I add custom metadata to the FLV?
There seems to be great documentation on multicasting live streams.I can't find anything about multicasting recorded stream. Is there documentation? Or any other sources of info on multicasting recorded streams?
I am trying to record a video chat being done through FMS. I have made my program start recording only when the second stream publishes. Then, I stop recording when either streams stops publishing. In my last test I recorded a 45 minute chat. The resulting flv files show that there are 13 seconds difference between them. The video gets more out of sync as it goes along. My first guess is that it is something to do with dropped frames. Is there any way to force FMS to fill in dropped frames. I'm posting my code for starting and stopping recording:
First,i am in project for creating some website with streaming video.The connection to the host is bad and i need to know if Flash media server can reconnect if connection is lost during playback and second,is there some kind of plugin for subtitles,this is very important,If there is a positive answers on those two questions,the third would be "where can i buy it and how"?