I want to set the bit rate of audio and video (48,96.128 kbps) at the time of live streaming . i m not using the flash media live encoder because i have to make it as web applicaton. can any one tell me how to set bit rate.
I'm trying to make a software which sends video and audio data to a flash media server by using RTMP protocol. Currently, my program can communicate with a flash media server correctly. RTMP specifications does not describe about the raw data in video/audio messages, so I muxed raw H.264 and AAC data into video/audio messages and sent to the server. The server seems to accept them, but a video player cannot playback the stream sending from the server. The player just says "Loading..." For a test purpose, I sniffed the network packets between Wirecast and the flash media server and ripped off only video and audio data. Then, I muxed those data into video/audio message and sent to the flash media server. In this case, the video player connected to the server can playback the stream correctly.
I checked the stream sent from Wirecast, the stream seems not to be H.264 raw data because those data are not started from 0x17 instead of H.264 start code. With those situation, I am wondering what kind of container format I should use for H.264/AAC data to the flash media server.
I've searched quite a few threads:[code]But all of them are talking about playing multiple bit rate live video,my question is how I can publish multiple bit rate live videos in the first place?
I ran a simple live video streaming application for the first time with actual users and ran into a couple of serious performance issues that had not turned up during testing. In this instance there was one video stream from a live web cam and used FMLE at 150 kbps using VP6 and MP3 @22k. There were 16 clients and everything worked pretty good for about 30 minutes. (although some clients said their audio and video were out of sync by up to 3 seconds)
Then individual clients would have either the video freeze or the video would continue and the audio would stop. These clints had to "disconnect" and then "connect" again to the application. This happened to all of the clients at one time or another for several minutes. I stopped and restarted the FMLE with progessively lower bandwidth settings down to 75 kbps but still clients were having the same issue.
I eventually stopped the FMLE and used the applications built in publisher at 45 kbps and that seemed to eliminate the freeze/dropping issue. But of course the video quality was very poor and some clients still reported that the audio was out of sync with the video. The server hosting the FMS application is a quad processor dell with lots of memory and network connectivity. The Flash Media Admin Console performance graph showed the total Bandwidth as 3 Mbps at maximum.
As titled, what is the way to record video/audio files using Flash Meida Server through rmtp, and allow users to access the recorded files through http?What I am trying to do, is to record a user's microphone's input and save it to the server.fterwards, I would like other users to be able to access the recorded files and mainuplating the audio data, by computeSpectrum(), to do some visualization of the audio. As I know computeSpectrum() cannot work on streaming files, so I think I need to access the recorded files using http instead of rmtp. Is that true?
We use a Flash component to allow a user on our web site to record from a Webcam on to our own Flash Media Server. The problem we are having is that the video in a 30 second FLV freezes at the 7th second but the audio continues. The video unfreezes after a couple of seconds but never catches up with the audio. At the very end there's a "Fast-forwarding" of video for the last few seconds so that at literally the last moment, everything's in sync. This happens for almost all of our recordings. Has anyone experienced this type of behavior?
I have been running FMIS 3.5 since April of this year. In August, we started recording conferences, but I'm not sure that this is the issue. We are under very low load at the moment. We never have more than one conference going simultaneously. Our server is spec'ed with a 64-bit operating system (MS08), 24GB RAM, and 24 processors.I notice that in some of our conferences, the video and audio freezes for 5 to 10 seconds. Then they recover and resume where they left off. When this occurs, it recurs every one or two minutes, which is very disruptive.People have to constantly repeat themselves.
FMIS is configured with defaults out of the box, with the exception that we are recording live video and audio. How do I solve a problem like this? Is it Internet latency? Is it our internal network? Is it FMIS 3.5? Not sure where to start.Now I've inherited much of the Flex 3 code that is used in the stream and player Flash components. I notice in the player component that the original programmer is using a timer to monitor the mx.Event.VideoEvent.STATE_CHANGED event. As long as the event fires, he resets the timer. But if 10 seconds expires and the STATE_CHANGED event has not fired, he restarts the player. Is this a valid methodology? Here is the code fragment.
<mx:Script><![CDATA[var lastUpdate : Number = 0;private var pulse : Timer = null; private function onInit() : void { viewVideo.addEventListener(VideoEvent.STATE_CHANGE, function() : void { if (!lastUpdate) { pulse.start(); }[code]....
I'm currently able to record audio and video (from a webcam) to a Flash Media Server. However, in some cases users have a webcam with no builtin microphone. In that case the flash client uses the default microphone with 'Microphone.getMicrophone();' and possibly selects the micrphone of the PC.
A delay between audio and video is caused in cases with a separated webcam and microphone. There isn't a lot of delay on an internal network (e.g. LAN) however, there is a very large delay between audio and video on an external network (e.g. WAN).
I've a site where consumer can take a live meeting with the beauty consultant. The problem I'm facing is during the video chat. Basically there is a delay of 5 seconds between Video and Audio when I access the site from out of my network but when I access it within my network (VPN) then it's work fine.
I'm creating a video chat application, and no matter what combination of Camera/Microphone/NetStream properties and functions I use, I cannot get high quality video/audio. I get occasional audio latency, pixelated video, occasional frozen video and the degree of each depends on the combination of properties/functions I set/call.
Others such as TokBox, TinyChat, Chat Roulette, etc. have achieved great video/audio quality with FMS, what is the secret? At least point me in the right direction, because right now I'm not impressed with FMS ability to provide a good video/audio experience. BTW, I'm using a P2P mesh using a group specifier, not NetStream.DIRECT_CONNECTIONS.
I'm new to Flash media server family. I need a server that can capture user recorded video/audio (through a flash recorder) and save it as a flv file on the server. which one of those server family is for me? I'm currently doing it using Red5, but would like to try out Flash media server as well.
We are using FMS3.5 for VOD streaming. It's spotted that when we are using Flash Player Version = WIN 10.0.42.34/10,0,45,2, it is encountered that the video picture is hang but audio keeps playing after our commercial break point. we will first pause the video content and then use the same object to play the instream ad. After that, the video is resume when instream ad complete.
We have one functionality in which one user can communicate with the other user at the same time through flash media server3.5. In this fucnitonality we are catching stream from both end through camera and pass it to flash media server. On FMS side this streams gets stored also we are displaying those recorded flv files side by side to the corresponding user so that they can communicate with each other. In this situation we are facing a video and audio latency upto 1.5 to 2 sec.
I have this sample code from the e-book: "Learning Flash Media Server 3". The purpose of the code is to create a FLV. The code is not running as it should. When I click on the 'record' button, the label is to change to 'recording', that dosen't seem to happen and also when I click on the 'Stop Record' button, I get the following error:
"TypeError: Error #1009: Cannot access a property or method of a null object reference.at MinRecord/stopRecord()"
[Code]....
My deduction is that the 'if' statement in the 'startRecord' function is not resolving to 'true' and hence the label is not changing to 'Recording'. Also I feel that the assingment to the variable 'ns' of type NetStream is not being done and hence in the function 'stopRecord' I get the above mentioned error message when I click on the stop button. How do I rectify these problems?
I'm trying to record a stream from a webcam using FMS4 developer edition. The streams are recording on a linux box in .flv files. These files though have problems.
1 - The video quality is terrible. Fuzzy/blocky. Any kind of motion looks awful. Is there any way to improve this?
2 - The sound jumps/skips and goes out of time with the video.
As it is now this is useless. I am trying to make a system where a user records a short video from the camera and then I upload it to YouTube from my server. If you use Youtube's webcam page the video quality is quite acceptable and the audio is obviously in sync too, so they must know something I don't. [URL]..
I have read in FMIS 4 "new features" that "absolute timecode" allows to switch audio tracks while playing a video (managing synch). Is there any exemple showing how to use this funtionnality (server config, flash player action script exemple)?
I did a live stream last week using 282,482,832,1500Kbps streams. What would cause the audio to get out of sync with the live video stream? I'm trying to determine if it was bandwidth related, cpu/memory issue on the FMIS 4.5 server, or an issue with encoding PC exceeding it's limits?
I'm recording Webcam to FMS 3.5 but when I play the recorded video from FMS it's choppy. I have set the camera.fps to 30 but when I trace out the currentFPS for camera it's variable and usually falls between 20 and 30. However when I play the recorded video, netStream's currentFPS returns a lower value, something closer to 10. So my question is, 1) why currentFPS of netStream is not the same as Camera's? 2) What's the actual frame rate of the recorded video (not the netStream, but FLV's)?
I currently have two connections with two separate streams. They both hit the same fms 3.5 server. One connection transfers live audio and video. The other one is used for remote objects. Sometimes when viewing the audio and video stream with a slower internet connection, the stream for the shared objects disconnects. I think it is a bandwidth issue. Is there any way to set the priority of the streams? I think this should allow me to set a higher priority for the shared object connection so it won't disconnect.
When concurrent users increases my a/v conent delivery to the client is very slow. or even drop. Feel breaks in the session. but when check my server rsourses, almost free. One more observation, when i liesten to the recorded stream everything seems to be fine like there is nothing oing bad in live session.
Is there somthing which is missing for concurrent users for FMS config file.
I found an interesting article: [URL] I read the 2 .article, named: "Chapter 5: Two-way audio-video communications" [URL] I have copied the server side code and the client side code out of the e-book("A Better Two-Way Chat Application").
I am planning to upgrade to FMIS v4. Currently, I am using v2 and encoding separate files for each bit-rate ( i.e. 100K. 400K, 700K). Can I encode all audiences to one file like Windows Media P layer,have FMIS stream that file, and have the local Flash Player detect the correct version and stream it?
when I use f4f packager command line tool,i supply --bitrate value, but I know it in bps (say 500,000 bps). And f4fpackager wants it in kbps. So, should I divide by 1024 or 1000? Does that matter as far as that switching point for client logic?