Media Server :: Multiple Audio Streams Mixed With A Video Stream
Apr 20, 2011
I have read in FMIS 4 "new features" that "absolute timecode" allows to switch audio tracks while playing a video (managing synch). Is there any exemple showing how to use this funtionnality (server config, flash player action script exemple)?
I need to merge multiple live audio streams into a single stream so that i can pass this stream as input to VOIP through a softphone.For this i tried the following approach:Created a new stream (str1) on FMS onAppStart and recorded the live streams (sent throgh microphone) in that new stream.
Below is the code : application.onAppStart = function() {
I did a live stream last week using 282,482,832,1500Kbps streams. What would cause the audio to get out of sync with the live video stream? I'm trying to determine if it was bandwidth related, cpu/memory issue on the FMIS 4.5 server, or an issue with encoding PC exceeding it's limits?
I currently have two connections with two separate streams. They both hit the same fms 3.5 server. One connection transfers live audio and video. The other one is used for remote objects. Sometimes when viewing the audio and video stream with a slower internet connection, the stream for the shared objects disconnects. I think it is a bandwidth issue. Is there any way to set the priority of the streams? I think this should allow me to set a higher priority for the shared object connection so it won't disconnect.
I am using Flash Media server 4.5 and i read the tutorial if i want to stream the live feed, i may need to use the media live encoder. but what i found in media encoder is i have to manually setup everything and it only support camera devices. But in my case i have multiple video files keep received from another program and place it on file system (server),my goal is use the Flash Media server to perform a live boardcasting with these video file one by one. That means when client watching a live streaming, they will not notice the server is playing mov1, then mov2, then mov3, then mov4... and so on.
You can imagine i am trying to boardcast a live footage say for 60sec, but the video file will not recorded entirely after 60sec, instead for every 10sec i will save a new video file, so that when client watching the live by HLS [URL]when the time reach to 10sec, a mov1 video file available and FMS should boardcast this video on live123.when the time reach to 20sec, a mov2 video file available and FMS should Immediately follow the mov1 boardcast on live123.and so on...Also can FMS dynamically create a new streaming session (invoke by code), so that when client A uploading some video files to the server, the FMS open a new streaming session only stream cilent A video files?the configuration to boardcasting like screen size, bit rate, etc should be pre-defined on the server. [URL]
I've had FMS running on my local machine for a while and have had a little experience writing FMS apps, but I've just tried recording audio for the first time using the standard vod application and I keep getting a "Write access denied for stream" error. My AS3 code is copied and pasted for various examples and am confident that it works.
I'm running Windows XP service pack 3 & FMIS 3.5.
I've had a look at the vod/media directory and under windows->properties the read-only attribute is ticked. Every time I un-tick this it reverts back to being ticked. I've googled this and MS say that most programs ignore the read-only attribute and that it only really applied to files. I've also tried the MS fix for setting the read-only attribute via cmd and still no joy (doesn't fix read-only attribute or FMS recording the audio after setting via cmd).
I've also tried our dev server install of FMS (running under linux) and am getting the same results.
Here's my AS3 code...
private function initApp(event:Event):void { removeEventListener(Event.ADDED_TO_STAGE,initApp);
I was wondering. is it possible to manipulate in any way the video streams on the server side? Like for example,to have two streams coming from two clients and mix them into one stream,so a third client (or more) can play just one stream per client instead of two?
Can Flash Media Server 3.5 do the following?. Can it take multiple live streams?. Is it possible to control the ip streams - by using API's in to the Media Servers?.
I am developing an application where I need musicians to play together and I want to have an audience listen to it. I have created my publisher clients and they can create the streams, but now I want to take those multiple streams and combine them so that you can hear the guitar, singer, etc. all together.
My CDN is running FMS, and my customers and I stream a lot of live traffic through it. We've got FMS configured to auto-archive our live streams, but that only works for stream using VP6 video. When I look at the archived files captured from H.264 streams I get audio but no video.
Is it possible to get FMS to autoarchive H.264 streams? Or is this planned for a future release of FMS?
I'm trying to setup an audio-only, on-demand HLS stream in FMS 3.5. I have no problems streaming the sample f4v files via HLS, nor do I have any issues streaming the mp3 files via RTMP to a Flash client. However, when I try to stream a sample mp3 via HLS (the mp3 file is located in the same directory as the sample f4v's), I get a 404 error. I can't find anything in the documentation about streaming audio via HLS on-demand.
I'm trying to make a software which sends video and audio data to a flash media server by using RTMP protocol. Currently, my program can communicate with a flash media server correctly. RTMP specifications does not describe about the raw data in video/audio messages, so I muxed raw H.264 and AAC data into video/audio messages and sent to the server. The server seems to accept them, but a video player cannot playback the stream sending from the server. The player just says "Loading..." For a test purpose, I sniffed the network packets between Wirecast and the flash media server and ripped off only video and audio data. Then, I muxed those data into video/audio message and sent to the flash media server. In this case, the video player connected to the server can playback the stream correctly.
I checked the stream sent from Wirecast, the stream seems not to be H.264 raw data because those data are not started from 0x17 instead of H.264 start code. With those situation, I am wondering what kind of container format I should use for H.264/AAC data to the flash media server.
I have the streaming server 3.5.3. The sample player page for the dynamic streaming is here: [URL].click on the dynamic sample. pause and wait a few minutes for disconnect from the server. When you hit play, it starts from that spot and also starts from the beginning in the background audio while showing the buffering orange circle the whole time.
My users are students watching long lectures, they pause all the time. when they come back, it is a mess.
when i try to live stream with FMS! I can stream video with Flash media live encoder to the server but when i create the player to recieve the livestream from server,i can not recieve the live stream,can anyone give me a step by step tutorial of how to do it?
"To serve streams over a cellular network, one of the streams must be audio-only. For more information, see HTTP Live Streaming Overview.To publish an audio-only stream, enter the following in the Flash Media Encoder Stream field:livestream%i?adbe-live-event=liveevent&adbe-audio-stream-name=livestream1_audio_only&adbe-audio-stream-src=livestream1If the encoder specifies individual query strings for each stream, use individual stream names instead of the variable %i:livestream1?adbe-live-event=liveevent&adbe-audio-stream-name=livestream1_audio_onlylivestream2?adbe-live-event=liveevent&adbe-audio-stream-name=livestream2_audio_onlyTo generate a set-level variant playlist when using an audio-only stream, specify the audio codec of the audio-nly stream. Specify the audio and the video codec of the streams that contain audio and video.For more information about using the Set-level F4M/M3U8 File Generator, see Publish and play live multi-bitrate streams over HTTP.
i'm trying to build some kind of online radio. Therefore i'll be streaming audio (mp3) in real time. i have a play-list on my server and the idea is for this play-list to play all day long, but on the client side the user can only do play and stop the music.
The problem is when one music stops and the next one starts i need the server to send not only the new audio but also all the info about this file: duration, author and music name, this way my player can show a play /stop and all the info about the current music playing.Until now i've only come across with video live streaming, but very little information on how to do a live transmission of audio.
ps: the only info on streaming audio that i've found so far is about audio on demand, but the way i want things to happen the client can't chose the music.
I ran a simple live video streaming application for the first time with actual users and ran into a couple of serious performance issues that had not turned up during testing. In this instance there was one video stream from a live web cam and used FMLE at 150 kbps using VP6 and MP3 @22k. There were 16 clients and everything worked pretty good for about 30 minutes. (although some clients said their audio and video were out of sync by up to 3 seconds)
Then individual clients would have either the video freeze or the video would continue and the audio would stop. These clints had to "disconnect" and then "connect" again to the application. This happened to all of the clients at one time or another for several minutes. I stopped and restarted the FMLE with progessively lower bandwidth settings down to 75 kbps but still clients were having the same issue.
I eventually stopped the FMLE and used the applications built in publisher at 45 kbps and that seemed to eliminate the freeze/dropping issue. But of course the video quality was very poor and some clients still reported that the audio was out of sync with the video. The server hosting the FMS application is a quad processor dell with lots of memory and network connectivity. The Flash Media Admin Console performance graph showed the total Bandwidth as 3 Mbps at maximum.
I need to improve the uptime of a 24/7 live audio stream. Periodically, the connection between FMLE and FMS (3.0.1) is broken and FMLE cannot reconnect. FMLE shows NetStream.Publish.BadName error. How do I prevent this error and get FMLE to automatically reconnect every time? What other measures can be taken to improve the uptime of a live audio stream publishing from FMLE 3.1 (Windows) to FMS 3.0.1? Also, are there any improvements in FMS 3.5 that will help with uptime?
I am using FMLE and fms 4 ent. When using fmle to broadcast webcam without audio. It goes really fast. I am on 1gbps connection with nothing else coming to the server at moment. I tested download speeds to and from the server and they are around 700mbs. I have tried all the different audio sample and bit rate settings. All have the same result. Once audio is turned on there becaomes a 15 second delay. What am I doing wrong.
I have read that you can stream protected audio/video to iOS based devices with the new FMS 4.5.Is it also possible to stream protected A/V to a HTML5/JS based Player.
HI installed flash media server 3.0.4 on a Windows 2003 server.I am using the default LIVE application and broadcasting to it using FME version 3.0 Inside FME I choose MP3 as the audio codec, stereo and 128kbps When I watch the stream everything works fine but after a minute (sometimes 2 minutes) the sound disappears and I can only see the video run.If I refresh the page that contains the SWF that plays the stream then the sound works again but then again after a minute the sound stops and only the video continues to play. It is important to say that this only happens when I play a live stream from a server that is located outside of my local LAN. If I play a stream from my development FMS server in my local LAN this problem doesn't happen using the same SWF and same settings in FME.Moreover, if I use the nellymoser audio codec then everything works fine and this problem doesn't happen also. The thing is that I want to broadcast a live event and it is very important that the audio sounds good (stereo and 192kbps) and if I use the nellymoser codec the results aren't that good.
I'd like to capture the audio data from an RTMFP stream to which the client is subscribed (so I get a bytearray of audio samples).The presence of the audioSampleAccess propery on the NetStream class certainly makes that sounds possibe: For RTMFP connections, specifies whether peer-to-peer subscribers on this NetStream are allowed to capture the audio stream. When FALSE, subscriber attempts to capture the audio stream show permission errors.[code]But in the case of audio, I dont know how to address the audio data to get it into a bytearray.My instinct said this wasnt possible, but the presence of the 'audioSampleAccess' property makes me think it might be..
I currently use FMS 4.5 to steam Live Technical Videos.
Hooked into Flash Media Live Encoder is a Roland VR-5 which lets me have multiple Camera switching.
I want to braodcast also a few live Audio only streams. I could easily use the VR-5 which in turn has a Behringer Audio Mixer attached to allow up to 16 Audio devices to be mixed in, and then just turn off the Video option in Flash Media Live Encoder but:
I don't want to have to put a video window on the Webpage. I just want an Audio controller.
So how can I use FlashMedia Server to just stream a live Audio feed?
Hi,I'm trying to use NetStreamPlayTransitions.SWITCH to create a multi angle view that switches between video streams. The issue I'm having is that NetStream.Play.TransitionComplete is called only after the buffer for the video before it is used up(this makes sense when using SWITCH to go between bandwidths but that's not what i'm using it for). Is there a way to force this switch before the buffer of the previous video is used up?
I've looked into SWAP but I can't really find any documentation on it. What I ideally would have happen is the next video in the array is triggered, that video is buffered and when there is enough to play it the stream switches to that one. SWITCH works really nice because there is no jump in switching when it's played but I just don't want the buffer to play out before the switch.Is there a way of maybe clearing the buffer of the playing video before i call SWITCH so it transitions quickly?
As titled, what is the way to record video/audio files using Flash Meida Server through rmtp, and allow users to access the recorded files through http?What I am trying to do, is to record a user's microphone's input and save it to the server.fterwards, I would like other users to be able to access the recorded files and mainuplating the audio data, by computeSpectrum(), to do some visualization of the audio. As I know computeSpectrum() cannot work on streaming files, so I think I need to access the recorded files using http instead of rmtp. Is that true?