Our company has a Flex application (3.0.2) that is having a problem with audio dropping out in Windows 7. The problem appears worse with a wireless connection, but it can't be consistently reproduced there, either. The only thing known for sure is that the problem has not been seen on any other version of Windows.
The application thinks the audio is playing, because it is calling the appropriate event listeners triggered when the audio completes and when any audio cue points are hit. The majority of our audio is streaming via Flash Media Server (3,0,2,201), but the problem was also seen using embedded sound effects. In one instance, several clicks of a button triggered an error saying the sound effect could not be loaded, and then about a minute later, all the sound effects played at once and the user could then continue with the application.
I ran a simple live video streaming application for the first time with actual users and ran into a couple of serious performance issues that had not turned up during testing. In this instance there was one video stream from a live web cam and used FMLE at 150 kbps using VP6 and MP3 @22k. There were 16 clients and everything worked pretty good for about 30 minutes. (although some clients said their audio and video were out of sync by up to 3 seconds)
Then individual clients would have either the video freeze or the video would continue and the audio would stop. These clints had to "disconnect" and then "connect" again to the application. This happened to all of the clients at one time or another for several minutes. I stopped and restarted the FMLE with progessively lower bandwidth settings down to 75 kbps but still clients were having the same issue.
I eventually stopped the FMLE and used the applications built in publisher at 45 kbps and that seemed to eliminate the freeze/dropping issue. But of course the video quality was very poor and some clients still reported that the audio was out of sync with the video. The server hosting the FMS application is a quad processor dell with lots of memory and network connectivity. The Flash Media Admin Console performance graph showed the total Bandwidth as 3 Mbps at maximum.
We have setup a brand new FMS 4.5 for http streaming to ios devices. We are feeding the source via rtmp to livepkgr application. Upon digesting the iOS feed via the url below, there is no audio being passed. I can check the stream using the rtmp protocol and there is audio. [URL]..
Am developing an application with Flex 4.5 & Flash Media Server. I need a visualization curresponding to the plaing track . It is possible with SoundMixer.computeSpectrum(bytes, true, 0) in case of progressive downloading. But not working with rtmp streaming. And also need audio wave curresponding to the track which is also working with progressive ownloading usinf Sound object.
I want to implement an internet chat application, I mean, I want a front end, which connects a web cam (mic and speaker). So, I can send videos and audios to a back end server (do flash streaming? but I am not sure what good product out there I can use), and also send both A/V to another computer on the internet. In other words, need a front end that sends local a/v from webcam (&mic) to backend and receives remote a/v from backend to webcam (and speaker) + backend streaming server.
I have this sample code from the e-book: "Learning Flash Media Server 3". The purpose of the code is to create a FLV. The code is not running as it should. When I click on the 'record' button, the label is to change to 'recording', that dosen't seem to happen and also when I click on the 'Stop Record' button, I get the following error:
"TypeError: Error #1009: Cannot access a property or method of a null object reference.at MinRecord/stopRecord()"
[Code]....
My deduction is that the 'if' statement in the 'startRecord' function is not resolving to 'true' and hence the label is not changing to 'Recording'. Also I feel that the assingment to the variable 'ns' of type NetStream is not being done and hence in the function 'stopRecord' I get the above mentioned error message when I click on the stop button. How do I rectify these problems?
I have a virtual directory (Storage Area Network) in 'C' drive as well as in "webroot" folder in Flash Streaming Server. What do I need to do to make RTMP videos work from SAN directory on Flash Streaming Server. It works fine for http. RTMP from vod -> application folder works fine. I have done a lot of research and found out that we can use virtual directories for streaming videos. I am unable to find steps on how to use it..
I am new in action script 3.0. i am trying to build an audio player which will stream audio(mp3 etc.) from wowza media server. the player needs to have a play button, stop/pause button, a play progressbar and a volume scroll bar. i am using NetStream object to paly the audio
Can Adobe Flash Media Streaming Server 3.5 run on AMD Athlon Dual Sock Quad Core?I just requested a Dell server to be added to our farm to run as a Media Server and to my surprise, while reading the requirements for FMS it states the following: 3.2GHz Intel® Pentium® 4 processor (dual Intel Xeon® or faster recommended)
I have Flash Media Streaming Server 3.5 (not Interactive) running on RHEL5.5 x86_64 Linux.All is working well, however how do I prevent unauthorized access to connecting to the live stream and streaming content?How can I setup the server to require a user and password to stream live media to the server?I am new to this product and I have been reading some documentation but I have not found a clear cut answer on how to force a username and password to connect to the server to stream live content only.I am using the Adobe FMS Apache install, what files need changing?[code]I want to lock down a person from connecting to the server on the public internet and starting a live stream?Can this be done with a user name and password?
We (the university I work for) want to add an IP camera to the top of a building (so it has to be IP, we can't put a machine up there, so USB is out of the question) to stream a live view of the quad. The problem is, I can't find a way to stream any IP Cameras through the flash live encoder. I tried a camera by Axis, but their capture driver only supported MJPEG which the live encoder does not.
I'm running Flash Media Streaming Server and have only been serving VOD up until now. I had my network administrator open up port 1935 to the outside world during the setup process and now I can't remember if that was actually required for streaming VOD to clients. Most documentation I've read says that this port should be open, but I seem to recall reading something at one point that suggested it wasn't necessary.
I've just started messing around with publishing live streams using Flash Media Live Encoder to the Flash Media Streaming Server. I have that working without issue but was surprised to find that no authentication is required before a client running the live encoder can publish a stream to the Flash Media Streaming Server. An authentication module is available however it only works with Flash Media Interactive Server and Flash Media Development Server.
If I leave port 1935 open to the outside world, there would be nothing to stop anybody anywhere from streaming video via my server. Anyone else running a default install of Flash Media Streaming Server and with port 1935 open to the outside should see that this is true of their setup as well. I'm wondering if I can safely close port 1935 without limiting the functionality of the server or if there's some way I can require authentication prior to publishing a live stream even though I'm not on the four-and-a-half-times-more-expensive edition of the product.
I'm trying to make a software which sends video and audio data to a flash media server by using RTMP protocol. Currently, my program can communicate with a flash media server correctly. RTMP specifications does not describe about the raw data in video/audio messages, so I muxed raw H.264 and AAC data into video/audio messages and sent to the server. The server seems to accept them, but a video player cannot playback the stream sending from the server. The player just says "Loading..." For a test purpose, I sniffed the network packets between Wirecast and the flash media server and ripped off only video and audio data. Then, I muxed those data into video/audio message and sent to the flash media server. In this case, the video player connected to the server can playback the stream correctly.
I checked the stream sent from Wirecast, the stream seems not to be H.264 raw data because those data are not started from 0x17 instead of H.264 start code. With those situation, I am wondering what kind of container format I should use for H.264/AAC data to the flash media server.
We are streaming a one hour F4V from streaming media server 3.5.2 and for some reason it is seeing our one hour video as being 10 hours long. We have tons of other videos and never ran into this problem in any of the other files. This is the only file that exceeds one hour. This occurs in the default player from the streaming server.
I instal on a machine the Adobe Flash Server 3.5.I have the Flash media administration console but i do not know how i can configure the machine to be a flash media streaming server.I have a tv card on my machine and i want use de flash media encoder to stream to a web page the tv signal on my tv card.I think i need to send the signal encoded use the flash media encoder to a flash media streaming server correct?
I would like to ask if Adobe Flash Media Streaming Server (not Interactive) supports the following Stream.play (server-side streaming), but not necessarily server-side playlists which I know is a FMIS featureunsigned server-side asctionscripts (main.asc) The difference is in the price of course, one is $1000, the other is $5000, so if the two above are supported, I will buy FMSS
I have FMS 3.5 (Streaming Server) installed, and I am trying to have it render media from locations other than default. Here is my setup.
[Code]...
Basically I am trying to render media specific to different departments (test,test1). So "Department Test" media files are under [E:ContentFMSapplicationsvod est] and "Department Test1" media files are under [E:ContentFMSapplicationsvod est1]
Flash media server 3.5 to stream live video to a webpage with flex GUI embedded in it.For this I have used Adobe flash media server start screen .I could able to stream video correctly.I want to know how long it could be do like this.ie suppose I have to stream like this for 3 or 6 months
I have a FMS 4.5 with (License) and is set up and running fine, now I want to stream Live to the IOS Devices but no luck. I have a web page for test purpose with the video src tags to: URL... and in the server I have a livestream.m3u8 pointing to the same URL..., Now the Encoder have the AAC plugin and all the presets like it shut but when I go to see it in my IPad 2 this is how it looks.
I'm building a video conferencing applicaition for a portal. But now when considering which version of flash media server to buy, I run into some problem. So can anyone helps me about comparing this two versions: Flash Media Streaming Server and Flash Media Interactive server, like if I use Flash Media Streaming Server, some function like NetStream.pause() may not work or something like that? I'm not sure if this is a foolish question but please let me know.
I m a begginer with FMS. I have a licenced version of FMS and I want to stream a live video from the Localhost or another machine with camera. And then then I should broadcast it to many clients. I don't know how to begin
I have a windows server box 2008 R2 with Adobe Flash Media Server installed on it version 4.5. I have installed Flash Media Server and input the serial and all that fun stuff. All ports have been forwarded properly, Once the server is up and running I go to the admin console just to make sure all is running well. Once up and running I turn to my Media Encoder and enter my FMS URL as well as the stream, (livestream?adbe-live-event=liveevent). I select my input device and it connects and says streaming to primary. Now Yesterday I did this and it was streaming correctly i check with my ipad and the livepkgr was working, I have live video on my ipad. I then went to my website to check and make sure all was working on there and sure enough my live event was being streamed to there was well! Sweet right? No wrong....The server was shut down after this test trial assuming everything was working.
When I turned it on today the flash media encoder still connected and streamed right and when i check the ADMIN console it appears to be working and connected the right way, but my problem is when i go to my website i just get a black screen with nothing playing on it, and on my ipad for some reason I get the last 30 seconds of a piece of the live clip we where streaming last night and then it stops.......I have nooooo clue what is wrong and it is driving me up the wall. My hope is to be able to stream from my mac to my Media Server and then be able to embed it into my website as well as have it playable on ipad/Android (havent worked on android part yet because of this whole problem).
The company I work for installed FMS on a server running Centos for a customer who needed streaming media capabilities. It worked fine for 3 months until today the customer called us and informed us that their client program can no longer connect to the streaming media server. No one has done anything to the server to cause this to happen. When attempting to access the admin console to diagnose the program, the streaming server returned a 500 Internal Server Error. So I tried another page. Same result. Reboots of both the hardware server and the FMS have changed nothing.
As titled, what is the way to record video/audio files using Flash Meida Server through rmtp, and allow users to access the recorded files through http?What I am trying to do, is to record a user's microphone's input and save it to the server.fterwards, I would like other users to be able to access the recorded files and mainuplating the audio data, by computeSpectrum(), to do some visualization of the audio. As I know computeSpectrum() cannot work on streaming files, so I think I need to access the recorded files using http instead of rmtp. Is that true?