Media Server :: RTMP And Streaming Not Working For Start Screen
Apr 30, 2009
I've freshly installed the FSM demo on a redhat linux box, and have everything working. From that start screen running on Local Host, I click the "Play video (HTTP)" and that video of a train shows up. Cool. But when I click the link above it ("Play Video (RTMP)") I get an error message: "Connection Error. press Play to try again." and no matter how many times I hit play, I get that same message. The Dynamic stream doesn't work either, and the Interactive sucessfully displays webcam feeds, but doesn't show the "Play Live Stream" button thing. Is there anything special you have to do to get the RTMP stuff working? Some special command or server you have to run?
Would firewalls intefere with things (I'm pretty sure there isn't one on the machine, but I'm flailing wildly here) or would permissions mess things up? I'm completely lost ^_^;; I guess I should also add that there doesn't seem to be any log files. I'm looking under the server install directory, and there isn't even a "log" folder. There isn't one under Apache, either. It confuses me. So far all I can find on the internet is instructions to look at the log files...but if they aren't there... Am I just looking in the wrong places, or are they just not being generated yet? I did a tcpdump with wireshark, and the web app IS pinging port 1935 (for RTMP), but the packets are failling the checksum and are refusing to be reassembled because of that. Is this making sense to ANYBODY?
I'm running Flash Media Streaming Server and have only been serving VOD up until now. I had my network administrator open up port 1935 to the outside world during the setup process and now I can't remember if that was actually required for streaming VOD to clients. Most documentation I've read says that this port should be open, but I seem to recall reading something at one point that suggested it wasn't necessary.
I've just started messing around with publishing live streams using Flash Media Live Encoder to the Flash Media Streaming Server. I have that working without issue but was surprised to find that no authentication is required before a client running the live encoder can publish a stream to the Flash Media Streaming Server. An authentication module is available however it only works with Flash Media Interactive Server and Flash Media Development Server.
If I leave port 1935 open to the outside world, there would be nothing to stop anybody anywhere from streaming video via my server. Anyone else running a default install of Flash Media Streaming Server and with port 1935 open to the outside should see that this is true of their setup as well. I'm wondering if I can safely close port 1935 without limiting the functionality of the server or if there's some way I can require authentication prior to publishing a live stream even though I'm not on the four-and-a-half-times-more-expensive edition of the product.
Currently, we are streaming using RTMP, but I want to switch to HLS to allow iOS devices to stream our video. I do not want to encode two separate streams though because we are operating with limited bandwidth on mobile data cards. If I stream using the livepkgr, the delay to iOS devices is about 12 seconds which is OK. However, on desktop clients, we have live scoring integrated with our video, so we cannot have a delay that long. My question is, if I connect to the same stream in a Flash client for the desktop, will the delay still be that long? Or will it stream in real time to flash clients with the delay only for iOS devices using HLS? Also, am I correct in assuming that I can do this on 1 stream, or do I need to setup a separate stream for RTMP and HLS?
I have some questions regarding 4.5 and deciding which version we must license (FMIS or FMS). Obviously there is a huge price difference and as this is within a local network we don't require encryption or multicast. Here is the scenerio we need to stream live DVR to mac, windows browser AND iPhone/iPad and devices running Android 2.2(Froyo)+. I need someone from Adobe to help us decide.
1. I see DVR is now listed as a feature in Flash media server page. Does that mean its no longer exclusive to Flash Media Interactive server. For both http streaming and RTMP? 2. Previously in order to stream live to Adobe dynamic streaming, Flash media interactive server was required because a live segmenter had to be used. Is this still the case? Or can we just launch adobe live media encoder (FLME) and stream to flash media server.3. Related to #2 can we switch between serving streams via RTMP or dynamic streaming without changing the format of the material on disk?4.I was at a webcast for 4.x and remember being told that the latency for http live dynamic streaming was 20 seconds because of the repackaging that had to take place. Is this still true? What is the latency for iOS playback from ingest to playback?5. For the iOS playback feature do the streams have to be in Adobe dynamic streaming format or can we just stream via FLME RTMP and trust it to do the right thing?6. Does the iOS playback support DVR?7. Does the Android plugin have good support for RTMP live streaming including DVR?
Am developing an application with Flex 4.5 & Flash Media Server. I need a visualization curresponding to the plaing track . It is possible with SoundMixer.computeSpectrum(bytes, true, 0) in case of progressive downloading. But not working with rtmp streaming. And also need audio wave curresponding to the track which is also working with progressive ownloading usinf Sound object.
I am developing an application in Flash that runs locally and it uses FMS 4.01, locally. I have been using adobe FMS 4.01 for months with no problem. Today I cannot connect to my server and I cannot even play the sample video on the Flash Media Server Start Screen. The sample video for the http plays after I changed the permissions for flash, but rtmp does not play. I have reinstalled the server; 3.5, 4.0 and 4.01, none of them will connect to rtmp. I receive this error on the flash media server start screen that says "the connection timed out".
I have some .mov files want to stream to Flash media server by ffmpeg. i have already tried to stream a single .mov by FFMPEG command in terminal and it works, the FMS can display the thing i streaming in live.Now i want to stream multiple files as one source, i tried to use above command one by one, but it seems Flash media server stop the streaming when file1 is finished, then start the stream with file2.It makes the stream player stopped when file1 is finish, and i have to refresh the web page in order to continue on file2.i am calling the FFMPEG command by a C program in linux, i wonder is there any method that i can prevent the FMS stopped when i switch the file source in FFMPEG? or is that possible to let FFMPEG constantly deliver the stream by multiple files source without stopped when a file finish
i had installed a FMS4 developper in server on my centOs. I can connect to my FMS with client with RTMP, but with RTMFP, FMS don't reply. I looked at FMS log and i saw edge. 00.log:
2010-09-30 13:11:53 4984 (e)2631504 RTMFP could not start on edge process for _defaultRoot__edge1. Famille d'adresses non support par le protocole -
I've got a page laid out that houses a nightly live webstream of music at a venue. Every night (except Sunday) the stream turns on from 8pm-2am Eastern Standard Time. Some nights, I notice there is nobody watching the stream, yet the stream plays everyday no matter what. In the interest of saving bandwidth per month, I'm trying to figure out how to only stream if somebody is watching.So basically the camera would be recording locally, and when someone connects to the website something tells the server to start broadcasting. If the person leaves, the server stops broadcasting. Has anyone ever tried doing this? I'm using Influxis for my media streaming and they've assured me I can execute server side scripts.
i have installed flash media server 3.5 ..windows vista and using FLASH 10right now in Flash Media Server page I am only able to play HTTP sample,RTMP file is not working , i check the sample folder >> and open HelloWorld >> that shows error :: Error #2044: Unhandled NetStatusEvent:. level=error, code=NetConnection.Connect.Failedat HelloWorld/connectHandler()what should i do now need ur help urgent
While streaming a live worship service in Flash format, I'll need to archive only the sermon in Flash. I do not want to handle this in post/editing. What will I need to make this happen?
Can you access the FMS Start Screen to verify installation remotely? We have a new FMS 3.5.4 installation set up by our IS team, but cannot remote directly into the box to verify the installation per the installation guide. Is there a URL that can be used to access the FMS Start Screen if FMS was installed with Apache?
We installed FMS on a new server - a cloud environment - and set the fms.ini for the correct dedicated IP but we're unable to see the start screen or access any videos.where the IP would need to be added to point to the wwwroot directory?
Just like the title says I am using Flash Media Live Encoder and I have three streams. The encoder is sending the streams just fine but I am not sure how to deliver it to the client. The only one that shows up is "livestream1". "Livestream2" and "Livestream3" won't even come up.
We have a FMS 3.5.3 running on a production server, a 2x quad-core Xeon E5345 CPU based hardware with Windows Server 2008 R2 64-bit OS.
The FMS service streams live streams to hundreds of simultaneous Adobe Flash Player 10 clients through RTMP at port 1935.ally with a higher number of simultaneous clients connected (around 1, one can see a rising number of clients of the loaded (single) application (never diminishing) through the Administration Console. It appears that the clients "linger" - i.e. the number only counts up, never down, from then on.
As if this was not sign of problems enough in itself, the bandwidth graph drops to near zero - i.e. it appears that streaming stops working too. Restarting application helps. The logs do not show anything UNTIL AFTER application is attempted unloaded via console. There are messages of various importance in the event log, among these the weird warning "Asynchronous I/O operation failed (Failed to attach to completion port: The parameter is incorrect. 87)." There is more too, which I can provide on demand.
The application is pretty rudimentary. Of the entire API, we only use Application, Client and SharedObject classes. There is a single persistent shared object, and that's it.
We are using a clean install. Only Microsoft supplied and FMS processes are running. Windows Firewall is on though, and has rules for letting in connections on ports 1935 and 1111. We need a stable server, and I have to restart it manually every day, sometimes twice in an hour, which frustrates me.
The first event I streamed tonight worked fine for an hour or so then it stopped being accessible from the web. I had no problem viewing the stream on mobile devices though. The log file shows these errors.
[Code]....
I can see the three stream directories get created but only one of them has all the files- bootstrap,control,meta,f4f,f4x. The other two stream directories only contain a control file. The event is streaming perfectly on iOS devices but not on PC's. The stream starts for a second then stops and displays the message 'Unable to connect to the content you've requested'.
I'm trying to make a software which sends video and audio data to a flash media server by using RTMP protocol. Currently, my program can communicate with a flash media server correctly. RTMP specifications does not describe about the raw data in video/audio messages, so I muxed raw H.264 and AAC data into video/audio messages and sent to the server. The server seems to accept them, but a video player cannot playback the stream sending from the server. The player just says "Loading..." For a test purpose, I sniffed the network packets between Wirecast and the flash media server and ripped off only video and audio data. Then, I muxed those data into video/audio message and sent to the flash media server. In this case, the video player connected to the server can playback the stream correctly.
I checked the stream sent from Wirecast, the stream seems not to be H.264 raw data because those data are not started from 0x17 instead of H.264 start code. With those situation, I am wondering what kind of container format I should use for H.264/AAC data to the flash media server.
I've been experimenting with Dynamic Streaming while in the process of writing a tutorial. The documentation is quite clear on a few points: The keyframe interval in the various encodings should be shortThe bufferlength should be at least 2x the keyframe intervalThe player should sense a bandwidth change, by default, within the 4-second sampling interval and call for a switch. Then, the switch could take as long as 2x the keyframe interval after that. What I'm finding is wildly different behavior than this. It takes anywhere from 10-15 seconds for the player to notice the change and call for a switch, then another 20-40 seconds for the switch to happen. When switching up to a higher bitrate stream, this just means the user gets low bitrate video for longer than they ought to. But when switching down due to falling bandwidth, the buffer runs out and the user stares at the rebuffering sign for a lengthy time - long enough to give up on watching the video, for sure.
I've encoded an H.264 MP4 file at 64, 384, and 768 kbps, at 30fps and an "every 60 frames" keyframe interval. I've streamed it rtmp via two different CDNs that use FMS 3.5, into two different Flash video players (JW Player and Flowplayer). I've restricted my bandwidth on Windows XP with Netlimiter 2.0; and on the Mac with 'ipfw'. I've set bufferlength between 4 and 10 seconds. I've tested switching up and switching down. For up, I start with a 200kbps bandwidth limit. The video starts OK with the correct stream, then at 5 seconds I open up the bandwidth to unrestricted. For testing down, I do the opposite: start at unrestricted and then at 00:05 restrict to 200kbps.
My test page, with both players and sample code is at [URL] n-flash-bitrate-switching/ I also have a couple of screen recordings there showing the behavior of the whole process, both switching up and switching down. I thought I've done everything right here - paid attention to every documented detail, but it works rather poorly. Can someone explain whether this is expected behavior, if the players have implemented dynamic switching poorly, or if I'm doing something wrong?
I'm running FMS 3.5. My live streams and recorde streams work fine. Whenever I am in the Admin console though, nothing displays or reports. These screen shots are displaying the FMS server activity while several videos are being streamed around Europe:
We had FMS2 installed before and the paths to all our videos are like rtmp://ServerName/sites/.... (the default path on FMS2) Now we upgrade to FMS4 and we would like to keep these paths the same because we have many HTMLs that reference these videos. However, the default path on FMS4 is rtmp://ServerName/vod/... Is there a way to change "vod" to "sites"?
I tried to change VOD_COMMON_DIR in fms.ini from /install_dir/webroot/vod to /install_dir/webroot/sites, and also changed the document root in httpd.conf, but rtmp://ServerName/sites/ is still not working.
I have a virtual directory (Storage Area Network) in 'C' drive as well as in "webroot" folder in Flash Streaming Server. What do I need to do to make RTMP videos work from SAN directory on Flash Streaming Server. It works fine for http. RTMP from vod -> application folder works fine. I have done a lot of research and found out that we can use virtual directories for streaming videos. I am unable to find steps on how to use it..
I am trying to publish a video to an RTMP Server but it doesn't publish. It might be a pre-release bug. I am able to play a NetStream but not able to publish one. Is there a sandbox issue? I'm not sure. Because, the FMS RTMP Server will not let a client connect unless it has been downloaded from the same host as the server (something like a sandbox condition)
We are attempting to transcode some video to a format suitable for our Flash Streaming Server. In the past we have used Adobe Media Encoder (AME) CS4, but we are working with *.VOB files this time, and AME CS4 does not support them. We are attempting to use Handbrake now, as it has great support for h.264 and MP4 files. Unfortunately while the files play wonderfully on our local machines, they do not stream from the Flash streaming server over RTMP. The files come out of Handbrake with a *.m4v extension. Our old files out of AME CS4 are *.MP4's with h.264 and AAC audio. I can see no differences in video or audio codecs which leads me to believe it may be a container format problem. We are attempting to use Handbrake because it does an excellent job of batch encoding files and produces good quality files.
(1) An ec2 instance with an SWF on it - this SWF plays streaming video - i.e. is a video player like JWPlayer (2) A streaming video distribution set up via Cloudfront
If I stream the the video via RTMP from Cloudfront to the SWF (which is on ec2) - would I incur charges for data transfer into the server (i.e. for data being read by the SWF) and out of the server (i.e. for data being displayed by the SWF to the user) on account of streaming the video to users (assuming that data transfer into and out for the server is being charged for)?
Can Adobe Flash Media Streaming Server 3.5 run on AMD Athlon Dual Sock Quad Core?I just requested a Dell server to be added to our farm to run as a Media Server and to my surprise, while reading the requirements for FMS it states the following: 3.2GHz Intel® Pentium® 4 processor (dual Intel Xeon® or faster recommended)
I have Flash Media Streaming Server 3.5 (not Interactive) running on RHEL5.5 x86_64 Linux.All is working well, however how do I prevent unauthorized access to connecting to the live stream and streaming content?How can I setup the server to require a user and password to stream live media to the server?I am new to this product and I have been reading some documentation but I have not found a clear cut answer on how to force a username and password to connect to the server to stream live content only.I am using the Adobe FMS Apache install, what files need changing?[code]I want to lock down a person from connecting to the server on the public internet and starting a live stream?Can this be done with a user name and password?
i have a small LAN of about 8 computers all of which are running windows 7. I have installed FMS and XAMPP webserver on one of the machines. I want to stream live from one PC to all the other PCs on the LAN. I have a webpage with jwplayer embedded in it on my XAMPP webserver that is able to see the live stream when i start it locally. I mean the live stream works fine on the machine with the servers on it. But when i want to view the live stream from another machine in the LAN by accessing the webpage that has the jwpalyer from another machine, The jwplayer returns "server not found:rtmp://192.168.10.1/live" error. I was thinking that maybe a firewall is blocking the 1935 port but i have turned off the firewall of every PC on the LAN. I have unistalled any antivirus program on all the PCs. But i still get the same error when i try to access the live stream from another PC on the LAN.When i run netstat -a -n|find ":1935" i get 192.168.10.2:49184 192.168.10.1:1935 SYN_SENT and i think the request for the stream is sent but the conection is rejected.
This is the code for the webpage with jwplayer embedded in it. maybe it: <html>head> <title>JW FLV Media Player</title> <script type="text/JavaScript" src="swfobject.js"></script>