Media Server :: HTTP Dynamic Streaming And Corporate Firewalls?
Nov 10, 2011
Will the use of HTTP Dynamic Streaming in place of the RTMP protocol aleviate the connectivity problems experienced by users who are sitting behind restrictive corporate firewalls?
I have been using streaming Flash video but just found out that it can't be seen in many places because Flash streams over ports 1935, 443, and 80, which are often closed by firewall. Progressive only needs port 443 so it's more likely that it will come through firewall.
When using DVR with RTMP streaming the stream could/would be saved as single big file on the server, but when using HTTP streaming the stream will consist of numerous chuncked files; in this case how will the stream get saved, will FMS 4 assemble the chunks? unlikely, will the chunks of stream get saved? in this case I wonder which tool could be used to access it and make VODs of it. What is the exact behaviour of FMS4 when dealing with storage of DVR HTTP streaming?
I am a video producer for a Non-Profit and in the next couple of weeks we are launching an online training, half of which requires watching multiple different video clips for this particular training around 50 deferent online videos. I'm a bit experienced with encoding video for the web (FLV, H.264) but would love to know more about HTTP Dynamic Streaming. As of now we just use a progressive download type of playing.
We recently hired a web dev and knows servers a bit but didn't seem like he knew anything about HTTP Dynamic Streaming said maybe we should search the internet for an auto detecting flash player? Anyways, is this something that I should be looking into for our organization if we do a lot of online video? If we purchased Flash Media Server would this be all we need? Is there like a Dummies book on this subject?
We recently purchased FMS4 and I'm in the midst of setting it up. However, not that it's a HUGE deal, but....
I see conflicting answers in the forums about whether or not it is possible to stream media in the HTML5 <video> tag (going back to early 2010).
Ideally, I'd like to stream (most of the time using flash), but for the specific browsers that support it, (chrome, ff, IE9, and select iOS devices) HTML5 video tags.
If so, how? I gave it a shot using the above code (minus the "blah, blah, blah" ), and the video doesn't play - if all it is, is a syntax error, then please be so kind as to provide the correct syntax.
However, if it isn't possible unless you download something else, like the File Packager for HTTP Streaming or the Origin module for HTTP Dynamic Streaming..
Does is matter what HTTP server I use?In the process of upgrading from FMS 3.0 on Windows using IIS 5 to FMS 4.0. Will dynamic HTTP Streaming work from a Windows server?
We have setup a brand new FMS 4.5 for http streaming to ios devices. We are feeding the source via rtmp to livepkgr application. Upon digesting the iOS feed via the url below, there is no audio being passed. I can check the stream using the rtmp protocol and there is audio. [URL]..
I just install 2 FMS 4.5 server and HTTP streaming is not working Tried using OSMF with HDS-LIVE ... failed Tried using SAFARI with HLS-LIVE failed too but when tested on my trial server, everyting is working fine. The open ports are 1935 and 80 Do I need to restart the server to make it works?
The documentation is for FMS "4.5.1". Is PHDS for a live HTTP stream also supported in FMS 4.5? I tried setting Application.xml as instructed in the guide, and when I viewed the .f4m from the live stream, I did not see any indication that it was encrypted.
I am trying to config custom encryption key for different pieces of content at streaming level using jit.conf for On-demand Apple HTTP Live Streaming.I tried to set "HLSEncryptionScope content" in httpd.conf and placed jit.conf in ../webroot/vod but not success, the m3u8 generate not contains EXT-X-KEY.and, I tried and got nothing in access plugin log file while i http stream a m3u8 file, is it possible to log HTTP Live Streaming using plug-in?can plugin got any event triggered by http streaming?
Finding faults in my reasoning, or expanding the discussion further. We are benchmarking FMS 3.5.x for live dynamic streaming and we have run across an issue. When throttling the client from a high bandwidth (1500kbps) to a low bandwidth (325kbps) via a bandwidth shaper (a physical firewall) it takes a very long (real-) time for the client to see the new stream-quality.
During the investigation of this issue we have narrowed this down to:
1. when transition is requested from client, the client-side buffer is 1/2 i.e. 4-5seconds. 2. if transition is request on a client with unlimited bandwidth, it takes about 6 seconds for the server to process, find an acceptable switching position and send a "transition.complete" event. 3. on the throttled client however, this event takes much longer.
As of Flash 10.1, they have added the ability to add bytes into the NetStream object via the appendBytes method URL...The main reason for this addition is that Adobe is finally supporting HTTP streaming of video. This is great, but it seems that you need to use the Adobe Media Streaming Server (URL...) to create the correct video chunks from your existing video to allow for smooth streaming.I have tried to do a hacked version of HTTP streaming in the past where I swap out the NetStream objects (URL...), but there is always a momentary pause between the chunks. With the new appendBytes, I tried to do a quick mock up with the two sections of video from the preceding site, but even then, the skip still remains.Does anyone know how the two consecutive .FLV files needs to be formated in order for the appendBytes method on the NetStream object to create a nice smooth video without a noticeable skip between the segments?
As seen the tutorial [URL].. I have successfully implimented the steps. However the problem comes when the stream is disconnected and reconnected again due to network problems. Once the stream is reconnected the video stops coming, even though all 3 streams are succesfully publishing the stream to Flash Media Server but on the client side where I'm using OSMF media player the stream gets stops. Is there any setting that I need to impliment in the OSMF player to resolve this issue ?
if I repeat this IIS topic, I couldn't find the answer to my problem any where on the internet. I installed FMS Dev 3.5 on Win 2003 Server with IIS 6.0 enabled. I don't have any issue with port 80 Listening, I used the IP address 192.168.0.21 for my web application (IIS) and 192.168.0.22 for FMS (I only have one network card and port 1935 is open under firewall). I can play the sample videos (RTMP, HTTP, and Dynamic Stream) using the Flash Media Start Screen (or from the location C: Program FilesAdobe Flash Media Server 3.5webrootindex.html) without any problem.
I then modified the IIS Default Website to look at the "webroot" folder (C: Program FilesAdobe Flash Media Server 3.5webroot). From IE, I can access the default web site by enter http://192.168.0.21/index.html. The website loads up correctly, and the RTMP video is playing perfectly. However, if I click Play Video (HTTP) or Dynamic Stream (tab), I receive "Connection Error. Please press Play to try again." I look at the log file (access.01.log) and see the error log "Session disconnect
I'm finally getting the hang of the Dynamic Streaming of live video content via FMS 3.5.2 and FMLE, and to a lesser extent, the DVR functionality that has newly been made available.
Inquiring minds want to know... is it possible to add DVR functionality to live Dynamically Streamed video content? In other words, I want to be able to provide DVR capability to our live videos that are being streamed at 3 different bitrates.
Is that currently possible? Jodi, can you ask David Hassoun, who seems to be the leading authority at the moment, if this is possible, and if so, if there's a tutorial we can access?
I recently installed FMS 3.5 and loaded 3 of the same video w/ different bit rates into my vod folder.What's the easiest way to set up a dynamic stream for these videos?
When using Flash Media Live Encoder 3 to dynamically stream live video, the subscribing to it is "jumpy", for lack of a better term. I have no issues with swapping between bitrate streams on pre-recorded video, but when I use live feeds one of the streams seems corrupted. I've tried various bitrate settings and different people experience the video differently, but about half the time, one of the streams is all tiling and stuttering.
Example: If publishing 3 streams (200kb/s, 600kb/s, and 1200kb/s) and the bandwidth drops, causing a downshift from 1200 to 600, after the transition the 600 is in this weird stuttering 1 second loop and persists to do this until the dynamic stream requests a switch to one of the other streams.I've tried reinstalling FMIS, FMLE and no matter what I do, there seems to be a 50/50 chance that a viewer will experience this. What's so weird to me, is I'll watch it in 2 separate player instances on the same machine and one of them gets the stuttering on the 600 stream and the other one doesn't.
I developed an simple video player based on the osmf with a dynamic streaming.It works fine with the examples from Akamai, but if i take my Live-Streams the first stream loads but as soon as stream should switch to a higher bandwidth i get the error:ERROR : error ID=15 message=Playback failed detail=Failed to play (stream ID: 1).Has anyone an idea what i do wrong? Or what might be wrong with the videos or streaming?
I'm working with dynamic streaming transitions and I would know how to stop a manual switch transition. Ocasionally I want to let the user change the quality of the streaming while another request its working. The actual result is that the requests succeded in cascade, and it would be nice to be able of stoping or canceling an actual change.
I install the latest version of apache before the FMS4. After the installation completed, I copied webroot, applications and other files to the web sites home directory. I also updated the FMS.ini file in the config folder. Because I am not familiar with FMS, I don't know whether the change is correct or not.
I need to find out how to layout my bitrates and widths in a way that will:make the switching between to levels (video qualities) as smooth as possible.support a wide range of bw (bandwidth). User with large bw should be fed with a high quality vid and users with low bw should be fed with a low quality vid.support both player widths: 416x240px and 640x340px without doing two different bitrate/with layouts. I have tested the following bitrate/width layot:Q5-Q8 is ment for fullscreen view, Q1-Q2 is ment for our small player and Q3-Q4 is ment for our large player.It works but I'm far from sure that it is the optimal layout. For instance, is it a problem that to videos with different widths have the same bitrate settings or bitrates that lay close to one another?As far as i know, the player (JW Player) will not choose a video that is more than 20% wider than the players width.
I read all information from above and related all posts.Now my question/problem is,Can we use DynamicStream to access and handle live stream user's web camera instead of normal NetStream?Our application is like video conferencing.In our application user's stream is not being stored on server. It is just being published as live.Now when I publish user's webcam stream as "record" (in ns.publish("stream_name","record"))I am able to get the stream.But when I publish the stream as "live" (in ns.publish("stream_name","live"))the stream is not being retrieved. Even sometimes I get an exception in "DynamicStream.as" provided by Adobe
I am using the new Dynamic Streaming classes with AS3 and FMS 3.5. I am storing four different streams in different locations on FMS. Because I can only identify one URI, how can I pass the different paths for each of the F4Vs?
Example in brief:
private var _nc:NetConnection;private var _ds:DynamicStream;private var _vidDisp:Video;private var _connClient:ConnectionClient; private var _uri:String = "rtmp://myserver.com/vod";private var streamName:String = "MyString"; var dsi:DynamicStreamItem = new DynamicStreamItem();_ds = new DynamicStream( _nc ); _ds.client = _connClient;
I have a video player with a playlist written with AS3, along with an .XML settings file to edit the playlist. I can successfully play rtmp:/ streams from my FMS server on the video player. I would like to use dynamic streaming using the dynamicStream.smil file with my existing video player/playlist. Lastly, I have been unsuccessful taking the snippet code from the FMS, Dynamic Streaming sample page, and using it within an HTML document. I assumed all I would need to change were the paths to where the rtmp:// video file, .smil, and the video player files were located on the FMS.
We are successfully using Captivate 4 to create SWF files leveraging a Flash Streaming Media Server hosted on a CDN.My question is, does anyone know how to leverage the dynamic streaming / adaptive bitrate feature in FMS 3.5 with a Captivate file? I have spent an entire day searching all over Adobe forums and am surprised to see this is not documented ANYWHERE and given Captivate REQUIRES FMS to stream, it would be incredibly poor design if it cannot leverage the best feature FMS has to offer!
We publish to Flash Player 10, and while FP 10 IS compatible with dynamic streaming, my challenge is Captivate only allows one URI path and one stream name.To deliver dynamic streaming you have to have several files with different encode rates availalbe. But if Captivate only allows one stream name,how do you point it to a stream that finds all the alternative files? I've explored this with 3 different CDNs and they don't know how to do it(Limelight, CDNetworks and Internap)
The reason I'm so interested is we are delivering a Moodle hosted eLearning course to users around the globe with VERY different internet speeds.We've followed all the best practices in encoding and this is the next step in making our course accessble to ALL users.For those interested,our files are mostly less than 10 MB, though some are as much as 50 MB, all encoded at 500 kbps.
I've been experimenting with Dynamic Streaming while in the process of writing a tutorial. The documentation is quite clear on a few points: The keyframe interval in the various encodings should be shortThe bufferlength should be at least 2x the keyframe intervalThe player should sense a bandwidth change, by default, within the 4-second sampling interval and call for a switch. Then, the switch could take as long as 2x the keyframe interval after that. What I'm finding is wildly different behavior than this. It takes anywhere from 10-15 seconds for the player to notice the change and call for a switch, then another 20-40 seconds for the switch to happen. When switching up to a higher bitrate stream, this just means the user gets low bitrate video for longer than they ought to. But when switching down due to falling bandwidth, the buffer runs out and the user stares at the rebuffering sign for a lengthy time - long enough to give up on watching the video, for sure.
I've encoded an H.264 MP4 file at 64, 384, and 768 kbps, at 30fps and an "every 60 frames" keyframe interval. I've streamed it rtmp via two different CDNs that use FMS 3.5, into two different Flash video players (JW Player and Flowplayer). I've restricted my bandwidth on Windows XP with Netlimiter 2.0; and on the Mac with 'ipfw'. I've set bufferlength between 4 and 10 seconds. I've tested switching up and switching down. For up, I start with a 200kbps bandwidth limit. The video starts OK with the correct stream, then at 5 seconds I open up the bandwidth to unrestricted. For testing down, I do the opposite: start at unrestricted and then at 00:05 restrict to 200kbps.
My test page, with both players and sample code is at [URL] n-flash-bitrate-switching/ I also have a couple of screen recordings there showing the behavior of the whole process, both switching up and switching down. I thought I've done everything right here - paid attention to every documented detail, but it works rather poorly. Can someone explain whether this is expected behavior, if the players have implemented dynamic switching poorly, or if I'm doing something wrong?
I have the streaming server 3.5.3. The sample player page for the dynamic streaming is here: [URL].click on the dynamic sample. pause and wait a few minutes for disconnect from the server. When you hit play, it starts from that spot and also starts from the beginning in the background audio while showing the buffering orange circle the whole time.
My users are students watching long lectures, they pause all the time. when they come back, it is a mess.
I have a virtual directory (Storage Area Network) in 'C' drive as well as in "webroot" folder in Flash Streaming Server. What do I need to do to make RTMP videos work from SAN directory on Flash Streaming Server. It works fine for http. RTMP from vod -> application folder works fine. I have done a lot of research and found out that we can use virtual directories for streaming videos. I am unable to find steps on how to use it..
why dynamic streaming taking too much time to switch video from lower bit rate to higher bit rate and vice versa. I am doing dynamic streaming in following ways -
var param:NetStreamPlayOptions = new NetStreamPlayOptions(); param.oldStreamName=oldStream(); param.streamName=newStream()
[code]....
I am using duel buffering and that is 3 seconds when video starts and 10 seconds when "NetStream.Buffer.Full". Video taking approximately 30-50 seconds to switch video and when I am calling the above code.