I'm stuck trying to get a Stream.play() call on a recorded video file (.flv) to translate from origin to edge servers. Don't even know if it's possible. If I call Stream.play() on a server, and connect a viewer directly to that server, the viewer sees the stream play. If I call Stream.play() on an origin server and connect a viewer to edge servers the stream does not play.
Is it possible to configure in any way that the play() call would propagate to edge servers? According to FMS 3.5 docs on the Stream class, the following is the case:
"When you call Stream.play()on the server, the server becomes the publisher."
See [URL].
If that server becomes a publisher and lets say it was an origin server on a CDN, shouldn't the other machines in the configuration receive the published stream? Somehow when you publish a live stream to an origin server it makes its way to edge servers. Is this analogy not relevant to the scenario wherein you call Stream.play() on a recorded file?
Is it possible to use origin/edge setup when streaming using HDS ? I have succesfully tested using RTMP but when I try the HDS string - get error "we are having problems with playback". I can view the stream directly if I connect to the origin server. I have 1x origin server and 1x edge server.
I am having a multi user application running in FMS. I need to cluster them using Edge/Origin with Origin failover.I went thru, http:[url]......My questions are,1) which software load balancer can I use? Is there possiblitiy of configuring the fms server to be reverse proxy.http:[url].... in the below link its mentioned that we can configure edges in 4 ways.Client auto discovery proxyServer auto discover (reverse proxy)Explicit URIImplicit URI (recommended)what configs do I need to do.I was successfut only Explicit URI way by configuring the Vhost.xml. How would I do for others?
where can I find step by step guide for clustering FMS? url...but , it is not clear what exactly to do , it just says like :Install and configure the first Flash Media Server.Use the same serial number and license file each time you install Flash Media Server.NOTE A special cluster license file is required. For more information, contact your Macromedia representative.Confirm that this Flash Media Server instance is working correctly.Configure the Flash Media Server instance as an origin server.Install and configure the next Flash Media Server in the cluster.Configure this Flash Media Server instance as an edge server.Make sure this edge server points to the origin server.Repeat steps 5 through 7 for each edge server that your license file allows. ".
I just came accross edge origin configuration. I was wondering if this would be good if I use Edge-Origin configuration on live applications, like live Video Chat or sharing a drawing board. I understand that they will be useful in case of failures of edges. But unlike 'videos on demand' applications like streaming, edges will help since the edges will cache them. So for a live application, apart from failover any other advantages of having Edge-Origin setup?
we have an origin server where video materials are stored and one edge where clients are conected to.
If i allow clients to connect directly to core server it can go easily to 1000 users without any problems. If the clients connects to edge after 100 users the player will start buffering the video every few seconds.
On the other hand if i connect a FMLE to core ( same video bitrate as materials already encoded ) clients can connect to edge and there is no problem ( tested with 1000 users ) and no buffering issues appeared.
There is no IO wait problem on core, edge is configured not to use any cache. There is no connectivity isssue between master and edge.
We are currently deciding what to use: Wowza(not so much, since they are undergoing the lawsuit) or Adobe Flash Media Server. The problem is that I couldn't find any information on how to configure the loadbalancer using FMS. I tested with Wowza and they use built-in loadbalancer but not so for FMS. Forgot to mention that we need it mostly for live events using RTMP protocol. My current configuration has 1 origin server + 2 edge servers(trial developers edition). I can manage to play livestream from the edge--origin servers using egde url. How can I configure the edge server so that when I connect to the origin server it redirects me to the least loaded edge server?
I tried to set up the connection limit for the origin server for 2 connections(one for the fmle, one for edge) and connect with 5-10 clients but my origin server didn't redirect the stream to the edge.
I have setup an Origin and Edge server to offload live and VOD content. However, my end-users are pointing explicitly at the Edge server. Is there any way to configure the setup so the Edge server (proxy)is transparent and the end-users only know about the Origin server?I am trying to configure video split streaming with FMS.
I've been trying now for few days to configure an Edge server that would connect to our working Origin server. I have not changed anything on the Origin server and media are served properly I can easily play all media from the videoPlayer.html sample app distributed with the server.
I was successfully able to setup FMS edge-origin cluster but Origin-only clustering is the one we wanted for our live applications. The only place I found was 'Large Scale Deployment with FMIS'. I guess for Origin Only Clustering, we need to have our own caching system. What does get cached in a Live AV Chat? How do I propagate it to other Origins? What kind of plugins would I need for cache managing? What should the plugin do?
In the attached Origin Only Architecture, what does the Primary Origin do and Secondary Origin do? I thought Origin is the place where the application is actually running and Edges do proxy. Is it the same here in Origin-only clustering also? Are there any tools which can measure the lag in AV? (like performance measuring tools or load testing tools for FMS). What kind of software Load Balancer can we use for FMS clustering? (Like Nginx, Zeus or something?)
i have a orgin server and 2 edge servers and a load balancer
first off im confused in each edge server i set them to be remote, rtmp and changed the routeentry to <RouteEntry>*:*;96.44.***.***:1935</RouteEntry> on both servers
orgin server is just local
what my question is im streaming to orgin and its getting sent to 1 edge and other edge is doing nothing no traffic the stream i set it this way url:
'rtmp://173.254.***.***/live/live_test',
which is the 1 edge thats working if i setup another channel pointing to the other edge server ip adresss it doesnt work.. what am i doing wrong isnt all this supposed to be load balanced or am i missing steps...or did i do it wrong on the flash player config.
I am trying to set up my video site to stream from the fms server¯ and the instruction stated that I should create my own new folder in the application directory on the fms server, and I have created my directory, which I called brigma_streams¯, and I have copied files from {FMS-Install-Dir}/ sample/Application/vod to my new folder.
I am trying to create a flash player to stream an on online Internet radio station using FMS. I chose FMS after being told that I need RTMP server to extract the metadata. Our radio audio is being encoded using a DJ interface called SAM Broadcaster. But for some reason, it does not have an option to send over the stream directly to a Flash Media Server. Only options are either IceCAST or ShoutCAST. How should i setup the stream? Should it be Stream encoder>> IceCAST/SHOUTCast Server>>Flash Media Server>>Flash Player Client. Or should I setup Stream Encoder>>Flash Media Server>>Flash Player Client? Shouldn't Flash Media Server be an alternative to IceCast or Shoutcast? If that is the case, how do I send over the stream from SAM Broadcaster directly to Flash Media Server without restreaming through another streaming server?
I have a question regarding FMS 3.5 and live streaming with failover protection. Currently, we are running a single server with FMS 3.5 for our VOD and live streaming events and its been working great. Recently during our last live video stream the server decided to die and naturally the live stream was lost. To prevent such a future scenario from happening, and given that I am on limited budget what would be a best method of prevention. I saw there is limited documentation on setting up origin to edge, or two origin servers (more likely for us since we can maybe purchase one more server at this time) but I do not have access to make network/router changes etc. I do have full access to configure my Windows 2003 servers however.
I have recently installed FMIS 3.5.3. In checking the access logs I find data in both logs that display the same stream stop and stream play time .I'm not sure why the time is the same (00:19:27 example below). Videos play fine when testing from work (T3 connection). However, occasionally a very slight hesitation when playing video from home (I have cable connection). [code]...
I got a bunch of live stream from FMLE, say: "FMLE_channel1", "FMLE_channel2", "FMLE_channel3". And then on the server side, I created several corresponding republished stream called "channel1", "channel2", "channel3".
On periodical basis, we call Stream.get("channel1").play("FMLE_channel1", -1, 10, true) every 10 seconds. Similar things were done on the second channel & third channel.Soon after the above Stream.get("channel1").play() call, I should get the following events in sequence:info :NetStream.Unpublish.Successinfo :NetStream.Publish.Startinfo :NetStream.Play.Resetinfo :NetStream.Play.Start In the above case all are happy. Clients can view channel1, channel2, channel3 well.But then after a while, one of the three channels, in most case it would be channel1, will not be viewable.
With the server trace info, I found that after the Stream.get("channel1").play() call, only the following two events exists:
info :NetStream.Unpublish.Successinfo :NetStream.Publish.Starti.e. I was missing the play.reset and play.start event.I further checked and confirmed that the FMLE was publishing all three channels fine to the server. I was able to view the "FMLE_channel1" from flash clients, but not the republished "channel1". the version is FMS 3.5.0.
I have an *.flv file on a FMS. When I play it on the client side the video plays just fine, but when I call Stream.play(filename, 0, -1, false) on the server side the video turns out really choppy.I both cases I use NetConnection to connect to an rtmp and NetStream to play the stream, but in one case I connect to a stream and request the server to play my file on that stream. Apparently that doesn't work with files? It works just fine for live streams.
Using SSAS on FMS3.5, is there a way to get the stream name of a play event? In my case, I am trying to run additional steps when a stream is played and I need the stream name to do so.
I have started the vhost setup. I have cpanel on the server and I added the domain and it resolves to the hosting account just fine pull up www.domain.com and goes to server.
I added the www.domain.com vhost. I pointed the application dir to /home/domain/public_html/applications/
I copied live over to applications dir. I setup user and pass for the virtual host login.
I'm trying to get Flex Builder 3 to play a live video stream with Flash Media Server and Flash Live Media Encoder.
I'm able to stream pre-recrorded (vod) flvs in flex from flash media server. I'm able to stream live video using Flash/FMS/Flas Live Media Encoder, but not with Flex.
This code streams pre-recorded Video on Demand flvs, but not live streams: If I change source to "rtmp://localhost/live/livesream.flv, I get nothing.
I've noticed an issue while creating one way (A publish B play) stream through RTMFP. When I try to create connection between computers behind the same NAT (same IP) stream playing is not fired at all, but on the other hand the same code works on remote computers (different IPs).This issue arose after releasing final release of Flash Player, in a beta release everything works fine.We checked NAT settings and there are no differences for this cases.
i was modified the main.asc script in livepkgr, it used to call s.play(streamObj.name), which i assume this is the code it play the stream from FMLE to s.play("FileName") , which is a flv video file in my application. unfortunately when i start the FMLE stream to the FMS and watch the stream with the FMS Sample Video player, i was expecting to see the video content from the flv, but instead the camera footage in playing on the player.
i am very confusing now, i thought the s.play the only code to control what should be playing in application. or when FMLE is connected to server, the server must play the footage from FMLE by some hidden code, no matter what s.play wanna to play?
NetStream.play(streamName, -1); This seems to be working wrong.if I have recorded an flv on server using FMS and FMLE with only audio with name "myaudio" and then after if I try to play a live stream using NetStream.play("myaudio", -1) then it plays the recorded stream. I believe that documentation says that it should start a live stream instead of playing recorded stream as the second argument is -1. Is this a bug in NetStream.play method?