Media Server :: Multipoint Publish - UnPublish The Streams?
Feb 10, 2012
I have FMLE source streaming to FMS #1, and FMS #1 is multipoint publishing to FMS #2. It works great, but when I stop FMLE source, the multipoint publishing from FMS #1 to FMS #2 doesn't stop (unpublish), and continues to stream empty (blank black) stream between the FMS servers. So how do I ubpublish the multipoint publishing streams on the FMS #1? Here is my code in application live on FMS #1:
I am facing a strange issue with a pair of FMIS servers set up with Multipoint Publishing. Configuration is as follows: Encoder >>>> FMIS1 >>>> FMIS2
When freshly started, I am able to view live streams from both servers, however, periodically, FMIS2 unpublishes the streams as seen in the log below. When this happens lives streams are accessible from FMIS1 but not FMIS2. In order to resume live streams on FMIS2, FMIS1 must be restarted.
Logs: 2010-09-01 14:31:00 9244 (s)2641173 NGA_1 is unpublishing - 2010-09-01 14:31:00 9244 (s)2641173 FTV_1 is unpublishing - 2010-09-01 14:31:00 9244 (s)2641173 BBCWN_1 is unpublishing - 2010-09-01 14:31:00 9244 (s)2641173 CTI_1 is unpublishing - 2010-09-01 14:31:00 9244 (s)2641173 CNN_1 is unpublishing - .....
I`m using FMIS 3.5.2 (windows xp) and was trying to make this example working URL...When my flash media live encoder connects to 'livestreams/ localnews' publishing point i got in logs:CSAAACPI is connectedSending error message: Method not found (releaseStream).Sending error message: Method not found (FCPublish).localnews is publishing into application livestreams/_definst_Republishing the stream into "livestreams/anotherinstance"Stream Status: NetStream.Publish.StartThe stream is now publishing.When i`m trying to connect to the "livestreams/localnews" with flash media player i can see the stream from my webcam but when i`m trying to connect to republished stream 'livestreams/anotherinstance' i got nothing.Player says "loading..." and shows nothing.
I am sending a live stream over the web. I'm testing its performance and reliability, and I've noticed that after unpublishing and publishing the stream again, the video becomes very laggy - about 1-2 fps. It happens when the player has buffer bigger than 0. When buffer is set to 0, everything is ok, and after a short break (the moment when stream is unpublished), video is being played normally, like before the break (fps ~25).
I want to record a stream which is published with Flash Live Encoder to FMS 3.5, but split the recording in files with predefined length. For example if a stream 'webcam' is published I want to record it in chunks of 10 minutes: 'webcam1.flv', 'webcam2.flv' ... From what I can tell there's no facility to work with timers. The only solution I could think of was using stream.record() with a time limit parameter but that seems like a hack because it triggerstStream.Record.DiskQuotaExceeded on the stream when the recordin should stop and start recording another chunk. Has anyone done something similar?
i am developing video chat system , one side is model, another sider is customer. Now i want that model sider stream sended to FMS, and then FMS make two streams, that the one stream is regular, the other stream is compressed. the compressed stream is just for administrator, so it should be small.
I'm using the server Stream.play() method to playback a recorded stream, it plays back fine until it hits the buffer limit (Say I set the buffer to 5 seconds, it will playback fine for 5 seconds). But then it freezes and playback is very stuttery (1 frame every 2/3 seconds). Is this a know issue? I'm using windows 2008 server. I've tried a few things to resolve this but no luck. The server is running the dev license and has no load.
I have configured FMS on Amazon EC2 I am trying to capture data from my webcam and push it to the server.For storing the webcam i created a new folder but as it is recommended in other forums "You need to replace the signed Live application (main.far) with main.asc in[FMS-Install-Dir]/samples/applications/live/ ;(For more Info: refer to documentations & live/readme.txt) "I am not able to locate samples folder from where i can get my main.asc file
I was wondering. is it possible to manipulate in any way the video streams on the server side? Like for example,to have two streams coming from two clients and mix them into one stream,so a third client (or more) can play just one stream per client instead of two?
We just upgraded our FMIS from 3 to 3.5. All our VOD streams were working prior to the updgrade. Now the streams do not play. When logged in to the admin and checking the Server Log, this is the message. I assume this is correct. I have **** out the address.
[Code]...
Is there anything more that I can check to see where this is failing?
I'm trying to figure out whether or not it's possible to have two incoming streams to the same FMS server. Every time I try to search this information, I find stuff on having multiple outgoing streams, but nothing on multiple incoming streams.
The two streams will have the same content, but will use different video / audio codecs, and be set to different resolutions / bitrates.
I don't want to spend the money on another server, so it'd be great to know if FMS can do this..
Does anyone know if it's possible with any of the Adobe products to automate and manage streams?
We've got a project that will be streaming pre-recorded video and music. Due to the number and frequency of these streams we need a way for us to automate and manage these through our own CMS system.
We will be ultimately pushing these streams out to our CDN provider so no end user streaming actually required.
Do i need a wireless connection from the server. I am thinking of installing an access point to test the server.. The only challenge is i dont know how to get the client to receive streams.
I have a problem with recorded stream: The MetaData its partialy correct.Most of the informations are correct, but the 2 most important information for me width and height for the video are always 320 x 240 for any recorded stream althow they are published at different WxH.I tried to get my head around this but cant seem to figure out were the problem is.FMLE its used to publish the stream useing H.264 / MP3 codecs.Here is the server side code used to record the stream:[code]....
I've been using FMS for a few weeks, and feel relatively comfortable with it, but am currently having some trouble connecting to a live stream in a a video conferencing-type application. I suspect I'm leaving some small step out, but am having trouble seeing what that step is.
I have one client application open the video camera, connect to the FMS, and publish it's live camera stream to the server. Something like this:
I am publishing 2 live streams from a computer with 2 video capture cards in it and I get a lag every 30 seconds or so on the subscribers side. I have tried adjusting the camera quality and setMode properties but still the lag persists inside and outside the LAN, is there a way to create a buffer on the server or adjust the way the live stream is received on the subscribers side so there is no noticeable lag? I saw something about editing application.xml to adjust the queue and a suggested bitrate but not sure if this is applicable, here is the link:
[URL]
Here is my setup:
The publishing computer:
2 PCI-e x1 cards, one takes S-Video (480i) and the other DVI (720p) Windows 7 64bit Intel i7 6 GB RAM' GB NIC and Switch
From the switch is one hop to a GB router and out to a 10 MB pipe which leads to our datacenter 30 miles away. The swf on this side just gets the 2 cameras, sets the quality to (0,80) and mode to (640,480,25,false) (I have played with these settings a little) and creates 2 lives streams on the FMS.
The FMS:I am running Flash Media Interactive Server 3.5 on my own server with two 3.6 Dual Core Xeon processors and 4 GB RAM. This server resides in a Datacenter and has a 100 MB burstable pipe. From the FMS administration console I am barely using 4MB total bandwidth, 1% CPU usage, 2% RAM.
The subscribing computer:I have used many different types of hardwired PC's within the same LAN and outside the LAN, results are the same.The swf on this side just creates 2 new video instances and attaches the 2 netstreams to them. They are placed side by side and the height and width of these videos are undefined.
Can Flash Media Server 3.5 do the following?. Can it take multiple live streams?. Is it possible to control the ip streams - by using API's in to the Media Servers?.
After solving an important issue about the audio conference, i have another doubt. The question is, when the user enter in the conference, how can i get the opened streams of the server, in order to automatically let the attendee hear the presenter?
Flash Media Interactive Server 3.5 or later supports recording of streams using the H.264 codec, directly on the server. Archive high-quality live streams so you can quickly deploy the content on demand after your live event is over, and use the free F4V flattener utility to prepare your file for video editing using Adobe Premiere Pro CS5."
there is no problem to record the stream in flv format but i don't find a way to record using h.264 compression
When i publish the streams on a linux server, the .flv files have permissions like this:-rw-rw----. And i don't have permissions to access this file on a client browser... Currently, i can only change the permission using "chmod" manually
We have FMS 3 on a server I'll call "Streamer" We have a public webserver running IIS on a server I'll call "Web Server". We host our website on Web Server, and both Streamer and Web Server are behind our firewall.
I made a page on Web Server with a flash file that has a Player which goes to [url]...
When we navigate to www.domain.com/test.html (which has the embedded video) from inside the network, we get the video streamed to us. However, when we get a laptop with an aircard connection up (to simulate a request from the web), the user gets a blank flash file with no video whatsoever.
On Streamer, the log file shows the IP address of my PC (the PC that made the request from the video inside the network) so that makes sense. But it shows no other request being made (like when the laptop/aircard made the request from the web).
We used wire shark and sure enough it showed that from Web Server to Streamer, there was traffic when it packet captured the internal request. But there was no traffic when the public request was made.
I own a site like ustream/justin where users can register and broadcast their channels using FMLE. Sometime I need to ban broadcasters after I receives any complaint against any channel. I do them on website, there is no issue in that. But how to do that on server side at FMIS. I need to do it bcause the stream users are already watching the channel page before it got banned, they still can watch it till they do the refresh. So I need something that can stop the Live Stream of that particular channel on FMS server too. How can I supply stream names to main.asc that should be banned or not to allow publishing without restarting server each time.
For a number of reasons (which I wont bore you with) we may have to interface our FMS servers over an NFS mesh to share disks. For example we have 6 core sites around the world and we want all core sites to see each others primary content store. Whilst we sort out the commercials for Aspera or similar to handle file replication properly, I had the idea to use <streams> to remotely mount each others disks (The disk systems here present themselves as NAS via NFS)So before you say No, don't do it im not sure I have a lot of choice at present.Its a large global core/edge deployment and the edges are implicitly configured to their origin.If i DID want to do it how many <streams> can I define before the server ignores them?
I am trying to record a video chat being done through FMS. I have made my program start recording only when the second stream publishes. Then, I stop recording when either streams stops publishing. In my last test I recorded a 45 minute chat. The resulting flv files show that there are 13 seconds difference between them. The video gets more out of sync as it goes along. My first guess is that it is something to do with dropped frames. Is there any way to force FMS to fill in dropped frames. I'm posting my code for starting and stopping recording:
we are streaming 24/7 live from a number of FMLE's to FMS Interactive server to a livepkgr cloned app., and to our clients we stream both to Flash via rtmp and to iOS via HLS.Since we want to introduce another server (to be able to serve more clients), the question is - hot to republish from one to another server so that both servers can stream rtmp and hls to clients?We are already using the backup url in FMLE to stream to a backup server - this we do not want to change since the backup server is just that - backup, if the main server fails.