Media Server :: Configure Mfs To Send Only Intra Frames?
Mar 21, 2010
I need to display large number of streams in one flash player. In order to minimize cpu load i want to display only intra frames in h264 video. I have tired receiveVideoFPS method of netstream with value 1 and 2 but it is not giving the correct result.
In wowza server i tired recieveVideFPS with value -3 and it is giving the correct result as i want but after some time flash player crashes in all the browsers and in air too.
I instal on a machine the Adobe Flash Server 3.5.I have the Flash media administration console but i do not know how i can configure the machine to be a flash media streaming server.I have a tv card on my machine and i want use de flash media encoder to stream to a web page the tv signal on my tv card.I think i need to send the signal encoded use the flash media encoder to a flash media streaming server correct?
What would be the crucial things to optimize when you have hundreds of people connected to your application, with most of them being in a video conference with RTMFP direct_connections, and also some using RTMP connections?
Our app seems to go down under heavy load. - RTMFP stops working (no one can connect to the service with RTMFP NetConnection) and RTMP NetStreams don't work either, however Text Chat works.
Server we use have a connection limit of 500 connections, and 2.5mbps speed limit per connection (a video conference with 4 people uses less than that so it seems fine).
I have been running a test server using Apache as the web server. The problem is I need to be able to run FMS on an IIS machine, and there is a conflict over each trying to binding to port 80. I have read that a solution to this problem is to have 2 IP addresses, having FMS bind to one and IIS bind to the other. This is not a requirement for Apache and FMS. Why is it a requirement for IIS and FMS? This question my show my lack of knowledge of the difference between the 2. But, is it possible to configure IIS or FMS so that they share port 80 like FMS and Apache do? how Apache and FMS work together, and why IIS cant?
I'm looking for a way to configure wamp with FMS 4.5. I installed fms 4.5 without the apache server that comes with it. From the way it looks, fms seems to be running; i was able to run the sample files that came with it. I really need to know to configure these two.
Flash Media Interactive Server 3.5We currently have an Edge/Origin configuration. We have another Origin Server to be used for fail-over. How do you setup a back-up Origin Server?
My problem and I SWFVerification enabled on the server and saw that FMLE does not connect more, I looked for some explanation and I saw that if that is enabled for SWFVerification you must also change UserAgentExceptions tag <Exception from = "" to =""/> at this point I added this change<Exception From="FME/0.0" to="FME/4.0"/>but I saw FMLE does not connect as little to solve this problem
I have installed Flash media server 3.5, and i have put my vod files under 'C:Program FilesAdobeFlash Media Server 3.5applicationsvodmedia'
the sample file under this folder given by fms 'sample.flv' is playing fine, (this has VP6 as video codec, mp3 as audio)
"rtmp://my_SYSTEM_IP/vod/sampl"
where i have kept my own file called 09.flv which has H263/mp3, and one more 09_1.flv, which has H264/mp3. But both of the files are not streming. When i tried to play these over vlc it sayes file not found. But sample video plays. Is there any way to make work vod files to stream?
I carefully read forums for phpmyadmin before posting here and none really had the answer. I am able to install phpmyadmin on stand alone versions of Apache, but unable to do so on the FMS version. I previously read this post [URL] and it doesn't seem to answer the question although it looks like part of the question may have been answered offline. The problem is that phpmyadmin installs and I know that PHP is running fine because I can run a couple of test scripts with no problem. However, when I go to login to phpmyadmin, I get the dialog box asking me "do you want to open or save this file" and the script itself never executes. I checked the the server log, and here's what's going on:
[Code]...
Based on the post noted above, I'm assuming this may have something to do with either POST variables or maybe proxying unknown requests, however the forum post didn't appear to resolve the problem. I also posted to that thread, but I wasn't sure it was still being monitored.
We have a pool of FMS 3.5 servers behind a haproxy load balancer. They are working fine with RTMP and download with HTTP through FMS proxy. We need now to have RTMPT working for some clients behind a strict firewall.
1) We started configuring the haproxy load balancer to work in HTTP (level 7) mode and we changed the FMS Adaptor.xml configuration file on all servers with the tag below:
I'm wondering if anybody knows of a guide or a step by step process for configuring a VBrick HS.264 encoder to send video to my FMS for live broadcast. I'm experiencing issues configuring the Transmitters on the VBrick admin panel and unsure how to have FMS recognize the video stream to publish it. Has anybody used a VBrick encoder with FMS?
I'm trying to make a software which sends video and audio data to a flash media server by using RTMP protocol. Currently, my program can communicate with a flash media server correctly. RTMP specifications does not describe about the raw data in video/audio messages, so I muxed raw H.264 and AAC data into video/audio messages and sent to the server. The server seems to accept them, but a video player cannot playback the stream sending from the server. The player just says "Loading..." For a test purpose, I sniffed the network packets between Wirecast and the flash media server and ripped off only video and audio data. Then, I muxed those data into video/audio message and sent to the flash media server. In this case, the video player connected to the server can playback the stream correctly.
I checked the stream sent from Wirecast, the stream seems not to be H.264 raw data because those data are not started from 0x17 instead of H.264 start code. With those situation, I am wondering what kind of container format I should use for H.264/AAC data to the flash media server.
I am recording a video and while recording I issue some NetStream.send("doSomething", params) commands from client side. When I am playing back this video I receive the doSomething events on client side. No problems so far.Can I receive those events on server side? I want to handle those events on server side. Not client side.
I am working in flash chat in that I need font type,size and color for typed text. how to send the html data from client to server and get back for client.
We just bought two Decklink Studio cards to use with our live streaming servers.First, everything works fine. We have high quality flash live streaming, using SDI input with embedded audio.But after approximately 10 hours of streaming,the quality comes down, and the frame rate too, looks like the encoder start are dropping frames.Then if we just try to stop the live encoder, they freeze.The only way to close the program is forcing with CRTL+ ALT+ DEL.At this time, if we restart the FMLE, all works fine again, for more or less 10 hours.
I have been conducting some tests and am totally lost as to why flash based broadcasting drops frames. The genral formula for calculating bitrate is : bitrate = Width x Height x FPS
Now if this value is lower than Camera.bandwidth then why should flash still drop frames ?I have checked with FMS performance tab: bytes in is well bellow Camera.bandwidth as well , but still flash drops frames... why??
I am attempting to send XML data to a PHP script from my server-side app using the following code:
var my_xml = new XML("<highscore><name>Ernie</name><score>13045</score></highscore>"); my_xml.contentType = "text/xml"; my_xml.send("http://www.server.com/temp/fms_post.php");
I am logging anything that the outside server receives at $_REQUEST, $_POST, $_GET. I can see that my server-side app is reaching out to the external server, but the data is always blank.
I have application that uses the send functionally in the NetStream on the server-side. When a connect to the app using rtmp I am able to see the send come thru but if the app uses rtmfp it does notI gone thru all the examples in setting up a mutlicast app and I know the app works because when I publish a video out everyone in the group sees the video. The only thing not working is send. I've also set dataReliable to true.