I found the article titled 'Capturing live video' quite useful. This article is linked off the FMS start web page - the one that opens immediately after installing.[URL] The tutorial is better than most that I've found because it has a simple, working example. I was able to enter the code into a frame script in Flash CS5 and I used my own FMS address to connect both to a localhost and a remote server.
The sad part was that I found that I got the video and audio to round trip to and from the FMS when compiled (ctrl+enter) in the Flash CS5 program but when I try to publish this to a web page, I get audio but I can't get any video to display - neither eachoing straight from the camera or subscribing from the FMS. I've tried accessing the web page from my local file system and from a webserver. In both cases, I get audio (great news) but not video (bad news). I'm wondering why this might be. I checked warnings and errors but I'm not getting any. This solution is so simple and would pretty much solve my problems.
I'm new to Flash media server family. I need a server that can capture user recorded video/audio (through a flash recorder) and save it as a flv file on the server. which one of those server family is for me? I'm currently doing it using Red5, but would like to try out Flash media server as well.
when i try to live stream with FMS! I can stream video with Flash media live encoder to the server but when i create the player to recieve the livestream from server,i can not recieve the live stream,can anyone give me a step by step tutorial of how to do it?
I am looking to broadcast live video and need a camera that is better than a simple webcam. It looks like they used to make video cameras that would register as webcams, but stopped making them.If you wanted to broadcast an event live, what would be a good camera to use?
I've tested an exciting tutorial about streaming live video with Flash Media Server 3.5..every thing went good and i could see my webcam broadcast from my machine through my web site but unfortunately.I can't see it from any other machine.I'm using Microsoft windows XP SP2and flash media live encoder 3but my web server run UNIX is this a problem?
I am trying to broadcast live video using a webcam. I am using the developer version of FMS3.0. Actually i have a working code of webcam broadcasting, which i used before. But now i am using a free version of FMS3.0. When i tried to connect, the following error is showing:
I'm using FMLE 3.1 to stream live video encoded with H.264 format with FMS 3 and Flash player 10 + AS 3 to connect to the stream. When I connect to the stream, just the audio is played. I'm able even to get the metadata information about the video, but I just receive the audio. I already tried some stuff like
1. "Flash 10 won't play live stream H.264 after iTunes install" [URL]. I tested it in a complete different environment than mine, but the same result.
2. I've tried some format to play method, but this is just to play files ns.play("mp4:saple.f4v"); ns.play("mp4:sample");
3. Also read "How do you watch and record a live h.264" [URL], but I don't get even to play the stream at first place. This is the code I'm using
import flash.media.Video; var video:Video = new Video(720, 480); var ncVideo = new NetConnection(); this.ncVideo.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus); [Code] .....
Another detail is that when I record the video to a file for instance "sample.f4v", I put this video in the FMS but when I connect to this stream I receive the "FileStructureInvalid" error message. I went from changing the extension to .flv [URL] to the solution to flatten the files [URL] but this is not the case because I'm using FMS to stream the recorded video.
I am a Wedding/Events Photographer/VideographerOne of my client want his events be live streamed over internet, I am quiet new for this type of jobas to what Software / Hardware I have to purchase for Live Streaming of Video over Internet.. and how it will encode in realtime.?
I have Flash media server, flash media live encoder, and flash cs5.I'm gonna be using flash media live encoder to stream my desktop and webcam to my website. My current host (host gator) doesn't have rtmp enabled on their web hosting plan, only on their dedicated and vps hosting. My question is do I need to switch hosts to one that has rtmp enabled so I can stream to my website for people to view?Also, can I embed this stream into another part of my site?
I'm very new to FMS and I have been experimenting around with it as a part of my job. I already have the encoder, FMS, and my AIR application all talking, so that portion is going well.The problem that I want to tackle next though is to have the FMS server record the live stream to it's hard drive. All of the guides that I've found talk about how to make a DVR stream, but that's not what I want. I want to be able to have the live stream, and then separately I would like to have it recorded.
we're planning to set up a live streaming service, one operson live streaming a video to multiple users. so no more than one channels contemporary in uplaod, and at the beginning it won't be a open broadcast service (maybe later..) we initially don't know how many can be the users. I've red that there's no theoretically limit for the maximum number of users connected with a fms, is this also valid for live streaming, or has it some limitation due to be live? is it necessary to have our fms, or is it possible (and cheaper) use a server provider?
I'm trying to publish video streaming over internet. I want to use the h264 codec in FMS for that. Is there any in built property or setting to use h264 or have to use library for that. And also want to know among VP6, sorenson and 264 which is better to use.
I've searched quite a few threads:[code]But all of them are talking about playing multiple bit rate live video,my question is how I can publish multiple bit rate live videos in the first place?
I have made a sample application for publishing video through FMS using as3.In that I had used the same code as given in the sample provided by the adobe,
m_nc = new NetConnection("rtmpt://localhost:1935/Test"); private function onNetStatus(event:NetStatusEvent):void { switch (event.info.code) case "NetConnection.Connect.
I'm trying to get Flex Builder 3 to play a live video stream with Flash Media Server and Flash Live Media Encoder.
I'm able to stream pre-recrorded (vod) flvs in flex from flash media server. I'm able to stream live video using Flash/FMS/Flas Live Media Encoder, but not with Flex.
This code streams pre-recorded Video on Demand flvs, but not live streams: If I change source to "rtmp://localhost/live/livesream.flv, I get nothing.
I am trying to encode and stream live video. I have downloaded both the Flash Media Encoder and Flash Media Server. The problem i have now is, i can't able to connect to Flash Media Server. The FMS url is as default: rtmp://localhost/live. Do i have to purchase a FMS url? How do i make sure that my FMS is running/ activated?
I read that it is best to de-interlace video "at the hardware level" before sending it to FMLE 3.2 for encoding. I have looked and looked for a software or solution but can't find anything that I can work. Is it best to de-interlace before sending the video to FMLE? I am streaming live video that needs de-interlacing before sending to our site. I know there is a "de-interlace" option in FMLE 3.2 but read that it was better to de-interlace the signal before sending to FMLE. I found something called DScaler but don't know how to get the video from DScaler into FMLE 3.2.
We have installed FMS on our godaddy server to work with and used live video streaming services. We are having problems with the video quality being choppy, slow and pauses continuously during live broadcast and motion. We can't figure what is wrong; video needs to be flawless and in high quality. The godaddy server is running:
Red Hat Fedora Core 7 Intel Core 2 Duo - 2.13 GHzRAM 2 GB RAM 250 GB Total Disk Space
I am facing below mentioned problem regarding video broadcasting with adobe flash media interactive server (Developers version). I am using DVRCAST for recording video from a webcam and broadcasting the same live. Please find video settings below. width:320 height:240 fps:15 quality:90 bandwidth:150 buffer time:0.01
Our problem is that videos are not playing smoothly in application. The motion is getting paused and interrupted in time of broadcast live. I downloaded and checked the recorded flv files from server and found the recorded file also has same problem. So this problem is appearing in time of streaming the video from client computer to fms server. This problem is not appearing when I am testing the application in my localhost.
My code for live streaming mentioned below: private function publishCamera():void { cam = Camera.getCamera(); cam.setMode(cameraSettings.width,cameraSettings.height,cameraSettings .fps); cam.setQuality(cameraSettings.bandwidth,0); cam.setQuality(cameraSettings.bandwidth,cameraSettings.quality); [Code] .....
2 x Intel(R) Xeon(R) CPU L5410 @ 2.33GHz 16Gb RAM 2 Gbit ethernet channel OS - Linux CentOS 5.5 x86_64 FMS4 Interactive
Live stream parameters:
320Ñ…240 qua_=87 bw_=200000 kf_=5 fps_=18
So, after approximately 600 connections to one stream video becomes choppy (periodically freeze, slow motions). CPU-usage at this time is 100-120% (maximum is 800%, 100% per each core), network usage is 500 Mbit/sec.But second live stream(with few connections) at this server looks fine simultaneously with 600 connections at first stream.Experiments with recommendation in "Configuring performance features" documentation chapter(enabling/disabling aggregate messages and configure the size of stream chunks) do not help.
We have an application that is using FMS to share slides (swf,jpg, etc) with another client. We added a video as an option. We would like to verify as often as possible the bandwidth quality between communications in order to suggest the user to close video if we notice bandwidth is not optimal.
We've been experimenting doing a bandwidth detection every 60 seconds, but I am wondering if there are specific guidelines to do this for live video, like how many times one should verify bandwidth? Is it ok to take one reading or should we take a few readings and then average out?
I did a live stream last week using 282,482,832,1500Kbps streams. What would cause the audio to get out of sync with the live video stream? I'm trying to determine if it was bandwidth related, cpu/memory issue on the FMIS 4.5 server, or an issue with encoding PC exceeding it's limits?
I currently have two connections with two separate streams. They both hit the same fms 3.5 server. One connection transfers live audio and video. The other one is used for remote objects. Sometimes when viewing the audio and video stream with a slower internet connection, the stream for the shared objects disconnects. I think it is a bandwidth issue. Is there any way to set the priority of the streams? I think this should allow me to set a higher priority for the shared object connection so it won't disconnect.