Media Server :: Dynamically Set Publishing Webcam Feed?
Jan 10, 2011
There's just one thing I still don't get about the sample app: As a client, you always have to check the checkboxes (send/receive Audio/Video) in order to define who is sending/receiving what kind of stream. How is it possible to manage the feeds from the FMS server? So the clients can't set the checkboxes, but the server determines who is sending and who is receiving?
I'm trying to use Flash Media Server (on Amazon) to read in a video feed from a user's webcam, process it via my own executable, and return processed data (not video) to the user. For simplicity, let's say my executable finds the darkest pixel in the image and returns its coordinates, and I want the client to draw a cross at this location.
The client-side is working fine. It can send the video stream, and receive vertex data to draw. The server is half-working. It can accept connections, and send fake vertex data to the client. However I can't find any way to access the webcam stream.
Ideally, I would pass each frame from the stream into my executable for processing, with no saving to disk (or perhaps minimal saving to disk as a cache for P-frames). I've got the impression that this is impossible though, so another acceptable solution is to save the stream (live) to disk, and read it (live) from my executable.
Also, somewhat relatedly, I would like to secure the video feed as much as possible. Eventually I'll use SSL to secure it in transit, but for now I have the more serious problem of FMS sharing every stream it recieves. I'm sure there's an option to disable this somewhere, but I just can't find it.
I have installed flash media server, flash cs5, media live encoder to be able to view my webcam feed via my website. Now in addition to these, i started streaming to localhost, and installed a dns updater which updates a name server according to my current ip. Then i wrote this dns address in the feed's source from flash 5 as rtmp://thednsname.somedns.net/live/livestream
so far so good, i also forwarded the ports 1935 incoming and outgoing from modem to my computer as well (because 1935 is said to be the default port number for media server). firewall -> closed (just in case); and my internet provider doesn't block ports...started the media server (with admin privilage in 7), started the live encoder, connect to localhost, start live feed.
NetStream.Play.StreamNotFound : Adobe Flash tried to play a live or recorded stream that does not exist. Source can't be found.
I'm trying to create a simple application that streams the webcam to FMS 4.5 (running on a Windows 7 machine), and then displays the published stream locally. Here's my code:
All traces in the event handlers show the NetConnection connecting successfully, and the netstream publishing successfully. However, I get no video output from this. I've confirmed this using the sample video player in FMS 4.5 as well. There is also a weird traffic pattern reported in the FMS Admin console.There is some traffic (~4K) when the NetStream first publishes, but after that there is no activity.
I have been working on a Flex application that sends a feed from my webcam to the Flash Media server. The application connects to the server fine but for some reason the camera is not sending anything to the Media Server. I am pretty sure that the answer to my question is really simple but I need another set of eyes to llok at my code and tell me what I am doing wrong.
how to publish live video feed from webcam in h.264 format non VP6 format with FMS 3.5.2 without using Flash Media Live Encoder, and how to set all parametersto have a good quality and smooth video without interruption, i have a server with 50Mbit bandwidth output enough for a publisher and 10 clients meunderstand this thing's been months since I try but the quality ugly
No I have installed fms4.5, but I can't find good docs how to set something like this up. In Learning section I saw a lesson, but this one isn't ready.So I want to publish using flash and the subscribers to the livestream can be iPhones or Browsers.
I use the Windows MEdia Server, but i´d like to migration to FMS. I set a login and passwrod on WMS to Encoder to use. Using FMS, how can i do to set a login and/or password for use by Flash Media Encoder?
I have a FMS 3 in on a linux server and I have a ViewCast Niagara Pro II encoder. I can publish to the FMS 3 no problem but how to I limit who can publish to the server, because currently I can use the free Adobe Media Encoder to connect to the server and publish with no authorization which tells me anybody could do the same if they just input the rtmp url and feed name.
I'm not sure if this is the appropriate place to post this, but I was wandering its possible to build your own Player in Flash? The situation is: I have an external digital camera, and i need to capture its contents and display it in a player/window. This will all be on a local machine. I don't need to record anything, or stream it to other machines..... I just need the player to be able to display what ever the camera is displaying. Kind of like the Live Encoder (input and output views). With that I could hopefuly embed the player into other applications.
I`m using FMIS 3.5.2 (windows xp) and was trying to make this example working URL...When my flash media live encoder connects to 'livestreams/ localnews' publishing point i got in logs:CSAAACPI is connectedSending error message: Method not found (releaseStream).Sending error message: Method not found (FCPublish).localnews is publishing into application livestreams/_definst_Republishing the stream into "livestreams/anotherinstance"Stream Status: NetStream.Publish.StartThe stream is now publishing.When i`m trying to connect to the "livestreams/localnews" with flash media player i can see the stream from my webcam but when i`m trying to connect to republished stream 'livestreams/anotherinstance' i got nothing.Player says "loading..." and shows nothing.
I have followed this link to republish my local stream to other local application instance : [URL]
i have used this script code into main.asc and it is working :
// Called when the client publishes application.onPublish = function(client, myStream) { trace(myStream.name + " is publishing into application " + application.name);
I'm currently having an issue with FMS4 developer edition on both Linux (x64) and Windows XP (x86).When I stream a webcam using ns.publish("foobar", "live") I can watch the live stream on another client, however when i use ns.publish("foobar", "record"); neither broadcasting nor recording works.Using the "live" parameter the client appears in the log files and in the administration console as "publishing", using "record" the client appears as "idle".Is there anything I need to configure besides LIVE_DIR in fms.ini?
Is this a restriction in the developer edition? Here is the relevant part of the code (condensed):
-------------------------------------------------------------------- var camLive:Camera = Camera.getCamera(); var nc:NetConnection = new NetConnection();[code]............
1) I currently have multi point publishing set up to rebroadcast a stream between several servers. I'm running into an issue where even if it's a live stream, when a client connects, it starts from the beginning and you have to seek to the very end for it to go live.I've attached my main.asc file
I have this video chat(video, text and audio) program written in actionscript and using FMS as the media server, that I would like to test out with diff comps. I don't have enough webcams to go around and also not all have a built-in webcam. Is there anyway I can use some software to record the screen activity as a video and than stream it over? I am aware that there are softwares that can record the screen activity, but how do I stream it over after that, while still in my video chat applicaton?
I have justed started using Flash Media server so this could be a really easy one to answer or not even possible.I have Adobe Flash Media Live Encoder 3 publishing to Adobe Flash Media Server 3.5 which then publishes this live stream to two CDN providers. I want to publish the stream with a different name to each CDN. Anyone know howI can do this without having FMLE3 publishing a backup stream and then FMS having to process two differently named streams?I.E.FMLE publishes a stream called bobThis is then published to CDN1 as bob and I then want to publish it to CDN2 as bob2Below the main.asc for the application but it ignores the second stream name and publishes to CDN2 with bob
I learned from some previous post that the default encoding format used when you publish the camera from a Flash player is Soreson. I'm wondering whether it is possible to select to use other codec such as VP2 or H.264?
I am publishing screen data to FMS 3.5.3 using ffmpeg with flv. Everything is fine with 1024 x 768. When I try to publish 1600 x 1200, the connection closes and I get an entry in core.00.log:1458 (e)2631029 Bad network data; terminating conection.
I have made a sample application for publishing video through FMS using as3.In that I had used the same code as given in the sample provided by the adobe,
m_nc = new NetConnection("rtmpt://localhost:1935/Test"); private function onNetStatus(event:NetStatusEvent):void { switch (event.info.code) case "NetConnection.Connect.
I'm developping a FMS 4 application that read an external stream, and then, republish it in multicast:
1.- So, first of all I open a NetConnection to the remote application. And I associate it to a new Stream created in the application. Then I have the stream available in my application.
nc = new NetConnection(); nc.connect(REMOTE_APPLICATION); nc.onStatus = function(info)
[Code].....
But, I don't know who would be the client in the registerStream method. Which reference should I add there? Is it possible in this way?
I made another different script that republish that stream in localhost using NetConnection.publish(localhost/sameapplication). It works properly. But I would like to be able of managing it in the other way.
I use different versions of FMS (3.0 3.5 4.0 to 4.5) to muti-point publishing?Use existing old ones for most streams, and use 4.5 for iOS apps.
The answer is Im guessing probably yes, as I was able to multi-point publish to different big company's new FMS (4?) from one of our FMS 3.0 machine. But just to make sure before I make investment? Also Air 3 sdk based iOS app now can connect to FMS 4.5 and play h264 live video stream on ios devices over the appstore approved http-live-streaming protocol right? Just to be clear
I'm trying to stream a webcam from 1 server to many client, what I really need to know is :1. How to adjust a bitrate video for bandwidth streaming, is it enough just using a Camera.setQuality() ?2. If u set Camera.setMode() at large resolution, for example at 1024x768 is it could make the bandwidth for client access increase too ?I mean is it affect the bandwidth for set the resolution at small (ex: 160x120) or set at large (ex: 1024x768) resolution.3. Same as number 2. just it was for video=new Video(w, h) . Is setting a video size affecting the bandwidth ?
When i click record_btn, flash asks me permission to access my webcam. If i alow in the screen i am able to see live video. But I want to store the video and replay it.
I managed to record webcam stream by using the FMS developper version locally my computer.And if I try to use a remote FMS (FMS Stream EDTN 3.5 ALP), my doesn't work anymore!
import mx.rpc.http.HTTPService; import flash.net.ObjectEncoding; private var nc:NetConnection;
I know that is possible to make a p2p webcam stream with Cirrus/Stratus or with Adobe liveCycle, but the first is not for commercial use and the second use an adobe's server. Is possible to set up a my own server with Flash media server 4 and then create a application to do what i want? I would to create a webcam stream 1->n using p2p. I cant find any example or any documentation about that and how work the rtmpf protocol of fms4. If i was not clear make me any question.