I have got success in making live streaming with telecast at different pc. The application which I have made has the features: I' am publishing live stream using the connection like: nc.connection("rtmp://10.8.4.56:1935/live"); publishing the stream like: ns.publish("mycamera","live"); now, on other pc I am running the file for telecast using the following code: making connection with server: nc.connection("rtmp://10.8.4.56:1935/live"); playing the live stream: ns.play("mycamera"); vid.attachNetStream(ns);
Now, that was for the live stream for publish and view. But I want little more than that.I want to record the live stream at the publish end simultaneously. I have tried the code at publish side like: nc.connection("rtmp://10.8.4.56:1935/live"); ns.publish("mycamera","live"); ns.publish("mycamera","record"); But it is giving me error: NetStream.Record.NoAccess when I am changing the connection point to tmp://10.8.4.56:1935/dvr", t is giving me the result but then effecting the live telecast.
I am developing a live audio-video application using FMS 3.5. Any best possible settings i should use for playing a live netstream at the receiver side like what buffer time I should use etc?
I'm very new to FMS and I have been experimenting around with it as a part of my job. I already have the encoder, FMS, and my AIR application all talking, so that portion is going well.The problem that I want to tackle next though is to have the FMS server record the live stream to it's hard drive. All of the guides that I've found talk about how to make a DVR stream, but that's not what I want. I want to be able to have the live stream, and then separately I would like to have it recorded.
My Purpose is to record live streams on server side and play recorded files later. What I have done is -
1. Copied All files of applications/live in some safe location. 2. Copied all files from samples/applications/live to applications/live folder. (Deleted main.far from live folder) 3. Restricted SWF and HTML to mydomain in these files - allowedHTMLdomains.txt and allowedSWFdomains.txt 4. In main.asc I added these line in the end var mystream;var intervalID; [Code] ..... 5. Then I restarted FMS and tried streaming using FLash Media Encoder. I was able to live stream and then i stopped it. 6. A FLV File was recorded in application/live folder with the stream name that i used in encoder.
I am able to view live video on my domain as well as some other domain, that means allowedHTMLdomains.txt and allowedSWFdomains.txt did not worked. Another issue is that I am not able to view recorded video after I stopped encoder but I was able see live video before stopping. I am using jwplayer to view the video and using flashvars streamer (rtmp://xx.xx.xx.xxx/live) and file(abc) to view the live and recorded video.
I know the stream can only be modified at publishing side by previous posts. I was thinking of using Server shared objects, but it's not supported by lite.
I'm having a problem with recording a live webcam stream. The last few seconds of the stream is getting cut off. The recording is stopped with the following piece of code:
I am looking to broadcast live video and need a camera that is better than a simple webcam. It looks like they used to make video cameras that would register as webcams, but stopped making them.If you wanted to broadcast an event live, what would be a good camera to use?
I have a video capture device on a machine. I have written a windows application to capture the video from this device. While capturing, if i try to broadcast using flash it does not connect. If i'm already broadcasting and then try to capture then my windows application does not get the capture pin of the device. So, flash seems to be taking the capture pin. Is there anyway to broadcast using the preview pin on the device so that i can capture using my application. Or is there way where i can split the capture pin and give one to my app and one to the broadcaster
As titled, what is the way to record video/audio files using Flash Meida Server through rmtp, and allow users to access the recorded files through http?What I am trying to do, is to record a user's microphone's input and save it to the server.fterwards, I would like other users to be able to access the recorded files and mainuplating the audio data, by computeSpectrum(), to do some visualization of the audio. As I know computeSpectrum() cannot work on streaming files, so I think I need to access the recorded files using http instead of rmtp. Is that true?
I would like to know if there is a solution available that allows me to plug 2 HDV cameras into a laptop and live switch between the 2 cameras while uploading the live stream via adobe flash media live encoder to the adobe streaming server? the scenario is I want to offer live streaming of corporate seminars that I normally shoot with 2 cameras.
I want to have a website where users can record short video clips (using a webcam) and then have those videos saved on the FMS for viewing later. I've seen a lot about streaming video, but I actually want to save the video.
I'm trying to develop a p2p (RTMFP) webcam chat application with AS3 and FMS. For test purposes I've been using "rtmfp://p2p.rtmfp.net" and now I'll use FMS on Amazon AWS.hat I couldn't understand is which application type should I use on FMS? There is four preconfigured application types on FMS, these are live, livepkgr, multicast and vod. I guess live or multicast but I cannot choose between them, I'm a little bit confused. Or maybe I don't have to setup any application?My application scenerio is; a user will connect to FMS and webservice and will get other person's farID from webservice and will be on audio/video chat with remote side as he sends stream. (with one person) Pretty much like sample Cirrus application.Also I've another issue, I also want to publish one person's (consultant in my case) video to several people (customers) at the same time (one to many), customers won't publish any video or audio, they'll only watch and hear consultant. Can I use RTMFP and P2P for that? I guess it will insanely increase publishers bandwidth needs, so should I use streaming on FMS and let customers to get stream from FMS
I have Flash Media Streaming Server 3.5 (not Interactive) running on RHEL5.5 x86_64 Linux.All is working well, however how do I prevent unauthorized access to connecting to the live stream and streaming content?How can I setup the server to require a user and password to stream live media to the server?I am new to this product and I have been reading some documentation but I have not found a clear cut answer on how to force a username and password to connect to the server to stream live content only.I am using the Adobe FMS Apache install, what files need changing?[code]I want to lock down a person from connecting to the server on the public internet and starting a live stream?Can this be done with a user name and password?
I'm running Flash Media Streaming Server and have only been serving VOD up until now. I had my network administrator open up port 1935 to the outside world during the setup process and now I can't remember if that was actually required for streaming VOD to clients. Most documentation I've read says that this port should be open, but I seem to recall reading something at one point that suggested it wasn't necessary.
I've just started messing around with publishing live streams using Flash Media Live Encoder to the Flash Media Streaming Server. I have that working without issue but was surprised to find that no authentication is required before a client running the live encoder can publish a stream to the Flash Media Streaming Server. An authentication module is available however it only works with Flash Media Interactive Server and Flash Media Development Server.
If I leave port 1935 open to the outside world, there would be nothing to stop anybody anywhere from streaming video via my server. Anyone else running a default install of Flash Media Streaming Server and with port 1935 open to the outside should see that this is true of their setup as well. I'm wondering if I can safely close port 1935 without limiting the functionality of the server or if there's some way I can require authentication prior to publishing a live stream even though I'm not on the four-and-a-half-times-more-expensive edition of the product.
when i try to live stream with FMS! I can stream video with Flash media live encoder to the server but when i create the player to recieve the livestream from server,i can not recieve the live stream,can anyone give me a step by step tutorial of how to do it?
We have mutiple live streams(games) in our application and one of our requirement is to record the games when they start and end the record after stop.We have written a servere side Code based on guidelines provided in the forums. Our Sample code looks like this.
We have written a PHP page which receives the Game Start and Game Stop and calls the FMS startRecord and stopRecord functions accordingly.We tested this record manuall by passing the values to PHP page and recording works perfectly.Our problem arises when we automate this recording. Every minutes we have 10 games that are created and hence the PHP page calls FMS 10 times a minute to startRecord and stopRecord.Some of the recorded flv are inconsistent and recording is never complete.
Can FMS take such sequential request? We are stuck with this because the recording is partial sometimes.We are using FMS 4 with Red Hat. Any other information will be provided.
If I set EncryptionScope to "server" in httpd.conf (<fms>/Apache2.2/conf/httpd.conf) how or where do I specify the certificates for flash access if I want to use "ProtectionScheme FlashAccessV2".
I have a FMS 4.5 with (License) and is set up and running fine, now I want to stream Live to the IOS Devices but no luck. I have a web page for test purpose with the video src tags to: URL... and in the server I have a livestream.m3u8 pointing to the same URL..., Now the Encoder have the AAC plugin and all the presets like it shut but when I go to see it in my IPad 2 this is how it looks.
Live Stream Encoder====>My FMS (my own server) republish to======>CDN======>Viewing Client
I want to archive the live stream at the first stop, my FMS. Adobe live docs say if I want the recorded file to be of format F4V (H.264), the stream name with which I publish must be of the format mp4:streamname.f4v. My CDN, unfortunately, requires the stream name it receives to be of the format "streamname".
I'm trying to record a FLV, using servers-side scripting. It's working as expected. I'm using record('append') to add streams to the same file. What I want to do now is record the streams in "real-time". I want to keep the same functionality (append) but I want to "records" the time that there is no streams available. In other words, I want to ends a with a file (FLV) with the srtreams separated by black frames when there are no streams available. If the time between a stream and the next stream is 5 minutes, I want the FLv to play 5 minutes in black.
I m a begginer with FMS. I have a licenced version of FMS and I want to stream a live video from the Localhost or another machine with camera. And then then I should broadcast it to many clients. I don't know how to begin
I'm new to Adobe Flash and I'm just trying out the Flashe Media Server 4.5 to stream live video on a local machine. I'm using a FLME 3.2 to capture the video from my digital camera, which works fine, and I can connect to the server with no worries as well. But I have issues streaming the content in the Sample Video Player that comes with the server.
[Code]....
I've also tried playing some sample videos on it. it even refuses to play those. What could be the problem??
One server - is streaming server with Flash Media Interactive Server 3.5that host the application with asc files in FMS application directory.Second server - is the IIS web server that host thehtml,aspx,swf etc.. files.So basically i heve swf file on one server that have to connect to live streaming via rtmp on different server with different ip address.I did not find any clear explanation on this crossdomain rmtp issue.ys only HTTP can use crossdomain.xml policy filebut not rmtp.So what policy security procedure need to be done in order to enableswf file hosted on one server to connect and show rmtp live cam broadcasting from another server?