Anyone know about progress bar in rtmp i m making player using rtmp protocol because it allow to seek in any position even if the video data is not cache but i did not solve this issue how to display progress bar data is loaded i can do this in http protocol using Netstream.bytesLoaded and Netstream.bytesTotal i know rtmp is streaming video but i just want to know the when we pause in youtube u can see bufferlength in light gray which indicate this much of video cache i want to does they rtmp protocol or something else.
I'm trying to make a software which sends video and audio data to a flash media server by using RTMP protocol. Currently, my program can communicate with a flash media server correctly. RTMP specifications does not describe about the raw data in video/audio messages, so I muxed raw H.264 and AAC data into video/audio messages and sent to the server. The server seems to accept them, but a video player cannot playback the stream sending from the server. The player just says "Loading..." For a test purpose, I sniffed the network packets between Wirecast and the flash media server and ripped off only video and audio data. Then, I muxed those data into video/audio message and sent to the flash media server. In this case, the video player connected to the server can playback the stream correctly.
I checked the stream sent from Wirecast, the stream seems not to be H.264 raw data because those data are not started from 0x17 instead of H.264 start code. With those situation, I am wondering what kind of container format I should use for H.264/AAC data to the flash media server.
I'm running Flash Media Streaming Server and have only been serving VOD up until now. I had my network administrator open up port 1935 to the outside world during the setup process and now I can't remember if that was actually required for streaming VOD to clients. Most documentation I've read says that this port should be open, but I seem to recall reading something at one point that suggested it wasn't necessary.
I've just started messing around with publishing live streams using Flash Media Live Encoder to the Flash Media Streaming Server. I have that working without issue but was surprised to find that no authentication is required before a client running the live encoder can publish a stream to the Flash Media Streaming Server. An authentication module is available however it only works with Flash Media Interactive Server and Flash Media Development Server.
If I leave port 1935 open to the outside world, there would be nothing to stop anybody anywhere from streaming video via my server. Anyone else running a default install of Flash Media Streaming Server and with port 1935 open to the outside should see that this is true of their setup as well. I'm wondering if I can safely close port 1935 without limiting the functionality of the server or if there's some way I can require authentication prior to publishing a live stream even though I'm not on the four-and-a-half-times-more-expensive edition of the product.
We had FMS2 installed before and the paths to all our videos are like rtmp://ServerName/sites/.... (the default path on FMS2) Now we upgrade to FMS4 and we would like to keep these paths the same because we have many HTMLs that reference these videos. However, the default path on FMS4 is rtmp://ServerName/vod/... Is there a way to change "vod" to "sites"?
I tried to change VOD_COMMON_DIR in fms.ini from /install_dir/webroot/vod to /install_dir/webroot/sites, and also changed the document root in httpd.conf, but rtmp://ServerName/sites/ is still not working.
In our company we upgraded to fms 4 and installed the apache that comes bundle up with the installation. We had it configure and now we can play videos in html 5 by using the http protocol. The question I have is, can we use https protocol instead of rtmps? I've been doing a lot of research and I found documentation that says to put a minus sign in front of port 443 (-443) in the fms.ini file in the ADAPTOR.HOSTPORT line but it also says that with this configuration port 443 will only receive rtmps connections. My next step is to put the minus sign in front of the port number and restart the server and just try to establish connection using https to see if i works.
I am publishing my live stream with rtmpt protocol on port 80,to overcome network proxy. But after 10 to 15 minutes (telecast) video get stuck and connection get lost.But i don't have this problem when i use rtmp protocol with port 1935.
I'm developing an online meeting system with audio/video sharing, using Adobe Flex 4 and Flash Media Server 4. I'm using the RTMFP protocol to make the transmission of audio/video which increases considerably the performance. The trouble now is that i must record the audio/video transmitted, but i figured out that when using the RTMFP protocol the FMS doesn't operate in the channels. So, how could i make FMS record this channels?
I want to use customised encoder to publish live steams to a FMS4.5 server?Is it possible to connect encoder and FMS4.5 server on HTTP protocol?i read in other articles FMLE can connect to FMS on RTMP protocol.
i have a small LAN of about 8 computers all of which are running windows 7. I have installed FMS and XAMPP webserver on one of the machines. I want to stream live from one PC to all the other PCs on the LAN. I have a webpage with jwplayer embedded in it on my XAMPP webserver that is able to see the live stream when i start it locally. I mean the live stream works fine on the machine with the servers on it. But when i want to view the live stream from another machine in the LAN by accessing the webpage that has the jwpalyer from another machine, The jwplayer returns "server not found:rtmp://192.168.10.1/live" error. I was thinking that maybe a firewall is blocking the 1935 port but i have turned off the firewall of every PC on the LAN. I have unistalled any antivirus program on all the PCs. But i still get the same error when i try to access the live stream from another PC on the LAN.When i run netstat -a -n|find ":1935" i get 192.168.10.2:49184 192.168.10.1:1935 SYN_SENT and i think the request for the stream is sent but the conection is rejected.
This is the code for the webpage with jwplayer embedded in it. maybe it: <html>head> <title>JW FLV Media Player</title> <script type="text/JavaScript" src="swfobject.js"></script>
Somebody I know can't watch because they have a proxy server.The nc.connect() call to get the live stream fails because it goes direct from AS3 to FMS and doesn't go through the proxy server.
i am new on steaming & flash server; when we try to use RTMP over HTTP the outside client gets the internal IP address of the FMS server instead of the NAT one or public IP address, how can we solve this.
I was previously using FMS 3.5 to record from flash media live encoder using the DVR app. This worked well as I was able to seek into a live stream, all of it not just a window, and I was using RTMP for playback (not HTTP).
Now I would like to upgrade to 4.5, and keep using RTMP for DVR BUT add support for iOS. So my question is, is this possible? I've looked over the documentation many times and cannot figure it out. I see livepackager and that appears to support HLS but it also requires using adobe dynamic http streaming and it says nothing about true DVR (yes actually recording it).
So is the dvr app now obsolete, and if not how can I add iPhone support to it? I already have a custom player and really hope I don't need to rewrite it to use adobe http dynamic streaming just for this.
I am trying to encode and stream live video. I have downloaded both the Flash Media Encoder and Flash Media Server. To complete the process of streaming video, it appears that I need to obtain a "flash media server URL address" which is called "RTMP" on the encoder page. Where can I find this? Is this something that can be downloaded, or do I have to purchase this from a partner like Level 3 communications, AT&T, etc.?
i am having a serious issue with finding my RTMP address which may be something very simple to you all out there, i just installed Flash Media Server 3.5 at C:Program Files....i have purchased a video flash chat program and it needs to know the rtmp address which im not sure. I named the Flash Server EazyFlash.....not sure if i should install flash server somewhere else on server so website can see it??? The flash console and sample videos are working great but its all C: Drive??? would it be RTMP://EazyFlash? The software came with flash application which was a folder called videoflashchat which i copied to applications folder.
i re-stream one rtmp from server1 to 5 or more FMS servers, and make sure they are always connected if they lose connection for some reason it should keep trying to connect back.Please note i'm not a coder. I need simple example of how i can do this without hiring someone to do so.
Currently, we are streaming using RTMP, but I want to switch to HLS to allow iOS devices to stream our video. I do not want to encode two separate streams though because we are operating with limited bandwidth on mobile data cards. If I stream using the livepkgr, the delay to iOS devices is about 12 seconds which is OK. However, on desktop clients, we have live scoring integrated with our video, so we cannot have a delay that long. My question is, if I connect to the same stream in a Flash client for the desktop, will the delay still be that long? Or will it stream in real time to flash clients with the delay only for iOS devices using HLS? Also, am I correct in assuming that I can do this on 1 stream, or do I need to setup a separate stream for RTMP and HLS?
I need to configure FMS server so that it will listen to only RTMP request and disable other services like HDS and PLD. What configuration is required for this arrangement?
We have RTMP and HLS live and vod streams running on FMS interactive 4.5. Given that FMS documentation on security and content protection is very scarce I have the following questions:
1. How do I protect my live and vod RTMP streams from embedding in other webpages?
2. How do I protect my live and vod RTMP streams from capturing with special tools like rtmpdump?
3. How do I protect my live and vod HLS streams from embedding in other webpages?
4. How do I protect my live and vod HLS streams from capturing with special tools?
As far as I understood RTMPE is flawed by design because it can be captured with rtmpdump. SWFVerification also fails, because one can use exactly the same SWF player on their site. Can I somehow tell to my FMS server that it would allow only SWF player from my domain?
We are using influxis fms connect, we need to upload videos and audios using php script, i dont know what to do exactly with this fms connecti need the answer for the following query, so that i can get into it, How to upload audio or video files to rtmp server using php script, since my php scripts is in one server and the fms is in another rtmp server?
My swf is playing one video in using NetConnection, NetStream and Video object. If I want to stream one more video simultaneously in the same swf I have a few problems. It works when I create more NetConnection, NetStream and Video objects but is that necessary? The code rapidly becomes complex to handle.
Is there an easier way like perhaps share the NetConnection, or something (same FMS server)?
Question 2
The two videos on the stage are suppose to have different size, placement etc. Still the last one created inherit the properties of the first video display. It also starts playing for an annoying couple of seconds before the first one. How can I avoid that (inherit and delay)?
var ns1:NetStream; var ns2:NetStream; var nc1:NetConnection = new NetConnection();
I have FMS 3.5 installed on a Windows 2003 server, service pack 2. I access the server via remote desktop on my computer and when I use the Sample Video Player, I can get the sample rtmp videos working on the server and also on my own personal computer, but when I try and play the samples in the video player on any other computer, I get a Connection Error. I've verified the ports are all working. I've also checked the admin console and when I connect with both the server and my own computer I get 2 connections. When I try from an outside computer, no connection is shown. So it's not even getting to FMS to even register. What else might I be missing?
Our company is developing the product that includes rtmp restreaming/transcoding server.Our goal is to restream rtmp stream taken from external rtmp server(probably FMS), transcoding it in real-time.By transcoding I mean changing the stream resolution, quality and bitrate.Does FMS privide such a functionality? Which version of server we have to use? Does we have to use additional software such
i had installed a FMS4 developper in server on my centOs. I can connect to my FMS with client with RTMP, but with RTMFP, FMS don't reply. I looked at FMS log and i saw edge. 00.log:
2010-09-30 13:11:53 4984 (e)2631504 RTMFP could not start on edge process for _defaultRoot__edge1. Famille d'adresses non support par le protocole -
Let's say user A connected to the FMS server with an RTMFP NetConnection, and user B connected to the server with an RTMP connection. If I want to connect them together, would it be better if I also connect A to an RTMP NetConnection to the server?
I have some questions regarding 4.5 and deciding which version we must license (FMIS or FMS). Obviously there is a huge price difference and as this is within a local network we don't require encryption or multicast. Here is the scenerio we need to stream live DVR to mac, windows browser AND iPhone/iPad and devices running Android 2.2(Froyo)+. I need someone from Adobe to help us decide.
1. I see DVR is now listed as a feature in Flash media server page. Does that mean its no longer exclusive to Flash Media Interactive server. For both http streaming and RTMP? 2. Previously in order to stream live to Adobe dynamic streaming, Flash media interactive server was required because a live segmenter had to be used. Is this still the case? Or can we just launch adobe live media encoder (FLME) and stream to flash media server.3. Related to #2 can we switch between serving streams via RTMP or dynamic streaming without changing the format of the material on disk?4.I was at a webcast for 4.x and remember being told that the latency for http live dynamic streaming was 20 seconds because of the repackaging that had to take place. Is this still true? What is the latency for iOS playback from ingest to playback?5. For the iOS playback feature do the streams have to be in Adobe dynamic streaming format or can we just stream via FLME RTMP and trust it to do the right thing?6. Does the iOS playback support DVR?7. Does the Android plugin have good support for RTMP live streaming including DVR?