Good day - how do i create a support request for my new Flash Media Interactive Server? I've tried to contact our Russian Adobe support - but got a reply stating, that FMS is not supported by adobe support.. Btw - how can that be?! But, nevertheless - i do need to open a support ticket, as i have several problems, which interfere with production of FMS..
We're trying to stream Flash 9 .F4V files without any success so we upgraded to FMS 3.5, from 3.0, thinking that would do the trick, but they won't stream. I see MP4 support on Adobe's site, but not a word about .F4V support.
1. I entered a url in a video player : rtmp://10.0.0.1/vod/mp4:sample1_1500kbps.f4v
2. this video doesn't exist in 10.0.0.1 server. it is exist in 10.0.0.2.
3. the server 10.0.0.1 connect, play, and resource not found.
4. server 10.0.0.1 know that this video exists in 10.0.0.2 ,then redirect the whole url to 10.0.0.2. such as rtmp://10.0.0.2/vod/mp4:sample1_1500kbps.f4v
I see that the new Flash Media Servers were released. Looking over the information on the product pages, Flash Media Enterprise Server 4 is the only available product that supports RTMFP connections?
We currently have Flash Media Server 3.5 and our product utilizes RTMP connections, but I've been working with your Stratus Server and RTMFP and have found that the new protocol better fits our needs. However, the price for Flash Media Enterprise Server 4 is beyond our budget. Is there any possibility that RTMFP support are/will be available in the lower levels of Flash Media Server?
I read a blog article http:(url......)talking about switching between different bit rate video streams of the same video content on fly.But I can not find such ability in RTMP's specification.I am wondering if such function has to be implemented by users with ActionScript?
I use H.264 Video Format,25fps,800x600 size,1000Kbps.Start the encoding,the CPU utility rate is going up to the 90%~100% percent.Plenty of video drops.For to reduce the cpu utility rate ,i want to use the hardware encoding of the nvidia geforce or the ati videocard.Are the FMLE3 support the hardware h.264 encoding?
I'm trying to make a new flash player to stream a video from the vod folder. But after creating the player, the video doesn't show. It only shows the a blank screen with the player skin.
what's the format to put in the RTMPE url into the source/content path?
The documentation is for FMS "4.5.1". Is PHDS for a live HTTP stream also supported in FMS 4.5? I tried setting Application.xml as instructed in the guide, and when I viewed the .f4m from the live stream, I did not see any indication that it was encrypted.
The Signal is yuv or pcm stream , i use the fmle3.0 to encoding to the h.264 format. Once started.the CPU utility rate is going up to the 100%. I want to konw,which kind of the CPU can support the live stream. In other words,My require is encoding the one live stream the cpu utility rate is below 30%.
I would like to build a video conferencing app that supports H.264. Is there a way to use Flash to do that? I understand that Flash natively supports the Sorensen Spark video codec and that you need to use Flash Live Encoder 3 for H.264 encoding. I can't seem to find any documentation that shows me how to use FLE3 with ActionScript.
I have called Adobe roughly a dozen times today to speak to someone about purchasing the developer "incident support bundle." I've been told to call about a dozen different numbers, and still have not had any luck.
Another question re setting up FMS to do multiple vid conferences:I have 3 or 4 video feeds being generated from different locations, and those 3 or 4 video feeds are going to be consumed by others. Can I do that through the same NetConnection/FMS application? Or do I need to create multiple FMS applications (i.e. one for each feed)? I currently have one feed on a NetConnection, but I'm looking to expand it.
I download the Flash streaming server and install it its work fine.I want know about the following question: how to stream the video using playlist? What is the format of the playlist? can i create a playlist from visual studio.Net and pass the value to the player?
As stated above, I wanted to know if I can take a video playlist, say with 4 clips, and have FMIS create a separate file out of the 4 individual clips. Specifically, a user selects 4 (or any number) of clips to create their playlist, then at the end, a 'full length' file is created. If this isn't clear, let me know and I'll try to elaborate.
I want to create a video chat application between 2 people and I want to record the dialouge between the two people to one audio file, so both of then can play back the dialouge at a later time.
When using adaptive bitrates do I have to encode multiple bitrates from my encoder AND create a manifest? Or can I simply encode/send one bitrate and have the FMS server do everything else?
Create a new folder named AudioStreams at C:Program FilesAdobeFlash Media Server 4.5applications. Open the AudioStreams folder and add a new folder named streams. Inside this folder place a new folder named _definst_ and copy the four audio files in the Exercise to the _definst_ folder. I am confused when should we create definst folder and when C:Program FilesAdobeFlash Media Server 4.5applicationslivemedia folder?
Can Adobe Flash Media Streaming Server 3.5 run on AMD Athlon Dual Sock Quad Core?I just requested a Dell server to be added to our farm to run as a Media Server and to my surprise, while reading the requirements for FMS it states the following: 3.2GHz Intel® Pentium® 4 processor (dual Intel Xeon® or faster recommended)
I'm trying to troubleshoot a Flash Media Server working with a little video playback application I wrote a few years ago that has suddenly stopped working.I'm using CS3/Actionscript 3.My app uses the FLVPlayback Component, and was working well last time I checked. I recevied a report that the videos stopped working, and have been looking into it.I figured I'd add a bunch of event listeners to the FLVPlayback's ncMgr.netConnection so I could get debug info on things like io errors, net status, etc.The problem I'm running into is that the netConnection is null when I set it to anything on my Flash Media Server, and adding any event listeners to this netConnection throws errors.Here's what I've tried so far:
Playback of a local FLV file works fine.In the FLVPlayback documentation, I found an example and stole the URL of the stream they were using in the example, and that works fine, although it is an HTTP protocol stream rather than RTMP.Any attempt to access FLV files on my Media Server, which has worked fine in the past, basically cause the FLVPlayback object to sit and hang in "buffering" mode and never progresses beyond this point.The netConnection object in this case is null.Here's my code:[code]........
Again the purpose of this is to troubleshoot the video streaming from the Flash Media Server, and it seems like there is no netconnection to the server being created.Does this mean that the server is not working, or is there a problem with the way I'm trying to access the content on the server?This was all working fine before, and I have set up the server-side application .ASC files and such to allow things to work fine on the server end.
I'm trying to make a software which sends video and audio data to a flash media server by using RTMP protocol. Currently, my program can communicate with a flash media server correctly. RTMP specifications does not describe about the raw data in video/audio messages, so I muxed raw H.264 and AAC data into video/audio messages and sent to the server. The server seems to accept them, but a video player cannot playback the stream sending from the server. The player just says "Loading..." For a test purpose, I sniffed the network packets between Wirecast and the flash media server and ripped off only video and audio data. Then, I muxed those data into video/audio message and sent to the flash media server. In this case, the video player connected to the server can playback the stream correctly.
I checked the stream sent from Wirecast, the stream seems not to be H.264 raw data because those data are not started from 0x17 instead of H.264 start code. With those situation, I am wondering what kind of container format I should use for H.264/AAC data to the flash media server.
I'm trying to create a FMS application that is broadcasting some data across connected clients using SharedObject. I want only the Application to be granted to set SharedObject properties.