ActionScript 3.0 :: No Sound In Published Swf On Server
Jul 2, 2009
Newbie question, can't find this specific problem in the archives... I've created an app that has lots of audio. The audio is mp3 files. When I test it (CTRL+ENTER) I hear the audio. When I publish it and view it in the .html file locally (not a local hosting environment) I hear the audio. No problems there. But now it's on an actual server and I can't hear the audio. Wondering if there is a setting I've missed. I have not changed any settings related to audio. I'm actually emailing the .swf file, the .html file, and the AC_RunActiveContent.js file to an individual who has access to the server, and he put them where they need to be. I can see all of the visual content, but no sound. I assume I don't have to send all of the audio with it, correct? What do I need to do to hear this audio?
I am developing video chat application. I am able to play published video stream. But I am unable to hear sound which is published along with the video. The video is visible but sound is not comming.What is going wrong.
The quality of sound when playing my flash file on the timeline in flash CS4 is excellent, however when I go to test movie the sounds quality becomes awful, like its being played in a tunnel on terrible speakers, my sound files are .wav files, am I missing a process I must do to ensure the quality of sound is carried through to the final published animation?
I have an MP3 file in Flash 5.5 that I have edited in the properties panel of the frame where the sound is...I basically cut a couple seconds off the beginning by editing the timeline. It plays fine when I'm in Flash, but when I publish the movie, the sound is published unedited.
I need to be able to play any stream published to our FMS application, without connecting to the application as regular users do. I am an administrator and the service is ours. Can I do so using Administration API, for instance? I sure can get the list of the streams, but I have not found a way to stream these as video yet. I am building an application not unlike Admin. console, but with more functions that we need. If Admin. console seems to be able to play streams, it must be possible?
In the article there : http:[url]...We can see this under 'User lookup'.RTMFP assigns a peer ID to each participant. These peer IDs are 256 bits long and are non-forgeable. When you want to subscribe to a directly published stream, you must specify the publisher's peer ID:
#var receiveStream:NetStream = new NetStream(netConnection, id_of_publishing_client); receiveStream.play("media"); In another thread, Michael said : i believe the problem is that you are attempting to make a P2P connection to the server's peer ID; that is, something like" var ns:NetStream = new NetStream(netConnection, netConnection.farID); ns.play(...);
under the covers, this will open a new RTMFP flow to the server that will appear to the server as a new incoming client, but the initial handshake will be incorrect (the first/only command message is "play" instead of "connect").i see this on Cirrus all the time.Is it an error in the article or is it right considering the scenario?
As a subscriber of live stream how do I get absolute stream time since it is published? Consider the case, a publisher has published a live stream at 0 seconds. It has been 100 seconds past since the live stream is published. A new subscriber comes to fms and subscribers a live stream. It starts playing that live stream but NetStream.time property will start from 0 instead of 100. Is there any way or configuration (in fms) which allows subscriber to get absolute stream time?
I publish the video stream with H.264 settings to fms 3.5 as recorded and at the receiver's end play it as live. The problem is that at the receiver's end stream is played from the last 4-5 seconds. I want it to play from the current position of the live stream.
netstrm = new NetStream(nc); netstrm.play("mp4:"+instanceName+".f4v", -1); h264Settings= new H264VideoStreamSettings();
I am building a custom administration application which has to review streams published to our FMS application. This is to ban clients publishing inappropriate content. I am wondering if I have to make a connection to the application, as normal users do, in order to play the published streams? I am getting the list of streams using another connection, to rtmp://domain:1111/admin, like outlined in Administration API, and issuing the 'getLiveStreams' call.
The reason I am asking is because firstly, I remember that the Administration Console can play streams published to the server, but I don't recall it incrementing the connection count which shows amount of user clients connected to an application. Maybe I have missed however, but if I am right and it indeed can play streams without connecting to the application (which actually DOES sound a bit unlikely, given how the stream names are local to an app.), then maybe I can do it too? Secondly, I presently have to differentiate users and administrators in my 'onConnect' script, because are perform completely different roles. If I could relieve my admin. app from connecting to the application, I could also dump the role switching.
I have created an swf where I can record the webcam picture as H264 video to a FMS 4.5 (I am using the developer version).My code is looking like this:
var h264Settings = new H264VideoStreamSettings(); h264Settings.setProfileLevel("baseline", "1.2");[code]....
I can then replay the video from FMS just fine, but if I try to copy the video from the FMS application directory into a local project and try to play the video with the FLVPlayback component, or with the Adobe Media Player, it is not playing at all.Is this to be expected? Can't I record a webcam video with FMS in H264 and use that video later without FMS?
It's easy to do on publishing clients, just use:http:[url]....How do we do it on the FMS? It doesn't appear to be possible according to these docs: http:[url]...What is the default for the FMS for the audioReliable, dataReliable and videoReliable properties?
I am using a flash app. I can send and receive streaming data to FMS by using this app, like the chart I attached. [URL] This app uses RTMP to access to FMS. I want to upgrade this app to display sound level of each client. Is it possible by using Actionscript and FMS? If so, which class should I use?
I have a broadcasting project that I've got where the host of the show can have multiple sound effects they can use on their show.I'm wondering if the only way to go about this is for all the clients listening to the show need to download the mp3 of that sound effect or if the sound effect could come from the host ...like combining his mic (for his voice) but also straight audio (the sound effect) into the stream.
I try to switch or change a server-side stream, it starts lagging after 2 seconds of playing and sound disappears. Here are scenarios that result in that terrible lag:
1. I create server-side playlist with stream.play() with reset=false; when it is time to play the next movie in the playlist, it starts lagging after 2 seconds.
2. The same problems appears when I just switch streams. I installed FMF Feature Explorer and tried to launch SwitchStreams sample application: the same problem - server stream starts lagging after I switch streams with stream.play().
I tried on different servers (local and remote), with different players (debug player of FMS Admin Console, Standard Flash videoplayer component, OSMF player, Flex video player). I also tried all possible flv, f4v and mp4 file compression options for video files - still the same problem. I have also tried literally thousands of Application.xml settings: changing buffer, buffer ration etc. Is there any tip where I should search for a solution?
I need to record the sound(.mp3) from microphone in flash, i want to save in a local system from a web page and without using the sever like fms red5 etc.
Is there any param value... or anything else I can use to stop the sound in an .flv file I'm streaming from the server?
I do not have the original video that the flv was created with... otherwise this would be a no brainer. I want to keep the video up but take away the sound
i know the way to capture the mic's sound,My idea is to make an application to record songs palyed by other app,like Windows Media Player,Yes, Just capture from the soundcard,
I have created a flash header that is pulling some external sound from a server and am running into an interesting issue. Here is the code that I am using.
[Code]...
And here is what is happening: I have a preloader on the first frame and after everything is loaded it goes to the second frame and begins playing. If I just preview the movie in flash (Ctrl + Enter) then everything works fine; the music plays my text field is created (the time counts correctly), and the start/stop button works fine. If I then go up to "Simulate Download" to test the preloader and everything, when it loads and goes to the second frame, the music will sometimes play (and cut out), the text field isn't created, the start/stop button doesn't work, and I don't know what the problem is.
I think I have pinned it down to my my_sound.onLoad() function (that I don't think is getting run in the simulated download), but I can't figure out why it would work with a live preview, and not a simulated download.
What I want to know is if it is possible to record a sound clip using a microphone and then upload it to the server using a swf applet in a webpage. Im an experienced java/javascript programmer but just starting out with flash/action script. What technologies will I need to do this? Can I do this simply using simply an swf file and an apache/php or jsp page?
I am totally new to Flash Media Server I need to be able to record sound via a Flash app and save it to a server preferably without the use of a Flash Media Server type setup. There is no requirement for live streaming. Is it possible to record the sound and save in memory then pass the data/file as a query parameter to the server?
I have this script in the first frame of the first scene of my movie
[Code]...
which works fine when I test the movie locally, but when I c it to the server I get nothing. I tried using a .wav and a .aiff and still get nothing. Does anyone out there know why this is? Is it a server issue...maybe something to do with mime types? I'm at a loss.
I am using FMS for live streaming. Am using Flash to publish live video, I have one to one live streaming where I can see other end user live video and other end person can see mine live video. So one input stream going to server and one out put stream coming back. The problem what I was getting is sound is getting echo i.e., Repetitive sound getting.
flash media server ,provide property Microphone.names that retrieves an array of strings reflecting the names of all available sound capture devices. is there is any property that detect output device attached to system so i can select which one to play back sound. for example:- i need to play back mp3 file in flash. but i have more one output device installed in my pc. the question is:- is there is any way for falsh to list all (or detects) audio output device installed in my pc then let me pick one of them to play back sound by selected output device?
I'm having an issue streaming mp3 files form flash media server where the sound will be very choppy. When polled every second the newStream.time property has kind of crazy numbers like 1,3,6,8,10 always skipping a second or two.
I am triggering short sounds dynamically from the library for a game (Specifically Air for Android). When the user clicks a button the sound can take up to 600ms to actually play. I have set it for any silence before the actual sound by calling the sound like so:
[Code]...
All return the same results. I know there are threads here that talk about this but none have offered a real solution that I can find. Is there no way to cache the sound or store it in a buffer?