I am using 2 sound channels to play 2 separate, externally loaded mp3 files at the same time. Right now I have the play button tied to a function that starts the first mp3, then the 2nd. The problem is - the 2nd mp3 starts a few milliseconds after the first. Is there a way that I can make sure these 2 start playing at the exact same time?
It appears my sound is off sync and the longer I play the movieclip, the farther off sync it goes. Its nothing too complicated, just some basic shooting sounds that fire every time I hit the space bar. My code is below:
The Attack() method gets called from another class that handles all keyboard controls.when it gets called the sound then plays.firesound and firesound2 are almost the same. firesound2 sounds a little off pitch to make it sound more realistic.At first the sound sounds pretty good, not great. but then it gets terrible as time passes.
I made a new .fla project. and I attached the following class to it by it self. so the following is the only code in the entire .fla project and the issue still occurs. I press the spacebar and the sound starts a day late.
referring to my old question AS3: Analyzing sound spectrums one by one, one channel at once , with flash player 11 and using Sound.extract,is it possible to extract a single channel and manipulate/visualize it's spectrum? I mean, if i mix 2 sounds into one file, can i control them separately, acting on thing like their soundwaves or volume?
I have an animation of a swinging sign that is accompanied by a squeaking/swinging sound that is loaded externaly.. What I would like to do is go a step further and synchrinize the sound with the swinging sign.I've seen some tutorials about synking imported sound with animation - but is there a way to also synk an external sound?
I am a beginner using AS3 and working on an eLearning lesson using a Flash animation as a breathing exercise. The animation works well, but I need to synchronize 2 breathing sounds with the inhale-exhale animation cycles.I can import the inhale.mp3 and exhale.mp3 into the library. But then, I am not sure how to control play so that when the animation's inhale cycle is displaying (circle on stage expands for a time using a variable for duration, in microseconds: _inhaleDuration), the inhale.mp3 plays until the _inhaleDuration completes. After the inhaling animation, the animation cycle has a brief pause (no sound), then the animation displays the circle contracting for the time in the variable, _exhaleDuration, during which time I need the exhale.mp3 to play its breathing sound. Each sound needs to be synchronized to play only as long as the animation inhale or exhale duration.
I've been reading about sound files online and trying to figure out how to control play by duration using these 2 variables (_inhaleDuration, _exhaleDuration) but so far have not found this. So, I'm stuck and hope someone can provide suggestions, sample AS3 script, so I can add this feature to the animation's Tween function, that expands and contracts the circle, so it also has sounds.
I am using an application which load games as an external swf file(total 10 games). When I click on the game icon it loads the game and also the sounds related to that game.
But when I close the game in that app, ideally it should also stop the sound for that game. It happens for few games, but for some games the sound still plays in the background even if the game is closed.I have used
In some games the sound is embedded in the game swf file and in some games sound is called from the external mp3 files.Is there a way in which if i close the game. ie any game , the sound should stop.
We are working at a complex AS3.0/Flash game project, where we need to preload and cache around 70 movie files with sound. But somehow the preloaded videos all seem to block a sound channel and so some sounds are not played in the game, as we seem to exceed the 32 sound channels. I cannot provide any technical details at the moment, but I will add them as soon as the developer in charge is back in the office.
is it possible to have multiple SoundChannels? And have a volume control control multiple volumes?
i have a site that uses .flv's with sound all throughout. Volume control is attached to a sound transform that affects the stream. I also added a piece of code to this function that creates a new soundtransformation object, adds it to a soundfactory object i have in a private var and adjusts that volume by 40% of the .flv's volume. This seems like it should work - but doesn't so i am wondering two things. a) can i have multiple sound channels (in this case the stream and the channel i made for bkg music) b) if i can why do my sound transformations not get implemented on this music track. I also tried to explcitly set the music volume to 0.4 via a sound transform right after i load it - which does not work either.
heres the code
ActionScript Code: trace("current vol is :" + currentVolume + " And 40% of that is: " + (currentVolume * 0.4)); var st:SoundTransform = new SoundTransform(currentVolume, 0); stream.soundTransform = st;
I want to develop is a multiplayer TD game, and Im concerned about how to sync the players, I mean, its possible that at some times the map would be crowed with units everywhere.. is that an issue that has to be considered in order to make all players view the same at all times?.
I'm trying to maintain a web applications state across multiple tabs whilst using ActionScript, JavaScript, and PHP. Should I use AJAX to update the database after an item has been purchased etc, or should I prevent the game being loaded if it's already open in a single tab (if so how could I achieve this)?
I'm building a facebook game, when the user buys an item and has many open tabs it doesn't update the state in the other open tabs. The buying of an item is handled by ActionScript, and the storage of that item is dealt with using PHP.
I have an mp3 audio file with some narration and I also have a bunch of images that should change each other on the screen depending on a cue phrases in audio. My question is what will be the best approach to the task. Is it possible to lay out sound and images in timeline the way it was possible in Director? If so, is there any way to automatically extend audio to the number of frames needed to cover its duration? Or do I need to convert an mp3 audio in some video format (flv) in Soundbooth, add cue points, then import video to Flash (what should be the video settings so the Flash accepts it) and use ActionScript to trigger the "slide show"
With AS2 is it possible to use the current frame postion of the main timeline A to control the position of two mc timelines B,C relative to the current frame postion of mc D?the main and mc timelines have the same No of frames.Inverse control by D of B,C relative to A is also required!
I am building a 10-channel mixer/remixer and I`m having a problem with synchronizing the sounds. When a user clicks on the "Play" button, it calls a method in the main (document) class which runs through a "for" loop and calls a method in each of the channel objects which play their respective sounds. Here is the method in the document class which calls the playTrack() method in each channel object:
I have prepared a CUSTOM WHITEBOARD. This whiteboard is shared by two clients. They BOTH can draw on the board using TOOLS in the WHITEBOARD PANEL SET of the application. When someone DRAWS I do following actions:
1: Capture the WHITEBOARD FULL AREA/CANVAS using jpgEncoder().
2: Then I send this BYTEARRAY to the FMS server as Remote Shared Object.
3: The client then draws that RSO onto the WHITEBOARD.
Pretty simple though.
BUT, I am having synchronization ISSUES. The issues are as follows:
Issue 1: Image updates with no synchronization and seems to be zig-zag/random when they both draw at a normal speed.
What is the best way to merge two alpha channels? What I want to do is, given channel values, use the value which is greater.
I tried using .merge(), but it appears to only blend channels with a linear value between current and destination value.
I looked into .draw(), which would work well using a blendmode like LIGHTEN, but it doesn't allow drawing per channel so I can't target the alpha channel only. The ALPHA blendmode didn't seem to do anything when used with .draw(), and ERASE gives an unwanted hard edge.
I also looked into .copyChannel(), but it doesn't allow any blending logic, merely copying a rectangular region from one BitmapData to another.
So as you may or may not know, BlazeDS (open source version of LiveCycle Data Services) is a nice way to get your server-side Java and client-side Flex application to play together. Unfortunately, it does have several pitfalls that need to be corrected. I'll try to explain one of them here.
All of BlazeDS's configuration is written via XML files in the flex/ folder of your webapp. The default names are separated for clarity, such as services-config.xml, remoting-config.xml, messaging-config.xml, etc. In these configuration files (particularly services-config.xml), Channels are defined; these setup URIs and objects used to capture and send information between the server and the client. In these config files, it is quite common to use a syntax like so:
[Code]...
Unfortunately, what they don't tell you is that some of these key-in replacements (ie: {context.root}) are not replaced dynamically upon execution but upon compilation of the WAR file you intend to distribute. Obviously not a good idea when switching domains.
So, instead I seek to dynamically define these channels. According to the documentation, that's all good and fine, but it only works if the channel already exists when the webapp is launched. I feel like that sort of defeats the point. So my question is, how do you truly create channels dynamically so that both the client and the server recognize their existence?
Are there any audio management libraries for ActionScript 3? The more tailored they are for game development the better.I'm finding it hard to understand and work with multiple channels, which is making a lot of my audio sound screwy and cancel each other out, etc. I've been working on an RPG that can at a given time have quite a number of sounds playing such as environmental noise (like opening doors, thunder, rain, etc), walking enemies, inventory sounds (for events like dropping an item into your inventory), passive spell-casting sounds, voice clips, weapon swinging etc. Alternatively, if there are any good resources that explain how to use multiple sound channels strategically.
Simple question here, lets say you have a color in RGB, white:255, 255, 255Pixel bender needs values between 0-1 not 0-255.So white = 1, 1, 1What about:134, 132, 123I dont know how to work it out.
one drawing layer (graphics object with lineTo, etc.)
one png with an alpha channel (supposed to serve only as a mask)
now i want to be able to only draw lines within an area restricted by the png mask.
i am trying like this:
var bitmapData:BitmapData = new BitmapData( 320, 320 ); bitmapData.draw( drawingLayer ); bitmapData.copyChannel( maskBitmapData, new Rectangle( 0, 0, 320, 320 ), new Point( 0, 0 ), BitmapDataChannel.ALPHA, BitmapDataChannel.ALPHA );
now the borders get cut off fine, but i get a black background, since the drawing layer has an alpha between the drawn lines (and it should remain like this) and the mask bitmap has an alpha outside the shape. so naturally the mask's alpha replaces the drawing layer's.
i tried it with merge, copyPixels and also with just setting the mask property on the drawing layer (i set everything to cacheAsBitmap) but to no avail.
[Embed("...")] private const BodyMask:Class; var maskBitmap:Bitmap = new BodyMask();
and assigned them to the mask property of a s:Group element (cacheAsBitmap=true) where the operations on the graphics object occured. i also tried reassigning the mask after each draw operation.
I'm developing an online meeting system with audio/video sharing, using Adobe Flex 4 and Flash Media Server 4. I'm using the RTMFP protocol to make the transmission of audio/video which increases considerably the performance. The trouble now is that i must record the audio/video transmitted, but i figured out that when using the RTMFP protocol the FMS doesn't operate in the channels. So, how could i make FMS record this channels?
I've been trying to find the right configuration for supporting both http/s requests in a Flex app. I've read all the docs and they allude to doing something like the following:
I'm running with Tomcat 5.5.17 and Java 5.The BlazeDS docs say this is the best practice. Is there a better way? With this config, there seems to be 2-3 retries associated with each channel defined in the default-channels element so it always takes ~20s before the my-amf channel connects via a http request. Is there a way to override the 2-3 retries to say, 1 retry for each channel?
I copied this from some other forum because it's explain well "...Let�s say you are loading in an external asset that has an alpha channel, such as a PNG. Now let�s say that you want that asset to act like a button, with rollover and click actions. No problem. No reason you can't do that. But, actually, there is a problem. The hit area for that external asset will be its entire bounding box. Take this octy image for example: Even if you rolled over the top left of the image where it�s totally transparent, it will still trigger rollover actions, which is certainly not ideal and could potentially be very confusing for a user..."
I'm trying to take a png and create two images, one that represents the color data and the other a grayscale version of the alpha channel.
Extracting the correct data was no problem, i'm just embedding the png, creating a new instance (as a Bitmap) and then extracting the relevent channels of it's bitmapData to two new bitmapData instances.
ActionScript Code: [Embed(source='/../deploy/images/alpha_test.png')] public var image : Class;
[code]....
The problem that i'm having, is that when i assign the colors bitmapData to a bitmap instance to display it, the background color is showing as black, even though the default fillColor of a bitmapData instance is white. how i can use the alpha channel data to create an 8 bit, grayscale image, that's the reverse of the channel?
I have a project that requires me to display the waveform for a uploaded sound. The sound is always an MP3, most of the time 22.05 kHz mono, with speech only. The project are written with Flex/ActionScript 3. It's meant to run in the browser, but might also consider converting to AIR if that can help.
All examples I've found and looked at for generating a wave, are either doing some visualization in real time as the sound is playing, or, the most promising, as3soundeditorlib, keeps the wave already generated, but does it very slowly, seemingly using as long time as playing through the audio would've taken.Is there any way to generate the wave faster than real time?
I have some sounds in my library. They are all exported and have been given class names like C, D, E etc. I have another class that represents a button. This class stores a reference to one of the Sound in my library. I set the type for this variable to Sound. When I try to set it I get the error: cannot convert C$ to flash.media.Sound I try as Sound, for example key.note = note_array[p] as Sound; The value ends up as null.
I'm trying to play a sound using a technique found here (play the sound by sampling raw sound data gathered from the original with extract()), with the difference that the mp3 sound is embedded in the swf, not loaded externally. This is my code:
var soundBytes:ByteArray = new ByteArray(); var mp3sound:Sound = Sound(new Sound1_design()); // this is the embedded sound mp3sound.extract(soundBytes, int.MAX_VALUE);
[Code]....
This works, in a way, except that the resulting sound is distorted (it has a kind of a metallic ring).