ActionScript 2.0 :: Stream Images (single Jpg) From Webcams Into A Swf?
Nov 5, 2009
I am trying to stream images (single jpg) from webcams into a swf. I know little about actionscript, and what I have so far is from what I have found in other posts around the web. My first attempt worked, but caused a bad white flash between the image being refreshed. So from others I have learned that this will happen unless I use two movie containers and buffer in between them? Here is the code I have, and it does not work.
[Code]...
I don't get any compiler errors. I think this is as2, but I am not sure
I have an AS3 swf which users can upload jpg images to my EC2 instances which sit behind and Elastic Load Balancer. The jpg images are converted into bytearray data and sent using URLLoader.load(URLRequest)I make 2 calls when uploading, one to upload a large version, then another to upload a thumbnail version. A PHP script to which the bytearray data is uploaded converts this to a file using file_put_contents($destination,$GLOBALS["HTTP_RAW_POST_DATA"])Is it possible to combine these two requests into a single request which contains both the bytearray data for the large and thumbnail images and 'split' the HTTP_RAW_POST_DATA to create 2 files at the server. This would be better than uploading the bytearray for the large version then using something like ImageMagick to resize the resulting image into a thumbnail which I realise is another option.
I'm interested in making a stereoscopic player similar to YouTube's 3D Player. The first strategy I tried is having two streams (one for each eye) attached to two video objects, but I noticed that this could lead to problems with them going off sync due to separate buffering issues. So now I'm thinking the way to do it is to encode the two eyes side-by-side into one video, and do some Flash trickery whereby I can take the left side and overlay it with the right side.
I suspect that this is how YouTube does this, because they ask you to upload 3D videos in a side-by-side format in the first place. I don't know how to do that Flash trickery, where I essentially take a video's left-half and overlay it with it's right-half. I tried attaching one stream to two video objects that were offset from each other, but only one of the streams played.
In Video 1, she shows how to stream a single file. I've done everything to the letter on this video, however, when she uses the videoplayer to reference the livestream.f4m file, there is none. I've actually checked the folder location(apploications/livepkgr/streams/_definst_/livestream and there are files in there (.f4f and .f4x), but no .f4m which is the file she references in the video player.
In Video 2, she shows how to create a multi-bitrate video stream. Using the manifest packager and creating a .f4m and .m3u8 package and placing it in the webroot folder. Here is the contents of my liveevent.f4m file:
I need to merge multiple live audio streams into a single stream so that i can pass this stream as input to VOIP through a softphone.For this i tried the following approach:Created a new stream (str1) on FMS onAppStart and recorded the live streams (sent throgh microphone) in that new stream.
Below is the code : application.onAppStart = function() {
I am using Flash Media server 4.5 and i read the tutorial if i want to stream the live feed, i may need to use the media live encoder. but what i found in media encoder is i have to manually setup everything and it only support camera devices. But in my case i have multiple video files keep received from another program and place it on file system (server),my goal is use the Flash Media server to perform a live boardcasting with these video file one by one. That means when client watching a live streaming, they will not notice the server is playing mov1, then mov2, then mov3, then mov4... and so on.
You can imagine i am trying to boardcast a live footage say for 60sec, but the video file will not recorded entirely after 60sec, instead for every 10sec i will save a new video file, so that when client watching the live by HLS [URL]when the time reach to 10sec, a mov1 video file available and FMS should boardcast this video on live123.when the time reach to 20sec, a mov2 video file available and FMS should Immediately follow the mov1 boardcast on live123.and so on...Also can FMS dynamically create a new streaming session (invoke by code), so that when client A uploading some video files to the server, the FMS open a new streaming session only stream cilent A video files?the configuration to boardcasting like screen size, bit rate, etc should be pre-defined on the server. [URL]
I am developing an AS3 AIR application which aims to use multiple webcams, i have two logitech c615 hooked up on the usb ports, i can get one of the video streams, but i can't seem to be able to access the other cam video, Problem is I just get video from one of the usb webcams, usually the last one i connected.[code]but i never get the other cameras, is this possible?
I hav ea problem with recording (or even showing) the video of 2 separate webcams which are the same type, brand etc. Somehow one shows its stream and the other one doesn't.
ActionScript Code: var widthPos=Math.ceil(Math.sqrt(Camera.names.length)); var heightPos=Math.ceil(Camera.names.length/widthPos); var tWidth=640/widthPos; var tHeight=480/widthPos;
who has ever messed around with this part of AS3 I've run into an issue with detecting the resolution of the webcams hooked up to the individual machines.
Now while the documentation on the Camera-class says that camera.height and camera.width are readable properties of that camera and also uses them as a way to find the camera's resolution in the examples, it just doesn't work that way on my end. No matter what webcam and what PC I try to use for this, it always tells me that my webcam has the dimensions of 160x120 pixels.
This is very frustrating to me since it forces me to impose a dialogbox onto the - possibly not too tech-savvy - user where they can chose a resolution that is closest to their webcam's native resolution.
I've looked around for a solution to this and came up totally dry. As IQAndreas approached me regarding this because of a twitteruser asking for the same thing, I'm pretty sure that I'm not the only one having this issue.
I have developed an application on Adobe AIR which helps users take there photographs and store them. What i need to know is that i wanted to use HIGH RESOLUTION cameras as web cam .. Preferably digicams .. So does Flash AIR support using digicams as webcams ?
or what other options do i have ? I want to take very high resolution photographs in Night Mode so i need to use digicams for that ..
how to create a single preloader that would load my main timeline and all external images? I'm using multiple instances of the loader component to import jpgs.
I'd like to import dozens of images and have them in sequence in a single layer. I did this once before, I made a .swf out of a video game cutscene and I recall importing the images to the stage then I asked somewhere else how to make them appear in the layers field. Then I did something that put each image in its own layer and I had to manually move them and make them appear in sequence.
So if anybody knows how to import and put in a single layer in sequence or simply import each image into its own layer
My webcam works great with Skype and Cheese and it used to work great with a program called FaceFlow but now when i try to use it with FaceFlow i get the following pop-up: Adobe Flash Player can't locate a camera on my computer? I have installed via Synaptic the latest version of Adobe Flash Player which is 11,1,102,63 but without any affect?
Would it be possible to uninstall the present version of Adobe and use an older version in the hope that the webcam will be recognised?"
I did a quick test of FaceFlow the other day ...... I have two webcams for testing, one is an older model and the other a uvc compatible device.
Faceflow would not recognise the older model, only the uvc camera.
So now Adobe has pulled the plug on Mobile Flash and are pushing more for HTML5, and Silverlight has pulled the plug too, what's next for interactive video content. So if they die out (as they are currently trying to do) how do we access web cams (or even phone cams) on websites. The HTML5 Media Capture looks like it doesn't support streams only Files, so that could be out, is there any alternatives at the moment, or in the near future?
Is it possible, via Adobe Air, to save multiple types of data in a single file? For example, an application would allow the user to load in external images, position them on stage and label them. This data would be then be stored in a ByteArray (I guess) using BitmapData for the images and probably XML for the metadata.
I would then like to write this to a single file, with a bespoke file extension that could be associated with said Air app.
The situation is, this resize code has always worked for me because I was using local images. I am now using a third party admin which feeds images through an application/octet-stream.
Whereas previously, I was loading the image with the Loader::load method but I am currently using the Loader::loadBytes to load the binary data because I am being fed a stream instead of an image/jpeg.
By switching to the stream, the code now fails to get past this line:
resizedImageData.copyPixels(e.currentTarget.getChildAt(0).bitmapData, cropRect, new Point(0, 0));
Only just getting started on this whole domain of learning, so go easy!If I set up a P2P video/audio chat (similar to the sample VideoPhone thing on the Cirrus site), can I get the stream from both parties to send to a server at the same time so that I can record it? If so, would I have to use a FMS to stream it to and perform the recording (and if so which version could I get away with)? Are there any (preferably free, or just tutorialised) solutions for the recording side of things?
Currently it seems like the only option for doing the P2P thing is to use Stratus/Cirrus unless I use FMS4 Enterprise.
how effective this kind of situation can be, in terms of quality of the stream and recording? Does any of this make sense?
I've had FMS running on my local machine for a while and have had a little experience writing FMS apps, but I've just tried recording audio for the first time using the standard vod application and I keep getting a "Write access denied for stream" error. My AS3 code is copied and pasted for various examples and am confident that it works.
I'm running Windows XP service pack 3 & FMIS 3.5.
I've had a look at the vod/media directory and under windows->properties the read-only attribute is ticked. Every time I un-tick this it reverts back to being ticked. I've googled this and MS say that most programs ignore the read-only attribute and that it only really applied to files. I've also tried the MS fix for setting the read-only attribute via cmd and still no joy (doesn't fix read-only attribute or FMS recording the audio after setting via cmd).
I've also tried our dev server install of FMS (running under linux) and am getting the same results.
Here's my AS3 code...
private function initApp(event:Event):void { removeEventListener(Event.ADDED_TO_STAGE,initApp);
i test the fms 4 update 1 rtmfp streams multicast after 10 minutes i get this message RTMFP Multicast stream has exceeded max duration allowed; closing stream. but i do not use IP multicast
I build a client side application where is only a FLVPlayback2.5 component and a short AS3 script.
[Code]....
My Encoder is setup with three streams: Vid: 500 kbps - Audio: 48 kbpsVid: 800 kbps - Audio: 48 kbpsVid: 1500 kbps - Audio: 48 kbps I start the encoder and everything looks fine in the log. In my browser (Safari or Firefox) I go to my html site and the stream starts after 6-8 sec. But anytime with the lowest bitrate 548 kbps and nothing look like the stream is switching to another bitrate. I tried it with the smil playlist and the result is the same. Only the lowest bitrate is plublished.
I have recently installed FMIS 3.5.3. In checking the access logs I find data in both logs that display the same stream stop and stream play time .I'm not sure why the time is the same (00:19:27 example below). Videos play fine when testing from work (T3 connection). However, occasionally a very slight hesitation when playing video from home (I have cable connection). [code]...
I'm having a problem with recording a live webcam stream. The last few seconds of the stream is getting cut off. The recording is stopped with the following piece of code:
I am having trouble getting audio stream meta data from an Akamai FMS stream. Everything is undefined and I'm not sure why. I am hoping maybe someone will notice something that I am overlooking. The stream is connecting and playing without a problem I just can't seem to figure out why all the meta data is undefined.
I have a layout with narration and a nav bar. When I click a nav button for section 2, the audio from section 1 (set to stream) continues to play over the audio for section 2. This cumulates so if I click buttons for sections 3, 4 and 5, I get five audio files playing on top of each other. Sections are individual movie clips with embedded audio streaming on a Sounds layer in each movie clip.
I'm trying to stream a HDS live multi-bit stream, it seems to push to the FMS but my player doesn't display the stream.Are these settings and files correct? The documenation is confusing on what and which files need to be edited and/or created.
Encoder settings: Bit Rate: 150,500,700 FMS URL: rtmp://myserver/livepkgr Stream: liveevent%i?adbe-live-event?liveevent
FMS 4.5
I see the following directories being created when I start encoding and each directory has a single file with a .stream extension in them. Are these correct? fC:FMS-HOMEapplicationslivepkgrevents\_definst_liveevent1[code].....