Actionscript 3 :: Does The OSMF Captioning Plugging Support Audio
Dec 13, 2011
I'm setting up a simple little OSMF-based media player, and I have hooked up the org.osmf.captioning plugin found in the samples. It is working just groovy with video, but I also have audio files to play with captions, and it doesn't seem to do anything for those. On initial inspection, I can't see anything in the plugin that ties it to a specific type of media. Nor can I find any indication that the timeline metadata used for captioning is only relevant to certain types of media. Does the captioning plugin support audio playback? Might I have set it up incorrectly?
If been trying to refurbish my website using solely OSMF but, the sound part gives problems in various browsers. It works in IE6 and FireFox but, it doesnt in Chrome and Safari. Does anyone know what is happening and what their differences are when it comes to file loading?[URL]check it for yourself clicking the buttons in various browsers the Flash button uses native flash sound libs the OSMF button the OSMF implementation.
I would like to add a very small and simple Flash audio player to my website. I have found lots of Flash MP3 players, but I can't seem to find players that will work with other types of audio files, such as aac/m4a/mp4.
I had previously been using Windows Media Server to stream WMA files on my site, but this only works for Windows users, and I need streaming to work for Mac users as well. So I think Flash is the way to go, but I cannot use mp3's because of licensing concerns.
I've been looking and looking for a flash player that supports other audio types, but can't find one. Is it true that flash has a native class for mp3, but doesn't have built-in support for others?
I'm using jPlayer in a website, which is a cross-platform/cross-browser jQuery solution for audio and video playback on a website. On my website, I will have users upload files either in MP3 or OGG format. I wonder though... in order to truly stay 100% stable on any browser or platform, wouldn't I need to support a corresponding OGG for every MP3 uploaded (and vice-versa)? I know jPlayer uses HTML5 when it can and will fall back to Flash when necessary, but I didn't know if you need to have that file in both formats to have it play everywhere. If I need both file formats then I suppose I'd have to convert the file to the other format using a server-side conversion. If I need just one format... then wouldn't that be just dandy! So my question is, do I need to convert? Or not?
I'm trying to create a client-side web app that generates music procedurally using some user-input parameters, so I'm looking for a framework (e.g. Flash, Silverlight etc.) that has the capability to play audio at a specified pitch. Whether it is playing a WAV/MP3 file, using MIDI output, or just playing beeps doesn't really matter -- I just need something that will enable me to generate arbitrary music client-side.
I've done a bit of searching and it appears that Flash might have the ability to change pitch with the help of a third-part plugin, but I couldn't find anything similar for Silverlight.
implement closed captioning into my FLVs. I have successfully set up the FLVplayback component, and the FLVplaybackCaptioning component to work. That's not the issue. The issue is, once I have my swf, flv, and xml files (needed to successfully use CC in a FLV), I need to insert that .swf into my captivate project.
It appears to insert/import OK, but once I publish the project, the swf does not appear, and in fact the FLV icon sits on the slide rapidly blinking. Neither the video plays, nor can I even see the playback component/closed caption component (swf) that holds the FLV.
All paths are relative, so the only things I can think of that may be causing the issue are:
a)Captivate doesn't support CS3 FLVPlayback or FLVPlaybackCaptioning components
b)Since Captivate allows importing of FLV files directly, running the FLV through the component's swf is a problem
c)The hierarchy of files needed for the Closed Captioning FLV (xml, flv, swf), interfere with compiling the new Captivate .swf upon publishing I've also been told that the FLV captioning component only works in Actionscript 3.0 which runs on Adobe Virtual Machine 2.Captivate publishes .swf files using Actionscript 2.0 which runs on Adobe Virtual Machine 1. Therefore, what I need to do will not work.
I'm creating accessible content for a html elearning module. I'm including video, which I have successfully imported into flash, and added captions to by linking a XML document.The issue I am having is that my styling in the XML isn't working in flash, and my captions play in a tiny horrible font, making it very difficult for people to read, therefore not making it accessible content.I've used two tutorials online to create these styles in XML but neither are working.I'm using CS5 Flash and Dreamweaver.XML content is as follows: (I have removed my content for the script)
I'm using the flvPlayback and Captioning component/skin for some videos. Works great, but I can't figure out how to have the captions off by default, and use the caption button to turn them on for users who want to see the caption buttons.
flvCaption.showCaptions=false;
--hides the captions, but also disables the captionButton in the skin, so there is no way to toggle showing and hiding the captions.
I have a simple mp3 player I'm making for timing out closed captioning. It plays one mp3 based on an ExternalInterface call, updates a text field with the position and allows for pausing and jumping the position back a bit. It will have a scrubber, but that code isn't in there yet. There is no complex code anywhere in the file, but it is still getting confused with btnPlay.visible or btnPause.visible, depending on which is visible first.
I am using using Flash Builder to create an AS3 video player which needs to support Closed Captioning. However, I am not using the FLVPlayback component. Is there a way of doing something like addASCuePoint() to flash.media.Video;, or is my only option using Flash' Timer to do my own check?
I am looking to add closed captioning to video being played via Netstream. The "cuePoints" and text-data would be pulled from an external XML file. I am tinkering around with different libraries/add-ins but not having much luck. I've read that you cannot add cuePoints to a netstream programmatically.
I currently have a problem with captioning on a video file that I am streaming from Flash Media Server. For some reason, the first time the video is viewed, the captioning pauses right around 6 mins. I've verified this happens from multiple computers, from different physical locations and on different networks. The video continues to play, but the pause button on the playbar also looses functionality and the captioning does not progress.
I've created a total of 11 videos all using the Flash CC plugin with a custom button for show/hide captions and minimal playbar which only includes the pause button and the mute button. This only happens in this one particular file, but they are all setup the same. From what I can tell, the xml is constructed correctly (TT XML generated by contractor).
The code I've written works fine on all the other files. The only thing I can see that's different among the files is possibly the size of the movie I'm streaming. Its the longest. If the viewer refreshes the page and watches the movie again, the pause does not happen. Is it possible that a setting may need to be changed on the Flash Media Server?
I am developing a video player using the OSMF library. I have the problem that I sometimes lose the connection to the server. So I set up an object that watches the connection to the server and in case of connection lost it tries a limited number of times to reconnect before giving up. Everything works just fine except for the message that I get on the debugger version of the player which states:[code]But I still get the error. The onNetStatus method gets events like NETSTREAM_BUFFER_EMPTY, NETSTREAM_BUFFER_FULL or NETSTREAM_PLAY_START but not NETSTREAM_PLAY_STREAMNOTFOUND.
I'm working on a local application ( it's not a website or nothing related ) and I have various FLVs with a very simple encryptation method (just like adding 10 at each byte). I can load/play them using NetStream.appendBytes() after my decrypt, but that happens only after I read all video data. What I really need is to stream those videos from a remote url, and decrypting while receiving data, using a OSMF player that I already have.
This is my current code just to play my decoded FLV byte array
private function playBytes(bytes:ByteArray):void { // detecting it's header if (bytes.readUTFBytes(3) != "FLV")
OSMV is very thick so so I'm trying to put a series of minimalist tutorials and again I'm stuck with something that should be dead simple.I have a VideoElement that I added to a MediaPlayer. Now now do I set the size of the video?I'd like to just set the size on the mediaPlayer or MediaElement and not include 20 layout classes like the OSMF examples.
private function handle_elementLoaded(e:MediaFactoryEvent):void { mediaPlayer = new MediaPlayer(e.mediaElement); [code]....
I'm trying to add a cuepoint to a video using OSMF. I built an OSMF video player, and I'd like to use this instead of the FLVPlayback component, which seems like the only way to add an actionscript cuepoint? Anyhow, I created a cuepoint by writing this:
The Flex 4.1 SDK ships with OSMF 1.0. For new features, OSMF can be updated. Adding the updated osmf.swc to a Flex 4.1 project, I get error messages as soon as I add an OSMF component in my application (VideoDisplay in the sceenshot): Screenshot: [URL].png OSMF 1.5 download page According to the documentation, OSMF 1.5 should work with the Flex 4.1 SDK. Unfortunately, I have to stick to Flex 4.1 because Flash 10.1 needs to be supported (Flex 4.5 needs Flash 10.2) When I change the SDK to version 4.5, the error messages disappears and it compiles as expected. --> Is the documentation wrong about supporting Flex 4.1 or am I doing something wrong?
I'm building a pretty simple player and have a a buffer size set on my MediaPlayer (mediaPlayer.bufferTime = BUFFER_TIME). That's working but I want to show a graphic whenever it starts to buffer. I have an Event Listener on my mediaplayer to show and hide but it doesn't seem to be working properly.[code]...
How would one create a video or audio stream using OSMF when there must be basic authentication on the url?Can one feed in Audio/Video using HTTPService to provide the header authentication?
I'm working on a local application ( it's not a website or nothing related ) and I have various FLVs with a very simple encryptation method by now (just like adding 10 at each byte).
I can load/play them using NetStream.appendBytes() after decrypting, but that happens only after I read all video data it's not streamed.
What I really need is to stream those videos from a remote url, and decrypting while receiving data, using a OSMF based player that I already have built. I'm lost on how OSMF deals with FLV, otherwise, I would try to create a plugin or something like. .
find a way to load a local file using OSMF, passing a ByteArray value, instead of a url (below). Or even giving me directions to create a OSMF plugin to solve my problem.
videoElement.resource = "video_url/video.flv";
This is my current code just to play my decoded FLV byte array
private function playBytes(bytes:ByteArray):void { // detecting it's header if (bytes.readUTFBytes(3) != "FLV")
I have a HTTP video player built using Adobe's OSMF and I am experiencing a strange behavior when trying to seek within a subclip.The player requests data from the server using URL like "http:[url]...." to get the full video, and appends "?begin=123456" to request a subclip starting at 123456 bytes offset.Whenever I try to seek within a loaded subclip, the playhead just drops to the start of the subclip and the video plays from there. Although both mediaPlayer.canSeek() and mediaPlayer.canSeekTo(newtime) return true.
I am looking to build a custom OSMF player such as this OSMF player sample, however I only have Flash CS but not Flex. Is it possible to build a OSMF player without using Flex?
I am developing an web application in flex which have a feature of recording the runtime by having a snapshot of each frames then encoding it into a ByteArray for video playback. I am currently using NetStream.appendBytes() for playing the ByteArray FLV. It is working, but I just found out about OSMF and thinking bout integrating it in my application. It is it possible to play the flv byteArray in OSMF?
I use OSMF's SWFElement for my project to load SWF file in to main Application but the main app can't detect event from child SWF at all, .CODE in the Main App
mediaPlayerSprite = new MediaPlayerSprite(); var swfElement:SWFElement = new SWFElement();[code]....
CODE in the Child Flash SWF using Flash CS3 add code in The FLASH TimeLine
I've written a little OSMF player that streams via RTMP from Amazon Cloudfront. There's a known issue, the mp3 duration is not correctly readed from metadata and thus the seek function is not working. I know there's a workaround implying the use of getStreamLength function of NetConnection, which I successfully implemented in a previous non-OSMF player, but now I don't know how and when to call it, in terms of OSMF Events and Traits. This code is not working:
I have written a program to publish the audio and video data to the FMS. I am publishing the video data to FMS in live mode and trying to play back via OSMF player. When i start publishing video data in the livepkgr application folder of FMS files like .bootstrap, .control, .meta are getting created. But when i try to playback via OSMFplayer i will get error 1009 and one intersting thing what i have observed is after 7 minutes this error won't come and OSMF player starts playing properly. give the solution for error 1009 which comes only for first 7 minutes?