I need to develop a plugin that is very similular to the SMIL plugin.
Composition Plugin (SMIL) and Use of [LoadableProxyElement]
A SMIL plugin needs to parse a SMIL document, and create a composite media element. However, a SMIL plugin does not knowapriori which [MediaElement] needs to be created before parsing through the SMIL document. As a result, it does not know which[MediaElement] to include with its [MediaInfo] that gets registered with the Strobe [MediaFactory]. To solve this problem, we recommend using the [ProxyElement] class as its default [MediaElement]. Here's how the approach might look like
- SMIL plugin's [PluginInfo] class (to be precise, its mediaInfoSet getter method) will return an array containing a single[MediaInfo] object. The [MediaInfo] object will contain '[LoadableProxyElement]' as its [MediaElement], 'URLResource' as the [IMediaResource] and 'SMILLoader', which extends [LoaderBase].
- The SMILLoader class will have a 'load' method. Here's what it should do at a high level
- Download the SMIL file specified in the URLResource object
- Parse the SMIL document
- Create a [SerialElement], or a [ParallelElement] or a specific [MediaElement] (such as [VideoElement]) depending what's at the root level in the SMIL document
- Call 'load' on its ILoadable trait
- Wait for it to be loaded
- Use the [ILoadedContext] object from the loaded media element and include it in the LOADED event.
My plugin needs to take an asset id, make a request a to get the FLV url, then create a Proxy element to monitor the playback.
I'm like the player implementation would look like this.
// VeohResource would append the asset id to the full service url. var resource:IURLResource = new URLResource(new VeohResource("v123456")); var mediaElement:MediaElement = mediaFactory.createMediaElement(resource); mediaPlayerWrapper.element = mediaElement;
The closest plugin example I could find was the SMIL plugin but after many many hours of reading contradictory blog post and different sprint code I'm going to need some help.
1. Take down the old documentation to pervious sprints.
2. Where is the SMIL sample project? All of the other plugins have a sample project. The unit test project was helpful but didnt' provide any clues on player implementation.
So from the looks of this document I guess you just have to load the plugin and place this code in the onPluginLoaded handler?
var resource:IURLResource = new URLResource(new URL("http://myserver/myfile.smil")); var mediaElement:MediaElement = mediaFactory.createMediaElement(resource); mediaPlayerWrapper.element = mediaElement;
SMIL Plugin Questions
1. I was going to ask how to call SMILLoader.load method from the player code,
but now in the latest release it has been renamed to executeLoad, which leads me to believe this method will be executed
by the framework and not the player code.
So what exactly is calling SMILLoader.executeLoad? and what player code triggers this call?
2. Why does SMILLoader get created twice in SMILPluginInfo.
Arr, looks like this may only be an issue with the sprint 9 tag.
3. I'm assuming after loading the SMILPlugin calling mediaFactory.createMediaElement(new URLResource(new URL("http://myserver/myfile.smil"))) would some how magically invoke SMILLoader.executeLoad?
4. In SMILLoader.finishLoad
a) Where did this loadTrit come from?
var loadedElement:MediaElement = mediaGenerator.createMediaElement(loadTrait.resource, smilDocument, factory);
And is the resource a reference to the original resource from the player code?
this one => var resource:IURLResource = new URLResource(new URL("http://myserver/myfile.smil"));
b) So I'm trying to figure out happens after the loadedElement var is assigned a new MediaElement.
This LoadedFromDocumentLoadTrait method is new to sprint 10 and I have no idea what it does?
c) So I'm completely confused here.
It looks like the MediaElement is created and it's now time to get it
back to the player code so it could be added to a MediaPlayer, right?
So in sprint-9 I thought the player code had to listen for this "update load trait" event to get the MediaElement
but now in sprint-10 there is a whole object and event handler.
So how the hell does the player get the MediaElement created here?
5. The example player code provided doesn't jive with the implementation found in the trunk.
// If we define our resource here
var resource:IURLResource = new URLResource(new URL("http://myserver/myfile.smil"));
// How could have the SMIL document been loaded and parsed before we call this line?
var mediaElement:MediaElement = mediaFactory.createMediaElement(resource);
mediaPlayerWrapper.element = mediaElement;
It seems the player code needs to listen for a SMIL loaded event before doing anything with
the mediaElement, right?
Thanks in advance for any advice you can give me.