Anyone? Is there a setting here that I'm missing?
I found the following line in the NetStream reference regarding H264 on iOS:
Special considerations for H.264 video in AIR 3.0 for iOS
For H.264 video, the iOS APIs for video playback accept only a URL to a file or stream. You cannot pass in a buffer of H264 video data to be decoded. Depending on your video source, pass the appropriate argument to
- For progressive playback: Pass the URL of the file (local or remote).
- For streaming video: Pass the URL of a playlist in Apple's HTTP Live Streaming (HLS) format. This file can be hosted by any server; Flash Media Server 4.5 and higher has the advantage of being able to encode streams in HLS format.
This seems to imply that HLS is the only supported transport protocol on iOS. However, this is never explicitly outlined by Adobe's docs. Also, I am able to get an H263 video rendered just fine over rtmp/rtmfp.
Can someone from Adobe please verify whether or not H264 decoding over an RTMP/RTMFP stream from FMS is supported in Adobe Air 3.0 on iOS?
I'm also interested in this issue.
Using Wowza, I can play HLS H.264 live streams in Safari under iOS 4; but how to do it in AIR 3?
<s:VideoDisplay source="http://server/rtmp/vv12/playlist.m3u8" />
<s:VideoDisplay source="rtmp://server/rtmp/vv12" />
but no video is displayed.
That's a completely different issue. RTMP and HLS are two completely different transport protocols. HLS is supported for H264 decoding/encoding on iOS, while RTMP is "officially" NOT supported on iOS for H264 decoding.
I've recently met your same problem trying to play embedded H264 video inside AIR for IOS. The same code works for FLV file (using NetStream class) but when I try to point to an F4V video, during the PC debug all continue to works fine but the app inside iPad doesen't display the video.
Here a piece of the code I've used. Have you solved that problem??
Thanks in advance for your support.
AIR3.0 SDK / FLASH CS 5.5 PROJECT / FLASHDEVELOP CLASSES
Following the code that works well with FLV file.
_movieVideo = new Video();
_movieVideo.smoothing = true;
var connection:NetConnection = new NetConnection();
_movieStream = new NetStream( connection );
_movieStream.addEventListener(NetStatusEvent.NET_STATUS, checkLoop );
_movieStream.bufferTime = 0;
_movieStream.inBufferSeek = true;
var client = new Object();
_movieStream.client = client;
client.onMetaData = onMetaData;
_movieVideo.attachNetStream( _movieStream );
_movieStream.play( sPath ); <- as path I've tried with "media/ap/ap.f4v" and "mp4:media/ap/ap.f4v" and "media/ap/ap.mp4" but NOTHING!
those files are in the same folder as the "media/ap/ap.flv" that WORKS WELL!
_movieStream.seek( nSeek );
_movieVideo.width = this.width; <- the container is a MovieClip previously dimensioned to 576x768 and I repeat, this code works well with FLV
_movieVideo.height = this.height;
_movieVideo.x = -this.width / 2;
_movieVideo.y = -this.height / 2;
this.addChild( _movieVideo );
It turns out that real-time streaming of an H264 video does not render on iOS devices in Adobe Air. Other video codecs will work (H263), but not for H264. You'll consume the feed, but will only hear audio. I confirmed this after talking with members of the Adobe Air development team.
Hi Craig, thanks for your answer.
Unfortunately the problem affters real-time streaming using local or remote files. It's a big trouble, actually I've rendered all my files in FLV but the quality isn't comparable with the H264 version and the files are a little bit larger!
Any chance that this incredible bug will be fixed in a future version?? (if there will be another update to the Air for iOS after the recent Adobe position regarding the AIR for mobile)
Hi kerpoof_dev, this problem affects both GPU and CPU render modes. The only way to render the H264 video is to create a StageVideo class that has great speed but many limitations.
You can render a locally store H.264 using stageVideo on iOS with no problems. The problem occurs with streaming an H264 video stream usig a real-time streaming protocol (RTMP, RTMFP). This does not work on iOS and Adobe Air, due to the underlying iOS SDK. It does work on Android, though. This isn't really a bug; instead it's a product of how Adobe Air compiles down and uses the underlying iOS SDK, and the fact that there's no real way within the iOS SDK to render a real-time streamed, H264 signal. Only a recorded stream.
When I say "streamed", what I mean is media streamed using RTMP or RTMFP from a media server (presumably Flash Media Server).
It IS possible to stream an H264 video using HTTP Live Streaming from an FMS to an Adobe Air mobile app on iOS. I haven't actually tested this out, but the docs all say it's supported. The problem with this approach is that HTTP Live Streaming isn't exactly "real-time", and can have delays of up to 30 seconds in the media stream.
1 person found this helpful
I do hope someone from Adobe is "hearing" this, guys. The lack of RTMP-based H.264 video on the Air for iOS is a major problem, indeed.
As Fabio Sonnati mentioned in http://sonnati.wordpress.com/2011/04/26/air-2-6-for-ios-and-video-play back/, AIR for iOS does support HTTP streaming (via HLS) of h.264 videos. However, when streaming via RTMP, AIR for iOS only supports VP6 and Spark – a couple of old, retired codecs.
While HTTP streaming (HLS) seems to be a good option for those who simply want to “play a video” in iOS, I do believe it has some severe limitations, especially for live-communications. I’d like to share some of these thoughts with you.
1. HLS has ridiculously high latency for live videos (around 40 seconds), when compared to RTMP. Although this may not be a problem for on-demand videos, it sure is a great problem for anyone doing serious live-communications applications (such as webconferencing, live webcasting with audience interaction or Skype-like video chats), which require near-zero latency.
2. Perhaps someone can correct me on this (hopefully!), but as far as I know, HTTP streaming will not allow cuepoints to be read from videos. This is particularly painful for anyone doing video-triggered actions, such as slide changes (for webinar apps), subtitling or live closed captioning, etc. I read somewhere that OSMF player allows cuepoints (or "temporal metadata". See http://blogs.adobe.com/osmf/2009/11/cue_point_support_in_osmf.html), but I haven't been able to test it myself.
3. Although HLS it is quite compatible with firewalls (since it flows through port 80), RTMP with tunnelling also flows through port 80 or 443, which adds great compatibility, even on very restricted networks. Our experience with very large clients proves that, hands down.
In other words, HTTP/HLS streaming is Ok. But it simply does *not* fit into every shoe that RTMP does. We do believe that RTMP remains as our best option for live streaming or serious streaming-oriented *apps* (in which things more complex than “mere video playing in a window” actually happen).
That all said, I do believe we should let Adobe know about this need. The fact that RTMP streaming in AIR for iOS is limited to VP6 and Spark, which are two “dead” codecs, still puts us, Air developers, in a very fragile position in terms of what we can accomplish with video in iOS.
I’m sure some of you cheered when you heard about Flash Player 11 having h.264 video encoding. This, (plus the echo cancellation feature that came in 10.3) opened great doors for great Unified-Communication applications to be developed for Flash/Air. Now, it’s undeniable that clients want those applications running on tablets, especially the iPad.
Not being able to use h.264 via RTMP on iOS is certainly a huge step backwards. Anyone shares this same opinion? What do you guys believe to be the best option to let Adobe really know about this need? Is this limitation a simple lack-of-a-feature (which can be fixed by Adobe) or is this some imposed thing by Apple?
Just one final note: Air for Android does *not* have the same limitation. It does allow RTMP streaming of h.264.
Thanks for your attention,
The requirement to use HTTP is an Apple one. I forget at what size it's a requirement, but it's a short enough time that most streaming applications would exceed it quite quickly. Here's the Apple article on it:
It seems that if your video is over 10 minutes, or it delivers over 5 MB in 5 minutes, it must be HTTP Live Streaming. Shorter clips can be progressive downloads. With those requirements it would be tough for Adobe to justify RTMP support for the sake of people with low data rate, short movies, so instead they added HTTP support to Flash Media Server.
Thanks for the (very!) quick response!
I understand this is imposed by Apple. The problem is that HTTP Live Streaming, as I mentioned on my previous post, currently does not provide the same features as RTMP does... Again, it's cool for those doing latency-independent apps, in which video is an entity by itself.
But, let's say I need some cuepoints in a *live* video to trigger, for example, an image change, or any other events on the app. That is pretty straighforward with RTMP, for example.
Ok, so I'll change my question. Instead of trying to accomplish this with RTMP, my new inquiry is: how can this be accomplished with HTTP Live Streaming?
Also, something is puzzling me: we know that some real-time video applications (such as Skype and WebEx) do work on the iPad. I presume that, being real-time, those apps do not use HTTP Live Streaming. So, what do they use? How can they achieve near-zero latency, with H.264-like quality, on their respective iPad apps? Any thoughts on this?
Thanks in advance,
I add myself to the list of people that need a solution to this problem.
I need to make a videoconference system... latency is one of the issue but also multiple h264 decoding of all the clients connected is a problem if i want to show all clients at the same time in realtime.
I would also like to see H.264 via RTMP on iOS. I was checking the AIR roadmap and it doesnt say anything about it on there. I was happy to see some sort of quasi threading in Actionscript coming late in 2012.
Me too, I cant even playback a VP6 encoded video delivered through RTMP o iOS5 for my iPhone. Any hints?
There are strict rules with regard to what videos can be played over RTMP in iOS. I forget the exact numbers, but it's to do with the number of megabytes per minute, and the total running time of the video. It's quite likely that you're going to exceed the limits, so just don't even try RTMP. It's not an AIR issue, it's an iOS restriction.
So that means that its not possible to playback a RTMP live stream, even if this is not being encoded with h264???. Its frustrating that it does work on the simulator, but once you deploy it, it will not work...
Has anybody found a solution for this?
Is it possible to do a live HTTP stream inside an Air app for iOS for a videochat application?
So as i understand there is no solution yet for playing live streaming with cuepoints on IOS using Air app?
What about create an ANE that play native IOS and embed it into the Air app?
What about create an ANE that play native IOS and embed it into the Air app?