Skip navigation
Currently Being Moderated

Adobe Air 3.0 iOS streaming H264/Speex over RTMP -- No Video

Oct 25, 2011 10:39 AM

Tags: #video #flash_builder #iphone #ios #ipad #h264 #rtmp #stagevideo #h264videostreamsettings #adobe_air #h263

Problem:

H264/Speex RTMP stream from Flash Media Server doesn't display video.  Audio plays fine.

 

Conditions:

Adobe Air 3.0

iOS device (iPad 2, iOS4.3)

 

App settings:

<renderMode>direct</renderMode>

 

Flash Builder 4.5.1 compiler settings:

-swf-version=13

-target-player=11.0.0

 

 

I have tried using both stageVideo and the regular Video object to render an H264 signal streaming from Flash Media Server over rtmp, but with no luck.  I can hear the audio, but the video is never rendered.

 

I CAN see H263 video when streamed over RTMP using this setup (with just the Video object). 

I can also stream a locally stored mp4 (H264/AAC) file over rtmp from the iOS device and play it locally just fine (using a stageVideo object).

 

I have attempted this with both stageVideo (which works fine when streaming an mp4 file from the iOS device) and with the regular Video object (the regular Video object handles H263 just fine streaming down from FMS over rtmp).  I've also played around with backgroundAlpha = "0" for this case, all with no luck.

 

I've followed all the online instructions and tutorials to get this working for the past week, but have not had any luck.  I'm happy to post more code, but this approach is fairly straight forward and has been described multiple times around these forums and in the adobe blog posts.  It goes like this:

 

     - instantiate NetConnection (_nc) to FMS and wait for successful connection:

               _ns = new NetConnection();  

               _nc.connect(serverAddress);

     - instantiate NetStream (_ns) and connect using NetConnection: 

               _ns = new NetStream(_nc);

     - setup listener for StageVideoAvailabilityEvent; if stageVideo becomes available, attach _ns: 

              _stageVideo = stage.stageVideos[0];

              _stageVideo = _stageVideo.attachNetStream(_ns);

     - if stageVideo ISN'T available, fall back to Video object:

               _video = new Video();

               _video.attachNetStream(_ns);

               stage.addChild(_video);

     - play netStream app:

               _ns.play(appName);

 

My renderMode is set correctly ("direct"), and from what I can tell, this isn't a backgroundAlpha issue.

 

I have seen this problem described elsewhere on this forums and on the net. 

 

I'd like to know from Adobe: are there any compatibility issues with this approach?  There doesn't seem to be any type of compatibility chart listing which video codecs and transport protocols are available on the different mobile platforms.  Since mobile deployment is so heaviliy fragmented, a descriptive chart or table like this would be helpful for those of us developing -- at least for the main mobile platforms predominantly in use.

 

Other forum posts similar to this problem:

 

http://forums.adobe.com/message/3981541#3981541

http://forums.adobe.com/message/3954578#3954578

 

 

Thanks in advance!

 
Replies
  • Currently Being Moderated
    Nov 9, 2011 8:32 AM   in reply to craiglewiston

    I'm also interested in this issue.

     

    Using Wowza, I can play HLS H.264 live streams in Safari under iOS 4; but how to do it in AIR 3?

     

    Tried

     

      <s:VideoDisplay source="http://server/rtmp/vv12/playlist.m3u8" />

     

    and also

     

              <s:VideoDisplay source="rtmp://server/rtmp/vv12" />

     

    but no video is displayed.

     
    |
    Mark as:
  • Currently Being Moderated
    Dec 22, 2011 6:58 AM   in reply to craiglewiston

    Hi Craig,

     

    I've recently met your same problem trying to play embedded H264 video inside AIR for IOS. The same code works for FLV file (using NetStream class) but when I try to point to an F4V video, during the PC debug all continue to works fine but the app inside iPad doesen't display the video.

     

    Here a piece of the code I've used. Have you solved that problem??

    Thanks in advance for your support.

     

    AIR3.0 SDK / FLASH CS 5.5 PROJECT / FLASHDEVELOP CLASSES

    Following the code that works well with FLV file.

     

    _movieVideo = new Video();

    _movieVideo.smoothing = true;

               

    var connection:NetConnection = new NetConnection();

    connection.connect(null);

               

    _movieStream = new NetStream( connection );

    _movieStream.addEventListener(NetStatusEvent.NET_STATUS, checkLoop );

    _movieStream.bufferTime = 0;

    _movieStream.inBufferSeek = true;

               

    var client = new Object();

    _movieStream.client = client;

    client.onMetaData = onMetaData;

               

    _movieVideo.attachNetStream( _movieStream );

               

    _movieStream.play( sPath ); <- as path I've tried with "media/ap/ap.f4v" and "mp4:media/ap/ap.f4v" and "media/ap/ap.mp4" but NOTHING!

                                                  those files are in the same folder as the "media/ap/ap.flv" that WORKS WELL!

    _movieStream.seek( nSeek );

               

    _movieVideo.width = this.width;   <- the container is a MovieClip previously dimensioned to 576x768 and I repeat, this code works well with FLV

    _movieVideo.height = this.height;

    _movieVideo.x = -this.width / 2;

    _movieVideo.y = -this.height / 2;

               

    this.addChild( _movieVideo );

     
    |
    Mark as:
  • Currently Being Moderated
    Jan 11, 2012 7:55 AM   in reply to craiglewiston

    Just curious, does it depend on whether renderMode is GPU or CPU?  Camera docs say they can't display output on iOS while in GPU mode, and the GPU mode notes say Adobe recommends not using GPU mode for video apps.

     
    |
    Mark as:
  • Currently Being Moderated
    Jan 11, 2012 8:23 AM   in reply to craiglewiston

    Hi Craig, thanks for your answer.

    Unfortunately the problem affters real-time streaming using local or remote files. It's a big trouble, actually I've rendered all my files in FLV but the quality isn't comparable with the H264 version and the files are a little bit larger!

     

    Any chance that this incredible bug will be fixed in a future version?? (if there will be another update to the Air for iOS after the recent Adobe position regarding the AIR for mobile)

     

    ---------------------

     

    Hi kerpoof_dev, this problem affects both GPU and CPU render modes. The only way to render the H264 video is to create a StageVideo class that has great speed but many limitations.

     
    |
    Mark as:
  • Currently Being Moderated
    Jan 17, 2012 9:22 AM   in reply to craiglewiston

    helllo

     

    @craiglewiston

    do you know if rtmfp and h264 will supportet in next air releases or will h264  rtmfp never work on ios air?

     

    best regards

     
    |
    Mark as:
  • Currently Being Moderated
    Feb 10, 2012 9:52 AM   in reply to craiglewiston

    I do hope someone from Adobe is "hearing" this, guys. The lack of RTMP-based H.264 video on the Air for iOS is a major problem, indeed.

     

    As Fabio Sonnati mentioned in http://sonnati.wordpress.com/2011/04/26/air-2-6-for-ios-and-video-play back/, AIR for iOS does support HTTP streaming (via HLS) of h.264 videos. However, when streaming via RTMP, AIR for iOS only supports VP6 and Spark – a couple of old, retired codecs.

     

    While HTTP streaming (HLS) seems to be a good option for those who simply want to “play a video” in iOS, I do believe it has some severe limitations, especially for live-communications. I’d like to share some of these thoughts with you.

     

    1. HLS has ridiculously high latency for live videos (around 40 seconds), when compared to RTMP. Although this may not be a problem for on-demand videos, it sure is a great problem for anyone doing serious live-communications applications (such as webconferencing, live webcasting with audience interaction or Skype-like video chats), which require near-zero latency.

     

    2. Perhaps someone can correct me on this (hopefully!), but as far as I know, HTTP streaming will not allow cuepoints to be read from videos. This is particularly painful for anyone doing video-triggered actions, such as slide changes (for webinar apps), subtitling or live closed captioning, etc. I read somewhere that OSMF player allows cuepoints (or "temporal metadata". See http://blogs.adobe.com/osmf/2009/11/cue_point_support_in_osmf.html), but I haven't been able to test it myself.

     

    3. Although HLS it is quite compatible with firewalls (since it flows through port 80), RTMP with tunnelling also flows through port 80 or 443, which adds great compatibility, even on very restricted networks. Our experience with very large clients proves that, hands down.

     

     

    In other words, HTTP/HLS streaming is Ok. But it simply does *not* fit into every shoe that RTMP does. We do believe that RTMP remains as our best option for live streaming or serious streaming-oriented *apps* (in which things more complex than “mere video playing in a window” actually happen).

     

    That all said, I do believe we should let Adobe know about this need. The fact that RTMP streaming in AIR for iOS is limited to VP6 and Spark, which are two “dead” codecs, still puts us, Air developers, in a very fragile position in terms of what we can accomplish with video in iOS.

     

    I’m sure some of you cheered when you heard about Flash Player 11 having h.264 video encoding. This, (plus the echo cancellation feature that came in 10.3) opened great doors for great Unified-Communication applications to be developed for Flash/Air. Now, it’s undeniable that clients want those applications running on tablets, especially the iPad.

     

    Not being able to use h.264 via RTMP on iOS is certainly a huge step backwards. Anyone shares this same opinion? What do you guys believe to be the best option to let Adobe really know about this need? Is this limitation a simple lack-of-a-feature (which can be fixed by Adobe) or is this some imposed thing by Apple?

     

    Just one final note: Air for Android does *not* have the same limitation. It does allow RTMP streaming of h.264.

     

    Thanks for your attention,

     

    Helder Conde

     
    |
    Mark as:
  • Currently Being Moderated
    Feb 10, 2012 10:05 AM   in reply to Helder Conde

    The requirement to use HTTP is an Apple one. I forget at what size it's a requirement, but it's a short enough time that most streaming applications would exceed it quite quickly. Here's the Apple article on it:

     

    https://developer.apple.com/library/ios/#documentation/networkinginter net/conceptual/streamingmediaguide/UsingHTTPLiveStreaming/UsingHTTPLiv eStreaming.html

     

    It seems that if your video is over 10 minutes, or it delivers over 5 MB in 5 minutes, it must be HTTP Live Streaming. Shorter clips can be progressive downloads. With those requirements it would be tough for Adobe to justify RTMP support for the sake of people with low data rate, short movies, so instead they added HTTP support to Flash Media Server.

     
    |
    Mark as:
  • Currently Being Moderated
    Feb 10, 2012 10:21 AM   in reply to Colin Holgate

    Colin,

     

    Thanks for the (very!) quick response!

     

    I understand this is imposed by Apple. The problem is that HTTP Live Streaming, as I mentioned on my previous post, currently does not provide the same features as RTMP does... Again, it's cool for those doing latency-independent apps, in which video is an entity by itself.

     

    But, let's say I need some cuepoints in a *live* video to trigger, for example, an image change, or any other events on the app. That is pretty straighforward with RTMP, for example.

     

    Ok, so I'll change my question. Instead of trying to accomplish this with RTMP, my new inquiry is: how can this be accomplished with HTTP Live Streaming?

     

    Also, something is puzzling me: we know that some real-time video applications (such as Skype and WebEx) do work on the iPad. I presume that, being real-time, those apps do not use HTTP Live Streaming. So, what do they use? How can they achieve near-zero latency, with H.264-like quality, on their respective iPad apps? Any thoughts on this?

     

    Thanks in advance,

     

    Helder Conde

     
    |
    Mark as:
  • Currently Being Moderated
    Mar 10, 2012 6:53 AM   in reply to Helder Conde

    I add myself to the list of people that need a solution to this problem.

     

    I need to make a videoconference system... latency is one of the issue but also multiple h264 decoding of all the clients connected is a problem if i want to show all clients at the same time in realtime.

     
    |
    Mark as:
  • Currently Being Moderated
    Mar 14, 2012 10:53 AM   in reply to Dario Tolio

    I would also like to see H.264 via RTMP on iOS. I was checking the AIR roadmap and it doesnt say anything about it on there. I was happy to see some sort of quasi threading in Actionscript coming late in 2012.

     
    |
    Mark as:
  • Currently Being Moderated
    Apr 15, 2012 10:28 AM   in reply to craiglewiston

    Me too, I cant even playback a VP6 encoded video delivered through RTMP o iOS5 for my iPhone.  Any hints?

     
    |
    Mark as:
  • Currently Being Moderated
    Apr 15, 2012 10:34 AM   in reply to AS2Xtreme

    There are strict rules with regard to what videos can be played over RTMP in iOS. I forget the exact numbers, but it's to do with the number of megabytes per minute, and the total running time of the video. It's quite likely that you're going to exceed the limits, so just don't even try RTMP. It's not an AIR issue, it's an iOS restriction.

     
    |
    Mark as:
  • Currently Being Moderated
    Apr 15, 2012 10:37 AM   in reply to Colin Holgate

    So that means that its not possible to playback a RTMP live stream, even if this is not being encoded with h264???.  Its frustrating that it does work on the simulator, but once you deploy it, it will not work...

     
    |
    Mark as:
  • Currently Being Moderated
    Jul 31, 2012 11:12 AM   in reply to AS2Xtreme

    Has anybody found a solution for this?

    Is it possible to do a live HTTP stream inside an Air app for iOS for a videochat application?

     
    |
    Mark as:
  • Currently Being Moderated
    Oct 30, 2012 5:40 AM   in reply to Jan Poehland

    So as i understand there is no solution yet for playing live streaming with cuepoints on IOS using Air app?

    What about create an ANE that play native IOS and embed it into the Air app?

     
    |
    Mark as:

More Like This

  • Retrieving data ...

Bookmarked By (0)

Answers + Points = Status

  • 10 points awarded for Correct Answers
  • 5 points awarded for Helpful Answers
  • 10,000+ points
  • 1,001-10,000 points
  • 501-1,000 points
  • 5-500 points