Please see the note at the bottom if you think this question has already been answered.
Is there a detailed document that explains what kind of iOs apis actionscript/flex on iOS can use? I assume it would be slow, but is it possible to use microphone, or camera to develop applications? Can I process real time camera input, for simple/complex image processing? Is it possible to build augmented reality apps for example? Or is mobile support limited to more generic applications?
It is quite an exciting idea to be able to carry investment in Flex into mobile, but I have not been able to find documentation that would explain what is possible and what is not.
Ps: I have tried to search for various keywords using the search facility of these pages, but I get zero results, even for exact rewriting of existing thread topics...
So, as it currently stands, the AIR runtime cannot access any APIs directly on a mobile device (or any machine as far as I know).
However, the 'native APIs' are exposed through the AIR runtime. This level of abstraction is what helps applications easily move between different different platforms (Desktop, Android, iOS, Playbook, etc..).
I recommend checking out the documentation on building mobile applications w/ Flash Builder and Flex: http://help.adobe.com/en_US/flex/mobileapps/index.html
This link may also help: http://help.adobe.com/en_US/as3/iphone/WS789ea67d3e73a8b24b55b57a124b32b5b57-7ffe.html
I haven't delved deep into the documentation myself; but I would expect that microphone access is there. I know camera access is there. I see no reason you wouldn't be able to build an Augmented reality app.
Thanks, this really helped.
I assume every time a Flex app is being deployed to iOS, the who Air runtime is being deployed again and again, since Apple does not permit runtimes on iOS, ie, apps hosting other apps.
Any early feedback about the performance of the apps built with Flex? Are they slower than native iOS apps?