1 Reply Latest reply on Oct 20, 2016 9:40 AM by Mylenium

    Creating a Stereo3D video for VR from an existing After Effects project...

    Karel Bata Level 1

      As an experiment I'm creating an immersive 3D video for VR from an existing After Effects project. My target platform is the Samsung Gear and similar.

       

      Obviously I go back into the AE file... and then what? LOL

       

      I can use SkyBox for stitching, but it doesn't 'create' seperate L & R 360 stereo files. What I mean is that if you have the L & R camera set with lenses that give 360 views and the two are separated horizontally (the InterAxial), then it works OK looking forwards, but when the viewer looks to the side there is no IA separation as the two images are overlaid (so no 3D). Look backwards the IA is reversed with reverse 3D.


      So how do you render so the IA axis is always at 90 degrees to the pov thus generating the correct parallax and 3D? Is there a trick I don't know about, or will I have to render multiple cameras and stich.

       

      Someone from the Foundry said they use a virtual slit-scan camera (not in AE) for each eye. I think that would be a formidable challenge in here....

       

      Below is a 2D 16:9 non VR render of the original. It's a few yearrs old - CS5 - and yes, it's Particular. Obviously there's a few issues - like possibly motion sickness(!!!). I'll slow it down a bit to tone down the motion, but I think it'll still be hardcore!

       

      Private video on Vimeo Password is TTT

       

      Any ideas?

        • 1. Re: Creating a Stereo3D video for VR from an existing After Effects project...
          Mylenium Most Valuable Participant

          Proper Stereo 3D on top of a VR projection would require a realtime solution. The field of view is shifting around way too much to provide a consistent separation and depth perception, especially near the edges. You can't really bake these things into the actual footage. If at all, it's only possible for a fixed focus plane that causes minor differences between the channels so it doesn't mess too much with adjacent regions that come into view as you pan around. The only other use case I can imagine without any form of realtime involvment would be some sort of linearly pre-defined path inside the VR environment so you can pre-determine the separations at each step of the way, but still, there are serious limitations here.

           

          Mylenium