8 Replies Latest reply on Jul 17, 2015 7:39 AM by Bill Engeler

    Encoding quality CPU vs GPU


      This site http://www.studio1productions.com/Articles/PremiereCS5.htm says the GPU is not used at all for encoding/decoding but every benchmark I've seen says otherwise. What's the official word on this?


      This site http://www.precomposed.com/blog/2013/06/ame-vs-premiere/ shows that CPU encoded output SUCKS quality wise (pic: http://www.precomposed.com/blog/blogstuff/2013/06/ame-vs-de-quality-tests_1.jpg ). It seems clear that the GPU is used but how is it used and why does the output differ so much between CPU only and GPU (CUDA on). Can anyone explain why the CPU output is so bad?