6 Replies Latest reply: Sep 14, 2013 4:39 AM by ryclark RSS

    2 Different Bit Depths… Which is the real one?

    steve.kr Community Member

      I apologize, this gets confusing. Reply and I'll try to clarify the best I can. I'll even submit the example files I've been experimenting with.


      In iTunes, I have been converting WAV files into Apple Lossless .m4a files (ALAC). Recently, some of the ones I have converted have had some inconsistent bit depth values. Since I work in Adobe Audition CS6, I can output various formats: format settings of 32/64-bit (Integer OR Floating Point IEEE), sometimes 24-bit Integer, and sample types of 192.000 kHz or 44.100 kHz and up to 32-bit (float).


      Contradictory? YES. It sounds weird to have format settings of up to 64-bit Integer/Float, but sample types of various sample rates, and almost always 32-bit (float). Is this a possible reason?


      Another problem: iTunes has the ability to convert files to ALAC, among other formats. As it turns out, iTunes has the ability to convert to "32-bit," but only under specific circumstances. Working in Adobe Audition CS6, I have two options for "32-bit:" 32-bit Integer, and 32-bit Floating Point (IEEE). In order for iTunes to be able to maintain that bit depth, I need to select 32-bit INTEGER under Format Settings, and 32-bit under Sample Type. When the file has been exported, I reopen it in Adobe Audition where Bit Depth (in the "Files" box in the top right-hand corner) is listed as "32 (float)" all of a sudden. All I did was export it, I didnt convert it, but despite specifiaclly choosing "Integer," I have 1 32-bit Float audio file. In iTunes, it can open just fine. iTunes calls it "32-bit" as well, and it converts to other formats in iTunes as 32-bit (proabably float) just fine. However, if I export an audio file with 32-bit Floating Point (IEEE) Format Settings from Audition, iTunes fails to convert it to 32-bit, but instead converts it to 16-bit.


      Here's what really odd: lately, every single ALAC file opens in Adobe Audition as having a Bit Depth of "32 (float)."


      So... Where are the flaws? What are the real bit depths? ALAC files with the same sample rate and different bit depths (according to iTunes) still show up as being "32 (float)" in Adobe Audition. Is iTunes converting them all to 32-bits, and it just doesn't know it?