CS6 clips the histogram (black and white point) when reducing a 16bit TIFF file from 7000px wide to let's say 1280px wide. CS5 does not using the same steps. Windows 7 ultimate 64bit - I7 -24GB RAM.
Anybody has had this issue? Is there a setting in CS6 to modify the reduction algo (binning, bicubic or other) that CS5 uses?
There is a dropdown at the bottom of Image Resize to select Nearest Neighbor or Bicubic-whatever if that’s what you mean. This hasn’t changed between CS5 and CS6 that I know of.
Exactly what stops are you using and what sort of image is it in the 7000px TIFF? Is it a black-and-white encraving or halftone image?
Are you using a different RGB-workspace color profile between CS5 and CS6?
Thansk for the quick reply. Indeed in CS6 the drop down defaults to bicubic automatic. I never noticed it before TBH. Changing it to Bicubic (best for smooth gradients) sorted out the issue although it defaults back to its original automatic settings. No clipping now. In CS5 it defaults to Bicubic (best for smooth gradients) which is why it worked in one but not the other.
The CS6 factory-default of Bicubic Automatic results in Bicubic Sharper and Bicubic Smoother when reducing and increasing pixel count, respectively. Sharper tends to create hideous oversharpening halos (hence the observed clipping in the histogram) but, hey, Adobe says that's best [rolls eyes].
Change the default resampling to plain Bicubic in Preferences > General.
Note that CS6's Transform mode has an independant resampling control in Options bar when a raster layer is targeted. The default for that is Bicubic Automatic so you probably want to set it to plain Bicubic, too.