Chris Cox wrote:
Try changing the GPU level in preferences, I'll bet the artifacts disappear in basic but show up in normal or advanced (posts 51 and 53).
Then we need to know your GPU and make sure you're using the latest driver.
Then from that we can try to reproduce the error.
That's right, the tiny transparency errors do not result in visible artefacts when the GPU level is "basic".
GPU is NVIDIA GeForce 320M in a mid-2010 Mac Mini running OS X 10.6.8 Snow Leopard with whatever driver is the most recently provided by Apple for that system.
Noel Carboni wrote:
Paulo or anyone experiencing the visibility of the transparency glitch... Does transparency otherwise seem to work normally?
It's almost hard to picture how it could, but I'm guessing it must or more major problems than this would be reported. It's almost as though the LS byte inaccuracy is getting combined into the MS byte.
Yes, transparency is normally OK. The glitch seems only to be a display artefact involving blending of the Transparency Grid with partially transparent image pixels.
It's weird that the error shows up in multiple GPU brands, but not with all driver versions.
It still seems to be in the shaders that convert color for the display, but I'll have to check with the GPU guys to see why it is so hit and miss.
Noel Carboni wrote:
... Does transparency otherwise seem to work normally?
A qualified yes! That is, nothing has presented itself that suggests any other problem with transparency. On the other hand, I have not run through a series of tests looking specifically for transparency problems. Also, I have seen no transparency issues, including the one that is the subject of this thread, in CS4 on the same hardware with the same drivers.
I may take a closer look if time permits.
FYI, I've just updated to ATI Catalyst 12.6 released drivers, and now I see the slight transparency glitch expressed without extreme enhancement.
Summarizing: Catalyst 12.2 did NOT do this, 12.6 DOES do this.
There seems to be yet another problem, or a related one, with the 16-bit mode in CS6. Rather than put this observation into
a new thread it will remain in this one as I happened across it while dealing with the transparency issue discussed above.
I have been visualizing the problem in the transparency channel using layer blending modes rather than by using extreme adjustment layers - not for any good reason other than it seemed a bit easier for me to do. My recent favorite procedure to create and visualize the problems of16-bit interpolation artifacts and those artifacts spilling into the transparency channel has been to create a 100x100 pixel document and fill it with 50% gray. Using the image size command the document is upsampled to 550x550 pixels and that larger image looks normal to the unaided eye - uniform medium gray.. That layer is then duplicated (Ctrl+J) and its blending mode is changed to hard mix. Voila, you have a dramatic rendering of the 16-bit mode problems. as shown here
The new problem is that I do not think that the layer math should be revealing the problem. If I execute the same process in 8-bit mode I get an all-white result, which is what I expect it to be. To eliminate the interpolation and transparency issues I made a new document 550x550 pixels, filled it with 50% gray, did NO scaling, duplicated it and changed the duped layer's blend mode to hard mix. The result is an all black image - again, not what one would expect. If I look at each of three component channels they are each all white, but the composite channel view is black. What's that all about ?
So the 16-bit mode in cs6 has interpolation problems, transparency channel problems and blend mode problems. I am not going to search for any other issues nor will I just happen upon any other malfunctions as I am now finished using 16-bit mode in CS6 until these issues are resolved, hopefully in the first update.
I think that's because the new Photoshop default for resampling is Bicubic Automatic. Worst idea Adobe ever came up with.
Try changing your default upsampling scheme to just plain Bicubic, Paulo. The problem goes away entirely.
As to why the channels show white instead of black, it's pretty clear they're using a cache level because they're so small, which causes the math to switch to 8 bit for layer compositing. You want to eliminate THAT disparity, change the cache levels to 1 and GPU mode to Basic, which causes all layer compositing for displays to be done in the full 16 bits.
Since the problem resolves by going to 8 bit, and since I deliver my work to the client as 8 bits, it doesn't make a whit of difference. Also, printing is still 8 bit so I don't see any problems there either.
I did generate the pattern as outlined by the OP. I am running an Sapphire 7750, driver vers 12.6. I can see the problem clearly just eyeballing it at 100% The quality of the artifacts appears to be related to the Straighten angle; at 16.8°, it appears as an interferometer pattern. Changing to 8 bits clears it up, and turning off the GPU in Prefs also eliminates the problem.
I know that there are still driver problems or at least the driver seems to be able to be corrupted if I use Sleep Mode. That has been fixed by reinstalling the driver and turning off Sleep. (It is a desktop, not a laptop).
There seems to be no difference between 12.2 and any other driver for this card, afaik, at this time.
I am eyeballing the black image only at this time. No tweaks to exaggerate the patterns.
Edit: FYI, I'm running a PC, Win7 64 Ultimate.
The PC consists of a Asus M4A77TD mobo, Athlon IIx4 630, cpu stock speed,12G Ram
Video Driver version atiumdag 220.127.116.11
Message was edited by: Hudechrome
Looks like I am the contrarian here.
Paolo, I cannot reproduce your effects. The image here is the 550px upsample with the settings shown superimposed over the black supposedly crosshatched set to fill screen.
1) 16 bit
2 ) Fill color set to 50 using LAB, corresponding to 100, 100,100.
However RGB in Color Settings set to ProPhoto
If I set it to sRGB and use RGB to set gray, at 128 128 128, now I get pure white.
In fact, I get pure white with a variety of RGB settings except for 100 100 100, where it is pure black, ProPhoto or sRGB. The correspondence to L in Lab changes, depending on the Color Settings.
You can check this easily by selecting each fill layer Thumbnail on the layers palette and proceed to change both layers to the same value. It's interesting to set each layer differently as well, but the only outcome to be pure black has 100 100 100 in either layer. IOW, if 100,100,100 is set on only one layer, the other can be anything and I get black.
The bottom line is I get wither pure white or pure black. No crosshatch.
I finally got around to dealing with your posting Noel, sorry for the delay.
For the example I presented you have found a good workaround in choosing bicubic for resampling. My goal, of course, is not to find work arounds but to generate more information to help Adobe fix any related issues. Also, if I changed the example process from scaling up the mid-gray square to let's say transforming the square to a 45 degree rotation of itself (everything else being the same) then the problem is present even with bicubic resampling for the transformation.
Changing the cache levels to 1 is not that good a work around for me as it completely turns off caching and thus affects the performance of other operations. It's a good piece of diagnostic information that you have uncovered and I hope it quickens an adobe repair. In the meantime I remain wary of using 16-bit mode in CS6.
Actually I don't consider the choice of Bicubic as a workaround, since I consider it a better method for resampling in general than either of the two "newer" methods (ESPECIALLY Bicubic Sharper, which adds horrendous sharpening artifacts).
I do hope Adobe gets to the bottom of this display inaccuracy issue though (which is apparently display driver-specific). It's not nice to be deceived by the document display.
Given that the AMD folks are not releasing new drivers very often any more, it'll probably be a while before we see a newer driver for ATI cards, and I want to believe that Adobe will be able to find a different way to use the GPU facilities so as to avoid these visible inaccuracies.
My intuition tells me that the document quality itself is not adversely affected by the use of 16 bits/channel mode - that this is just a display issue. But I respect your wanting to be wary of it.
I cannot reproduce this problem with the 13.0.1 code and ATI Catalyst 12.8. I'm wanting to call it fixed in all facets.
Anyone else care to try?
Message was edited by: PECourtejoie
Never mind, motivated by your response I went back to trying it again.
I managed to reproduce it with 13.0.1. I don't know what I missed or did wrong before.
Please scratch what I said in post 81 above.
Apparently the problem still exists. But now I can create it using simple Bicubic resampling.
I did happen upon one other tidbit of information...
The "pattern" of visible transparency on the screen is ONLY visible to me if the layer with the minor transparency glitches is above nothing - i.e.,
It's the process that's trying to make the checkerboard show through that's in error.
It is NOT visible when rendering the layer with the slight transparency glitches above a solid layer (e.g., a solid fill layer). It only becomes visible in this case with sets of extreme curve adjustment layers added over the top, at which point the color can be seen showing through.
I gave it a try and I can see it without any extreme adjustments like in curves. It's a slight brown patterning on the black.
Now this monitor has excellent black separation so maybe I can see it w/o extracting the pattern with curves.
Also, it's a function of the actual pixel density. I tried it with a 10,000x10,000 px dimension and I could not se it at any magnification. At 600%, I suddenly get a fly screen looking pattern that doesn't resolve by going to 8 bit. My first try was 200x200px., then 10,000x10,000.
I also tried it with Graphics Processor disabled, and no patterns showed up.200x200px.
I don't understand from where Chris is coming. There is no transparency layer when I create the test object.
I'm making a new file, 800x800, 16-bit, transparent, filling it with black and rotating with the crop tool. The GP is enabled. Bingo! Suit patterns! Disable the GP, try again, suit patterns gone.
Noel Carboni wrote:
Apparently the problem still exists. But now I can create it using simple Bicubic resampling.
Noel, closely check CS5.1 again to see that regular Bicubic had the problem, anyway. A specific example is resize 256 x 256 to 400 x 400.
I see nothing using resampling.
Here are example screenshots:
1. The original document is 256 x 256 pixels, 16 bpc, Adobe RGB 1998. The one black layer (there is no Background layer) cannot be distinguished from the black window background.
2. The document is "Bicubic" resampled to 400 x 400 pixels. A network of horizontal and vertical lines where the transparency grid wrongly shows is barely visible.
3. Photoshop displaying screenshot 2 with a region given an exposure enhancement in case anyone's monitor or vision is preventing perception of the lines in screenshot 2.
OK, I simply used Background. Changing it to a Layer (Layer0) I can reproduce it ...
Yes, I suspected that you were using Background and that's why I made a point of stating that there was no Background layer in my example. Background never has any transparency and so it will always block the transparency grid (gray and white checkered display by default) from being visible.
... except Exposure does not enhance the image. No change.
My third screenshot showed an exposure increase being applied to the second screenshot. The exposure adjustment was not being applied in the actual document with the resampled layer.
But, if you change from 16 bit to 32 bit the lines disappear.
If I change mode from 16-bit to 32-bit mode, the lines become feinter but they definitely do not completely disappear on my monitor.
I suspect these are artifacts and not real. Passing the eyedropper over the image showed no change from 0,0,0. So I built a grid and set it's values to L=10,a=0,b=0. It looked quite close in values to the grid generated with the upsampling. Again, the eyedropper show no change in the values as you pass over the grid. Of course, the grid doesn't print with the image if it is left on when printing so I would suspect this also may be true of these artifacts.
Inasmuch as image pixel size with which I generally work is way larger than 256 or even 400 px, and inasmuch as I see no evidence of this type of grid on a 50M image at 100%, it seems moot to me. It became of interest when I saw that the effect shows up when the Background is changed to Layer 0, and layer 0 is what happens to the background if you uncheck "Delete Cropped Pixels", of course I am interested.
So, hopefully this won't actually be a problem with real world image sizes, and if it does, I'll flatten everything before final output and save it as a flattened version.
There's an explanation for this display problem in the thread. It appears that some GPUs are incorrectly blending the transparency grid with pixels which have a transparency of a tiny fraction of one percent. These pixels really should not have any transparency but tiny transparency errors are arising in bicubic resampling. The transparency is so minuscule that it shouldn't be visibly different to full opacity. The problematic GPUs are erroneously hugely magnifying these minuscule transparencies when calculating the display for your monitor.
The RGB values of pixels in the resampled layer will not reveal anything. A pixel's RGB value is independant of its transparency.
Don't worry about your documents. They will be OK.
Yes, I saw the explanation, and I wanted to check it out for myself, especially as to whether they make a real world difference especially in the darker values, which tend to more noise problems.
I never take anyone's advice about what may happen to my docs. I'll make every effort to check it out myself. One learns much in doing so!
Thanks for the feedback.
Summarizing, there seem to be two problems:
1. Some resampling operations are leaving the transparency for a given layer at a value of something less than fully opaque. A few levels out of 32769.
2. Some video cards / display drivers are making that (barely) visible. A few levels out of 256.
It probably all boils down to roundoff error at two different stages, one with the 16 bit math and one with the 8 bit math used to combine things for display.
I should have thought that Photoshop would have best-in-show algorithms internally that don't do this, and in fact it used to be true. What's amazing to me is that Adobe didn't see fit to fix either of these issues in 13.0.1.
I don't think Adobe can be held responsible for video drivers making errors
More to the point, these errors contaminate the 16 bit presentations, not 8 nor 32. So it isn't simply changing afterwards, starting with either 8 or 32 bit excludes the contamination. Further, this action is unidirectional, that is reducing the size does not produce the artifact. But increasing it back to the original does produce it. Also, if the multiplier is a whole number, (2x, 3x, etc) the effect isn't present.
As to large start values in the pixel number. it appears not to be present. But increasing the magnification past 65% does show it up. Example: Start 2000x2000 px. Increase size to 3521 px. Image looks clean. Magnify to 66+% and there is the pattern.
So it is insidious! looking at sharpening artifacts when checking at 100% after a size increase other than a whole number will show up, and possibly confuse the judgement of, artifacts in shadows.
Proceed to go nuts trying to correct it!
BTW, I still cannot use Exposure or Curves to amplify these artifacts. there is no change; the background remains dark unless I use the gamma slider, at which time the background changes to gray with no artifacts showing. I must be missing a step or two.