1 Reply Latest reply on Dec 24, 2013 10:09 AM by mikepucher

    Most efficient way to read accurate 16 bit color data?

    mikepucher

      I'm working on some digital imaging & color management research using Photoshop, and I'd like to optimize my speed and accuracy. I'm aware of some command line tools like imagemagick, but since we are working with custom profiles and our end users will be using Photoshop, I'd like to stay within the Adobe CMM ecosystem to ensure that our testing reflects the real world usage.

       

      I'm working with high-res 16-bit RGB and Lab files- I'm interested in gathering data from images of ColorChecker SG target cards. I've tried scripting two ways to read target color information:

       

      • Color Samplers
        • Pros: Accurate 16 (aka "15+1") bit readout of color. Lab values with decimal precision.
        • Cons: Each color sampler is only a point sample. I've scripted out creating an average readout of various pixels, but it is slow.
          • Checking one point sample/square gives an inaccurate result, takes only about 30 seconds
          • Checking/averaging a 3x3px area takes 13 seconds/square X 140 squares on the chart (~30 mins/chart)
          • Checking/averaging a 50x50px area takes 400 seconds/square X 140 squares (~15 hours/chart)
            • There is an average ∆E of 0.26 between the 3x3 and 50x50 readings, indicating that more is better, but 50x50 is probably overkill. However I'm looking to be able to process multiple charts daily, so even if the 3x3 was accurate enough, it would still be slower than I'd like.
      • Histogram
        • Pros: Fast. Takes 30 seconds to read 140 patches, averaging an area of 50x50px per reading
        • Cons: The histogram data is only 8 bits, even for a 16 bit original. Less accurate recording of color data.
          • Comparing the histogram data to the 50x50 color sampler data yields an average ∆E = .753, min ∆E = .56, max ∆E = .99

       

      So really, I'd like the speed of the histrogram matched with the accuracy of the 16 bit color sampler... Any thoughts? I'm currently using AppleScript because I find it easier to get things up and running, though js is a possiblity.

       

      Here's a snippet of code from the Color Sampler way- I think it's pretty lean code while it loops here, but perhaps there's something that could speed it up?

       

                                    repeat sampleSize times

       

                                              repeat sampleSize times

       

       

                                                        tell current document

                                                                  set theSampler to make new color sampler with properties {class:color sampler, position:{currentX, currentY}}

                                                                  set theValues to color sampler color of theSampler

        delete theSampler

       

                                                        end tell

       

                                                        set valueLsum to valueLsum + (value_L of theValues)

                                                        set valueAsum to valueAsum + (value_a of theValues)

                                                        set valueBsum to valueBsum + (value_b of theValues)

       

       

       

                                                        set currentX to currentX + 1

       

                                              end repeat

                                              set currentX to currentX - sampleSize

                                              set currentY to currentY + 1

       

                                    end repeat

        • 1. Re: Most efficient way to read accurate 16 bit color data?
          mikepucher Level 1

          Sometimes it helps to write out the facts to help you brainstorm. I kept playing around with what I had and came up with something simpler.

           

          Instead of using the script to average out the histogram values for a selection, I'm now using the average blur to average the color within a selection. This reduces the ∆E (compared to the iterating color sampler method) to only 0.08.

           

          I feel much more confident with the results, however I'm curious why a ∆E exists at all between using average blur on a 50x50px selected area and taking the average of the 2500 point sample readings in that same 50x50px area. Anyone have insight on this?

           

           

          Sample code using average blur:

           

                              tell application "Adobe Photoshop CS6"

                                        tell current document

            select region {{currentX - sampleOffset, currentY - sampleOffset}, {currentX - sampleOffset, currentY + sampleOffset}, {currentX + sampleOffset, currentY + sampleOffset}, {currentX + sampleOffset, currentY - sampleOffset}}

            filter current layer using average

                                                  set theSampler to make new color sampler with properties {class:color sampler, position:{currentX, currentY}}

                                                  set theValues to color sampler color of theSampler

            delete theSampler

           

                                                  set valueL to value_L of theValues

                                                  set valueA to value_a of theValues

                                                  set valueB to value_b of theValues

           

           

                                        end tell

                              end tell