6 Replies Latest reply: Aug 24, 2013 7:22 PM by AsterLUXman RSS

    VertexShader - Parameter / Matrix Issues


      I'm trying to implement a simple mask-shader for my 2D engine and I'm having multiple issues getting it to work.


      Basically the shader takes two images and two matrices and tries to map back the vertex position of the sprites vertex back to the local mask space position and then calculate the local uv-coordinates for the mask, but I'm stuck with some basic calculations here.


      Bug 1: Divisions in the material-vertex kernel don't work at all. That's why I have to put invertedMaskSize (which is 1.0 / 128.0) into the shader (but we already know that )

      Bug 2: interpolated vars have to be float4, float2 is not possible (That's a pretty old bug as well)


      I tried the following changes in the shader. The shader posted here, just tries to display the resulting uv-coordinates. No textures are used.


      Case 1:

      interpolatedMaskUV = float4((vertexPos.x + halfMaskSize.x) * invertedMaskSize.x, (vertexPoss.y + halfMaskSize.y) * invertedMaskSize.y, 0.0, 0.0);

      The output is this: http://dev.nulldesign.de/plain_uvcoords.png Just like you expect! Perfect, let's proceed.


      Case 2:

      Change the halfMaskSize and invertedMaskSize to float2 and set set the parameters as two vectors of length two of course in AS. The output: http://dev.nulldesign.de/float2_uvcoords.png


      Case 3:

      Masking Test, matrix multiplication. First calculating the world space position of the vertex:


      float4 worldSpacePos = float4(vertexPos.x, vertexPos.y, 0.0, 1.0) * objectToClipSpaceTransform;


      Then mapping it back to the local space of the mask:

      float4 localMaskSpacePos = worldSpacePos * maskObjectToClipSpaceTransform;


      And calculating the uv-coords:

      interpolatedMaskUV = float4((localMaskSpacePos.x + halfMaskSize.x) * invertedMaskSize.x, (localMaskSpacePos.y + halfMaskSize.y) * invertedMaskSize.y, 0.0, 0.0);


      For testing, I set the maskObjectToClipSpaceTransform to the inverse of the objectToClipSpaceTransform. In theory and on paper, this should work.

      But, I think, something gets out of order and maybe the maskObjectToClipSpaceTransform is screwed up in the shader, just like when I set the halfMaskSize and invertedMaskSize to float2. The result is this: http://dev.nulldesign.de/local_uvcoords.png and I have no idea how to fix this...


      <languageVersion : 1.0;>
      material kernel texture
          namespace : "ND2D_Shader";
          vendor : "nulldesign";
          version : 1;
          input vertex float2 uvCoord
              id : "PB3D_UV";


          input vertex float2 vertexPos
              id : "PB3D_POSITION";


          parameter float2 uvOffset;
          parameter float4x4 objectToClipSpaceTransform;
          parameter float4x4 maskObjectToClipSpaceTransform;

          // if set to float2, strange things happen

          parameter float4 halfMaskSize;
          parameter float4 invertedMaskSize;


          interpolated float4 interpolatedUV;
          interpolated float4 interpolatedMaskUV;


          void evaluateVertex()
              // not used in the current test ...

              interpolatedUV = float4(uvCoord.x + uvOffset.x, uvCoord.y + uvOffset.y, 0.0, 0.0);


              float4 worldSpacePos = float4(vertexPos.x, vertexPos.y, 0.0, 1.0) * objectToClipSpaceTransform;

              // doesn't work as expected

              float4 localMaskSpacePos = worldSpacePos * maskObjectToClipSpaceTransform;


              interpolatedMaskUV = float4((localMaskSpacePos.x + halfMaskSize.x) * invertedMaskSize.x,
                                          (localMaskSpacePos.y + halfMaskSize.y) * invertedMaskSize.y,
                                           0.0, 0.0);


          input image4 textureImage;
          input image4 textureMaskImage;
          parameter float4 color;


          output float4 result;


          void evaluateFragment()
                // just visualize the uv-coords

                result = float4(interpolatedMaskUV.x, interpolatedMaskUV.y, 0.0, 1.0);


              float4 texel = sample(textureImage, float2(interpolatedUV.x, interpolatedUV.y), PB3D_2D | PB3D_MIPNEAREST | PB3D_CLAMP);
              float4 texel2 = sample(textureMaskImage, float2(interpolatedMaskUV.x, interpolatedMaskUV.y), PB3D_2D | PB3D_MIPNEAREST | PB3D_CLAMP);


              result = float4(texel.r * color.r,
                              texel.g * color.g,
                              texel.b * color.b,
                              texel.a * color.a * texel2.a);


      I know that we're working with a four month old version of pb3d and I hope that a new version will be out soon and maybe all these bugs I encountered are already solved, but if not.... here's another shader to fix

        • 1. Re: VertexShader - Parameter / Matrix Issues
          AIF Bob Community Member

          First of all, thank you for posting such an informative description of your problem. Having this much detail helps us to track down what's going on. We have fixes for some of the bugs you've encountered, and we're working on the other bugs. We'll make sure this gets added to our test suite.


          We're working on a new release which we plan to have out soon (weeks rather than months).



          • 2. Re: VertexShader - Parameter / Matrix Issues
            .lars Community Member

            Thanks Bob for the quick reply. I'm really curious about the next drop


            One more thing that came into my mind: I tried to implement geometry batching with pixelbender and had two issues:


            1. I can declare something like this in a vertex / fragment shader:

            parameter float4x4 objectToClipSpaceTransform[8];


            But during runtime, I get this error, even if the array size is very small:

            [Fault] exception, information=Error: Register allocator has run out of registers.


            2. And as far as I know, there's no way to fill the arrays with the vertexBufferHelper class:


            parameterBufferHelper.setMatrixParameterByName(Context3DProgramType.VERTEX, "objectToClipSpaceTransform", clipSpaceMatrix, true);


            I had to use setProgramConstantsFromMatrix, which is a bit inconsistent...


            Are these issues, that have already been adressed? I'd really like to speed up rendering by doing proper batch rendering..



            • 3. Re: VertexShader - Parameter / Matrix Issues
              AIF Bob Community Member

              One of the problems we have is that PB3D is built on top of Molehill which has severe hardware restrictions, particularly around the number of registers that are available for use. Once you start using float4x4, particularly in conjunction with arrays you run out of registers almost immediately (there are only 8 temporary registers available - since a float4x4 uses of 4 of them you can see that we're squeezed really tight).


              This means we haven't been focussing on array handling, so you're probably stuck with a clunky method of getting values into arrays for the next release.



              • 4. Re: VertexShader - Parameter / Matrix Issues
                AIF Bob Community Member

                I just discovered one more issue when I was working through your example. If you have a parameter that's used by the vertex kernel, and a parameter that's used by the evaluateVertex function of the material kernel they must have different names, even if they are supposed to be the same value. The particular parameter that gave me trouble is objectToClipSpaceTransform. I'm using it in my standard vertex kernel, but it's also used in your example as an input to the evaluateVertex function.


                We haven't yet worked out what the right way to deal with this situation is, for the time being, the work around is to make sure that the parameters have two different names.



                • 5. Re: VertexShader - Parameter / Matrix Issues
                  .lars Community Member

                  Ah that's interesting. I thought after compiling the three shaders with pixelbender, the two vertex kernels are merged and I can use the same parameters. This would mean, I have to push objectToClipSpaceTransform twice (with a different name)? So I would to waste four registers


                  Here is the AGAL version of the shader, it's working as expected:



                  m44 vt0, va0, vc0                   // vertex * clipspace
                  m44 vt1, vt0, vc4                   // clipsace to local pos in mask
                  add vt1.xy, vt1.xy, vc8.xy     // add half masksize to local pos
                  div vt1.xy, vt1.xy, vc8.zw     // local pos / masksize
                  mov v0, va1                         // copy uv
                  mov v1, vt1                         // copy mask uv
                  mov op, vt0                         // output position



                  mov ft0, v0                                                   // get interpolated uv coords
                  tex ft1, ft0, fs0 <2d,clamp,linear,nomip>      // sample texture
                  mul ft1, ft1, fc0                                             // mult with color
                  mov ft2, v1                                                   // get interpolated uv coords for mask
                  tex ft3, ft2, fs1 <2d,clamp,linear,nomip>      // sample mask
                  mul ft1, ft1, ft3                                             // mult mask color with tex color
                  mov oc, ft1                                                   // output color


                  The full source code here: https://github.com/nulldesign/nd2d/blob/master/src/de/nulldesign/nd2d/materials/Sprite2DMa skMaterial.as


                  I'll try it with a different parameter name now. Thanks

                  • 6. Re: VertexShader - Parameter / Matrix Issues
                    AsterLUXman Community Member

                    This may not be the answer to .lars' issue, and may be totally obvious to some, but for readers who, like me, were stuck wondering why offset UVs weren't working when the offset was applied within the evaluateVertex() function of the material kernel ( as opposed to the evaluateVertex() of the vertex kernel ):


                    You need to pass the UV offset parameter ( aka. constant register ) to the VERTEX shader, NOT to the FRAGMENT shader, even though you are targeting the evaluateVertex() of the material kernel.


                    In other words, use:


                    com.adobe.pixelBender3D.utils.ProgramConstantsHelper::setNumberParameterByName(  Context3DProgramType.VERTEX, "myUVOffset", Vector.<Number>( [ value1, value2 ] ) );



                    P.S. My question to Adobe: did we really need to split the vertex shader in two: on portion in the vertex kernel and the other in the material kernel? ..  It seems like a really odd choice, making things unnecessarily more complicated.