1

I am trying to improve performance for the code I use to convert a png to grayscale.

I load image via Json, convert to texture, then grayscale at runtime going through each pixel and using GetPixels32/SetPixels32. Memory use is too high for the many images I have so I'd like to use more efficient GetRawTextureData.

However LoadImage is returning a ARGB32 texture2d which seems to be messing up then follow on use in GetRawTextureData (RGB32).

It's confusing because png with LoadImage on Unity 2019.2 should be returning RGBA32 but for me it's returning ARGB32 which is what LoadImage legacy Unity 5.3 would have returned - not sure if a bug or documentation error.

Texture2D thisTexture = new Texture2D(1, 1, TextureFormat.RGBA32, false);
byte[] t = Convert.FromBase64String(imageString);//png image in string format
thisTexture.LoadImage(t);//Using Unity2019.2.2 result is ARGB32

When I convert image to grayscale I get a yellowish image and I think that's because GetRawTextureData is RGB32 but LoadImage returns ARGB32.

How can I replicate the LoadImage process to get RGB32?

//convert texture
        graph = thisTexture;
        grayImg = new Texture2D(graph.width, graph.height, graph.format, false);

        Debug.Log("grayImg format " + grayImg.format + " graph format " + graph.format);
        Graphics.CopyTexture(graph, grayImg);
        var data = grayImg.GetRawTextureData<Color32>();

        int index = 0;
        Color32 pixel;
        for (int x = 0; x < grayImg.width; x++)
        {
            for (int y = 0; y < grayImg.height; y++)
            {
                pixel = data[index];
                int p = ((256 * 256 + pixel.r) * 256 + pixel.b) * 256 + pixel.g;
                int b = p % 256;
                p = Mathf.FloorToInt(p / 256);
                int g = p % 256;
                p = Mathf.FloorToInt(p / 256);
                int r = p % 256;
                byte l = (byte) ((pixel.r ) + pixel.g + pixel.b);
                Color32 c = new Color32(pixel.r, pixel.g, pixel.b, 1);
                data[index++] = c;
            }
        }
        // upload to the GPU
        grayImg.Apply(false);

Revised code with LoadRawTextureData:

    Texture2D thisTexture = new Texture2D(1, 1, TextureFormat.RGBA32, false);
    byte[] t = Convert.FromBase64String(imageString);//png image in string format
    thisTexture.LoadRawTextureData(t);//results in RGBA32 (Note: using either thisTexture.LoadImage(t); or thisTexture.LoadRawTextureData(t); when using LoadRawTextureData to set the pixels on the grayscale, the texture continues to be yellowish.
    thisTexture.Apply();

Sergio Solorzano
  • 476
  • 9
  • 29

1 Answers1

1

This is not how GetRawTextureData should be used - it retuns NativeArray and those are very slow when mistaken with oridinary array. For examples on this, take a look at this RawTextureDataProcessingExamples repo. Especially this GrayscaleRGBA32Job file:

using Unity.Mathematics;
using Unity.Collections;

[Unity.Burst.BurstCompile]
public struct GrayscaleRGBA32Job : Unity.Jobs.IJobParallelFor
{
    public NativeArray<RGBA32> data;
    void Unity.Jobs.IJobParallelFor.Execute ( int i )
    {
        var color = data[i];
        float product = math.mul( new float3{ x=color.R , y=color.G , z=color.B } , new float3{ x=0.3f , y=0.59f , z=0.11f } );
        byte b = (byte)product;
        data[i] = new RGBA32{ R=b , G=b , B=b , A=color.A };
    }
}

You can swap channel bytes here if you ever need that. This is how you apply this job:

var rawdata = tex.GetRawTextureData<RGBA32>();
new GrayscaleRGBA32Job { data = tex.GetRawTextureData<RGBA32>() }
    .Schedule( rawdata.Length , tex.width ).Complete();
tex.Apply();

That's it. Based on this you can also write a job that takes RGBA32 rawData as input and outputs that to grayscale texture's R8 rawData directly (no copy needed, and smaller memory footprint).

Andrew Łukasik
  • 1,454
  • 12
  • 19