1

I'm using unity, and I wanna send a byte array to the GPU using HLSL. I know about the ByteAddressBuffer, but I have no idea how to use it. I kinda just want to know how to send "byte"s to the GPU. I want to have a StructuredBuffer<byte> in my compute shader.

None
  • 609
  • 7
  • 22

1 Answers1

3

For the shader part, you can use a StructuredBuffer. I don't know exactly if there is the byte data type in HLSL, so I will just use integers for this example.

Shader code:

Shader "Name" {
    SubShader {
        ...
        StructuredBuffer<int> _Data;
        ...
    }
}

On the C# side, you have a Material that corresponds to your shader, lets call it mat, and your byte array bArr. Additionally you have to create a gpu buffer, that you can then bind to your shader: ComputeBuffer dataBuf = new ComputeBuffer(bArr.Length, sizeof(int)).

Finally, load your array onto the gpu dataBuf.SetData(bArr) and bind the buffer to your shader mat.SetBuffer("_Data", dataBuf);

Edit

I want to have a StructuredBuffer<byte> in my compute shader.

From what I've read, you can't. There is no byte data type in HLSL (nor CG, which is what unity uses). The example above is a standard vertex/fragment shader, for using compute shaders I would refer you to my answer on your other question. Augment it to your needs. As I have already written in a comment, if you do not want to use int for your byte data and thus waste 24 bits, you can punch 4 bytes into 1 int with bit shifting. The shifting operation should be available in shaders when using shader model above 4 (DX10.1 or something)

An example of how to do this is as follows:

//encoding on the cpu

int myInt = 0;
myInt += (int)myByte1;
myInt += (int)(myByte2 << 8);
myInt += (int)(myByte3 << 16);
myInt += (int)(myByte4 << 24);

//decoding on the gpu

myByte1 = myInt & 0xFF;
myByte2 = (myInt >> 8) & 0xFF;
myByte3 = (myInt >> 16) & 0xFF;
myByte4 = (myInt >> 24) & 0xFF;
chantey
  • 4,252
  • 1
  • 35
  • 40
nyro_0
  • 1,135
  • 1
  • 7
  • 9
  • 1
    I know how to use structured buffers ,and I know how to send ints to the GPU. I want to know how to send "byte"s to the GPU @xyLe_ – None Nov 23 '16 at 09:11
  • 1
    there is no `byte` data type in HLSL nor in CG (which is what unity uses by default). in CG you could go with `char`, but can't you just go with `int`? – nyro_0 Nov 23 '16 at 09:19
  • if you are really possessed with used space, then just pack 4 bytes into 1 int and use bit shifts – nyro_0 Nov 23 '16 at 09:22
  • @None I've updated my answer – nyro_0 Nov 23 '16 at 12:40