0

Premise

I'm currently developing a graphical application and, due to unforeseen limitations with the framework I'm using, I need to convert my TextureCube textures into a Texture2DArray with 6 slices.

While going from one format to the other is not really an issue, sampling from the Texture2DArray using a 3D vector constitutes a harder challenge.

Question

Given:

  • A 3D vector previously used to sample a TextureCube
  • A Texture2DArray representation of the same TextureCube

What is the most efficient way to write a (HLSL) shader function that returns a float3 coord, in which:

  • coord.z is the index of the correct slice within the Texture2DArray
  • coord.xy are the uv coordinates used to sample the selected slice

Current Progress

At the moment this is my current progress:

  • I know from this link, that the vector component with the biggest magnitude dictates the correct face to be selected.
  • Once a face is selected using the component with the biggest magnitude, I think that the remaining components can be used as uv coordinates (assuming the vector is normalized).
  • I'm not sure how to treat cases in which two of the vector components, or all three of the vector components have the same magnitude (for instance (0.5,0.5,0.5)). I assume that, if the texture is somewhat continuous from pixel to pixel, picking one face, rather then the next should not yield particular differences.

Final Note

Please assume that the Cubemap faces are sorted within the using the default DX11 ordering, i.e. +X, -X, +Y, -Y, +Z, -Z. I also want to stress that efficiency is extremely important.

Community
  • 1
  • 1
StrG30
  • 670
  • 1
  • 10
  • 20
  • "*due to unforeseen limitations with the framework I'm using, I need to convert my TextureCube textures into a Texture2DArray with 6 slices.*" I cannot imagine what those limitations might be. In D3D11, it is perfectly possible to take a 2D array texture with 6+ layers and create a cubemap view of it. Why are you unable to do that? – Nicol Bolas Aug 04 '16 at 13:37
  • 1
    @NicolBolas I've actually first built a prototype of my project in C++/D3D11, but now I'm porting it to Unity. While on the D3D11 side everything works fine, Unity offers a very poor support of TextureCube, thus yielding my question above. I can further explain what I mean with "poor support" if you're curious... – StrG30 Aug 04 '16 at 13:47
  • Keep in mind that a Texture2DArray only works on ``D3D_FEATURE_LEVEL_10_0`` or better hardware. – Chuck Walbourn Aug 04 '16 at 21:15
  • @ChuckWalboum Yep, good reminder. – StrG30 Aug 05 '16 at 07:41

0 Answers0