I've had an issue with normal maps not behaving correctly in my custom shader and finally managed to find the cause. It turns out it was the way UV of my objects were mapped. In UV0 I stored a mapping to a color palette texture - the UVs were all scrambled together as the only thing that mattered was that they are on a pixel with the correct color. In UV1 I stored the traditional UV uwrap, which I used to apply the normal map. To get the normal map I used a set-up like this:
I'm doing my own light calculations so I need to transform the normal from tangent space to world space before using it.
This approach was causing two issues - weird artifacts and the normals being "stuck" to the object:
The sphere on the right is upside down and if you look at the normals they are also upside down. The artifacts are on both spheres, but they are visible on the right one from this perspective.
What seems to be the cause is the way I used UV0 to map the object to a color palette. It somehow affects the tangent to world space transformation done by the Transform node (I know it's this node because removing it makes the artifacts disappear). Also, swapping the UV channels so that the traditional unwrap is in UV0 and the palette mapping is in UV1 fixes the issue:
There are no artifacts and the normals aren't stuck to the object.
So why is the transform node affected by UV mapping? I thought it does the transformation based on the geometry of the object. And if it uses UV maps, why is there no dropdown to select which UV it's going to use?