Say you have a trivial Unity shader. It does not at all use any texture. It grabs simply the position ..
void vert (inout appdata_full v, out Input o)
{
UNITY_INITIALIZE_OUTPUT(Input,o);
o.localPos = v.vertex.xyz;
}
and then draws a square ..
the quad in the example has been stretch about 3:1 using the transform.
If in the shader we simply knew the scaling (or, just the ratios) we could very easily draw "square squares"
This is obviously a common, everyday technique for things like billboarding, 2D images and backgrounds etc.
In current unity (2018) how the heck do you simply get the current scaling of the object, in Cg?
This is one of those crazy Unity things that is (1) totally undocumented (2) where the only information available about it is as much as 13 years old, I mean some of the folks involved may be deceased (3) it has changed drastically in different Unity versions, so often discussion about it is just totally wrong. Sigh.
How to do it? Can you?
Currently I just have a trivial script pass in the value, which is OK but a bit shoddy.