I currently have objects positioned in 'real' world positions (unit=meter). The problem is I want an object viewable a few kilometres away. When OpenGL scales this object, its too small and barely perceivable. I also have objects a few meters away, which would be too large if I simply scaled the objects.
So I was looking for a way of mapping objects a few kilometres away to be closer, and those too close to be further. I found glDepthRange, which seems to have been what I wanted, but I know believe thats more to do with ordering depth than the visual representation. Im also aware of offsetPolygon, but I doubt that will give me my desired result. I could manually transform the co-ordinates of the objects using polar co-ordinates and update them as the view moves, But it seems an awfully heavy approach.
Is there a way of achieving it. As Im using OpenGL ES 1.0, shaders are out of the question. BTW its for use in Augmented Reality.
Thanks