I'm working on a bumpmapping implementation in my own 3D tracer. And to be really short - I want this result:
I've been reading pages and pages about bumpmapping, heightmaps, normal maps, ... and I think I understand most of the subject, the difference between all above etc etc. (I got perlin-bumpmapping to work - result ) The only thing I don't get is how to actually perturbate the surface normal from an object given a texture image. Most papers and pages are more than slightly vague on that subject.
First I figured you'd just add an image-texture to your object, for a given pixel you'd calculate the rgb color from the texture and then calculated a lightness factor from the rgb value (something like a grayscale value,...) and perturbated the normal with it before adding light effects. - I want(ed) to calculate the normal on the fly.
Question: if the above is even possible: how to perturbate the normal given the lightness-factor? And if I'm wrong here, any tips/links that could help me work my way up to given result would be greatly appreciated.
Question 2: If the above is not possible with any given rgb image, could anyone explain how to perturbate an image given a heightmap? like the first image on wikipedia Thanks a lot.
P.S: I'm doing/have to do the implementation in Java.