4

I am building a ray Tracer from scratch. My question is: When I change camera coordinates the Sphere changes to ellipse. I don't understand why it's happening.

Here are some images to show the artifacts:

Sphere: 1 1 -1 1.0 (Center, radius)
Camera: 0 0 5 0 0 0 0 1 0 45.0 1.0 (eyepos, lookat, up, foy, aspect)

enter camera 0 0 5 0 0 0 0 1 0 45.0 1.0

But when I changed camera coordinate, the sphere looks distorted as shown below:

Camera: -2 -2 2 0 0 0 0 1 0 45.0 1.0

enter image description here

I don't understand what is wrong. If someone can help that would be great!

I set my imagePlane as follows:

   //Computing u,v,w axes coordinates of Camera as follows:

 {       
        Vector a = Normalize(eye - lookat);  //Camera_eye - Camera_lookAt
        Vector b = up;    //Camera Up Vector 
        m_w = a;
        m_u = b.cross(m_w);
        m_u.normalize();
        m_v = m_w.cross(m_u);
}

After that I compute directions for each pixel from the Camera position (eye) as mentioned below:

//Then Computing direction as follows:

int half_w = m_width * 0.5;
    int half_h = m_height * 0.5;

double half_fy = fovy() * 0.5;
double angle = tan( ( M_PI * half_fy) / (double)180.0 );

for(int k=0; k<pixels.size(); k++){
    double j = pixels[k].x();       //width
    double i = pixels[k].y();       //height

    double XX = aspect() * angle * ( (j - half_w ) / (double)half_w );
    double YY  =           angle * ( (half_h - i ) / (double)half_h );

    Vector dir = (m_u * XX + m_v * YY) - m_w ;


 directions.push_back(dir);
}

After that:

 for each dir:

    Ray ray(eye, dir);
    int depth = 0;
    t_color += Trace(g_primitive, ray, depth);
Paul Floyd
  • 5,530
  • 5
  • 29
  • 43
sinner
  • 803
  • 2
  • 11
  • 24
  • 1
    Have you seen the answer to [this question](http://stackoverflow.com/questions/14074643/why-does-raytracer-render-spheres-as-ovals)? – jon hanson Jan 08 '13 at 08:36
  • Thanks jon-hanson. I played with my implementation alot and found that if my `camera's all three xyz corrdinates are non-zero` then it is showing distortion, but if anyone of the `xyz coordinate is zero` then it works fine. I think it could be because of distortion of the perspective but I would like to know if there is something else which could responsible for this problem. – sinner Jan 08 '13 at 12:03
  • Thanks for the link also. I checked for FOV parameter value and it seems fine. I changed it to 30 and 15 respective values but the result is still the same. – sinner Jan 08 '13 at 12:35
  • But I still don't find the right answer for this question. If anybody knows then it would be great! – sinner Jan 19 '13 at 17:12
  • 1
    Try extending your program to render an array (aka grid) of spheres. E.g. 5 from left to right x 5 from bottom to top. The middle sphere woud be at the middle of the view. Make the sphere's radius less than half the distance between them. If that doesn't help you then post the picture. – jon hanson Jan 19 '13 at 19:10
  • seems like I was doing everything right. This behaviour of sphere distortion is because of perspective. I will soon answer this question and will add more examples to close this thread with a conclusion. Thanks a lot @jon-hanson for your help! :) – sinner Jan 23 '13 at 10:10

2 Answers2

5

After playing a lot and with the help of the comments of all you guys I was able to create successfully my rayTracer properly. Sorry for answering late, but I would like to close this thread with few remarks.

So, the above mentioned code is perfectly correct. Based on my own assumptions (as mentioned in above comments) I have decided to set my Camera parameters like that.

The problem I mentioned above is a normal behaviour of the camera (as also mentioned above in the comments).

I have got good results now but there are few things to check while coding a rayTracer:

  1. Always make sure to take care of Radians to Degrees (or vice versa) conversion while computing FOV and ASPECT RATIO. I did it as follows:
     double angle = tan((M_PI * 0.5 * fovy) / 180.0);
     double y = angle;
     double x = aspect * angle;

2) While computing Triangle intersections, make sure to implement cross product properly.

3) While using intersections of different objects make sure to find the intersection which is at a minimum distance from the camera.

Here's the result I got: enter image description here

Above is a very simple model (courtesy UCBerkeley), which I rayTraced.

sinner
  • 803
  • 2
  • 11
  • 24
2

This is the correct behavior. Get a camera with a wide angle lens, put the sphere near the edge of the field of view and take a picture. Then in a photo app draw a circle on top of the photo of the sphere and you will see that it's not a circular projection.

This effect will be magnified by the fact that you set aspect to 1.0 but your image is not square.

A few things to fix:

  • A direction vector is (to - from). You have (from - to), so a is pointing backward. You'll want to add m_w at the end, rather than subtract it. Also, this fix will rotate your m_u,m_v by 180 degrees, which will make you about to change (j - half_w) to (half_w - j).

  • Also, putting all the pixels and all the directions in lists is not as efficient as just looping over x,y values.

Paul Floyd
  • 5,530
  • 5
  • 29
  • 43
All The Rage
  • 743
  • 5
  • 24
  • Hello Rage, Thanks for your comments but: 1) "Vecter" is my own dataType for Mathematical Vectors. 2) Direction is deliberately set like that to make coherence with OpenGL camera specifications (as in, in OpenGL Z is -ve down the axis) and also to keep the image from flipping. 3) And since I consider my image (which is generated finally as a result) origin to be at the UPPER_LEFT corner, that's why its `j-half_w` and `half_h-i` – sinner Apr 01 '13 at 08:32