hi!
I've got some werid trouble with creating rays.
Let E be the eye position, L the look at position. From that and the FOV angles i create two vectors UP and RIGHT, in such way that the viewing screen is defined by
V = L + a*UP + b * RIGHT, |a|,|b| <= 1.
( UP + E and RIGHT + E span the plane in which the screen lies )
Now the ray position is E, and ray direction is V - E.
I try to raytrace few spheres, which lie on the OX axis, with eye position 0,0,-2 and lookat 0,0,0
and fov's enough big to see them all.
that what i do not understand is, why the further from 0,0,0 the more the spheres are deformed.
This looks like i would be doing a fish-eye like efect.
Image:
http://img163.imageshack.us/my.php?image=testhi4.pngi am sure, that the Up and RIGHT vectors are correct, and the ray-sphere intersection code was written using two different algos, both giving the same result.
i think i am making some kind of fundamental mistake, but when i looked and other people's code it was moreless the same.