0

I'm using OpenCV to do gaze estimation of a person. Currently I have a vector of the user's gaze (for each eye) and coordinates of the center of the eye all in 3D world coordinates.

I want to figure out approximately where the user is looking on the screen. I've tried a few approaches but the results have been inaccurate. How would I go about doing this?

user3543300
  • 499
  • 2
  • 9
  • 27
  • Do you have the coordinates of the screen? – user253751 Apr 05 '16 at 23:58
  • No how do I get these. All I have is the 2D coordinates of the video from my webcam, which would be 640x480. – user3543300 Apr 05 '16 at 23:59
  • I don't know how you can get those. But you could imagine that the screen might be above, below, or to the side of the webcam. And the screen might be different sizes too. That's why you need the 3D coordinates of it. – user253751 Apr 06 '16 at 00:08
  • You could try to estimate them using some calibration process -- i.e. have the user look at each of the 4 corners of the screen, or some image you display at various locations. – Dan Mašek Apr 06 '16 at 02:40
  • How would the user looking at 4 corners of the screen help me calibrate that? – user3543300 Apr 06 '16 at 03:13
  • You said you have the vector of the users gaze and the position of the eye in 3D world coordinates. Project the vectors of the user looking at one corner from several positions, find where the projected lines are closest together (in ideal case they would intersect). That should give you approximate 3d world coordinates of the screen. – Dan Mašek Apr 06 '16 at 05:06

0 Answers0