0

I am designing a system to capture images of a specific object (with specific dimensions) from different distances in a way they have specific sizes in pixel.

I am looking for a solution which gives me the distance if I give it the object size in pixel in the captured image.

Although the easiest way to solve it would be to take the image from farther or closer distances and then crop, down-sample or up-sample it, I need some guideline about the exact distance which gives me the exact size in pixel in the image.

Let's put it in another way: I take a picture of an object (with size W,H) at X cm distance from the camera, and the subject size in the image is w,h. If I want the same object to be at w1,h1, what should be the distance X1 of the object from the camera? For the sake of simplicity, I assume the object is parallel to the camera so it is considered as a 2D object.

ard24ie
  • 181
  • 3
  • 12
  • Not sure I understand the question. Is there a problem with the obvious answer `X1 = X * w1/w`? – Dima Chubarov Jun 18 '15 at 14:39
  • So you mean there is a linear relationship between the distance and the size in pixel size? In other word if we double the distance the size would get twice? I guess that a reverse relationship between them. When object get closer the size gets larger. – ard24ie Jun 18 '15 at 14:44
  • Thank you for the reply. I surely mixed things up. Yet the relationship must be linear. Let f be the focal length, then W * f = X * w, therefore X1 = X * w/w1. See also this [post in Photography SE](http://photo.stackexchange.com/q/48218) – Dima Chubarov Jun 20 '15 at 10:26

0 Answers0