3

I am using a GoPro HERO 4 on a drone to capture images that need to be georeferenced. Ideally I need coordinates of the captured image's corners relative to the drone.

I have the camera's:

  1. Altitude
  2. Horizontal and vertical field of view
  3. Rotation in all 3 axes

I have found a couple of solutions but I can't quite translate them for my purposes. The closest one I found is here https://photo.stackexchange.com/questions/56596/how-do-i-calculate-the-ground-footprint-of-an-aerial-camera but I can't figure out how and if it's possible for me to use it. Particularly when I have to take both pitch and roll into account.

Thanks for any help I get.

Edit: I code my software in Java.

Community
  • 1
  • 1
Milan Zelenka
  • 416
  • 1
  • 7
  • 11
  • What do you mean the coordinates of the corners? Do you mean in the focal / projection plane of the camera, or the four corners on the ground as in the question? If you mean the latter it's still possible, but it may be more efficient to user a vector-based approach –  Jun 29 '16 at 15:46
  • Yes I mean the four corners on the ground defining a trapezoid. Are you implying I should be looking for the intersections of four "rays" coming from the camera with the ground? – Milan Zelenka Jun 29 '16 at 16:01
  • yes, you should calculate those and do a standard ray-plane intersection test. if you were to try to do it using trig functions as in the link you gave... well, good luck. –  Jun 29 '16 at 16:06
  • just making sure - what are your parameters of rotation? pitch, yaw, roll? –  Jun 29 '16 at 18:23
  • Yes but basically I just need pitch and roll. – Milan Zelenka Jun 29 '16 at 22:02

1 Answers1

4

If you have rotations in all three axes then you can use these matrices - http://planning.cs.uiuc.edu/node102.html - to construct a full (3x3) rotation matrix for your camera.

Assuming that, when the rotation matrix is an identity (i.e. in the camera's frame) you have defined the camera's axes to be:

  • X axis for front
  • Y for side (left)
  • Z for up

In the camera frame, the rays have directions:

enter image description here

Calculate these directions and rotate them using the matrix to get the real-world axes. Use the camera's real world coordinate as the source.

To calculate the points on the ground: https://www.cs.princeton.edu/courses/archive/fall00/cs426/lectures/raycast/sld017.htm

  • Thank you so much! – Milan Zelenka Jun 30 '16 at 08:51
  • @MilanZelenka no problem. and one more thing - you may need to normalize the ray vectors –  Jun 30 '16 at 08:51
  • I have just run some tests to verify the results of my calculations and I have encountered something I am not sure about. I set the camera facing down and the rays have directions: (+-tan(FOVv/2), +-tan(FOVh/2), -1). Now if I am correct the angle between the bottom left and the top right ray should correspond to the camera's diagonal FOV but it doesn't. The camera's specs are FOVh = 64.4, FOVv = 37.2 FOVd = 73.6. The two vectors are (0.336,0.629,-1.0) and (-0.336,-0.629,-1.0). The angle between them is 71 degrees. Am I missing something here? Normalization does not change the results. – Milan Zelenka Jul 08 '16 at 12:58
  • that is to be expected because you are calculating the angle on the slanted side of the frustum, whereas fov is the angle of the straight cross section. They are different, as is apparent from the sine rule. –  Jul 08 '16 at 16:38
  • @MilanZelenka - did you manage to make this work? I'm trying to do the same, and have never worked with matrix rotation, although I have been able to understand and make those initial calculations. What does one do with the rotated results in order to get the footprint ground coordinates, with a known source coordinate, altitude and angle? – ssast Nov 27 '17 at 21:44
  • 2
    Yes, I managed to make it work. You can find the source here https://github.com/zelenmi6/thesis/blob/master/src/geometry/CameraCalculator.java Look at the getBoundingPolygon(..) method and let me know if that is enough for you. – Milan Zelenka Nov 28 '17 at 08:39
  • @MilanZelenka Awesome, thank you. I'm working in Python, but think I can follow your example. I'm a little uncertain on how to work with long-lat coordinates as x,y, and the altitude in meters, but I see you set the the source at (0, 0 altitude), so perhaps that's where I've been tripping up. – ssast Nov 28 '17 at 18:30
  • 1
    I've found I could get this working with my data by first converting my geographic coords to utm so all units are in meters, then performing the ray rotation and intersection. Another thing I hadn't realized before is yaw (and others) are measured counter clockwise, and should be given as radians, not degrees. – ssast Nov 28 '17 at 20:50
  • I see that CameraCalculator.getBoundingPolygon() uses constants.CameraTests.MAX_DISTANCE, which is set to 100. I assume I should set MAX_DISTANCE to a very large number larger than any altitude that I will use and recompile before using, is that right? – Tomasso Jul 25 '19 at 22:57
  • Hi @ssast did you mind to share the python porting of the java code? – Luigi Pirelli Aug 16 '19 at 07:55
  • for everyone and tnx to @MilanZelenka to share, I prepared a 1to1 porting of his java code to python. you can find here: https://gist.github.com/luipir/dc33864b53cf6634f9cdd2bce712d3d9 – Luigi Pirelli Aug 24 '19 at 08:26