Disclaimer for this question:
- This will probably be a case of the XY problem. I would love to hear suggestions for Y or X
- I have a BS in CS, but I've forgotten much of my knowledge of linear algebra, so even rediscovering what to google would be helpful.
The main problem: I want to create a program where I can select several (x, y) coordinates on an image, and enter corresponding (theta, phi) coordinates for that point on the image. How would I use these points to generate a nonlinear transformation or function, so that any future given (x, y) has a (theta, phi) output? Is this mathematically possible? Is there some magic python library I haven't heard about that will do it for me?
For context/the X problem: I'm building a turret that detects my cats on the counter and sprays them with water. The camera position is static because I would like to set a simple "No-Go" polygon for the cats, and don't want to deal with room tracking. Cat detection/robotics/nearly everything else is already done. Problem Y comes in because I was thinking I could use a target to "calibrate" the turret by taking several points on the image, noting their (x, y) position in the image, and then noting the corresponding (theta, phi) values for the turret to hit the target. Then using whatever function that generates, I could pick any position in the image, and have the turret hit hopefully roughly in the right spot. I understand this would be finicky in areas where there's large distance variations, such as the edge of a counter, but I'd hope several calibration points around that area would make it functional enough. I've also considered using multiple cameras and generating a depth map or using range finders instead of this. I'm open to suggestions for problems X or Y.
I've looked into generating non-linear functions/matrices from a given list of pairs of coordinates, but I feel I'm lacking the terminology to find what I'm actually looking for.