My task is to calculate the distance between a rectangle and the 0/0 point in an coordinate system and print a particular answer. If it's nearer than 100m (the unit of the system is meters, 1 unit = 1 meter), it should print 100m, if distance < 200m, print 101m...
I've learned that I can use the Pythagorean theorem to get the distance between two coordinates. I implemented it into my program (in Python) but I have got some trouble with the output.
Let's try an example. A rectangle with the coordinates (–400,200); (–300, 200); (–300, 300); (–400, 300) is 360m away from the point (0/0). The right output would be "103m".
Somebody asked something like this before and they said, you have to divide the distance through 100 and add it to "10{}".
print("10{}m".format(distance//100))
Actually, this works for everything below 1000. If the coordinates would be (–4000,2000); (–3000, 2000); (–3000, 3000); (–4000, 3000), the correct distance would be "3605m" and it should output "136m".
Hope you can understand my case/question!