I have a problem with a short function to calculate the midpoint of a line when given the latitude and longitude of the points at each end. To put it simply, it works correctly when the longitude is greater than -90 degrees or less than 90 degrees. For the other half of the planet, it provides a somewhat random result.
The code is a python conversion of javascript provided at http://www.movable-type.co.uk/scripts/latlong.html, and appears to conform to the corrected versions here and here. When comparing with the two stackoverflow versions, I'll admit I don't code in C# or Java, but I can't spot where my error is.
Code is as follows:
#!/usr/bin/python
import math
def midpoint(p1, p2):
lat1, lat2 = math.radians(p1[0]), math.radians(p2[0])
lon1, lon2 = math.radians(p1[1]), math.radians(p2[1])
dlon = lon2 - lon1
dx = math.cos(lat2) * math.cos(dlon)
dy = math.cos(lat2) * math.sin(dlon)
lat3 = math.atan2(math.sin(lat1) + math.sin(lat2), math.sqrt((math.cos(lat1) + dx) * (math.cos(lat1) + dx) + dy * dy))
lon3 = lon1 + math.atan2(dy, math.cos(lat1) + dx)
return(math.degrees(lat3), math.degrees(lon3))
p1 = (6.4, 45)
p2 = (7.3, 43.5)
print "Correct:", midpoint(p1, p2)
p1 = (95.5,41.4)
p2 = (96.3,41.8)
print "Wrong:", midpoint(p1, p2)
Any suggestions?