If an approximate solution is okay, you could try a simple optimization algorithm. Here's an example, in Python
import random
def opt(*points):
best, dist = (0, 0), 99999999
for i in range(10000):
new = best[0] + random.gauss(0, .5), best[1] + random.gauss(0, .5)
dist_new = max(abs(new[0] - qx) + abs(new[1] - qy) for qx, qy in points)
if dist_new < dist:
best, dist = new, dist_new
print new, dist_new
return best, dist
Explanation: We start with the point (0, 0), or any other random point, and modify it a few thousand times, each time keeping the better of the new and the previously best point. Gradually, this will approximate the optimum.
Note that simply picking the mean or median of the three points, or solving for x and y independently does not work when minimizing the maximum manhattan distance. Counter-example: Consider the points (0,0), (0,20) and (10,10), or (0,0), (0,1) and (0,100). If we pick the mean of the most separated points, this would yield (10,5) for the first example, and if we take the median this would be (0,1) for the second example, which both have a higher maximum manhattan distance than the optimum.
Update: Looks like solving for x and y independently and taking the mean of the most distant points does in fact work, provided that one does some pre- and postprocessing, as pointed out by thiton.