I'll try to be as specific as i can.
I'm developing an Augmented Reality application, and I need to figure out the best range of action within which show markers. For usability reason I've decided to set the maximum at 5km. I would like to discover the best radius within which is included the maximum density of markers that are closer my position.
For example, if I am in a place where there are within 100 meters, 20 markers, and 1 km away from these other 5 marker, I am interested to see in detail the 20 marker within 100 meters, then the radius must be set to 100m . If I am in a place where I have 5 markers between 3 and 4 km, I have interest in having the radius of 4km.
what kind of algorithm or mathematical method (clustering? centroid? histogram's peaks?) do you recommend for the purpose? I know the distance from my position of each marker.
EDIT: I'll try to be more specific adding some info.
USE CASE let's say that the markers indicate the user's friends, and that the user is in a office in the morning, sure he will have around it a high density of markers / friends, and will therefore be more interested in seeing these in detail (say, in within 1 km) rather than those who are at the train station (say within a radius of 5 km). Suppose instead that the user is always in the office, but in the evening, there will be not a lot of markers / friends within a radius of 1km as before, but maybe it will be interested to see those who are within 5km, to know who's going away, who possibly is coming to pick it up, etc..
WHAT I'VE TRIED I ideally divide the distances in Xmeters bands and calculated how many markers were part of each band, finding the band containing the maximum number fo marker, and setting the radius to that dimension. But I lose all markers in neighboring bands greater than the one taken as the radius, but which are part of the "density" which is still of interest. I don't really know how to fix this thing.