I am looking to do this for a community sublet advertising website, but theoretically the algorithm would be similar for any local search.
The less populated the area searched, the higher should be the default radius to search. On the other hand areas with high population density should have low default radius to keep things locally relevant.
This might be more of a mathematical question than a programming one but code is very welcome. So far I have calculated amount of sublets within 15 miles of each town or village and saved this in the database as an approximation for density. I intended to use this number to figure out how far to go with the search when someone searches for the town or village.
To test any proposed solution, I pulled out some approximate numbers I would want the algorithm to come up with. If there is a lot of sublets within 15 miles of a point, say 30k, I would want the default radius to search to be around 3 miles. If there is very little say 1 or 2, the default radius should go high up to 25 miles, even more miles if there are no places around. A mid range area with say ~1k sublets would have a default radius of 15 miles. These are just examples, the density will off course grow or shrink a with number of things in the database.
Population -> Default search radius
0 -> very high (~60 miles or more)
1 -> 25 miles
1k -> 15 miles
30k -> 3 miles
Am I going in the right direction? Python or PHP would be preferred for code centric answers.
Thank you