what I'm trying to build is a map that support millions of markers. I worked on everything from clustering markers to load only visible markers using geoquery.atLocation and here it is my problem : geofire take a long time to search in a millions of data. If the app doen't crash in this case. I can change the structure of the database to resolve the problem but this is going to create a lot of issue for me. So is there any trick that make geofire search in very large database to support millions.
Asked
Active
Viewed 129 times
-1
-
You may use the firebase analytics or run the adhoc queries in bigquery console. – jagath Sep 18 '18 at 23:37
-
Or you can export the data to Google cloud and do the analytics locally too if you have any commodity cluster – jagath Sep 18 '18 at 23:38
1 Answers
2
you need to add .indexOn
field g
:
{
"rules": {
...
"locations": {
".indexOn": "g"
}
}
}
when running GeoQuery
with a circular radius, this will load a little more than it would, when querying with rectangular boundaries - while this is not being supported, therefore it is not really avoidable.

Martin Zeitler
- 1
- 19
- 155
- 216