0

I'm using a python api for Google Earth Engine code editor and am working with a very large featureCollection (1.8 billion ish features). I am using

ee.FeatureCollection(buildings.toList(chunk,i))

to break the code up and loop over chunks, but as I get further into the dataset this process is becoming extremely slow (like 10x slower and I'm only 100 million in). I'm wondering how I can speed this process up. The documentation for ee.FeatureCollection.toList(chunk,i) says i is "The number of elements to discard from the start. If set, (i + chunk) elements will be fetched and the first offset elements will be discarded", which means it's still pulling all of the unnecessary data before deleting it. Is there another way to chunk a FeatureCollection that doesn't do this?

Thank you!

0 Answers0