I am an absolute beginner in PostgreSQL
and PostGIS
(databases in general) but have a fairly good working experience in R
. I have two multi-polygon data sets of vulnerable areas of India from two different sources - one is around 12gb and it's in .gdb
format (let's call it mygdb
) and the other is a shapefile around 2gb (let's call it myshp
). I want to compare the two sets of vulnerability maps and generate some state-wise measures of fit using intersection (I), difference (D), and union (U) between the maps.
I would like to make use of PostGIS
functionalities (via R
) as neither R
(crashes!) nor qgis
(too slow) is efficient for this. To start with, I have uploaded both data sets in my PostGIS
database. I used ogr2ogr
in R
to upload mygdb
. But I am kind of stuck at this point. My idea is to split both polygon files by states and then apply other functions to get I, U and D. From my search, I think I can use sf
functions like st_split
, st_intersect
, st_difference
, and st_union
. However, even after splitting, I would imagine that the file sizes will be still too large for r to process, so my questions are
- Is my approach the best way forward?
- How can I use
sf::st_
functions (e.g.st_split
,st_intersection
) without importing the data from database into R
There are some useful answers to previous relevant questions, like this one for example. But I find it hard to put the steps together from different links and any help with a dummy example would be great. Many thanks in advance.