0

I have a large table of swell period associated with a lat, long, and date/time. Lat, long, and date are separate columns. I have a smaller table of exact locations, lat and longs in seperate column, in which I want to extract swell period per date/time stamp. I am stumped on how to do this. I have provided samples of my code and tables below. My attempt seems to only pull out 6 of the 20 locations. I am confident there is matching lat and longs.

full spatial DT

specific locations


x <- as.vector(round(c(-121.110423, -120.562848, -120.057825,  -119.717691, -119.439510, -119.041160, -118.586216, 
                           -118.507400, -118.087338, -117.662998, -117.437052, -117.335509, -118.160157, -119.059945, 
                           -119.943535, -120.515873, -119.070552, -118.295248, -119.471023, -120.584913), 2))

y <-  as.vector(round(c(34.560459, 34.253113,34.348268, 34.147373, 34.256965, 33.989257, 33.913609, 33.699766, 33.576926, 
                           33.272809, 32.922927, 32.535688, 32.606003, 32.500511, 32.644805, 33.138259, 33.135506, 33.102819,
                           33.696946, 33.703024), 2))

ta_loc <- as.data.frame(cbind(x, y))


Ta_data <- as.data.frame(rasterToPoints(final_ras_ta)) %>%
  pivot_longer(!c(x,y), names_to = "date", values_to = "period") %>%
  mutate(date = gsub("X", "", date),
         date = as.numeric(date),
         date = as_datetime(date),
         x = round(x, 2),
         y = round(y , 2)) %>% 
  filter(x %in% ta_loc$x,
         y %in% ta_loc$y )
  
a <- Ta_data %>% 
  st_as_sf(. , coords = c("x", "y"), crs = 4326)

b <- ta_loc %>% 
  st_as_sf(. , coords = c("x", "y"), crs = 4326)

loc <- a %>% 
  group_by(date) %>% 
  filter(geometry == b$geometry)
  
r2evans
  • 141,215
  • 6
  • 77
  • 149
Eizy
  • 253
  • 1
  • 9
  • Are you `round`ing x & y to anonymize exact locations? Anyway, effects of rounding might account for that 6 of 20 result. – Chris Apr 08 '22 at 20:59
  • You may want to check out the "fuzzyjoin" package. It has a geojoin function that might do the trick here. – Dave2e Apr 08 '22 at 21:23
  • 1
    Please do not post (only) an image of code/data/errors: it breaks screen-readers and it cannot be copied or searched (ref: https://meta.stackoverflow.com/a/285557 and https://xkcd.com/2116/). Please include the code, console output, or data (e.g., `data.frame(...)` or the output from `dput(head(x))`) directly. – r2evans Apr 08 '22 at 21:24
  • I run your code for `Ta_data` and I got this error message: `Error in match.arg(method) : 'arg' must be NULL or a character vector` – Abdur Rohman Apr 09 '22 at 00:58
  • @Chris I am rounding because there are not exact matches to the values with that many decimal places. After I round both data sets I went and manually checked each cordinate value and they were present. Yet when I string it together it only pulls 6 locations ``` check <- Ta_data %>% filter(x == -120.56, y == 34.25) ``` – Eizy Apr 10 '22 at 18:05
  • 1
    @Dave2e So it seems I can use one of the join options and specify not needing an exact match? – Eizy Apr 10 '22 at 18:06
  • I'm assuming your sampling stations are reasonably spaced, but, transform both to some common equal area, buffer sampling stations, then intersect samples with stations...instead of rounding at the beginning. – Chris Apr 10 '22 at 19:06
  • @Chris no they are not evenly spaced. They are specific locations where I want to display swell period on a heat map as "text". – Eizy Apr 11 '22 at 01:03
  • Not evenness, just sufficiently spaced that a reasonable buffer could be applied to all, but can be done individually. – Chris Apr 11 '22 at 01:17
  • You questions is too open ended to provide a complete answer and lacking example data. How would you like to handle the one-to-many join between table 2 and table 1? Yes, the fuzzy join package does not need an exact match. Or see this question for another approach of finding the closest point between 2 data frames: https://stackoverflow.com/questions/47997906/set-static-centers-for-kmeans-in-r/48000184#48000184 – Dave2e Apr 11 '22 at 02:51

0 Answers0