I am trying to use forecast reconciliation
in fable to improve forecasts at low, intermittent hierarchy levels. However, my computer runs out of memory for anything but trivial examples.
I am basing my analysis on example code from the presentation "Tidy Time Series & Forecasting in R : 10. Forecast Reconciliation" (bit.ly/fable2020, presented at rstudio::conf 2020) :
tourism %>%
aggregate_key(Purpose * (State / Region), Trips = sum(Trips)) %>%
model(ets = ETS(Tripsl)) %>%
reconcile(ets_adjusted = min_trace(ets)) %>%
forecast(h = 2)
This runs fine, even on my 8 GB RAM laptop.
However, our data has many more hieriarchy levels and groupings than this example, and the code is never able to complete. As a reproducible example I have added more three dummy levels to the "tsibble::tourism" dataset and include these in the aggregate_key
. This runs out of memory even on my 50 GB RAM server!
tourism %>% mutate(Region1 = Region, Region2 = Region, Region3 = Region) %>%
aggregate_key(Purpose * (State / Region/ Region1 / Region2 / Region3), Trips = sum(Trips)) %>%
model(ets = ETS(Trips)) %>%
reconcile(ets_adjusted = min_trace(ets)) %>%
forecast(h = 2)
Error: cannot allocate vector of size 929 Kb
Question Is there some way I can run this without reducing hierarchy levels and without running out of memory? Thanks!