I'm running Rstudio with the bnlearn package installed.
I want the influences of different parent nodes on the children (with confidence intervals if possible).
I have:
my dataset, made into factors (25 variables, 200 cases).
my whitelist of all legal connections between nodes (60 arcs).
My problem: when I run the following code:
w<-rsmax2(fac, whitelist = arcs, blacklist = NULL, maximize = "hc",
test = NULL, score = NULL, alpha = 0.05, B = NULL,
optimized = TRUE, strict = FALSE, debug = TRUE)
my computer will stall through too little memory (it creates about 7 gigs of memory usage untill my pc gives in). However, if I drop my whitelist, it will run just fine.
So:
As far as I know detecting the interactions with constraints (arcs.csv) should use less memory than without. This turns out to be false. Am I missing something here?
I'm absolutely not interested in bnlearn suggesting a model for me, I know what it should look like. I just want the dependencies calculated with p-values (and confidence intervals) between the nodes. Am I even using the right tool?