0

I'm running Rstudio with the bnlearn package installed.

I want the influences of different parent nodes on the children (with confidence intervals if possible).

I have:
my dataset, made into factors (25 variables, 200 cases).
my whitelist of all legal connections between nodes (60 arcs).

My problem: when I run the following code:

w<-rsmax2(fac, whitelist = arcs, blacklist = NULL, maximize = "hc",
           test = NULL, score = NULL, alpha = 0.05, B = NULL,
           optimized = TRUE, strict = FALSE, debug = TRUE)

my computer will stall through too little memory (it creates about 7 gigs of memory usage untill my pc gives in). However, if I drop my whitelist, it will run just fine.

So:

  1. As far as I know detecting the interactions with constraints (arcs.csv) should use less memory than without. This turns out to be false. Am I missing something here?

  2. I'm absolutely not interested in bnlearn suggesting a model for me, I know what it should look like. I just want the dependencies calculated with p-values (and confidence intervals) between the nodes. Am I even using the right tool?

Mel
  • 5,837
  • 10
  • 37
  • 42
  • Atta, if you know the model, and don't need any structure learning, then you just just pass the model and learn the parameters. http://www.bnlearn.com/examples/dag/ shows how to specify a graph manually, and then you estimate parameters with `bn.fit` – user2957945 May 07 '17 at 20:07
  • @user2957945 you are absolutely right. I feel like an idiot right now. Thanks! – Atta van Westreenen May 08 '17 at 08:41

0 Answers0