1

I'm running into memory issues using the bnlearn package's structure learning algorithms. Specifically, I notice that score based methods (e.g. hc and tabu) use LOTS of memory--especially when given a non-empty starting network.

Memory usage wouldn't be an issue except that it continually brings down both my laptop (16GB RAM) and a VM I'm using (128 GB RAM), yet the data set in question is a discrete BN with 41 nodes and ~250 rows (69KB in memory). The issue occurs both when running sequentially with 16GB of RAM and in parallel on a VM (32GB/core).

One last bit of detail: Occasionally I can get 100-200 nets with a random start to run successfully, but then one net will randomly get too big and bring the system down.

My question: I'm newer to BNs, so is this just inherent to the method or is it a memory management issue with the package?

John
  • 11
  • 1
  • John, I have not had these issues and have run structure learning on much larger datasets. Do you receive nay warnings or errors? One issue I have had is that sometimes a node can have too may parents and so the cpt would have greater than 2^31 entries and produces a memory error. I tend to set the maximum number of parents to limit this issue , and besides generally favouring a sparse network is not a bad thing in my opinion. – user20650 Jun 03 '20 at 10:42

0 Answers0