I'm running into memory issues using the bnlearn package's structure learning algorithms. Specifically, I notice that score based methods (e.g. hc and tabu) use LOTS of memory--especially when given a non-empty starting network.
Memory usage wouldn't be an issue except that it continually brings down both my laptop (16GB RAM) and a VM I'm using (128 GB RAM), yet the data set in question is a discrete BN with 41 nodes and ~250 rows (69KB in memory). The issue occurs both when running sequentially with 16GB of RAM and in parallel on a VM (32GB/core).
One last bit of detail: Occasionally I can get 100-200 nets with a random start to run successfully, but then one net will randomly get too big and bring the system down.
My question: I'm newer to BNs, so is this just inherent to the method or is it a memory management issue with the package?