I have a discrete (3 states) network w/ ~ 2500 nodes that I am running in BNLEARN via R. As it stands, it has been running for about 5 1/2 days. I am trying to estimate how long it will take. Is there anyway I can calcuate this? I am using a laptop that has 16 GB of RAM and has a processor speed of 2.9 GHz. I also understand using a server would be quicker but I was having issues with our server. Thanks.
Asked
Active
Viewed 47 times
0
-
12500 nodes isn't that many. How many rows does your datframe have .. is the 16gb reasonable for the size of data? Which algorithm are you using / which test? What paramters have you set e.g. number of restarts, perturbations etc. . If it is taking too long say using a hill-climbing alg. then you could look try using a hybrid alg. with a sensible test. – user20650 Jun 20 '23 at 16:00
-
6 rows. I am using hc algorithm and maxp = 5. I haven't set any perturbations or restarts. I was just trying to see how long this one would take. – LittleBlueHeron Jun 20 '23 at 16:13
-
1From running similar searches (hc & > 2500 nodes, but with >> more rows) this should not take 5.5 days. But with only 6 rows of data and each variable having 3 states, there will likely be a lot of zeros in the cpt's when testing during the search. I'd think this may cause the search to go on as the likelihood is probably quite flat. You could reduce maxp to one or two and see if it helps (5 is too may as 6 rows of data wont have enough info) . Perhaps also try the bds score. – user20650 Jun 20 '23 at 16:22
-
That makes sense. Thank you. – LittleBlueHeron Jun 20 '23 at 16:24