0

I'm having problems with memory when running a Poisson regression model. With the data loaded in and ready for the model, python is using about 650 MB of memory. As soon as I create the model,

import theano.tensor as t
with pm.Model() as poisson_model:
    # priors for coefficients
    coeffs = pm.Uniform('coeffs', -10, 10, shape=(1, predictors.shape[1]))

    r = t.exp(pm.sum(coeffs*predictors.values, 1))

    obs = pm.Poisson('obs', r, observed=targets)

the memory usage shoots up to 3 GB. There are only 350 data points of 8-bit integers, so I have no idea what is using this amount of memory.

After playing around a bit, I've found that adding anything to the model puts it up to 3 GB in memory, even something as simple as

with pm.Model() as poisson_model:
    test = pm.Uniform('test', -1, 1)

Any suggestions as to what is happening or how I can look deeper? I'm using a new iMac, Python 2.7, and the latest version of PyMC3. Thanks.

jww
  • 97,681
  • 90
  • 411
  • 885
Mat Leonard
  • 253
  • 1
  • 2
  • 6

1 Answers1

1

I've tried replicating this on my system (Macbook Air, Py 2.7) but get ~80MB of memory usage. I would try a couple of things:

  1. Clear the theano cache:

    theano-cache clear

  2. Try updating Theano

  3. Re-install PyMC from the master branch

These are all guesses, as I cannot replicate the issue, so I'm hoping one of these will do the trick.

Chris Fonnesbeck
  • 4,143
  • 4
  • 29
  • 30