0

I'd like to model a distribution which is a mixture of a Normal and the constant 0.

I couldn't find a solution because in all the mixture examples I've found the class of distribution is the same for every category.

Here is some code to illustrate what I'm looking for:

with pm.Model() as model:
    x_non_zero = pm.Normal(...)
    zero_rate = pm.Uniform('zero_rate', lower=0.0, upper=.0, testval=0.5)
    fr = pm.Bernoulli('fr', p=zero_rate)
    x = pm.???('x', pm.switch(pm.eq(fr, 0), x_non_zero, 0), observed=data['x'])

I'm interested in the rate the data is exactly zero and the parameters of the normal when it is non-zero.

Here is how the data I'm modelling roughly looks like: Mixture of a normal with a constant

smatting
  • 125
  • 6

1 Answers1

0

One option will be to try with a Gaussian mixture model, we may think of a Gaussian with sd=0 as a constant value. Another option will be to use a model like the following:

with pm.Model() as model:
    mean = pm.Normal('mean', mu=100, sd=10)
    sd = pm.HalfNormal('sd', sd=10)

    category = pm.Categorical('category', p=[0.5, 0.5], shape=len(x))
    mu = pm.switch(pm.eq(category, 0), 0, mean)
    eps = pm.switch(pm.eq(category, 0), 0.1, sd)

    obs = pm.Normal('obs', mu=mu, sd=eps, observed=x)

    step0 = pm.ElemwiseCategorical(vars=[category], values=[0, 1])
    step1 = pm.Metropolis(vars=[mean, sd])
    trace = pm.sample(10000, step=[step0, step1])

to find out the rate you can compute

burnin = 100  
np.mean(trace[burnin]['category'])
aloctavodia
  • 2,040
  • 21
  • 28
  • Thanks for you answer! Do you think it's possible to extend your model with priors on `p` of `category`? – smatting May 25 '16 at 08:27
  • You are welcome. Sure, you can use a Dirichlet distribution as a prior for a categorical, for example `p = pm.Dirichlet('p', a=np.array([1.,1.]))` – aloctavodia May 25 '16 at 11:32