I am calling the lFormula
function in lmer
(lme4
1.1-9) to get the transpose of the sparse model matrix of a random effect (Zt
). The problem is that unused levels are dropped in the output matrix. Is-there a way to get around this?
I realize that this question might seem awkward, so here is an example on GitHub Gist trying to fit a multiple-membership model with a factor that has more levels than observations.