I have a large panel dataset with ~ 2000 individuals and ~ 15000 observations (person/year). I have set of time-varying and non-time varying variables and and binary outcome variable (0/1). I am trying to do a multi-level discrete survival analysis using glmer using "lme4" package.
id = Individual ID, survtime = # of years survival before event/censoring
I couldn't produce a reproducible example with such large dataset but here is my code,
Modelsurv <- glmer(formula = outcome ~ survtime + var1(discrete-timevarying) + var2(discrete-timevarying) + var3(dummy-non timevarying) + var4(4 level categorical-non timevarying)+ (1|id),
family = binomial(cloglog),
data = dataset,
control = glmerControl(optimizer = "bobyqa",
optCtrl = list(maxfun = 2e5)))
I am trying to replicate this example here. See point 8 (multilevel discrete-time survival analysis).
I am not understanding what the code control = glmerControl(optimizer = "bobyqa", optCtrl = list(maxfun = 2e5
is doing and in my case with such large data, how and what to set this?
I tried using the above code but get the following message error.
unable to evaluate scaled gradient
Warning in checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv, :
Model failed to converge: degenerate Hessian with 1 negative eigenvalues
Can anyone help me understand and guide me in this? And will I have to make iteration based on number and kind of variables I add into the model?
Thank you !