TLDR:
How can I get the marginal and conditional R^2 corresponding to the Johnson, P. C. (2014) paper to a simple random slope models with unstructured covarience type and a single random slope in SPSS?
Longer:
In R I am using the r.squaredGLMM() function in the MuMin package to compute the marginal and conditional R^2 for my mixed models according to Johnson, P. C. (2014). Extension of Nakagawa & Schielzeth's R2GLMM to random slopes models. Methods in ecology and evolution, 5(9), 944-946.
I am teaching a class now on mixed models using SPSS, and I would like to give students a tool through which they can also calculate marginal and conditional R^2. However, SPSS does not provide this in its outputs for mixed models.
I found instructions on how to compute the marginal and conditional R^2 in SPSS for random intercept models by Paul Johnson. This requires computing: Vf (fixed effect variance), Vr (random effect variance), Ve (residual variance),
In a very simple case of a random intercept model I compute these by:
- Vf: I take the variance of the predicted values based on the fixed effect predictors only
- Vr: Estimates of Covariance Parameters table listing the variance associated with the random intercept
- Ve: Estimates of Covariance Parameters table listing the variance associated with the residuals
and use the formula:
- R^2m = Vf / (Vf + Vr + Ve)
- R^2c = (Vf + Vr) / (Vf + Vr + Ve)
This returns the same result as this R code:
library(lme4)
library(MuMIn)
library(insight)
data <- read.csv("https://raw.githubusercontent.com/kekecsz/SIMM32/master/2021/Lab_4/data_bully_slope.csv")
mod1 <- lmer(sandwich_taken ~ weight + (1|class), data = data)
summary(mod1)
r.squaredGLMM(mod1)
However, I do not know how to generalize this to a simple random intercept + slope case in SPSS. In R I was able to get these variance components as:
mod2 <- lmer(sandwich_taken ~ weight + (weight|class), data = data)
Vf = var(predict(mod2,re.form=NA))
Vr = get_variance(mod2)$var.random
Ve = sigma(mod2)^2
and using this formula:
R^2m = Vf / (Vf + Vr + Ve)
R^2c = (Vf + Vr) / (Vf + Vr + Ve)
returns the same values as
r.squaredGLMM(mod2)
I can compute the Vf and Ve the same way as stated above in SPSS, but I cannot find the variance for the random effects in the output. I tried to calculate it by taking the total variance of the outcome variable, and subtracting Vf and Ve:
Vtotal = var(model.response(model.frame(mod1)))
Vr = Vtotal - (Vf + Ve)
But the value I get for Vr is different from what I get from
get_variance(mod2)$var.random
and the marginal and conditional R^2 computed this way does not correspond to
r.squaredGLMM(mod2)
How can I get the marginal and conditional R^2 corresponding to the Johnson, P. C. (2014) paper to a simple random slope models with unstructured covarience type and a single random slope in SPSS?
Alternatively, would it be reasonable to say that the following would give a reasonable estimation of marginal and conditional R^2 for a simple random slope model? (Because this is something I could compute in SPSS), with the caveat that this includes a slight over-estimation of the marginal R^2 and a slight underestimation of the conditional R^2?
Vtot = var(model.response(model.frame(mod2)))
Vf = var(predict(mod2,re.form=NA))
Ve = sigma(mod2)^2
Vr_est = Vtot - (Vf + Ve)
Rm_est = Vf / (Vf + Vr_est + Ve)
Rm_est
Rc_est = (Vf + Vr_est) / (Vf + Vr_est + Ve)
Rc_est