I would like to use the IDI.INF function from the survIDINRI package to the performance of two Cox regression models. For this purpose, I have used the following script:
> D=subset(Dataset, select=c("time","status","Var1","Var2","Var3", "Var4", "Var5", "Var6", "Var7", "Var8"))
> D$status=as.numeric(D$status==2)
> D=D[!is.na(apply(D,1,mean)),] ; dim(D)
[1] 800 10
> head(D)
> outcome=D[,c(1,2)]
> covs1<-as.matrix(D[,c(-1,-2)])
> covs0<-as.matrix(D[,c(-1,-2, -10)])
> head(outcome)
> head(covs0)
> head(covs1)
> x <-IDI.INF(outcome, covs0, covs1, t0, npert=300,npert.rand = NULL, seed1 = NULL, alpha = 0.05)
At the end, I get the following message:
Error in fitter(X, Y, strats, offset, init, control, weights = weights, :
NA/NaN/Inf in foreign function call (arg 6)
In addition: Warning message:
In fitter(X, Y, strats, offset, init, control, weights = weights, :
Ran out of iterations and did not converge
Are there people who have experience with this IDI.INF function? Are there ways to tackle this problem?