I am running NbClust()
on many different dataframes (i.e. different data, different dimensionality). In most cases it works fine but in some it produces weird errors which seem to be due to some computational bug within some of the indices that NbClust()
computes.
This is what my code looks like
library(NbClust)
NbClust(df, distance="euclidean", min.nc=3, max.nc=5, method = "complete")
This is the error I get
Error in if ((resCritical[ncB - min_nc + 1, 3] >= alphaBeale) &&
(!foundBeale)) { : missing value where TRUE/FALSE needed
another error which I am coming across very often is the following
Error in if ((res[ncP - min_nc + 1, 15] <= resCritical[ncP - min_nc +
: missing value where TRUE/FALSE needed
Has anyone ever encountered similar problems? Or does anyone know why NbClust()
is so unreliable? Any workarounds?
Data looks as follows
df = structure(list(Rate = c(-0.161, -0.519, 1.163, -0.781, -0.755,
2.252, -0.206, -0.796, -0.803, 1.444, -0.652, -0.541, -0.759,
-0.309, 0.945, -0.202, -0.449, 0.551, -0.774, 0.993, -0.434,
-0.604, -0.571, -0.545, -0.722, -0.696, -0.678, -0.512, -0.759,
2.857, 0.145, -0.206, -0.689, 0.514, 2.373, -0.659, 0.628, 0.2,
2.746, -0.781, -0.704, 2.019, -0.826, -0.051, 0.034, -0.693,
-0.047, -0.571, -0.335, -0.073), Losses = c(-0.142, 4.327, 5.004,
-0.293, -0.293, -0.293, -0.191, -0.293, -0.293, -0.293, 1.044,
0.151, -0.276, -0.293, -0.293, 0.004, -0.024, 0.151, -0.293,
0.223, -0.293, -0.293, 0.089, -0.293, -0.27, -0.293, -0.293,
-0.293, -0.293, -0.287, -0.03, 0.26, -0.293, -0.223, -0.293,
-0.293, -0.293, 0.066, -0.293, -0.293, -0.293, -0.215, 0.086,
0.086, -0.522, -0.518, -0.497, -0.522, -0.516, -0.518)), .Names = c("Rate",
"Losses"), row.names = c(NA, -50L), class = "data.frame")
A former unanswered question regarding the same problem can be found here