Got a dataframe with columns 2:37 as variables and column one as a binary response variable.
mx.set.seed(1234)
train.x = data.matrix(A3n.df[,2:37])
train.y = A3n.df[,1]
data <- mx.symbol.Variable("data")
fc1 <- mx.symbol.FullyConnected(data, name="fc1", num_hidden=12)
act1 <- mx.symbol.Activation(fc1, name="relu1", act_type="relu")
fc2 <- mx.symbol.FullyConnected(act1, name="fc2", num_hidden=1)
logoutput <- mx.symbol.LogisticRegressionOutput(fc2, name="logoutput")
A1.MXmodel <- mx.model.FeedForward.create(logoutput, X=train.x, y=train.y,
ctx=mx.gpu(), num.round=1000, array.batch.size=100,
learning.rate=0.01, momentum=0.9, eval.metric=mx.metric.accuracy,
initializer=mx.init.uniform(0.07),
epoch.end.callback=mx.callback.log.train.metric(100))
Leads to error:
Error in mx.io.internal.arrayiter(as.array(data), as.array(label), unif.rnds, :
io.cc:50: Seems X, y was passed in a Row major way, MXNetR adopts a column major convention.
Please pass in transpose of X instead
Just a few days ago I used:
train.x <- t(train.x)
Which fixed the error and yielded a error rate low enough to be believable, but today, it's nearly .50 with no learning. I also tried switching around array.layout to rowmajor/colmajor to no effect.
[16] Train-accuracy=0.460714285714286
[17] Train-accuracy=0.460714285714286
[18] Train-accuracy=0.460714285714286
[19] Train-accuracy=0.460714285714286
[20] Train-accuracy=0.460714285714286
[993] Train-accuracy=0.460714285714286
[994] Train-accuracy=0.460714285714286
[995] Train-accuracy=0.460714285714286
[996] Train-accuracy=0.460714285714286
[997] Train-accuracy=0.460714285714286
[998] Train-accuracy=0.460714285714286
[999] Train-accuracy=0.460714285714286
[1000] Train-accuracy=0.460714285714286