3

I recently switched to R from matlab and I want to run an optimization scenario.

In matlab I was able to:

options = optimset('GradObj', 'on', 'MaxIter', 400);
[theta, cost] = fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);

Here is the equivalent of costFunctionReg (here I call it logisticRegressionCost)

logisticRegressionCost <- function(theta, X, y) {
    J = 0;
    theta = as.matrix(theta);
    X = as.matrix(X);
    y = as.matrix(y);   

    rows = dim(theta)[2];
    cols = dim(theta)[1];
    grad = matrix(0, rows, cols);

    predicted = sigmoid(X %*% theta);
    J = (-y) * log(predicted) - (1 - y) * log(1 - predicted);

    J = sum(J) / dim(y)[1];

    grad = t(predicted - y);
    grad = grad %*% X;
    grad = grad / dim(y)[1];

    return(list(J = J, grad = t(grad)));    
}

However when I try to run an optimization on it like:

o = optim(theta <- matrix(0, dim(X)[2]), fn = logisticRegressionCost, X = X, y = y, method="Nelder-Mead")

I get an error due of the list return. (When I just only return J it works)

Error:

(list) object cannot be coerced to type 'double'

Q1: Is there a way to specify which return should the optim use for the minimization? (like fn$J)

Q2: Is there a solution where I can use the gradient I calculate in the logisticRegressionCost?

Harlequin
  • 637
  • 6
  • 8
  • I've used `optim` to fit a perceptron some time ago. My function just accepted `w` and returned the error value. – Fernando Mar 14 '14 at 15:07

1 Answers1

6

I don't think you can do this because the documentation for optim says fn should return a scalar result.

Perhaps you could just write a helper function to optimize. Something like:

logisticRegressionCost.helper <- function(theta, X, y) {
   logisticRegressionCost(theta, X, y)$J
}

Also, you don't need the semi colons to suppress output in R. I had that same habit when I switched from MatLab too :)

mts
  • 2,160
  • 2
  • 24
  • 34
user3850621
  • 76
  • 1
  • 3