optim
Define f
as the function in the question except we explicitly list all arguments and ss
as the residual sum of squares. Then minimize ss
using an arbitrary value for i (since we have two equations and 3 unknowns). Below we show the solution for j and k (in the par
component of the output) using i = 10 as an input.
f <- function(x, i, j, k) i * log(j * x + k)
ss <- function(p, i) (f(x = 0, i = i, j = p[1], k = p[2]) - 6)^2 +
(f(x = 1, i = i, j= p[1], k = p[2]) - 12)^2
optim(1:2, ss, i = 10)
giving:
$par
[1] 1.497972 1.822113
$value
[1] 9.894421e-09
$counts
function gradient
59 NA
$convergence
[1] 0
$message
NULL
nlsLM
Alternately we can use nonlinear least squares. This is slightly easier to specify since we don't need to define ss
but it does require a package. We use nlsLM
instead of nls
in the core of R since nls
does not handle zero residual problems well.
library(minpack.lm)
nlsLM(y ~ f(x, i, j, k), data = list(y = c(6, 12), x = 0:1, i = 10),
start = c(j = 1, k = 2))
giving:
Nonlinear regression model
model: y ~ f(x, i, j, k)
data: list(y = c(6, 12), x = 0:1, i = 10)
j k
1.50 1.82
residual sum-of-squares: 0
Number of iterations to convergence: 4
Achieved convergence tolerance: 1.49e-08