0

I use Levenberg -- Marquardt algorithm to fit my nonlinear function f(x,b) (x:Nx1, b:Mx1) to data X:NxK.

Now I want to estimate goodness (confidence) of solution b.

This post says that I should not try to find R-squared in nonlinear case. What should I do then? Are there any reliable universal metrics at all? I could not google any answer for this.

Anton3
  • 577
  • 4
  • 14
  • 1
    The post that you mentioned has a suggestion at the bottom - Standard error of the regression. – Some Guy Jul 28 '16 at 19:20
  • @RandomGuy But what if `X` are such that some parameter `bi` affects `f(x,b)` very little at the solution `b`? Then S can be low, but `bi` will be estimated poorly. – Anton3 Jul 28 '16 at 19:26
  • @RandomGuy Consider this example: `f(x,b) = x1 * b1 + ...`. If `x1=0`, then `f(x,b)` can be made close to zero, but `b1` can't be estimated at all. – Anton3 Jul 28 '16 at 19:32

1 Answers1

1

Standard errors are usually calculated as:

s.e. = sigma^2 inv(J'J)

or as

s.e. = sigma^2 inv(H)

where

J : Jacobian matrix
H : Hessian matrix
sigma^2 = SSE/df = sum of squared errors / (n-p) 

A confidence interval is then

b +- s.e. * t(n-p,alpha/2)

where t is the critical value for the Student’s t distribution

Erwin Kalvelagen
  • 15,677
  • 2
  • 14
  • 39