2

I am wondering how one could solve the following problem in R.

We have a v vector (of n elements) and a B matrix (of dimension m x n). E.g:

    > v 
    [1] 2 4 3 1 5 7

    > B
         [,1] [,2] [,3] [,4] [,5] [,6]
    [1,]    2    1   5    5    3    4
    [2,]    4    5   6    3    2    5
    [3,]    3    7   5    1    7    6

I am looking for the m-long vector u such that

    sum( ( v - ( u %*% B) )^2 )

is minimized (i.e. minimizes the sum of squares).

josliber
  • 43,891
  • 12
  • 98
  • 133
benny
  • 133
  • 5

1 Answers1

5

You are describing linear regression, which can be done with the lm function:

coefficients(lm(v~t(B)+0))
#      t(B)1      t(B)2      t(B)3 
#  0.2280676 -0.1505233  0.7431653 
josliber
  • 43,891
  • 12
  • 98
  • 133
  • Thank you for you quick and clear answer. And what if I am looking for vector *u* such that sum( ( V - ( B%*%diag(u) %*%t( B)) )^2 ) is minimized (for fixed matrices V and B). It is very similar but I don't know whether lm function can also solve this problem. – benny Jul 08 '15 at 14:25
  • @benny this sounds different enough that I would encourage you to ask a separate question. You can reference this question and answer if it makes it more clear. – josliber Jul 08 '15 at 15:19
  • Thank you, I asked it in a different question: http://stackoverflow.com/questions/31301694/least-square-optimization-of-matrices-in-r – benny Jul 08 '15 at 19:15