0

I have three data points through which I have to fit a straight line of the form Y=m*X+C. I want the line to have pre-determined slope 'm' but the constant'C' can change to get the least error while fitting using matlab. Can someone help me out?

labalala
  • 17
  • 6

2 Answers2

2

Just do the math:

C= mean(Y)-m*mean(X)

assuming Y is the vector containing the y coordinates, and X the x coordinates.

Reference: http://hotmath.com/hotmath_help/topics/line-of-best-fit.html

BCartolo
  • 720
  • 4
  • 21
0

If you opt to use the Curve Fitting Toolbox the solution is as follows.

To start generate some data

m = 3;
x = (1:10).';
y = m*x + 2 + randn(size(x));

then select the model to fit and set the bounds for its coefficients

ft = fittype('poly1');
opts = fitoptions('Method', 'LinearLeastSquares');
opts.Lower = [m -Inf];
opts.Upper = [m  Inf];

finally call the fitting routine

[fitresult, gof] = fit(x, y, ft, opts);

The intercept is stored in fitresult.p2.

Jommy
  • 1,020
  • 1
  • 7
  • 14