I'm plotting some interaction effects that stem from a regression in stata. I'm using excel for convenience. The data are curvilinear and I'm adding a polynomial trendline to maximize the fit. The problem I have is that the trendline function seems to assume that the x values are 1, 2, 3 and so on. Although in my graph they are actually 0, 0.1, 0.2, and so on. The resulting trendline equation thus does not make much sense.
Example data: X axis:
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Y axis:
0.365794061
0.391834869
0.411805232
0.424622483
0.429572487
0.426375536
0.415212526
0.396708002
0.371872294
0.342010179
0.308607579
This gives a trendline (polynomial, 2) of -0.0036x^2 + 0.367x + 0.3336
which neatly fits the data (R^2 is over 99%). However, this equation does not allow me to calculate the maximum correctly.
The data itself stem from a negative binomial regression where the key explanatory variable x is present as linear and quadratic effect (creating the curvilinear shape) with b1 = 0.783 and b2 = -0.953
.
So two alternative questions:
1) Is there a way to fix, override, or circumvent the assumption in excel upon which the trendlines are based?
2) Is there a better way of deriving the actual shape of the curve straight from the coefficients of the negative binomial regression?
Thanks,
Simon