I was using scikit-learn.linear_model.LogisticRegression
, but it didn't work so well.
I started with an x and y list. X was integers between -10 and 10 inclusive. Y was the x value plugged into the logistic function 100/ (1 + e^-x)
. This means it should have been a perfect Logistic Regression.
It worked well, but when I changed the Y slightly (+1, -1, or 0 for every y value), it didn't work. When plotted with matplotlib.pyplot
it was clearly a logistic curve.
I tried
reg = linear_model.LogisticRegression(C=10e4)
reg.fit(x, y)
reg.predict(x)
The predicted Y had half the points at the minimum y (y=0), and the other half at the maximum (y=100). I tried with C as the default value, but the same thing happened.
Why is this, and how do I get a correct regression model?
Orange is the model, blue is the actual. This one is a little bit better, but when C is default, all oranges are above 100 or less than 0