1

With:

  • A train and testing dataset with two features x1 and x2
  • Two binary classes for classification (0 or 1)
  • Training and prediction with the Naive Bayes algorithm for Gaussian classification
  • Plot of decision regions following scikit-learn guide, with the axes being x1 and x2

How to calculate and plot the line equation that goes between the two decision regions?

The least squares regression method has already been tried, but the results never fit the decision boundary.

This is the code that has already been tried:

x_mean = np.mean(x)
y_mean = np.mean(y)

numer = 0
denom = 0
for i in range(len(x)):
    numer += (x[i] - x_mean) * (y[i] - y_mean)
    denom += (x[i] - x_mean) ** 2
m = -numer/denom
c = y_mean - (m * x_mean)
    
max_x = np.max(x)
min_x = np.min(y)
    
x_plt = np.linspace(min_x, max_x, 1000)
y_plt = c + m*x
    
plt.plot(x_plt, y_plt)
desertnaut
  • 57,590
  • 26
  • 140
  • 166
ukmz75
  • 29
  • 2
  • This sounds like a homework question. Could you provide any code that you attempted? – Daniel me Apr 18 '22 at 16:23
  • There are many articles and guides on how to plot decision boundaries/surfaces. [Plot the decision boundaries of a VotingClassifier](https://scikit-learn.org/stable/auto_examples/ensemble/plot_voting_decision_regions.html), [Easily visualize Scikit-learn models’ decision boundaries](https://towardsdatascience.com/easily-visualize-scikit-learn-models-decision-boundaries-dd0fb3747508). Also, here is a similar question: [Plot scikit-learn (sklearn) SVM decision boundary / surface](https://stackoverflow.com/q/51297423/1609514) – Bill Apr 18 '22 at 17:13
  • Sorry, I don't know the formula for the line equation of the decision boundary of a Naive Bayes classifier. I didn't know it was a straight line. Maybe try googling that question or post a question on [Cross Validated](https://stats.stackexchange.com) if you can't find the answer there. – Bill Apr 19 '22 at 04:26

0 Answers0