0

First, I apologize if this question is wrong or has been asked before. I was having trouble searching for answers because parameters are a thing in machine learning but I mean a different kind of parameter, and I don't really know what to search for. I'm actually now wondering if scipy.optimize.curve_fit would do the job?

Here is the problem I'm trying to solve:

  • I have a small training set of x and y.
  • For each item in the set:
    • x = x1, x2, x3 ... xn (floats)
    • y = binary true/false (binary)

I need to create a custom function f(x). Function f(x) is required for another piece of software, which will generate novel x. The software uses a sampling algorithm to generate x that lead to positive values for f(x), and avoid x that lead to negative f(x).

Suffice to say, the software will use f(x) to create x, so I'd really like it to get high positive numbers for "true" x and low negative numbers for "false" x.

f(x) = a*x1 + b*x2 + ... + n*xn

Can I use machine learning to optimize parameters (a, b, ... n) for f(x)?

f(x) should returnpositive value for "true" x, and negative value for "false" x.

I can do advanced python coding, I just have 0 experience with machine learning but think it might help me parameterize f(x).

Thank you!

Amanda
  • 202
  • 1
  • 3
  • 8
  • 1
    You can use logistic regression for this. The implementation of this in the scikit learn package should have everything you need. – RajeshM Mar 11 '21 at 06:46
  • @RajeshM thank you so much. i used scikit learn logistic regression and it worked perfectly. – Amanda Mar 27 '21 at 06:27

0 Answers0