-1

I want to find the zeros of a simple function for given parameters a, b, c. I have to use the Newton-Raphson method. The problem I obtain when I compile the code is that the x variable is not defined.

from scipy import optimize

def Zeros(a, b, c, u):
  return optimize.newton(a*x**2+b*x+c, u, 2*ax+b, args=(a, b, c))

a, b, c are constants of the function f and u is the starting point. So with this function I should be able to obtain a zero by specifying a, b, c and u. For instance:

print Zeros(1, -1, 0, 0.8)

But I obtain "global name 'x' is not defined".

Why does that happen?

mkrieger1
  • 19,194
  • 5
  • 54
  • 65
Berni
  • 551
  • 10
  • 17

2 Answers2

1

The way most programming languages work is that they use variables (the names a, b, c, u in your code) and functions (Zeros, for instance).

When calling a function, Python expects all of the "quantities" that are input to be defined. In your case, x does not exist.

The solution is to define a function that depends on x, for the function and its derivative

from scipy import optimize

def Zeros(a,b,c,u):
    def f(x, a, b, c):
        return a*x**2+b*x+c
    def fprime(x, a, b, c):
        return 2*a*x + b
    return optimize.newton(f, u, fprime=fprime,args=(a,b,c))

print(Zeros(1,-1,0,0.8))
Warren Weckesser
  • 110,654
  • 19
  • 194
  • 214
Pierre de Buyl
  • 7,074
  • 2
  • 16
  • 22
  • Hi @WarrenWeckesser. May I ask what edit was done on the answer? (I cannot find the difference). – Pierre de Buyl Apr 24 '17 at 14:22
  • The `return` statement for `Zeros` was on the same line as the `return` statement of `fprime`. I just added a line break so the last `return` statement was a separate line. – Warren Weckesser Apr 24 '17 at 14:28
1

Crude way of doing it to see what's going on!

Define a function:

def f(x):
    return x ** 6 / 6 - 3 * x ** 4 - 2 * x ** 3 / 3 + 27 * x ** 2 / 2 \
        + 18 * x - 30

Define the differential:

def d_f(x):
    return x ** 5 - 12 * x ** 3 - 2 * x ** 2 + 27 * x + 18

Newton-Raphson:

x = 1

d = {'x': [x], 'f(x)': [f(x)], "f'(x)": [d_f(x)]}

for i in range(0, 40):
    x = x - f(x) / d_f(x)
    d['x'].append(x)
    d['f(x)'].append(f(x))
    d["f'(x)"].append(d_f(x))

df = pd.DataFrame(d, columns=['x', 'f(x)', "f'(x)"])
print(df)
mkrieger1
  • 19,194
  • 5
  • 54
  • 65