1

With SymPy we can set assumptions on variables:

from sympy import *
x = Symbol('x', real=True)
x, re(x), im(x)

the result is :

(x,x,0)

as expected. But if we try to do the same for a function:

from sympy import *
x = Symbol('x', real=True)
f = Function('f', real=True)(x)

f, re(f), im(f)

the result is :

(f(x),re(f(x)),im(f(x)))

but I expected something as

(f(x),f(x),0)

The question is then: Is there a way to make python assume the range of the function to be real, i.e.

In this previous question it is partially answered, but if the we take the derivative we get again the same problem:

from sympy import *
x = Symbol('x', real=True)
class f(Function):
    is_positive = True
df = f(x).diff(x)


df, re(df), im(df)

with result:

(f(x).diff(x),re(f(x).diff(x)),im(f(x).diff(x)))

wanting it to be like

(f(x).diff(x),f(x).diff(x),0)

is there a way?

user
  • 5,370
  • 8
  • 47
  • 75
iiqof
  • 157
  • 10
  • I think you might have to subclass `Function` and define an `_eval_is_real` method. The docs are pretty lacking, but they do happen to have an example of defining a function whose value is real when its argument is real, and that's what the example does. – user2357112 Nov 16 '16 at 22:23
  • 1
    I would consider this to be a bug. Derivative should be smart enough to know that the derivative of a real function is real. – asmeurer Nov 17 '16 at 19:04
  • 1
    Opened https://github.com/sympy/sympy/issues/11868 – asmeurer Nov 17 '16 at 19:08

0 Answers0