I have an Nvidia GPU, downloaded CUDA, and am trying to make use of it.
Say I have this code:
#@cuda.jit (Attempted fix #1)
#@cuda.jit(device = True) (Attempted fix #2)
#@cuda.jit(int32(int32,int32)) (Attempted fix #3)
@njit
def product(rho, theta):
x = rho * (theta)
return(x)
a = product(1,2)
print(a)
How do I make it work with the cuda.jit decorator instead of njit?
Things I've tried:
When I switch the decorator from @njit to @cuda.jit, I get: TypingError: No conversion from int64 to none for '$0.5', defined at None.
When I switch the decorator @cuda.jit(device = True), I get: TypeError: 'DeviceFunctionTemplate' object is not callable.
And when I specify the types for my inputs and outputs, and use the decorator @cuda.jit(int32(int32,int32)), I get: TypeError: CUDA kernel must have void return type.