Hi I'm following a neural net tutorial where the author seems to be using shared variables everywhere. From my understanding, a shared variable in theanos simply is a space in memory that can be shared by the gpu and cpu heap. Anyway, I have two matrices which I declare as shared variables and I want to perform some operation on them using function. (Question 1) I'd love it if someone could explain why function is usefull vs regular def function. Anyway, I'm setting up my definition like such:
import theano
import theano.tensor as T
from theano import function
import numpy as np
class Transform:
def __init__(self, dimg):
dimg = dimg.astype(theano.config.floatX)
self.in_t = theano.shared(dimg, name='dimg', borrow=True)
def rotate(self, ox, oy, radians):
value = np.zeros((2 * self.in_t.get_value().shape[0],
2 * self.in_t.get_value().shape[1]))
out_t = theano.shared(value,
name='b',
dtype=theano.config.floatX),
borrow=True)
din = theano.tensor.dmatrix('a')
dout = theano.tensor.dmatrix('b')
def atest():
y = x + y
return y
f = function(inputs=[],
givens={x: self.in_t,
y: self.out_t},
outputs=atest)
return f()
The problem is that I have no idea how to use the shared variables in a regular function-output call. I understand that I can do updates via function([],..update=(shared_var_1, upate_function)). But how do I access them in my regular function?