I have to stack some my own layers on different kinds of pytorch models with different devices.
E.g. A is a cuda model and B is a cpu model (but I don't know it before I get the device type). Then the new models are C and D respectively, where
class NewModule(torch.nn.Module):
def __init__(self, base):
super(NewModule, self).__init__()
self.base = base
self.extra = my_layer() # e.g. torch.nn.Linear()
def forward(self,x):
y = self.base(x)
z = self.extra(y)
return z
...
C = NewModule(A) # cuda
D = NewModule(B) # cpu
However I must move base
and extra
to the same device, i.e. base
and extra
of C are cuda models and D's are cpu models. So I tried this __inin__
:
def __init__(self, base):
super(NewModule, self).__init__()
self.base = base
self.extra = my_layer().to(base.device)
Unfortunately, there's no attribute device
in torch.nn.Module
(raise AttributeError
).
What should I do to get the device type of base
? Or any other method to make base
and extra
to be on the same device automaticly even the structure of base
is unspecific?