When training a model using a custom dataset in Pytorch 1.4 the following error is thrown after a seemingly random amount of epochs.
RuntimeError: Couldn't open shared file mapping: <torch_15324_2327643205>, error code: <1455>
The dataset is wrapped in a torch.utils.data.DataLoader
and uses 4 workers, equal to the amount of physical cores.
class TSNDataSet(data.Dataset):
def __init__(self, pickle_file_paths, transforms):
self.pickle_file_paths = pickle_file_paths # list with file paths to pickle files
self.dataset_size = len(pickle_file_paths)
def __getitem__(self, index):
with open(self.pickle_file_paths[index], 'rb') as f:
mffs = pickle.load(f)
return mffs, index
def __len__(self):
return self.dataset_size
It would be helpful to know what the error means and what the possible solutions are.