I try to run a DataFlow pipeline remotely which will use a pickle file. Locally, I can use the code below to invoke the file.
with open (known_args.file_path, 'rb') as fp:
file = pickle.load(fp)
However, I find it not valid when the path is about cloud storage(gs://...):
IOError: [Errno 2] No such file or directory: 'gs://.../.pkl'
I kind of understand why it is not working but I cannot find the right way to do it.