-1

After training model i would like to save history in mine bucket or any location which i can access later on local

when i run code below on google colab all work fine

history = model.fit(training_dataset, steps_per_epoch=steps_per_epoch, epochs=EPOCHS,
                validation_data=validation_dataset, validation_steps=validation_steps)
#model.summary()
model.save(BUCKET)
#save as pickle
with open('/trainHistoryDict', 'wb') as file_pi:
      pickle.dump(history.history, file_pi)

I can read later using

history = pickle.load(open('/trainHistoryDict', "rb"))

however when run code as a job on google cloud AI-Platform ( using %%wrotefile) I cannot retrieve history on google colab using pickle load - im getting 'no such directory' So how i can run training on AI platform on google cloud and then access history on google colab? Can i save history.history in bucket ? i tried to use PACKAGE_STAGING_PATH but it didnt worked

1 Answers1

1

found solution

subprocess.Popen('gsutil cp history gs://bigdatapart2-storage/history1', shell=True, stdout=subprocess.PIPE)