I have a hdf5 file (image dataset) of size 17GB which I need to upload and use in the Google Colab and train my model on this dataset. But however when I use the following code to upload from my drive the run time automatically gets disconnected after the authentication process. Is it because of the size of the file or other issues? Any solution/s for overcoming this?
The code snippet is below :
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
#Authenticate and create the PyDrive client.
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
# Get the file
downloaded = drive.CreateFile({'id': 'my_id'}) #replace the my_id with id of file you want to access
downloaded.GetContentFile('dataset.hdf5')