I am a bit new to Colab and Keras. I want to work on images that are of very large size (each image of 5 gb), available on google drive. Can I read those images directly from google drive, process them on the fly and train a model with those images (batch wise)? The resultant images will be stored on either my desktop or google drive. Is it possible to do that?
Asked
Active
Viewed 489 times
1 Answers
0
Yes Google drive can be loaded as a drive for the Colab, So when drive is mounted you can use regular python commands to read and write files. you can try something like this.
from google.colab import drive
drive.mount('/content/drive/')
This is just one way that I know, I believe there are other ways as well.
Here is a helpful link for starting with Colab.
For BatchProcessing look for Keras fit_generator.

Manoj
- 983
- 12
- 23