I know this is very simple, but I need some directions.
I have a Jupyter Notebook that I used to run on my local Linux machine. The notebook has some deep learning training code that imports dataset, processing and training and stuff.
In my local machine I have my dataset located at
'/home/USERNAME/Workspace/Final Year Project/input'
This input folder has two sub folders train and test When I run the notebook on my local machine it runs perfectly, but my system has some limitations so I chose to use Google Colab instead.
But the main issue I am facing is how to import the same dataset in Colab? Like I know it could be done using Google Drive but how?
Currently I am loading my dataset in my numpy array using the file path
If I upload my dataset to Google Drive how could I use this file path?
For example to get the training data I use the below function which takes the file path as parameter
# Get training data
def get_X_data(path, output_shape=(None, None)):
'''
Loads images from path/{id}/images/{id}.png into a numpy array
'''
img_paths = ['{0}/{1}/images/{1}.png'.format(path, id) for id in os.listdir(path)]
X_data = np.array([skimage.transform.resize(skimage.io.imread(path)[:,:,:3], output_shape=output_shape, mode='constant', preserve_range=True) for path in img_paths], dtype=np.uint8) #take only 3 channels/bands
return X_data
X_train = get_X_data(train_path, output_shape=(img_height,img_width))
Any help would be really appreciated. Thank You.