0

im doing deep learning stuff. I have tensorflow 2.0 (cpu version) and when i try to run this code(below) in pycharm (or jupyter notebook) it uses 3GBs of memory(RAM) however i have 6GBs of RAM

the dataset that i run has 50000+ training pic and 10k test pics (as i remember). the code is:

import tensorflow as tf
from tensorflow import keras
import matplotlib.pyplot as plt
import numpy as np

data_mnist = keras.datasets.cifar10
(x_train, y_train), (x_test, y_test) = data_mnist.load_data()

class_names = ['airplane', 'automobile', 'bird',
               'cat', 'deer', 'dog', 'frog',
               'horse', 'ship', 'truck'
               ]

x_train = x_train/255.0
x_test = x_test/255.0

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(32, 32, 3)),
    keras.layers.Dense(128, activation='relu'),
    keras.layers.Dense(10, activation='softmax'),
])

model.compile(optimizer='Adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
model.fit(x_train, y_train, epochs=1)

image_label = 1

prediction = model.predict(x_test)
plt.grid(False)
plt.imshow(x_test[image_label])
plt.title('Actual Img: ' +  class_names[image_label])
plt.xlabel('Predicted: ' + class_names[np.argmax(prediction[image_label])])
plt.show

it happens whenever i try to train the model.

there is no problem about the code, but as i said the computer almost crashes when i try to run this code(look at the picture below): enter image description here

i hope anybody can help just drop what u think will be the answer. Thanks a lot<3

Hadok 361
  • 64
  • 5

2 Answers2

0

Use batches for training, should fix it

0

the other 3 GB is being used by your OS and other stuff, and that is possibly crashing your pc...load your data in batches for training

Skywalker
  • 1
  • 1
  • 1
  • 2