0

My session in google colab is continously crashing showing "Your session is crashed after using available RAM" even after using small dataset. test size = 999892 train size = 2999892

I am searching for the solution to this problem but unable to find one.

corpus = []
for i in range(0, 299989):
    SentimentText = re.sub('[^a-zA-Z]', ' ', dataset1['SentimentText'][i])
    SentimentText = re.sub('<[^<]+?>', ' ', dataset1['SentimentText'][i])
    tokenizer = RegexpTokenizer(r'\w+')
    SentimentText = tokenizer.tokenize(SentimentText)
    SentimentText = ' '.join(SentimentText)
    corpus.append(SentimentText)

X_train = cv.fit_transform(corpus)
from sklearn.feature_extraction import text
X_train = text.TfidfTransformer().fit_transform(X_train)

X_train = X_train.toarray()
y_train = dataset.iloc[:, 1].values

after executing the third part the error is displayed of session crashing..

Chandan
  • 571
  • 4
  • 21
m.santuka
  • 1
  • 1
  • 1

1 Answers1

0

Try turning on the GPU from the edit menu and select notebook settings. In the hardware accelerators select GPU. This will increase the size of the RAM.

wovano
  • 4,543
  • 5
  • 22
  • 49