My session in google colab is continously crashing showing "Your session is crashed after using available RAM" even after using small dataset. test size = 999892 train size = 2999892
I am searching for the solution to this problem but unable to find one.
corpus = []
for i in range(0, 299989):
SentimentText = re.sub('[^a-zA-Z]', ' ', dataset1['SentimentText'][i])
SentimentText = re.sub('<[^<]+?>', ' ', dataset1['SentimentText'][i])
tokenizer = RegexpTokenizer(r'\w+')
SentimentText = tokenizer.tokenize(SentimentText)
SentimentText = ' '.join(SentimentText)
corpus.append(SentimentText)
X_train = cv.fit_transform(corpus)
from sklearn.feature_extraction import text
X_train = text.TfidfTransformer().fit_transform(X_train)
X_train = X_train.toarray()
y_train = dataset.iloc[:, 1].values
after executing the third part the error is displayed of session crashing..