0

I'm new to ml, and trying to create a ml model fine tuning GPT2.

I get the dataset and preprocessed it (file_name). But when I actually try to run below code, fine tuning GPT2, Colab always say 'Your session crashed after using all available RAM.'

gpt2.finetune(sess,
              dataset=file_name,
              model_name='124M',
              steps=50,
              restore_from='fresh',
              run_name='run1',
              print_every=10,
              sample_every=10,
              save_every=10,
              batch_size=16
              )

I'm already on the Colab pro version and RAM size is 25GB, and file size is only about 500MB. I tried lowering training steps and batch size but this error is keep happening.

Any idea how can I stop this behavior?

Seungjun
  • 874
  • 9
  • 21

0 Answers0