I'm fine tuning a gpt-2 model following this tutorial:
With its associated GitHub repository:
https://github.com/nshepperd/gpt-2
I have been able to replicate the examples, my issue is that I'm not finding a parameter to set the number of iterations. Basically the training script shows a sample every 100 iterations and save a model version every 1000 iterations. But I'm not finding a parameter to train it for say, 5000 iterations and then close it.
The script for training is here: https://github.com/nshepperd/gpt-2/blob/finetuning/train.py
EDIT:
As suggested by cronoik I'm trying to replace the while for a for loop.
I'm adding these changes:
Adding one additional argument:
parser.add_argument('--training_steps', metavar='STEPS', type=int, default=1000, help='a number representing how many training steps the model shall be trained for')
Changing the loop:
try: for iter_count in range(training_steps): if counter % args.save_every == 0: save()
Using the new argument:
python3 train.py --training_steps 300
But I'm getting this error:
File "train.py", line 259, in main
for iter_count in range(training_steps):
NameError: name 'training_steps' is not defined