1

I'm trying to fine-tune a GPT-3 model on my tweets. I want the model to generate tweets with no prompt. Is it possible?

The dataset is reqired to have "prompt" and "completion" columns. Do I just take first couple of words of each tweet and make it a prompt?

user1743703
  • 133
  • 2
  • 10
  • 3
    You could also try to have the prompt always be `"Someone tweeted this: \""`, and see how good the results are after tune-training. – Philipp Lenssen Oct 03 '22 at 10:11

2 Answers2

0

What I did to create something similar was use all past tweets as the dataset, then generate new ones and add them to the current dataset. So for instance you could do something like.

Tweet1: Life is just an arc in the wind.

Tweet2: Crazy day to be alive!

Tweet3:

Then have GPT run on the prompt to create something like this:

Tweet1: Life is just an arc in the wind.

Tweet2: Crazy day to be alive!

Tweet3: I love life! Keep on living people!

then from that you can just add an appendage like this,

Tweet1: Life is just an arc in the wind.

Tweet2: Crazy day to be alive!

Tweet3: I love life! Keep on living people!

Tweet4:

& continue on from there to get unlimited tweets!

NOTE; That the more initial data you give, say 15-50 already pre written tweets with varying length and topic the more likely you are to receive a well rounded unique tweets each time. Other wise it might just stick to the same topic.

0

GPT is a generative model. So the prompt needs some input to generate in response. What you can do is this. 1: create a category/genre for tweets. Whether they are memes humor etc. 2: finetune it as follows: prompt: category/genre, completion: your tweet. This way you have some sort of data ready for finetuning. Hope this helps.

Muhammad Ahmed
  • 318
  • 2
  • 8