Questions tagged [peft]
16 questions
0
votes
0 answers
I try to use GPTJ-lora model to generate txt, but the max-length of the generated text seemed to be 20 tokens. How to make it longer
import transformers
#from transformers import AutoModelWithHeads
model.load_adapter("./",adapter_name='lora')
peft_model_path="./"
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-j-6B")
tokenizer.pad_token =…

Dapeng Zhang
- 1
- 3