0

From open_llm_leaderboard, there are many interesting 30b loras model with extremely good performance

But HOW CAN I LOAD IT without adapter_config.json?

I am really sorry that I am new to the field, but if I didn't understand it wrongly, with a correct base model, those .bin Loras I can download should be loaded

In those high-rank Loras models, I can not find the adapter_config.json on their files.

Can I create one? or I have to retrain them? can I retain them with text-generation-webui ?

0 Answers0