We all hear GPT-3 being called a large language model (LLM), but is it really more of a framework since you can use GPT-3 with your own dataset, to train your own version of a GPT-3 model?
My understanding is that a model is the result of training, and you can use one of many frameworks/libraries to train the model (ex: tensor flow). If GPT-3 was just a model, you wouldn't be able to train with your own data on it, right? So that makes GPT-3 a framework?
Can anyone help me to better understand the AI terminology for this?