I am using colab to interact with CSV.
# Load the locally downloaded model
def load_llm():
llm = CTransformers(
model = "llama-2-7b-chat.ggmlv3.q8_0.bin",
model_type="llama",
max_new_tokens = 512,
temperature = 0.5
)
return llm
uploaded_file = '/content/2019.csv'
loader = CSVLoader(file_path=uploaded_file, encoding="utf-8", csv_args={'delimiter': ','})
data = loader.load()
# Initialize embeddings and vector store
embeddings = HuggingFaceEmbeddings(model_name='sentence-transformers/all-MiniLM-L6-v2', model_kwargs={'device': 'cpu'})
db = FAISS.from_documents(data, embeddings)
llm = load_llm()
chain = ConversationalRetrievalChain.from_llm(llm=llm, retriever=db.as_retriever())
The above exception was the direct cause of the following exception:
RepositoryNotFoundError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_errors.py in hf_raise_for_status(response, endpoint_name)
291 " make sure you are authenticated."
292 )
--> 293 raise RepositoryNotFoundError(message, response) from e
294
295 elif response.status_code == 400:
RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-64e367a0-03a4565e09b69d8b0d3624fb;5108be96-cf03-475c-a68f-c6481f96b567)
Repository Not Found for url: https://huggingface.co/api/models/llama-2-7b-chat.ggmlv3.q8_0.bin/revision/main.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid username or password.
https://huggingface.co/api/models/llama-2-7b-chat.ggmlv3.q8_0.bin/revision/main is not found and returning 404 so i am getting this error. what is the resolution to get this model in colab- any suggestion would be really helpful