-3

I am trying to create a SQL query LLM. Does anyone know how to use HuggingFacePipeline.from_pretrained to load a locally stored LLM model. The from_pretrained is not working with HuggingFace, as in the method does not exist.

--------- Adding Information --------

I have used LaMini-T5-738M as the base model to be run locally. The LLM model files is downloaded locally. As such huggingface does not seem to have a loading function for local LLM Models. So I used a pipeline object to load the values as per code below. My intention is to connect it to a DB and generate queries but that does not seem to be working right.
I am uncertain if this is due to the limitaiton of the model used. If so there are no references online to a model which supports anything beyond chat and QnA. Below code -

db_uri = f"type+psycopg2://{username}:{password}@{host}:{port}/{dbName}"
db = SQLDatabase.from_uri(db_uri)

checkpoint = "LaMini-T5-738M"  

modelSeq = AutoModelForSeq2SeqLM.from_pretrained(checkpoint)

tokenizer = AutoTokenizer.from_pretrained(checkpoint)

pipe = pipeline("text2text-generation",
                model=modelSeq,
                tokenizer=tokenizer)
local_llm = HuggingFacePipeline(pipeline=pipe)
db_chain = SQLDatabaseChain(llm=local_llm, database=db, verbose=True)

toolkit = SQLDatabaseToolkit(db=db,llm=local_llm)
agent_executor = create_sql_agent(toolkit=toolkit,llm=local_llm)
agent_executor.run(<query>)  #GIVES GARBAGE OR UNINTELLIGIBLE OUTPUT
RaptorX
  • 113
  • 10
  • Welcome to Stack Overflow. Please take the [tour] to learn how Stack Overflow works and read [ask] on how to improve the quality of your question. Please see: [Why is β€œCan someone help me?” not an actual question?](https://meta.stackoverflow.com/q/284236). Please show your attempts you have tried and the problems/error messages you get from your attempts. It is unclear what you are asking or what the problem is. – Progman Aug 04 '23 at 16:54
  • @Progman updated query with inputs. – RaptorX Aug 05 '23 at 18:15

0 Answers0