0

new to LangChain and I'm trying to create a simple QA bot (over documents). Following the documentation and guide on their website, I've created a simple working bot, but I'm struggling to understand certain parts of the code.

template = """Use the following pieces of context to answer the question at the end. 
If you don't know the answer, just say that you don't know, don't try to make up an answer. 
Use three sentences maximum and keep the answer as concise as possible. 
Always say "thanks for asking!" at the end of the answer. 
{context}
Question: {question}
Helpful Answer:"""

QA_CHAIN_PROMPT = PromptTemplate(input_variables=["context", "question"], template=template)

llm = ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0)
qa = RetrievalQA.from_chain_type(llm,
                                chain_type='stuff',
                                retriever=vectorstore.as_retriever(),
                                chain_type_kwargs={"prompt": QA_CHAIN_PROMPT})

query = "some query"
print(qa.run(query))

Given the sample code above, I have some questions.

  1. What is the point of having {context} and {question} inside our prompt template, when no arguments are passed inside?

  2. What does chain_type_kwargs={"prompt": QA_CHAIN_PROMPT} actually accomplish?

  3. If I were to include a new argument inside my prompt (e.g. {name}), where do I go about to actually pass in the value for said argument?

whatsggon
  • 1
  • 3
  • Here's something that might assist you: consider exploring this implementation using LangChain - you can find it at [PrivateDocBot](https://github.com/Abhi5h3k/PrivateDocBot) – Abhi Aug 27 '23 at 15:26

1 Answers1

0

What is the point of having {context} and {question} inside our prompt template, when no arguments are passed inside?

Answer - The context and question placeholders inside the prompt template are meant to be filled in with actual values when you generate a prompt using the template. 

What does chain_type_kwargs={"prompt": QA_CHAIN_PROMPT} actually accomplish?

Answer - chain_type_kwargs is used to pass additional keyword argument to RetrievalQA. Here you are passing your prompt (QA_CHAIN_PROMPT) as an argument

If I were to include a new argument inside my prompt (e.g. {name}), where do I go about to actually pass in the value for said argument?

Answer - You can do this by passing placeholder in your prompt template. Your code will look like below, please find my comment inline

        template = """Use the following pieces of context to answer the question at the end. 
        If you don't know the answer, just say that you don't know, don't try to make up an answer. 
        Use three sentences maximum and keep the answer as concise as possible. 
        Always say "thanks for asking!" at the end of the answer. 
        {context}
        Question: {question}
        Name: {name} #This is your additional parameter
        Helpful Answer: "{answer} Thanks for asking!" """
        
        #Adding new parameter to prompt
        QA_CHAIN_PROMPT = PromptTemplate(input_variables=["context", "question", "name", "answer"], template=template)

        #Passing values along with value for new parameter
        prompt = QA_CHAIN_PROMPT(context="Some context here", question="What is the purpose of life?", name="AIBot", answer="The purpose of life is...")

        result = qa.run(prompt)
        print(result)
ZKS
  • 817
  • 3
  • 16
  • 31
  • Thanks for your reply. Just a follow-up question to your answer for #3. In my example code, where I'm using RetrievalQA, I'm passing in my prompt (QA_CHAIN_PROMPT) as an argument, however the {context} and {prompt} values are yet to be filled in (since it is passing in the original string). From my understanding, RetrievalQA uses the vectorstore to answer the query that is given. Hence I'm still having trouble understanding how the {context} and {prompt} are being used in the original PromptTemplate. – whatsggon Aug 22 '23 at 03:33
  • From your code context and prompt will be set by you, you need to pass these values as input variables. After that based on your prompt, vectorstore.as_retriever() will query vector database and fetch result from you embedded stored data. – ZKS Aug 22 '23 at 07:40