-1

I have written a simple function to get summary from my data and in that I am adding memory (chat_history) using Conversation Buffer Memory for follow up questions. When the code below is not in a function, I see chat_history gets loaded in the output but when I keep it in a function the chat_history appears to be empty. I am unable to understand why is this happening. Please give your suggestions. Thank you

Here is my function code-

<openai credentials>
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain
from langchain.memory import ConversationBufferMemory
def summary_and_memory(text):
 template="""
 Chat history is:
 {chat_history}

 Your task is to write a summary based on the 
 information provided in the data delimited by triple backticks following the 
 steps below-
 Consider the chat history and try to answer based on that.
 1. Analyse the input data.
 2. Extract key facts out of the input data. 
 3. Do not add names and figures that are not present in the data.
 4. Do not write numbers in scientific notation or exponents or any other special symbols.
 5. Use at most 25 words.

 Data: ```{text_input}``` 
 """

 fact_extraction_prompt = PromptTemplate(
 input_variables=["text_input", "chat_history"],
 template=template)

 memory = ConversationBufferMemory(memory_key="chat_history")
 print(memory)
 fact_extraction_chain = LLMChain(llm=llm, prompt=fact_extraction_prompt,memory=memory, 
 verbose=True)
 output = fact_extraction_chain.run(text)
 return output
Srishino
  • 43
  • 10

1 Answers1

0

every time the function is being invoked, the memory is being reset. You might want to create the memory outside the function.