-2

Hope you are doing good. I’ve prepared a chatbot based on the below langchain documentation:

Langchain chatbot documentation

In the above langchain documenation, the prompt template has two input variables - history and human input.

I’ve variables for UserID, SessionID. I’m storing UserID, SessionID, UserMessage, LLM-Response in a csv file. I used python pandas module to read the csv and filtered the data frame for given UserID and SessionID and prepared the chat-history for that specific user session. I’m passing this chat-history as the ‘history’ input to the langchain prompt template(which was discussed in the above link). As I set verbose=true, the langchain was printing the prompt template on the console for every API call. I’ve started the conversation for the first user and first session and sent 3 human_inputs one by one. Later I started the second user session(now session ID and user ID are changed). After observing that prompt template on the console, I’ve observed that langchain is not only taking chat-history of second user session, it’s taking some of the chat-history from previous user session as well, even though I’ve written the correct code to prepare chat-history for the given user session. The code to get chat-history is below:

# get chat_history
def get_chat_history(user_id,session_id,user_query):
    chat_history = "You're a chatbot based on a large language model trained by OpenAI. The text followed by Human: will be user input and your response should be followed by AI: as shown below.\n"
    chat_data = pd.read_csv("DB.csv")
    for index in chat_data.index:
        if ((chat_data['user_id'][index] == user_id) and (chat_data['session_id'][index] == session_id)):
            chat_history += "Human: " + chat_data['user_query'][index] + "\n" + "AI: " + chat_data['gpt_response'][index] + "\n"
    chat_history += "Human: " + user_query + "\n" + "AI: "
    return chat_history

How to teach langchain to consider only the given user session chat-history in it’s prompt. Please help

Surya
  • 37
  • 1
  • 10

2 Answers2

1

Without seeing exactly what code you are using when constructing the calls to openai (i.e. are you using ConversationChain or LLMChain? and most importantly, are you using ConversationBufferMemory like in the example you linked?)

The reason I ask about ConversationChain (which will by default initialize a ConversationMemory for you if you don't pass a memory arg to it) and ConversationBufferMemory, is because it sounds a lot like what you are seeing is the buffer that isn't cleared... basically, they are sharing the same buffer, or MessageHistory.

Sorry, I can't be more helpful without seeing your eac

Fielding
  • 59
  • 5
0

You need to store per-user chat history in a database, and then run your main chain for the current user's user ID, fetching the history from the database by that user ID.

Here's a tutorial that illustrates doing this with Firestore, but you can swap in any other supported database.

https://wnmurphy.com/creating-a-versatile-multi-prompt-chatbot-with-memory-and-a-data-store-in-langchain/