0

inside the langchain memory object there are different methods e.g. ConversationBufferMemory or ConversationBufferWindowMemory

Regardless, if the conversation get long at somepoint I get the following error

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 4113 tokens (4063 in the messages, 50 in the functions). Please reduce the length of the messages or functions.

because the context get full. I wonder how I can remove the last object or apply some moving window strategy ?

Areza
  • 5,623
  • 7
  • 48
  • 79

0 Answers0