0

I am using llama2 and its Prompts has a format to let model know about conversation like this

[INST] {First_human_input} [/INST]
{First_model_output}
[INST] {Second_human_input}[/INST]

but the memory in langchain default like this

Human: First_input
AI: First_output
Human: Second_input

Although using the default of langchain ,llama can also know but it will output with "AI:" I have already try to set ai_prefix and human_prefix but " : " can not remove Is there have a good way to solve?

desertnaut
  • 57,590
  • 26
  • 140
  • 166
AndyLinOuO
  • 21
  • 2

1 Answers1

0

You can do it with extending and overriding save_context method in ConversationStringBufferMemory class

    #extract from ConversationStringBufferMemory 
    #here you can custom separator instead of :
    human = f"{self.human_prefix}: " + inputs[prompt_input_key]
    ai = f"{self.ai_prefix}: " + outputs[output_key]
    self.buffer += "\n" + "\n".join([human, ai])
ZKS
  • 817
  • 3
  • 16
  • 31