1

I am working on creating a langchain agent

logging.info("Initializing the tools")
        tools = [
            StructuredTool.from_function(some_func),
            Tool(
                name="QA",
                func=(lambda message: extract_QA(docsearch, message, company, language, info=info)),
                description="useful for when you need to answer questions. You should ask targeted questions",
                return_direct=True,
                verbose=True
            )
        ]

        # Init the agent
        logging.info("Initializing the agent")
        agent = initialize_agent(
            tools,
            chat_model,
            agent=AgentType.OPENAI_FUNCTIONS,
            verbose=True,
            system_message=system_message,
            max_iterations=2,
            agent_kwargs={
                "system_message": system_message
              } | agent_kwargs,
            memory=memory
        )

Where the extract_QA function is a similarity search

def extract_QA(docsearch: Pinecone, message:str, company: str, language: str = "spanish", info: str = ""):
  result = docsearch(inputs={"message": message, "question": message, "company": company, "language": language})
  info = result.get("summaries", "")
  return result["answer"]

However, I would like to get as an output not only the string on the final agent output, but also the closest document got on the Retrieval of the docsearch, is there a way to do this?

Haven't got any ideas and documentation has not support for this use case

JIST
  • 1,139
  • 2
  • 8
  • 30
Carlo
  • 11
  • 2

0 Answers0