I have been working on Langchain
for the past few days. I'm structuring the data from user's input using API Specs. But I'm facing issues with the memory of the chain.
Technology: :Langchain
LLM Module: Vertex AI
Chain: OpenAPIEndPointChain, Method: from_api_operation
Operation: APIOperation, Method:from_openapi_spec
llm = VertexAI(max_output_tokens=1000, model_name="text-bison", top_k=0)
spec = OpenAPISpec.from_file(filepath)
operation = APIOperation.from_openapi_spec(spec, resource_path, method)
chain = OpenAPIEndpointChain.from_api_operation(operation, llm, requests=Requests(), return_intermediate_steps=True)
output = chain(question)
The chain provides me the parameters using spec docs only the current query, but I need the parameters based on the history of the queries.
Example:
Input: on 7th July
Current Output: {"date":"07/07/2023","time":""}
Input: at 4am
Current Output: {"date":"","time":"04:00"}
Expected Output:
{"date":"07/07/2023","time":"04:00"}
Thanks in advance!!