-2

I am currently playing around with langchainai and openAi. My goal is to make a kind of a chatbot which can answer questions specific to certain documents. It does work currently:

export async function query(req: Request, res: Response) {
    const { directoryName, query } = req.body;

    const model = new OpenAI({
        openAIApiKey: process.env.OPEN_AI_API_KEY,
        maxTokens: -1,
        temperature: 1,
    });
    const embeddings = new OpenAIEmbeddings({
        openAIApiKey: process.env.OPEN_AI_API_KEY,
    });

    const vectorStore = await HNSWLib.load(`${baseDirectoryPath}/${directoryName}/index/data`, embeddings);

    const chain = RetrievalQAChain.fromLLM(model, vectorStore.asRetriever())

    const response = await chain.call({ query });

    return res.status(200).json({
        response: response.text.trim(),
    })
}

Is there any way how I would find out where exactly answers come from? in an ideal world I would like to get the source of each answer like: "source: document.xyz, line: xyz" for example?

any help would be great! Thank you so much!

Jason Aller
  • 3,541
  • 28
  • 38
  • 38
Hannes F
  • 339
  • 2
  • 11
  • You can return source documents: https://js.langchain.com/docs/modules/chains/popular/vector_db_qa#return-source-documents – learner Aug 04 '23 at 03:32

0 Answers0