0

I'm running a Ruby on Rails app that takes text entries, generates embeddings with OpenAI and sends them to Pinecone. I would like any 'Question' to be asked and for the following to happen:

  • Generate Embedding for the question
  • Query Pinecone with the question embedding and receive relevant text entry embeddings
  • Ask OpenAI to answer the question based on the data provided.

I assumed I could do this by (for example) getting the top 5 most relevant entries and putting them into an OpenAI prompt, asking it to answer the question. Is this how Langchain Question Answer format works? Or is it more comprehensive than that? I'm mostly interested in the process. Thanks!

MikeHolford
  • 1,851
  • 2
  • 21
  • 32

1 Answers1

0

Technically you do not need Langchain to interact with Pinecone but LangChain is a framework that makes the building process easier. It also connects AI models (not only the ones from OpenAI. LangChain provides a simple and easy-to-use API for interacting with these platforms) with external sources of data and computation. It practically connects more components and makes them work together.

Yilmaz
  • 35,338
  • 10
  • 157
  • 202