0

So, previously I was adding docs in my supabase database using this function and it is working just fine. You can see I have to modify the document to add another column (metadata)

async function generateEmbeddings(text, email) {
  const textSplitter = new RecursiveCharacterTextSplitter({
    chunkSize: 1000,
    chunkOverlap: 200,
  });
  const documents = await textSplitter.createDocuments([text]);
  console.log(documents);

  const content = documents.map((doc) => doc.pageContent).join('\n');
  const metadata = { email: email };
  const docs = [new Document({ pageContent: content, metadata })];

  let vectorStore = await SupabaseVectorStore.fromDocuments(
    docs,
    new OpenAIEmbeddings(),
    {
      client: SUPABASE_CLIENT,
      tableName: 'documents',
    }
  );
}

data added as wanted

Now I am trying to implement memory with so it could remember my previous reponses. here is the code..

const model = new ChatOpenAI({
  openAIApiKey: OPENAI_API_KEY,
  modelName: 'gpt-3.5-turbo',
});

const vectorStore = await SupabaseVectorStore.fromExistingIndex(
  new OpenAIEmbeddings(),
  {
    client: SUPABASE_CLIENT,
    tableName: 'documents',
    queryName: 'match_documents_with_filters',
    filter: { email: 'abc@gmail.com' },
  }
);

const memory = new VectorStoreRetrieverMemory({
  vectorStoreRetriever: vectorStore.asRetriever(5),
  memoryKey: 'history',
  returnDocs: false,
});

const prompt =
  PromptTemplate.fromTemplate(`The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Relevant pieces of conversation:
{history}

(You do not need to use these pieces of information if not relevant to your question)

Current conversation:
Human: {input}
AI:`);

const rl = readline.createInterface({
  input: process.stdin,
  output: process.stdout,
});

export const query = async () => {
  rl.question('input:', async (answer) => {
    let input = answer;

    const chain = new LLMChain({
      llm: model,
      prompt: prompt,
      verbose: true,
      memory: memory,
    });
    await chain
      .call({
        input,
      })
      .then((res) => {
        console.log(res);
      });

    query();
  });
};

query();

I am also using metadata filtering to get only the relevant docs.

The problem is as soon as I run the Chain.call method, it stores the data in my supabase database but with an empty metadata

embeddings and content are added, but I cant figure out how to add metadata now

I am trying to add metadata along with other data fields but cant find a way to achieve so.

0 Answers0