3

I am experimenting with a new UI where typical profile settings are updated via chat instead of UI. For example, instead of showing a frontend component to let users cancel their billing, they can just chat to the bot.

I am wondering if its possible to get my LLM (lets say gpt3) to generate the graphql query necessary to run these operation. I am thinking I could ingest my graphql schema into a vector database like Pinecone, then feed in that context into LLM so that it can generate the appropriate GQL query/mutation.

Is this feasible, or a bad idea?

I have only theorized this so far

0xterran
  • 31
  • 1
  • Did you find any solution? I am also trying to solve a similar problem and this feels like a good way to fit larger schemas inside models' context limit. – Jay Shukla May 26 '23 at 16:07

1 Answers1

0

I managed to do this on simple queries by using Langchain's graphql agent

user6714507
  • 63
  • 1
  • 9