-3

I am going to extract information from docx file using OpenAI GPT-3 model in python. But the total length of prompts is too big than GPT-3 provided.

If you have any opinion about this problem, please help me. Thanks

I tried it to split several chunks, but it is not useful. Because I need whole information of chunks, not each chunks.

TopTen1310
  • 98
  • 8

1 Answers1

0

You can use embeddings to find the most relevant pieces of information to build an answer. Here is a good example of using it in a similar case: Question answering using embeddings-based search

Taras Drapalyuk
  • 473
  • 3
  • 6