1

i want to restrict query search limited to custom documents for LLM . but its showing out of context results as well as shown in below image.

enter image description here My code is below:

for token generation

max_input_size = 4096
num_outputs = 512
max_chunk_overlap = 20
chunk_size_limit = 600
gpt_model_name='text-davinci-003'
        
prompt_helper = PromptHelper(max_input_size, num_outputs, max_chunk_overlap, chunk_size_limit=chunk_size_limit)
llm_predictor = LLMPredictor(llm=OpenAI(temperature=0, model_name=gpt_model_name, max_tokens=num_outputs))
documents = SimpleDirectoryReader('./static/').load_data()
index = GPTSimpleVectorIndex(documents, llm_predictor=llm_predictor, prompt_helper=prompt_helper)
index.save_to_disk('./static/dump/story.json')

For Query:

new_index = GPTSimpleVectorIndex.load_from_disk('./static/dump/story.json')
response = new_index.query("Only answer from provided content:"+ques,response_mode="compact")

As I'm new to this any help will be thankfull.

0 Answers0