1

FYI : I am trying to build a chatbot based on the instructions given by Dan Shipper https://www.lennysnewsletter.com/p/i-built-a-lenny-chatbot-using-gpt I'm trying to use custom libraries called gpt_index and langchain to create a GPT-3 based search index using the OpenAI API. I have successfully installed the libraries and have the following code. BTW I am using google Colab for the environment.

from gpt_index import SimpleDirectoryReader, GPTListIndex, readers, GPTSimpleVectorIndex, LLMPredictor, PromptHelper
from langchain import OpenAI
import sys
import os
from IPython.display import Markdown, display

def construct_index(directory_path):
    ...
    llm_predictor = LLMPredictor(llm=OpenAI(temperature=0, model_name="text-davinci-003", max_tokens=num_outputs))
    prompt_helper = PromptHelper(max_input_size, num_outputs, max_chunk_overlap, chunk_size_limit=chunk_size_limit)
 
    documents = SimpleDirectoryReader(directory_path).load_data()
    
    index = GPTSimpleVectorIndex(
        documents, llm_predictor=llm_predictor, prompt_helper=prompt_helper
    )

    index.save_to_disk('index.json')
    return index

def ask_lenny():
    index = GPTSimpleVectorIndex.load_from_disk('index.json')
    while True: 
        query = input("What do you want to ask Lenny? ")
        response = index.query(query, response_mode="compact")
        display(Markdown(f"Lenny Bot says: <b>{response.response}</b>"))

When I call the construct_index function with the path to my documents, I get the following error: TypeError: __init__() got an unexpected keyword argument 'llm_predictor'

It seems that there is a mismatch between the expected arguments of the GPTSimpleVectorIndex class and the provided arguments in the code. Unfortunately, I cannot find any documentation or examples for these custom libraries.

Could anyone help me understand how to correctly initialize the GPTSimpleVectorIndex class and resolve this error? Any guidance on using these libraries would be greatly appreciated.

Thank you!

I am running this in Google Colab and see the error.

meshkati
  • 1,720
  • 2
  • 16
  • 29
lionkingx
  • 29
  • 4

1 Answers1

1

I was able to use a hint from this forum about the Use ServiceContext, and with that and little help from GPT4

We resolved the issue by using the ServiceContext class instead of directly passing the LLMPredictor and PromptHelper as arguments to the GPTSimpleVectorIndex constructor: CODE

def construct_index(directory_path):
max_input_size = 4096
num_outputs = 256
max_chunk_overlap = 20
chunk_size_limit = 600

llm_predictor = LLMPredictor(llm=OpenAI(temperature=0, model_name="text-davinci-003",
max_tokens=num_outputs))
prompt_helper = PromptHelper(max_input_size, num_outputs, max_chunk_overlap,
chunk_size_limit=chunk_size_limit)

service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor,
prompt_helper=prompt_helper)

documents = SimpleDirectoryReader(directory_path).load_data()

index = GPTSimpleVectorIndex.from_documents(documents, service_context=service_context)

index.save_to_disk('index.json')
return index
enter code here

enter code here CODE

lionkingx
  • 29
  • 4