I am following along with this video video making a ChatGPT bot. Everything was fine until I the very end where I am trying to create the model and indexes for the bot.
I copied the code directly from the video creator's notebook `
def construct_index(directory_path):
# set maximum input size
max_input_size = 4096
# set number of output tokens
num_outputs = 256
# set maximum chunk overlap
max_chunk_overlap = 20
# set chunk size limit
chunk_size_limit = 600
# define LLM (ChatGPT gpt-3.5-turbo)
llm_predictor = LLMPredictor(llm=ChatOpenAI(temperature=0, model_name="gpt-3.5-turbo", max_tokens=num_outputs))
prompt_helper = PromptHelper(max_input_size, num_outputs, max_chunk_overlap, chunk_size_limit=chunk_size_limit)
documents = SimpleDirectoryReader(directory_path).load_data()
index = GPTSimpleVectorIndex(
documents, llm_predictor=llm_predictor, prompt_helper=prompt_helper
)
index.save_to_disk('index.json')
return index
def ask_me_anything(question):
index = GPTSimpleVectorIndex.load_from_disk('index.json')
response = index.query(question, response_mode="compact")
display(Markdown(f"You asked: <b>{question}</b>"))
display(Markdown(f"Bot says: <b>{response.response}</b>"))
This code runs without any problems
When I run this code:
construct_index('/data/notebook_files/textdata')
I get this error:
Traceback (most recent call last):
at cell 32, line 1
at cell 31, line 17, in construct_index(directory_path)
at /opt/python/envs/default/lib/python3.8/site-packages/llama_index/indices/vector_store/vector_indices.py, line 69, in __init__(self, nodes, index_struct, service_context, vector_store, **kwargs)
at /opt/python/envs/default/lib/python3.8/site-packages/llama_index/indices/vector_store/base.py, line 54, in __init__(self, nodes, index_struct, service_context, vector_store, use_async, **kwargs)
TypeError: __init__() got an unexpected keyword argument 'llm_predictor'
I also tried it directly in the video creators notebook and got the same error. Is there something I am missing? What should I do to fix this?