0

I am building the chatbot based on llama_index, langchain and gradio (as GUI). My code look like this at the moment:

import gradio as gr
import langchain
from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationChain
from llama_index import StorageContext, load_index_from_storage
from langchain import OpenAI
from langchain.chat_models import ChatOpenAI
import os
import openai

openai.api_key  = os.environ['OPENAI_API_KEY']

def create_query_engine():
    storage_context = StorageContext.from_defaults(persist_dir = "store")
    index = load_index_from_storage(storage_context)
    query_engine = index.as_query_engine()
    return query_engine

llm = OpenAI(temperature=0, model_name="gpt-3.5-turbo", max_tokens=600)
query_engine = create_query_engine()


def click_response(message, history):
    memory = ConversationBufferMemory()
    for user_msg, ai_msg in history:
        memory.chat_memory.add_user_message(user_msg)
        memory.chat_memory.add_ai_message(ai_msg)

    conversation = ConversationChain(
        llm=llm,
        verbose=True,
        memory=memory
    )

    response = query_engine.query(message)
    return response

demo = gr.ChatInterface(click_response)

if __name__ == "__main__":
    demo.launch()
    

Currently I encounter the following error:

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/gradio/routes.py", line 442, in run_predict
    output = await app.get_blocks().process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/gradio/blocks.py", line 1392, in process_api
    data = self.postprocess_data(fn_index, result["prediction"], state)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/gradio/blocks.py", line 1326, in postprocess_data
    prediction_value = block.postprocess(prediction_value)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/gradio/components/chatbot.py", line 235, in postprocess
    self._postprocess_chat_messages(message_pair[1]),
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/gradio/components/chatbot.py", line 210, in _postprocess_chat_messages
    raise ValueError(f"Invalid message for Chatbot component: {chat_message}")
ValueError: Invalid message for Chatbot component: 

After the last information "ValueError: Invalid message for Chatbot component:" the log shows perfect answer build based on "store" folder with my vectors.

How can I fix it to make it work and show the answer in the chatbox?

Renata Ka
  • 21
  • 2

1 Answers1

0

I've worked it out - I had to change one line as follows:

response = str(query_engine.query(message))
Renata Ka
  • 21
  • 2