-2

I have been trying to create a GPT-3.5-turbo chatbot with a Gradio interface, the chatbot works perfectly fine in command line but not when I implement it with Gradio. I am able to send my input and receive the response. However the response then gets returned and Gradio doesn't properly display the result. It replies with the "role" and "content" dictionary keys instead of the chat strings. My goal is to be able to have a simple chat conversation with the history recorded in the web interface.

I have tried returning strings, all sorts of different sections of the dictionary and I'm completely at a loss. Before when I would return a string in the predict function it would complain it wanted something to enumerate. Then I sending a list of strings, and no luck there either. When I return the whole dictionary it doesn't kick an error but it then displays the keys not the values.

Here is a image of the error occurring in the Gradio Interface

Here is my current code:

import openai
import gradio as gr

openai.api_key = "XXXXXXX"

history = []
system_msg = input("What type of chatbot would you like to create? ")
history.append({"role": "system", "content": system_msg})

with open("chatbot.txt", "w") as f:
    f.write("System: "+system_msg)

print("Say hello to your new assistant!")

def predict(input, history):
    if len(history) > 10:
        history.pop(1)
        history.pop(2)
        history.pop(3)
    history.append({"role": "user", "content": input})
    response = openai.ChatCompletion.create(
        model="gpt-3.5-turbo",
        messages=history)
    reply = response["choices"][0]["message"]["content"]
    history.append({"role": "assistant", "content": reply})
    return history, history

with gr.Blocks() as demo:
    chatbot = gr.Chatbot()
    state = gr.State([])
    
    with gr.Row():
        txt = gr.Textbox(show_label=False, placeholder="What kind of chatbot would you like to create? ").style(container=False)
    
    txt.submit(predict, [txt, state], [chatbot, state])

demo.launch()
KnoBuddy
  • 1
  • 2

2 Answers2

0

Okay, solved it. The issue was that history is a dictionary. The output of the submit wanted a list of lists. I finally solved it by creating a separate list. The list list of questions and reply, then the next item on the top most list is another list of question and reply.

Dictionaries are the death of me. Here is the fixed code.

chat_history = []

def gpt_reply(chat_history):
    response = openai.ChatCompletion.create(
        model="gpt-3.5-turbo",
        messages=chat_history)
    reply = response["choices"][0]["message"]["content"]
    return reply


def predict(input, history = []):
    if len(history) == 0:
        system_msg = "You are an expert code completionist."
        history.append({"role": "system", "content": system_msg})
        with open("chatbot.txt", "w") as f:
            f.write("System: "+system_msg)
    if len(history) > 10:
        history.pop(1)
        history.pop(2)
        history.pop(3)
    
    message = input
    with open("chatbot.txt", "a") as f:
        f.write("\nUser: " + message + "\n")
    history.append({"role": "user", "content": message})
    reply = gpt_reply(history)
    history.append({"role": "assistant", "content": reply})
    with open("chatbot.json", "w") as f:
        json.dump(history, f, indent=4, ensure_ascii=False)
    # create new list with only the user and bot responses. Each response is a string of one item of a list
    chat_history.append([message, reply])
    with open("user_responses.json", "w") as f:
        json.dump(chat_history, f, indent=4, ensure_ascii=False)
    return chat_history, history

with gr.Blocks() as demo:
    chatbot = gr.Chatbot()
    state = gr.State([])
    
    with gr.Row():
        txt = gr.Textbox(show_label=False, placeholder="What kind of chatbot would you like to create? ").style(container=False)
    
    txt.submit(predict, [txt, state], [chatbot, state])

demo.launch()
KnoBuddy
  • 1
  • 2
0

I tried the above solution and it didn't work for me. The follwing works:

reply = res["choices"][0]["message"]["content"]
reply = reply.replace('\n', '<br />')