1

I am playing around with ChatGPT and Delphi, using the OpenAI library at: https://github.com/HemulGM/DelphiOpenAI. It supports streaming, but I can't figure out the ChatGPT mechanism for streaming. I can create a Chat, and get all data back in one return message.

However, when I try to use streaming, I get an error. The following console code works fine. I submit my chat, and I get the entire answer back in one "event". I would like the same behavior as the ChatGPT website, so the tokens would be displayed as they are generated. My code is as follows...

var buf : TStringlist;
begin
...
 var Chat := OpenAI.Chat.Create(
           procedure(Params: TChatParams)
       begin
          Params.Messages([TChatMessageBuild.Create(TMessageRole.User, Buf.Text)]);
          Params.MaxTokens(1024);
         // Params.Stream(True);
        end);
       try
            for var Choice in Chat.Choices do
              begin

                Buf.Add(Choice.Message.Content);
                Writeln(Choice.Message.Content);
              end;
        finally
         Chat.Free;
      end;

This code works. When I try to turn on streaming, I get the EConversionError 'The input value is not a valid Object', which causes ChatGPT to return 'Empty or Invalid Response'.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
user1009073
  • 3,160
  • 7
  • 40
  • 82

1 Answers1

3

Because in this mode, it responds in this case not with a JSON object, but in its own special format.

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": "\r", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": "\n", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": "1", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": ",", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": " 2", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}
...

I can start working on such a mode for the library.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
HemulGM
  • 58
  • 1
  • 5