0

When receiving a response from OpenAI's text-davinci-003 model, I was able to extract the text from the response with the following PHP code:

$response = $response->choices[0]->text;

Here was the text-davinci-003 response code:

{
  "id": "cmpl-uqkvlQyYK7bGYrRHQ0eXlWi7",
  "object": "text_completion",
  "created": 1589478378,
  "model": "text-davinci-003",
  "choices": [
    {
      "text": "\n\nThis is indeed a test",
      "index": 0,
      "logprobs": null,
      "finish_reason": "length"
    }
  ],
  "usage": {
    "prompt_tokens": 5,
    "completion_tokens": 7,
    "total_tokens": 12
  }
}

I am now trying to alter my code to work with the recently released gpt-3.5-turbo model which returns the response slightly differently:

{
  "id": "chatcmpl-123",
  "object": "chat.completion",
  "created": 1677652288,
  "choices": [{
    "index": 0,
    "message": {
      "role": "assistant",
      "content": "\n\nHello there, how may I assist you today?",
    },
    "finish_reason": "stop"
  }],
  "usage": {
    "prompt_tokens": 9,
    "completion_tokens": 12,
    "total_tokens": 21
  }
}

My question is, how can I alter the code:

$response = $response->choices[0]->text;

...so that it can grab the content of the response message?

Rok Benko
  • 14,265
  • 2
  • 24
  • 49
sw123456
  • 3,339
  • 1
  • 24
  • 42

2 Answers2

8

Python:

print(response['choices'][0]['message']['content'])

NodeJS:

Note: OpenAI NodeJS SDK v4 was released on August 16, 2023, and is a complete rewrite of the SDK. Among other things, there are changes in extracting the message content. See the v3 to v4 migration guide.

• SDK v3:

console.log(response.data.choices[0].message.content);

• SDK v4:

console.log(response.choices[0].message.content);

PHP:

var_dump($response->choices[0]->message->content);

Working example in PHP

If you run test.php the OpenAI API will return the following completion:

string(40) "

The capital city of England is London."

test.php

<?php
    $ch = curl_init();

    $url = 'https://api.openai.com/v1/chat/completions';

    $api_key = 'sk-xxxxxxxxxxxxxxxxxxxx';

    $query = 'What is the capital city of England?';

    $post_fields = array(
        "model" => "gpt-3.5-turbo",
        "messages" => array(
            array(
                "role" => "user",
                "content" => $query
            )
        ),
        "max_tokens" => 12,
        "temperature" => 0
    );

    $header  = [
        'Content-Type: application/json',
        'Authorization: Bearer ' . $api_key
    ];

    curl_setopt($ch, CURLOPT_URL, $url);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
    curl_setopt($ch, CURLOPT_POST, 1);
    curl_setopt($ch, CURLOPT_POSTFIELDS, json_encode($post_fields));
    curl_setopt($ch, CURLOPT_HTTPHEADER, $header);

    $result = curl_exec($ch);
    if (curl_errno($ch)) {
        echo 'Error: ' . curl_error($ch);
    }
    curl_close($ch);

    $response = json_decode($result);
    var_dump($response->choices[0]->message->content);
?>
Rok Benko
  • 14,265
  • 2
  • 24
  • 49
  • On another note, I had: – sw123456 Mar 02 '23 at 10:17
  • $post_fields = [ "model" => "text-davinci-003", "prompt" => $query, "max_tokens" => 500, "temperature" => 0.8 ]; – sw123456 Mar 02 '23 at 10:17
  • ...when sending a query to da vinci. But the new model requires a 'message' (not prompt) in the form: "messages": [{"role": "user", "content": "Hello!"}] – sw123456 Mar 02 '23 at 10:18
  • How can I alter the "prompt" => $query line to "messages" => so I can pass in role and content? – sw123456 Mar 02 '23 at 10:19
  • I'm trying "messages" => ["role" => "user" , "content" => $query], but responses are always null. – sw123456 Mar 02 '23 at 10:44
  • 1
    The solution for the comments above: [OpenAI ChatGPT (gpt-3.5-turbo) API: Why am I getting NULL response?](https://stackoverflow.com/questions/75614444/null-response-from-gpt-3-5-turbo-api-calls/75615117#75615117) – Rok Benko Mar 02 '23 at 12:28
  • Hi @RokBenko Can you please check my question if you have a time ? [HERE IS MY QUESTION](https://stackoverflow.com/questions/75910313/i-am-experimenting-using-chatgpt-api-but-the-query-gives-undefined-output) – AlwaysStudent Apr 02 '23 at 09:27
0

This is what worked for me.

$data = array(
  "prompt" => "Is Climate Change worrisome?", //Your question or request
  "temperature" => 0.5,
  "max_tokens" => 500,
  "model" => "text-davinci-003",
  "top_p" => 1,
  "frequency_penalty" => 0,
  "presence_penalty" => 0
);

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,
  'https://api.openai.com/v1/completions');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, json_encode($data));

$headers = array();
$headers[] = 'Content-Type: application/json';
$headers[] = 'Authorization: Bearer '.$OPENAI_API_KEY;
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);

$response = curl_exec($ch);
if (curl_errno($ch)) {
  echo 'Error:' . curl_error($ch);
}
curl_close($ch);

$response = json_decode($response);
echo $response->choices[0]->text;