Second answer, as the first one did not answer OP's question.
Based on this OpenAI Playground Example, a 'conversation' can only be 'asked' by sending both the command to the API.
Don't think there is a way of keep the conversation going after retreiving a response.
Consider this example, were we send the following text:
The following is a conversation with an AI assistant.
Human: Hello
Human: What is 3 * 3?
AI:
Human: What did I just asked?
AI:
The response I get is:
You asked me what 3 * 3 is. The answer is 9.
Code used for this:
<?php
require __DIR__ . '/vendor/autoload.php';
use Orhanerday\OpenAi\OpenAi;
$open_ai_key = getenv('OPENAI_API_KEY');
$open_ai = new OpenAi($open_ai_key);
function ask($ai, $question, $model = 'text-davinci-003') {
$res = $ai->completion([
'model' => $model,
'prompt' => $question,
'temperature' => 0.9,
'max_tokens' => 150,
'frequency_penalty' => 0,
'presence_penalty' => 0.6,
'stop' => ["\nHuman:", "\nAI:"]
]);
try {
$json = @json_decode($res);
foreach ($json->choices as $choice) {
echo $choice->text . PHP_EOL;
}
} catch (Exception $e) {
var_dump($e);
return NULL;
}
}
$text = <<<EOL
The following is a conversation with an AI assistant.
Human: Hello
Human: What is 3 * 3?
AI:
Human: What did I just asked?
AI:
EOL;
$res = ask($open_ai, $text);
Note the stop
array that, quoted from the documentation:
Up to 4 sequences where the API will stop generating further tokens. The returned text will not contain the stop sequence.
This seems to let the AI know where to 'read' and where to 'write'
If you remove that param from the request, it returns without the answer:
You asked what 3 times 3 is.