3

I'm getting "unsupported data" error when trying to send post request to Azure OpenAI. What should I do to fix the error?

https://myopenai.openai.azure.com/openai/deployments/code-davinci-002/completions?api-version=2023-03-15-preview&API-KEY=xxxxxxxxxxxx&content-type=application/json

api-version = 2023-03-15-preview
API-KEY = xxxxx
content-type = application/json


{
     "model": "gpt-3.5-turbo",
     "messages": [{"role": "user", "content": "Say this is a test!"}],
     "temperature": 0.7
}
Talha Tayyab
  • 8,111
  • 25
  • 27
  • 44
Kenny_I
  • 2,001
  • 5
  • 40
  • 94

3 Answers3

5

You are missing a chat/ in the URL. It shouldn't be

https://myopenai.openai.azure.com/openai/deployments/<deployment>/completions?api-version=2023-03-15-preview&API-KEY=xxxxxxxxxxxx&content-type=application/json

but

https://myopenai.openai.azure.com/openai/deployments/<deployment>/chat/completions?api-version=2023-03-15-preview&API-KEY=xxxxxxxxxxxx&content-type=application/json

I was running into the same issue and it took me forever to figure out that the endpoint for the chat API is slightly different.

Arcadio Garcia
  • 499
  • 1
  • 4
  • 16
  • This is the answer – Hieu Nguyen May 18 '23 at 13:36
  • I'm using just like you said, it's working BUT the data is not from my private deployment, it comes from outside world data. I asked Lady Gaga's age and it correctly answered me. I'm pretty sure we do not have it in our deployment files. Why is that? Is it because I didn't click DEPLOY? But deploys is for me to use Microsfot's app. I don't want it. I'm using CURL calls. And it's working. Just not as they're selling it, querying my own data. It's querying public data instead. – Alan Cruz Jul 20 '23 at 15:48
1

Seems like you may be mixing davinci model in your deployment but gpt model in your body.

I am experiencing a similar error when my model deployment is correct. Using gpt-35-turbo as an Azure deployment and api-key xxxx in header and api-version 2023-03-15-preview in url string.

400 model_error Unsupported data type.

Here is the payload.

{
    "messages": [
        {
            "role": "system",
            "content": "You are an AI assistant that helps people find information."
        },
        {
            "role": "user",
            "content": "Why do some oranges have seeds and others do not?"
        }
    ]
}
0

Try to use prompt with ChatML instead.

For example:

{
  "prompt": "<|im_start|>system\nAssistant is a large language model trained by OpenAI.\n<|im_end|>\n<|im_start|>user\nWhat's the difference between garbanzo beans and chickpeas?\n<|im_end|>\n<|im_start|>assistant\n",
  "temperature": 0.9,
  "top_p": 1,
  "frequency_penalty": 0,
  "presence_penalty": 0,
  "max_tokens": 256,
  "stop": ["<|im_end|>"]
}
ccshih
  • 1,190
  • 3
  • 17
  • 25