10

I don't understand why I am receiving this error.

Refused to set unsafe header "User-Agent"

I am trying to use OpenAI's API for a personal project. I don't understand why it's refusing to set this "unsafe header" and how, or if, I can make it safe. I've tried googling this issue and the top link is for a GitHub forum that explains how it might be something that Chrome does but, I tried to use the app in Safari and it wouldn't work either.

const onFormSubmit = (e) => {
    e.preventDefault();

    const formData = new FormData(e.target),
      formDataObj = Object.fromEntries(formData.entries())
    console.log(formDataObj.foodDescription);

    //////OPENAI
    const configuration = new Configuration({
      apiKey: process.env.REACT_APP_OPENAI_API_KEY,
    });
    const openai = new OpenAIApi(configuration);

    openai.createCompletion("text-curie-001", {
      prompt: `generate food suggestions from the following flavor cravings: ${formDataObj.foodDescription}`,
      temperature: 0.8,
      max_tokens: 256,
      top_p: 1,
      frequency_penalty: 0,
      presence_penalty: 0,
    })
    .then((response) => {
      setState({
        heading: `AI Food Suggestions for: ${formDataObj.foodDescription}`,
        response: `${response.data.choices[0].text}`
      });
    })
  }
Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
SlickRick
  • 107
  • 1
  • 4
  • Note OpenAI includes this notice in the API keys section: "OpenAI may also automatically rotate any API key that we've found has leaked publicly.". After a half-dozen commits, I realized that as soon as I pushed my code to github, OpenAI would recycle the key. – Jacob Valdez Jul 30 '22 at 00:10

5 Answers5

9

As you stated, you're recieving the error because the openai API client "Refused to set unsafe header "User-Agent". Since using it requires access to sensitive information (the API key), the nodejs client intentionally restricts cross origin requests to prevent accidentally revealing secrets.

For a workaround, see https://github.com/openai/openai-node/issues/6 where AmanKishore manually requests completions.

I ended up writing my own completion function like so:

const DEFAULT_PARAMS = {
  "model": "text-davinci-002",
  "temperature": 0.7,
  "max_tokens": 256,
  "top_p": 1,
  "frequency_penalty": 0,
  "presence_penalty": 0
}

export async function query(params = {}) {
  const params_ = { ...DEFAULT_PARAMS, ...params };
  const requestOptions = {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'Authorization': 'Bearer ' + String(openai_api_key)
    },
    body: JSON.stringify(params_)
  };
  const response = await fetch('https://api.openai.com/v1/completions', requestOptions);
  const data = await response.json();
  return data.choices[0].text;
}
Jacob Valdez
  • 246
  • 2
  • 9
3

This worked for me, but it depends on implementation details of the Configuration class:

// This hardcodes insertion of 'User-Agent'
let config = new Configuration({ apiKey: key,});

// Delete it
delete config.baseOptions.headers['User-Agent'];

let api = new OpenAIApi(config);

Purplie
  • 31
  • 2
1

Using Jacobs answer as reference, here is the workaround for the GPT 3.5 Turbo API.

const [chatList, setChatList] = ([]) // ur chat history
async function createCompletion(params = {}) {
        const DEFAULT_PARAMS = {
            model: "gpt-3.5-turbo",
            messages: [{ role: "user", content: "Hello World" }],
            // max_tokens: 4096,
            temperature: 0,
            // frequency_penalty: 1.0,
            // stream: true,
        };
        const params_ = { ...DEFAULT_PARAMS, ...params };
        const result = await fetch('https://api.openai.com/v1/chat/completions', {
            method: 'POST',
            headers: {
                'Content-Type': 'application/json',
                'Authorization': 'Bearer ' + String(your_api_key)
            },
            body: JSON.stringify(params_)
        });
        const stream = result.body
        const output = await fetchStream(stream);
        setChatList(previousInputs => (previousInputs.concat(output.choices[0].message)));
    }

There was a need to include a function fetchStream() as the openapi response returned a readableStream which needed to be handled through a recursive function.

 async function fetchStream(stream) {
    const reader = stream.getReader();
    let charsReceived = 0;
    const li = document.createElement("li");

    // read() returns a promise that resolves
    // when a value has been received
    const result = await reader.read().then(
        function processText({ done, value }) {
            // Result objects contain two properties:
            // done  - true if the stream has already given you all its data.
            // value - some data. Always undefined when done is true.
            if (done) {
                console.log("Stream complete");
                return li.innerText;
            }
            // value for fetch streams is a Uint8Array
            charsReceived += value.length;
            const chunk = value;
            console.log(`Received ${charsReceived} characters so far. Current chunk = ${chunk}`);
            li.appendChild(document.createTextNode(chunk));
            return reader.read().then(processText);
        });
    const list = result.split(",")
    const numList = list.map((item) => {
        return parseInt(item)
    })
    const text = String.fromCharCode(...numList);
    const response = JSON.parse(text)
    return response
}
0

I have some problem. And this code work for me!

const configuration = new Configuration({
  apiKey: "YOURE_OPENAI_KEY",
  organization: "YOURE_OPENAI_OGRANIZATION",
});

configuration.baseOptions.headers = {
  Authorization: "Bearer " + "YOURE_OPENAI_KEY",
};
  • 1
    Your answer could be improved with additional supporting information. Please [edit] to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Jul 03 '23 at 18:37
-2

This error occurs if we call OpenAI from the frontend / client-side instead of the secure backend / server-side.

Muhammad Bilal
  • 1,840
  • 1
  • 18
  • 16
  • Don't care about downvote, I exactly receive this error and I created an API to resolve my issue. I also read it somewhere that on front end, your key can be leaked. – Muhammad Bilal Jun 09 '23 at 15:18