Update: (29th may, 2023)
In order to use the library with Microsoft Azure endpoints, you need to set the OPENAI_API_TYPE
, OPENAI_API_BASE
, OPENAI_API_KEY
, and optionally API_VERSION
. The OPENAI_API_TYPE
must be set to 'azure'
and the others correspond to the properties of your endpoint. In addition, the deployment name must be passed as the model parameter.
See the below example:
import os
os.environ["OPENAI_API_TYPE"] = "azure"
os.environ["OPENAI_API_BASE"] = "https://<your-endpoint.openai.azure.com/"
os.environ["OPENAI_API_KEY"] = "your AzureOpenAI key"
from langchain.embeddings.openai import OpenAIEmbeddings
embeddings = OpenAIEmbeddings(
deployment="your-embeddings-deployment-name",
model="your-embeddings-model-name",
api_base="https://your-endpoint.openai.azure.com/",
api_type="azure",
)
The same goes for the LLM model. See the below example with reference to your provided notebook link:
import os
os.environ["OPENAI_API_TYPE"] = "azure"
os.environ["OPENAI_API_VERSION"] = "2022-12-01"
os.environ["OPENAI_API_BASE"] = "..."
os.environ["OPENAI_API_KEY"] = "..."
# Import Azure OpenAI
from langchain.llms import AzureOpenAI
# Replace the deployment name with your own
chain = load_qa_chain(
AzureOpenAI(
deployment_name="td2",
model_name="text-davinci-002",
),
chain_type="stuff",
)
Initial suspicion
AuthenticationError
occurs when there's an issue with your API key or token. It could be because it's invalid. This may happen if there's a small mistake, like a typo or a formatting error. Try with a different/new API key if the issue persists.