Run OpenAI models in Azure (Enterprise)

Microsoft Azure offers the ability to host private clouds with OpenAI models. You can add these models in Stack AI using an "Azure" node. Hosting models in Azure has a few benefits:

  1. Lower and consistent latency: models in Azure are not affected by the traffic of OpenAI and

  2. Higher rate limits: models in Azure offer higher rate limits of up to 240,000 tokens per minute and 1440 requests per minute.

  3. Data privacy and compliance: data sent to Azure is kept under the private cloud and is not sent to OpenAI or any external service. These models are covered under Azure's Business Associate Agreement (BAA) and are HIPPA compliant.

Enterprise users of Stack AI have access to models hosted in Azure.

We currently support text-davinci-003, gpt-3.5-turbo, and text-embeddings-ada-002.

You can use azure-text-embeddings-ada-002 under Vector Databases or Offline Data Loaders nodes.

Last updated