Azure OpenAI
Label Studio Enterprise integrates with Azure OpenAI through the Prompts interface so you can run your Azure-hosted LLMs directly on project data. Once you add an Azure OpenAI connection, your deployments show up in the Base model dropdown, right alongside OpenAI, Gemini, Anthropic, and Vertex, so you can generate annotations, iterate on prompts, and evaluate model quality without leaving Label Studio.
With this connection, you can:
- Pre-label tasks using Azure OpenAI and write outputs into project controls
- Run structured evaluations on subsets with ground truth before you scale up
- Compare Azure OpenAI behavior with other providers on the same prompts and data
This Azure OpenAI integration is available in Label Studio Enterprise / SaaS via Prompts.
How Label Studio Connects with Azure OpenAI in Prompts
In Prompts, Azure OpenAI is configured as a model provider connection. Each connection knows how to reach a specific Azure deployment and how to authenticate. When you run a Prompt that uses Azure OpenAI:
- The Prompts UI sends your selected tasks, prompt template, and Azure OpenAI provider choice to the backend
- Label Studio builds runtime parameters using your connection details:
model = 'azure/' + deployment_namebase_url = <your Azure OpenAI endpoint>api_key = <your Azure key>
- The request is sent to Azure OpenAI via an OpenAI-compatible API layer
- Azure OpenAI returns model outputs, which Label Studio maps back into predictions that align with your label config (choices, spans, text fields, etc.)
To prompt designers and annotators, Azure OpenAI behaves like any other base model in Prompts: you pick it from the dropdown, write instructions, reference fields like {{ text }}, and hit Run.