Vertex AI
Label Studio Enterprise integrates with Vertex AI through the Prompts interface so you can run Gemini and other Vertex-hosted models directly on your project data. Once you configure Vertex as a provider, its models appear in the Prompts “Base model” dropdown alongside OpenAI, Gemini, and Anthropic, letting you generate annotations, iterate on prompts, and evaluate outputs against ground truth inside Label Studio.
With this connection, you can:
- Use Vertex-hosted models to pre-label tasks and write results into project controls
- Run structured evaluations on real project data before scaling up
- Compare Vertex outputs with other providers while keeping the workflow in one place
This Vertex AI integration is available in Label Studio Enterprise and SaaS via Prompts.
How Label Studio Connects with Vertex AI in Prompts
Label Studio treats Vertex AI as a model provider behind the Prompts and evaluation flow. When you create a Prompt that uses Vertex:
- You select Vertex as the provider and choose a Vertex model in the Base model dropdown
- The Prompts UI sends your selected tasks, prompt template, and model choice to the Enterprise backend
- The backend builds a Vertex request from the connection settings (project, region, model) and runtime parameters such as max tokens, temperature, and top-p
- Vertex runs the model and returns outputs, which Label Studio writes back into your project as predictions or displays in the Response panel
- You can review outputs, refine your prompt, and then reuse the same Prompt to drive pre-labeling or evaluations at scale
To you, it looks and feels like any other Prompt; under the hood, Label Studio switches the provider implementation to Vertex.