Integrations · Azure AI & Vertex AI

Gemini and Azure AI, wrapped the same way.

Two hyperscalers, one shape of API. Wrap the Vertex AI or Azure AI Foundry client once (Gemini 2.5, Azure OpenAI, Phi, Mistral on Azure, Claude on Vertex) and every call becomes signed evidence.

One API across Vertex, Azure AI Foundry, and Azure OpenAI. Workload identity federation eliminates long-lived credentials; region-pinned signing keys honor residency programs.

Vertex AI

The integration wraps Gemini 2.5 Pro and Flash generateContent and streamGenerateContent calls, including thinking content, tool function-call cycles, grounding metadata, safety ratings, and usage metadata. Claude and Llama deployments on Vertex use the same wrapper.

python
from google.cloud import aiplatform
import veridra

client = aiplatform.gapic.PredictionServiceClient()  # or Vertex SDK client
wrapped = veridra.wrap_vertex(client, system_id="benefits-eligibility-v1")

Azure AI Foundry

The Foundry inference client covers hosted models (Phi, Mistral, Cohere, Llama, Jais) with signed records including Foundry project, deployment name, and region.

python
from azure.ai.inference import ChatCompletionsClient
from azure.identity import DefaultAzureCredential
import veridra

client = ChatCompletionsClient(
    endpoint="https://...inference.ml.azure.com",
    credential=DefaultAzureCredential(),
)
wrapped = veridra.wrap_azure_ai(client, system_id="contract-review-v4")

Azure OpenAI

Azure-specific fields preserved: deployment name, API version, managed-identity scope, and content-filter results. Uses the same wrap_openai helper — the Azure client is a drop-in for the OpenAI client.

Residency and key boundaries

Region-pinned signing ensures attestations remain in the same geography as model inference, supporting EU, UK, Swiss, Japanese, and Indian compliance requirements. Infrastructure uses workload identity federation on GCP and managed identities on Azure — no long-lived service account keys.

  • KMS binding via GCP KMS (workload identity) or Azure Key Vault (managed identity).
  • Private Service Connect on GCP; Private Endpoint on Azure. No public internet in the inference path.
  • Region pinning at the tenant level; signing keys can't be accessed from another region.
  • Captures safety-rating and content-filter interventions as signed records.
One wrapper, three surfaces
Gemini, Azure AI Foundry, Azure OpenAI
The wrapper dispatches by client type. Your code sees the native SDK you already use; Veridra sees normalized, canonicalized evidence regardless of which hyperscaler provided the model.
Multi-cloud governance
Unified across AWS, Azure, GCP
The platform consolidates multi-cloud AI deployments across AWS, Azure, and GCP into unified governance, with attestation records stored in a tenant-isolated transparency log. An EU tenant running Gemini on Vertex and Claude on Bedrock sees one signed audit trail, not three.