Architecture · Multi-region

Four regions. Residency by construction.

Tenants are pinned to a region at provisioning. Signing keys, transparency logs, and evidence packs never leave it. Residency is enforced in the data path, not promised in a contract.

Residency is enforced in the data path, not promised in a contract.

Regional footprint

We operate four primary regions. Each is a complete deployment of the attestation plane (signer, kms-adapter, Trillian log, witness relays, pack-builder) with independent blast-radius boundaries. A regional outage in one doesn't impair signing in the others, and no tenant's evidence ever fails over out of its home region.

US (us-east / us-west)

AWS us-east-1 and us-west-2. For customers under SR 11-7, SEC, state DFS, or HIPAA regimes that expect US-resident evidence. FedRAMP-aligned controls.

EU (eu-central / eu-west)

AWS eu-central-1 and eu-west-1 (Frankfurt and Dublin). For GDPR and EU AI Act residency. Encryption keys and logs remain in-region; no Schrems II transfer for signed evidence.

UK (uk-south)

AWS eu-west-2 (London). For UK GDPR, FCA SYSC, and PRA SS1/23 model-risk regimes that require UK-resident records.

Africa (af-south)

AWS af-south-1 (Cape Town) with partner DC presence in Lagos and Nairobi. For POPIA, NDPA, and regional financial-services residency. We run this as a full region, not a replicated afterthought.

How residency is enforced

Tenant pinning at provisioning

Each tenant is created with a residency tag. The tag is baked into the tenant record, the KMS keyring location, the Trillian tree, and the database shard. There is no API to move a tenant between regions; migration requires an explicit, signed, operator-approved event.

In-region KMS and HSM

A tenant in EU signs against an EU KMS key. The kms-adapter refuses to relay a signing request to a different region; the policy is enforced by SPIFFE identity scope, not by configuration.

In-region transparency log

Each region has its own Trillian deployment and its own set of witnesses. Inclusion proofs reference region-local STHs. Cross-region comparison of logs is explicit, not implicit.

No silent cross-region replication

Backups are in-region by default. Where a customer explicitly opts into cross-region DR, the target region is named in the contract and surfaces in the data-flow diagram. Nothing crosses a border by accident.

What is regional and what is global

A small number of control-plane components are global: billing metadata, status page, documentation. No tenant evidence, no key material, no policy content sits in that set.

  • Signed decisions, policy records, incidents: regional, never replicated.
  • Signing keys and KMS references: regional, per tenant, never exported.
  • Transparency log and witnesses: regional, independent roots per region.
  • Evidence packs: generated and retrieved in the tenant region; no global cache.
  • Usage metering: global, computed from tenant-emitted metadata only.
  • Documentation, status page, marketing: global.
Residency, keys, and isolation are one story
Geographically, cryptographically, logically
Multi-region deployment is how we achieve residency geographically. Key management is how it's achieved cryptographically, and data security is how it's achieved logically inside a region. The three work together: a tenant can't be deprovisioned from their region, their keys can't be read from another region, and their data can't be queried by another tenant.
What we give to regulators
Tenant-specific data-flow diagrams
For regulators who ask the residency question seriously, we provide a tenant-specific data-flow diagram showing exactly which region each component lives in, which KMS holds the signing key, which Trillian tree holds the log, and which backup location holds the encrypted archive. No hand-waving, no generic architecture slide.