TokenGate

LLM keys, prompts, and quotas in one place.

Centralize provider API keys, expose prompt endpoints, and meter token usage per client and model without storing user content.

Provider key vault

Store Gemini/OpenAI/Claude keys with envelope encryption and rotate them per tenant.

Prompt endpoints

Publish prompt versions as endpoints and accept text/image payloads with client user IDs.

Usage + quotas

Track token usage by client and model to enforce defaults and overrides.