We are entering the era of agentic AI, where AI agents are no longer experimental tools or isolated assistants. They are becoming core components of modern software systems, executing tasks, making decisions, and interacting with external services on behalf of users and organizations. As adoption accelerates, enterprises are rapidly embedding AI agents into business-critical workflows.
Under the hood, most of these agents rely on GenAI platform APIs, with OpenAI being the most widely used provider today. To access these APIs, AI agents authenticate using API keys. This same pattern applies to other major providers such as Mistral and Anthropic Claude, and is likely to remain common across the broader GenAI ecosystem.
OpenAI API keys are:
This model was acceptable when only a small number of centrally managed applications accessed AI APIs. It breaks down in an environment where large numbers of AI agents are deployed dynamically across clusters, regions, and teams.
Long-lived static API keys introduce several systemic issues:
In practice, these keys often end up embedded in configuration files, CI pipelines, container images, or environment variables, significantly increasing the blast radius of any compromise.
HashiCorp Vault and OpenBao address part of this problem by supporting dynamic OpenAI API keys through a secrets engine. With this approach:
This is a major improvement over static secrets. Credentials are no longer perpetual, and the window of exposure is dramatically reduced.
However, this does not fully solve the problem.
Even temporary OpenAI API keys must still be:
To achieve this, developers typically need to:
As the number of agents grows, this approach becomes increasingly complex and fragile.
According to Gartner: Gartner predicts that 40% of enterprise applications will feature task-specific AI agents by 2026, up from less than 5% in 2025.
At this scale, secret distribution and credential management are no longer application-level concerns — they become platform-level security challenges.
Riptides addresses this challenge by starting from a different premise: identity must be the root of trust.
With Riptides:
Riptides relies on tokenex to access Vault and OpenBao secrets through their native JWT authentication mechanisms, exchanging workload identity tokens for secrets without issuing or storing any Vault/OpenBao tokens.
Using tokenex, Riptides:
This ensures that:
Riptides retrieves short-lived OpenAI API keys from Vault/OpenBao on demand and places them into a dedicated sysfs file provided by the Riptides Linux kernel module, ensuring that read access is enforced at the kernel level.
Key properties of this approach:
From the AI agent’s perspective, consumption is trivial:
This makes secret consumption as simple as a file read, while ensuring that access control is stronger than environment variables or in-process memory, and cannot be bypassed by misconfiguration or code changes.
By combining verifiable workload identity, short-lived credentials, and kernel-level enforcement, Riptides provides:
Together, these properties align with least privilege, zero trust, and defense-in-depth principles, while simplifying compliance with security frameworks, and internal audit requirements.
This section walks through how to configure Vault/OpenBao and Riptides so that AI agents can securely obtain short‑lived OpenAI API keys without embedding Vault clients, handling Vault tokens, or managing secrets directly in application code.
The configuration is presented step by step, with each snippet accompanied by an explanation of what it does, why it is needed, and how it fits into the overall flow.
Prerequisites
This guide assumes:
- Vault or OpenBao is already running
- JWT authentication is configured in Vault/OpenBao to trust the Riptides Control Plane as an OIDC issuer
- The OpenAI secrets engine plugin is installed
We do not cover JWT auth configuration in detail here. A complete example can be found in this post
For instructions on installing the OpenAI secrets engine plugin, see:
https://github.com/gitrgoliveira/vault-plugin-secrets-openai
Note
In all examples below, if you are using HashiCorp Vault instead of OpenBao, replace thebaoCLI withvault.
bao secrets enable openai
This command enables the OpenAI secrets engine at the openai/ path. Once enabled, Vault/OpenBao can dynamically generate OpenAI API keys instead of relying on static, long‑lived credentials.
bao write openai/config \
admin_api_key="$OPENAI_ADMIN_KEY" \
admin_api_key_id="$OPENAI_ADMIN_KEY_ID" \
organization_id="$OPENAI_ORG_ID"
Here you configure the secrets engine with an administrative OpenAI API key. This key is used only by Vault/OpenBao to create and revoke project‑scoped, short‑lived API keys on behalf of workloads.
This administrative key never leaves Vault/OpenBao and is not exposed to AI agents.
bao write openai/roles/my-role \
project_id="$OPENAI_PROJ_ID" \
service_account_name_template="vault-{{.RoleName}}-{{.RandomSuffix}}" \
ttl=15m \
max_ttl=1h
This role defines how Vault/OpenBao generates OpenAI API keys:
ttl) and maximum lifetime (max_ttl) of issued keysEvery API key generated through this role is short‑lived by design, limiting blast radius and exposure.
bao policy write read-openai-apikeys -<<EOF
path "openai/creds/my-role" {
capabilities = ["read"]
}
EOF
This policy allows a client to read credentials generated by the my-role OpenAI role. It does not grant administrative privileges or access to other secrets.
In the next steps, this policy will be bound to a specific workload identity.
bao write auth/jwt/role/openai-apikeys \
user_claim="sub" \
bound_audiences="vault" \
bound_subject="<AI_AGENT_WORKLOAD_SPIFFE_ID>" \
token_policies="read-openai-apikeys" \
token_type="service" \
token_num_uses=2 \
role_type="jwt"
This configuration tells Vault/OpenBao which identity is allowed to authenticate using JWTs:
bound_subject)vault)No Vault token is stored or managed by the AI agent. Authentication is entirely JWT‑based and secretless.
apiVersion: core.riptides.io/v1alpha1
kind: Service
metadata:
name: openai-api
namespace: riptides-system
spec:
addresses:
- address: api.openai.com
port: 443
labels:
app: openai-api
external: true
This Riptides Service object declares OpenAI as an external dependency. It allows Riptides to apply identity‑aware egress policies when workloads communicate with the OpenAI API.
apiVersion: core.riptides.io/v1alpha1
kind: CredentialSource
metadata:
name: vault-openai-apikeys
namespace: riptides-system
spec:
vault:
address: http://localhost:8200
jwtAuthMethodPath: jwt
role: <JWT-AUTH-ROLE>
path: openai/creds/my-role
audience: ["vault"]
This CredentialSource tells Riptides:
Riptides will exchange a workload identity JWT for a short‑lived OpenAI API key, without requiring Vault tokens or SDKs in the workload.
apiVersion: core.riptides.io/v1alpha1
kind: WorkloadIdentity
metadata:
name: ai-agent
namespace: riptides-system
spec:
scope:
agent:
id: <AGENT_WORKLOAD_ID>
workloadID: ai-agent
selectors:
- process:name: [<AI-AGENT-PROCESS-NAME>]
This object binds a SPIFFE-based workload identity to a specific process. Only the matching process is treated as the AI agent and is allowed to:
apiVersion: core.riptides.io/v1alpha1
kind: CredentialBinding
metadata:
name: vault-openai-apikeys-binding
namespace: riptides-system
spec:
workloadID: ai-agent
credentialSource: vault-openai-apikeys
propagation:
sysfs: {}
This CredentialBinding connects the AI agent identity with the Vault credential source and specifies how the credential is delivered.
By selecting sysfs, Riptides exposes the secret through a our Linux kernel module managed file.
kubectl get credentialbindings.core.riptides.io -n riptides-system vault-openai-apikeys-binding -o yaml
This command shows the resolved credential binding, including the exact sysfs file path where the OpenAI API key is exposed.
status:
state: OK
sysfs:
files:
- path: /sys/module/riptides/credentials/2e08019a-a435-55e3-9ba7-f01c09b9fc2b/crb-vault-openai-apikeys-binding/vault.json
type: CREDENTIAL
The file path shown above is created and managed by the Riptides Linux kernel module. Only the authorized workload can read its contents.
{
"api_key": "sk-svcacct-V8rgR1rZ-MVmeVUR-....",
"api_key_id": "key_1Nj...",
"service_account": "vault-my-role-...",
"service_account_id": "user-gLNLYNq....."
}
This file contains the short‑lived OpenAI API key and associated metadata. The key exists only for its configured TTL and is revoked automatically by Vault/OpenBao.
curl https://api.openai.com/v1/models \
-H "Authorization: Bearer <api_key from vault.json>" \
-H "OpenAI-Organization: <ORGANIZATION ID>" \
-H "OpenAI-Project: <PROJECT ID>"
Output:
{
"object": "list",
"data": [
{
"id": "gpt-4-0613",
"object": "model",
"created": 1686588896,
"owned_by": "openai"
},
{
"id": "gpt-4",
"object": "model",
"created": 1687882411,
"owned_by": "openai"
},
{
"id": "gpt-3.5-turbo",
"object": "model",
"created": 1677610602,
"owned_by": "openai"
},
{
"id": "gpt-5.2-codex",
"object": "model",
"created": 1766164985,
"owned_by": "system"
},
{
"id": "gpt-4o-mini-tts-2025-12-15",
"object": "model",
"created": 1765610837,
"owned_by": "system"
},
{
"id": "gpt-realtime-mini-2025-12-15",
"object": "model",
"created": 1765612007,
"owned_by": "system"
},
{
"id": "gpt-audio-mini-2025-12-15",
"object": "model",
"created": 1765760008,
"owned_by": "system"
},
{
"id": "chatgpt-image-latest",
"object": "model",
"created": 1765925279,
"owned_by": "system"
}
...
...
]
}
This request demonstrates that the API key retrieved via Riptides and Vault/OpenBao works as expected.
The AI agent simply reads the api_key field from the sysfs file and uses it when calling the OpenAI API.
This provides a simple, secure, and auditable way for AI agents to consume GenAI APIs at scale.
As AI agents become a foundational building block of modern systems, credential handling must evolve.
Riptides enables an operating model where:
This approach:
In an ecosystem rapidly moving toward autonomous workloads, identity-first, short-lived credentials are no longer optional — they are essential.
Riptides brings this model to GenAI workloads today, starting with OpenAI and extending naturally to the broader AI platform landscape.
In this post, we showed how short-lived OpenAI API keys can be securely delivered to AI agents using identity and kernel-level enforcement. In the next post, we’ll go one step further and show how Riptides can inject these keys directly into outbound requests, so workloads never read or handle API keys at all.