Skip to main content
Cordon integrates with OpenAI Codex so your AI agent can make authenticated API calls without holding real credentials.
Cordon currently works with API key authentication only. If you use Codex with a ChatGPT Plus/Team subscription (OAuth login), credential injection won’t apply — Codex authenticates directly via OAuth, bypassing the proxy. Support for OAuth-based subscriptions is coming soon.

Automated setup

The fastest way to get started:
cordon setup codex
This:
  1. Generates CA certificates (if not already present)
  2. Creates a scaffold cordon.yaml
  3. Writes proxy env vars to Codex’s ~/.codex/.env (HTTPS_PROXY, HTTP_PROXY, SSL_CERT_FILE)
Your existing .env is backed up to .env.cordon.bak before any changes are made.

Global setup with background service

To install cordon as a background service that starts automatically:
cordon setup codex --global
This additionally installs a launchd (macOS) or systemd (Linux) service so cordon runs in the background without a terminal window.

Remove the setup

cordon setup codex --remove           # project-scoped
cordon setup codex --remove --global  # global (also removes background service)

API key setup

Codex needs to be configured to use API key authentication (rather than OAuth) so traffic routes through cordon. Run codex login, select the API key option (option 3), and enter a dummy value:
codex login
# Select: API key
# Enter: dummy-replaced-by-cordon
Cordon strips this dummy key and injects the real one from your secret store at the network layer.

Adding routes

After setup, edit cordon.yaml to add a route for OpenAI:
routes:
  - name: openai
    match:
      host: api.openai.com
    auth:
      type: bearer
      secret:
        source: keyring
        account: openai
Then store the secret:
cordon secret set openai --config /path/to/cordon.yaml

Manual setup

If you prefer manual configuration, add these to ~/.codex/.env:
HTTPS_PROXY="http://127.0.0.1:6790"
HTTP_PROXY="http://127.0.0.1:6790"
SSL_CERT_FILE="/path/to/ca-cert.pem"
Codex filters out CODEX_* prefixed variables from its .env file as a security measure, so you must use SSL_CERT_FILE instead of CODEX_CA_CERTIFICATE. If you need to set CODEX_CA_CERTIFICATE, it must be in your shell environment (e.g., ~/.zshrc), not in the .env file.
The CODEX_HOME env var can override the default ~/.codex/ path if Codex is installed in a non-standard location.

How it works

Codex is a Rust CLI that uses reqwest for HTTP and rustls for TLS. It loads ~/.codex/.env via dotenvy at startup (before any threads are created), so proxy env vars are picked up automatically. The SSL_CERT_FILE env var points directly to the Cordon CA certificate. Since rustls adds custom CAs on top of the system trust store (rather than replacing it), no combined CA bundle is needed — this is simpler than the Hermes integration. Cordon only MITMs connections to hosts with matching routes. All other traffic passes through as a transparent CONNECT tunnel — the upstream server’s real certificate is presented to the client, and no CA configuration is needed for those connections.

WebSocket fallback

Codex prefers WebSocket connections (wss://) for the OpenAI realtime API. WebSocket connections through cordon’s MITM currently fail with a TLS handshake error — Codex gracefully detects this and falls back to HTTP/SSE for the remainder of the session. The credential injection works identically on both transports; only the connection upgrade fails. WebSocket support is tracked in a future release.

Workflow

Once configured, the workflow is:
  1. Start cordon: cordon start (or use the background service)
  2. Start Codex as usual
  3. When Codex makes API calls to api.openai.com, cordon transparently injects credentials
  4. Codex never sees or logs real API keys
Use cordon doctor to diagnose any setup issues. It checks config validity, cert paths, trust store status, and port availability.

Troubleshooting

Codex still uses OAuth instead of API key

If Codex is connecting to chatgpt.com or ab.chatgpt.com instead of api.openai.com, it’s using the OAuth path. Run codex login, select option 3 (API key), and enter a dummy value. Check cordon’s logs — you should see MITM: injecting credentials route=openai for api.openai.com requests.

Certificate errors

If you see TLS errors, verify SSL_CERT_FILE is set correctly in ~/.codex/.env and points to an existing file:
cat ~/.codex/.env | grep SSL_CERT_FILE
ls -la "$(cat ~/.codex/.env | grep SSL_CERT_FILE | cut -d'"' -f2)"

Proxy not being used

Verify the env vars are in ~/.codex/.env:
cat ~/.codex/.env
If the file exists but Codex isn’t routing through the proxy, ensure cordon is running:
cordon status
curl http://127.0.0.1:6790/health

CODEX_CA_CERTIFICATE not working from .env

Codex silently filters out all CODEX_* prefixed variables from its .env file. Use SSL_CERT_FILE instead, or set CODEX_CA_CERTIFICATE in your shell profile (~/.zshrc or ~/.bashrc).

New routes not taking effect

Cordon resolves routes and secrets at startup. If you add or change routes in cordon.yaml, restart the proxy:
# If running manually
# Ctrl+C, then:
cordon start

# If running as a service
cordon service stop codex && cordon service start codex

401 Unauthorized errors

  1. Verify the secret is stored: cordon secret set openai --config /path/to/cordon.yaml
  2. Check the route auth type — OpenAI uses type: bearer
  3. Restart cordon after adding or changing secrets (secrets are resolved at startup)