Skip to main content
Cordon integrates with OpenAI Codex so your AI agent can make authenticated API calls without holding real credentials.
Cordon currently works with API key authentication only. If you use Codex with a ChatGPT Plus/Team subscription (OAuth login), credential injection won’t apply — Codex authenticates directly via OAuth, bypassing the proxy. Support for OAuth-based subscriptions is coming soon.

Scope

Codex setup defaults to project scope: cordon.toml lives in $CWD, and proxy env vars are written to $CWD/.codex/.env. Because Codex loads config from $CODEX_HOME, you must point it at the project directory at runtime:
export CODEX_HOME="$PWD/.codex"
codex
Add this export to your shell profile or a project .envrc (e.g. direnv) to automate it. To use a single cordon config across all projects, override to user scope:
cordon setup codex --scope user
User scope writes env vars to $CODEX_HOME/.env (or $HOME/.codex/.env if CODEX_HOME is not set) and stores config at $XDG_CONFIG_HOME/cordon/cordon.toml. See Scopes for path details and trade-offs.

Automated setup

The fastest way to get started:
cordon setup codex
This:
  1. Generates CA certificates (if not already present)
  2. Creates a scaffold cordon.toml
  3. Builds a combined CA bundle (system CAs + Cordon CA) and writes proxy env vars to Codex’s ~/.codex/.env (HTTPS_PROXY, HTTP_PROXY, https_proxy, http_proxy, SSL_CERT_FILE, REQUESTS_CA_BUNDLE, CURL_CA_BUNDLE)
Your existing .env is backed up to .env.cordon.bak before any changes are made. To run cordon as a background service, run cordon service install after setup (add --scope user if you set up Codex with --scope user).

Remove the setup

cordon integration disable codex

API key setup

Codex needs to be configured to use API key authentication (rather than OAuth) so traffic routes through cordon. Run codex login, select the API key option (option 3), and enter a dummy value:
codex login
# Select: API key
# Enter: dummy-replaced-by-cordon
Cordon strips this dummy key and injects the real one from your secret store at the network layer.

Adding routes

The cordon route, cordon start, and cordon service commands below default to project scope. If you set up Codex with --scope user, append --scope user to each of these commands so they target ~/.config/cordon/cordon.toml instead of ./cordon.toml.
After setup, add a route for OpenAI:
cordon route add
If you chose keyring as the secret source, store the secret:
cordon secret set openai
To modify an existing route, use cordon route edit <name> — see Routes for the full format.

Manual setup

If you prefer manual configuration, add these to the appropriate .env file. Replace <PORT> with the port in your cordon.toml:
HTTPS_PROXY="http://127.0.0.1:<PORT>"
HTTP_PROXY="http://127.0.0.1:<PORT>"
https_proxy="http://127.0.0.1:<PORT>"
http_proxy="http://127.0.0.1:<PORT>"
SSL_CERT_FILE="/path/to/combined-ca.pem"
REQUESTS_CA_BUNDLE="/path/to/combined-ca.pem"
CURL_CA_BUNDLE="/path/to/combined-ca.pem"
Codex filters out CODEX_* prefixed variables from its .env file as a security measure, so you must use SSL_CERT_FILE instead of CODEX_CA_CERTIFICATE. If you need to set CODEX_CA_CERTIFICATE, it must be in your shell environment (e.g., ~/.zshrc), not in the .env file.
The CODEX_HOME env var can override the default ~/.codex/ path if Codex is installed in a non-standard location.

How it works

Codex is a Rust CLI that uses reqwest for HTTP and rustls for TLS. It loads ~/.codex/.env via dotenvy at startup (before any threads are created), so proxy env vars are picked up automatically. Setup generates a combined CA bundle (system CAs + Cordon CA) and sets SSL_CERT_FILE to point at it. This ensures tools that replace (rather than append to) the default cert store still trust both Cordon and public CAs. REQUESTS_CA_BUNDLE and CURL_CA_BUNDLE are also set for Python and curl compatibility. Cordon only MITMs connections to hosts with matching routes. All other traffic passes through as a transparent CONNECT tunnel — the upstream server’s real certificate is presented to the client, and no CA configuration is needed for those connections.

WebSocket fallback

Codex prefers WebSocket connections (wss://) for the OpenAI realtime API. WebSocket connections through cordon’s MITM currently fail with a TLS handshake error — Codex gracefully detects this and falls back to HTTP/SSE for the remainder of the session. The credential injection works identically on both transports; only the connection upgrade fails. WebSocket support is tracked in a future release.

Workflow

Once configured, the workflow is:
  1. Start cordon: cordon start (or use the background service)
  2. Start Codex as usual
  3. When Codex makes API calls to api.openai.com, cordon transparently injects credentials
  4. Codex never sees or logs real API keys
Use cordon doctor to diagnose any setup issues. It checks config validity, cert paths, trust store status, and port availability.

Troubleshooting

If Codex is connecting to chatgpt.com or ab.chatgpt.com instead of api.openai.com, it’s using the OAuth path. Run codex login, select option 3 (API key), and enter a dummy value. Check cordon’s logs — you should see MITM: injecting credentials route=openai for api.openai.com requests.
If you see TLS errors, verify SSL_CERT_FILE is set correctly in ~/.codex/.env and points to an existing file:
cat ~/.codex/.env | grep SSL_CERT_FILE
ls -la "$(cat ~/.codex/.env | grep SSL_CERT_FILE | cut -d'"' -f2)"
Verify the env vars are in ~/.codex/.env:
cat ~/.codex/.env
If the file exists but Codex isn’t routing through the proxy, ensure cordon is running:
cordon status
# Replace <PORT> with the listen port from your cordon.toml
curl http://127.0.0.1:<PORT>/health
Codex silently filters out all CODEX_* prefixed variables from its .env file. Use SSL_CERT_FILE instead, or set CODEX_CA_CERTIFICATE in your shell profile (~/.zshrc or ~/.bashrc).
Cordon loads routes at startup. If you add or change routes in cordon.toml, restart the proxy (secrets are fetched per-request and don’t require a restart):
# If running manually
# Ctrl+C, then:
cordon start

# If running as a service
cordon service stop codex && cordon service start codex
  1. Verify the secret is stored: cordon secret set openai
  2. Check the route auth type — OpenAI uses type: bearer
  3. Secrets are fetched per-request — if you changed a secret, the next request picks it up automatically