Automated setup
The fastest way to get started:- Generates CA certificates (if not already present)
- Creates a scaffold
cordon.yaml - Writes proxy env vars to Codex’s
~/.codex/.env(HTTPS_PROXY,HTTP_PROXY,SSL_CERT_FILE)
.env is backed up to .env.cordon.bak before any changes are made.
Global setup with background service
To install cordon as a background service that starts automatically:Remove the setup
API key setup
Codex needs to be configured to use API key authentication (rather than OAuth) so traffic routes through cordon. Runcodex login, select the API key option (option 3), and enter a dummy value:
Adding routes
After setup, editcordon.yaml to add a route for OpenAI:
Manual setup
If you prefer manual configuration, add these to~/.codex/.env:
Codex filters out
CODEX_* prefixed variables from its .env file as a security measure, so you must use SSL_CERT_FILE instead of CODEX_CA_CERTIFICATE. If you need to set CODEX_CA_CERTIFICATE, it must be in your shell environment (e.g., ~/.zshrc), not in the .env file.CODEX_HOME env var can override the default ~/.codex/ path if Codex is installed in a non-standard location.
How it works
Codex is a Rust CLI that usesreqwest for HTTP and rustls for TLS. It loads ~/.codex/.env via dotenvy at startup (before any threads are created), so proxy env vars are picked up automatically.
The SSL_CERT_FILE env var points directly to the Cordon CA certificate. Since rustls adds custom CAs on top of the system trust store (rather than replacing it), no combined CA bundle is needed — this is simpler than the Hermes integration.
Cordon only MITMs connections to hosts with matching routes. All other traffic passes through as a transparent CONNECT tunnel — the upstream server’s real certificate is presented to the client, and no CA configuration is needed for those connections.
WebSocket fallback
Codex prefers WebSocket connections (wss://) for the OpenAI realtime API. WebSocket connections through cordon’s MITM currently fail with a TLS handshake error — Codex gracefully detects this and falls back to HTTP/SSE for the remainder of the session. The credential injection works identically on both transports; only the connection upgrade fails. WebSocket support is tracked in a future release.
Workflow
Once configured, the workflow is:- Start cordon:
cordon start(or use the background service) - Start Codex as usual
- When Codex makes API calls to
api.openai.com, cordon transparently injects credentials - Codex never sees or logs real API keys
Troubleshooting
Codex still uses OAuth instead of API key
If Codex is connecting tochatgpt.com or ab.chatgpt.com instead of api.openai.com, it’s using the OAuth path. Run codex login, select option 3 (API key), and enter a dummy value. Check cordon’s logs — you should see MITM: injecting credentials route=openai for api.openai.com requests.
Certificate errors
If you see TLS errors, verifySSL_CERT_FILE is set correctly in ~/.codex/.env and points to an existing file:
Proxy not being used
Verify the env vars are in~/.codex/.env:
CODEX_CA_CERTIFICATE not working from .env
Codex silently filters out all CODEX_* prefixed variables from its .env file. Use SSL_CERT_FILE instead, or set CODEX_CA_CERTIFICATE in your shell profile (~/.zshrc or ~/.bashrc).
New routes not taking effect
Cordon resolves routes and secrets at startup. If you add or change routes incordon.yaml, restart the proxy:
401 Unauthorized errors
- Verify the secret is stored:
cordon secret set openai --config /path/to/cordon.yaml - Check the route auth type — OpenAI uses
type: bearer - Restart cordon after adding or changing secrets (secrets are resolved at startup)