Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.codezero.io/llms.txt

Use this file to discover all available pages before exploring further.

Cordon integrates with Hermes Agent so your AI agent can make authenticated API calls without holding real credentials.

Scope

Hermes setup only supports user scope — Hermes operates across projects, so cordon setup hermes --scope project is rejected by the CLI. User scope stores config at $XDG_CONFIG_HOME/cordon/cordon.toml and writes proxy env vars to $HERMES_HOME/.env (or ~/.hermes/.env). A single cordon instance handles credential injection regardless of which project Hermes is working in. See Scopes for path details and trade-offs.

Automated setup

The fastest way to get started:
cordon setup hermes
This:
  1. Generates CA certificates (if not already present)
  2. Creates a scaffold cordon.toml
  3. Writes the standard proxy and CA env vars from cordon env --scope user to Hermes’s ~/.hermes/.env
  4. Installs a cordon agent skill to ~/.hermes/skills/devops/cordon/SKILL.md
Your existing .env is backed up to .env.cordon.bak before any changes are made. To run cordon as a background service, run cordon service install --scope user after setup.

Remove the setup

cordon integration disable hermes

Adding routes

After setup, add a route for your LLM provider with cordon route add --scope user and, if using the keyring secret source, store the credential with cordon secret set. Hermes is user-scope only, so pass --scope user to cordon route, cordon start, and cordon service commands.
Anthropic uses type: header with header_name: x-api-key and no scheme. Using an Authorization header will result in 401 errors.

Provider auto-detection

Hermes uses env vars to auto-detect which LLM provider to use. Since cordon injects the real API key at the network layer, Hermes still needs a dummy key to select the right provider. Add a placeholder to ~/.hermes/.env:
# Hermes sees this and selects the Anthropic provider.
# Cordon strips it and injects the real key from the keychain.
ANTHROPIC_API_KEY=dummy-replaced-by-cordon
Without this, Hermes won’t know which provider to use and will fail to make API calls even though cordon has the real credentials ready to inject.

Manual setup

Prefer cordon setup hermes for Hermes configuration. It writes the proxy and CA settings, generates the combined CA bundle, and backs up the existing .env before changing it. If setup cannot cover your environment, copy the values from cordon env --scope user to ~/.hermes/.env:
HTTPS_PROXY="http://127.0.0.1:<PORT>"
SSL_CERT_FILE="/path/to/combined-ca.pem"
The HERMES_HOME env var can override the default ~/.hermes/ path if Hermes is installed in a non-standard location. See Any tool (generic) for the full env-var contract.

How it works

Hermes uses Python’s httpx library for HTTP, which honors HTTPS_PROXY by default (trust_env=True). The OpenAI, Firecrawl, and Exa SDKs all use httpx or requests internally, and Hermes’s own Tavily client uses httpx directly, so all HTTP traffic routes through cordon automatically. No code changes or monkeypatching required. For Cordon’s matched-route TLS behavior and certificate troubleshooting, see TLS.

Sandboxed environments

Hermes supports several sandboxed execution backends (TERMINAL_ENV): docker, singularity, modal, daytona, and ssh. These environments run in isolated network namespaces where 127.0.0.1 refers to the container’s or remote host’s loopback, not the developer’s machine. The cordon proxy running on the host is not reachable from inside these sandboxes without network bridging. For local execution (TERMINAL_ENV=local), cordon works out of the box. For sandboxed backends, network reachability varies by backend and has not been fully tested. Docker may reach the host via host.docker.internal on macOS/Windows, but other backends (Modal, Daytona, SSH) have their own networking models. A remote cordon proxy with network-accessible binding would be needed for full support (not yet available).

Workflow

Once configured, the workflow is:
  1. Start cordon: cordon start --scope user (or use the background service)
  2. Start Hermes as usual
  3. When Hermes makes API calls to configured hosts, cordon transparently injects credentials
  4. Hermes never sees or logs real API keys
Use cordon doctor to diagnose any setup issues. It checks config validity, cert paths, trust store status, and port availability.

Troubleshooting

  1. Wrong header config: Anthropic uses type: header with header_name: x-api-key and no scheme. Check your cordon.toml route configuration.
  2. Missing dummy key: Hermes won’t select a provider without its API key env var set. Add ANTHROPIC_API_KEY=dummy-replaced-by-cordon to ~/.hermes/.env.
  3. Check the secret source: HTTP route secrets are fetched per-request — if you changed a secret, the next request picks it up automatically.
  4. Verify the secret is stored: cordon secret set anthropic
Verify ~/.hermes/.env includes the CA bundle values from cordon env --scope user, then follow TLS troubleshooting.
Verify the env vars are in ~/.hermes/.env:
cat ~/.hermes/.env
Hermes loads this file at startup via load_hermes_dotenv(). If the file exists but Hermes isn’t routing through the proxy, use the shared proxy not running checks with --scope user.
If Hermes can’t determine which LLM provider to use, it’s likely missing the dummy API key env var. Add the appropriate key to ~/.hermes/.env:
ANTHROPIC_API_KEY=dummy-replaced-by-cordon    # for Anthropic
OPENAI_API_KEY=dummy-replaced-by-cordon       # for OpenAI
Restart Cordon after adding or editing route definitions. See Routes: route changes and secret rotation.