ocm: Add Responses WebSocket API proxy and fix client config docs

Support the OpenAI Responses WebSocket API (`wss://.../v1/responses`)
for bidirectional frame proxying with usage tracking.
Fix Codex CLI client config examples to use profiles and correct flags.

Update openai-go v3.24.0 → v3.26.0.
This commit is contained in:
世界
2026-03-10 20:56:07 +08:00
parent 0b04528803
commit e0be8743f6
6 changed files with 293 additions and 15 deletions

View File

@@ -114,14 +114,18 @@ Add to `~/.codex/config.toml`:
[model_providers.ocm]
name = "OCM Proxy"
base_url = "http://127.0.0.1:8080/v1"
wire_api = "responses"
requires_openai_auth = false
supports_websockets = true
[profiles.ocm]
model_provider = "ocm"
# model = "gpt-5.4" # if the latest model is not yet publicly released
# model_reasoning_effort = "xhigh"
```
Then run:
```bash
codex --model-provider ocm
codex --profile ocm
```
### Example with Authentication
@@ -159,13 +163,17 @@ Add to `~/.codex/config.toml`:
[model_providers.ocm]
name = "OCM Proxy"
base_url = "http://127.0.0.1:8080/v1"
wire_api = "responses"
requires_openai_auth = false
supports_websockets = true
experimental_bearer_token = "sk-alice-secret-token"
[profiles.ocm]
model_provider = "ocm"
# model = "gpt-5.4" # if the latest model is not yet publicly released
# model_reasoning_effort = "xhigh"
```
Then run:
```bash
codex --model-provider ocm
codex --profile ocm
```