Update opencode default provider to infini kimi-k2.5

This commit is contained in:
hotwa
2026-03-18 15:43:07 +08:00
parent 4e1d0707ba
commit 537ee62363
3 changed files with 21 additions and 5 deletions

View File

@@ -93,19 +93,19 @@ The cluster no longer uses a single "default to each machine's own local model"
Current node-specific default policy:
- `mac-5`: default `opencode` model is `opencode/minimax-m2.5-free`
- `mac-5`: default `opencode` model is `infini/kimi-k2.5` via `https://cloud.infini-ai.com/maas/coding/v1`
- `mac-6`: default `opencode` model is `vllm/Qwen3.5-27B` via `http://100.64.0.5:8000/v1`
- `mac-7`: default `opencode` model is `vllm/Qwen3.5-27B` via `http://100.64.0.5:8000/v1`
Operational meaning:
- `mac-5` prefers the already-validated free cloud minimax route for daily ACP stability.
- `mac-5` prefers the Infini coding endpoint with `kimi-k2.5`; the earlier `ckimi / kimi-for-coding` stopgap was removed because `Kimi For Coding` is currently targeted at coding-agent products rather than this `opencode` path.
- `mac-6` and `mac-7` prefer the shared local vLLM endpoint instead of their previous per-node local `oMLX` default for `opencode` ACP.
- This rule is specific to current `opencode` defaults; it does not invalidate separate worker/subagent topology docs.
Observed validation status:
- `mac-5`: direct `opencode` and ACP minimal tests succeeded with `opencode/minimax-m2.5-free`
- `mac-5`: direct `opencode` config validation succeeded with `infini/kimi-k2.5` (`opencode models infini` returned `infini/kimi-k2.5`)
- `mac-6`: ACP minimal test succeeded with `vllm/Qwen3.5-27B`
- `mac-7`: ACP minimal test succeeded with `vllm/Qwen3.5-27B`