Update opencode default provider to infini kimi-k2.5
This commit is contained in:
@@ -63,8 +63,11 @@ Each wrapper should:
|
||||
Current `opencode` default-model routing baseline:
|
||||
|
||||
- `mac-5`
|
||||
- default model: `opencode/minimax-m2.5-free`
|
||||
- rationale: the minimax free cloud route already passed direct + ACP minimal validation and avoids the timeout-prone local `oMLX` ACP path
|
||||
- provider label: `infini`
|
||||
- base URL: `https://cloud.infini-ai.com/maas/coding/v1`
|
||||
- model id: `kimi-k2.5`
|
||||
- default model: `infini/kimi-k2.5`
|
||||
- rationale: removed the temporary `ckimi / kimi-for-coding` route because `Kimi For Coding` is currently limited to coding-agent products and is not a reliable `opencode` default path
|
||||
- `mac-6`
|
||||
- provider label: `vllm`
|
||||
- base URL: `http://100.64.0.5:8000/v1`
|
||||
|
||||
@@ -28,6 +28,19 @@
|
||||
- `mac-6` ACP 最小测试结果:`MAC6_VLLM_ACP_OK`
|
||||
- `mac-7` ACP 最小测试结果:`MAC7_VLLM_ACP_OK`
|
||||
|
||||
## opencode provider/default 切换(mac-5)
|
||||
|
||||
- hotwa 要求撤掉 `opencode` 上一会话里为 `Kimi For Coding` 临时接入的 `ckimi` provider
|
||||
- 原因:`Kimi For Coding` 当前仅面向 Coding Agents(如 Kimi CLI / Claude Code / Roo Code / Kilo Code 等),不支持当前这条 `opencode` 使用路径
|
||||
- 已将 `~/.config/opencode/opencode.json` 改为:
|
||||
- 删除 `ckimi` provider
|
||||
- 新默认 provider:`infini`
|
||||
- 新默认模型:`infini/kimi-k2.5`
|
||||
- `baseURL`:`https://cloud.infini-ai.com/maas/coding/v1`
|
||||
- `apiKey`:`{env:WARP_INFINI_API_KEY}`
|
||||
- 已用 `opencode models infini` 最小校验,返回:`infini/kimi-k2.5`
|
||||
- 结论:`mac-5` 的当前 `opencode` 默认模型基线,已从 `opencode/minimax-m2.5-free` 调整为 `infini/kimi-k2.5`
|
||||
|
||||
## 备注
|
||||
|
||||
- 这次更新针对的是 `opencode` 默认模型策略,不等于废弃原有 subagent / worker 的本地模型拓扑说明
|
||||
|
||||
@@ -93,19 +93,19 @@ The cluster no longer uses a single "default to each machine's own local model"
|
||||
|
||||
Current node-specific default policy:
|
||||
|
||||
- `mac-5`: default `opencode` model is `opencode/minimax-m2.5-free`
|
||||
- `mac-5`: default `opencode` model is `infini/kimi-k2.5` via `https://cloud.infini-ai.com/maas/coding/v1`
|
||||
- `mac-6`: default `opencode` model is `vllm/Qwen3.5-27B` via `http://100.64.0.5:8000/v1`
|
||||
- `mac-7`: default `opencode` model is `vllm/Qwen3.5-27B` via `http://100.64.0.5:8000/v1`
|
||||
|
||||
Operational meaning:
|
||||
|
||||
- `mac-5` prefers the already-validated free cloud minimax route for daily ACP stability.
|
||||
- `mac-5` prefers the Infini coding endpoint with `kimi-k2.5`; the earlier `ckimi / kimi-for-coding` stopgap was removed because `Kimi For Coding` is currently targeted at coding-agent products rather than this `opencode` path.
|
||||
- `mac-6` and `mac-7` prefer the shared local vLLM endpoint instead of their previous per-node local `oMLX` default for `opencode` ACP.
|
||||
- This rule is specific to current `opencode` defaults; it does not invalidate separate worker/subagent topology docs.
|
||||
|
||||
Observed validation status:
|
||||
|
||||
- `mac-5`: direct `opencode` and ACP minimal tests succeeded with `opencode/minimax-m2.5-free`
|
||||
- `mac-5`: direct `opencode` config validation succeeded with `infini/kimi-k2.5` (`opencode models infini` returned `infini/kimi-k2.5`)
|
||||
- `mac-6`: ACP minimal test succeeded with `vllm/Qwen3.5-27B`
|
||||
- `mac-7`: ACP minimal test succeeded with `vllm/Qwen3.5-27B`
|
||||
|
||||
|
||||
Reference in New Issue
Block a user