mirror of
https://github.com/Youzini-afk/ST-Bionic-Memory-Ecology.git
synced 2026-05-15 22:30:38 +08:00
fix(llm): 改用 max_tokens 避免与 ST 代理注入的 max_tokens 冲突
ST 的 /api/backends/chat-completions/generate 端点会从预设配置注入 max_tokens,导致与扩展自行传递的 max_completion_tokens 冲突, 上游 API 返回 400。改为 max_tokens 后由 ST 统一处理,消除冲突。 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -1441,7 +1441,7 @@ async function callDedicatedOpenAICompatible(
|
||||
model: config.model,
|
||||
messages,
|
||||
temperature: filteredGeneration.temperature ?? 1,
|
||||
max_completion_tokens: resolvedCompletionTokens,
|
||||
max_tokens: resolvedCompletionTokens,
|
||||
stream: filteredGeneration.stream ?? false,
|
||||
frequency_penalty: filteredGeneration.frequency_penalty ?? 0,
|
||||
presence_penalty: filteredGeneration.presence_penalty ?? 0,
|
||||
|
||||
Reference in New Issue
Block a user