mirror of
https://git.mirrors.martin98.com/https://github.com/langgenius/dify.git
synced 2025-08-12 18:59:07 +08:00
fix(model_runtime): fix wrong max_tokens for Claude 3.5 Haiku on Amazon Bedrock (#10286)
This commit is contained in:
parent
249b897872
commit
cb245b5435
@ -16,9 +16,9 @@ parameter_rules:
|
|||||||
use_template: max_tokens
|
use_template: max_tokens
|
||||||
required: true
|
required: true
|
||||||
type: int
|
type: int
|
||||||
default: 4096
|
default: 8192
|
||||||
min: 1
|
min: 1
|
||||||
max: 4096
|
max: 8192
|
||||||
help:
|
help:
|
||||||
zh_Hans: 停止前生成的最大令牌数。请注意,Anthropic Claude 模型可能会在达到 max_tokens 的值之前停止生成令牌。不同的 Anthropic Claude 模型对此参数具有不同的最大值。
|
zh_Hans: 停止前生成的最大令牌数。请注意,Anthropic Claude 模型可能会在达到 max_tokens 的值之前停止生成令牌。不同的 Anthropic Claude 模型对此参数具有不同的最大值。
|
||||||
en_US: The maximum number of tokens to generate before stopping. Note that Anthropic Claude models might stop generating tokens before reaching the value of max_tokens. Different Anthropic Claude models have different maximum values for this parameter.
|
en_US: The maximum number of tokens to generate before stopping. Note that Anthropic Claude models might stop generating tokens before reaching the value of max_tokens. Different Anthropic Claude models have different maximum values for this parameter.
|
||||||
|
Loading…
x
Reference in New Issue
Block a user