fix the max token of Tongyi-Qianwen text-embedding-v3 model to 8k (#2118)

### What problem does this PR solve?

_Briefly describe what this PR aims to solve. Include background context
that will help reviewers understand the purpose of the PR._

fix the max token of Tongyi-Qianwen text-embedding-v3 model to 8k

close #2117 

### Type of change

- [ ] Bug Fix (non-breaking change which fixes an issue)
- [ ] New Feature (non-breaking change which adds functionality)
- [ ] Documentation Update
- [ ] Refactoring
- [ ] Performance Improvement
- [ ] Other (please describe):
This commit is contained in:
zhuhao 2024-08-28 10:14:19 +08:00 committed by GitHub
parent a2b4d0190c
commit e9f5468a49
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -106,8 +106,8 @@
},
{
"llm_name": "text-embedding-v3",
"tags": "TEXT EMBEDDING,2K",
"max_tokens": 2048,
"tags": "TEXT EMBEDDING,8K",
"max_tokens": 8192,
"model_type": "embedding"
},
{