Merge branch 'refs/heads/main' into feat/workflow-parallel-support

# Conflicts:
#	api/core/workflow/nodes/llm/llm_node.py
#	api/core/workflow/nodes/question_classifier/question_classifier_node.py
This commit is contained in:
takatost 2024-08-23 00:32:28 +08:00
commit 5b22d8f8b2
48 changed files with 381 additions and 543 deletions

156
CONTRIBUTING_VI.md Normal file
View File

@ -0,0 +1,156 @@
Thật tuyệt vời khi bạn muốn đóng góp cho Dify! Chúng tôi rất mong chờ được thấy những gì bạn sẽ làm. Là một startup với nguồn nhân lực và tài chính hạn chế, chúng tôi có tham vọng lớn là thiết kế quy trình trực quan nhất để xây dựng và quản lý các ứng dụng LLM. Mọi sự giúp đỡ từ cộng đồng đều rất quý giá đối với chúng tôi.
Chúng tôi cần linh hoạt và làm việc nhanh chóng, nhưng đồng thời cũng muốn đảm bảo các cộng tác viên như bạn có trải nghiệm đóng góp thuận lợi nhất có thể. Chúng tôi đã tạo ra hướng dẫn đóng góp này nhằm giúp bạn làm quen với codebase và cách chúng tôi làm việc với các cộng tác viên, để bạn có thể nhanh chóng bắt tay vào phần thú vị.
Hướng dẫn này, cũng như bản thân Dify, đang trong quá trình cải tiến liên tục. Chúng tôi rất cảm kích sự thông cảm của bạn nếu đôi khi nó không theo kịp dự án thực tế, và chúng tôi luôn hoan nghênh mọi phản hồi để cải thiện.
Về vấn đề cấp phép, xin vui lòng dành chút thời gian đọc qua [Thỏa thuận Cấp phép và Đóng góp](./LICENSE) ngắn gọn của chúng tôi. Cộng đồng cũng tuân thủ [quy tắc ứng xử](https://github.com/langgenius/.github/blob/main/CODE_OF_CONDUCT.md).
## Trước khi bắt đầu
[Tìm kiếm](https://github.com/langgenius/dify/issues?q=is:issue+is:closed) một vấn đề hiện có, hoặc [tạo mới](https://github.com/langgenius/dify/issues/new/choose) một vấn đề. Chúng tôi phân loại các vấn đề thành 2 loại:
### Yêu cầu tính năng:
* Nếu bạn đang tạo một yêu cầu tính năng mới, chúng tôi muốn bạn giải thích tính năng đề xuất sẽ đạt được điều gì và cung cấp càng nhiều thông tin chi tiết càng tốt. [@perzeusss](https://github.com/perzeuss) đã tạo một [Trợ lý Yêu cầu Tính năng](https://udify.app/chat/MK2kVSnw1gakVwMX) rất hữu ích để giúp bạn soạn thảo nhu cầu của mình. Hãy thử dùng nó nhé.
* Nếu bạn muốn chọn một vấn đề từ danh sách hiện có, chỉ cần để lại bình luận dưới vấn đề đó nói rằng bạn sẽ làm.
Một thành viên trong nhóm làm việc trong lĩnh vực liên quan sẽ được thông báo. Nếu mọi thứ ổn, họ sẽ cho phép bạn bắt đầu code. Chúng tôi yêu cầu bạn chờ đợi cho đến lúc đó trước khi bắt tay vào làm tính năng, để không lãng phí công sức của bạn nếu chúng tôi đề xuất thay đổi.
Tùy thuộc vào lĩnh vực mà tính năng đề xuất thuộc về, bạn có thể nói chuyện với các thành viên khác nhau trong nhóm. Dưới đây là danh sách các lĩnh vực mà các thành viên trong nhóm chúng tôi đang làm việc hiện tại:
| Thành viên | Phạm vi |
| ------------------------------------------------------------ | ---------------------------------------------------- |
| [@yeuoly](https://github.com/Yeuoly) | Thiết kế kiến trúc Agents |
| [@jyong](https://github.com/JohnJyong) | Thiết kế quy trình RAG |
| [@GarfieldDai](https://github.com/GarfieldDai) | Xây dựng quy trình làm việc |
| [@iamjoel](https://github.com/iamjoel) & [@zxhlyh](https://github.com/zxhlyh) | Làm cho giao diện người dùng dễ sử dụng |
| [@guchenhe](https://github.com/guchenhe) & [@crazywoola](https://github.com/crazywoola) | Trải nghiệm nhà phát triển, đầu mối liên hệ cho mọi vấn đề |
| [@takatost](https://github.com/takatost) | Định hướng và kiến trúc tổng thể sản phẩm |
Cách chúng tôi ưu tiên:
| Loại tính năng | Mức độ ưu tiên |
| ------------------------------------------------------------ | -------------- |
| Tính năng ưu tiên cao được gắn nhãn bởi thành viên trong nhóm | Ưu tiên cao |
| Yêu cầu tính năng phổ biến từ [bảng phản hồi cộng đồng](https://github.com/langgenius/dify/discussions/categories/feedbacks) của chúng tôi | Ưu tiên trung bình |
| Tính năng không quan trọng và cải tiến nhỏ | Ưu tiên thấp |
| Có giá trị nhưng không cấp bách | Tính năng tương lai |
### Những vấn đề khác (ví dụ: báo cáo lỗi, tối ưu hiệu suất, sửa lỗi chính tả):
* Bắt đầu code ngay lập tức.
Cách chúng tôi ưu tiên:
| Loại vấn đề | Mức độ ưu tiên |
| ------------------------------------------------------------ | -------------- |
| Lỗi trong các chức năng chính (không thể đăng nhập, ứng dụng không hoạt động, lỗ hổng bảo mật) | Nghiêm trọng |
| Lỗi không quan trọng, cải thiện hiệu suất | Ưu tiên trung bình |
| Sửa lỗi nhỏ (lỗi chính tả, giao diện người dùng gây nhầm lẫn nhưng vẫn hoạt động) | Ưu tiên thấp |
## Cài đặt
Dưới đây là các bước để thiết lập Dify cho việc phát triển:
### 1. Fork repository này
### 2. Clone repository
Clone repository đã fork từ terminal của bạn:
```
git clone git@github.com:<tên_người_dùng_github>/dify.git
```
### 3. Kiểm tra các phụ thuộc
Dify yêu cầu các phụ thuộc sau để build, hãy đảm bảo chúng đã được cài đặt trên hệ thống của bạn:
- [Docker](https://www.docker.com/)
- [Docker Compose](https://docs.docker.com/compose/install/)
- [Node.js v18.x (LTS)](http://nodejs.org)
- [npm](https://www.npmjs.com/) phiên bản 8.x.x hoặc [Yarn](https://yarnpkg.com/)
- [Python](https://www.python.org/) phiên bản 3.10.x
### 4. Cài đặt
Dify bao gồm một backend và một frontend. Đi đến thư mục backend bằng lệnh `cd api/`, sau đó làm theo hướng dẫn trong [README của Backend](api/README.md) để cài đặt. Trong một terminal khác, đi đến thư mục frontend bằng lệnh `cd web/`, sau đó làm theo hướng dẫn trong [README của Frontend](web/README.md) để cài đặt.
Kiểm tra [FAQ về cài đặt](https://docs.dify.ai/learn-more/faq/self-host-faq) để xem danh sách các vấn đề thường gặp và các bước khắc phục.
### 5. Truy cập Dify trong trình duyệt của bạn
Để xác nhận cài đặt của bạn, hãy truy cập [http://localhost:3000](http://localhost:3000) (địa chỉ mặc định, hoặc URL và cổng bạn đã cấu hình) trong trình duyệt. Bạn sẽ thấy Dify đang chạy.
## Phát triển
Nếu bạn đang thêm một nhà cung cấp mô hình, [hướng dẫn này](https://github.com/langgenius/dify/blob/main/api/core/model_runtime/README.md) dành cho bạn.
Nếu bạn đang thêm một nhà cung cấp công cụ cho Agent hoặc Workflow, [hướng dẫn này](./api/core/tools/README.md) dành cho bạn.
Để giúp bạn nhanh chóng định hướng phần đóng góp của mình, dưới đây là một bản phác thảo ngắn gọn về cấu trúc backend & frontend của Dify:
### Backend
Backend của Dify được viết bằng Python sử dụng [Flask](https://flask.palletsprojects.com/en/3.0.x/). Nó sử dụng [SQLAlchemy](https://www.sqlalchemy.org/) cho ORM và [Celery](https://docs.celeryq.dev/en/stable/getting-started/introduction.html) cho hàng đợi tác vụ. Logic xác thực được thực hiện thông qua Flask-login.
```
[api/]
├── constants // Các cài đặt hằng số được sử dụng trong toàn bộ codebase.
├── controllers // Định nghĩa các route API và logic xử lý yêu cầu.
├── core // Điều phối ứng dụng cốt lõi, tích hợp mô hình và công cụ.
├── docker // Cấu hình liên quan đến Docker & containerization.
├── events // Xử lý và xử lý sự kiện
├── extensions // Mở rộng với các framework/nền tảng bên thứ 3.
├── fields // Định nghĩa trường cho serialization/marshalling.
├── libs // Thư viện và tiện ích có thể tái sử dụng.
├── migrations // Script cho việc di chuyển cơ sở dữ liệu.
├── models // Mô hình cơ sở dữ liệu & định nghĩa schema.
├── services // Xác định logic nghiệp vụ.
├── storage // Lưu trữ khóa riêng tư.
├── tasks // Xử lý các tác vụ bất đồng bộ và công việc nền.
└── tests
```
### Frontend
Website được khởi tạo trên boilerplate [Next.js](https://nextjs.org/) bằng Typescript và sử dụng [Tailwind CSS](https://tailwindcss.com/) cho styling. [React-i18next](https://react.i18next.com/) được sử dụng cho việc quốc tế hóa.
```
[web/]
├── app // layouts, pages và components
│ ├── (commonLayout) // layout chung được sử dụng trong toàn bộ ứng dụng
│ ├── (shareLayout) // layouts được chia sẻ cụ thể cho các phiên dựa trên token
│ ├── activate // trang kích hoạt
│ ├── components // được chia sẻ bởi các trang và layouts
│ ├── install // trang cài đặt
│ ├── signin // trang đăng nhập
│ └── styles // styles được chia sẻ toàn cục
├── assets // Tài nguyên tĩnh
├── bin // scripts chạy ở bước build
├── config // cài đặt và tùy chọn có thể điều chỉnh
├── context // contexts được chia sẻ bởi các phần khác nhau của ứng dụng
├── dictionaries // File dịch cho từng ngôn ngữ
├── docker // cấu hình container
├── hooks // Hooks có thể tái sử dụng
├── i18n // Cấu hình quốc tế hóa
├── models // mô tả các mô hình dữ liệu & hình dạng của phản hồi API
├── public // tài nguyên meta như favicon
├── service // xác định hình dạng của các hành động API
├── test
├── types // mô tả các tham số hàm và giá trị trả về
└── utils // Các hàm tiện ích được chia sẻ
```
## Gửi PR của bạn
Cuối cùng, đã đến lúc mở một pull request (PR) đến repository của chúng tôi. Đối với các tính năng lớn, chúng tôi sẽ merge chúng vào nhánh `deploy/dev` để kiểm tra trước khi đưa vào nhánh `main`. Nếu bạn gặp vấn đề như xung đột merge hoặc không biết cách mở pull request, hãy xem [hướng dẫn về pull request của GitHub](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests).
Và thế là xong! Khi PR của bạn được merge, bạn sẽ được giới thiệu là một người đóng góp trong [README](https://github.com/langgenius/dify/blob/main/README.md) của chúng tôi.
## Nhận trợ giúp
Nếu bạn gặp khó khăn hoặc có câu hỏi cấp bách trong quá trình đóng góp, hãy đặt câu hỏi của bạn trong vấn đề GitHub liên quan, hoặc tham gia [Discord](https://discord.gg/8Tpq4AcN9c) của chúng tôi để trò chuyện nhanh chóng.

View File

@ -247,8 +247,8 @@ API_TOOL_DEFAULT_READ_TIMEOUT=60
HTTP_REQUEST_MAX_CONNECT_TIMEOUT=300 HTTP_REQUEST_MAX_CONNECT_TIMEOUT=300
HTTP_REQUEST_MAX_READ_TIMEOUT=600 HTTP_REQUEST_MAX_READ_TIMEOUT=600
HTTP_REQUEST_MAX_WRITE_TIMEOUT=600 HTTP_REQUEST_MAX_WRITE_TIMEOUT=600
HTTP_REQUEST_NODE_MAX_BINARY_SIZE=10485760 # 10MB HTTP_REQUEST_NODE_MAX_BINARY_SIZE=10485760
HTTP_REQUEST_NODE_MAX_TEXT_SIZE=1048576 # 1MB HTTP_REQUEST_NODE_MAX_TEXT_SIZE=1048576
# Log file path # Log file path
LOG_FILE= LOG_FILE=

View File

@ -24,7 +24,7 @@ from fields.app_fields import related_app_list
from fields.dataset_fields import dataset_detail_fields, dataset_query_detail_fields from fields.dataset_fields import dataset_detail_fields, dataset_query_detail_fields
from fields.document_fields import document_status_fields from fields.document_fields import document_status_fields
from libs.login import login_required from libs.login import login_required
from models.dataset import Dataset, Document, DocumentSegment from models.dataset import Dataset, DatasetPermissionEnum, Document, DocumentSegment
from models.model import ApiToken, UploadFile from models.model import ApiToken, UploadFile
from services.dataset_service import DatasetPermissionService, DatasetService, DocumentService from services.dataset_service import DatasetPermissionService, DatasetService, DocumentService
@ -202,7 +202,7 @@ class DatasetApi(Resource):
nullable=True, nullable=True,
help='Invalid indexing technique.') help='Invalid indexing technique.')
parser.add_argument('permission', type=str, location='json', choices=( parser.add_argument('permission', type=str, location='json', choices=(
'only_me', 'all_team_members', 'partial_members'), help='Invalid permission.' DatasetPermissionEnum.ONLY_ME, DatasetPermissionEnum.ALL_TEAM, DatasetPermissionEnum.PARTIAL_TEAM), help='Invalid permission.'
) )
parser.add_argument('embedding_model', type=str, parser.add_argument('embedding_model', type=str,
location='json', help='Invalid embedding model.') location='json', help='Invalid embedding model.')
@ -239,7 +239,7 @@ class DatasetApi(Resource):
tenant_id, dataset_id_str, data.get('partial_member_list') tenant_id, dataset_id_str, data.get('partial_member_list')
) )
# clear partial member list when permission is only_me or all_team_members # clear partial member list when permission is only_me or all_team_members
elif data.get('permission') == 'only_me' or data.get('permission') == 'all_team_members': elif data.get('permission') == DatasetPermissionEnum.ONLY_ME or data.get('permission') == DatasetPermissionEnum.ALL_TEAM:
DatasetPermissionService.clear_partial_member_list(dataset_id_str) DatasetPermissionService.clear_partial_member_list(dataset_id_str)
partial_member_list = DatasetPermissionService.get_dataset_partial_member_list(dataset_id_str) partial_member_list = DatasetPermissionService.get_dataset_partial_member_list(dataset_id_str)

View File

@ -10,7 +10,7 @@ from core.model_runtime.entities.model_entities import ModelType
from core.provider_manager import ProviderManager from core.provider_manager import ProviderManager
from fields.dataset_fields import dataset_detail_fields from fields.dataset_fields import dataset_detail_fields
from libs.login import current_user from libs.login import current_user
from models.dataset import Dataset from models.dataset import Dataset, DatasetPermissionEnum
from services.dataset_service import DatasetService from services.dataset_service import DatasetService
@ -78,6 +78,8 @@ class DatasetListApi(DatasetApiResource):
parser.add_argument('indexing_technique', type=str, location='json', parser.add_argument('indexing_technique', type=str, location='json',
choices=Dataset.INDEXING_TECHNIQUE_LIST, choices=Dataset.INDEXING_TECHNIQUE_LIST,
help='Invalid indexing technique.') help='Invalid indexing technique.')
parser.add_argument('permission', type=str, location='json', choices=(
DatasetPermissionEnum.ONLY_ME, DatasetPermissionEnum.ALL_TEAM, DatasetPermissionEnum.PARTIAL_TEAM), help='Invalid permission.', required=False, nullable=False)
args = parser.parse_args() args = parser.parse_args()
try: try:
@ -85,7 +87,8 @@ class DatasetListApi(DatasetApiResource):
tenant_id=tenant_id, tenant_id=tenant_id,
name=args['name'], name=args['name'],
indexing_technique=args['indexing_technique'], indexing_technique=args['indexing_technique'],
account=current_user account=current_user,
permission=args['permission']
) )
except services.errors.dataset.DatasetNameDuplicateError: except services.errors.dataset.DatasetNameDuplicateError:
raise DatasetNameDuplicateError() raise DatasetNameDuplicateError()

View File

@ -1,15 +1,13 @@
import logging import logging
import time
from enum import Enum from enum import Enum
from threading import Lock from threading import Lock
from typing import Literal, Optional from typing import Optional
from httpx import Timeout, get, post from httpx import Timeout, post
from pydantic import BaseModel from pydantic import BaseModel
from yarl import URL from yarl import URL
from configs import dify_config from configs import dify_config
from core.helper.code_executor.entities import CodeDependency
from core.helper.code_executor.javascript.javascript_transformer import NodeJsTemplateTransformer from core.helper.code_executor.javascript.javascript_transformer import NodeJsTemplateTransformer
from core.helper.code_executor.jinja2.jinja2_transformer import Jinja2TemplateTransformer from core.helper.code_executor.jinja2.jinja2_transformer import Jinja2TemplateTransformer
from core.helper.code_executor.python3.python3_transformer import Python3TemplateTransformer from core.helper.code_executor.python3.python3_transformer import Python3TemplateTransformer
@ -66,8 +64,7 @@ class CodeExecutor:
def execute_code(cls, def execute_code(cls,
language: CodeLanguage, language: CodeLanguage,
preload: str, preload: str,
code: str, code: str) -> str:
dependencies: Optional[list[CodeDependency]] = None) -> str:
""" """
Execute code Execute code
:param language: code language :param language: code language
@ -87,9 +84,6 @@ class CodeExecutor:
'enable_network': True 'enable_network': True
} }
if dependencies:
data['dependencies'] = [dependency.model_dump() for dependency in dependencies]
try: try:
response = post(str(url), json=data, headers=headers, timeout=CODE_EXECUTION_TIMEOUT) response = post(str(url), json=data, headers=headers, timeout=CODE_EXECUTION_TIMEOUT)
if response.status_code == 503: if response.status_code == 503:
@ -119,7 +113,7 @@ class CodeExecutor:
return response.data.stdout or '' return response.data.stdout or ''
@classmethod @classmethod
def execute_workflow_code_template(cls, language: CodeLanguage, code: str, inputs: dict, dependencies: Optional[list[CodeDependency]] = None) -> dict: def execute_workflow_code_template(cls, language: CodeLanguage, code: str, inputs: dict) -> dict:
""" """
Execute code Execute code
:param language: code language :param language: code language
@ -131,67 +125,12 @@ class CodeExecutor:
if not template_transformer: if not template_transformer:
raise CodeExecutionException(f'Unsupported language {language}') raise CodeExecutionException(f'Unsupported language {language}')
runner, preload, dependencies = template_transformer.transform_caller(code, inputs, dependencies) runner, preload = template_transformer.transform_caller(code, inputs)
try: try:
response = cls.execute_code(language, preload, runner, dependencies) response = cls.execute_code(language, preload, runner)
except CodeExecutionException as e: except CodeExecutionException as e:
raise e raise e
return template_transformer.transform_response(response) return template_transformer.transform_response(response)
@classmethod
def list_dependencies(cls, language: str) -> list[CodeDependency]:
if language not in cls.supported_dependencies_languages:
return []
with cls.dependencies_cache_lock:
if language in cls.dependencies_cache:
# check expiration
dependencies = cls.dependencies_cache[language]
if dependencies['expiration'] > time.time():
return dependencies['data']
# remove expired cache
del cls.dependencies_cache[language]
dependencies = cls._get_dependencies(language)
with cls.dependencies_cache_lock:
cls.dependencies_cache[language] = {
'data': dependencies,
'expiration': time.time() + 60
}
return dependencies
@classmethod
def _get_dependencies(cls, language: Literal['python3']) -> list[CodeDependency]:
"""
List dependencies
"""
url = URL(CODE_EXECUTION_ENDPOINT) / 'v1' / 'sandbox' / 'dependencies'
headers = {
'X-Api-Key': CODE_EXECUTION_API_KEY
}
running_language = cls.code_language_to_running_language.get(language)
if isinstance(running_language, Enum):
running_language = running_language.value
data = {
'language': running_language,
}
try:
response = get(str(url), params=data, headers=headers, timeout=CODE_EXECUTION_TIMEOUT)
if response.status_code != 200:
raise Exception(f'Failed to list dependencies, got status code {response.status_code}, please check if the sandbox service is running')
response = response.json()
dependencies = response.get('data', {}).get('dependencies', [])
return [
CodeDependency(**dependency) for dependency in dependencies
if dependency.get('name') not in Python3TemplateTransformer.get_standard_packages()
]
except Exception as e:
logger.exception(f'Failed to list dependencies: {e}')
return []

View File

@ -2,8 +2,6 @@ from abc import abstractmethod
from pydantic import BaseModel from pydantic import BaseModel
from core.helper.code_executor.code_executor import CodeExecutor
class CodeNodeProvider(BaseModel): class CodeNodeProvider(BaseModel):
@staticmethod @staticmethod
@ -23,10 +21,6 @@ class CodeNodeProvider(BaseModel):
""" """
pass pass
@classmethod
def get_default_available_packages(cls) -> list[dict]:
return [p.model_dump() for p in CodeExecutor.list_dependencies(cls.get_language())]
@classmethod @classmethod
def get_default_config(cls) -> dict: def get_default_config(cls) -> dict:
return { return {
@ -50,6 +44,5 @@ class CodeNodeProvider(BaseModel):
"children": None "children": None
} }
} }
}, }
"available_dependencies": cls.get_default_available_packages(),
} }

View File

@ -1,6 +0,0 @@
from pydantic import BaseModel
class CodeDependency(BaseModel):
name: str
version: str

View File

@ -3,7 +3,7 @@ from core.helper.code_executor.code_executor import CodeExecutor, CodeLanguage
class Jinja2Formatter: class Jinja2Formatter:
@classmethod @classmethod
def format(cls, template: str, inputs: str) -> str: def format(cls, template: str, inputs: dict) -> str:
""" """
Format template Format template
:param template: template :param template: template

View File

@ -1,14 +1,9 @@
from textwrap import dedent from textwrap import dedent
from core.helper.code_executor.python3.python3_transformer import Python3TemplateTransformer
from core.helper.code_executor.template_transformer import TemplateTransformer from core.helper.code_executor.template_transformer import TemplateTransformer
class Jinja2TemplateTransformer(TemplateTransformer): class Jinja2TemplateTransformer(TemplateTransformer):
@classmethod
def get_standard_packages(cls) -> set[str]:
return {'jinja2'} | Python3TemplateTransformer.get_standard_packages()
@classmethod @classmethod
def transform_response(cls, response: str) -> dict: def transform_response(cls, response: str) -> dict:
""" """

View File

@ -4,30 +4,6 @@ from core.helper.code_executor.template_transformer import TemplateTransformer
class Python3TemplateTransformer(TemplateTransformer): class Python3TemplateTransformer(TemplateTransformer):
@classmethod
def get_standard_packages(cls) -> set[str]:
return {
'base64',
'binascii',
'collections',
'datetime',
'functools',
'hashlib',
'hmac',
'itertools',
'json',
'math',
'operator',
'os',
'random',
're',
'string',
'sys',
'time',
'traceback',
'uuid',
}
@classmethod @classmethod
def get_runner_script(cls) -> str: def get_runner_script(cls) -> str:
runner_script = dedent(f""" runner_script = dedent(f"""

View File

@ -2,9 +2,6 @@ import json
import re import re
from abc import ABC, abstractmethod from abc import ABC, abstractmethod
from base64 import b64encode from base64 import b64encode
from typing import Optional
from core.helper.code_executor.entities import CodeDependency
class TemplateTransformer(ABC): class TemplateTransformer(ABC):
@ -13,12 +10,7 @@ class TemplateTransformer(ABC):
_result_tag: str = '<<RESULT>>' _result_tag: str = '<<RESULT>>'
@classmethod @classmethod
def get_standard_packages(cls) -> set[str]: def transform_caller(cls, code: str, inputs: dict) -> tuple[str, str]:
return set()
@classmethod
def transform_caller(cls, code: str, inputs: dict,
dependencies: Optional[list[CodeDependency]] = None) -> tuple[str, str, list[CodeDependency]]:
""" """
Transform code to python runner Transform code to python runner
:param code: code :param code: code
@ -28,14 +20,7 @@ class TemplateTransformer(ABC):
runner_script = cls.assemble_runner_script(code, inputs) runner_script = cls.assemble_runner_script(code, inputs)
preload_script = cls.get_preload_script() preload_script = cls.get_preload_script()
packages = dependencies or [] return runner_script, preload_script
standard_packages = cls.get_standard_packages()
for package in standard_packages:
if package not in packages:
packages.append(CodeDependency(name=package, version=''))
packages = list({dep.name: dep for dep in packages if dep.name}.values())
return runner_script, preload_script, packages
@classmethod @classmethod
def extract_result_str_from_response(cls, response: str) -> str: def extract_result_str_from_response(cls, response: str) -> str:

View File

@ -428,7 +428,7 @@ class OAIAPICompatLargeLanguageModel(_CommonOAI_API_Compat, LargeLanguageModel):
if new_tool_call.function.arguments: if new_tool_call.function.arguments:
tool_call.function.arguments += new_tool_call.function.arguments tool_call.function.arguments += new_tool_call.function.arguments
finish_reason = 'Unknown' finish_reason = None # The default value of finish_reason is None
for chunk in response.iter_lines(decode_unicode=True, delimiter=delimiter): for chunk in response.iter_lines(decode_unicode=True, delimiter=delimiter):
chunk = chunk.strip() chunk = chunk.strip()
@ -437,6 +437,8 @@ class OAIAPICompatLargeLanguageModel(_CommonOAI_API_Compat, LargeLanguageModel):
if chunk.startswith(':'): if chunk.startswith(':'):
continue continue
decoded_chunk = chunk.strip().lstrip('data: ').lstrip() decoded_chunk = chunk.strip().lstrip('data: ').lstrip()
if decoded_chunk == '[DONE]': # Some provider returns "data: [DONE]"
continue
try: try:
chunk_json = json.loads(decoded_chunk) chunk_json = json.loads(decoded_chunk)

View File

@ -25,7 +25,7 @@ parameters:
pt_BR: Prompt pt_BR: Prompt
human_description: human_description:
en_US: Image prompt, you can check the official documentation of DallE 3 en_US: Image prompt, you can check the official documentation of DallE 3
zh_Hans: 图像提示词,您可以查看DallE 3 的官方文档 zh_Hans: 图像提示词,您可以查看 DallE 3 的官方文档
pt_BR: Imagem prompt, você pode verificar a documentação oficial do DallE 3 pt_BR: Imagem prompt, você pode verificar a documentação oficial do DallE 3
llm_description: Image prompt of DallE 3, you should describe the image you want to generate as a list of words as possible as detailed llm_description: Image prompt of DallE 3, you should describe the image you want to generate as a list of words as possible as detailed
form: llm form: llm

View File

@ -25,7 +25,7 @@ parameters:
pt_BR: Prompt pt_BR: Prompt
human_description: human_description:
en_US: Image prompt, you can check the official documentation of CogView 3 en_US: Image prompt, you can check the official documentation of CogView 3
zh_Hans: 图像提示词,您可以查看CogView 3 的官方文档 zh_Hans: 图像提示词,您可以查看 CogView 3 的官方文档
pt_BR: Image prompt, you can check the official documentation of CogView 3 pt_BR: Image prompt, you can check the official documentation of CogView 3
llm_description: Image prompt of CogView 3, you should describe the image you want to generate as a list of words as possible as detailed llm_description: Image prompt of CogView 3, you should describe the image you want to generate as a list of words as possible as detailed
form: llm form: llm

View File

@ -24,7 +24,7 @@ parameters:
pt_BR: Prompt pt_BR: Prompt
human_description: human_description:
en_US: Image prompt, you can check the official documentation of DallE 2 en_US: Image prompt, you can check the official documentation of DallE 2
zh_Hans: 图像提示词,您可以查看DallE 2 的官方文档 zh_Hans: 图像提示词,您可以查看 DallE 2 的官方文档
pt_BR: Image prompt, you can check the official documentation of DallE 2 pt_BR: Image prompt, you can check the official documentation of DallE 2
llm_description: Image prompt of DallE 2, you should describe the image you want to generate as a list of words as possible as detailed llm_description: Image prompt of DallE 2, you should describe the image you want to generate as a list of words as possible as detailed
form: llm form: llm

View File

@ -25,7 +25,7 @@ parameters:
pt_BR: Prompt pt_BR: Prompt
human_description: human_description:
en_US: Image prompt, you can check the official documentation of DallE 3 en_US: Image prompt, you can check the official documentation of DallE 3
zh_Hans: 图像提示词,您可以查看DallE 3 的官方文档 zh_Hans: 图像提示词,您可以查看 DallE 3 的官方文档
pt_BR: Image prompt, you can check the official documentation of DallE 3 pt_BR: Image prompt, you can check the official documentation of DallE 3
llm_description: Image prompt of DallE 3, you should describe the image you want to generate as a list of words as possible as detailed llm_description: Image prompt of DallE 3, you should describe the image you want to generate as a list of words as possible as detailed
form: llm form: llm

View File

@ -25,7 +25,7 @@ parameters:
pt_BR: Prompt pt_BR: Prompt
human_description: human_description:
en_US: Image prompt, you can check the official documentation of step-1x en_US: Image prompt, you can check the official documentation of step-1x
zh_Hans: 图像提示词,您可以查看step-1x 的官方文档 zh_Hans: 图像提示词,您可以查看 step-1x 的官方文档
pt_BR: Image prompt, you can check the official documentation of step-1x pt_BR: Image prompt, you can check the official documentation of step-1x
llm_description: Image prompt of step-1x you should describe the image you want to generate as a list of words as possible as detailed llm_description: Image prompt of step-1x you should describe the image you want to generate as a list of words as possible as detailed
form: llm form: llm

View File

@ -67,7 +67,6 @@ class CodeNode(BaseNode):
language=code_language, language=code_language,
code=code, code=code,
inputs=variables, inputs=variables,
dependencies=node_data.dependencies
) )
# Transform result # Transform result

View File

@ -3,7 +3,6 @@ from typing import Literal, Optional
from pydantic import BaseModel from pydantic import BaseModel
from core.helper.code_executor.code_executor import CodeLanguage from core.helper.code_executor.code_executor import CodeLanguage
from core.helper.code_executor.entities import CodeDependency
from core.workflow.entities.base_node_data_entities import BaseNodeData from core.workflow.entities.base_node_data_entities import BaseNodeData
from core.workflow.entities.variable_entities import VariableSelector from core.workflow.entities.variable_entities import VariableSelector
@ -16,8 +15,12 @@ class CodeNodeData(BaseNodeData):
type: Literal['string', 'number', 'object', 'array[string]', 'array[number]', 'array[object]'] type: Literal['string', 'number', 'object', 'array[string]', 'array[number]', 'array[object]']
children: Optional[dict[str, 'Output']] = None children: Optional[dict[str, 'Output']] = None
class Dependency(BaseModel):
name: str
version: str
variables: list[VariableSelector] variables: list[VariableSelector]
code_language: Literal[CodeLanguage.PYTHON3, CodeLanguage.JAVASCRIPT] code_language: Literal[CodeLanguage.PYTHON3, CodeLanguage.JAVASCRIPT]
code: str code: str
outputs: dict[str, Output] outputs: dict[str, Output]
dependencies: Optional[list[CodeDependency]] = None dependencies: Optional[list[Dependency]] = None

View File

@ -138,12 +138,14 @@ class LLMNode(BaseNode):
result_text = '' result_text = ''
usage = LLMUsage.empty_usage() usage = LLMUsage.empty_usage()
finish_reason = None
for event in generator: for event in generator:
if isinstance(event, RunStreamChunkEvent): if isinstance(event, RunStreamChunkEvent):
yield event yield event
elif isinstance(event, ModelInvokeCompleted): elif isinstance(event, ModelInvokeCompleted):
result_text = event.text result_text = event.text
usage = event.usage usage = event.usage
finish_reason = event.finish_reason
break break
except Exception as e: except Exception as e:
yield RunCompletedEvent( yield RunCompletedEvent(
@ -158,7 +160,8 @@ class LLMNode(BaseNode):
outputs = { outputs = {
'text': result_text, 'text': result_text,
'usage': jsonable_encoder(usage) 'usage': jsonable_encoder(usage),
'finish_reason': finish_reason
} }
yield RunCompletedEvent( yield RunCompletedEvent(
@ -227,6 +230,7 @@ class LLMNode(BaseNode):
prompt_messages: list[PromptMessage] = [] prompt_messages: list[PromptMessage] = []
full_text = '' full_text = ''
usage = None usage = None
finish_reason = None
for result in invoke_result: for result in invoke_result:
text = result.delta.message.content text = result.delta.message.content
full_text += text full_text += text
@ -245,12 +249,16 @@ class LLMNode(BaseNode):
if not usage and result.delta.usage: if not usage and result.delta.usage:
usage = result.delta.usage usage = result.delta.usage
if not finish_reason and result.delta.finish_reason:
finish_reason = result.delta.finish_reason
if not usage: if not usage:
usage = LLMUsage.empty_usage() usage = LLMUsage.empty_usage()
yield ModelInvokeCompleted( yield ModelInvokeCompleted(
text=full_text, text=full_text,
usage=usage usage=usage,
finish_reason=finish_reason
) )
def _transform_chat_messages(self, def _transform_chat_messages(self,

View File

@ -74,10 +74,12 @@ class QuestionClassifierNode(LLMNode):
result_text = '' result_text = ''
usage = LLMUsage.empty_usage() usage = LLMUsage.empty_usage()
finished_reason = None
for event in generator: for event in generator:
if isinstance(event, ModelInvokeCompleted): if isinstance(event, ModelInvokeCompleted):
result_text = event.text result_text = event.text
usage = event.usage usage = event.usage
finished_reason = event.finished_reason
break break
category_name = node_data.classes[0].name category_name = node_data.classes[0].name
@ -104,6 +106,7 @@ class QuestionClassifierNode(LLMNode):
prompt_messages=prompt_messages prompt_messages=prompt_messages
), ),
'usage': jsonable_encoder(usage), 'usage': jsonable_encoder(usage),
'finish_reason': finish_reason
} }
outputs = { outputs = {
'class_name': category_name 'class_name': category_name

View File

@ -28,6 +28,16 @@ class S3Storage(BaseStorage):
region_name=app_config.get("S3_REGION"), region_name=app_config.get("S3_REGION"),
config=Config(s3={"addressing_style": app_config.get("S3_ADDRESS_STYLE")}), config=Config(s3={"addressing_style": app_config.get("S3_ADDRESS_STYLE")}),
) )
# create bucket
try:
self.client.head_bucket(Bucket=self.bucket_name)
except ClientError as e:
# if bucket not exists, create it
if e.response["Error"]["Code"] == "404":
self.client.create_bucket(Bucket=self.bucket_name)
else:
# other error, raise exception
raise
def save(self, filename, data): def save(self, filename, data):
self.client.put_object(Bucket=self.bucket_name, Key=filename, Body=data) self.client.put_object(Bucket=self.bucket_name, Key=filename, Body=data)

View File

@ -1,4 +1,5 @@
import base64 import base64
import enum
import hashlib import hashlib
import hmac import hmac
import json import json
@ -22,6 +23,11 @@ from .model import App, Tag, TagBinding, UploadFile
from .types import StringUUID from .types import StringUUID
class DatasetPermissionEnum(str, enum.Enum):
ONLY_ME = 'only_me'
ALL_TEAM = 'all_team_members'
PARTIAL_TEAM = 'partial_members'
class Dataset(db.Model): class Dataset(db.Model):
__tablename__ = 'datasets' __tablename__ = 'datasets'
__table_args__ = ( __table_args__ = (

201
api/poetry.lock generated
View File

@ -1385,77 +1385,77 @@ testing = ["pytest (>=7.2.1)", "pytest-cov (>=4.0.0)", "tox (>=4.4.3)"]
[[package]] [[package]]
name = "clickhouse-connect" name = "clickhouse-connect"
version = "0.7.18" version = "0.7.19"
description = "ClickHouse Database Core Driver for Python, Pandas, and Superset" description = "ClickHouse Database Core Driver for Python, Pandas, and Superset"
optional = false optional = false
python-versions = "~=3.8" python-versions = "~=3.8"
files = [ files = [
{file = "clickhouse-connect-0.7.18.tar.gz", hash = "sha256:516aba1fdcf58973b0d0d90168a60c49f6892b6db1183b932f80ae057994eadb"}, {file = "clickhouse-connect-0.7.19.tar.gz", hash = "sha256:ce8f21f035781c5ef6ff57dc162e8150779c009b59f14030ba61f8c9c10c06d0"},
{file = "clickhouse_connect-0.7.18-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:43e712b8fada717160153022314473826adffde00e8cbe8068e0aa1c187c2395"}, {file = "clickhouse_connect-0.7.19-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6ac74eb9e8d6331bae0303d0fc6bdc2125aa4c421ef646348b588760b38c29e9"},
{file = "clickhouse_connect-0.7.18-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0a21244d24c9b2a7d1ea2cf23f254884113e0f6d9950340369ce154d7d377165"}, {file = "clickhouse_connect-0.7.19-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:300f3dea7dd48b2798533ed2486e4b0c3bb03c8d9df9aed3fac44161b92a30f9"},
{file = "clickhouse_connect-0.7.18-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:347b19f3674b57906dea94dd0e8b72aaedc822131cc2a2383526b19933ed7a33"}, {file = "clickhouse_connect-0.7.19-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9c72629f519105e21600680c791459d729889a290440bbdc61e43cd5eb61d928"},
{file = "clickhouse_connect-0.7.18-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:23c5aa1b144491211f662ed26f279845fb367c37d49b681b783ca4f8c51c7891"}, {file = "clickhouse_connect-0.7.19-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9ece0fb202cd9267b3872210e8e0974e4c33c8f91ca9f1c4d92edea997189c72"},
{file = "clickhouse_connect-0.7.18-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e99b4271ed08cc59162a6025086f1786ded5b8a29f4c38e2d3b2a58af04f85f5"}, {file = "clickhouse_connect-0.7.19-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a6e5adf0359043d4d21c9a668cc1b6323a1159b3e1a77aea6f82ce528b5e4c5b"},
{file = "clickhouse_connect-0.7.18-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:27d76d1dbe988350567dab7fbcc0a54cdd25abedc5585326c753974349818694"}, {file = "clickhouse_connect-0.7.19-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:63432180179e90f6f3c18861216f902d1693979e3c26a7f9ef9912c92ce00d14"},
{file = "clickhouse_connect-0.7.18-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:d2cd40b4e07df277192ab6bcb187b3f61e0074ad0e256908bf443b3080be4a6c"}, {file = "clickhouse_connect-0.7.19-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:754b9c58b032835caaa9177b69059dc88307485d2cf6d0d545b3dedb13cb512a"},
{file = "clickhouse_connect-0.7.18-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:8f4ae2c4fb66b2b49f2e7f893fe730712a61a068e79f7272e60d4dd7d64df260"}, {file = "clickhouse_connect-0.7.19-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:24e2694e89d12bba405a14b84c36318620dc50f90adbc93182418742d8f6d73f"},
{file = "clickhouse_connect-0.7.18-cp310-cp310-win32.whl", hash = "sha256:ed871195b25a4e1acfd37f59527ceb872096f0cd65d76af8c91f581c033b1cc0"}, {file = "clickhouse_connect-0.7.19-cp310-cp310-win32.whl", hash = "sha256:52929826b39b5b0f90f423b7a035930b8894b508768e620a5086248bcbad3707"},
{file = "clickhouse_connect-0.7.18-cp310-cp310-win_amd64.whl", hash = "sha256:0c4989012e434b9c167bddf9298ca6eb076593e48a2cab7347cd70a446a7b5d3"}, {file = "clickhouse_connect-0.7.19-cp310-cp310-win_amd64.whl", hash = "sha256:5c301284c87d132963388b6e8e4a690c0776d25acc8657366eccab485e53738f"},
{file = "clickhouse_connect-0.7.18-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:52cfcd77fc63561e7b51940e32900c13731513d703d7fc54a3a6eb1fa4f7be4e"}, {file = "clickhouse_connect-0.7.19-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ee47af8926a7ec3a970e0ebf29a82cbbe3b1b7eae43336a81b3a0ca18091de5f"},
{file = "clickhouse_connect-0.7.18-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:71d7bb9a24b0eacf8963044d6a1dd9e86dfcdd30afe1bd4a581c00910c83895a"}, {file = "clickhouse_connect-0.7.19-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ce429233b2d21a8a149c8cd836a2555393cbcf23d61233520db332942ffb8964"},
{file = "clickhouse_connect-0.7.18-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:395cfe09d1d39be4206fc1da96fe316f270077791f9758fcac44fd2765446dba"}, {file = "clickhouse_connect-0.7.19-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:617c04f5c46eed3344a7861cd96fb05293e70d3b40d21541b1e459e7574efa96"},
{file = "clickhouse_connect-0.7.18-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac55b2b2eb068b02cbb1afbfc8b2255734e28a646d633c43a023a9b95e08023b"}, {file = "clickhouse_connect-0.7.19-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f08e33b8cc2dc1873edc5ee4088d4fc3c0dbb69b00e057547bcdc7e9680b43e5"},
{file = "clickhouse_connect-0.7.18-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4d59bb1df3814acb321f0fe87a4a6eea658463d5e59f6dc8ae10072df1205591"}, {file = "clickhouse_connect-0.7.19-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:921886b887f762e5cc3eef57ef784d419a3f66df85fd86fa2e7fbbf464c4c54a"},
{file = "clickhouse_connect-0.7.18-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:da5ea738641a7ad0ab7a8e1d8d6234639ea1e61c6eac970bbc6b94547d2c2fa7"}, {file = "clickhouse_connect-0.7.19-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:6ad0cf8552a9e985cfa6524b674ae7c8f5ba51df5bd3ecddbd86c82cdbef41a7"},
{file = "clickhouse_connect-0.7.18-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:72eb32a75026401777e34209694ffe64db0ce610475436647ed45589b4ab4efe"}, {file = "clickhouse_connect-0.7.19-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:70f838ef0861cdf0e2e198171a1f3fd2ee05cf58e93495eeb9b17dfafb278186"},
{file = "clickhouse_connect-0.7.18-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:43bdd638b1ff27649d0ed9ed5000a8b8d754be891a8d279b27c72c03e3d12dcb"}, {file = "clickhouse_connect-0.7.19-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:c5f0d207cb0dcc1adb28ced63f872d080924b7562b263a9d54d4693b670eb066"},
{file = "clickhouse_connect-0.7.18-cp311-cp311-win32.whl", hash = "sha256:f45bdcba1dc84a1f60a8d827310f615ecbc322518c2d36bba7bf878631007152"}, {file = "clickhouse_connect-0.7.19-cp311-cp311-win32.whl", hash = "sha256:8c96c4c242b98fcf8005e678a26dbd4361748721b6fa158c1fe84ad15c7edbbe"},
{file = "clickhouse_connect-0.7.18-cp311-cp311-win_amd64.whl", hash = "sha256:6df629ab4b646a49a74e791e14a1b6a73ccbe6c4ee25f864522588d376b66279"}, {file = "clickhouse_connect-0.7.19-cp311-cp311-win_amd64.whl", hash = "sha256:bda092bab224875ed7c7683707d63f8a2322df654c4716e6611893a18d83e908"},
{file = "clickhouse_connect-0.7.18-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:32a35e1e63e4ae708432cbe29c8d116518d2d7b9ecb575b912444c3078b20e20"}, {file = "clickhouse_connect-0.7.19-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:8f170d08166438d29f0dcfc8a91b672c783dc751945559e65eefff55096f9274"},
{file = "clickhouse_connect-0.7.18-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:357529b8c08305ab895cdc898b60a3dc9b36637dfa4dbfedfc1d00548fc88edc"}, {file = "clickhouse_connect-0.7.19-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:26b80cb8f66bde9149a9a2180e2cc4895c1b7d34f9dceba81630a9b9a9ae66b2"},
{file = "clickhouse_connect-0.7.18-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2aa124d2bb65e29443779723e52398e8724e4bf56db94c9a93fd8208b9d6e2bf"}, {file = "clickhouse_connect-0.7.19-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9ba80e3598acf916c4d1b2515671f65d9efee612a783c17c56a5a646f4db59b9"},
{file = "clickhouse_connect-0.7.18-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8e3646254607e38294e20bf2e20b780b1c3141fb246366a1ad2021531f2c9c1b"}, {file = "clickhouse_connect-0.7.19-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0d38c30bd847af0ce7ff738152478f913854db356af4d5824096394d0eab873d"},
{file = "clickhouse_connect-0.7.18-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:433e50309af9d46d1b52e5b93ea105332565558be35296c7555c9c2753687586"}, {file = "clickhouse_connect-0.7.19-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d41d4b159071c0e4f607563932d4fa5c2a8fc27d3ba1200d0929b361e5191864"},
{file = "clickhouse_connect-0.7.18-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:251e67753909f76f8b136cad734501e0daf5977ed62747e18baa2b187f41c92c"}, {file = "clickhouse_connect-0.7.19-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:3682c2426f5dbda574611210e3c7c951b9557293a49eb60a7438552435873889"},
{file = "clickhouse_connect-0.7.18-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:a9980916495da3ed057e56ce2c922fc23de614ea5d74ed470b8450b58902ccee"}, {file = "clickhouse_connect-0.7.19-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:6d492064dca278eb61be3a2d70a5f082e2ebc8ceebd4f33752ae234116192020"},
{file = "clickhouse_connect-0.7.18-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:555e00660c04a524ea00409f783265ccd0d0192552eb9d4dc10d2aeaf2fa6575"}, {file = "clickhouse_connect-0.7.19-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:62612da163b934c1ff35df6155a47cf17ac0e2d2f9f0f8f913641e5c02cdf39f"},
{file = "clickhouse_connect-0.7.18-cp312-cp312-win32.whl", hash = "sha256:f4770c100f0608511f7e572b63a6b222fb780fc67341c11746d361c2b03d36d3"}, {file = "clickhouse_connect-0.7.19-cp312-cp312-win32.whl", hash = "sha256:196e48c977affc045794ec7281b4d711e169def00535ecab5f9fdeb8c177f149"},
{file = "clickhouse_connect-0.7.18-cp312-cp312-win_amd64.whl", hash = "sha256:fd44a7885d992410668d083ba38d6a268a1567f49709300b4ff84eb6aef63b70"}, {file = "clickhouse_connect-0.7.19-cp312-cp312-win_amd64.whl", hash = "sha256:b771ca6a473d65103dcae82810d3a62475c5372fc38d8f211513c72b954fb020"},
{file = "clickhouse_connect-0.7.18-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:9ac122dcabe1a9d3c14d331fade70a0adc78cf4006c8b91ee721942cdaa1190e"}, {file = "clickhouse_connect-0.7.19-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:85a016eebff440b76b90a4725bb1804ddc59e42bba77d21c2a2ec4ac1df9e28d"},
{file = "clickhouse_connect-0.7.18-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:1e89db8e8cc9187f2e9cd6aa32062f67b3b4de7b21b8703f103e89d659eda736"}, {file = "clickhouse_connect-0.7.19-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f059d3e39be1bafbf3cf0e12ed19b3cbf30b468a4840ab85166fd023ce8c3a17"},
{file = "clickhouse_connect-0.7.18-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c34bb25e5ab9a97a4154d43fdcd16751c9aa4a6e6f959016e4c5fe5b692728ed"}, {file = "clickhouse_connect-0.7.19-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:39ed54ba0998fd6899fcc967af2b452da28bd06de22e7ebf01f15acbfd547eac"},
{file = "clickhouse_connect-0.7.18-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:929441a6689a78c63c6a05ee7eb39a183601d93714835ebd537c0572101f7ab1"}, {file = "clickhouse_connect-0.7.19-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8e4b4d786572cb695a087a71cfdc53999f76b7f420f2580c9cffa8cc51442058"},
{file = "clickhouse_connect-0.7.18-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e8852df54b04361e57775d8ae571cd87e6983f7ed968890c62bbba6a2f2c88fd"}, {file = "clickhouse_connect-0.7.19-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3710ca989ceae03d5ae56a436b4fe246094dbc17a2946ff318cb460f31b69450"},
{file = "clickhouse_connect-0.7.18-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:56333eb772591162627455e2c21c8541ed628a9c6e7c115193ad00f24fc59440"}, {file = "clickhouse_connect-0.7.19-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:d104f25a054cb663495a51ccb26ea11bcdc53e9b54c6d47a914ee6fba7523e62"},
{file = "clickhouse_connect-0.7.18-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ac6633d2996100552d2ae47ac5e4eb551e11f69d05637ea84f1e13ac0f2bc21a"}, {file = "clickhouse_connect-0.7.19-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ee23b80ee4c5b05861582dd4cd11f0ca0d215a899e9ba299a6ec6e9196943b1b"},
{file = "clickhouse_connect-0.7.18-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:265085ab548fb49981fe2aef9f46652ee24d5583bf12e652abb13ee2d7e77581"}, {file = "clickhouse_connect-0.7.19-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:942ec21211d369068ab0ac082312d4df53c638bfc41545d02c41a9055e212df8"},
{file = "clickhouse_connect-0.7.18-cp38-cp38-win32.whl", hash = "sha256:5ee6c1f74df5fb19b341c389cfed7535fb627cbb9cb1a9bdcbda85045b86cd49"}, {file = "clickhouse_connect-0.7.19-cp38-cp38-win32.whl", hash = "sha256:cb8f0a59d1521a6b30afece7c000f6da2cd9f22092e90981aa83342032e5df99"},
{file = "clickhouse_connect-0.7.18-cp38-cp38-win_amd64.whl", hash = "sha256:c7a28f810775ce68577181e752ecd2dc8caae77f288b6b9f6a7ce4d36657d4fb"}, {file = "clickhouse_connect-0.7.19-cp38-cp38-win_amd64.whl", hash = "sha256:98d5779dba942459d5dc6aa083e3a8a83e1cf6191eaa883832118ad7a7e69c87"},
{file = "clickhouse_connect-0.7.18-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:67f9a3953693b609ab068071be5ac9521193f728b29057e913b386582f84b0c2"}, {file = "clickhouse_connect-0.7.19-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9f57aaa32d90f3bd18aa243342b3e75f062dc56a7f988012a22f65fb7946e81d"},
{file = "clickhouse_connect-0.7.18-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:77e202b8606096769bf45e68b46e6bb8c78c2c451c29cb9b3a7bf505b4060d44"}, {file = "clickhouse_connect-0.7.19-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:5fb25143e4446d3a73fdc1b7d976a0805f763c37bf8f9b2d612a74f65d647830"},
{file = "clickhouse_connect-0.7.18-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8abcbd17f243ca8399a06fb08970d68e73d1ad671f84bb38518449248093f655"}, {file = "clickhouse_connect-0.7.19-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7b4e19c9952b7b9fe24a99cca0b36a37e17e2a0e59b14457a2ce8868aa32e30e"},
{file = "clickhouse_connect-0.7.18-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:192605c2a9412e4c7d4baab85e432a58a0a5520615f05bc14f13c2836cfc6eeb"}, {file = "clickhouse_connect-0.7.19-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9876509aa25804f1377cb1b54dd55c1f5f37a9fbc42fa0c4ac8ac51b38db5926"},
{file = "clickhouse_connect-0.7.18-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c17108b190ab34645ee1981440ae129ecd7ca0cb6a93b4e5ce3ffc383355243f"}, {file = "clickhouse_connect-0.7.19-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:04cfb1dae8fb93117211cfe4e04412b075e47580391f9eee9a77032d8e7d46f4"},
{file = "clickhouse_connect-0.7.18-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ac1be43360a6e602784eb60547a03a6c2c574744cb8982ec15aac0e0e57709bd"}, {file = "clickhouse_connect-0.7.19-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b04f7c57f61b5dfdbf49d4b5e4fa5e91ce86bee09bb389b641268afa8f511ab4"},
{file = "clickhouse_connect-0.7.18-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:cf403781d4ffd5a47aa7eff591940df182de4d9c423cfdc7eb6ade1a1b100e22"}, {file = "clickhouse_connect-0.7.19-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:e5b563f32dcc9cb6ff1f6ed238e83c3e80eb15814b1ea130817c004c241a3c2e"},
{file = "clickhouse_connect-0.7.18-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:937c6481ec083e2a0bcf178ea363b72d437ab0c8fcbe65143db64b12c1e077c0"}, {file = "clickhouse_connect-0.7.19-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:6018675a231130bd03a7b39a3e875e683286d98115085bfa3ac0918f555f4bfe"},
{file = "clickhouse_connect-0.7.18-cp39-cp39-win32.whl", hash = "sha256:77635fea4b3fc4b1568a32674f04d35f4e648e3180528a9bb776e46e76090e4a"}, {file = "clickhouse_connect-0.7.19-cp39-cp39-win32.whl", hash = "sha256:5cb67ae3309396033b825626d60fe2cd789c1d2a183faabef8ffdbbef153d7fb"},
{file = "clickhouse_connect-0.7.18-cp39-cp39-win_amd64.whl", hash = "sha256:5ef60eb76be54b6d6bd8f189b076939e2cca16b50b92b763e7a9c7a62b488045"}, {file = "clickhouse_connect-0.7.19-cp39-cp39-win_amd64.whl", hash = "sha256:fd225af60478c068cde0952e8df8f731f24c828b75cc1a2e61c21057ff546ecd"},
{file = "clickhouse_connect-0.7.18-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:7bf76743d7b92b6cac6b4ef2e7a4c2d030ecf2fd542fcfccb374b2432b8d1027"}, {file = "clickhouse_connect-0.7.19-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:6f31898e0281f820e35710b5c4ad1d40a6c01ffae5278afaef4a16877ac8cbfb"},
{file = "clickhouse_connect-0.7.18-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:65b344f174d63096eec098137b5d9c3bb545d67dd174966246c4aa80f9c0bc1e"}, {file = "clickhouse_connect-0.7.19-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:51c911b0b8281ab4a909320f41dd9c0662796bec157c8f2704de702c552104db"},
{file = "clickhouse_connect-0.7.18-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:24dcc19338cd540e6a3e32e8a7c72c5fc4930c0dd5a760f76af9d384b3e57ddc"}, {file = "clickhouse_connect-0.7.19-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d1088da11789c519f9bb8927a14b16892e3c65e2893abe2680eae68bf6c63835"},
{file = "clickhouse_connect-0.7.18-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:31f5e42d5fd4eaab616926bae344c17202950d9d9c04716d46bccce6b31dbb73"}, {file = "clickhouse_connect-0.7.19-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:03953942cc073078b40619a735ebeaed9bf98efc71c6f43ce92a38540b1308ce"},
{file = "clickhouse_connect-0.7.18-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a890421403c7a59ef85e3afc4ff0d641c5553c52fbb9d6ce30c0a0554649fac6"}, {file = "clickhouse_connect-0.7.19-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:4ac0602fa305d097a0cd40cebbe10a808f6478c9f303d57a48a3a0ad09659544"},
{file = "clickhouse_connect-0.7.18-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:d61de71d2b82446dd66ade1b925270366c36a2b11779d5d1bcf71b1bfdd161e6"}, {file = "clickhouse_connect-0.7.19-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:4fdefe9eb2d38063835f8f1f326d666c3f61de9d6c3a1607202012c386ca7631"},
{file = "clickhouse_connect-0.7.18-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e81c4f2172e8d6f3dc4dd64ff2dc426920c0caeed969b4ec5bdd0b2fad1533e4"}, {file = "clickhouse_connect-0.7.19-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ff6469822fe8d83f272ffbb3fb99dcc614e20b1d5cddd559505029052eff36e7"},
{file = "clickhouse_connect-0.7.18-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:092cb8e8acdcccce01d239760405fbd8c266052def49b13ad0a96814f5e521ca"}, {file = "clickhouse_connect-0.7.19-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:46298e23f7e7829f0aa880a99837a82390c1371a643b21f8feb77702707b9eaa"},
{file = "clickhouse_connect-0.7.18-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a1ae8b1bab7f06815abf9d833a66849faa2b9dfadcc5728fd14c494e2879afa8"}, {file = "clickhouse_connect-0.7.19-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c6409390b13e09c19435ff65e2ebfcf01f9b2382e4b946191979a5d54ef8625c"},
{file = "clickhouse_connect-0.7.18-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:e08ebec4db83109024c97ca2d25740bf57915160d7676edd5c4390777c3e3ec0"}, {file = "clickhouse_connect-0.7.19-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:cd7e7097b30b70eb695b7b3b6c79ba943548c053cc465fa74efa67a2354f6acd"},
{file = "clickhouse_connect-0.7.18-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:e5e42ec23b59597b512b994fec68ac1c2fa6def8594848cc3ae2459cf5e9d76a"}, {file = "clickhouse_connect-0.7.19-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:15e080aead66e43c1f214b3e76ab26e3f342a4a4f50e3bbc3118bdd013d12e5f"},
{file = "clickhouse_connect-0.7.18-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f1aad4543a1ae4d40dc815ef85031a1809fe101687380d516383b168a7407ab2"}, {file = "clickhouse_connect-0.7.19-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:194d2a32ba1b370cb5ac375dd4153871bb0394ff040344d8f449cb36ea951a96"},
{file = "clickhouse_connect-0.7.18-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:46cb4c604bd696535b1e091efb8047b833ff4220d31dbd95558c3587fda533a7"}, {file = "clickhouse_connect-0.7.19-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0ac93aafd6a542fdcad4a2b6778575eab6dbdbf8806e86d92e1c1aa00d91cfee"},
{file = "clickhouse_connect-0.7.18-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:05e1ef335b81bf6b5908767c3b55e842f1f8463742992653551796eeb8f2d7d6"}, {file = "clickhouse_connect-0.7.19-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b208dd3e29db7154b02652c26157a1903bea03d27867ca5b749edc2285c62161"},
{file = "clickhouse_connect-0.7.18-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:094e089de4a50a170f5fd1c0ebb2ea357e055266220bb11dfd7ddf2d4e9c9123"}, {file = "clickhouse_connect-0.7.19-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:9724fdf3563b2335791443cb9e2114be7f77c20c8c4bbfb3571a3020606f0773"},
] ]
[package.dependencies] [package.dependencies]
@ -3826,18 +3826,22 @@ test = ["flufl.flake8", "importlib-resources (>=1.3)", "jaraco.test (>=5.4)", "p
[[package]] [[package]]
name = "importlib-resources" name = "importlib-resources"
version = "6.4.3" version = "6.4.4"
description = "Read resources from Python packages" description = "Read resources from Python packages"
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
files = [ files = [
{file = "importlib_resources-6.4.3-py3-none-any.whl", hash = "sha256:2d6dfe3b9e055f72495c2085890837fc8c758984e209115c8792bddcb762cd93"}, {file = "importlib_resources-6.4.4-py3-none-any.whl", hash = "sha256:dda242603d1c9cd836c3368b1174ed74cb4049ecd209e7a1a0104620c18c5c11"},
{file = "importlib_resources-6.4.3.tar.gz", hash = "sha256:4a202b9b9d38563b46da59221d77bb73862ab5d79d461307bcb826d725448b98"}, {file = "importlib_resources-6.4.4.tar.gz", hash = "sha256:20600c8b7361938dc0bb2d5ec0297802e575df486f5a544fa414da65e13721f7"},
] ]
[package.extras] [package.extras]
check = ["pytest-checkdocs (>=2.4)", "pytest-ruff (>=0.2.1)"]
cover = ["pytest-cov"]
doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"]
test = ["jaraco.test (>=5.4)", "pytest (>=6,!=8.1.*)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy", "pytest-ruff (>=0.2.1)", "zipp (>=3.17)"] enabler = ["pytest-enabler (>=2.2)"]
test = ["jaraco.test (>=5.4)", "pytest (>=6,!=8.1.*)", "zipp (>=3.17)"]
type = ["pytest-mypy"]
[[package]] [[package]]
name = "iniconfig" name = "iniconfig"
@ -4175,13 +4179,13 @@ openai = ["openai (>=0.27.8)"]
[[package]] [[package]]
name = "langsmith" name = "langsmith"
version = "0.1.100" version = "0.1.101"
description = "Client library to connect to the LangSmith LLM Tracing and Evaluation Platform." description = "Client library to connect to the LangSmith LLM Tracing and Evaluation Platform."
optional = false optional = false
python-versions = "<4.0,>=3.8.1" python-versions = "<4.0,>=3.8.1"
files = [ files = [
{file = "langsmith-0.1.100-py3-none-any.whl", hash = "sha256:cae44a884a4166c4d8b9cc5ff99f5d520337bd90b9dadfe3706ed31415d559a7"}, {file = "langsmith-0.1.101-py3-none-any.whl", hash = "sha256:572e2c90709cda1ad837ac86cedda7295f69933f2124c658a92a35fb890477cc"},
{file = "langsmith-0.1.100.tar.gz", hash = "sha256:20ff0126253a5a1d621635a3bc44ccacc036e855f52185ae983420f14eb6c605"}, {file = "langsmith-0.1.101.tar.gz", hash = "sha256:caf4d95f314bb6cd3c4e0632eed821fd5cd5d0f18cb824772fce6d7a9113895b"},
] ]
[package.dependencies] [package.dependencies]
@ -5006,13 +5010,13 @@ tldextract = ">=2.0.1"
[[package]] [[package]]
name = "nltk" name = "nltk"
version = "3.9.1" version = "3.8.1"
description = "Natural Language Toolkit" description = "Natural Language Toolkit"
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.7"
files = [ files = [
{file = "nltk-3.9.1-py3-none-any.whl", hash = "sha256:4fa26829c5b00715afe3061398a8989dc643b92ce7dd93fb4585a70930d168a1"}, {file = "nltk-3.8.1-py3-none-any.whl", hash = "sha256:fd5c9109f976fa86bcadba8f91e47f5e9293bd034474752e92a520f81c93dda5"},
{file = "nltk-3.9.1.tar.gz", hash = "sha256:87d127bd3de4bd89a4f81265e5fa59cb1b199b27440175370f7417d2bc7ae868"}, {file = "nltk-3.8.1.zip", hash = "sha256:1834da3d0682cba4f2cede2f9aad6b0fafb6461ba451db0efb6f9c39798d64d3"},
] ]
[package.dependencies] [package.dependencies]
@ -5925,13 +5929,13 @@ tests = ["pytest (>=5.4.1)", "pytest-cov (>=2.8.1)", "pytest-mypy (>=0.8.0)", "p
[[package]] [[package]]
name = "posthog" name = "posthog"
version = "3.5.0" version = "3.5.2"
description = "Integrate PostHog into any python application." description = "Integrate PostHog into any python application."
optional = false optional = false
python-versions = "*" python-versions = "*"
files = [ files = [
{file = "posthog-3.5.0-py2.py3-none-any.whl", hash = "sha256:3c672be7ba6f95d555ea207d4486c171d06657eb34b3ce25eb043bfe7b6b5b76"}, {file = "posthog-3.5.2-py2.py3-none-any.whl", hash = "sha256:605b3d92369971cc99290b1fcc8534cbddac3726ef7972caa993454a5ecfb644"},
{file = "posthog-3.5.0.tar.gz", hash = "sha256:8f7e3b2c6e8714d0c0c542a2109b83a7549f63b7113a133ab2763a89245ef2ef"}, {file = "posthog-3.5.2.tar.gz", hash = "sha256:a383a80c1f47e0243f5ce359e81e06e2e7b37eb39d1d6f8d01c3e64ed29df2ee"},
] ]
[package.dependencies] [package.dependencies]
@ -6139,19 +6143,6 @@ files = [
{file = "pyarrow-17.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:392bc9feabc647338e6c89267635e111d71edad5fcffba204425a7c8d13610d7"}, {file = "pyarrow-17.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:392bc9feabc647338e6c89267635e111d71edad5fcffba204425a7c8d13610d7"},
{file = "pyarrow-17.0.0-cp38-cp38-macosx_10_15_x86_64.whl", hash = "sha256:af5ff82a04b2171415f1410cff7ebb79861afc5dae50be73ce06d6e870615204"}, {file = "pyarrow-17.0.0-cp38-cp38-macosx_10_15_x86_64.whl", hash = "sha256:af5ff82a04b2171415f1410cff7ebb79861afc5dae50be73ce06d6e870615204"},
{file = "pyarrow-17.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:edca18eaca89cd6382dfbcff3dd2d87633433043650c07375d095cd3517561d8"}, {file = "pyarrow-17.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:edca18eaca89cd6382dfbcff3dd2d87633433043650c07375d095cd3517561d8"},
{file = "pyarrow-17.0.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7c7916bff914ac5d4a8fe25b7a25e432ff921e72f6f2b7547d1e325c1ad9d155"},
{file = "pyarrow-17.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f553ca691b9e94b202ff741bdd40f6ccb70cdd5fbf65c187af132f1317de6145"},
{file = "pyarrow-17.0.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:0cdb0e627c86c373205a2f94a510ac4376fdc523f8bb36beab2e7f204416163c"},
{file = "pyarrow-17.0.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:d7d192305d9d8bc9082d10f361fc70a73590a4c65cf31c3e6926cd72b76bc35c"},
{file = "pyarrow-17.0.0-cp38-cp38-win_amd64.whl", hash = "sha256:02dae06ce212d8b3244dd3e7d12d9c4d3046945a5933d28026598e9dbbda1fca"},
{file = "pyarrow-17.0.0-cp39-cp39-macosx_10_15_x86_64.whl", hash = "sha256:13d7a460b412f31e4c0efa1148e1d29bdf18ad1411eb6757d38f8fbdcc8645fb"},
{file = "pyarrow-17.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9b564a51fbccfab5a04a80453e5ac6c9954a9c5ef2890d1bcf63741909c3f8df"},
{file = "pyarrow-17.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:32503827abbc5aadedfa235f5ece8c4f8f8b0a3cf01066bc8d29de7539532687"},
{file = "pyarrow-17.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a155acc7f154b9ffcc85497509bcd0d43efb80d6f733b0dc3bb14e281f131c8b"},
{file = "pyarrow-17.0.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:dec8d129254d0188a49f8a1fc99e0560dc1b85f60af729f47de4046015f9b0a5"},
{file = "pyarrow-17.0.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:a48ddf5c3c6a6c505904545c25a4ae13646ae1f8ba703c4df4a1bfe4f4006bda"},
{file = "pyarrow-17.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:42bf93249a083aca230ba7e2786c5f673507fa97bbd9725a1e2754715151a204"},
{file = "pyarrow-17.0.0.tar.gz", hash = "sha256:4beca9521ed2c0921c1023e68d097d0299b62c362639ea315572a58f3f50fd28"},
] ]
[package.dependencies] [package.dependencies]
@ -8045,13 +8036,13 @@ test = ["pytest", "tornado (>=4.5)", "typeguard"]
[[package]] [[package]]
name = "tencentcloud-sdk-python-common" name = "tencentcloud-sdk-python-common"
version = "3.0.1215" version = "3.0.1216"
description = "Tencent Cloud Common SDK for Python" description = "Tencent Cloud Common SDK for Python"
optional = false optional = false
python-versions = "*" python-versions = "*"
files = [ files = [
{file = "tencentcloud-sdk-python-common-3.0.1215.tar.gz", hash = "sha256:2bc3be61598af06835f66a2b2f769fe32e7b6f8fdcde4bd5aefb2b0205a374ae"}, {file = "tencentcloud-sdk-python-common-3.0.1216.tar.gz", hash = "sha256:7ad83b100574068fe25439fe47fd27253ff1c730348a309567a7ff88eda63cf8"},
{file = "tencentcloud_sdk_python_common-3.0.1215-py2.py3-none-any.whl", hash = "sha256:45a1a995d2d972e0e2673c19b97eecc2176c30ab25016a814fda20a9254e6e88"}, {file = "tencentcloud_sdk_python_common-3.0.1216-py2.py3-none-any.whl", hash = "sha256:5e1cf9b685923d567d379f96a7008084006ad68793cfa0a0524e65dc59fd09d7"},
] ]
[package.dependencies] [package.dependencies]
@ -8059,17 +8050,17 @@ requests = ">=2.16.0"
[[package]] [[package]]
name = "tencentcloud-sdk-python-hunyuan" name = "tencentcloud-sdk-python-hunyuan"
version = "3.0.1215" version = "3.0.1216"
description = "Tencent Cloud Hunyuan SDK for Python" description = "Tencent Cloud Hunyuan SDK for Python"
optional = false optional = false
python-versions = "*" python-versions = "*"
files = [ files = [
{file = "tencentcloud-sdk-python-hunyuan-3.0.1215.tar.gz", hash = "sha256:aba15f29f14f27ec7c8801562601d4720368435b925640c7efd83170e2bfefe1"}, {file = "tencentcloud-sdk-python-hunyuan-3.0.1216.tar.gz", hash = "sha256:b295d67f97dba52ed358a1d9e061f94b1a4a87e45714efbf0987edab12642206"},
{file = "tencentcloud_sdk_python_hunyuan-3.0.1215-py2.py3-none-any.whl", hash = "sha256:5baab9ded7509326d951aca553147e72d57b7c0ea961fa03afc123f7a45bf2b7"}, {file = "tencentcloud_sdk_python_hunyuan-3.0.1216-py2.py3-none-any.whl", hash = "sha256:62d925b41424017929b532389061a076dca72dde455e85ec089947645010e691"},
] ]
[package.dependencies] [package.dependencies]
tencentcloud-sdk-python-common = "3.0.1215" tencentcloud-sdk-python-common = "3.0.1216"
[[package]] [[package]]
name = "threadpoolctl" name = "threadpoolctl"
@ -9116,13 +9107,13 @@ files = [
[[package]] [[package]]
name = "werkzeug" name = "werkzeug"
version = "3.0.3" version = "3.0.4"
description = "The comprehensive WSGI web application library." description = "The comprehensive WSGI web application library."
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
files = [ files = [
{file = "werkzeug-3.0.3-py3-none-any.whl", hash = "sha256:fc9645dc43e03e4d630d23143a04a7f947a9a3b5727cd535fdfe155a17cc48c8"}, {file = "werkzeug-3.0.4-py3-none-any.whl", hash = "sha256:02c9eb92b7d6c06f31a782811505d2157837cea66aaede3e217c7c27c039476c"},
{file = "werkzeug-3.0.3.tar.gz", hash = "sha256:097e5bfda9f0aba8da6b8545146def481d06aa7d3266e7448e2cccf67dd8bd18"}, {file = "werkzeug-3.0.4.tar.gz", hash = "sha256:34f2371506b250df4d4f84bfe7b0921e4762525762bbd936614909fe25cd7306"},
] ]
[package.dependencies] [package.dependencies]
@ -9642,4 +9633,4 @@ cffi = ["cffi (>=1.11)"]
[metadata] [metadata]
lock-version = "2.0" lock-version = "2.0"
python-versions = ">=3.10,<3.13" python-versions = ">=3.10,<3.13"
content-hash = "d82b0af7ec92f91035cd0c69741f7b749cce94198d598eac431ee746b9daf4f4" content-hash = "69c20af8ecacced3cca092662223a1511acaf65cb2616a5a1e38b498223463e0"

View File

@ -216,7 +216,7 @@ twilio = "~9.0.4"
vanna = { version = "0.5.5", extras = ["postgres", "mysql", "clickhouse", "duckdb"] } vanna = { version = "0.5.5", extras = ["postgres", "mysql", "clickhouse", "duckdb"] }
wikipedia = "1.4.0" wikipedia = "1.4.0"
yfinance = "~0.2.40" yfinance = "~0.2.40"
nltk = "3.8.1"
############################################################ ############################################################
# VDB dependencies required by vector store clients # VDB dependencies required by vector store clients
############################################################ ############################################################

View File

@ -27,6 +27,7 @@ from models.dataset import (
Dataset, Dataset,
DatasetCollectionBinding, DatasetCollectionBinding,
DatasetPermission, DatasetPermission,
DatasetPermissionEnum,
DatasetProcessRule, DatasetProcessRule,
DatasetQuery, DatasetQuery,
Document, Document,
@ -80,21 +81,21 @@ class DatasetService:
if permitted_dataset_ids: if permitted_dataset_ids:
query = query.filter( query = query.filter(
db.or_( db.or_(
Dataset.permission == 'all_team_members', Dataset.permission == DatasetPermissionEnum.ALL_TEAM,
db.and_(Dataset.permission == 'only_me', Dataset.created_by == user.id), db.and_(Dataset.permission == DatasetPermissionEnum.ONLY_ME, Dataset.created_by == user.id),
db.and_(Dataset.permission == 'partial_members', Dataset.id.in_(permitted_dataset_ids)) db.and_(Dataset.permission == DatasetPermissionEnum.PARTIAL_TEAM, Dataset.id.in_(permitted_dataset_ids))
) )
) )
else: else:
query = query.filter( query = query.filter(
db.or_( db.or_(
Dataset.permission == 'all_team_members', Dataset.permission == DatasetPermissionEnum.ALL_TEAM,
db.and_(Dataset.permission == 'only_me', Dataset.created_by == user.id) db.and_(Dataset.permission == DatasetPermissionEnum.ONLY_ME, Dataset.created_by == user.id)
) )
) )
else: else:
# if no user, only show datasets that are shared with all team members # if no user, only show datasets that are shared with all team members
query = query.filter(Dataset.permission == 'all_team_members') query = query.filter(Dataset.permission == DatasetPermissionEnum.ALL_TEAM)
if search: if search:
query = query.filter(Dataset.name.ilike(f'%{search}%')) query = query.filter(Dataset.name.ilike(f'%{search}%'))
@ -330,7 +331,7 @@ class DatasetService:
raise NoPermissionError( raise NoPermissionError(
'You do not have permission to access this dataset.' 'You do not have permission to access this dataset.'
) )
if dataset.permission == 'only_me' and dataset.created_by != user.id: if dataset.permission == DatasetPermissionEnum.ONLY_ME and dataset.created_by != user.id:
logging.debug( logging.debug(
f'User {user.id} does not have permission to access dataset {dataset.id}' f'User {user.id} does not have permission to access dataset {dataset.id}'
) )
@ -351,11 +352,11 @@ class DatasetService:
@staticmethod @staticmethod
def check_dataset_operator_permission(user: Account = None, dataset: Dataset = None): def check_dataset_operator_permission(user: Account = None, dataset: Dataset = None):
if dataset.permission == 'only_me': if dataset.permission == DatasetPermissionEnum.ONLY_ME:
if dataset.created_by != user.id: if dataset.created_by != user.id:
raise NoPermissionError('You do not have permission to access this dataset.') raise NoPermissionError('You do not have permission to access this dataset.')
elif dataset.permission == 'partial_members': elif dataset.permission == DatasetPermissionEnum.PARTIAL_TEAM:
if not any( if not any(
dp.dataset_id == dataset.id for dp in DatasetPermission.query.filter_by(account_id=user.id).all() dp.dataset_id == dataset.id for dp in DatasetPermission.query.filter_by(account_id=user.id).all()
): ):

View File

@ -6,14 +6,13 @@ from _pytest.monkeypatch import MonkeyPatch
from jinja2 import Template from jinja2 import Template
from core.helper.code_executor.code_executor import CodeExecutor, CodeLanguage from core.helper.code_executor.code_executor import CodeExecutor, CodeLanguage
from core.helper.code_executor.entities import CodeDependency
MOCK = os.getenv('MOCK_SWITCH', 'false') == 'true' MOCK = os.getenv('MOCK_SWITCH', 'false') == 'true'
class MockedCodeExecutor: class MockedCodeExecutor:
@classmethod @classmethod
def invoke(cls, language: Literal['python3', 'javascript', 'jinja2'], def invoke(cls, language: Literal['python3', 'javascript', 'jinja2'],
code: str, inputs: dict, dependencies: Optional[list[CodeDependency]] = None) -> dict: code: str, inputs: dict) -> dict:
# invoke directly # invoke directly
match language: match language:
case CodeLanguage.PYTHON3: case CodeLanguage.PYTHON3:
@ -24,6 +23,8 @@ class MockedCodeExecutor:
return { return {
"result": Template(code).render(inputs) "result": Template(code).render(inputs)
} }
case _:
raise Exception("Language not supported")
@pytest.fixture @pytest.fixture
def setup_code_executor_mock(request, monkeypatch: MonkeyPatch): def setup_code_executor_mock(request, monkeypatch: MonkeyPatch):

View File

@ -28,14 +28,6 @@ def test_javascript_with_code_template():
inputs={'arg1': 'Hello', 'arg2': 'World'}) inputs={'arg1': 'Hello', 'arg2': 'World'})
assert result == {'result': 'HelloWorld'} assert result == {'result': 'HelloWorld'}
def test_javascript_list_default_available_packages():
packages = JavascriptCodeProvider.get_default_available_packages()
# no default packages available for javascript
assert len(packages) == 0
def test_javascript_get_runner_script(): def test_javascript_get_runner_script():
runner_script = NodeJsTemplateTransformer.get_runner_script() runner_script = NodeJsTemplateTransformer.get_runner_script()
assert runner_script.count(NodeJsTemplateTransformer._code_placeholder) == 1 assert runner_script.count(NodeJsTemplateTransformer._code_placeholder) == 1

View File

@ -29,15 +29,6 @@ def test_python3_with_code_template():
assert result == {'result': 'HelloWorld'} assert result == {'result': 'HelloWorld'}
def test_python3_list_default_available_packages():
packages = Python3CodeProvider.get_default_available_packages()
assert len(packages) > 0
assert {'requests', 'httpx'}.issubset(p['name'] for p in packages)
# check JSON serializable
assert len(str(json.dumps(packages))) > 0
def test_python3_get_runner_script(): def test_python3_get_runner_script():
runner_script = Python3TemplateTransformer.get_runner_script() runner_script = Python3TemplateTransformer.get_runner_script()
assert runner_script.count(Python3TemplateTransformer._code_placeholder) == 1 assert runner_script.count(Python3TemplateTransformer._code_placeholder) == 1

View File

@ -16,7 +16,7 @@ Use `docker-compose --profile certbot up` to use this features.
CERTBOT_DOMAIN=your_domain.com CERTBOT_DOMAIN=your_domain.com
CERTBOT_EMAIL=example@your_domain.com CERTBOT_EMAIL=example@your_domain.com
``` ```
excecute command: execute command:
```shell ```shell
sudo docker network prune sudo docker network prune
sudo docker-compose --profile certbot up --force-recreate -d sudo docker-compose --profile certbot up --force-recreate -d
@ -30,7 +30,7 @@ Use `docker-compose --profile certbot up` to use this features.
```properties ```properties
NGINX_HTTPS_ENABLED=true NGINX_HTTPS_ENABLED=true
``` ```
excecute command: execute command:
```shell ```shell
sudo docker-compose --profile certbot up -d --no-deps --force-recreate nginx sudo docker-compose --profile certbot up -d --no-deps --force-recreate nginx
``` ```

View File

@ -236,6 +236,12 @@ import { Row, Col, Properties, Property, Heading, SubProperty, Paragraph } from
<Property name='name' type='string' key='name'> <Property name='name' type='string' key='name'>
Knowledge name Knowledge name
</Property> </Property>
<Property name='permission' type='string' key='permission'>
Permission
- <code>only_me</code> Only me
- <code>all_team_members</code> All team members
- <code>partial_members</code> Partial members
</Property>
</Properties> </Properties>
</Col> </Col>
<Col sticky> <Col sticky>
@ -243,14 +249,15 @@ import { Row, Col, Properties, Property, Heading, SubProperty, Paragraph } from
title="Request" title="Request"
tag="POST" tag="POST"
label="/datasets" label="/datasets"
targetCode={`curl --location --request POST '${props.apiBaseUrl}/datasets' \\\n--header 'Authorization: Bearer {api_key}' \\\n--header 'Content-Type: application/json' \\\n--data-raw '{"name": "name"}'`} targetCode={`curl --location --request POST '${props.apiBaseUrl}/datasets' \\\n--header 'Authorization: Bearer {api_key}' \\\n--header 'Content-Type: application/json' \\\n--data-raw '{"name": "name", "permission": "only_me"}'`}
> >
```bash {{ title: 'cURL' }} ```bash {{ title: 'cURL' }}
curl --location --request POST '${apiBaseUrl}/v1/datasets' \ curl --location --request POST '${apiBaseUrl}/v1/datasets' \
--header 'Authorization: Bearer {api_key}' \ --header 'Authorization: Bearer {api_key}' \
--header 'Content-Type: application/json' \ --header 'Content-Type: application/json' \
--data-raw '{ --data-raw '{
"name": "name" "name": "name",
"permission": "only_me"
}' }'
``` ```
</CodeGroup> </CodeGroup>

View File

@ -236,6 +236,12 @@ import { Row, Col, Properties, Property, Heading, SubProperty, Paragraph } from
<Property name='name' type='string' key='name'> <Property name='name' type='string' key='name'>
知识库名称 知识库名称
</Property> </Property>
<Property name='permission' type='string' key='permission'>
权限
- <code>only_me</code> 仅自己
- <code>all_team_members</code> 所有团队成员
- <code>partial_members</code> 部分团队成员
</Property>
</Properties> </Properties>
</Col> </Col>
<Col sticky> <Col sticky>
@ -243,14 +249,15 @@ import { Row, Col, Properties, Property, Heading, SubProperty, Paragraph } from
title="Request" title="Request"
tag="POST" tag="POST"
label="/datasets" label="/datasets"
targetCode={`curl --location --request POST '${props.apiBaseUrl}/datasets' \\\n--header 'Authorization: Bearer {api_key}' \\\n--header 'Content-Type: application/json' \\\n--data-raw '{"name": "name"}'`} targetCode={`curl --location --request POST '${props.apiBaseUrl}/datasets' \\\n--header 'Authorization: Bearer {api_key}' \\\n--header 'Content-Type: application/json' \\\n--data-raw '{"name": "name", "permission": "only_me"}'`}
> >
```bash {{ title: 'cURL' }} ```bash {{ title: 'cURL' }}
curl --location --request POST '${props.apiBaseUrl}/datasets' \ curl --location --request POST '${props.apiBaseUrl}/datasets' \
--header 'Authorization: Bearer {api_key}' \ --header 'Authorization: Bearer {api_key}' \
--header 'Content-Type: application/json' \ --header 'Content-Type: application/json' \
--data-raw '{ --data-raw '{
"name": "name" "name": "name",
"permission": "only_me"
}' }'
``` ```
</CodeGroup> </CodeGroup>

View File

@ -9,7 +9,7 @@ import { randomString } from '@/utils'
export type IAppBasicProps = { export type IAppBasicProps = {
iconType?: 'app' | 'api' | 'dataset' | 'webapp' | 'notion' iconType?: 'app' | 'api' | 'dataset' | 'webapp' | 'notion'
icon?: string icon?: string
icon_background?: string icon_background?: string | null
name: string name: string
type: string | React.ReactNode type: string | React.ReactNode
hoverTip?: string hoverTip?: string

View File

@ -24,6 +24,7 @@ import { LeftIndent02 } from '@/app/components/base/icons/src/vender/line/editor
import { FileText } from '@/app/components/base/icons/src/vender/line/files' import { FileText } from '@/app/components/base/icons/src/vender/line/files'
import WorkflowToolConfigureButton from '@/app/components/tools/workflow-tool/configure-button' import WorkflowToolConfigureButton from '@/app/components/tools/workflow-tool/configure-button'
import type { InputVar } from '@/app/components/workflow/types' import type { InputVar } from '@/app/components/workflow/types'
import { appDefaultIconBackground } from '@/config'
export type AppPublisherProps = { export type AppPublisherProps = {
disabled?: boolean disabled?: boolean
@ -212,8 +213,8 @@ const AppPublisher = ({
detailNeedUpdate={!!toolPublished && published} detailNeedUpdate={!!toolPublished && published}
workflowAppId={appDetail?.id} workflowAppId={appDetail?.id}
icon={{ icon={{
content: appDetail?.icon, content: (appDetail.icon_type === 'image' ? '🤖' : appDetail?.icon) || '🤖',
background: appDetail?.icon_background, background: (appDetail.icon_type === 'image' ? appDefaultIconBackground : appDetail?.icon_background) || appDefaultIconBackground,
}} }}
name={appDetail?.name} name={appDetail?.name}
description={appDetail?.description} description={appDetail?.description}

View File

@ -43,9 +43,12 @@ const ConfigPanel = () => {
<> <>
<div className='flex items-center h-8 text-2xl font-semibold text-gray-800'> <div className='flex items-center h-8 text-2xl font-semibold text-gray-800'>
<AppIcon <AppIcon
iconType={appData?.site.icon_type}
icon={appData?.site.icon} icon={appData?.site.icon}
background='transparent' background='transparent'
imageUrl={appData?.site.icon_url}
size='small' size='small'
className="mr-2"
/> />
{appData?.site.title} {appData?.site.title}
</div> </div>

View File

@ -48,9 +48,12 @@ const ConfigPanel = () => {
<> <>
<div className='flex items-center h-8 text-2xl font-semibold text-gray-800'> <div className='flex items-center h-8 text-2xl font-semibold text-gray-800'>
<AppIcon <AppIcon
iconType={appData?.site.icon_type}
icon={appData?.site.icon} icon={appData?.site.icon}
imageUrl={appData?.site.icon_url}
background='transparent' background='transparent'
size='small' size='small'
className="mr-2"
/> />
{appData?.site.title} {appData?.site.title}
</div> </div>

View File

@ -68,6 +68,7 @@ const Panel = (props: PanelProps) => {
...tagList, ...tagList,
newTag, newTag,
]) ])
setKeywords('')
setCreating(false) setCreating(false)
onCreate() onCreate()
} }
@ -123,11 +124,8 @@ const Panel = (props: PanelProps) => {
handleValueChange() handleValueChange()
}) })
const onMouseLeave = async () => {
props.onClose?.()
}
return ( return (
<div className='relative w-full bg-white rounded-lg border-[0.5px] border-gray-200' onMouseLeave={onMouseLeave}> <div className='relative w-full bg-white rounded-lg border-[0.5px] border-gray-200'>
<div className='p-2 border-b-[0.5px] border-black/5'> <div className='p-2 border-b-[0.5px] border-black/5'>
<SearchInput placeholder={t('common.tag.selectorPlaceholder') || ''} white value={keywords} onChange={handleKeywordsChange} /> <SearchInput placeholder={t('common.tag.selectorPlaceholder') || ''} white value={keywords} onChange={handleKeywordsChange} />
</div> </div>

View File

@ -8,7 +8,7 @@ import { useDebounceFn } from 'ahooks'
import { useContext } from 'use-context-selector' import { useContext } from 'use-context-selector'
import { useTranslation } from 'react-i18next' import { useTranslation } from 'react-i18next'
import { useStore as useTagStore } from './store' import { useStore as useTagStore } from './store'
import TagRemoveModal from './tag-remove-modal' import Confirm from '@/app/components/base/confirm'
import cn from '@/utils/classnames' import cn from '@/utils/classnames'
import type { Tag } from '@/app/components/base/tag-management/constant' import type { Tag } from '@/app/components/base/tag-management/constant'
import { ToastContext } from '@/app/components/base/toast' import { ToastContext } from '@/app/components/base/toast'
@ -134,14 +134,15 @@ const TagItemEditor: FC<TagItemEditorProps> = ({
/> />
)} )}
</div> </div>
<TagRemoveModal <Confirm
tag={tag} title={`${t('common.tag.delete')} "${tag.name}"`}
show={showRemoveModal} isShow={showRemoveModal}
content={t('common.tag.deleteTip')}
onConfirm={() => { onConfirm={() => {
handleRemove() handleRemove()
setShowRemoveModal(false) setShowRemoveModal(false)
}} }}
onClose={() => setShowRemoveModal(false)} onCancel={() => setShowRemoveModal(false)}
/> />
</> </>
) )

View File

@ -87,8 +87,10 @@ const AppNav = () => {
})(isCurrentWorkspaceEditor, app) })(isCurrentWorkspaceEditor, app)
return { return {
id: app.id, id: app.id,
icon_type: app.icon_type,
icon: app.icon, icon: app.icon,
icon_background: app.icon_background, icon_background: app.icon_background,
icon_url: app.icon_url,
name: app.name, name: app.name,
mode: app.mode, mode: app.mode,
link, link,

View File

@ -16,13 +16,16 @@ import { Route } from '@/app/components/base/icons/src/vender/solid/mapsAndTrave
import { useAppContext } from '@/context/app-context' import { useAppContext } from '@/context/app-context'
import { useStore as useAppStore } from '@/app/components/app/store' import { useStore as useAppStore } from '@/app/components/app/store'
import { FileArrow01, FilePlus01, FilePlus02 } from '@/app/components/base/icons/src/vender/line/files' import { FileArrow01, FilePlus01, FilePlus02 } from '@/app/components/base/icons/src/vender/line/files'
import type { AppIconType } from '@/types/app'
export type NavItem = { export type NavItem = {
id: string id: string
name: string name: string
link: string link: string
icon_type: AppIconType | null
icon: string icon: string
icon_background: string icon_background: string
icon_url: string | null
mode?: string mode?: string
} }
export type INavSelectorProps = { export type INavSelectorProps = {
@ -82,7 +85,7 @@ const NavSelector = ({ curNav, navs, createText, isApp, onCreate, onLoadmore }:
router.push(nav.link) router.push(nav.link)
}} title={nav.name}> }} title={nav.name}>
<div className='relative w-6 h-6 mr-2 rounded-md'> <div className='relative w-6 h-6 mr-2 rounded-md'>
<AppIcon size='tiny' icon={nav.icon} background={nav.icon_background} /> <AppIcon size='tiny' iconType={nav.icon_type} icon={nav.icon} background={nav.icon_background} imageUrl={nav.icon_url}/>
{!!nav.mode && ( {!!nav.mode && (
<span className={cn( <span className={cn(
'absolute w-3.5 h-3.5 -bottom-0.5 -right-0.5 p-0.5 bg-white rounded border-[0.5px] border-[rgba(0,0,0,0.02)] shadow-sm', 'absolute w-3.5 h-3.5 -bottom-0.5 -right-0.5 p-0.5 bg-white rounded border-[0.5px] border-[rgba(0,0,0,0.02)] shadow-sm',

View File

@ -133,7 +133,7 @@ const WorkflowToolAsModal: FC<Props> = ({
<div> <div>
<div className='py-2 leading-5 text-sm font-medium text-gray-900'>{t('tools.createTool.name')} <span className='ml-1 text-red-500'>*</span></div> <div className='py-2 leading-5 text-sm font-medium text-gray-900'>{t('tools.createTool.name')} <span className='ml-1 text-red-500'>*</span></div>
<div className='flex items-center justify-between gap-3'> <div className='flex items-center justify-between gap-3'>
<AppIcon size='large' onClick={() => { setShowEmojiPicker(true) }} className='cursor-pointer' icon={emoji.content} background={emoji.background} /> <AppIcon size='large' onClick={() => { setShowEmojiPicker(true) }} className='cursor-pointer' iconType='emoji' icon={emoji.content} background={emoji.background} />
<input <input
type='text' type='text'
className='grow h-10 px-3 text-sm font-normal bg-gray-100 rounded-lg border border-transparent outline-none appearance-none caret-primary-600 placeholder:text-gray-400 hover:bg-gray-50 hover:border hover:border-gray-300 focus:bg-gray-50 focus:border focus:border-gray-300 focus:shadow-xs' className='grow h-10 px-3 text-sm font-normal bg-gray-100 rounded-lg border border-transparent outline-none appearance-none caret-primary-600 placeholder:text-gray-400 hover:bg-gray-50 hover:border hover:border-gray-300 focus:bg-gray-50 focus:border focus:border-gray-300 focus:shadow-xs'

View File

@ -1,97 +0,0 @@
import type { FC } from 'react'
import React, { useCallback, useState } from 'react'
import { t } from 'i18next'
import {
RiArrowDownSLine,
RiSearchLine,
} from '@remixicon/react'
import type { CodeDependency } from './types'
import { PortalToFollowElem, PortalToFollowElemContent, PortalToFollowElemTrigger } from '@/app/components/base/portal-to-follow-elem'
import { Check } from '@/app/components/base/icons/src/vender/line/general'
import { XCircle } from '@/app/components/base/icons/src/vender/solid/general'
type Props = {
value: CodeDependency
available_dependencies: CodeDependency[]
onChange: (dependency: CodeDependency) => void
}
const DependencyPicker: FC<Props> = ({
available_dependencies,
value,
onChange,
}) => {
const [open, setOpen] = useState(false)
const [searchText, setSearchText] = useState('')
const handleChange = useCallback((dependency: CodeDependency) => {
return () => {
setOpen(false)
onChange(dependency)
}
}, [onChange])
return (
<PortalToFollowElem
open={open}
onOpenChange={setOpen}
placement='bottom-start'
offset={4}
>
<PortalToFollowElemTrigger onClick={() => setOpen(!open)} className='flex-grow cursor-pointer'>
<div className='flex items-center h-8 justify-between px-2.5 rounded-lg border-0 bg-gray-100 text-gray-900 text-[13px]'>
<div className='grow w-0 truncate' title={value.name}>{value.name}</div>
<RiArrowDownSLine className='shrink-0 w-3.5 h-3.5 text-gray-700' />
</div>
</PortalToFollowElemTrigger>
<PortalToFollowElemContent style={{
zIndex: 100,
}}>
<div className='p-1 bg-white rounded-lg shadow-sm' style={{
width: 350,
}}>
<div
className='shadow-sm bg-white mb-2 mx-1 flex items-center px-2 rounded-lg bg-gray-100'
>
<RiSearchLine className='shrink-0 ml-[1px] mr-[5px] w-3.5 h-3.5 text-gray-400' />
<input
value={searchText}
className='grow px-0.5 py-[7px] text-[13px] text-gray-700 bg-transparent appearance-none outline-none caret-primary-600 placeholder:text-gray-400'
placeholder={t('workflow.nodes.code.searchDependencies') || ''}
onChange={e => setSearchText(e.target.value)}
autoFocus
/>
{
searchText && (
<div
className='flex items-center justify-center ml-[5px] w-[18px] h-[18px] cursor-pointer'
onClick={() => setSearchText('')}
>
<XCircle className='w-[14px] h-[14px] text-gray-400' />
</div>
)
}
</div>
<div className='max-h-[30vh] overflow-y-auto'>
{available_dependencies.filter((v) => {
if (!searchText)
return true
return v.name.toLowerCase().includes(searchText.toLowerCase())
}).map(dependency => (
<div
key={dependency.name}
className='flex items-center h-[30px] justify-between pl-3 pr-2 rounded-lg hover:bg-gray-100 text-gray-900 text-[13px] cursor-pointer'
onClick={handleChange(dependency)}
>
<div className='w-0 grow truncate'>{dependency.name}</div>
{dependency.name === value.name && <Check className='shrink-0 w-4 h-4 text-primary-600' />}
</div>
))}
</div>
</div>
</PortalToFollowElemContent>
</PortalToFollowElem>
)
}
export default React.memo(DependencyPicker)

View File

@ -1,36 +0,0 @@
import type { FC } from 'react'
import React from 'react'
import RemoveButton from '../_base/components/remove-button'
import type { CodeDependency } from './types'
import DependencyPicker from './dependency-picker'
type Props = {
available_dependencies: CodeDependency[]
dependencies: CodeDependency[]
handleRemove: (index: number) => void
handleChange: (index: number, dependency: CodeDependency) => void
}
const Dependencies: FC<Props> = ({
available_dependencies, dependencies, handleRemove, handleChange,
}) => {
return (
<div className='space-y-2'>
{dependencies.map((dependency, index) => (
<div className='flex items-center space-x-1' key={index}>
<DependencyPicker
value={dependency}
available_dependencies={available_dependencies}
onChange={dependency => handleChange(index, dependency)}
/>
<RemoveButton
className='!p-2 !bg-gray-100 hover:!bg-gray-200'
onClick={() => handleRemove(index)}
/>
</div>
))}
</div>
)
}
export default React.memo(Dependencies)

View File

@ -5,7 +5,6 @@ import RemoveEffectVarConfirm from '../_base/components/remove-effect-var-confir
import useConfig from './use-config' import useConfig from './use-config'
import type { CodeNodeType } from './types' import type { CodeNodeType } from './types'
import { CodeLanguage } from './types' import { CodeLanguage } from './types'
import Dependencies from './dependency'
import VarList from '@/app/components/workflow/nodes/_base/components/variable/var-list' import VarList from '@/app/components/workflow/nodes/_base/components/variable/var-list'
import OutputVarList from '@/app/components/workflow/nodes/_base/components/variable/output-var-list' import OutputVarList from '@/app/components/workflow/nodes/_base/components/variable/output-var-list'
import AddButton from '@/app/components/base/button/add-button' import AddButton from '@/app/components/base/button/add-button'
@ -60,11 +59,6 @@ const Panel: FC<NodePanelProps<CodeNodeType>> = ({
varInputs, varInputs,
inputVarValues, inputVarValues,
setInputVarValues, setInputVarValues,
allowDependencies,
availableDependencies,
handleAddDependency,
handleRemoveDependency,
handleChangeDependency,
} = useConfig(id, data) } = useConfig(id, data)
return ( return (
@ -84,31 +78,6 @@ const Panel: FC<NodePanelProps<CodeNodeType>> = ({
filterVar={filterVar} filterVar={filterVar}
/> />
</Field> </Field>
{
allowDependencies
? (
<div>
<Split />
<div className='pt-4'>
<Field
title={t(`${i18nPrefix}.advancedDependencies`)}
operations={
<AddButton onClick={() => handleAddDependency({ name: '', version: '' })} />
}
tooltip={t(`${i18nPrefix}.advancedDependenciesTip`)!}
>
<Dependencies
available_dependencies={availableDependencies}
dependencies={inputs.dependencies || []}
handleRemove={index => handleRemoveDependency(index)}
handleChange={(index, dependency) => handleChangeDependency(index, dependency)}
/>
</Field>
</div>
</div>
)
: null
}
<Split /> <Split />
<CodeEditor <CodeEditor
isInNode isInNode

View File

@ -16,10 +16,4 @@ export type CodeNodeType = CommonNodeType & {
code_language: CodeLanguage code_language: CodeLanguage
code: string code: string
outputs: OutputVar outputs: OutputVar
dependencies?: CodeDependency[]
}
export type CodeDependency = {
name: string
version: string
} }

View File

@ -5,7 +5,7 @@ import useOutputVarList from '../_base/hooks/use-output-var-list'
import { BlockEnum, VarType } from '../../types' import { BlockEnum, VarType } from '../../types'
import type { Var } from '../../types' import type { Var } from '../../types'
import { useStore } from '../../store' import { useStore } from '../../store'
import type { CodeDependency, CodeNodeType, OutputVar } from './types' import type { CodeNodeType, OutputVar } from './types'
import { CodeLanguage } from './types' import { CodeLanguage } from './types'
import useNodeCrud from '@/app/components/workflow/nodes/_base/hooks/use-node-crud' import useNodeCrud from '@/app/components/workflow/nodes/_base/hooks/use-node-crud'
import useOneStepRun from '@/app/components/workflow/nodes/_base/hooks/use-one-step-run' import useOneStepRun from '@/app/components/workflow/nodes/_base/hooks/use-one-step-run'
@ -21,19 +21,15 @@ const useConfig = (id: string, payload: CodeNodeType) => {
const appId = useAppStore.getState().appDetail?.id const appId = useAppStore.getState().appDetail?.id
const [allLanguageDefault, setAllLanguageDefault] = useState<Record<CodeLanguage, CodeNodeType> | null>(null) const [allLanguageDefault, setAllLanguageDefault] = useState<Record<CodeLanguage, CodeNodeType> | null>(null)
const [allLanguageDependencies, setAllLanguageDependencies] = useState<Record<CodeLanguage, CodeDependency[]> | null>(null)
useEffect(() => { useEffect(() => {
if (appId) { if (appId) {
(async () => { (async () => {
const { config: javaScriptConfig } = await fetchNodeDefault(appId, BlockEnum.Code, { code_language: CodeLanguage.javascript }) as any const { config: javaScriptConfig } = await fetchNodeDefault(appId, BlockEnum.Code, { code_language: CodeLanguage.javascript }) as any
const { config: pythonConfig, available_dependencies: pythonDependencies } = await fetchNodeDefault(appId, BlockEnum.Code, { code_language: CodeLanguage.python3 }) as any const { config: pythonConfig } = await fetchNodeDefault(appId, BlockEnum.Code, { code_language: CodeLanguage.python3 }) as any
setAllLanguageDefault({ setAllLanguageDefault({
[CodeLanguage.javascript]: javaScriptConfig as CodeNodeType, [CodeLanguage.javascript]: javaScriptConfig as CodeNodeType,
[CodeLanguage.python3]: pythonConfig as CodeNodeType, [CodeLanguage.python3]: pythonConfig as CodeNodeType,
} as any) } as any)
setAllLanguageDependencies({
[CodeLanguage.python3]: pythonDependencies as CodeDependency[],
} as any)
})() })()
} }
}, [appId]) }, [appId])
@ -45,62 +41,6 @@ const useConfig = (id: string, payload: CodeNodeType) => {
setInputs, setInputs,
}) })
const handleAddDependency = useCallback((dependency: CodeDependency) => {
const newInputs = produce(inputs, (draft) => {
if (!draft.dependencies)
draft.dependencies = []
draft.dependencies.push(dependency)
})
setInputs(newInputs)
}, [inputs, setInputs])
const handleRemoveDependency = useCallback((index: number) => {
const newInputs = produce(inputs, (draft) => {
if (!draft.dependencies)
draft.dependencies = []
draft.dependencies.splice(index, 1)
})
setInputs(newInputs)
}, [inputs, setInputs])
const handleChangeDependency = useCallback((index: number, dependency: CodeDependency) => {
const newInputs = produce(inputs, (draft) => {
if (!draft.dependencies)
draft.dependencies = []
draft.dependencies[index] = dependency
})
setInputs(newInputs)
}, [inputs, setInputs])
const [allowDependencies, setAllowDependencies] = useState<boolean>(false)
useEffect(() => {
if (!inputs.code_language)
return
if (!allLanguageDependencies)
return
const newAllowDependencies = !!allLanguageDependencies[inputs.code_language]
setAllowDependencies(newAllowDependencies)
}, [allLanguageDependencies, inputs.code_language])
const [availableDependencies, setAvailableDependencies] = useState<CodeDependency[]>([])
useEffect(() => {
if (!inputs.code_language)
return
if (!allLanguageDependencies)
return
const newAvailableDependencies = produce(allLanguageDependencies[inputs.code_language], (draft) => {
const currentLanguage = inputs.code_language
if (!currentLanguage || !draft || !inputs.dependencies)
return []
return draft.filter((dependency) => {
return !inputs.dependencies?.find(d => d.name === dependency.name)
})
})
setAvailableDependencies(newAvailableDependencies || [])
}, [allLanguageDependencies, inputs.code_language, inputs.dependencies])
const [outputKeyOrders, setOutputKeyOrders] = useState<string[]>([]) const [outputKeyOrders, setOutputKeyOrders] = useState<string[]>([])
const syncOutputKeyOrders = useCallback((outputs: OutputVar) => { const syncOutputKeyOrders = useCallback((outputs: OutputVar) => {
setOutputKeyOrders(Object.keys(outputs)) setOutputKeyOrders(Object.keys(outputs))
@ -223,11 +163,6 @@ const useConfig = (id: string, payload: CodeNodeType) => {
inputVarValues, inputVarValues,
setInputVarValues, setInputVarValues,
runResult, runResult,
availableDependencies,
allowDependencies,
handleAddDependency,
handleRemoveDependency,
handleChangeDependency,
} }
} }

View File

@ -14,10 +14,10 @@ export type SiteInfo = {
title: string title: string
chat_color_theme?: string chat_color_theme?: string
chat_color_theme_inverted?: boolean chat_color_theme_inverted?: boolean
icon_type?: AppIconType icon_type?: AppIconType | null
icon?: string icon?: string
icon_background?: string icon_background?: string | null
icon_url?: string icon_url?: string | null
description?: string description?: string
default_language?: Locale default_language?: Locale
prompt_public?: boolean prompt_public?: boolean