Merge branch 'refs/heads/main' into feat/workflow-parallel-support

This commit is contained in:
takatost 2024-07-07 16:57:21 +08:00
commit fed068ac2e
688 changed files with 21760 additions and 9718 deletions

View File

@ -3,7 +3,7 @@
cd web && npm install cd web && npm install
echo 'alias start-api="cd /workspaces/dify/api && flask run --host 0.0.0.0 --port=5001 --debug"' >> ~/.bashrc echo 'alias start-api="cd /workspaces/dify/api && flask run --host 0.0.0.0 --port=5001 --debug"' >> ~/.bashrc
echo 'alias start-worker="cd /workspaces/dify/api && celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail"' >> ~/.bashrc echo 'alias start-worker="cd /workspaces/dify/api && celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion"' >> ~/.bashrc
echo 'alias start-web="cd /workspaces/dify/web && npm run dev"' >> ~/.bashrc echo 'alias start-web="cd /workspaces/dify/web && npm run dev"' >> ~/.bashrc
echo 'alias start-containers="cd /workspaces/dify/docker && docker-compose -f docker-compose.middleware.yaml -p dify up -d"' >> ~/.bashrc echo 'alias start-containers="cd /workspaces/dify/docker && docker-compose -f docker-compose.middleware.yaml -p dify up -d"' >> ~/.bashrc

7
.gitattributes vendored Normal file
View File

@ -0,0 +1,7 @@
# Ensure that .sh scripts use LF as line separator, even if they are checked out
# to Windows(NTFS) file-system, by a user of Docker for Window.
# These .sh scripts will be run from the Container after `docker compose up -d`.
# If they appear to be CRLF style, Dash from the Container will fail to execute
# them.
*.sh text eol=lf

View File

@ -14,6 +14,8 @@ body:
required: true required: true
- label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)). - label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)).
required: true required: true
- label: "请务必使用英文提交 Issue否则会被关闭。谢谢:"
required: true
- label: "Please do not modify this template :) and fill in all the required fields." - label: "Please do not modify this template :) and fill in all the required fields."
required: true required: true

View File

@ -12,6 +12,8 @@ body:
required: true required: true
- label: I confirm that I am using English to submit report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)). - label: I confirm that I am using English to submit report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)).
required: true required: true
- label: "请务必使用英文提交 Issue否则会被关闭。谢谢:"
required: true
- label: "Please do not modify this template :) and fill in all the required fields." - label: "Please do not modify this template :) and fill in all the required fields."
required: true required: true
- type: textarea - type: textarea

View File

@ -12,6 +12,8 @@ body:
required: true required: true
- label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)). - label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)).
required: true required: true
- label: "请务必使用英文提交 Issue否则会被关闭。谢谢:"
required: true
- label: "Please do not modify this template :) and fill in all the required fields." - label: "Please do not modify this template :) and fill in all the required fields."
required: true required: true
- type: textarea - type: textarea

View File

@ -12,6 +12,8 @@ body:
required: true required: true
- label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)). - label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)).
required: true required: true
- label: "请务必使用英文提交 Issue否则会被关闭。谢谢:"
required: true
- label: "Please do not modify this template :) and fill in all the required fields." - label: "Please do not modify this template :) and fill in all the required fields."
required: true required: true
- type: input - type: input

View File

@ -1,13 +1,21 @@
# Checklist:
> [!IMPORTANT]
> Please review the checklist below before submitting your pull request.
- [ ] Please open an issue before creating a PR or link to an existing issue
- [ ] I have performed a self-review of my own code
- [ ] I have commented my code, particularly in hard-to-understand areas
- [ ] I ran `dev/reformat`(backend) and `cd web && npx lint-staged`(frontend) to appease the lint gods
# Description # Description
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change. Describe the big picture of your changes here to communicate to the maintainers why we should accept this pull request. If it fixes a bug or resolves a feature request, be sure to link to that issue. Close issue syntax: `Fixes #<issue number>`, see [documentation](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword) for more details.
Fixes # (issue) Fixes
## Type of Change ## Type of Change
Please delete options that are not relevant.
- [ ] Bug fix (non-breaking change which fixes an issue) - [ ] Bug fix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality) - [ ] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected) - [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
@ -15,18 +23,12 @@ Please delete options that are not relevant.
- [ ] Improvement, including but not limited to code refactoring, performance optimization, and UI/UX improvement - [ ] Improvement, including but not limited to code refactoring, performance optimization, and UI/UX improvement
- [ ] Dependency upgrade - [ ] Dependency upgrade
# How Has This Been Tested? # Testing Instructions
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
- [ ] TODO - [ ] Test A
- [ ] Test B
# Suggested Checklist:
- [ ] I have performed a self-review of my own code
- [ ] I have commented my code, particularly in hard-to-understand areas
- [ ] My changes generate no new warnings
- [ ] I ran `dev/reformat`(backend) and `cd web && npx lint-staged`(frontend) to appease the lint gods
- [ ] `optional` I have made corresponding changes to the documentation
- [ ] `optional` I have added tests that prove my fix is effective or that my feature works
- [ ] `optional` New and existing unit tests pass locally with my changes

View File

@ -55,6 +55,14 @@ jobs:
- name: Run Tool - name: Run Tool
run: poetry run -C api bash dev/pytest/pytest_tools.sh run: poetry run -C api bash dev/pytest/pytest_tools.sh
- name: Set up dotenvs
run: |
cp docker/.env.example docker/.env
cp docker/middleware.env.example docker/middleware.env
- name: Expose Service Ports
run: sh .github/workflows/expose_service_ports.sh
- name: Set up Sandbox - name: Set up Sandbox
uses: hoverkraft-tech/compose-action@v2.0.0 uses: hoverkraft-tech/compose-action@v2.0.0
with: with:
@ -71,13 +79,7 @@ jobs:
uses: hoverkraft-tech/compose-action@v2.0.0 uses: hoverkraft-tech/compose-action@v2.0.0
with: with:
compose-file: | compose-file: |
docker/docker-compose.middleware.yaml docker/docker-compose.yaml
docker/docker-compose.qdrant.yaml
docker/docker-compose.milvus.yaml
docker/docker-compose.pgvecto-rs.yaml
docker/docker-compose.pgvector.yaml
docker/docker-compose.chroma.yaml
docker/docker-compose.oracle.yaml
services: | services: |
weaviate weaviate
qdrant qdrant
@ -87,7 +89,5 @@ jobs:
pgvecto-rs pgvecto-rs
pgvector pgvector
chroma chroma
oracle
- name: Test Vector Stores - name: Test Vector Stores
run: poetry run -C api bash dev/pytest/pytest_vdb.sh run: poetry run -C api bash dev/pytest/pytest_vdb.sh

View File

@ -38,6 +38,11 @@ jobs:
- name: Install dependencies - name: Install dependencies
run: poetry install -C api run: poetry install -C api
- name: Prepare middleware env
run: |
cd docker
cp middleware.env.example middleware.env
- name: Set up Middlewares - name: Set up Middlewares
uses: hoverkraft-tech/compose-action@v2.0.0 uses: hoverkraft-tech/compose-action@v2.0.0
with: with:

10
.github/workflows/expose_service_ports.sh vendored Executable file
View File

@ -0,0 +1,10 @@
#!/bin/bash
yq eval '.services.weaviate.ports += ["8080:8080"]' -i docker/docker-compose.yaml
yq eval '.services.qdrant.ports += ["6333:6333"]' -i docker/docker-compose.yaml
yq eval '.services.chroma.ports += ["8000:8000"]' -i docker/docker-compose.yaml
yq eval '.services["milvus-standalone"].ports += ["19530:19530"]' -i docker/docker-compose.yaml
yq eval '.services.pgvector.ports += ["5433:5432"]' -i docker/docker-compose.yaml
yq eval '.services["pgvecto-rs"].ports += ["5431:5432"]' -i docker/docker-compose.yaml
echo "Ports exposed for sandbox, weaviate, qdrant, chroma, milvus, pgvector, pgvecto-rs."

16
.gitignore vendored
View File

@ -139,10 +139,21 @@ web/.vscode/settings.json
!.idea/icon.png !.idea/icon.png
.ideaDataSources/ .ideaDataSources/
*.iml *.iml
api/.idea
api/.env api/.env
api/storage/* api/storage/*
docker-legacy/volumes/app/storage/*
docker-legacy/volumes/db/data/*
docker-legacy/volumes/redis/data/*
docker-legacy/volumes/weaviate/*
docker-legacy/volumes/qdrant/*
docker-legacy/volumes/etcd/*
docker-legacy/volumes/minio/*
docker-legacy/volumes/milvus/*
docker-legacy/volumes/chroma/*
docker/volumes/app/storage/* docker/volumes/app/storage/*
docker/volumes/db/data/* docker/volumes/db/data/*
docker/volumes/redis/data/* docker/volumes/redis/data/*
@ -153,6 +164,9 @@ docker/volumes/minio/*
docker/volumes/milvus/* docker/volumes/milvus/*
docker/volumes/chroma/* docker/volumes/chroma/*
docker/nginx/conf.d/default.conf
docker/middleware.env
sdks/python-client/build sdks/python-client/build
sdks/python-client/dist sdks/python-client/dist
sdks/python-client/dify_client.egg-info sdks/python-client/dify_client.egg-info
@ -160,3 +174,5 @@ sdks/python-client/dify_client.egg-info
.vscode/* .vscode/*
!.vscode/launch.json !.vscode/launch.json
pyrightconfig.json pyrightconfig.json
.idea/

View File

@ -1,30 +1,16 @@
{ {
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0", "version": "0.2.0",
"configurations": [ "configurations": [
{
"name": "Python: Celery",
"type": "debugpy",
"request": "launch",
"module": "celery",
"justMyCode": true,
"args": ["-A", "app.celery", "worker", "-P", "gevent", "-c", "1", "--loglevel", "info", "-Q", "dataset,generation,mail"],
"envFile": "${workspaceFolder}/.env",
"env": {
"FLASK_APP": "app.py",
"FLASK_DEBUG": "1",
"GEVENT_SUPPORT": "True"
},
"console": "integratedTerminal",
"python": "${command:python.interpreterPath}"
},
{ {
"name": "Python: Flask", "name": "Python: Flask",
"type": "debugpy", "type": "debugpy",
"request": "launch", "request": "launch",
"python": "${workspaceFolder}/api/.venv/bin/python",
"cwd": "${workspaceFolder}/api",
"envFile": ".env",
"module": "flask", "module": "flask",
"justMyCode": true,
"jinja": true,
"env": { "env": {
"FLASK_APP": "app.py", "FLASK_APP": "app.py",
"FLASK_DEBUG": "1", "FLASK_DEBUG": "1",
@ -34,11 +20,36 @@
"run", "run",
"--host=0.0.0.0", "--host=0.0.0.0",
"--port=5001", "--port=5001",
"--debug" ]
], },
"jinja": true, {
"name": "Python: Celery",
"type": "debugpy",
"request": "launch",
"python": "${workspaceFolder}/api/.venv/bin/python",
"cwd": "${workspaceFolder}/api",
"module": "celery",
"justMyCode": true, "justMyCode": true,
"python": "${command:python.interpreterPath}" "envFile": ".env",
} "console": "integratedTerminal",
"env": {
"FLASK_APP": "app.py",
"FLASK_DEBUG": "1",
"GEVENT_SUPPORT": "True"
},
"args": [
"-A",
"app.celery",
"worker",
"-P",
"gevent",
"-c",
"1",
"--loglevel",
"info",
"-Q",
"dataset,generation,mail,ops_trace,app_deletion"
]
},
] ]
} }

View File

@ -174,6 +174,7 @@ The easiest way to start the Dify server is to run our [docker-compose.yml](dock
```bash ```bash
cd docker cd docker
cp .env.example .env
docker compose up -d docker compose up -d
``` ```
@ -183,7 +184,7 @@ After running, you can access the Dify dashboard in your browser at [http://loca
## Next steps ## Next steps
If you need to customize the configuration, please refer to the comments in our [docker-compose.yml](docker/docker-compose.yaml) file and manually set the environment configuration. After making the changes, please run `docker-compose up -d` again. You can see the full list of environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments). If you need to customize the configuration, please refer to the comments in our [.env.example](docker/.env.example) file and update the corresponding values in your `.env` file. Additionally, you might need to make adjustments to the `docker-compose.yaml` file itself, such as changing image versions, port mappings, or volume mounts, based on your specific deployment environment and requirements. After making any changes, please re-run `docker-compose up -d`. You can find the full list of available environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments).
If you'd like to configure a highly-available setup, there are community-contributed [Helm Charts](https://helm.sh/) and YAML files which allow Dify to be deployed on Kubernetes. If you'd like to configure a highly-available setup, there are community-contributed [Helm Charts](https://helm.sh/) and YAML files which allow Dify to be deployed on Kubernetes.
@ -191,6 +192,11 @@ If you'd like to configure a highly-available setup, there are community-contrib
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm) - [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes) - [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Using Terraform for Deployment
##### Azure Global
Deploy Dify to Azure with a single click using [terraform](https://www.terraform.io/).
- [Azure Terraform by @nikawang](https://github.com/nikawang/dify-azure-terraform)
## Contributing ## Contributing

View File

@ -157,15 +157,17 @@
```bash ```bash
cd docker cd docker
cp .env.example .env
docker compose up -d docker compose up -d
``` ```
بعد التشغيل، يمكنك الوصول إلى لوحة تحكم Dify في متصفحك على [http://localhost/install](http://localhost/install) وبدء عملية التهيئة. بعد التشغيل، يمكنك الوصول إلى لوحة تحكم Dify في متصفحك على [http://localhost/install](http://localhost/install) وبدء عملية التهيئة.
> إذا كنت ترغب في المساهمة في Dify أو القيام بتطوير إضافي، فانظر إلى [دليلنا للنشر من الشفرة (code) المصدرية](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code) > إذا كنت ترغب في المساهمة في Dify أو القيام بتطوير إضافي، فانظر إلى [دليلنا للنشر من الشفرة (code) المصدرية](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
## الخطوات التالية ## الخطوات التالية
إذا كنت بحاجة إلى تخصيص التكوين، يرجى الرجوع إلى التعليقات في ملف [docker-compose.yml](docker/docker-compose.yaml) لدينا وتعيين التكوينات البيئية يدويًا. بعد إجراء التغييرات، يرجى تشغيل `docker-compose up -d` مرة أخرى. يمكنك رؤية قائمة كاملة بالمتغيرات البيئية [هنا](https://docs.dify.ai/getting-started/install-self-hosted/environments). إذا كنت بحاجة إلى تخصيص الإعدادات، فيرجى الرجوع إلى التعليقات في ملف [.env.example](docker/.env.example) وتحديث القيم المقابلة في ملف `.env`. بالإضافة إلى ذلك، قد تحتاج إلى إجراء تعديلات على ملف `docker-compose.yaml` نفسه، مثل تغيير إصدارات الصور أو تعيينات المنافذ أو نقاط تحميل وحدات التخزين، بناءً على بيئة النشر ومتطلباتك الخاصة. بعد إجراء أي تغييرات، يرجى إعادة تشغيل `docker-compose up -d`. يمكنك العثور على قائمة كاملة بمتغيرات البيئة المتاحة [هنا](https://docs.dify.ai/getting-started/install-self-hosted/environments).
يوجد مجتمع خاص بـ [Helm Charts](https://helm.sh/) وملفات YAML التي تسمح بتنفيذ Dify على Kubernetes للنظام من الإيجابيات العلوية. يوجد مجتمع خاص بـ [Helm Charts](https://helm.sh/) وملفات YAML التي تسمح بتنفيذ Dify على Kubernetes للنظام من الإيجابيات العلوية.
@ -173,6 +175,12 @@ docker compose up -d
- [رسم بياني Helm من قبل @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm) - [رسم بياني Helm من قبل @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [ملف YAML من قبل @Winson-030](https://github.com/Winson-030/dify-kubernetes) - [ملف YAML من قبل @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### استخدام Terraform للتوزيع
##### Azure Global
استخدم [terraform](https://www.terraform.io/) لنشر Dify على Azure بنقرة واحدة.
- [Azure Terraform بواسطة @nikawang](https://github.com/nikawang/dify-azure-terraform)
## المساهمة ## المساهمة

View File

@ -179,11 +179,16 @@ Dify 是一个开源的 LLM 应用开发平台。其直观的界面结合了 AI
```bash ```bash
cd docker cd docker
cp .env.example .env
docker compose up -d docker compose up -d
``` ```
运行后,可以在浏览器上访问 [http://localhost/install](http://localhost/install) 进入 Dify 控制台并开始初始化安装操作。 运行后,可以在浏览器上访问 [http://localhost/install](http://localhost/install) 进入 Dify 控制台并开始初始化安装操作。
### 自定义配置
如果您需要自定义配置,请参考 [.env.example](docker/.env.example) 文件中的注释,并更新 `.env` 文件中对应的值。此外,您可能需要根据您的具体部署环境和需求对 `docker-compose.yaml` 文件本身进行调整,例如更改镜像版本、端口映射或卷挂载。完成任何更改后,请重新运行 `docker-compose up -d`。您可以在[此处](https://docs.dify.ai/getting-started/install-self-hosted/environments)找到可用环境变量的完整列表。
#### 使用 Helm Chart 部署 #### 使用 Helm Chart 部署
使用 [Helm Chart](https://helm.sh/) 版本或者 YAML 文件,可以在 Kubernetes 上部署 Dify。 使用 [Helm Chart](https://helm.sh/) 版本或者 YAML 文件,可以在 Kubernetes 上部署 Dify。
@ -192,9 +197,11 @@ docker compose up -d
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm) - [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML 文件 by @Winson-030](https://github.com/Winson-030/dify-kubernetes) - [YAML 文件 by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
### 配置 #### 使用 Terraform 部署
如果您需要自定义配置,请参考我们的 [docker-compose.yml](docker/docker-compose.yaml) 文件中的注释,并手动设置环境配置。更改后,请再次运行 `docker-compose up -d`。您可以在我们的[文档](https://docs.dify.ai/getting-started/install-self-hosted/environments)中查看所有环境变量的完整列表。 ##### Azure Global
使用 [terraform](https://www.terraform.io/) 一键部署 Dify 到 Azure。
- [Azure Terraform by @nikawang](https://github.com/nikawang/dify-azure-terraform)
## Star History ## Star History

View File

@ -179,6 +179,7 @@ La forma más fácil de iniciar el servidor de Dify es ejecutar nuestro archivo
```bash ```bash
cd docker cd docker
cp .env.example .env
docker compose up -d docker compose up -d
``` ```
@ -188,7 +189,7 @@ Después de ejecutarlo, puedes acceder al panel de control de Dify en tu navegad
## Próximos pasos ## Próximos pasos
Si necesitas personalizar la configuración, consulta los comentarios en nuestro archivo [docker-compose.yml](docker/docker-compose.yaml) y configura manualmente la configuración del entorno Si necesita personalizar la configuración, consulte los comentarios en nuestro archivo [.env.example](docker/.env.example) y actualice los valores correspondientes en su archivo `.env`. Además, es posible que deba realizar ajustes en el propio archivo `docker-compose.yaml`, como cambiar las versiones de las imágenes, las asignaciones de puertos o los montajes de volúmenes, según su entorno de implementación y requisitos específicos. Después de realizar cualquier cambio, vuelva a ejecutar `docker-compose up -d`. Puede encontrar la lista completa de variables de entorno disponibles [aquí](https://docs.dify.ai/getting-started/install-self-hosted/environments).
. Después de realizar los cambios, ejecuta `docker-compose up -d` nuevamente. Puedes ver la lista completa de variables de entorno [aquí](https://docs.dify.ai/getting-started/install-self-hosted/environments). . Después de realizar los cambios, ejecuta `docker-compose up -d` nuevamente. Puedes ver la lista completa de variables de entorno [aquí](https://docs.dify.ai/getting-started/install-self-hosted/environments).
@ -198,6 +199,12 @@ Si desea configurar una configuración de alta disponibilidad, la comunidad prop
- [Gráfico Helm por @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm) - [Gráfico Helm por @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [Ficheros YAML por @Winson-030](https://github.com/Winson-030/dify-kubernetes) - [Ficheros YAML por @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Uso de Terraform para el despliegue
##### Azure Global
Utiliza [terraform](https://www.terraform.io/) para desplegar Dify en Azure con un solo clic.
- [Azure Terraform por @nikawang](https://github.com/nikawang/dify-azure-terraform)
## Contribuir ## Contribuir

View File

@ -179,6 +179,7 @@ La manière la plus simple de démarrer le serveur Dify est d'exécuter notre fi
```bash ```bash
cd docker cd docker
cp .env.example .env
docker compose up -d docker compose up -d
``` ```
@ -188,9 +189,7 @@ Après l'exécution, vous pouvez accéder au tableau de bord Dify dans votre nav
## Prochaines étapes ## Prochaines étapes
Si vous devez personnaliser la configuration, veuillez Si vous devez personnaliser la configuration, veuillez vous référer aux commentaires dans notre fichier [.env.example](docker/.env.example) et mettre à jour les valeurs correspondantes dans votre fichier `.env`. De plus, vous devrez peut-être apporter des modifications au fichier `docker-compose.yaml` lui-même, comme changer les versions d'image, les mappages de ports ou les montages de volumes, en fonction de votre environnement de déploiement et de vos exigences spécifiques. Après avoir effectué des modifications, veuillez réexécuter `docker-compose up -d`. Vous pouvez trouver la liste complète des variables d'environnement disponibles [ici](https://docs.dify.ai/getting-started/install-self-hosted/environments).
vous référer aux commentaires dans notre fichier [docker-compose.yml](docker/docker-compose.yaml) et définir manuellement la configuration de l'environnement. Après avoir apporté les modifications, veuillez exécuter à nouveau `docker-compose up -d`. Vous pouvez voir la liste complète des variables d'environnement [ici](https://docs.dify.ai/getting-started/install-self-hosted/environments).
Si vous souhaitez configurer une configuration haute disponibilité, la communauté fournit des [Helm Charts](https://helm.sh/) et des fichiers YAML, à travers lesquels vous pouvez déployer Dify sur Kubernetes. Si vous souhaitez configurer une configuration haute disponibilité, la communauté fournit des [Helm Charts](https://helm.sh/) et des fichiers YAML, à travers lesquels vous pouvez déployer Dify sur Kubernetes.
@ -198,6 +197,12 @@ Si vous souhaitez configurer une configuration haute disponibilité, la communau
- [Helm Chart par @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm) - [Helm Chart par @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [Fichier YAML par @Winson-030](https://github.com/Winson-030/dify-kubernetes) - [Fichier YAML par @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Utilisation de Terraform pour le déploiement
##### Azure Global
Utilisez [terraform](https://www.terraform.io/) pour déployer Dify sur Azure en un clic.
- [Azure Terraform par @nikawang](https://github.com/nikawang/dify-azure-terraform)
## Contribuer ## Contribuer

View File

@ -178,6 +178,7 @@ Difyサーバーを起動する最も簡単な方法は、[docker-compose.yml](d
```bash ```bash
cd docker cd docker
cp .env.example .env
docker compose up -d docker compose up -d
``` ```
@ -187,7 +188,7 @@ docker compose up -d
## 次のステップ ## 次のステップ
環境設定をカスタマイズする場合は、[docker-compose.yml](docker/docker-compose.yaml)ファイル内のコメントを参照して、環境設定を手動で設定してください。変更を加えた後は、再び `docker-compose up -d` を実行してください。環境変数の完全なリストは[こちら](https://docs.dify.ai/getting-started/install-self-hosted/environments)をご覧ください 設定をカスタマイズする必要がある場合は、[.env.example](docker/.env.example) ファイルのコメントを参照し、`.env` ファイルの対応する値を更新してください。さらに、デプロイ環境や要件に応じて、`docker-compose.yaml` ファイル自体を調整する必要がある場合があります。たとえば、イメージのバージョン、ポートのマッピング、ボリュームのマウントなどを変更します。変更を加えた後は、`docker-compose up -d` を再実行してください。利用可能な環境変数の全一覧は、[こちら](https://docs.dify.ai/getting-started/install-self-hosted/environments)で確認できます
高可用性設定を設定する必要がある場合、コミュニティは[Helm Charts](https://helm.sh/)とYAMLファイルにより、DifyをKubernetesにデプロイすることができます。 高可用性設定を設定する必要がある場合、コミュニティは[Helm Charts](https://helm.sh/)とYAMLファイルにより、DifyをKubernetesにデプロイすることができます。
@ -195,6 +196,12 @@ docker compose up -d
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm) - [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes) - [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Terraformを使用したデプロイ
##### Azure Global
[terraform](https://www.terraform.io/) を使用して、AzureにDifyをワンクリックでデプロイします。
- [nikawangのAzure Terraform](https://github.com/nikawang/dify-azure-terraform)
## 貢献 ## 貢献

View File

@ -179,6 +179,7 @@ The easiest way to start the Dify server is to run our [docker-compose.yml](dock
```bash ```bash
cd docker cd docker
cp .env.example .env
docker compose up -d docker compose up -d
``` ```
@ -188,7 +189,7 @@ After running, you can access the Dify dashboard in your browser at [http://loca
## Next steps ## Next steps
If you need to customize the configuration, please refer to the comments in our [docker-compose.yml](docker/docker-compose.yaml) file and manually set the environment configuration. After making the changes, please run `docker-compose up -d` again. You can see the full list of environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments). If you need to customize the configuration, please refer to the comments in our [.env.example](docker/.env.example) file and update the corresponding values in your `.env` file. Additionally, you might need to make adjustments to the `docker-compose.yaml` file itself, such as changing image versions, port mappings, or volume mounts, based on your specific deployment environment and requirements. After making any changes, please re-run `docker-compose up -d`. You can find the full list of available environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments).
If you'd like to configure a highly-available setup, there are community-contributed [Helm Charts](https://helm.sh/) and YAML files which allow Dify to be deployed on Kubernetes. If you'd like to configure a highly-available setup, there are community-contributed [Helm Charts](https://helm.sh/) and YAML files which allow Dify to be deployed on Kubernetes.
@ -196,6 +197,13 @@ If you'd like to configure a highly-available setup, there are community-contrib
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm) - [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes) - [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Terraform atorlugu pilersitsineq
##### Azure Global
Atoruk [terraform](https://www.terraform.io/) Dify-mik Azure-mut ataatsikkut ikkussuilluarlugu.
- [Azure Terraform atorlugu @nikawang](https://github.com/nikawang/dify-azure-terraform)
## Contributing ## Contributing
For those who'd like to contribute code, see our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md). For those who'd like to contribute code, see our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).

View File

@ -172,6 +172,7 @@ Dify 서버를 시작하는 가장 쉬운 방법은 [docker-compose.yml](docker/
```bash ```bash
cd docker cd docker
cp .env.example .env
docker compose up -d docker compose up -d
``` ```
@ -181,8 +182,7 @@ docker compose up -d
## 다음 단계 ## 다음 단계
구성 커스터마이징이 필요한 경우, [docker-compose.yml](docker/docker-compose.yaml) 파일의 코멘트를 참조하여 환경 구성을 수동으로 설정하십시오. 변경 후 `docker-compose up -d` 를 다시 실행하십시오. 환경 변수의 전체 목록은 [여기](https://docs.dify.ai/getting-started/install-self-hosted/environments)에서 확인할 수 있습니다. 구성을 사용자 정의해야 하는 경우 [.env.example](docker/.env.example) 파일의 주석을 참조하고 `.env` 파일에서 해당 값을 업데이트하십시오. 또한 특정 배포 환경 및 요구 사항에 따라 `docker-compose.yaml` 파일 자체를 조정해야 할 수도 있습니다. 예를 들어 이미지 버전, 포트 매핑 또는 볼륨 마운트를 변경합니다. 변경 한 후 `docker-compose up -d`를 다시 실행하십시오. 사용 가능한 환경 변수의 전체 목록은 [여기](https://docs.dify.ai/getting-started/install-self-hosted/environments)에서 찾을 수 있습니다.
Dify를 Kubernetes에 배포하고 프리미엄 스케일링 설정을 구성했다는 커뮤니티가 제공하는 [Helm Charts](https://helm.sh/)와 YAML 파일이 존재합니다. Dify를 Kubernetes에 배포하고 프리미엄 스케일링 설정을 구성했다는 커뮤니티가 제공하는 [Helm Charts](https://helm.sh/)와 YAML 파일이 존재합니다.
@ -190,6 +190,12 @@ Dify를 Kubernetes에 배포하고 프리미엄 스케일링 설정을 구성했
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm) - [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes) - [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Terraform을 사용한 배포
##### Azure Global
[terraform](https://www.terraform.io/)을 사용하여 Azure에 Dify를 원클릭으로 배포하세요.
- [nikawang의 Azure Terraform](https://github.com/nikawang/dify-azure-terraform)
## 기여 ## 기여
코드에 기여하고 싶은 분들은 [기여 가이드](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)를 참조하세요. 코드에 기여하고 싶은 분들은 [기여 가이드](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)를 참조하세요.

View File

@ -39,7 +39,7 @@ DB_DATABASE=dify
# Storage configuration # Storage configuration
# use for store upload files, private keys... # use for store upload files, private keys...
# storage type: local, s3, azure-blob # storage type: local, s3, azure-blob, google-storage
STORAGE_TYPE=local STORAGE_TYPE=local
STORAGE_LOCAL_PATH=storage STORAGE_LOCAL_PATH=storage
S3_USE_AWS_MANAGED_IAM=false S3_USE_AWS_MANAGED_IAM=false
@ -63,7 +63,7 @@ ALIYUN_OSS_REGION=your-region
# Google Storage configuration # Google Storage configuration
GOOGLE_STORAGE_BUCKET_NAME=yout-bucket-name GOOGLE_STORAGE_BUCKET_NAME=yout-bucket-name
GOOGLE_STORAGE_SERVICE_ACCOUNT_JSON=your-google-service-account-json-base64-string GOOGLE_STORAGE_SERVICE_ACCOUNT_JSON_BASE64=your-google-service-account-json-base64-string
# Tencent COS Storage configuration # Tencent COS Storage configuration
TENCENT_COS_BUCKET_NAME=your-bucket-name TENCENT_COS_BUCKET_NAME=your-bucket-name
@ -72,11 +72,18 @@ TENCENT_COS_SECRET_ID=your-secret-id
TENCENT_COS_REGION=your-region TENCENT_COS_REGION=your-region
TENCENT_COS_SCHEME=your-scheme TENCENT_COS_SCHEME=your-scheme
# OCI Storage configuration
OCI_ENDPOINT=your-endpoint
OCI_BUCKET_NAME=your-bucket-name
OCI_ACCESS_KEY=your-access-key
OCI_SECRET_KEY=your-secret-key
OCI_REGION=your-region
# CORS configuration # CORS configuration
WEB_API_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,* WEB_API_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
CONSOLE_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,* CONSOLE_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
# Vector database configuration, support: weaviate, qdrant, milvus, relyt, pgvecto_rs, pgvector # Vector database configuration, support: weaviate, qdrant, milvus, relyt, pgvecto_rs, pgvector, pgvector, chroma, opensearch, tidb_vector
VECTOR_STORE=weaviate VECTOR_STORE=weaviate
# Weaviate configuration # Weaviate configuration
@ -144,6 +151,13 @@ CHROMA_DATABASE=default_database
CHROMA_AUTH_PROVIDER=chromadb.auth.token_authn.TokenAuthenticationServerProvider CHROMA_AUTH_PROVIDER=chromadb.auth.token_authn.TokenAuthenticationServerProvider
CHROMA_AUTH_CREDENTIALS=difyai123456 CHROMA_AUTH_CREDENTIALS=difyai123456
# OpenSearch configuration
OPENSEARCH_HOST=127.0.0.1
OPENSEARCH_PORT=9200
OPENSEARCH_USER=admin
OPENSEARCH_PASSWORD=admin
OPENSEARCH_SECURE=true
# Upload configuration # Upload configuration
UPLOAD_FILE_SIZE_LIMIT=15 UPLOAD_FILE_SIZE_LIMIT=15
UPLOAD_FILE_BATCH_LIMIT=5 UPLOAD_FILE_BATCH_LIMIT=5

View File

@ -1,12 +1,11 @@
# base image # base image
FROM python:3.10-slim-bookworm as base FROM python:3.10-slim-bookworm AS base
WORKDIR /app/api WORKDIR /app/api
# Install Poetry # Install Poetry
ENV POETRY_VERSION=1.8.3 ENV POETRY_VERSION=1.8.3
RUN pip install --no-cache-dir --upgrade pip && \ RUN pip install --no-cache-dir poetry==${POETRY_VERSION}
pip install --no-cache-dir --upgrade poetry==${POETRY_VERSION}
# Configure Poetry # Configure Poetry
ENV POETRY_CACHE_DIR=/tmp/poetry_cache ENV POETRY_CACHE_DIR=/tmp/poetry_cache
@ -14,7 +13,7 @@ ENV POETRY_NO_INTERACTION=1
ENV POETRY_VIRTUALENVS_IN_PROJECT=true ENV POETRY_VIRTUALENVS_IN_PROJECT=true
ENV POETRY_VIRTUALENVS_CREATE=true ENV POETRY_VIRTUALENVS_CREATE=true
FROM base as packages FROM base AS packages
RUN apt-get update \ RUN apt-get update \
&& apt-get install -y --no-install-recommends gcc g++ libc-dev libffi-dev libgmp-dev libmpfr-dev libmpc-dev && apt-get install -y --no-install-recommends gcc g++ libc-dev libffi-dev libgmp-dev libmpfr-dev libmpc-dev
@ -23,22 +22,21 @@ RUN apt-get update \
COPY pyproject.toml poetry.lock ./ COPY pyproject.toml poetry.lock ./
RUN poetry install --sync --no-cache --no-root RUN poetry install --sync --no-cache --no-root
# production stage # production stage
FROM base AS production FROM base AS production
ENV FLASK_APP app.py ENV FLASK_APP=app.py
ENV EDITION SELF_HOSTED ENV EDITION=SELF_HOSTED
ENV DEPLOY_ENV PRODUCTION ENV DEPLOY_ENV=PRODUCTION
ENV CONSOLE_API_URL http://127.0.0.1:5001 ENV CONSOLE_API_URL=http://127.0.0.1:5001
ENV CONSOLE_WEB_URL http://127.0.0.1:3000 ENV CONSOLE_WEB_URL=http://127.0.0.1:3000
ENV SERVICE_API_URL http://127.0.0.1:5001 ENV SERVICE_API_URL=http://127.0.0.1:5001
ENV APP_WEB_URL http://127.0.0.1:3000 ENV APP_WEB_URL=http://127.0.0.1:3000
EXPOSE 5001 EXPOSE 5001
# set timezone # set timezone
ENV TZ UTC ENV TZ=UTC
WORKDIR /app/api WORKDIR /app/api
@ -61,6 +59,6 @@ RUN chmod +x /entrypoint.sh
ARG COMMIT_SHA ARG COMMIT_SHA
ENV COMMIT_SHA ${COMMIT_SHA} ENV COMMIT_SHA=${COMMIT_SHA}
ENTRYPOINT ["/bin/bash", "/entrypoint.sh"] ENTRYPOINT ["/bin/bash", "/entrypoint.sh"]

View File

@ -11,7 +11,8 @@
```bash ```bash
cd ../docker cd ../docker
docker-compose -f docker-compose.middleware.yaml -p dify up -d cp middleware.env.example middleware.env
docker compose -f docker-compose.middleware.yaml -p dify up -d
cd ../api cd ../api
``` ```
@ -66,7 +67,7 @@
10. If you need to debug local async processing, please start the worker service. 10. If you need to debug local async processing, please start the worker service.
```bash ```bash
poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion
``` ```
The started celery app handles the async tasks, e.g. dataset importing and documents indexing. The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

View File

@ -1,8 +1,8 @@
import os import os
from configs.app_config import DifyConfig from configs import dify_config
if not os.environ.get("DEBUG") or os.environ.get("DEBUG", "false").lower() != 'true': if os.environ.get("DEBUG", "false").lower() != 'true':
from gevent import monkey from gevent import monkey
monkey.patch_all() monkey.patch_all()
@ -24,7 +24,6 @@ from flask_cors import CORS
from werkzeug.exceptions import Unauthorized from werkzeug.exceptions import Unauthorized
from commands import register_commands from commands import register_commands
from config import Config
# DO NOT REMOVE BELOW # DO NOT REMOVE BELOW
from events import event_handlers from events import event_handlers
@ -44,6 +43,8 @@ from extensions import (
from extensions.ext_database import db from extensions.ext_database import db
from extensions.ext_login import login_manager from extensions.ext_login import login_manager
from libs.passport import PassportService from libs.passport import PassportService
# TODO: Find a way to avoid importing models here
from models import account, dataset, model, source, task, tool, tools, web from models import account, dataset, model, source, task, tool, tools, web
from services.account_service import AccountService from services.account_service import AccountService
@ -82,8 +83,17 @@ def create_flask_app_with_configs() -> Flask:
with configs loaded from .env file with configs loaded from .env file
""" """
dify_app = DifyApp(__name__) dify_app = DifyApp(__name__)
dify_app.config.from_object(Config()) dify_app.config.from_mapping(dify_config.model_dump())
dify_app.config.from_mapping(DifyConfig().model_dump())
# populate configs into system environment variables
for key, value in dify_app.config.items():
if isinstance(value, str):
os.environ[key] = value
elif isinstance(value, int | float | bool):
os.environ[key] = str(value)
elif value is None:
os.environ[key] = ''
return dify_app return dify_app
@ -232,7 +242,7 @@ def register_blueprints(app):
app = create_app() app = create_app()
celery = app.extensions["celery"] celery = app.extensions["celery"]
if app.config['TESTING']: if app.config.get('TESTING'):
print("App is running in TESTING mode") print("App is running in TESTING mode")

View File

@ -8,10 +8,12 @@ import click
from flask import current_app from flask import current_app
from werkzeug.exceptions import NotFound from werkzeug.exceptions import NotFound
from configs import dify_config
from constants.languages import languages from constants.languages import languages
from core.rag.datasource.vdb.vector_factory import Vector from core.rag.datasource.vdb.vector_factory import Vector
from core.rag.datasource.vdb.vector_type import VectorType from core.rag.datasource.vdb.vector_type import VectorType
from core.rag.models.document import Document from core.rag.models.document import Document
from events.app_event import app_was_created
from extensions.ext_database import db from extensions.ext_database import db
from extensions.ext_redis import redis_client from extensions.ext_redis import redis_client
from libs.helper import email as email_validate from libs.helper import email as email_validate
@ -111,7 +113,7 @@ def reset_encrypt_key_pair():
After the reset, all LLM credentials will become invalid, requiring re-entry. After the reset, all LLM credentials will become invalid, requiring re-entry.
Only support SELF_HOSTED mode. Only support SELF_HOSTED mode.
""" """
if current_app.config['EDITION'] != 'SELF_HOSTED': if dify_config.EDITION != 'SELF_HOSTED':
click.echo(click.style('Sorry, only support SELF_HOSTED mode.', fg='red')) click.echo(click.style('Sorry, only support SELF_HOSTED mode.', fg='red'))
return return
@ -585,6 +587,53 @@ def upgrade_db():
click.echo('Database migration skipped') click.echo('Database migration skipped')
@click.command('fix-app-site-missing', help='Fix app related site missing issue.')
def fix_app_site_missing():
"""
Fix app related site missing issue.
"""
click.echo(click.style('Start fix app related site missing issue.', fg='green'))
failed_app_ids = []
while True:
sql = """select apps.id as id from apps left join sites on sites.app_id=apps.id
where sites.id is null limit 1000"""
with db.engine.begin() as conn:
rs = conn.execute(db.text(sql))
processed_count = 0
for i in rs:
processed_count += 1
app_id = str(i.id)
if app_id in failed_app_ids:
continue
try:
app = db.session.query(App).filter(App.id == app_id).first()
tenant = app.tenant
if tenant:
accounts = tenant.get_accounts()
if not accounts:
print("Fix app {} failed.".format(app.id))
continue
account = accounts[0]
print("Fix app {} related site missing issue.".format(app.id))
app_was_created.send(app, account=account)
except Exception as e:
failed_app_ids.append(app_id)
click.echo(click.style('Fix app {} related site missing issue failed!'.format(app_id), fg='red'))
logging.exception(f'Fix app related site missing issue failed, error: {e}')
continue
if not processed_count:
break
click.echo(click.style('Congratulations! Fix app related site missing issue successful!', fg='green'))
def register_commands(app): def register_commands(app):
app.cli.add_command(reset_password) app.cli.add_command(reset_password)
app.cli.add_command(reset_email) app.cli.add_command(reset_email)
@ -594,3 +643,4 @@ def register_commands(app):
app.cli.add_command(add_qdrant_doc_id_index) app.cli.add_command(add_qdrant_doc_id_index)
app.cli.add_command(create_tenant) app.cli.add_command(create_tenant)
app.cli.add_command(upgrade_db) app.cli.add_command(upgrade_db)
app.cli.add_command(fix_app_site_missing)

View File

@ -1,41 +0,0 @@
import os
import dotenv
DEFAULTS = {
}
def get_env(key):
return os.environ.get(key, DEFAULTS.get(key))
def get_bool_env(key):
value = get_env(key)
return value.lower() == 'true' if value is not None else False
def get_cors_allow_origins(env, default):
cors_allow_origins = []
if get_env(env):
for origin in get_env(env).split(','):
cors_allow_origins.append(origin)
else:
cors_allow_origins = [default]
return cors_allow_origins
class Config:
"""Application configuration class."""
def __init__(self):
dotenv.load_dotenv()
self.TESTING = False
# cors settings
self.CONSOLE_CORS_ALLOW_ORIGINS = get_cors_allow_origins(
'CONSOLE_CORS_ALLOW_ORIGINS', get_env('CONSOLE_WEB_URL'))
self.WEB_API_CORS_ALLOW_ORIGINS = get_cors_allow_origins(
'WEB_API_CORS_ALLOW_ORIGINS', '*')

View File

@ -0,0 +1,3 @@
from .app_config import DifyConfig
dify_config = DifyConfig()

View File

@ -1,4 +1,5 @@
from pydantic_settings import BaseSettings, SettingsConfigDict from pydantic import Field, computed_field
from pydantic_settings import SettingsConfigDict
from configs.deploy import DeploymentConfig from configs.deploy import DeploymentConfig
from configs.enterprise import EnterpriseFeatureConfig from configs.enterprise import EnterpriseFeatureConfig
@ -9,9 +10,6 @@ from configs.packaging import PackagingInfo
class DifyConfig( class DifyConfig(
# based on pydantic-settings
BaseSettings,
# Packaging info # Packaging info
PackagingInfo, PackagingInfo,
@ -31,12 +29,39 @@ class DifyConfig(
# **Before using, please contact business@dify.ai by email to inquire about licensing matters.** # **Before using, please contact business@dify.ai by email to inquire about licensing matters.**
EnterpriseFeatureConfig, EnterpriseFeatureConfig,
): ):
DEBUG: bool = Field(default=False, description='whether to enable debug mode.')
model_config = SettingsConfigDict( model_config = SettingsConfigDict(
# read from dotenv format config file # read from dotenv format config file
env_file='.env', env_file='.env',
env_file_encoding='utf-8', env_file_encoding='utf-8',
frozen=True,
# ignore extra attributes # ignore extra attributes
extra='ignore', extra='ignore',
) )
CODE_MAX_NUMBER: int = 9223372036854775807
CODE_MIN_NUMBER: int = -9223372036854775808
CODE_MAX_STRING_LENGTH: int = 80000
CODE_MAX_STRING_ARRAY_LENGTH: int = 30
CODE_MAX_OBJECT_ARRAY_LENGTH: int = 30
CODE_MAX_NUMBER_ARRAY_LENGTH: int = 1000
HTTP_REQUEST_MAX_CONNECT_TIMEOUT: int = 300
HTTP_REQUEST_MAX_READ_TIMEOUT: int = 600
HTTP_REQUEST_MAX_WRITE_TIMEOUT: int = 600
HTTP_REQUEST_NODE_MAX_BINARY_SIZE: int = 1024 * 1024 * 10
@computed_field
def HTTP_REQUEST_NODE_READABLE_MAX_BINARY_SIZE(self) -> str:
return f'{self.HTTP_REQUEST_NODE_MAX_BINARY_SIZE / 1024 / 1024:.2f}MB'
HTTP_REQUEST_NODE_MAX_TEXT_SIZE: int = 1024 * 1024
@computed_field
def HTTP_REQUEST_NODE_READABLE_MAX_TEXT_SIZE(self) -> str:
return f'{self.HTTP_REQUEST_NODE_MAX_TEXT_SIZE / 1024 / 1024:.2f}MB'
SSRF_PROXY_HTTP_URL: str | None = None
SSRF_PROXY_HTTPS_URL: str | None = None

View File

@ -1,10 +1,21 @@
from pydantic import BaseModel, Field from pydantic import Field
from pydantic_settings import BaseSettings
class DeploymentConfig(BaseModel): class DeploymentConfig(BaseSettings):
""" """
Deployment configs Deployment configs
""" """
APPLICATION_NAME: str = Field(
description='application name',
default='langgenius/dify',
)
TESTING: bool = Field(
description='',
default=False,
)
EDITION: str = Field( EDITION: str = Field(
description='deployment edition', description='deployment edition',
default='SELF_HOSTED', default='SELF_HOSTED',

View File

@ -1,7 +1,8 @@
from pydantic import BaseModel, Field from pydantic import Field
from pydantic_settings import BaseSettings
class EnterpriseFeatureConfig(BaseModel): class EnterpriseFeatureConfig(BaseSettings):
""" """
Enterprise feature configs. Enterprise feature configs.
**Before using, please contact business@dify.ai by email to inquire about licensing matters.** **Before using, please contact business@dify.ai by email to inquire about licensing matters.**

View File

@ -1,5 +1,3 @@
from pydantic import BaseModel
from configs.extra.notion_config import NotionConfig from configs.extra.notion_config import NotionConfig
from configs.extra.sentry_config import SentryConfig from configs.extra.sentry_config import SentryConfig

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field from pydantic import Field
from pydantic_settings import BaseSettings
class NotionConfig(BaseModel): class NotionConfig(BaseSettings):
""" """
Notion integration configs Notion integration configs
""" """

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, NonNegativeFloat from pydantic import Field, NonNegativeFloat
from pydantic_settings import BaseSettings
class SentryConfig(BaseModel): class SentryConfig(BaseSettings):
""" """
Sentry configs Sentry configs
""" """

View File

@ -1,11 +1,12 @@
from typing import Optional from typing import Optional
from pydantic import AliasChoices, BaseModel, Field, NonNegativeInt, PositiveInt from pydantic import AliasChoices, Field, NonNegativeInt, PositiveInt, computed_field
from pydantic_settings import BaseSettings
from configs.feature.hosted_service import HostedServiceConfig from configs.feature.hosted_service import HostedServiceConfig
class SecurityConfig(BaseModel): class SecurityConfig(BaseSettings):
""" """
Secret Key configs Secret Key configs
""" """
@ -17,8 +18,12 @@ class SecurityConfig(BaseModel):
default=None, default=None,
) )
RESET_PASSWORD_TOKEN_EXPIRY_HOURS: PositiveInt = Field(
description='Expiry time in hours for reset token',
default=24,
)
class AppExecutionConfig(BaseModel): class AppExecutionConfig(BaseSettings):
""" """
App Execution configs App Execution configs
""" """
@ -28,12 +33,12 @@ class AppExecutionConfig(BaseModel):
) )
class CodeExecutionSandboxConfig(BaseModel): class CodeExecutionSandboxConfig(BaseSettings):
""" """
Code Execution Sandbox configs Code Execution Sandbox configs
""" """
CODE_EXECUTION_ENDPOINT: str = Field( CODE_EXECUTION_ENDPOINT: str = Field(
description='whether to enable HTTP response compression of gzip', description='endpoint URL of code execution servcie',
default='http://sandbox:8194', default='http://sandbox:8194',
) )
@ -43,36 +48,36 @@ class CodeExecutionSandboxConfig(BaseModel):
) )
class EndpointConfig(BaseModel): class EndpointConfig(BaseSettings):
""" """
Module URL configs Module URL configs
""" """
CONSOLE_API_URL: str = Field( CONSOLE_API_URL: str = Field(
description='The backend URL prefix of the console API.' description='The backend URL prefix of the console API.'
'used to concatenate the login authorization callback or notion integration callback.', 'used to concatenate the login authorization callback or notion integration callback.',
default='https://cloud.dify.ai', default='',
) )
CONSOLE_WEB_URL: str = Field( CONSOLE_WEB_URL: str = Field(
description='The front-end URL prefix of the console web.' description='The front-end URL prefix of the console web.'
'used to concatenate some front-end addresses and for CORS configuration use.', 'used to concatenate some front-end addresses and for CORS configuration use.',
default='https://cloud.dify.ai', default='',
) )
SERVICE_API_URL: str = Field( SERVICE_API_URL: str = Field(
description='Service API Url prefix.' description='Service API Url prefix.'
'used to display Service API Base Url to the front-end.', 'used to display Service API Base Url to the front-end.',
default='https://api.dify.ai', default='',
) )
APP_WEB_URL: str = Field( APP_WEB_URL: str = Field(
description='WebApp Url prefix.' description='WebApp Url prefix.'
'used to display WebAPP API Base Url to the front-end.', 'used to display WebAPP API Base Url to the front-end.',
default='https://udify.app', default='',
) )
class FileAccessConfig(BaseModel): class FileAccessConfig(BaseSettings):
""" """
File Access configs File Access configs
""" """
@ -82,7 +87,7 @@ class FileAccessConfig(BaseModel):
'Url is signed and has expiration time.', 'Url is signed and has expiration time.',
validation_alias=AliasChoices('FILES_URL', 'CONSOLE_API_URL'), validation_alias=AliasChoices('FILES_URL', 'CONSOLE_API_URL'),
alias_priority=1, alias_priority=1,
default='https://cloud.dify.ai', default='',
) )
FILES_ACCESS_TIMEOUT: int = Field( FILES_ACCESS_TIMEOUT: int = Field(
@ -91,7 +96,7 @@ class FileAccessConfig(BaseModel):
) )
class FileUploadConfig(BaseModel): class FileUploadConfig(BaseSettings):
""" """
File Uploading configs File Uploading configs
""" """
@ -116,7 +121,7 @@ class FileUploadConfig(BaseModel):
) )
class HttpConfig(BaseModel): class HttpConfig(BaseSettings):
""" """
HTTP configs HTTP configs
""" """
@ -125,8 +130,30 @@ class HttpConfig(BaseModel):
default=False, default=False,
) )
inner_CONSOLE_CORS_ALLOW_ORIGINS: str = Field(
description='',
validation_alias=AliasChoices('CONSOLE_CORS_ALLOW_ORIGINS', 'CONSOLE_WEB_URL'),
default='',
)
class InnerAPIConfig(BaseModel): @computed_field
@property
def CONSOLE_CORS_ALLOW_ORIGINS(self) -> list[str]:
return self.inner_CONSOLE_CORS_ALLOW_ORIGINS.split(',')
inner_WEB_API_CORS_ALLOW_ORIGINS: str = Field(
description='',
validation_alias=AliasChoices('WEB_API_CORS_ALLOW_ORIGINS'),
default='*',
)
@computed_field
@property
def WEB_API_CORS_ALLOW_ORIGINS(self) -> list[str]:
return self.inner_WEB_API_CORS_ALLOW_ORIGINS.split(',')
class InnerAPIConfig(BaseSettings):
""" """
Inner API configs Inner API configs
""" """
@ -141,7 +168,7 @@ class InnerAPIConfig(BaseModel):
) )
class LoggingConfig(BaseModel): class LoggingConfig(BaseSettings):
""" """
Logging configs Logging configs
""" """
@ -173,7 +200,7 @@ class LoggingConfig(BaseModel):
) )
class ModelLoadBalanceConfig(BaseModel): class ModelLoadBalanceConfig(BaseSettings):
""" """
Model load balance configs Model load balance configs
""" """
@ -183,7 +210,7 @@ class ModelLoadBalanceConfig(BaseModel):
) )
class BillingConfig(BaseModel): class BillingConfig(BaseSettings):
""" """
Platform Billing Configurations Platform Billing Configurations
""" """
@ -193,7 +220,7 @@ class BillingConfig(BaseModel):
) )
class UpdateConfig(BaseModel): class UpdateConfig(BaseSettings):
""" """
Update configs Update configs
""" """
@ -203,7 +230,7 @@ class UpdateConfig(BaseModel):
) )
class WorkflowConfig(BaseModel): class WorkflowConfig(BaseSettings):
""" """
Workflow feature configs Workflow feature configs
""" """
@ -224,7 +251,7 @@ class WorkflowConfig(BaseModel):
) )
class OAuthConfig(BaseModel): class OAuthConfig(BaseSettings):
""" """
oauth configs oauth configs
""" """
@ -254,7 +281,7 @@ class OAuthConfig(BaseModel):
) )
class ModerationConfig(BaseModel): class ModerationConfig(BaseSettings):
""" """
Moderation in app configs. Moderation in app configs.
""" """
@ -266,7 +293,7 @@ class ModerationConfig(BaseModel):
) )
class ToolConfig(BaseModel): class ToolConfig(BaseSettings):
""" """
Tool configs Tool configs
""" """
@ -277,7 +304,7 @@ class ToolConfig(BaseModel):
) )
class MailConfig(BaseModel): class MailConfig(BaseSettings):
""" """
Mail Configurations Mail Configurations
""" """
@ -309,7 +336,7 @@ class MailConfig(BaseModel):
SMTP_PORT: Optional[int] = Field( SMTP_PORT: Optional[int] = Field(
description='smtp server port', description='smtp server port',
default=None, default=465,
) )
SMTP_USERNAME: Optional[str] = Field( SMTP_USERNAME: Optional[str] = Field(
@ -333,7 +360,7 @@ class MailConfig(BaseModel):
) )
class RagEtlConfig(BaseModel): class RagEtlConfig(BaseSettings):
""" """
RAG ETL Configurations. RAG ETL Configurations.
""" """
@ -359,7 +386,7 @@ class RagEtlConfig(BaseModel):
) )
class DataSetConfig(BaseModel): class DataSetConfig(BaseSettings):
""" """
Dataset configs Dataset configs
""" """
@ -370,7 +397,7 @@ class DataSetConfig(BaseModel):
) )
class WorkspaceConfig(BaseModel): class WorkspaceConfig(BaseSettings):
""" """
Workspace configs Workspace configs
""" """
@ -381,7 +408,7 @@ class WorkspaceConfig(BaseModel):
) )
class IndexingConfig(BaseModel): class IndexingConfig(BaseSettings):
""" """
Indexing configs. Indexing configs.
""" """
@ -392,7 +419,7 @@ class IndexingConfig(BaseModel):
) )
class ImageFormatConfig(BaseModel): class ImageFormatConfig(BaseSettings):
MULTIMODAL_SEND_IMAGE_FORMAT: str = Field( MULTIMODAL_SEND_IMAGE_FORMAT: str = Field(
description='multi model send image format, support base64, url, default is base64', description='multi model send image format, support base64, url, default is base64',
default='base64', default='base64',

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, NonNegativeInt from pydantic import Field, NonNegativeInt
from pydantic_settings import BaseSettings
class HostedOpenAiConfig(BaseModel): class HostedOpenAiConfig(BaseSettings):
""" """
Hosted OpenAI service config Hosted OpenAI service config
""" """
@ -68,7 +69,7 @@ class HostedOpenAiConfig(BaseModel):
) )
class HostedAzureOpenAiConfig(BaseModel): class HostedAzureOpenAiConfig(BaseSettings):
""" """
Hosted OpenAI service config Hosted OpenAI service config
""" """
@ -94,7 +95,7 @@ class HostedAzureOpenAiConfig(BaseModel):
) )
class HostedAnthropicConfig(BaseModel): class HostedAnthropicConfig(BaseSettings):
""" """
Hosted Azure OpenAI service config Hosted Azure OpenAI service config
""" """
@ -125,7 +126,7 @@ class HostedAnthropicConfig(BaseModel):
) )
class HostedMinmaxConfig(BaseModel): class HostedMinmaxConfig(BaseSettings):
""" """
Hosted Minmax service config Hosted Minmax service config
""" """
@ -136,7 +137,7 @@ class HostedMinmaxConfig(BaseModel):
) )
class HostedSparkConfig(BaseModel): class HostedSparkConfig(BaseSettings):
""" """
Hosted Spark service config Hosted Spark service config
""" """
@ -147,7 +148,7 @@ class HostedSparkConfig(BaseModel):
) )
class HostedZhipuAIConfig(BaseModel): class HostedZhipuAIConfig(BaseSettings):
""" """
Hosted Minmax service config Hosted Minmax service config
""" """
@ -158,7 +159,7 @@ class HostedZhipuAIConfig(BaseModel):
) )
class HostedModerationConfig(BaseModel): class HostedModerationConfig(BaseSettings):
""" """
Hosted Moderation service config Hosted Moderation service config
""" """
@ -174,7 +175,7 @@ class HostedModerationConfig(BaseModel):
) )
class HostedFetchAppTemplateConfig(BaseModel): class HostedFetchAppTemplateConfig(BaseSettings):
""" """
Hosted Moderation service config Hosted Moderation service config
""" """

View File

@ -1,12 +1,14 @@
from typing import Any, Optional from typing import Any, Optional
from pydantic import BaseModel, Field, NonNegativeInt, PositiveInt, computed_field from pydantic import Field, NonNegativeInt, PositiveInt, computed_field
from pydantic_settings import BaseSettings
from configs.middleware.redis_config import RedisConfig from configs.middleware.cache.redis_config import RedisConfig
from configs.middleware.storage.aliyun_oss_storage_config import AliyunOSSStorageConfig from configs.middleware.storage.aliyun_oss_storage_config import AliyunOSSStorageConfig
from configs.middleware.storage.amazon_s3_storage_config import S3StorageConfig from configs.middleware.storage.amazon_s3_storage_config import S3StorageConfig
from configs.middleware.storage.azure_blob_storage_config import AzureBlobStorageConfig from configs.middleware.storage.azure_blob_storage_config import AzureBlobStorageConfig
from configs.middleware.storage.google_cloud_storage_config import GoogleCloudStorageConfig from configs.middleware.storage.google_cloud_storage_config import GoogleCloudStorageConfig
from configs.middleware.storage.oci_storage_config import OCIStorageConfig
from configs.middleware.storage.tencent_cos_storage_config import TencentCloudCOSStorageConfig from configs.middleware.storage.tencent_cos_storage_config import TencentCloudCOSStorageConfig
from configs.middleware.vdb.chroma_config import ChromaConfig from configs.middleware.vdb.chroma_config import ChromaConfig
from configs.middleware.vdb.milvus_config import MilvusConfig from configs.middleware.vdb.milvus_config import MilvusConfig
@ -21,7 +23,7 @@ from configs.middleware.vdb.tidb_vector_config import TiDBVectorConfig
from configs.middleware.vdb.weaviate_config import WeaviateConfig from configs.middleware.vdb.weaviate_config import WeaviateConfig
class StorageConfig(BaseModel): class StorageConfig(BaseSettings):
STORAGE_TYPE: str = Field( STORAGE_TYPE: str = Field(
description='storage type,' description='storage type,'
' default to `local`,' ' default to `local`,'
@ -35,14 +37,14 @@ class StorageConfig(BaseModel):
) )
class VectorStoreConfig(BaseModel): class VectorStoreConfig(BaseSettings):
VECTOR_STORE: Optional[str] = Field( VECTOR_STORE: Optional[str] = Field(
description='vector store type', description='vector store type',
default=None, default=None,
) )
class KeywordStoreConfig(BaseModel): class KeywordStoreConfig(BaseSettings):
KEYWORD_STORE: str = Field( KEYWORD_STORE: str = Field(
description='keyword store type', description='keyword store type',
default='jieba', default='jieba',
@ -80,6 +82,11 @@ class DatabaseConfig:
default='', default='',
) )
DB_EXTRAS: str = Field(
description='db extras options. Example: keepalives_idle=60&keepalives=1',
default='',
)
SQLALCHEMY_DATABASE_URI_SCHEME: str = Field( SQLALCHEMY_DATABASE_URI_SCHEME: str = Field(
description='db uri scheme', description='db uri scheme',
default='postgresql', default='postgresql',
@ -88,7 +95,12 @@ class DatabaseConfig:
@computed_field @computed_field
@property @property
def SQLALCHEMY_DATABASE_URI(self) -> str: def SQLALCHEMY_DATABASE_URI(self) -> str:
db_extras = f"?client_encoding={self.DB_CHARSET}" if self.DB_CHARSET else "" db_extras = (
f"{self.DB_EXTRAS}&client_encoding={self.DB_CHARSET}"
if self.DB_CHARSET
else self.DB_EXTRAS
).strip("&")
db_extras = f"?{db_extras}" if db_extras else ""
return (f"{self.SQLALCHEMY_DATABASE_URI_SCHEME}://" return (f"{self.SQLALCHEMY_DATABASE_URI_SCHEME}://"
f"{self.DB_USERNAME}:{self.DB_PASSWORD}@{self.DB_HOST}:{self.DB_PORT}/{self.DB_DATABASE}" f"{self.DB_USERNAME}:{self.DB_PASSWORD}@{self.DB_HOST}:{self.DB_PORT}/{self.DB_DATABASE}"
f"{db_extras}") f"{db_extras}")
@ -113,7 +125,7 @@ class DatabaseConfig:
default=False, default=False,
) )
SQLALCHEMY_ECHO: bool = Field( SQLALCHEMY_ECHO: bool | str = Field(
description='whether to enable SqlAlchemy echo', description='whether to enable SqlAlchemy echo',
default=False, default=False,
) )
@ -143,7 +155,7 @@ class CeleryConfig(DatabaseConfig):
@computed_field @computed_field
@property @property
def CELERY_RESULT_BACKEND(self) -> str: def CELERY_RESULT_BACKEND(self) -> str | None:
return 'db+{}'.format(self.SQLALCHEMY_DATABASE_URI) \ return 'db+{}'.format(self.SQLALCHEMY_DATABASE_URI) \
if self.CELERY_BACKEND == 'database' else self.CELERY_BROKER_URL if self.CELERY_BACKEND == 'database' else self.CELERY_BROKER_URL
@ -167,6 +179,7 @@ class MiddlewareConfig(
GoogleCloudStorageConfig, GoogleCloudStorageConfig,
TencentCloudCOSStorageConfig, TencentCloudCOSStorageConfig,
S3StorageConfig, S3StorageConfig,
OCIStorageConfig,
# configs of vdb and vdb providers # configs of vdb and vdb providers
VectorStoreConfig, VectorStoreConfig,

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, NonNegativeInt, PositiveInt from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings
class RedisConfig(BaseModel): class RedisConfig(BaseSettings):
""" """
Redis configs Redis configs
""" """

View File

@ -1,39 +1,40 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field from pydantic import Field
from pydantic_settings import BaseSettings
class AliyunOSSStorageConfig(BaseModel): class AliyunOSSStorageConfig(BaseSettings):
""" """
Aliyun storage configs Aliyun storage configs
""" """
ALIYUN_OSS_BUCKET_NAME: Optional[str] = Field( ALIYUN_OSS_BUCKET_NAME: Optional[str] = Field(
description='Aliyun storage ', description='Aliyun OSS bucket name',
default=None, default=None,
) )
ALIYUN_OSS_ACCESS_KEY: Optional[str] = Field( ALIYUN_OSS_ACCESS_KEY: Optional[str] = Field(
description='Aliyun storage access key', description='Aliyun OSS access key',
default=None, default=None,
) )
ALIYUN_OSS_SECRET_KEY: Optional[str] = Field( ALIYUN_OSS_SECRET_KEY: Optional[str] = Field(
description='Aliyun storage secret key', description='Aliyun OSS secret key',
default=None, default=None,
) )
ALIYUN_OSS_ENDPOINT: Optional[str] = Field( ALIYUN_OSS_ENDPOINT: Optional[str] = Field(
description='Aliyun storage endpoint URL', description='Aliyun OSS endpoint URL',
default=None, default=None,
) )
ALIYUN_OSS_REGION: Optional[str] = Field( ALIYUN_OSS_REGION: Optional[str] = Field(
description='Aliyun storage region', description='Aliyun OSS region',
default=None, default=None,
) )
ALIYUN_OSS_AUTH_VERSION: Optional[str] = Field( ALIYUN_OSS_AUTH_VERSION: Optional[str] = Field(
description='Aliyun storage authentication version', description='Aliyun OSS authentication version',
default=None, default=None,
) )

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field from pydantic import Field
from pydantic_settings import BaseSettings
class S3StorageConfig(BaseModel): class S3StorageConfig(BaseSettings):
""" """
S3 storage configs S3 storage configs
""" """

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field from pydantic import Field
from pydantic_settings import BaseSettings
class AzureBlobStorageConfig(BaseModel): class AzureBlobStorageConfig(BaseSettings):
""" """
Azure Blob storage configs Azure Blob storage configs
""" """
@ -24,6 +25,6 @@ class AzureBlobStorageConfig(BaseModel):
) )
AZURE_BLOB_ACCOUNT_URL: Optional[str] = Field( AZURE_BLOB_ACCOUNT_URL: Optional[str] = Field(
description='Azure Blob account url', description='Azure Blob account URL',
default=None, default=None,
) )

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field from pydantic import Field
from pydantic_settings import BaseSettings
class GoogleCloudStorageConfig(BaseModel): class GoogleCloudStorageConfig(BaseSettings):
""" """
Google Cloud storage configs Google Cloud storage configs
""" """

View File

@ -0,0 +1,36 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class OCIStorageConfig(BaseSettings):
"""
OCI storage configs
"""
OCI_ENDPOINT: Optional[str] = Field(
description='OCI storage endpoint',
default=None,
)
OCI_REGION: Optional[str] = Field(
description='OCI storage region',
default=None,
)
OCI_BUCKET_NAME: Optional[str] = Field(
description='OCI storage bucket name',
default=None,
)
OCI_ACCESS_KEY: Optional[str] = Field(
description='OCI storage access key',
default=None,
)
OCI_SECRET_KEY: Optional[str] = Field(
description='OCI storage secret key',
default=None,
)

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field from pydantic import Field
from pydantic_settings import BaseSettings
class TencentCloudCOSStorageConfig(BaseModel): class TencentCloudCOSStorageConfig(BaseSettings):
""" """
Tencent Cloud COS storage configs Tencent Cloud COS storage configs
""" """

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class ChromaConfig(BaseModel): class ChromaConfig(BaseSettings):
""" """
Chroma configs Chroma configs
""" """

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class MilvusConfig(BaseModel): class MilvusConfig(BaseSettings):
""" """
Milvus configs Milvus configs
""" """
@ -29,11 +30,11 @@ class MilvusConfig(BaseModel):
) )
MILVUS_SECURE: bool = Field( MILVUS_SECURE: bool = Field(
description='wheter to use SSL connection for Milvus', description='whether to use SSL connection for Milvus',
default=False, default=False,
) )
MILVUS_DATABASE: str = Field( MILVUS_DATABASE: str = Field(
description='Milvus database', description='Milvus database, default to `default`',
default='default', default='default',
) )

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class OpenSearchConfig(BaseModel): class OpenSearchConfig(BaseSettings):
""" """
OpenSearch configs OpenSearch configs
""" """

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class OracleConfig(BaseModel): class OracleConfig(BaseSettings):
""" """
ORACLE configs ORACLE configs
""" """
@ -15,7 +16,7 @@ class OracleConfig(BaseModel):
ORACLE_PORT: Optional[PositiveInt] = Field( ORACLE_PORT: Optional[PositiveInt] = Field(
description='ORACLE port', description='ORACLE port',
default=None, default=1521,
) )
ORACLE_USER: Optional[str] = Field( ORACLE_USER: Optional[str] = Field(

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class PGVectorConfig(BaseModel): class PGVectorConfig(BaseSettings):
""" """
PGVector configs PGVector configs
""" """
@ -15,7 +16,7 @@ class PGVectorConfig(BaseModel):
PGVECTOR_PORT: Optional[PositiveInt] = Field( PGVECTOR_PORT: Optional[PositiveInt] = Field(
description='PGVector port', description='PGVector port',
default=None, default=5433,
) )
PGVECTOR_USER: Optional[str] = Field( PGVECTOR_USER: Optional[str] = Field(

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class PGVectoRSConfig(BaseModel): class PGVectoRSConfig(BaseSettings):
""" """
PGVectoRS configs PGVectoRS configs
""" """
@ -15,7 +16,7 @@ class PGVectoRSConfig(BaseModel):
PGVECTO_RS_PORT: Optional[PositiveInt] = Field( PGVECTO_RS_PORT: Optional[PositiveInt] = Field(
description='PGVectoRS port', description='PGVectoRS port',
default=None, default=5431,
) )
PGVECTO_RS_USER: Optional[str] = Field( PGVECTO_RS_USER: Optional[str] = Field(

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, NonNegativeInt, PositiveInt from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings
class QdrantConfig(BaseModel): class QdrantConfig(BaseSettings):
""" """
Qdrant configs Qdrant configs
""" """

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class RelytConfig(BaseModel): class RelytConfig(BaseSettings):
""" """
Relyt configs Relyt configs
""" """

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, PositiveInt from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings
class TencentVectorDBConfig(BaseModel): class TencentVectorDBConfig(BaseSettings):
""" """
Tencent Vector configs Tencent Vector configs
""" """
@ -14,17 +15,17 @@ class TencentVectorDBConfig(BaseModel):
) )
TENCENT_VECTOR_DB_API_KEY: Optional[str] = Field( TENCENT_VECTOR_DB_API_KEY: Optional[str] = Field(
description='Tencent Vector api key', description='Tencent Vector API key',
default=None, default=None,
) )
TENCENT_VECTOR_DB_TIMEOUT: PositiveInt = Field( TENCENT_VECTOR_DB_TIMEOUT: PositiveInt = Field(
description='Tencent Vector timeout', description='Tencent Vector timeout in seconds',
default=30, default=30,
) )
TENCENT_VECTOR_DB_USERNAME: Optional[str] = Field( TENCENT_VECTOR_DB_USERNAME: Optional[str] = Field(
description='Tencent Vector password', description='Tencent Vector username',
default=None, default=None,
) )
@ -38,7 +39,12 @@ class TencentVectorDBConfig(BaseModel):
default=1, default=1,
) )
TENCENT_VECTOR_DB_REPLICAS: PositiveInt = Field( TENCENT_VECTOR_DB_REPLICAS: NonNegativeInt = Field(
description='Tencent Vector replicas', description='Tencent Vector replicas',
default=2, default=2,
) )
TENCENT_VECTOR_DB_DATABASE: Optional[str] = Field(
description='Tencent Vector Database',
default=None,
)

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class TiDBVectorConfig(BaseModel): class TiDBVectorConfig(BaseSettings):
""" """
TiDB Vector configs TiDB Vector configs
""" """
@ -15,7 +16,7 @@ class TiDBVectorConfig(BaseModel):
TIDB_VECTOR_PORT: Optional[PositiveInt] = Field( TIDB_VECTOR_PORT: Optional[PositiveInt] = Field(
description='TiDB Vector port', description='TiDB Vector port',
default=None, default=4000,
) )
TIDB_VECTOR_USER: Optional[str] = Field( TIDB_VECTOR_USER: Optional[str] = Field(

View File

@ -1,9 +1,10 @@
from typing import Optional from typing import Optional
from pydantic import BaseModel, Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class WeaviateConfig(BaseModel): class WeaviateConfig(BaseSettings):
""" """
Weaviate configs Weaviate configs
""" """

View File

@ -1,14 +1,15 @@
from pydantic import BaseModel, Field from pydantic import Field
from pydantic_settings import BaseSettings
class PackagingInfo(BaseModel): class PackagingInfo(BaseSettings):
""" """
Packaging build information Packaging build information
""" """
CURRENT_VERSION: str = Field( CURRENT_VERSION: str = Field(
description='Dify version', description='Dify version',
default='0.6.11', default='0.6.12-fix1',
) )
COMMIT_SHA: str = Field( COMMIT_SHA: str = Field(

View File

@ -1,7 +1,3 @@
languages = ['en-US', 'zh-Hans', 'zh-Hant', 'pt-BR', 'es-ES', 'fr-FR', 'de-DE', 'ja-JP', 'ko-KR', 'ru-RU', 'it-IT', 'uk-UA', 'vi-VN', 'pl-PL', 'hi-IN']
language_timezone_mapping = { language_timezone_mapping = {
'en-US': 'America/New_York', 'en-US': 'America/New_York',
'zh-Hans': 'Asia/Shanghai', 'zh-Hans': 'Asia/Shanghai',
@ -18,9 +14,11 @@ language_timezone_mapping = {
'vi-VN': 'Asia/Ho_Chi_Minh', 'vi-VN': 'Asia/Ho_Chi_Minh',
'ro-RO': 'Europe/Bucharest', 'ro-RO': 'Europe/Bucharest',
'pl-PL': 'Europe/Warsaw', 'pl-PL': 'Europe/Warsaw',
'hi-IN': 'Asia/Kolkata' 'hi-IN': 'Asia/Kolkata',
} }
languages = list(language_timezone_mapping.keys())
def supported_language(lang): def supported_language(lang):
if lang in languages: if lang in languages:

View File

@ -22,7 +22,7 @@ default_app_templates = {
'model_config': { 'model_config': {
'model': { 'model': {
"provider": "openai", "provider": "openai",
"name": "gpt-4", "name": "gpt-4o",
"mode": "chat", "mode": "chat",
"completion_params": {} "completion_params": {}
}, },
@ -51,7 +51,7 @@ default_app_templates = {
'model_config': { 'model_config': {
'model': { 'model': {
"provider": "openai", "provider": "openai",
"name": "gpt-4", "name": "gpt-4o",
"mode": "chat", "mode": "chat",
"completion_params": {} "completion_params": {}
} }
@ -77,7 +77,7 @@ default_app_templates = {
'model_config': { 'model_config': {
'model': { 'model': {
"provider": "openai", "provider": "openai",
"name": "gpt-4", "name": "gpt-4o",
"mode": "chat", "mode": "chat",
"completion_params": {} "completion_params": {}
} }

View File

@ -20,6 +20,7 @@ from .app import (
generator, generator,
message, message,
model_config, model_config,
ops_trace,
site, site,
statistic, statistic,
workflow, workflow,
@ -29,7 +30,7 @@ from .app import (
) )
# Import auth controllers # Import auth controllers
from .auth import activate, data_source_bearer_auth, data_source_oauth, login, oauth from .auth import activate, data_source_bearer_auth, data_source_oauth, forgot_password, login, oauth
# Import billing controllers # Import billing controllers
from .billing import billing from .billing import billing

View File

@ -1,4 +1,3 @@
import json
import uuid import uuid
from flask_login import current_user from flask_login import current_user
@ -9,17 +8,14 @@ from controllers.console import api
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from core.tools.tool_manager import ToolManager from core.ops.ops_trace_manager import OpsTraceManager
from core.tools.utils.configuration import ToolParameterConfigurationManager
from fields.app_fields import ( from fields.app_fields import (
app_detail_fields, app_detail_fields,
app_detail_fields_with_site, app_detail_fields_with_site,
app_pagination_fields, app_pagination_fields,
) )
from libs.login import login_required from libs.login import login_required
from models.model import App, AppMode, AppModelConfig
from services.app_service import AppService from services.app_service import AppService
from services.tag_service import TagService
ALLOW_CREATE_APP_MODES = ['chat', 'agent-chat', 'advanced-chat', 'workflow', 'completion'] ALLOW_CREATE_APP_MODES = ['chat', 'agent-chat', 'advanced-chat', 'workflow', 'completion']
@ -194,6 +190,10 @@ class AppExportApi(Resource):
@get_app_model @get_app_model
def get(self, app_model): def get(self, app_model):
"""Export app""" """Export app"""
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
app_service = AppService() app_service = AppService()
return { return {
@ -286,6 +286,39 @@ class AppApiStatus(Resource):
return app_model return app_model
class AppTraceApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, app_id):
"""Get app trace"""
app_trace_config = OpsTraceManager.get_app_tracing_config(
app_id=app_id
)
return app_trace_config
@setup_required
@login_required
@account_initialization_required
def post(self, app_id):
# add app trace
if not current_user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument('enabled', type=bool, required=True, location='json')
parser.add_argument('tracing_provider', type=str, required=True, location='json')
args = parser.parse_args()
OpsTraceManager.update_app_tracing_config(
app_id=app_id,
enabled=args['enabled'],
tracing_provider=args['tracing_provider'],
)
return {"result": "success"}
api.add_resource(AppListApi, '/apps') api.add_resource(AppListApi, '/apps')
api.add_resource(AppImportApi, '/apps/import') api.add_resource(AppImportApi, '/apps/import')
api.add_resource(AppApi, '/apps/<uuid:app_id>') api.add_resource(AppApi, '/apps/<uuid:app_id>')
@ -295,3 +328,4 @@ api.add_resource(AppNameApi, '/apps/<uuid:app_id>/name')
api.add_resource(AppIconApi, '/apps/<uuid:app_id>/icon') api.add_resource(AppIconApi, '/apps/<uuid:app_id>/icon')
api.add_resource(AppSiteStatus, '/apps/<uuid:app_id>/site-enable') api.add_resource(AppSiteStatus, '/apps/<uuid:app_id>/site-enable')
api.add_resource(AppApiStatus, '/apps/<uuid:app_id>/api-enable') api.add_resource(AppApiStatus, '/apps/<uuid:app_id>/api-enable')
api.add_resource(AppTraceApi, '/apps/<uuid:app_id>/trace')

View File

@ -97,3 +97,21 @@ class DraftWorkflowNotSync(BaseHTTPException):
error_code = 'draft_workflow_not_sync' error_code = 'draft_workflow_not_sync'
description = "Workflow graph might have been modified, please refresh and resubmit." description = "Workflow graph might have been modified, please refresh and resubmit."
code = 400 code = 400
class TracingConfigNotExist(BaseHTTPException):
error_code = 'trace_config_not_exist'
description = "Trace config not exist."
code = 400
class TracingConfigIsExist(BaseHTTPException):
error_code = 'trace_config_is_exist'
description = "Trace config is exist."
code = 400
class TracingConfigCheckError(BaseHTTPException):
error_code = 'trace_config_check_error'
description = "Invalid Credentials."
code = 400

View File

@ -25,6 +25,7 @@ class ModelConfigResource(Resource):
@account_initialization_required @account_initialization_required
@get_app_model(mode=[AppMode.AGENT_CHAT, AppMode.CHAT, AppMode.COMPLETION]) @get_app_model(mode=[AppMode.AGENT_CHAT, AppMode.CHAT, AppMode.COMPLETION])
def post(self, app_model): def post(self, app_model):
"""Modify app model config""" """Modify app model config"""
# validate config # validate config
model_configuration = AppModelConfigService.validate_configuration( model_configuration = AppModelConfigService.validate_configuration(

View File

@ -0,0 +1,101 @@
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.error import TracingConfigCheckError, TracingConfigIsExist, TracingConfigNotExist
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from libs.login import login_required
from services.ops_service import OpsService
class TraceAppConfigApi(Resource):
"""
Manage trace app configurations
"""
@setup_required
@login_required
@account_initialization_required
def get(self, app_id):
parser = reqparse.RequestParser()
parser.add_argument('tracing_provider', type=str, required=True, location='args')
args = parser.parse_args()
try:
trace_config = OpsService.get_tracing_app_config(
app_id=app_id, tracing_provider=args['tracing_provider']
)
if not trace_config:
return {"has_not_configured": True}
return trace_config
except Exception as e:
raise e
@setup_required
@login_required
@account_initialization_required
def post(self, app_id):
"""Create a new trace app configuration"""
parser = reqparse.RequestParser()
parser.add_argument('tracing_provider', type=str, required=True, location='json')
parser.add_argument('tracing_config', type=dict, required=True, location='json')
args = parser.parse_args()
try:
result = OpsService.create_tracing_app_config(
app_id=app_id,
tracing_provider=args['tracing_provider'],
tracing_config=args['tracing_config']
)
if not result:
raise TracingConfigIsExist()
if result.get('error'):
raise TracingConfigCheckError()
return result
except Exception as e:
raise e
@setup_required
@login_required
@account_initialization_required
def patch(self, app_id):
"""Update an existing trace app configuration"""
parser = reqparse.RequestParser()
parser.add_argument('tracing_provider', type=str, required=True, location='json')
parser.add_argument('tracing_config', type=dict, required=True, location='json')
args = parser.parse_args()
try:
result = OpsService.update_tracing_app_config(
app_id=app_id,
tracing_provider=args['tracing_provider'],
tracing_config=args['tracing_config']
)
if not result:
raise TracingConfigNotExist()
return {"result": "success"}
except Exception as e:
raise e
@setup_required
@login_required
@account_initialization_required
def delete(self, app_id):
"""Delete an existing trace app configuration"""
parser = reqparse.RequestParser()
parser.add_argument('tracing_provider', type=str, required=True, location='args')
args = parser.parse_args()
try:
result = OpsService.delete_tracing_app_config(
app_id=app_id,
tracing_provider=args['tracing_provider']
)
if not result:
raise TracingConfigNotExist()
return {"result": "success"}
except Exception as e:
raise e
api.add_resource(TraceAppConfigApi, '/apps/<uuid:app_id>/trace-config')

View File

@ -20,6 +20,8 @@ def parse_app_site_args():
parser.add_argument('icon_background', type=str, required=False, location='json') parser.add_argument('icon_background', type=str, required=False, location='json')
parser.add_argument('description', type=str, required=False, location='json') parser.add_argument('description', type=str, required=False, location='json')
parser.add_argument('default_language', type=supported_language, required=False, location='json') parser.add_argument('default_language', type=supported_language, required=False, location='json')
parser.add_argument('chat_color_theme', type=str, required=False, location='json')
parser.add_argument('chat_color_theme_inverted', type=bool, required=False, location='json')
parser.add_argument('customize_domain', type=str, required=False, location='json') parser.add_argument('customize_domain', type=str, required=False, location='json')
parser.add_argument('copyright', type=str, required=False, location='json') parser.add_argument('copyright', type=str, required=False, location='json')
parser.add_argument('privacy_policy', type=str, required=False, location='json') parser.add_argument('privacy_policy', type=str, required=False, location='json')
@ -55,6 +57,8 @@ class AppSite(Resource):
'icon_background', 'icon_background',
'description', 'description',
'default_language', 'default_language',
'chat_color_theme',
'chat_color_theme_inverted',
'customize_domain', 'customize_domain',
'copyright', 'copyright',
'privacy_policy', 'privacy_policy',

View File

@ -109,6 +109,34 @@ class DraftWorkflowApi(Resource):
} }
class DraftWorkflowImportApi(Resource):
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW])
@marshal_with(workflow_fields)
def post(self, app_model: App):
"""
Import draft workflow
"""
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument('data', type=str, required=True, nullable=False, location='json')
args = parser.parse_args()
workflow_service = WorkflowService()
workflow = workflow_service.import_draft_workflow(
app_model=app_model,
data=args['data'],
account=current_user
)
return workflow
class AdvancedChatDraftWorkflowRunApi(Resource): class AdvancedChatDraftWorkflowRunApi(Resource):
@setup_required @setup_required
@login_required @login_required
@ -439,6 +467,7 @@ class ConvertToWorkflowApi(Resource):
api.add_resource(DraftWorkflowApi, '/apps/<uuid:app_id>/workflows/draft') api.add_resource(DraftWorkflowApi, '/apps/<uuid:app_id>/workflows/draft')
api.add_resource(DraftWorkflowImportApi, '/apps/<uuid:app_id>/workflows/draft/import')
api.add_resource(AdvancedChatDraftWorkflowRunApi, '/apps/<uuid:app_id>/advanced-chat/workflows/draft/run') api.add_resource(AdvancedChatDraftWorkflowRunApi, '/apps/<uuid:app_id>/advanced-chat/workflows/draft/run')
api.add_resource(DraftWorkflowRunApi, '/apps/<uuid:app_id>/workflows/draft/run') api.add_resource(DraftWorkflowRunApi, '/apps/<uuid:app_id>/workflows/draft/run')
api.add_resource(WorkflowTaskStopApi, '/apps/<uuid:app_id>/workflow-runs/tasks/<string:task_id>/stop') api.add_resource(WorkflowTaskStopApi, '/apps/<uuid:app_id>/workflow-runs/tasks/<string:task_id>/stop')

View File

@ -6,6 +6,7 @@ from flask_login import current_user
from flask_restful import Resource from flask_restful import Resource
from werkzeug.exceptions import Forbidden from werkzeug.exceptions import Forbidden
from configs import dify_config
from controllers.console import api from controllers.console import api
from libs.login import login_required from libs.login import login_required
from libs.oauth_data_source import NotionOAuth from libs.oauth_data_source import NotionOAuth
@ -16,11 +17,11 @@ from ..wraps import account_initialization_required
def get_oauth_providers(): def get_oauth_providers():
with current_app.app_context(): with current_app.app_context():
notion_oauth = NotionOAuth(client_id=current_app.config.get('NOTION_CLIENT_ID'), if not dify_config.NOTION_CLIENT_ID or not dify_config.NOTION_CLIENT_SECRET:
client_secret=current_app.config.get( return {}
'NOTION_CLIENT_SECRET'), notion_oauth = NotionOAuth(client_id=dify_config.NOTION_CLIENT_ID,
redirect_uri=current_app.config.get( client_secret=dify_config.NOTION_CLIENT_SECRET,
'CONSOLE_API_URL') + '/console/api/oauth/data-source/callback/notion') redirect_uri=dify_config.CONSOLE_API_URL + '/console/api/oauth/data-source/callback/notion')
OAUTH_PROVIDERS = { OAUTH_PROVIDERS = {
'notion': notion_oauth 'notion': notion_oauth
@ -39,8 +40,10 @@ class OAuthDataSource(Resource):
print(vars(oauth_provider)) print(vars(oauth_provider))
if not oauth_provider: if not oauth_provider:
return {'error': 'Invalid provider'}, 400 return {'error': 'Invalid provider'}, 400
if current_app.config.get('NOTION_INTEGRATION_TYPE') == 'internal': if dify_config.NOTION_INTEGRATION_TYPE == 'internal':
internal_secret = current_app.config.get('NOTION_INTERNAL_SECRET') internal_secret = dify_config.NOTION_INTERNAL_SECRET
if not internal_secret:
return {'error': 'Internal secret is not set'},
oauth_provider.save_internal_access_token(internal_secret) oauth_provider.save_internal_access_token(internal_secret)
return { 'data': '' } return { 'data': '' }
else: else:
@ -60,13 +63,13 @@ class OAuthDataSourceCallback(Resource):
if 'code' in request.args: if 'code' in request.args:
code = request.args.get('code') code = request.args.get('code')
return redirect(f'{current_app.config.get("CONSOLE_WEB_URL")}?type=notion&code={code}') return redirect(f'{dify_config.CONSOLE_WEB_URL}?type=notion&code={code}')
elif 'error' in request.args: elif 'error' in request.args:
error = request.args.get('error') error = request.args.get('error')
return redirect(f'{current_app.config.get("CONSOLE_WEB_URL")}?type=notion&error={error}') return redirect(f'{dify_config.CONSOLE_WEB_URL}?type=notion&error={error}')
else: else:
return redirect(f'{current_app.config.get("CONSOLE_WEB_URL")}?type=notion&error=Access denied') return redirect(f'{dify_config.CONSOLE_WEB_URL}?type=notion&error=Access denied')
class OAuthDataSourceBinding(Resource): class OAuthDataSourceBinding(Resource):

View File

@ -5,3 +5,28 @@ class ApiKeyAuthFailedError(BaseHTTPException):
error_code = 'auth_failed' error_code = 'auth_failed'
description = "{message}" description = "{message}"
code = 500 code = 500
class InvalidEmailError(BaseHTTPException):
error_code = 'invalid_email'
description = "The email address is not valid."
code = 400
class PasswordMismatchError(BaseHTTPException):
error_code = 'password_mismatch'
description = "The passwords do not match."
code = 400
class InvalidTokenError(BaseHTTPException):
error_code = 'invalid_or_expired_token'
description = "The token is invalid or has expired."
code = 400
class PasswordResetRateLimitExceededError(BaseHTTPException):
error_code = 'password_reset_rate_limit_exceeded'
description = "Password reset rate limit exceeded. Try again later."
code = 429

View File

@ -0,0 +1,107 @@
import base64
import logging
import secrets
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.auth.error import (
InvalidEmailError,
InvalidTokenError,
PasswordMismatchError,
PasswordResetRateLimitExceededError,
)
from controllers.console.setup import setup_required
from extensions.ext_database import db
from libs.helper import email as email_validate
from libs.password import hash_password, valid_password
from models.account import Account
from services.account_service import AccountService
from services.errors.account import RateLimitExceededError
class ForgotPasswordSendEmailApi(Resource):
@setup_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument('email', type=str, required=True, location='json')
args = parser.parse_args()
email = args['email']
if not email_validate(email):
raise InvalidEmailError()
account = Account.query.filter_by(email=email).first()
if account:
try:
AccountService.send_reset_password_email(account=account)
except RateLimitExceededError:
logging.warning(f"Rate limit exceeded for email: {account.email}")
raise PasswordResetRateLimitExceededError()
else:
# Return success to avoid revealing email registration status
logging.warning(f"Attempt to reset password for unregistered email: {email}")
return {"result": "success"}
class ForgotPasswordCheckApi(Resource):
@setup_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument('token', type=str, required=True, nullable=False, location='json')
args = parser.parse_args()
token = args['token']
reset_data = AccountService.get_reset_password_data(token)
if reset_data is None:
return {'is_valid': False, 'email': None}
return {'is_valid': True, 'email': reset_data.get('email')}
class ForgotPasswordResetApi(Resource):
@setup_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument('token', type=str, required=True, nullable=False, location='json')
parser.add_argument('new_password', type=valid_password, required=True, nullable=False, location='json')
parser.add_argument('password_confirm', type=valid_password, required=True, nullable=False, location='json')
args = parser.parse_args()
new_password = args['new_password']
password_confirm = args['password_confirm']
if str(new_password).strip() != str(password_confirm).strip():
raise PasswordMismatchError()
token = args['token']
reset_data = AccountService.get_reset_password_data(token)
if reset_data is None:
raise InvalidTokenError()
AccountService.revoke_reset_password_token(token)
salt = secrets.token_bytes(16)
base64_salt = base64.b64encode(salt).decode()
password_hashed = hash_password(new_password, salt)
base64_password_hashed = base64.b64encode(password_hashed).decode()
account = Account.query.filter_by(email=reset_data.get('email')).first()
account.password = base64_password_hashed
account.password_salt = base64_salt
db.session.commit()
return {'result': 'success'}
api.add_resource(ForgotPasswordSendEmailApi, '/forgot-password')
api.add_resource(ForgotPasswordCheckApi, '/forgot-password/validity')
api.add_resource(ForgotPasswordResetApi, '/forgot-password/resets')

View File

@ -1,7 +1,7 @@
from typing import cast from typing import cast
import flask_login import flask_login
from flask import current_app, request from flask import request
from flask_restful import Resource, reqparse from flask_restful import Resource, reqparse
import services import services
@ -56,14 +56,14 @@ class LogoutApi(Resource):
class ResetPasswordApi(Resource): class ResetPasswordApi(Resource):
@setup_required @setup_required
def get(self): def get(self):
parser = reqparse.RequestParser() # parser = reqparse.RequestParser()
parser.add_argument('email', type=email, required=True, location='json') # parser.add_argument('email', type=email, required=True, location='json')
args = parser.parse_args() # args = parser.parse_args()
# import mailchimp_transactional as MailchimpTransactional # import mailchimp_transactional as MailchimpTransactional
# from mailchimp_transactional.api_client import ApiClientError # from mailchimp_transactional.api_client import ApiClientError
account = {'email': args['email']} # account = {'email': args['email']}
# account = AccountService.get_by_email(args['email']) # account = AccountService.get_by_email(args['email'])
# if account is None: # if account is None:
# raise ValueError('Email not found') # raise ValueError('Email not found')
@ -71,22 +71,22 @@ class ResetPasswordApi(Resource):
# AccountService.update_password(account, new_password) # AccountService.update_password(account, new_password)
# todo: Send email # todo: Send email
MAILCHIMP_API_KEY = current_app.config['MAILCHIMP_TRANSACTIONAL_API_KEY'] # MAILCHIMP_API_KEY = current_app.config['MAILCHIMP_TRANSACTIONAL_API_KEY']
# mailchimp = MailchimpTransactional(MAILCHIMP_API_KEY) # mailchimp = MailchimpTransactional(MAILCHIMP_API_KEY)
message = { # message = {
'from_email': 'noreply@example.com', # 'from_email': 'noreply@example.com',
'to': [{'email': account.email}], # 'to': [{'email': account['email']}],
'subject': 'Reset your Dify password', # 'subject': 'Reset your Dify password',
'html': """ # 'html': """
<p>Dear User,</p> # <p>Dear User,</p>
<p>The Dify team has generated a new password for you, details as follows:</p> # <p>The Dify team has generated a new password for you, details as follows:</p>
<p><strong>{new_password}</strong></p> # <p><strong>{new_password}</strong></p>
<p>Please change your password to log in as soon as possible.</p> # <p>Please change your password to log in as soon as possible.</p>
<p>Regards,</p> # <p>Regards,</p>
<p>The Dify Team</p> # <p>The Dify Team</p>
""" # """
} # }
# response = mailchimp.messages.send({ # response = mailchimp.messages.send({
# 'message': message, # 'message': message,

View File

@ -6,6 +6,7 @@ import requests
from flask import current_app, redirect, request from flask import current_app, redirect, request
from flask_restful import Resource from flask_restful import Resource
from configs import dify_config
from constants.languages import languages from constants.languages import languages
from extensions.ext_database import db from extensions.ext_database import db
from libs.helper import get_remote_ip from libs.helper import get_remote_ip
@ -18,22 +19,24 @@ from .. import api
def get_oauth_providers(): def get_oauth_providers():
with current_app.app_context(): with current_app.app_context():
github_oauth = GitHubOAuth(client_id=current_app.config.get('GITHUB_CLIENT_ID'), if not dify_config.GITHUB_CLIENT_ID or not dify_config.GITHUB_CLIENT_SECRET:
client_secret=current_app.config.get( github_oauth = None
'GITHUB_CLIENT_SECRET'), else:
redirect_uri=current_app.config.get( github_oauth = GitHubOAuth(
'CONSOLE_API_URL') + '/console/api/oauth/authorize/github') client_id=dify_config.GITHUB_CLIENT_ID,
client_secret=dify_config.GITHUB_CLIENT_SECRET,
redirect_uri=dify_config.CONSOLE_API_URL + '/console/api/oauth/authorize/github',
)
if not dify_config.GOOGLE_CLIENT_ID or not dify_config.GOOGLE_CLIENT_SECRET:
google_oauth = None
else:
google_oauth = GoogleOAuth(
client_id=dify_config.GOOGLE_CLIENT_ID,
client_secret=dify_config.GOOGLE_CLIENT_SECRET,
redirect_uri=dify_config.CONSOLE_API_URL + '/console/api/oauth/authorize/google',
)
google_oauth = GoogleOAuth(client_id=current_app.config.get('GOOGLE_CLIENT_ID'), OAUTH_PROVIDERS = {'github': github_oauth, 'google': google_oauth}
client_secret=current_app.config.get(
'GOOGLE_CLIENT_SECRET'),
redirect_uri=current_app.config.get(
'CONSOLE_API_URL') + '/console/api/oauth/authorize/google')
OAUTH_PROVIDERS = {
'github': github_oauth,
'google': google_oauth
}
return OAUTH_PROVIDERS return OAUTH_PROVIDERS
@ -63,8 +66,7 @@ class OAuthCallback(Resource):
token = oauth_provider.get_access_token(code) token = oauth_provider.get_access_token(code)
user_info = oauth_provider.get_user_info(token) user_info = oauth_provider.get_user_info(token)
except requests.exceptions.HTTPError as e: except requests.exceptions.HTTPError as e:
logging.exception( logging.exception(f'An error occurred during the OAuth process with {provider}: {e.response.text}')
f"An error occurred during the OAuth process with {provider}: {e.response.text}")
return {'error': 'OAuth process failed'}, 400 return {'error': 'OAuth process failed'}, 400
account = _generate_account(provider, user_info) account = _generate_account(provider, user_info)
@ -81,7 +83,7 @@ class OAuthCallback(Resource):
token = AccountService.login(account, ip_address=get_remote_ip(request)) token = AccountService.login(account, ip_address=get_remote_ip(request))
return redirect(f'{current_app.config.get("CONSOLE_WEB_URL")}?console_token={token}') return redirect(f'{dify_config.CONSOLE_WEB_URL}?console_token={token}')
def _get_account_by_openid_or_email(provider: str, user_info: OAuthUserInfo) -> Optional[Account]: def _get_account_by_openid_or_email(provider: str, user_info: OAuthUserInfo) -> Optional[Account]:
@ -101,11 +103,7 @@ def _generate_account(provider: str, user_info: OAuthUserInfo):
# Create account # Create account
account_name = user_info.name if user_info.name else 'Dify' account_name = user_info.name if user_info.name else 'Dify'
account = RegisterService.register( account = RegisterService.register(
email=user_info.email, email=user_info.email, name=account_name, password=None, open_id=user_info.id, provider=provider
name=account_name,
password=None,
open_id=user_info.id,
provider=provider
) )
# Set interface language # Set interface language

View File

@ -8,7 +8,7 @@ import services
from controllers.console import api from controllers.console import api
from controllers.console.apikey import api_key_fields, api_key_list from controllers.console.apikey import api_key_fields, api_key_list
from controllers.console.app.error import ProviderNotInitializeError from controllers.console.app.error import ProviderNotInitializeError
from controllers.console.datasets.error import DatasetInUseError, DatasetNameDuplicateError from controllers.console.datasets.error import DatasetInUseError, DatasetNameDuplicateError, IndexingEstimateError
from controllers.console.setup import setup_required from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required from controllers.console.wraps import account_initialization_required
from core.errors.error import LLMBadRequestError, ProviderTokenNotInitError from core.errors.error import LLMBadRequestError, ProviderTokenNotInitError
@ -226,6 +226,15 @@ class DatasetApi(Resource):
except services.errors.dataset.DatasetInUseError: except services.errors.dataset.DatasetInUseError:
raise DatasetInUseError() raise DatasetInUseError()
class DatasetUseCheckApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, dataset_id):
dataset_id_str = str(dataset_id)
dataset_is_using = DatasetService.dataset_use_check(dataset_id_str)
return {'is_using': dataset_is_using}, 200
class DatasetQueryApi(Resource): class DatasetQueryApi(Resource):
@ -346,6 +355,8 @@ class DatasetIndexingEstimateApi(Resource):
"in the Settings -> Model Provider.") "in the Settings -> Model Provider.")
except ProviderTokenNotInitError as ex: except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description) raise ProviderNotInitializeError(ex.description)
except Exception as e:
raise IndexingEstimateError(str(e))
return response, 200 return response, 200
@ -560,6 +571,7 @@ class DatasetErrorDocs(Resource):
api.add_resource(DatasetListApi, '/datasets') api.add_resource(DatasetListApi, '/datasets')
api.add_resource(DatasetApi, '/datasets/<uuid:dataset_id>') api.add_resource(DatasetApi, '/datasets/<uuid:dataset_id>')
api.add_resource(DatasetUseCheckApi, '/datasets/<uuid:dataset_id>/use-check')
api.add_resource(DatasetQueryApi, '/datasets/<uuid:dataset_id>/queries') api.add_resource(DatasetQueryApi, '/datasets/<uuid:dataset_id>/queries')
api.add_resource(DatasetErrorDocs, '/datasets/<uuid:dataset_id>/error-docs') api.add_resource(DatasetErrorDocs, '/datasets/<uuid:dataset_id>/error-docs')
api.add_resource(DatasetIndexingEstimateApi, '/datasets/indexing-estimate') api.add_resource(DatasetIndexingEstimateApi, '/datasets/indexing-estimate')

View File

@ -20,6 +20,7 @@ from controllers.console.datasets.error import (
ArchivedDocumentImmutableError, ArchivedDocumentImmutableError,
DocumentAlreadyFinishedError, DocumentAlreadyFinishedError,
DocumentIndexingError, DocumentIndexingError,
IndexingEstimateError,
InvalidActionError, InvalidActionError,
InvalidMetadataError, InvalidMetadataError,
) )
@ -388,6 +389,8 @@ class DocumentIndexingEstimateApi(DocumentResource):
"in the Settings -> Model Provider.") "in the Settings -> Model Provider.")
except ProviderTokenNotInitError as ex: except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description) raise ProviderNotInitializeError(ex.description)
except Exception as e:
raise IndexingEstimateError(str(e))
return response return response
@ -493,6 +496,8 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
"in the Settings -> Model Provider.") "in the Settings -> Model Provider.")
except ProviderTokenNotInitError as ex: except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description) raise ProviderNotInitializeError(ex.description)
except Exception as e:
raise IndexingEstimateError(str(e))
return response return response

View File

@ -83,3 +83,9 @@ class DatasetInUseError(BaseHTTPException):
error_code = 'dataset_in_use' error_code = 'dataset_in_use'
description = "The dataset is being used by some apps. Please remove the dataset from the apps before deleting it." description = "The dataset is being used by some apps. Please remove the dataset from the apps before deleting it."
code = 409 code = 409
class IndexingEstimateError(BaseHTTPException):
error_code = 'indexing_estimate_error'
description = "Knowledge indexing estimate failed: {message}"
code = 500

View File

@ -3,11 +3,10 @@ from functools import wraps
from flask import current_app, request from flask import current_app, request
from flask_restful import Resource, reqparse from flask_restful import Resource, reqparse
from extensions.ext_database import db
from libs.helper import email, get_remote_ip, str_len from libs.helper import email, get_remote_ip, str_len
from libs.password import valid_password from libs.password import valid_password
from models.model import DifySetup from models.model import DifySetup
from services.account_service import AccountService, RegisterService, TenantService from services.account_service import RegisterService, TenantService
from . import api from . import api
from .error import AlreadySetupError, NotInitValidateError, NotSetupError from .error import AlreadySetupError, NotInitValidateError, NotSetupError
@ -51,28 +50,17 @@ class SetupApi(Resource):
required=True, location='json') required=True, location='json')
args = parser.parse_args() args = parser.parse_args()
# Register # setup
account = RegisterService.register( RegisterService.setup(
email=args['email'], email=args['email'],
name=args['name'], name=args['name'],
password=args['password'] password=args['password'],
ip_address=get_remote_ip(request)
) )
TenantService.create_owner_tenant_if_not_exist(account)
setup()
AccountService.update_last_login(account, ip_address=get_remote_ip(request))
return {'result': 'success'}, 201 return {'result': 'success'}, 201
def setup():
dify_setup = DifySetup(
version=current_app.config['CURRENT_VERSION']
)
db.session.add(dify_setup)
def setup_required(view): def setup_required(view):
@wraps(view) @wraps(view)
def decorated(*args, **kwargs): def decorated(*args, **kwargs):

View File

@ -245,6 +245,8 @@ class AccountIntegrateApi(Resource):
return {'data': integrate_data} return {'data': integrate_data}
# Register API resources # Register API resources
api.add_resource(AccountInitApi, '/account/init') api.add_resource(AccountInitApi, '/account/init')
api.add_resource(AccountProfileApi, '/account/profile') api.add_resource(AccountProfileApi, '/account/profile')

View File

@ -26,6 +26,8 @@ class AppSiteApi(WebApiResource):
site_fields = { site_fields = {
'title': fields.String, 'title': fields.String,
'chat_color_theme': fields.String,
'chat_color_theme_inverted': fields.Boolean,
'icon': fields.String, 'icon': fields.String,
'icon_background': fields.String, 'icon_background': fields.String,
'description': fields.String, 'description': fields.String,

View File

@ -32,7 +32,6 @@ from core.model_runtime.entities.model_entities import ModelFeature
from core.model_runtime.model_providers.__base.large_language_model import LargeLanguageModel from core.model_runtime.model_providers.__base.large_language_model import LargeLanguageModel
from core.model_runtime.utils.encoders import jsonable_encoder from core.model_runtime.utils.encoders import jsonable_encoder
from core.tools.entities.tool_entities import ( from core.tools.entities.tool_entities import (
ToolInvokeMessage,
ToolParameter, ToolParameter,
ToolRuntimeVariablePool, ToolRuntimeVariablePool,
) )
@ -141,24 +140,6 @@ class BaseAgentRunner(AppRunner):
app_generate_entity.app_config.prompt_template.simple_prompt_template = '' app_generate_entity.app_config.prompt_template.simple_prompt_template = ''
return app_generate_entity return app_generate_entity
def _convert_tool_response_to_str(self, tool_response: list[ToolInvokeMessage]) -> str:
"""
Handle tool response
"""
result = ''
for response in tool_response:
if response.type == ToolInvokeMessage.MessageType.TEXT:
result += response.message
elif response.type == ToolInvokeMessage.MessageType.LINK:
result += f"result link: {response.message}. please tell user to check it."
elif response.type == ToolInvokeMessage.MessageType.IMAGE_LINK or \
response.type == ToolInvokeMessage.MessageType.IMAGE:
result += "image has been created and sent to user already, you do not need to create it, just tell the user to check it now."
else:
result += f"tool response: {response.message}."
return result
def _convert_tool_to_prompt_message_tool(self, tool: AgentToolEntity) -> tuple[PromptMessageTool, Tool]: def _convert_tool_to_prompt_message_tool(self, tool: AgentToolEntity) -> tuple[PromptMessageTool, Tool]:
""" """

View File

@ -1,7 +1,7 @@
import json import json
from abc import ABC, abstractmethod from abc import ABC, abstractmethod
from collections.abc import Generator from collections.abc import Generator
from typing import Union from typing import Optional, Union
from core.agent.base_agent_runner import BaseAgentRunner from core.agent.base_agent_runner import BaseAgentRunner
from core.agent.entities import AgentScratchpadUnit from core.agent.entities import AgentScratchpadUnit
@ -15,6 +15,7 @@ from core.model_runtime.entities.message_entities import (
ToolPromptMessage, ToolPromptMessage,
UserPromptMessage, UserPromptMessage,
) )
from core.ops.ops_trace_manager import TraceQueueManager
from core.prompt.agent_history_prompt_transform import AgentHistoryPromptTransform from core.prompt.agent_history_prompt_transform import AgentHistoryPromptTransform
from core.tools.entities.tool_entities import ToolInvokeMeta from core.tools.entities.tool_entities import ToolInvokeMeta
from core.tools.tool.tool import Tool from core.tools.tool.tool import Tool
@ -42,6 +43,8 @@ class CotAgentRunner(BaseAgentRunner, ABC):
self._repack_app_generate_entity(app_generate_entity) self._repack_app_generate_entity(app_generate_entity)
self._init_react_state(query) self._init_react_state(query)
trace_manager = app_generate_entity.trace_manager
# check model mode # check model mode
if 'Observation' not in app_generate_entity.model_conf.stop: if 'Observation' not in app_generate_entity.model_conf.stop:
if app_generate_entity.model_conf.provider not in self._ignore_observation_providers: if app_generate_entity.model_conf.provider not in self._ignore_observation_providers:
@ -211,7 +214,8 @@ class CotAgentRunner(BaseAgentRunner, ABC):
tool_invoke_response, tool_invoke_meta = self._handle_invoke_action( tool_invoke_response, tool_invoke_meta = self._handle_invoke_action(
action=scratchpad.action, action=scratchpad.action,
tool_instances=tool_instances, tool_instances=tool_instances,
message_file_ids=message_file_ids message_file_ids=message_file_ids,
trace_manager=trace_manager,
) )
scratchpad.observation = tool_invoke_response scratchpad.observation = tool_invoke_response
scratchpad.agent_response = tool_invoke_response scratchpad.agent_response = tool_invoke_response
@ -237,8 +241,7 @@ class CotAgentRunner(BaseAgentRunner, ABC):
# update prompt tool message # update prompt tool message
for prompt_tool in self._prompt_messages_tools: for prompt_tool in self._prompt_messages_tools:
self.update_prompt_message_tool( self.update_prompt_message_tool(tool_instances[prompt_tool.name], prompt_tool)
tool_instances[prompt_tool.name], prompt_tool)
iteration_step += 1 iteration_step += 1
@ -275,14 +278,15 @@ class CotAgentRunner(BaseAgentRunner, ABC):
message=AssistantPromptMessage( message=AssistantPromptMessage(
content=final_answer content=final_answer
), ),
usage=llm_usage['usage'] if llm_usage['usage'] else LLMUsage.empty_usage( usage=llm_usage['usage'] if llm_usage['usage'] else LLMUsage.empty_usage(),
),
system_fingerprint='' system_fingerprint=''
)), PublishFrom.APPLICATION_MANAGER) )), PublishFrom.APPLICATION_MANAGER)
def _handle_invoke_action(self, action: AgentScratchpadUnit.Action, def _handle_invoke_action(self, action: AgentScratchpadUnit.Action,
tool_instances: dict[str, Tool], tool_instances: dict[str, Tool],
message_file_ids: list[str]) -> tuple[str, ToolInvokeMeta]: message_file_ids: list[str],
trace_manager: Optional[TraceQueueManager] = None
) -> tuple[str, ToolInvokeMeta]:
""" """
handle invoke action handle invoke action
:param action: action :param action: action
@ -312,21 +316,22 @@ class CotAgentRunner(BaseAgentRunner, ABC):
tenant_id=self.tenant_id, tenant_id=self.tenant_id,
message=self.message, message=self.message,
invoke_from=self.application_generate_entity.invoke_from, invoke_from=self.application_generate_entity.invoke_from,
agent_tool_callback=self.agent_callback agent_tool_callback=self.agent_callback,
trace_manager=trace_manager,
) )
# publish files # publish files
for message_file, save_as in message_files: for message_file_id, save_as in message_files:
if save_as: if save_as:
self.variables_pool.set_file( self.variables_pool.set_file(
tool_name=tool_call_name, value=message_file.id, name=save_as) tool_name=tool_call_name, value=message_file_id, name=save_as)
# publish message file # publish message file
self.queue_manager.publish(QueueMessageFileEvent( self.queue_manager.publish(QueueMessageFileEvent(
message_file_id=message_file.id message_file_id=message_file_id
), PublishFrom.APPLICATION_MANAGER) ), PublishFrom.APPLICATION_MANAGER)
# add message file ids # add message file ids
message_file_ids.append(message_file.id) message_file_ids.append(message_file_id)
return tool_invoke_response, tool_invoke_meta return tool_invoke_response, tool_invoke_meta

View File

@ -50,6 +50,9 @@ class FunctionCallAgentRunner(BaseAgentRunner):
} }
final_answer = '' final_answer = ''
# get tracing instance
trace_manager = app_generate_entity.trace_manager
def increase_usage(final_llm_usage_dict: dict[str, LLMUsage], usage: LLMUsage): def increase_usage(final_llm_usage_dict: dict[str, LLMUsage], usage: LLMUsage):
if not final_llm_usage_dict['usage']: if not final_llm_usage_dict['usage']:
final_llm_usage_dict['usage'] = usage final_llm_usage_dict['usage'] = usage
@ -243,18 +246,19 @@ class FunctionCallAgentRunner(BaseAgentRunner):
message=self.message, message=self.message,
invoke_from=self.application_generate_entity.invoke_from, invoke_from=self.application_generate_entity.invoke_from,
agent_tool_callback=self.agent_callback, agent_tool_callback=self.agent_callback,
trace_manager=trace_manager,
) )
# publish files # publish files
for message_file, save_as in message_files: for message_file_id, save_as in message_files:
if save_as: if save_as:
self.variables_pool.set_file(tool_name=tool_call_name, value=message_file.id, name=save_as) self.variables_pool.set_file(tool_name=tool_call_name, value=message_file_id, name=save_as)
# publish message file # publish message file
self.queue_manager.publish(QueueMessageFileEvent( self.queue_manager.publish(QueueMessageFileEvent(
message_file_id=message_file.id message_file_id=message_file_id
), PublishFrom.APPLICATION_MANAGER) ), PublishFrom.APPLICATION_MANAGER)
# add message file ids # add message file ids
message_file_ids.append(message_file.id) message_file_ids.append(message_file_id)
tool_response = { tool_response = {
"tool_call_id": tool_call_id, "tool_call_id": tool_call_id,

View File

@ -40,7 +40,7 @@ class AgentConfigManager:
'provider_type': tool['provider_type'], 'provider_type': tool['provider_type'],
'provider_id': tool['provider_id'], 'provider_id': tool['provider_id'],
'tool_name': tool['tool_name'], 'tool_name': tool['tool_name'],
'tool_parameters': tool['tool_parameters'] if 'tool_parameters' in tool else {} 'tool_parameters': tool.get('tool_parameters', {})
} }
agent_tools.append(AgentToolEntity(**agent_tool_properties)) agent_tools.append(AgentToolEntity(**agent_tool_properties))

View File

@ -114,6 +114,10 @@ class VariableEntity(BaseModel):
default: Optional[str] = None default: Optional[str] = None
hint: Optional[str] = None hint: Optional[str] = None
@property
def name(self) -> str:
return self.variable
class ExternalDataVariableEntity(BaseModel): class ExternalDataVariableEntity(BaseModel):
""" """
@ -183,6 +187,14 @@ class TextToSpeechEntity(BaseModel):
language: Optional[str] = None language: Optional[str] = None
class TracingConfigEntity(BaseModel):
"""
Tracing Config Entity.
"""
enabled: bool
tracing_provider: str
class FileExtraConfig(BaseModel): class FileExtraConfig(BaseModel):
""" """
File Upload Entity. File Upload Entity.
@ -199,7 +211,7 @@ class AppAdditionalFeatures(BaseModel):
more_like_this: bool = False more_like_this: bool = False
speech_to_text: bool = False speech_to_text: bool = False
text_to_speech: Optional[TextToSpeechEntity] = None text_to_speech: Optional[TextToSpeechEntity] = None
trace_config: Optional[TracingConfigEntity] = None
class AppConfig(BaseModel): class AppConfig(BaseModel):
""" """

View File

@ -20,6 +20,7 @@ from core.app.entities.app_invoke_entities import AdvancedChatAppGenerateEntity,
from core.app.entities.task_entities import ChatbotAppBlockingResponse, ChatbotAppStreamResponse from core.app.entities.task_entities import ChatbotAppBlockingResponse, ChatbotAppStreamResponse
from core.file.message_file_parser import MessageFileParser from core.file.message_file_parser import MessageFileParser
from core.model_runtime.errors.invoke import InvokeAuthorizationError, InvokeError from core.model_runtime.errors.invoke import InvokeAuthorizationError, InvokeError
from core.ops.ops_trace_manager import TraceQueueManager
from extensions.ext_database import db from extensions.ext_database import db
from models.account import Account from models.account import Account
from models.model import App, Conversation, EndUser, Message from models.model import App, Conversation, EndUser, Message
@ -29,13 +30,14 @@ logger = logging.getLogger(__name__)
class AdvancedChatAppGenerator(MessageBasedAppGenerator): class AdvancedChatAppGenerator(MessageBasedAppGenerator):
def generate(self, app_model: App, def generate(
workflow: Workflow, self, app_model: App,
user: Union[Account, EndUser], workflow: Workflow,
args: dict, user: Union[Account, EndUser],
invoke_from: InvokeFrom, args: dict,
stream: bool = True) \ invoke_from: InvokeFrom,
-> Union[dict, Generator[dict, None, None]]: stream: bool = True,
) -> Union[dict, Generator[dict, None, None]]:
""" """
Generate App response. Generate App response.
@ -57,7 +59,7 @@ class AdvancedChatAppGenerator(MessageBasedAppGenerator):
inputs = args['inputs'] inputs = args['inputs']
extras = { extras = {
"auto_generate_conversation_name": args['auto_generate_name'] if 'auto_generate_name' in args else False "auto_generate_conversation_name": args.get('auto_generate_name', False)
} }
# get conversation # get conversation
@ -84,6 +86,13 @@ class AdvancedChatAppGenerator(MessageBasedAppGenerator):
workflow=workflow workflow=workflow
) )
# get tracing instance
trace_manager = TraceQueueManager(app_id=app_model.id)
if invoke_from == InvokeFrom.DEBUGGER:
# always enable retriever resource in debugger mode
app_config.additional_features.show_retrieve_source = True
# init application generate entity # init application generate entity
application_generate_entity = AdvancedChatAppGenerateEntity( application_generate_entity = AdvancedChatAppGenerateEntity(
task_id=str(uuid.uuid4()), task_id=str(uuid.uuid4()),
@ -95,7 +104,8 @@ class AdvancedChatAppGenerator(MessageBasedAppGenerator):
user_id=user.id, user_id=user.id,
stream=stream, stream=stream,
invoke_from=invoke_from, invoke_from=invoke_from,
extras=extras extras=extras,
trace_manager=trace_manager
) )
return self._generate( return self._generate(

View File

@ -70,7 +70,8 @@ class AdvancedChatAppRunner(AppRunner):
app_record=app_record, app_record=app_record,
app_generate_entity=application_generate_entity, app_generate_entity=application_generate_entity,
inputs=inputs, inputs=inputs,
query=query query=query,
message_id=message.id
): ):
return return
@ -156,11 +157,14 @@ class AdvancedChatAppRunner(AppRunner):
# return workflow # return workflow
return workflow return workflow
def handle_input_moderation(self, queue_manager: AppQueueManager, def handle_input_moderation(
app_record: App, self, queue_manager: AppQueueManager,
app_generate_entity: AdvancedChatAppGenerateEntity, app_record: App,
inputs: dict, app_generate_entity: AdvancedChatAppGenerateEntity,
query: str) -> bool: inputs: dict,
query: str,
message_id: str
) -> bool:
""" """
Handle input moderation Handle input moderation
:param queue_manager: application queue manager :param queue_manager: application queue manager
@ -168,6 +172,7 @@ class AdvancedChatAppRunner(AppRunner):
:param app_generate_entity: application generate entity :param app_generate_entity: application generate entity
:param inputs: inputs :param inputs: inputs
:param query: query :param query: query
:param message_id: message id
:return: :return:
""" """
try: try:
@ -178,6 +183,7 @@ class AdvancedChatAppRunner(AppRunner):
app_generate_entity=app_generate_entity, app_generate_entity=app_generate_entity,
inputs=inputs, inputs=inputs,
query=query, query=query,
message_id=message_id,
) )
except ModerationException as e: except ModerationException as e:
self._stream_output( self._stream_output(

View File

@ -42,6 +42,7 @@ from core.app.task_pipeline.workflow_cycle_manage import WorkflowCycleManage
from core.file.file_obj import FileVar from core.file.file_obj import FileVar
from core.model_runtime.entities.llm_entities import LLMUsage from core.model_runtime.entities.llm_entities import LLMUsage
from core.model_runtime.utils.encoders import jsonable_encoder from core.model_runtime.utils.encoders import jsonable_encoder
from core.ops.ops_trace_manager import TraceQueueManager
from core.workflow.entities.node_entities import NodeType, SystemVariable from core.workflow.entities.node_entities import NodeType, SystemVariable
from core.workflow.nodes.answer.answer_node import AnswerNode from core.workflow.nodes.answer.answer_node import AnswerNode
from core.workflow.nodes.answer.entities import TextGenerateRouteChunk, VarGenerateRouteChunk from core.workflow.nodes.answer.entities import TextGenerateRouteChunk, VarGenerateRouteChunk
@ -69,13 +70,15 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
_workflow_system_variables: dict[SystemVariable, Any] _workflow_system_variables: dict[SystemVariable, Any]
_iteration_nested_relations: dict[str, list[str]] _iteration_nested_relations: dict[str, list[str]]
def __init__(self, application_generate_entity: AdvancedChatAppGenerateEntity, def __init__(
workflow: Workflow, self, application_generate_entity: AdvancedChatAppGenerateEntity,
queue_manager: AppQueueManager, workflow: Workflow,
conversation: Conversation, queue_manager: AppQueueManager,
message: Message, conversation: Conversation,
user: Union[Account, EndUser], message: Message,
stream: bool) -> None: user: Union[Account, EndUser],
stream: bool
) -> None:
""" """
Initialize AdvancedChatAppGenerateTaskPipeline. Initialize AdvancedChatAppGenerateTaskPipeline.
:param application_generate_entity: application generate entity :param application_generate_entity: application generate entity
@ -126,14 +129,16 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
self._application_generate_entity.query self._application_generate_entity.query
) )
generator = self._process_stream_response() generator = self._process_stream_response(
trace_manager=self._application_generate_entity.trace_manager
)
if self._stream: if self._stream:
return self._to_stream_response(generator) return self._to_stream_response(generator)
else: else:
return self._to_blocking_response(generator) return self._to_blocking_response(generator)
def _to_blocking_response(self, generator: Generator[StreamResponse, None, None]) \ def _to_blocking_response(self, generator: Generator[StreamResponse, None, None]) \
-> ChatbotAppBlockingResponse: -> ChatbotAppBlockingResponse:
""" """
Process blocking response. Process blocking response.
:return: :return:
@ -164,7 +169,7 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
raise Exception('Queue listening stopped unexpectedly.') raise Exception('Queue listening stopped unexpectedly.')
def _to_stream_response(self, generator: Generator[StreamResponse, None, None]) \ def _to_stream_response(self, generator: Generator[StreamResponse, None, None]) \
-> Generator[ChatbotAppStreamResponse, None, None]: -> Generator[ChatbotAppStreamResponse, None, None]:
""" """
To stream response. To stream response.
:return: :return:
@ -177,7 +182,9 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
stream_response=stream_response stream_response=stream_response
) )
def _process_stream_response(self) -> Generator[StreamResponse, None, None]: def _process_stream_response(
self, trace_manager: Optional[TraceQueueManager] = None
) -> Generator[StreamResponse, None, None]:
""" """
Process stream response. Process stream response.
:return: :return:
@ -249,7 +256,9 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
yield self._handle_iteration_to_stream_response(self._application_generate_entity.task_id, event) yield self._handle_iteration_to_stream_response(self._application_generate_entity.task_id, event)
self._handle_iteration_operation(event) self._handle_iteration_operation(event)
elif isinstance(event, QueueStopEvent | QueueWorkflowSucceededEvent | QueueWorkflowFailedEvent): elif isinstance(event, QueueStopEvent | QueueWorkflowSucceededEvent | QueueWorkflowFailedEvent):
workflow_run = self._handle_workflow_finished(event) workflow_run = self._handle_workflow_finished(
event, conversation_id=self._conversation.id, trace_manager=trace_manager
)
if workflow_run: if workflow_run:
yield self._workflow_finish_to_stream_response( yield self._workflow_finish_to_stream_response(
task_id=self._application_generate_entity.task_id, task_id=self._application_generate_entity.task_id,
@ -292,7 +301,7 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
continue continue
if not self._is_stream_out_support( if not self._is_stream_out_support(
event=event event=event
): ):
continue continue
@ -361,7 +370,7 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
id=self._message.id, id=self._message.id,
**extras **extras
) )
def _get_stream_generate_routes(self) -> dict[str, ChatflowStreamGenerateRoute]: def _get_stream_generate_routes(self) -> dict[str, ChatflowStreamGenerateRoute]:
""" """
Get stream generate routes. Get stream generate routes.
@ -391,9 +400,9 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
) )
return stream_generate_routes return stream_generate_routes
def _get_answer_start_at_node_ids(self, graph: dict, target_node_id: str) \ def _get_answer_start_at_node_ids(self, graph: dict, target_node_id: str) \
-> list[str]: -> list[str]:
""" """
Get answer start at node id. Get answer start at node id.
:param graph: graph :param graph: graph
@ -414,14 +423,14 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
target_node = next((node for node in nodes if node.get('id') == target_node_id), None) target_node = next((node for node in nodes if node.get('id') == target_node_id), None)
if not target_node: if not target_node:
return [] return []
node_iteration_id = target_node.get('data', {}).get('iteration_id') node_iteration_id = target_node.get('data', {}).get('iteration_id')
# get iteration start node id # get iteration start node id
for node in nodes: for node in nodes:
if node.get('id') == node_iteration_id: if node.get('id') == node_iteration_id:
if node.get('data', {}).get('start_node_id') == target_node_id: if node.get('data', {}).get('start_node_id') == target_node_id:
return [target_node_id] return [target_node_id]
return [] return []
start_node_ids = [] start_node_ids = []
@ -457,7 +466,7 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
start_node_ids.extend(sub_start_node_ids) start_node_ids.extend(sub_start_node_ids)
return start_node_ids return start_node_ids
def _get_iteration_nested_relations(self, graph: dict) -> dict[str, list[str]]: def _get_iteration_nested_relations(self, graph: dict) -> dict[str, list[str]]:
""" """
Get iteration nested relations. Get iteration nested relations.
@ -466,18 +475,18 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
""" """
nodes = graph.get('nodes') nodes = graph.get('nodes')
iteration_ids = [node.get('id') for node in nodes iteration_ids = [node.get('id') for node in nodes
if node.get('data', {}).get('type') in [ if node.get('data', {}).get('type') in [
NodeType.ITERATION.value, NodeType.ITERATION.value,
NodeType.LOOP.value, NodeType.LOOP.value,
]] ]]
return { return {
iteration_id: [ iteration_id: [
node.get('id') for node in nodes if node.get('data', {}).get('iteration_id') == iteration_id node.get('id') for node in nodes if node.get('data', {}).get('iteration_id') == iteration_id
] for iteration_id in iteration_ids ] for iteration_id in iteration_ids
} }
def _generate_stream_outputs_when_node_started(self) -> Generator: def _generate_stream_outputs_when_node_started(self) -> Generator:
""" """
Generate stream outputs. Generate stream outputs.
@ -485,8 +494,8 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
""" """
if self._task_state.current_stream_generate_state: if self._task_state.current_stream_generate_state:
route_chunks = self._task_state.current_stream_generate_state.generate_route[ route_chunks = self._task_state.current_stream_generate_state.generate_route[
self._task_state.current_stream_generate_state.current_route_position: self._task_state.current_stream_generate_state.current_route_position:
] ]
for route_chunk in route_chunks: for route_chunk in route_chunks:
if route_chunk.type == 'text': if route_chunk.type == 'text':
@ -506,7 +515,8 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
# all route chunks are generated # all route chunks are generated
if self._task_state.current_stream_generate_state.current_route_position == len( if self._task_state.current_stream_generate_state.current_route_position == len(
self._task_state.current_stream_generate_state.generate_route): self._task_state.current_stream_generate_state.generate_route
):
self._task_state.current_stream_generate_state = None self._task_state.current_stream_generate_state = None
def _generate_stream_outputs_when_node_finished(self) -> Optional[Generator]: def _generate_stream_outputs_when_node_finished(self) -> Optional[Generator]:
@ -519,7 +529,7 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
route_chunks = self._task_state.current_stream_generate_state.generate_route[ route_chunks = self._task_state.current_stream_generate_state.generate_route[
self._task_state.current_stream_generate_state.current_route_position:] self._task_state.current_stream_generate_state.current_route_position:]
for route_chunk in route_chunks: for route_chunk in route_chunks:
if route_chunk.type == 'text': if route_chunk.type == 'text':
route_chunk = cast(TextGenerateRouteChunk, route_chunk) route_chunk = cast(TextGenerateRouteChunk, route_chunk)
@ -551,7 +561,8 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
value = iteration_state.current_index value = iteration_state.current_index
elif value_selector[1] == 'item': elif value_selector[1] == 'item':
value = iterator_selector[iteration_state.current_index] if iteration_state.current_index < len( value = iterator_selector[iteration_state.current_index] if iteration_state.current_index < len(
iterator_selector) else None iterator_selector
) else None
else: else:
# check chunk node id is before current node id or equal to current node id # check chunk node id is before current node id or equal to current node id
if route_chunk_node_id not in self._task_state.ran_node_execution_infos: if route_chunk_node_id not in self._task_state.ran_node_execution_infos:
@ -562,14 +573,15 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
# get route chunk node execution info # get route chunk node execution info
route_chunk_node_execution_info = self._task_state.ran_node_execution_infos[route_chunk_node_id] route_chunk_node_execution_info = self._task_state.ran_node_execution_infos[route_chunk_node_id]
if (route_chunk_node_execution_info.node_type == NodeType.LLM if (route_chunk_node_execution_info.node_type == NodeType.LLM
and latest_node_execution_info.node_type == NodeType.LLM): and latest_node_execution_info.node_type == NodeType.LLM):
# only LLM support chunk stream output # only LLM support chunk stream output
self._task_state.current_stream_generate_state.current_route_position += 1 self._task_state.current_stream_generate_state.current_route_position += 1
continue continue
# get route chunk node execution # get route chunk node execution
route_chunk_node_execution = db.session.query(WorkflowNodeExecution).filter( route_chunk_node_execution = db.session.query(WorkflowNodeExecution).filter(
WorkflowNodeExecution.id == route_chunk_node_execution_info.workflow_node_execution_id).first() WorkflowNodeExecution.id == route_chunk_node_execution_info.workflow_node_execution_id
).first()
outputs = route_chunk_node_execution.outputs_dict outputs = route_chunk_node_execution.outputs_dict
@ -631,7 +643,8 @@ class AdvancedChatAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCyc
# all route chunks are generated # all route chunks are generated
if self._task_state.current_stream_generate_state.current_route_position == len( if self._task_state.current_stream_generate_state.current_route_position == len(
self._task_state.current_stream_generate_state.generate_route): self._task_state.current_stream_generate_state.generate_route
):
self._task_state.current_stream_generate_state = None self._task_state.current_stream_generate_state = None
def _is_stream_out_support(self, event: QueueTextChunkEvent) -> bool: def _is_stream_out_support(self, event: QueueTextChunkEvent) -> bool:

View File

@ -19,6 +19,7 @@ from core.app.apps.message_based_app_queue_manager import MessageBasedAppQueueMa
from core.app.entities.app_invoke_entities import AgentChatAppGenerateEntity, InvokeFrom from core.app.entities.app_invoke_entities import AgentChatAppGenerateEntity, InvokeFrom
from core.file.message_file_parser import MessageFileParser from core.file.message_file_parser import MessageFileParser
from core.model_runtime.errors.invoke import InvokeAuthorizationError, InvokeError from core.model_runtime.errors.invoke import InvokeAuthorizationError, InvokeError
from core.ops.ops_trace_manager import TraceQueueManager
from extensions.ext_database import db from extensions.ext_database import db
from models.account import Account from models.account import Account
from models.model import App, EndUser from models.model import App, EndUser
@ -56,7 +57,7 @@ class AgentChatAppGenerator(MessageBasedAppGenerator):
inputs = args['inputs'] inputs = args['inputs']
extras = { extras = {
"auto_generate_conversation_name": args['auto_generate_name'] if 'auto_generate_name' in args else True "auto_generate_conversation_name": args.get('auto_generate_name', True)
} }
# get conversation # get conversation
@ -82,6 +83,11 @@ class AgentChatAppGenerator(MessageBasedAppGenerator):
config=args.get('model_config') config=args.get('model_config')
) )
# always enable retriever resource in debugger mode
override_model_config_dict["retriever_resource"] = {
"enabled": True
}
# parse files # parse files
files = args['files'] if args.get('files') else [] files = args['files'] if args.get('files') else []
message_file_parser = MessageFileParser(tenant_id=app_model.tenant_id, app_id=app_model.id) message_file_parser = MessageFileParser(tenant_id=app_model.tenant_id, app_id=app_model.id)
@ -103,6 +109,9 @@ class AgentChatAppGenerator(MessageBasedAppGenerator):
override_config_dict=override_model_config_dict override_config_dict=override_model_config_dict
) )
# get tracing instance
trace_manager = TraceQueueManager(app_model.id)
# init application generate entity # init application generate entity
application_generate_entity = AgentChatAppGenerateEntity( application_generate_entity = AgentChatAppGenerateEntity(
task_id=str(uuid.uuid4()), task_id=str(uuid.uuid4()),
@ -116,7 +125,8 @@ class AgentChatAppGenerator(MessageBasedAppGenerator):
stream=stream, stream=stream,
invoke_from=invoke_from, invoke_from=invoke_from,
extras=extras, extras=extras,
call_depth=0 call_depth=0,
trace_manager=trace_manager
) )
# init generate records # init generate records
@ -153,7 +163,7 @@ class AgentChatAppGenerator(MessageBasedAppGenerator):
conversation=conversation, conversation=conversation,
message=message, message=message,
user=user, user=user,
stream=stream stream=stream,
) )
return AgentChatAppGenerateResponseConverter.convert( return AgentChatAppGenerateResponseConverter.convert(
@ -161,11 +171,13 @@ class AgentChatAppGenerator(MessageBasedAppGenerator):
invoke_from=invoke_from invoke_from=invoke_from
) )
def _generate_worker(self, flask_app: Flask, def _generate_worker(
application_generate_entity: AgentChatAppGenerateEntity, self, flask_app: Flask,
queue_manager: AppQueueManager, application_generate_entity: AgentChatAppGenerateEntity,
conversation_id: str, queue_manager: AppQueueManager,
message_id: str) -> None: conversation_id: str,
message_id: str,
) -> None:
""" """
Generate worker in a new thread. Generate worker in a new thread.
:param flask_app: Flask app :param flask_app: Flask app
@ -187,7 +199,7 @@ class AgentChatAppGenerator(MessageBasedAppGenerator):
application_generate_entity=application_generate_entity, application_generate_entity=application_generate_entity,
queue_manager=queue_manager, queue_manager=queue_manager,
conversation=conversation, conversation=conversation,
message=message message=message,
) )
except GenerateTaskStoppedException: except GenerateTaskStoppedException:
pass pass

View File

@ -28,10 +28,13 @@ class AgentChatAppRunner(AppRunner):
""" """
Agent Application Runner Agent Application Runner
""" """
def run(self, application_generate_entity: AgentChatAppGenerateEntity,
queue_manager: AppQueueManager, def run(
conversation: Conversation, self, application_generate_entity: AgentChatAppGenerateEntity,
message: Message) -> None: queue_manager: AppQueueManager,
conversation: Conversation,
message: Message,
) -> None:
""" """
Run assistant application Run assistant application
:param application_generate_entity: application generate entity :param application_generate_entity: application generate entity
@ -100,6 +103,7 @@ class AgentChatAppRunner(AppRunner):
app_generate_entity=application_generate_entity, app_generate_entity=application_generate_entity,
inputs=inputs, inputs=inputs,
query=query, query=query,
message_id=message.id
) )
except ModerationException as e: except ModerationException as e:
self.direct_output( self.direct_output(
@ -199,7 +203,7 @@ class AgentChatAppRunner(AppRunner):
llm_model = cast(LargeLanguageModel, model_instance.model_type_instance) llm_model = cast(LargeLanguageModel, model_instance.model_type_instance)
model_schema = llm_model.get_model_schema(model_instance.model, model_instance.credentials) model_schema = llm_model.get_model_schema(model_instance.model, model_instance.credentials)
if set([ModelFeature.MULTI_TOOL_CALL, ModelFeature.TOOL_CALL]).intersection(model_schema.features or []): if {ModelFeature.MULTI_TOOL_CALL, ModelFeature.TOOL_CALL}.intersection(model_schema.features or []):
agent_entity.strategy = AgentEntity.Strategy.FUNCTION_CALLING agent_entity.strategy = AgentEntity.Strategy.FUNCTION_CALLING
conversation = db.session.query(Conversation).filter(Conversation.id == conversation.id).first() conversation = db.session.query(Conversation).filter(Conversation.id == conversation.id).first()
@ -219,7 +223,7 @@ class AgentChatAppRunner(AppRunner):
runner_cls = FunctionCallAgentRunner runner_cls = FunctionCallAgentRunner
else: else:
raise ValueError(f"Invalid agent strategy: {agent_entity.strategy}") raise ValueError(f"Invalid agent strategy: {agent_entity.strategy}")
runner = runner_cls( runner = runner_cls(
tenant_id=app_config.tenant_id, tenant_id=app_config.tenant_id,
application_generate_entity=application_generate_entity, application_generate_entity=application_generate_entity,

View File

@ -1,52 +1,56 @@
from collections.abc import Mapping
from typing import Any, Optional
from core.app.app_config.entities import AppConfig, VariableEntity from core.app.app_config.entities import AppConfig, VariableEntity
class BaseAppGenerator: class BaseAppGenerator:
def _get_cleaned_inputs(self, user_inputs: dict, app_config: AppConfig): def _get_cleaned_inputs(self, user_inputs: Optional[Mapping[str, Any]], app_config: AppConfig) -> Mapping[str, Any]:
if user_inputs is None: user_inputs = user_inputs or {}
user_inputs = {}
filtered_inputs = {}
# Filter input variables from form configuration, handle required fields, default values, and option values # Filter input variables from form configuration, handle required fields, default values, and option values
variables = app_config.variables variables = app_config.variables
for variable_config in variables: filtered_inputs = {var.name: self._validate_input(inputs=user_inputs, var=var) for var in variables}
variable = variable_config.variable filtered_inputs = {k: self._sanitize_value(v) for k, v in filtered_inputs.items()}
if (variable not in user_inputs
or user_inputs[variable] is None
or (isinstance(user_inputs[variable], str) and user_inputs[variable] == '')):
if variable_config.required:
raise ValueError(f"{variable} is required in input form")
else:
filtered_inputs[variable] = variable_config.default if variable_config.default is not None else ""
continue
value = user_inputs[variable]
if value is not None:
if variable_config.type != VariableEntity.Type.NUMBER and not isinstance(value, str):
raise ValueError(f"{variable} in input form must be a string")
elif variable_config.type == VariableEntity.Type.NUMBER and isinstance(value, str):
if '.' in value:
value = float(value)
else:
value = int(value)
if variable_config.type == VariableEntity.Type.SELECT:
options = variable_config.options if variable_config.options is not None else []
if value not in options:
raise ValueError(f"{variable} in input form must be one of the following: {options}")
elif variable_config.type in [VariableEntity.Type.TEXT_INPUT, VariableEntity.Type.PARAGRAPH]:
if variable_config.max_length is not None:
max_length = variable_config.max_length
if len(value) > max_length:
raise ValueError(f'{variable} in input form must be less than {max_length} characters')
if value and isinstance(value, str):
filtered_inputs[variable] = value.replace('\x00', '')
else:
filtered_inputs[variable] = value if value is not None else None
return filtered_inputs return filtered_inputs
def _validate_input(self, *, inputs: Mapping[str, Any], var: VariableEntity):
user_input_value = inputs.get(var.name)
if var.required and not user_input_value:
raise ValueError(f'{var.name} is required in input form')
if not var.required and not user_input_value:
# TODO: should we return None here if the default value is None?
return var.default or ''
if (
var.type
in (
VariableEntity.Type.TEXT_INPUT,
VariableEntity.Type.SELECT,
VariableEntity.Type.PARAGRAPH,
)
and user_input_value
and not isinstance(user_input_value, str)
):
raise ValueError(f"(type '{var.type}') {var.name} in input form must be a string")
if var.type == VariableEntity.Type.NUMBER and isinstance(user_input_value, str):
# may raise ValueError if user_input_value is not a valid number
try:
if '.' in user_input_value:
return float(user_input_value)
else:
return int(user_input_value)
except ValueError:
raise ValueError(f"{var.name} in input form must be a valid number")
if var.type == VariableEntity.Type.SELECT:
options = var.options or []
if user_input_value not in options:
raise ValueError(f'{var.name} in input form must be one of the following: {options}')
elif var.type in (VariableEntity.Type.TEXT_INPUT, VariableEntity.Type.PARAGRAPH):
if var.max_length and user_input_value and len(user_input_value) > var.max_length:
raise ValueError(f'{var.name} in input form must be less than {var.max_length} characters')
return user_input_value
def _sanitize_value(self, value: Any) -> Any:
if isinstance(value, str):
return value.replace('\x00', '')
return value

View File

@ -338,11 +338,14 @@ class AppRunner:
), PublishFrom.APPLICATION_MANAGER ), PublishFrom.APPLICATION_MANAGER
) )
def moderation_for_inputs(self, app_id: str, def moderation_for_inputs(
tenant_id: str, self, app_id: str,
app_generate_entity: AppGenerateEntity, tenant_id: str,
inputs: dict, app_generate_entity: AppGenerateEntity,
query: str) -> tuple[bool, dict, str]: inputs: dict,
query: str,
message_id: str,
) -> tuple[bool, dict, str]:
""" """
Process sensitive_word_avoidance. Process sensitive_word_avoidance.
:param app_id: app id :param app_id: app id
@ -350,6 +353,7 @@ class AppRunner:
:param app_generate_entity: app generate entity :param app_generate_entity: app generate entity
:param inputs: inputs :param inputs: inputs
:param query: query :param query: query
:param message_id: message id
:return: :return:
""" """
moderation_feature = InputModeration() moderation_feature = InputModeration()
@ -358,7 +362,9 @@ class AppRunner:
tenant_id=tenant_id, tenant_id=tenant_id,
app_config=app_generate_entity.app_config, app_config=app_generate_entity.app_config,
inputs=inputs, inputs=inputs,
query=query if query else '' query=query if query else '',
message_id=message_id,
trace_manager=app_generate_entity.trace_manager
) )
def check_hosting_moderation(self, application_generate_entity: EasyUIBasedAppGenerateEntity, def check_hosting_moderation(self, application_generate_entity: EasyUIBasedAppGenerateEntity,

View File

@ -50,6 +50,9 @@ class ChatAppConfigManager(BaseAppConfigManager):
app_model_config_dict = app_model_config.to_dict() app_model_config_dict = app_model_config.to_dict()
config_dict = app_model_config_dict.copy() config_dict = app_model_config_dict.copy()
else: else:
if not override_config_dict:
raise Exception('override_config_dict is required when config_from is ARGS')
config_dict = override_config_dict config_dict = override_config_dict
app_mode = AppMode.value_of(app_model.mode) app_mode = AppMode.value_of(app_model.mode)

View File

@ -19,6 +19,7 @@ from core.app.apps.message_based_app_queue_manager import MessageBasedAppQueueMa
from core.app.entities.app_invoke_entities import ChatAppGenerateEntity, InvokeFrom from core.app.entities.app_invoke_entities import ChatAppGenerateEntity, InvokeFrom
from core.file.message_file_parser import MessageFileParser from core.file.message_file_parser import MessageFileParser
from core.model_runtime.errors.invoke import InvokeAuthorizationError, InvokeError from core.model_runtime.errors.invoke import InvokeAuthorizationError, InvokeError
from core.ops.ops_trace_manager import TraceQueueManager
from extensions.ext_database import db from extensions.ext_database import db
from models.account import Account from models.account import Account
from models.model import App, EndUser from models.model import App, EndUser
@ -27,12 +28,13 @@ logger = logging.getLogger(__name__)
class ChatAppGenerator(MessageBasedAppGenerator): class ChatAppGenerator(MessageBasedAppGenerator):
def generate(self, app_model: App, def generate(
user: Union[Account, EndUser], self, app_model: App,
args: Any, user: Union[Account, EndUser],
invoke_from: InvokeFrom, args: Any,
stream: bool = True) \ invoke_from: InvokeFrom,
-> Union[dict, Generator[dict, None, None]]: stream: bool = True,
) -> Union[dict, Generator[dict, None, None]]:
""" """
Generate App response. Generate App response.
@ -53,7 +55,7 @@ class ChatAppGenerator(MessageBasedAppGenerator):
inputs = args['inputs'] inputs = args['inputs']
extras = { extras = {
"auto_generate_conversation_name": args['auto_generate_name'] if 'auto_generate_name' in args else True "auto_generate_conversation_name": args.get('auto_generate_name', True)
} }
# get conversation # get conversation
@ -79,6 +81,11 @@ class ChatAppGenerator(MessageBasedAppGenerator):
config=args.get('model_config') config=args.get('model_config')
) )
# always enable retriever resource in debugger mode
override_model_config_dict["retriever_resource"] = {
"enabled": True
}
# parse files # parse files
files = args['files'] if args.get('files') else [] files = args['files'] if args.get('files') else []
message_file_parser = MessageFileParser(tenant_id=app_model.tenant_id, app_id=app_model.id) message_file_parser = MessageFileParser(tenant_id=app_model.tenant_id, app_id=app_model.id)
@ -100,6 +107,9 @@ class ChatAppGenerator(MessageBasedAppGenerator):
override_config_dict=override_model_config_dict override_config_dict=override_model_config_dict
) )
# get tracing instance
trace_manager = TraceQueueManager(app_model.id)
# init application generate entity # init application generate entity
application_generate_entity = ChatAppGenerateEntity( application_generate_entity = ChatAppGenerateEntity(
task_id=str(uuid.uuid4()), task_id=str(uuid.uuid4()),
@ -112,7 +122,8 @@ class ChatAppGenerator(MessageBasedAppGenerator):
user_id=user.id, user_id=user.id,
stream=stream, stream=stream,
invoke_from=invoke_from, invoke_from=invoke_from,
extras=extras extras=extras,
trace_manager=trace_manager
) )
# init generate records # init generate records
@ -149,7 +160,7 @@ class ChatAppGenerator(MessageBasedAppGenerator):
conversation=conversation, conversation=conversation,
message=message, message=message,
user=user, user=user,
stream=stream stream=stream,
) )
return ChatAppGenerateResponseConverter.convert( return ChatAppGenerateResponseConverter.convert(

View File

@ -96,6 +96,7 @@ class ChatAppRunner(AppRunner):
app_generate_entity=application_generate_entity, app_generate_entity=application_generate_entity,
inputs=inputs, inputs=inputs,
query=query, query=query,
message_id=message.id
) )
except ModerationException as e: except ModerationException as e:
self.direct_output( self.direct_output(
@ -154,7 +155,7 @@ class ChatAppRunner(AppRunner):
application_generate_entity.invoke_from application_generate_entity.invoke_from
) )
dataset_retrieval = DatasetRetrieval() dataset_retrieval = DatasetRetrieval(application_generate_entity)
context = dataset_retrieval.retrieve( context = dataset_retrieval.retrieve(
app_id=app_record.id, app_id=app_record.id,
user_id=application_generate_entity.user_id, user_id=application_generate_entity.user_id,
@ -165,7 +166,8 @@ class ChatAppRunner(AppRunner):
invoke_from=application_generate_entity.invoke_from, invoke_from=application_generate_entity.invoke_from,
show_retrieve_source=app_config.additional_features.show_retrieve_source, show_retrieve_source=app_config.additional_features.show_retrieve_source,
hit_callback=hit_callback, hit_callback=hit_callback,
memory=memory memory=memory,
message_id=message.id,
) )
# reorganize all inputs and template to prompt messages # reorganize all inputs and template to prompt messages

View File

@ -19,6 +19,7 @@ from core.app.apps.message_based_app_queue_manager import MessageBasedAppQueueMa
from core.app.entities.app_invoke_entities import CompletionAppGenerateEntity, InvokeFrom from core.app.entities.app_invoke_entities import CompletionAppGenerateEntity, InvokeFrom
from core.file.message_file_parser import MessageFileParser from core.file.message_file_parser import MessageFileParser
from core.model_runtime.errors.invoke import InvokeAuthorizationError, InvokeError from core.model_runtime.errors.invoke import InvokeAuthorizationError, InvokeError
from core.ops.ops_trace_manager import TraceQueueManager
from extensions.ext_database import db from extensions.ext_database import db
from models.account import Account from models.account import Account
from models.model import App, EndUser, Message from models.model import App, EndUser, Message
@ -94,6 +95,9 @@ class CompletionAppGenerator(MessageBasedAppGenerator):
override_config_dict=override_model_config_dict override_config_dict=override_model_config_dict
) )
# get tracing instance
trace_manager = TraceQueueManager(app_model.id)
# init application generate entity # init application generate entity
application_generate_entity = CompletionAppGenerateEntity( application_generate_entity = CompletionAppGenerateEntity(
task_id=str(uuid.uuid4()), task_id=str(uuid.uuid4()),
@ -105,7 +109,8 @@ class CompletionAppGenerator(MessageBasedAppGenerator):
user_id=user.id, user_id=user.id,
stream=stream, stream=stream,
invoke_from=invoke_from, invoke_from=invoke_from,
extras=extras extras=extras,
trace_manager=trace_manager
) )
# init generate records # init generate records
@ -141,7 +146,7 @@ class CompletionAppGenerator(MessageBasedAppGenerator):
conversation=conversation, conversation=conversation,
message=message, message=message,
user=user, user=user,
stream=stream stream=stream,
) )
return CompletionAppGenerateResponseConverter.convert( return CompletionAppGenerateResponseConverter.convert(
@ -158,7 +163,6 @@ class CompletionAppGenerator(MessageBasedAppGenerator):
:param flask_app: Flask app :param flask_app: Flask app
:param application_generate_entity: application generate entity :param application_generate_entity: application generate entity
:param queue_manager: queue manager :param queue_manager: queue manager
:param conversation_id: conversation ID
:param message_id: message ID :param message_id: message ID
:return: :return:
""" """
@ -300,7 +304,7 @@ class CompletionAppGenerator(MessageBasedAppGenerator):
conversation=conversation, conversation=conversation,
message=message, message=message,
user=user, user=user,
stream=stream stream=stream,
) )
return CompletionAppGenerateResponseConverter.convert( return CompletionAppGenerateResponseConverter.convert(

View File

@ -77,6 +77,7 @@ class CompletionAppRunner(AppRunner):
app_generate_entity=application_generate_entity, app_generate_entity=application_generate_entity,
inputs=inputs, inputs=inputs,
query=query, query=query,
message_id=message.id
) )
except ModerationException as e: except ModerationException as e:
self.direct_output( self.direct_output(
@ -114,7 +115,7 @@ class CompletionAppRunner(AppRunner):
if dataset_config and dataset_config.retrieve_config.query_variable: if dataset_config and dataset_config.retrieve_config.query_variable:
query = inputs.get(dataset_config.retrieve_config.query_variable, "") query = inputs.get(dataset_config.retrieve_config.query_variable, "")
dataset_retrieval = DatasetRetrieval() dataset_retrieval = DatasetRetrieval(application_generate_entity)
context = dataset_retrieval.retrieve( context = dataset_retrieval.retrieve(
app_id=app_record.id, app_id=app_record.id,
user_id=application_generate_entity.user_id, user_id=application_generate_entity.user_id,
@ -124,7 +125,8 @@ class CompletionAppRunner(AppRunner):
query=query, query=query,
invoke_from=application_generate_entity.invoke_from, invoke_from=application_generate_entity.invoke_from,
show_retrieve_source=app_config.additional_features.show_retrieve_source, show_retrieve_source=app_config.additional_features.show_retrieve_source,
hit_callback=hit_callback hit_callback=hit_callback,
message_id=message.id
) )
# reorganize all inputs and template to prompt messages # reorganize all inputs and template to prompt messages

View File

@ -35,22 +35,23 @@ logger = logging.getLogger(__name__)
class MessageBasedAppGenerator(BaseAppGenerator): class MessageBasedAppGenerator(BaseAppGenerator):
def _handle_response(self, application_generate_entity: Union[ def _handle_response(
ChatAppGenerateEntity, self, application_generate_entity: Union[
CompletionAppGenerateEntity, ChatAppGenerateEntity,
AgentChatAppGenerateEntity, CompletionAppGenerateEntity,
AdvancedChatAppGenerateEntity AgentChatAppGenerateEntity,
], AdvancedChatAppGenerateEntity
queue_manager: AppQueueManager, ],
conversation: Conversation, queue_manager: AppQueueManager,
message: Message, conversation: Conversation,
user: Union[Account, EndUser], message: Message,
stream: bool = False) \ user: Union[Account, EndUser],
-> Union[ stream: bool = False,
ChatbotAppBlockingResponse, ) -> Union[
CompletionAppBlockingResponse, ChatbotAppBlockingResponse,
Generator[Union[ChatbotAppStreamResponse, CompletionAppStreamResponse], None, None] CompletionAppBlockingResponse,
]: Generator[Union[ChatbotAppStreamResponse, CompletionAppStreamResponse], None, None]
]:
""" """
Handle response. Handle response.
:param application_generate_entity: application generate entity :param application_generate_entity: application generate entity

View File

@ -20,6 +20,7 @@ from core.app.entities.app_invoke_entities import InvokeFrom, WorkflowAppGenerat
from core.app.entities.task_entities import WorkflowAppBlockingResponse, WorkflowAppStreamResponse from core.app.entities.task_entities import WorkflowAppBlockingResponse, WorkflowAppStreamResponse
from core.file.message_file_parser import MessageFileParser from core.file.message_file_parser import MessageFileParser
from core.model_runtime.errors.invoke import InvokeAuthorizationError, InvokeError from core.model_runtime.errors.invoke import InvokeAuthorizationError, InvokeError
from core.ops.ops_trace_manager import TraceQueueManager
from extensions.ext_database import db from extensions.ext_database import db
from models.account import Account from models.account import Account
from models.model import App, EndUser from models.model import App, EndUser
@ -29,14 +30,15 @@ logger = logging.getLogger(__name__)
class WorkflowAppGenerator(BaseAppGenerator): class WorkflowAppGenerator(BaseAppGenerator):
def generate(self, app_model: App, def generate(
workflow: Workflow, self, app_model: App,
user: Union[Account, EndUser], workflow: Workflow,
args: dict, user: Union[Account, EndUser],
invoke_from: InvokeFrom, args: dict,
stream: bool = True, invoke_from: InvokeFrom,
call_depth: int = 0) \ stream: bool = True,
-> Union[dict, Generator[dict, None, None]]: call_depth: int = 0,
) -> Union[dict, Generator[dict, None, None]]:
""" """
Generate App response. Generate App response.
@ -46,6 +48,7 @@ class WorkflowAppGenerator(BaseAppGenerator):
:param args: request args :param args: request args
:param invoke_from: invoke from source :param invoke_from: invoke from source
:param stream: is stream :param stream: is stream
:param call_depth: call depth
""" """
inputs = args['inputs'] inputs = args['inputs']
@ -68,6 +71,9 @@ class WorkflowAppGenerator(BaseAppGenerator):
workflow=workflow workflow=workflow
) )
# get tracing instance
trace_manager = TraceQueueManager(app_model.id)
# init application generate entity # init application generate entity
application_generate_entity = WorkflowAppGenerateEntity( application_generate_entity = WorkflowAppGenerateEntity(
task_id=str(uuid.uuid4()), task_id=str(uuid.uuid4()),
@ -77,7 +83,8 @@ class WorkflowAppGenerator(BaseAppGenerator):
user_id=user.id, user_id=user.id,
stream=stream, stream=stream,
invoke_from=invoke_from, invoke_from=invoke_from,
call_depth=call_depth call_depth=call_depth,
trace_manager=trace_manager
) )
return self._generate( return self._generate(
@ -87,17 +94,16 @@ class WorkflowAppGenerator(BaseAppGenerator):
application_generate_entity=application_generate_entity, application_generate_entity=application_generate_entity,
invoke_from=invoke_from, invoke_from=invoke_from,
stream=stream, stream=stream,
call_depth=call_depth
) )
def _generate(self, app_model: App, def _generate(
workflow: Workflow, self, app_model: App,
user: Union[Account, EndUser], workflow: Workflow,
application_generate_entity: WorkflowAppGenerateEntity, user: Union[Account, EndUser],
invoke_from: InvokeFrom, application_generate_entity: WorkflowAppGenerateEntity,
stream: bool = True, invoke_from: InvokeFrom,
call_depth: int = 0) \ stream: bool = True,
-> Union[dict, Generator[dict, None, None]]: ) -> Union[dict, Generator[dict, None, None]]:
""" """
Generate App response. Generate App response.
@ -131,7 +137,7 @@ class WorkflowAppGenerator(BaseAppGenerator):
workflow=workflow, workflow=workflow,
queue_manager=queue_manager, queue_manager=queue_manager,
user=user, user=user,
stream=stream stream=stream,
) )
return WorkflowAppGenerateResponseConverter.convert( return WorkflowAppGenerateResponseConverter.convert(
@ -158,10 +164,10 @@ class WorkflowAppGenerator(BaseAppGenerator):
""" """
if not node_id: if not node_id:
raise ValueError('node_id is required') raise ValueError('node_id is required')
if args.get('inputs') is None: if args.get('inputs') is None:
raise ValueError('inputs is required') raise ValueError('inputs is required')
extras = { extras = {
"auto_generate_conversation_name": False "auto_generate_conversation_name": False
} }

View File

@ -1,6 +1,6 @@
import logging import logging
from collections.abc import Generator from collections.abc import Generator
from typing import Any, Union from typing import Any, Optional, Union
from core.app.apps.base_app_queue_manager import AppQueueManager from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import ( from core.app.entities.app_invoke_entities import (
@ -36,6 +36,7 @@ from core.app.entities.task_entities import (
) )
from core.app.task_pipeline.based_generate_task_pipeline import BasedGenerateTaskPipeline from core.app.task_pipeline.based_generate_task_pipeline import BasedGenerateTaskPipeline
from core.app.task_pipeline.workflow_cycle_manage import WorkflowCycleManage from core.app.task_pipeline.workflow_cycle_manage import WorkflowCycleManage
from core.ops.ops_trace_manager import TraceQueueManager
from core.workflow.entities.node_entities import NodeType, SystemVariable from core.workflow.entities.node_entities import NodeType, SystemVariable
from core.workflow.nodes.end.end_node import EndNode from core.workflow.nodes.end.end_node import EndNode
from extensions.ext_database import db from extensions.ext_database import db
@ -104,7 +105,9 @@ class WorkflowAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCycleMa
db.session.refresh(self._user) db.session.refresh(self._user)
db.session.close() db.session.close()
generator = self._process_stream_response() generator = self._process_stream_response(
trace_manager=self._application_generate_entity.trace_manager
)
if self._stream: if self._stream:
return self._to_stream_response(generator) return self._to_stream_response(generator)
else: else:
@ -158,7 +161,10 @@ class WorkflowAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCycleMa
stream_response=stream_response stream_response=stream_response
) )
def _process_stream_response(self) -> Generator[StreamResponse, None, None]: def _process_stream_response(
self,
trace_manager: Optional[TraceQueueManager] = None
) -> Generator[StreamResponse, None, None]:
""" """
Process stream response. Process stream response.
:return: :return:
@ -215,7 +221,9 @@ class WorkflowAppGenerateTaskPipeline(BasedGenerateTaskPipeline, WorkflowCycleMa
yield self._handle_iteration_to_stream_response(self._application_generate_entity.task_id, event) yield self._handle_iteration_to_stream_response(self._application_generate_entity.task_id, event)
self._handle_iteration_operation(event) self._handle_iteration_operation(event)
elif isinstance(event, QueueStopEvent | QueueWorkflowSucceededEvent | QueueWorkflowFailedEvent): elif isinstance(event, QueueStopEvent | QueueWorkflowSucceededEvent | QueueWorkflowFailedEvent):
workflow_run = self._handle_workflow_finished(event) workflow_run = self._handle_workflow_finished(
event, trace_manager=trace_manager
)
# save workflow app log # save workflow app log
self._save_workflow_app_log(workflow_run) self._save_workflow_app_log(workflow_run)

View File

@ -7,6 +7,7 @@ from core.app.app_config.entities import AppConfig, EasyUIBasedAppConfig, Workfl
from core.entities.provider_configuration import ProviderModelBundle from core.entities.provider_configuration import ProviderModelBundle
from core.file.file_obj import FileVar from core.file.file_obj import FileVar
from core.model_runtime.entities.model_entities import AIModelEntity from core.model_runtime.entities.model_entities import AIModelEntity
from core.ops.ops_trace_manager import TraceQueueManager
class InvokeFrom(Enum): class InvokeFrom(Enum):
@ -89,6 +90,12 @@ class AppGenerateEntity(BaseModel):
# extra parameters, like: auto_generate_conversation_name # extra parameters, like: auto_generate_conversation_name
extras: dict[str, Any] = {} extras: dict[str, Any] = {}
# tracing instance
trace_manager: Optional[TraceQueueManager] = None
class Config:
arbitrary_types_allowed = True
class EasyUIBasedAppGenerateEntity(AppGenerateEntity): class EasyUIBasedAppGenerateEntity(AppGenerateEntity):
""" """

View File

@ -44,6 +44,7 @@ from core.model_runtime.entities.message_entities import (
) )
from core.model_runtime.model_providers.__base.large_language_model import LargeLanguageModel from core.model_runtime.model_providers.__base.large_language_model import LargeLanguageModel
from core.model_runtime.utils.encoders import jsonable_encoder from core.model_runtime.utils.encoders import jsonable_encoder
from core.ops.ops_trace_manager import TraceQueueManager, TraceTask, TraceTaskName
from core.prompt.utils.prompt_message_util import PromptMessageUtil from core.prompt.utils.prompt_message_util import PromptMessageUtil
from core.prompt.utils.prompt_template_parser import PromptTemplateParser from core.prompt.utils.prompt_template_parser import PromptTemplateParser
from events.message_event import message_was_created from events.message_event import message_was_created
@ -100,7 +101,9 @@ class EasyUIBasedGenerateTaskPipeline(BasedGenerateTaskPipeline, MessageCycleMan
self._conversation_name_generate_thread = None self._conversation_name_generate_thread = None
def process(self) -> Union[ def process(
self,
) -> Union[
ChatbotAppBlockingResponse, ChatbotAppBlockingResponse,
CompletionAppBlockingResponse, CompletionAppBlockingResponse,
Generator[Union[ChatbotAppStreamResponse, CompletionAppStreamResponse], None, None] Generator[Union[ChatbotAppStreamResponse, CompletionAppStreamResponse], None, None]
@ -120,7 +123,9 @@ class EasyUIBasedGenerateTaskPipeline(BasedGenerateTaskPipeline, MessageCycleMan
self._application_generate_entity.query self._application_generate_entity.query
) )
generator = self._process_stream_response() generator = self._process_stream_response(
trace_manager=self._application_generate_entity.trace_manager
)
if self._stream: if self._stream:
return self._to_stream_response(generator) return self._to_stream_response(generator)
else: else:
@ -197,7 +202,9 @@ class EasyUIBasedGenerateTaskPipeline(BasedGenerateTaskPipeline, MessageCycleMan
stream_response=stream_response stream_response=stream_response
) )
def _process_stream_response(self) -> Generator[StreamResponse, None, None]: def _process_stream_response(
self, trace_manager: Optional[TraceQueueManager] = None
) -> Generator[StreamResponse, None, None]:
""" """
Process stream response. Process stream response.
:return: :return:
@ -224,7 +231,7 @@ class EasyUIBasedGenerateTaskPipeline(BasedGenerateTaskPipeline, MessageCycleMan
yield self._message_replace_to_stream_response(answer=output_moderation_answer) yield self._message_replace_to_stream_response(answer=output_moderation_answer)
# Save message # Save message
self._save_message() self._save_message(trace_manager)
yield self._message_end_to_stream_response() yield self._message_end_to_stream_response()
elif isinstance(event, QueueRetrieverResourcesEvent): elif isinstance(event, QueueRetrieverResourcesEvent):
@ -269,7 +276,9 @@ class EasyUIBasedGenerateTaskPipeline(BasedGenerateTaskPipeline, MessageCycleMan
if self._conversation_name_generate_thread: if self._conversation_name_generate_thread:
self._conversation_name_generate_thread.join() self._conversation_name_generate_thread.join()
def _save_message(self) -> None: def _save_message(
self, trace_manager: Optional[TraceQueueManager] = None
) -> None:
""" """
Save message. Save message.
:return: :return:
@ -300,6 +309,15 @@ class EasyUIBasedGenerateTaskPipeline(BasedGenerateTaskPipeline, MessageCycleMan
db.session.commit() db.session.commit()
if trace_manager:
trace_manager.add_trace_task(
TraceTask(
TraceTaskName.MESSAGE_TRACE,
conversation_id=self._conversation.id,
message_id=self._message.id
)
)
message_was_created.send( message_was_created.send(
self._message, self._message,
application_generate_entity=self._application_generate_entity, application_generate_entity=self._application_generate_entity,

View File

@ -167,8 +167,11 @@ class MessageCycleManage:
extension = '.bin' extension = '.bin'
else: else:
extension = '.bin' extension = '.bin'
# add sign url # add sign url to local file
url = ToolFileManager.sign_file(tool_file_id=tool_file_id, extension=extension) if message_file.url.startswith('http'):
url = message_file.url
else:
url = ToolFileManager.sign_file(tool_file_id=tool_file_id, extension=extension)
return MessageFileStreamResponse( return MessageFileStreamResponse(
task_id=self._application_generate_entity.task_id, task_id=self._application_generate_entity.task_id,

View File

@ -22,6 +22,7 @@ from core.app.entities.task_entities import (
from core.app.task_pipeline.workflow_iteration_cycle_manage import WorkflowIterationCycleManage from core.app.task_pipeline.workflow_iteration_cycle_manage import WorkflowIterationCycleManage
from core.file.file_obj import FileVar from core.file.file_obj import FileVar
from core.model_runtime.utils.encoders import jsonable_encoder from core.model_runtime.utils.encoders import jsonable_encoder
from core.ops.ops_trace_manager import TraceQueueManager, TraceTask, TraceTaskName
from core.tools.tool_manager import ToolManager from core.tools.tool_manager import ToolManager
from core.workflow.entities.node_entities import NodeRunMetadataKey, NodeType from core.workflow.entities.node_entities import NodeRunMetadataKey, NodeType
from core.workflow.nodes.tool.entities import ToolNodeData from core.workflow.nodes.tool.entities import ToolNodeData
@ -94,11 +95,15 @@ class WorkflowCycleManage(WorkflowIterationCycleManage):
return workflow_run return workflow_run
def _workflow_run_success(self, workflow_run: WorkflowRun, def _workflow_run_success(
start_at: float, self, workflow_run: WorkflowRun,
total_tokens: int, start_at: float,
total_steps: int, total_tokens: int,
outputs: Optional[str] = None) -> WorkflowRun: total_steps: int,
outputs: Optional[str] = None,
conversation_id: Optional[str] = None,
trace_manager: Optional[TraceQueueManager] = None
) -> WorkflowRun:
""" """
Workflow run success Workflow run success
:param workflow_run: workflow run :param workflow_run: workflow run
@ -106,6 +111,7 @@ class WorkflowCycleManage(WorkflowIterationCycleManage):
:param total_tokens: total tokens :param total_tokens: total tokens
:param total_steps: total steps :param total_steps: total steps
:param outputs: outputs :param outputs: outputs
:param conversation_id: conversation id
:return: :return:
""" """
workflow_run.status = WorkflowRunStatus.SUCCEEDED.value workflow_run.status = WorkflowRunStatus.SUCCEEDED.value
@ -119,14 +125,27 @@ class WorkflowCycleManage(WorkflowIterationCycleManage):
db.session.refresh(workflow_run) db.session.refresh(workflow_run)
db.session.close() db.session.close()
if trace_manager:
trace_manager.add_trace_task(
TraceTask(
TraceTaskName.WORKFLOW_TRACE,
workflow_run=workflow_run,
conversation_id=conversation_id,
)
)
return workflow_run return workflow_run
def _workflow_run_failed(self, workflow_run: WorkflowRun, def _workflow_run_failed(
start_at: float, self, workflow_run: WorkflowRun,
total_tokens: int, start_at: float,
total_steps: int, total_tokens: int,
status: WorkflowRunStatus, total_steps: int,
error: str) -> WorkflowRun: status: WorkflowRunStatus,
error: str,
conversation_id: Optional[str] = None,
trace_manager: Optional[TraceQueueManager] = None
) -> WorkflowRun:
""" """
Workflow run failed Workflow run failed
:param workflow_run: workflow run :param workflow_run: workflow run
@ -148,6 +167,15 @@ class WorkflowCycleManage(WorkflowIterationCycleManage):
db.session.refresh(workflow_run) db.session.refresh(workflow_run)
db.session.close() db.session.close()
if trace_manager:
trace_manager.add_trace_task(
TraceTask(
TraceTaskName.WORKFLOW_TRACE,
workflow_run=workflow_run,
conversation_id=conversation_id,
)
)
return workflow_run return workflow_run
def _init_node_execution_from_workflow_run(self, workflow_run: WorkflowRun, def _init_node_execution_from_workflow_run(self, workflow_run: WorkflowRun,
@ -180,7 +208,8 @@ class WorkflowCycleManage(WorkflowIterationCycleManage):
title=node_title, title=node_title,
status=WorkflowNodeExecutionStatus.RUNNING.value, status=WorkflowNodeExecutionStatus.RUNNING.value,
created_by_role=workflow_run.created_by_role, created_by_role=workflow_run.created_by_role,
created_by=workflow_run.created_by created_by=workflow_run.created_by,
created_at=datetime.now(timezone.utc).replace(tzinfo=None)
) )
db.session.add(workflow_node_execution) db.session.add(workflow_node_execution)
@ -440,9 +469,9 @@ class WorkflowCycleManage(WorkflowIterationCycleManage):
current_node_execution = self._task_state.ran_node_execution_infos[event.node_id] current_node_execution = self._task_state.ran_node_execution_infos[event.node_id]
workflow_node_execution = db.session.query(WorkflowNodeExecution).filter( workflow_node_execution = db.session.query(WorkflowNodeExecution).filter(
WorkflowNodeExecution.id == current_node_execution.workflow_node_execution_id).first() WorkflowNodeExecution.id == current_node_execution.workflow_node_execution_id).first()
execution_metadata = event.execution_metadata if isinstance(event, QueueNodeSucceededEvent) else None execution_metadata = event.execution_metadata if isinstance(event, QueueNodeSucceededEvent) else None
if self._iteration_state and self._iteration_state.current_iterations: if self._iteration_state and self._iteration_state.current_iterations:
if not execution_metadata: if not execution_metadata:
execution_metadata = {} execution_metadata = {}
@ -470,7 +499,7 @@ class WorkflowCycleManage(WorkflowIterationCycleManage):
if execution_metadata and execution_metadata.get(NodeRunMetadataKey.TOTAL_TOKENS): if execution_metadata and execution_metadata.get(NodeRunMetadataKey.TOTAL_TOKENS):
self._task_state.total_tokens += ( self._task_state.total_tokens += (
int(execution_metadata.get(NodeRunMetadataKey.TOTAL_TOKENS))) int(execution_metadata.get(NodeRunMetadataKey.TOTAL_TOKENS)))
if self._iteration_state: if self._iteration_state:
for iteration_node_id in self._iteration_state.current_iterations: for iteration_node_id in self._iteration_state.current_iterations:
data = self._iteration_state.current_iterations[iteration_node_id] data = self._iteration_state.current_iterations[iteration_node_id]
@ -496,13 +525,18 @@ class WorkflowCycleManage(WorkflowIterationCycleManage):
return workflow_node_execution return workflow_node_execution
def _handle_workflow_finished(self, event: QueueStopEvent | QueueWorkflowSucceededEvent | QueueWorkflowFailedEvent) \ def _handle_workflow_finished(
-> Optional[WorkflowRun]: self, event: QueueStopEvent | QueueWorkflowSucceededEvent | QueueWorkflowFailedEvent,
conversation_id: Optional[str] = None,
trace_manager: Optional[TraceQueueManager] = None
) -> Optional[WorkflowRun]:
workflow_run = db.session.query(WorkflowRun).filter( workflow_run = db.session.query(WorkflowRun).filter(
WorkflowRun.id == self._task_state.workflow_run_id).first() WorkflowRun.id == self._task_state.workflow_run_id).first()
if not workflow_run: if not workflow_run:
return None return None
if conversation_id is None:
conversation_id = self._application_generate_entity.inputs.get('sys.conversation_id')
if isinstance(event, QueueStopEvent): if isinstance(event, QueueStopEvent):
workflow_run = self._workflow_run_failed( workflow_run = self._workflow_run_failed(
workflow_run=workflow_run, workflow_run=workflow_run,
@ -510,7 +544,9 @@ class WorkflowCycleManage(WorkflowIterationCycleManage):
total_tokens=self._task_state.total_tokens, total_tokens=self._task_state.total_tokens,
total_steps=self._task_state.total_steps, total_steps=self._task_state.total_steps,
status=WorkflowRunStatus.STOPPED, status=WorkflowRunStatus.STOPPED,
error='Workflow stopped.' error='Workflow stopped.',
conversation_id=conversation_id,
trace_manager=trace_manager
) )
latest_node_execution_info = self._task_state.latest_node_execution_info latest_node_execution_info = self._task_state.latest_node_execution_info
@ -531,7 +567,9 @@ class WorkflowCycleManage(WorkflowIterationCycleManage):
total_tokens=self._task_state.total_tokens, total_tokens=self._task_state.total_tokens,
total_steps=self._task_state.total_steps, total_steps=self._task_state.total_steps,
status=WorkflowRunStatus.FAILED, status=WorkflowRunStatus.FAILED,
error=event.error error=event.error,
conversation_id=conversation_id,
trace_manager=trace_manager
) )
else: else:
if self._task_state.latest_node_execution_info: if self._task_state.latest_node_execution_info:
@ -546,7 +584,9 @@ class WorkflowCycleManage(WorkflowIterationCycleManage):
start_at=self._task_state.start_at, start_at=self._task_state.start_at,
total_tokens=self._task_state.total_tokens, total_tokens=self._task_state.total_tokens,
total_steps=self._task_state.total_steps, total_steps=self._task_state.total_steps,
outputs=outputs outputs=outputs,
conversation_id=conversation_id,
trace_manager=trace_manager
) )
self._task_state.workflow_run_id = workflow_run.id self._task_state.workflow_run_id = workflow_run.id

Some files were not shown because too many files have changed in this diff Show More