mirror of
https://git.mirrors.martin98.com/https://github.com/langgenius/dify.git
synced 2025-08-15 01:56:01 +08:00
Merge branch 'main' into feat/r2
This commit is contained in:
commit
82be119fec
@ -1,5 +1,4 @@
|
||||
FROM mcr.microsoft.com/devcontainers/python:3.12
|
||||
|
||||
# [Optional] Uncomment this section to install additional OS packages.
|
||||
# RUN apt-get update && export DEBIAN_FRONTEND=noninteractive \
|
||||
# && apt-get -y install --no-install-recommends <your-package-list-here>
|
||||
RUN apt-get update && export DEBIAN_FRONTEND=noninteractive \
|
||||
&& apt-get -y install libgmp-dev libmpfr-dev libmpc-dev
|
||||
|
1
.github/workflows/style.yml
vendored
1
.github/workflows/style.yml
vendored
@ -139,6 +139,7 @@ jobs:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
persist-credentials: false
|
||||
|
||||
- name: Check changed files
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Introducing Dify Workflow File Upload: Recreate Google NotebookLM Podcast</a>
|
||||
@ -87,8 +87,6 @@ Please refer to our [FAQ](https://docs.dify.ai/getting-started/install-self-host
|
||||
**1. Workflow**:
|
||||
Build and test powerful AI workflows on a visual canvas, leveraging all the following features and beyond.
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
**2. Comprehensive model support**:
|
||||
Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama3, and any OpenAI API-compatible models. A full list of supported model providers can be found [here](https://docs.dify.ai/getting-started/readme/model-providers).
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
|
||||
@ -54,8 +54,6 @@
|
||||
|
||||
**1. سير العمل**: قم ببناء واختبار سير عمل الذكاء الاصطناعي القوي على قماش بصري، مستفيدًا من جميع الميزات التالية وأكثر.
|
||||
|
||||
<https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa>
|
||||
|
||||
**2. الدعم الشامل للنماذج**: تكامل سلس مع مئات من LLMs الخاصة / مفتوحة المصدر من عشرات من موفري التحليل والحلول المستضافة ذاتيًا، مما يغطي GPT و Mistral و Llama3 وأي نماذج متوافقة مع واجهة OpenAI API. يمكن العثور على قائمة كاملة بمزودي النموذج المدعومين [هنا](https://docs.dify.ai/getting-started/readme/model-providers).
|
||||
|
||||

|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">ডিফাই ওয়ার্কফ্লো ফাইল আপলোড পরিচিতি: গুগল নোটবুক-এলএম পডকাস্ট পুনর্নির্মাণ</a>
|
||||
@ -84,8 +84,6 @@ docker compose up -d
|
||||
**১. ওয়ার্কফ্লো**:
|
||||
ভিজ্যুয়াল ক্যানভাসে AI ওয়ার্কফ্লো তৈরি এবং পরীক্ষা করুন, নিম্নলিখিত সব ফিচার এবং তার বাইরেও আরও অনেক কিছু ব্যবহার করে।
|
||||
|
||||
<https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa>
|
||||
|
||||
**২. মডেল সাপোর্ট**:
|
||||
GPT, Mistral, Llama3, এবং যেকোনো OpenAI API-সামঞ্জস্যপূর্ণ মডেলসহ, কয়েক ডজন ইনফারেন্স প্রদানকারী এবং সেল্ফ-হোস্টেড সমাধান থেকে শুরু করে প্রোপ্রাইটরি/ওপেন-সোর্স LLM-এর সাথে সহজে ইন্টিগ্রেশন। সমর্থিত মডেল প্রদানকারীদের একটি সম্পূর্ণ তালিকা পাওয়া যাবে [এখানে](https://docs.dify.ai/getting-started/readme/model-providers)।
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<div align="center">
|
||||
<a href="https://cloud.dify.ai">Dify 云服务</a> ·
|
||||
@ -61,11 +61,6 @@ Dify 是一个开源的 LLM 应用开发平台。其直观的界面结合了 AI
|
||||
**1. 工作流**:
|
||||
在画布上构建和测试功能强大的 AI 工作流程,利用以下所有功能以及更多功能。
|
||||
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
|
||||
|
||||
**2. 全面的模型支持**:
|
||||
与数百种专有/开源 LLMs 以及数十种推理提供商和自托管解决方案无缝集成,涵盖 GPT、Mistral、Llama3 以及任何与 OpenAI API 兼容的模型。完整的支持模型提供商列表可在[此处](https://docs.dify.ai/getting-started/readme/model-providers)找到。
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Einführung in Dify Workflow File Upload: Google NotebookLM Podcast nachbilden</a>
|
||||
@ -83,11 +83,6 @@ Bitte beachten Sie unsere [FAQ](https://docs.dify.ai/getting-started/install-sel
|
||||
**1. Workflow**:
|
||||
Erstellen und testen Sie leistungsstarke KI-Workflows auf einer visuellen Oberfläche, wobei Sie alle der folgenden Funktionen und darüber hinaus nutzen können.
|
||||
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
|
||||
|
||||
**2. Umfassende Modellunterstützung**:
|
||||
Nahtlose Integration mit Hunderten von proprietären und Open-Source-LLMs von Dutzenden Inferenzanbietern und selbstgehosteten Lösungen, die GPT, Mistral, Llama3 und alle mit der OpenAI API kompatiblen Modelle abdecken. Eine vollständige Liste der unterstützten Modellanbieter finden Sie [hier](https://docs.dify.ai/getting-started/readme/model-providers).
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
|
||||
@ -59,11 +59,6 @@ Dify es una plataforma de desarrollo de aplicaciones de LLM de código abierto.
|
||||
**1. Flujo de trabajo**:
|
||||
Construye y prueba potentes flujos de trabajo de IA en un lienzo visual, aprovechando todas las siguientes características y más.
|
||||
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
|
||||
|
||||
**2. Soporte de modelos completo**:
|
||||
Integración perfecta con cientos de LLMs propietarios / de código abierto de docenas de proveedores de inferencia y soluciones auto-alojadas, que cubren GPT, Mistral, Llama3 y cualquier modelo compatible con la API de OpenAI. Se puede encontrar una lista completa de proveedores de modelos admitidos [aquí](https://docs.dify.ai/getting-started/readme/model-providers).
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
|
||||
@ -59,11 +59,6 @@ Dify est une plateforme de développement d'applications LLM open source. Son in
|
||||
**1. Flux de travail** :
|
||||
Construisez et testez des flux de travail d'IA puissants sur un canevas visuel, en utilisant toutes les fonctionnalités suivantes et plus encore.
|
||||
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
|
||||
|
||||
**2. Prise en charge complète des modèles** :
|
||||
Intégration transparente avec des centaines de LLM propriétaires / open source provenant de dizaines de fournisseurs d'inférence et de solutions auto-hébergées, couvrant GPT, Mistral, Llama3, et tous les modèles compatibles avec l'API OpenAI. Une liste complète des fournisseurs de modèles pris en charge se trouve [ici](https://docs.dify.ai/getting-started/readme/model-providers).
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
|
||||
@ -60,11 +60,6 @@ DifyはオープンソースのLLMアプリケーション開発プラットフ
|
||||
**1. ワークフロー**:
|
||||
強力なAIワークフローをビジュアルキャンバス上で構築し、テストできます。すべての機能、および以下の機能を使用できます。
|
||||
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
|
||||
|
||||
**2. 総合的なモデルサポート**:
|
||||
数百ものプロプライエタリ/オープンソースのLLMと、数十もの推論プロバイダーおよびセルフホスティングソリューションとのシームレスな統合を提供します。GPT、Mistral、Llama3、OpenAI APIと互換性のあるすべてのモデルを統合されています。サポートされているモデルプロバイダーの完全なリストは[こちら](https://docs.dify.ai/getting-started/readme/model-providers)をご覧ください。
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
|
||||
@ -59,11 +59,6 @@ Dify is an open-source LLM app development platform. Its intuitive interface com
|
||||
**1. Workflow**:
|
||||
Build and test powerful AI workflows on a visual canvas, leveraging all the following features and beyond.
|
||||
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
|
||||
|
||||
**2. Comprehensive model support**:
|
||||
Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama3, and any OpenAI API-compatible models. A full list of supported model providers can be found [here](https://docs.dify.ai/getting-started/readme/model-providers).
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
<a href="https://cloud.dify.ai">Dify 클라우드</a> ·
|
||||
@ -54,11 +54,6 @@
|
||||
**1. 워크플로우**:
|
||||
다음 기능들을 비롯한 다양한 기능을 활용하여 시각적 캔버스에서 강력한 AI 워크플로우를 구축하고 테스트하세요.
|
||||
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
|
||||
|
||||
**2. 포괄적인 모델 지원:**:
|
||||
|
||||
수십 개의 추론 제공업체와 자체 호스팅 솔루션에서 제공하는 수백 개의 독점 및 오픈 소스 LLM과 원활하게 통합되며, GPT, Mistral, Llama3 및 모든 OpenAI API 호환 모델을 포함합니다. 지원되는 모델 제공업체의 전체 목록은 [여기](https://docs.dify.ai/getting-started/readme/model-providers)에서 확인할 수 있습니다.
|
||||
|
@ -1,5 +1,4 @@
|
||||

|
||||
|
||||

|
||||
<p align="center">
|
||||
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Introduzindo o Dify Workflow com Upload de Arquivo: Recrie o Podcast Google NotebookLM</a>
|
||||
</p>
|
||||
@ -59,11 +58,6 @@ Dify é uma plataforma de desenvolvimento de aplicativos LLM de código aberto.
|
||||
**1. Workflow**:
|
||||
Construa e teste workflows poderosos de IA em uma interface visual, aproveitando todos os recursos a seguir e muito mais.
|
||||
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
|
||||
|
||||
**2. Suporte abrangente a modelos**:
|
||||
Integração perfeita com centenas de LLMs proprietários e de código aberto de diversas provedoras e soluções auto-hospedadas, abrangendo GPT, Mistral, Llama3 e qualquer modelo compatível com a API da OpenAI. A lista completa de provedores suportados pode ser encontrada [aqui](https://docs.dify.ai/getting-started/readme/model-providers).
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Predstavljamo nalaganje datotek Dify Workflow: znova ustvarite Google NotebookLM Podcast</a>
|
||||
@ -81,11 +81,6 @@ Prosimo, glejte naša pogosta vprašanja [FAQ](https://docs.dify.ai/getting-star
|
||||
**1. Potek dela**:
|
||||
Zgradite in preizkusite zmogljive poteke dela AI na vizualnem platnu, pri čemer izkoristite vse naslednje funkcije in več.
|
||||
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
|
||||
|
||||
**2. Celovita podpora za modele**:
|
||||
Brezhibna integracija s stotinami lastniških/odprtokodnih LLM-jev ducatov ponudnikov sklepanja in samostojnih rešitev, ki pokrivajo GPT, Mistral, Llama3 in vse modele, združljive z API-jem OpenAI. Celoten seznam podprtih ponudnikov modelov najdete [tukaj](https://docs.dify.ai/getting-started/readme/model-providers).
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
<a href="https://cloud.dify.ai">Dify Bulut</a> ·
|
||||
@ -55,11 +55,6 @@ Dify, açık kaynaklı bir LLM uygulama geliştirme platformudur. Sezgisel aray
|
||||
**1. Workflow**:
|
||||
Görsel bir arayüz üzerinde güçlü AI iş akışları oluşturun ve test edin, aşağıdaki tüm özellikleri ve daha fazlasını kullanarak.
|
||||
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
|
||||
|
||||
**2. Kapsamlı model desteği**:
|
||||
Çok sayıda çıkarım sağlayıcısı ve kendi kendine barındırılan çözümlerden yüzlerce özel / açık kaynaklı LLM ile sorunsuz entegrasyon sağlar. GPT, Mistral, Llama3 ve OpenAI API uyumlu tüm modelleri kapsar. Desteklenen model sağlayıcılarının tam listesine [buradan](https://docs.dify.ai/getting-started/readme/model-providers) ulaşabilirsiniz.
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">介紹 Dify 工作流程檔案上傳功能:重現 Google NotebookLM Podcast</a>
|
||||
@ -86,8 +86,6 @@ docker compose up -d
|
||||
**1. 工作流程**:
|
||||
在視覺化畫布上建立和測試強大的 AI 工作流程,利用以下所有功能及更多。
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
**2. 全面的模型支援**:
|
||||
無縫整合來自數十個推理提供商和自託管解決方案的數百個專有/開源 LLM,涵蓋 GPT、Mistral、Llama3 和任何與 OpenAI API 兼容的模型。您可以在[此處](https://docs.dify.ai/getting-started/readme/model-providers)找到支援的模型提供商完整列表。
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||

|
||||

|
||||
|
||||
<p align="center">
|
||||
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
|
||||
@ -55,11 +55,6 @@ Dify là một nền tảng phát triển ứng dụng LLM mã nguồn mở. Gia
|
||||
**1. Quy trình làm việc**:
|
||||
Xây dựng và kiểm tra các quy trình làm việc AI mạnh mẽ trên một canvas trực quan, tận dụng tất cả các tính năng sau đây và hơn thế nữa.
|
||||
|
||||
|
||||
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
|
||||
|
||||
|
||||
|
||||
**2. Hỗ trợ mô hình toàn diện**:
|
||||
Tích hợp liền mạch với hàng trăm mô hình LLM độc quyền / mã nguồn mở từ hàng chục nhà cung cấp suy luận và giải pháp tự lưu trữ, bao gồm GPT, Mistral, Llama3, và bất kỳ mô hình tương thích API OpenAI nào. Danh sách đầy đủ các nhà cung cấp mô hình được hỗ trợ có thể được tìm thấy [tại đây](https://docs.dify.ai/getting-started/readme/model-providers).
|
||||
|
||||
|
@ -348,6 +348,7 @@ SENTRY_DSN=
|
||||
|
||||
# DEBUG
|
||||
DEBUG=false
|
||||
ENABLE_REQUEST_LOGGING=False
|
||||
SQLALCHEMY_ECHO=false
|
||||
|
||||
# Notion import configuration, support public and internal
|
||||
|
@ -54,6 +54,7 @@ def initialize_extensions(app: DifyApp):
|
||||
ext_otel,
|
||||
ext_proxy_fix,
|
||||
ext_redis,
|
||||
ext_request_logging,
|
||||
ext_sentry,
|
||||
ext_set_secretkey,
|
||||
ext_storage,
|
||||
@ -83,6 +84,7 @@ def initialize_extensions(app: DifyApp):
|
||||
ext_blueprints,
|
||||
ext_commands,
|
||||
ext_otel,
|
||||
ext_request_logging,
|
||||
]
|
||||
for ext in extensions:
|
||||
short_name = ext.__name__.split(".")[-1]
|
||||
|
@ -17,6 +17,12 @@ class DeploymentConfig(BaseSettings):
|
||||
default=False,
|
||||
)
|
||||
|
||||
# Request logging configuration
|
||||
ENABLE_REQUEST_LOGGING: bool = Field(
|
||||
description="Enable request and response body logging",
|
||||
default=False,
|
||||
)
|
||||
|
||||
EDITION: str = Field(
|
||||
description="Deployment edition of the application (e.g., 'SELF_HOSTED', 'CLOUD')",
|
||||
default="SELF_HOSTED",
|
||||
|
@ -9,7 +9,7 @@ class PackagingInfo(BaseSettings):
|
||||
|
||||
CURRENT_VERSION: str = Field(
|
||||
description="Dify version",
|
||||
default="1.3.1",
|
||||
default="1.4.0",
|
||||
)
|
||||
|
||||
COMMIT_SHA: str = Field(
|
||||
|
@ -17,15 +17,13 @@ from controllers.console.wraps import (
|
||||
)
|
||||
from core.ops.ops_trace_manager import OpsTraceManager
|
||||
from extensions.ext_database import db
|
||||
from fields.app_fields import (
|
||||
app_detail_fields,
|
||||
app_detail_fields_with_site,
|
||||
app_pagination_fields,
|
||||
)
|
||||
from fields.app_fields import app_detail_fields, app_detail_fields_with_site, app_pagination_fields
|
||||
from libs.login import login_required
|
||||
from models import Account, App
|
||||
from services.app_dsl_service import AppDslService, ImportMode
|
||||
from services.app_service import AppService
|
||||
from services.enterprise.enterprise_service import EnterpriseService
|
||||
from services.feature_service import FeatureService
|
||||
|
||||
ALLOW_CREATE_APP_MODES = ["chat", "agent-chat", "advanced-chat", "workflow", "completion"]
|
||||
|
||||
@ -75,7 +73,17 @@ class AppListApi(Resource):
|
||||
if not app_pagination:
|
||||
return {"data": [], "total": 0, "page": 1, "limit": 20, "has_more": False}
|
||||
|
||||
return marshal(app_pagination, app_pagination_fields)
|
||||
if FeatureService.get_system_features().webapp_auth.enabled:
|
||||
app_ids = [str(app.id) for app in app_pagination.items]
|
||||
res = EnterpriseService.WebAppAuth.batch_get_app_access_mode_by_id(app_ids=app_ids)
|
||||
if len(res) != len(app_ids):
|
||||
raise BadRequest("Invalid app id in webapp auth")
|
||||
|
||||
for app in app_pagination.items:
|
||||
if str(app.id) in res:
|
||||
app.access_mode = res[str(app.id)].access_mode
|
||||
|
||||
return marshal(app_pagination, app_pagination_fields), 200
|
||||
|
||||
@setup_required
|
||||
@login_required
|
||||
@ -119,6 +127,10 @@ class AppApi(Resource):
|
||||
|
||||
app_model = app_service.get_app(app_model)
|
||||
|
||||
if FeatureService.get_system_features().webapp_auth.enabled:
|
||||
app_setting = EnterpriseService.WebAppAuth.get_app_access_mode_by_id(app_id=str(app_model.id))
|
||||
app_model.access_mode = app_setting.access_mode
|
||||
|
||||
return app_model
|
||||
|
||||
@setup_required
|
||||
|
@ -81,8 +81,7 @@ class DraftWorkflowApi(Resource):
|
||||
parser.add_argument("graph", type=dict, required=True, nullable=False, location="json")
|
||||
parser.add_argument("features", type=dict, required=True, nullable=False, location="json")
|
||||
parser.add_argument("hash", type=str, required=False, location="json")
|
||||
# TODO: set this to required=True after frontend is updated
|
||||
parser.add_argument("environment_variables", type=list, required=False, location="json")
|
||||
parser.add_argument("environment_variables", type=list, required=True, location="json")
|
||||
parser.add_argument("conversation_variables", type=list, required=False, location="json")
|
||||
args = parser.parse_args()
|
||||
elif "text/plain" in content_type:
|
||||
|
@ -1,3 +1,6 @@
|
||||
from typing import cast
|
||||
|
||||
from flask_login import current_user
|
||||
from flask_restful import Resource, marshal_with, reqparse
|
||||
from flask_restful.inputs import int_range
|
||||
|
||||
@ -12,8 +15,7 @@ from fields.workflow_run_fields import (
|
||||
)
|
||||
from libs.helper import uuid_value
|
||||
from libs.login import login_required
|
||||
from models import App
|
||||
from models.model import AppMode
|
||||
from models import Account, App, AppMode, EndUser
|
||||
from services.workflow_run_service import WorkflowRunService
|
||||
|
||||
|
||||
@ -90,7 +92,12 @@ class WorkflowRunNodeExecutionListApi(Resource):
|
||||
run_id = str(run_id)
|
||||
|
||||
workflow_run_service = WorkflowRunService()
|
||||
node_executions = workflow_run_service.get_workflow_run_node_executions(app_model=app_model, run_id=run_id)
|
||||
user = cast("Account | EndUser", current_user)
|
||||
node_executions = workflow_run_service.get_workflow_run_node_executions(
|
||||
app_model=app_model,
|
||||
run_id=run_id,
|
||||
user=user,
|
||||
)
|
||||
|
||||
return {"data": node_executions}
|
||||
|
||||
|
@ -24,7 +24,7 @@ from libs.password import hash_password, valid_password
|
||||
from models.account import Account
|
||||
from services.account_service import AccountService, TenantService
|
||||
from services.errors.account import AccountRegisterError
|
||||
from services.errors.workspace import WorkSpaceNotAllowedCreateError
|
||||
from services.errors.workspace import WorkSpaceNotAllowedCreateError, WorkspacesLimitExceededError
|
||||
from services.feature_service import FeatureService
|
||||
|
||||
|
||||
@ -119,6 +119,9 @@ class ForgotPasswordResetApi(Resource):
|
||||
if not reset_data:
|
||||
raise InvalidTokenError()
|
||||
# Must use token in reset phase
|
||||
if reset_data.get("phase", "") != "reset":
|
||||
raise InvalidTokenError()
|
||||
# Must use token in reset phase
|
||||
if reset_data.get("phase", "") != "reset":
|
||||
raise InvalidTokenError()
|
||||
|
||||
@ -168,6 +171,8 @@ class ForgotPasswordResetApi(Resource):
|
||||
)
|
||||
except WorkSpaceNotAllowedCreateError:
|
||||
pass
|
||||
except WorkspacesLimitExceededError:
|
||||
pass
|
||||
except AccountRegisterError:
|
||||
raise AccountInFreezeError()
|
||||
|
||||
|
@ -21,6 +21,7 @@ from controllers.console.error import (
|
||||
AccountNotFound,
|
||||
EmailSendIpLimitError,
|
||||
NotAllowedCreateWorkspace,
|
||||
WorkspacesLimitExceeded,
|
||||
)
|
||||
from controllers.console.wraps import email_password_login_enabled, setup_required
|
||||
from events.tenant_event import tenant_was_created
|
||||
@ -30,7 +31,7 @@ from models.account import Account
|
||||
from services.account_service import AccountService, RegisterService, TenantService
|
||||
from services.billing_service import BillingService
|
||||
from services.errors.account import AccountRegisterError
|
||||
from services.errors.workspace import WorkSpaceNotAllowedCreateError
|
||||
from services.errors.workspace import WorkSpaceNotAllowedCreateError, WorkspacesLimitExceededError
|
||||
from services.feature_service import FeatureService
|
||||
|
||||
|
||||
@ -88,6 +89,11 @@ class LoginApi(Resource):
|
||||
# SELF_HOSTED only have one workspace
|
||||
tenants = TenantService.get_join_tenants(account)
|
||||
if len(tenants) == 0:
|
||||
system_features = FeatureService.get_system_features()
|
||||
|
||||
if system_features.is_allow_create_workspace and not system_features.license.workspaces.is_available():
|
||||
raise WorkspacesLimitExceeded()
|
||||
else:
|
||||
return {
|
||||
"result": "fail",
|
||||
"data": "workspace not found, please contact system admin to invite you to join in a workspace",
|
||||
@ -198,6 +204,9 @@ class EmailCodeLoginApi(Resource):
|
||||
if account:
|
||||
tenant = TenantService.get_join_tenants(account)
|
||||
if not tenant:
|
||||
workspaces = FeatureService.get_system_features().license.workspaces
|
||||
if not workspaces.is_available():
|
||||
raise WorkspacesLimitExceeded()
|
||||
if not FeatureService.get_system_features().is_allow_create_workspace:
|
||||
raise NotAllowedCreateWorkspace()
|
||||
else:
|
||||
@ -215,6 +224,8 @@ class EmailCodeLoginApi(Resource):
|
||||
return NotAllowedCreateWorkspace()
|
||||
except AccountRegisterError as are:
|
||||
raise AccountInFreezeError()
|
||||
except WorkspacesLimitExceededError:
|
||||
raise WorkspacesLimitExceeded()
|
||||
token_pair = AccountService.login(account, ip_address=extract_remote_ip(request))
|
||||
AccountService.reset_login_error_rate_limit(args["email"])
|
||||
return {"result": "success", "data": token_pair.model_dump()}
|
||||
|
@ -46,6 +46,18 @@ class NotAllowedCreateWorkspace(BaseHTTPException):
|
||||
code = 400
|
||||
|
||||
|
||||
class WorkspaceMembersLimitExceeded(BaseHTTPException):
|
||||
error_code = "limit_exceeded"
|
||||
description = "Unable to add member because the maximum workspace's member limit was exceeded"
|
||||
code = 400
|
||||
|
||||
|
||||
class WorkspacesLimitExceeded(BaseHTTPException):
|
||||
error_code = "limit_exceeded"
|
||||
description = "Unable to create workspace because the maximum workspace limit was exceeded"
|
||||
code = 400
|
||||
|
||||
|
||||
class AccountBannedError(BaseHTTPException):
|
||||
error_code = "account_banned"
|
||||
description = "Account is banned."
|
||||
|
@ -23,3 +23,9 @@ class AppSuggestedQuestionsAfterAnswerDisabledError(BaseHTTPException):
|
||||
error_code = "app_suggested_questions_after_answer_disabled"
|
||||
description = "Function Suggested questions after answer disabled."
|
||||
code = 403
|
||||
|
||||
|
||||
class AppAccessDeniedError(BaseHTTPException):
|
||||
error_code = "access_denied"
|
||||
description = "App access denied."
|
||||
code = 403
|
||||
|
@ -1,3 +1,4 @@
|
||||
import logging
|
||||
from datetime import UTC, datetime
|
||||
from typing import Any
|
||||
|
||||
@ -15,6 +16,11 @@ from fields.installed_app_fields import installed_app_list_fields
|
||||
from libs.login import login_required
|
||||
from models import App, InstalledApp, RecommendedApp
|
||||
from services.account_service import TenantService
|
||||
from services.app_service import AppService
|
||||
from services.enterprise.enterprise_service import EnterpriseService
|
||||
from services.feature_service import FeatureService
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class InstalledAppsListApi(Resource):
|
||||
@ -48,6 +54,21 @@ class InstalledAppsListApi(Resource):
|
||||
for installed_app in installed_apps
|
||||
if installed_app.app is not None
|
||||
]
|
||||
|
||||
# filter out apps that user doesn't have access to
|
||||
if FeatureService.get_system_features().webapp_auth.enabled:
|
||||
user_id = current_user.id
|
||||
res = []
|
||||
for installed_app in installed_app_list:
|
||||
app_code = AppService.get_app_code_by_id(str(installed_app["app"].id))
|
||||
if EnterpriseService.WebAppAuth.is_user_allowed_to_access_webapp(
|
||||
user_id=user_id,
|
||||
app_code=app_code,
|
||||
):
|
||||
res.append(installed_app)
|
||||
installed_app_list = res
|
||||
logger.debug(f"installed_app_list: {installed_app_list}, user_id: {user_id}")
|
||||
|
||||
installed_app_list.sort(
|
||||
key=lambda app: (
|
||||
-app["is_pinned"],
|
||||
|
@ -4,10 +4,14 @@ from flask_login import current_user
|
||||
from flask_restful import Resource
|
||||
from werkzeug.exceptions import NotFound
|
||||
|
||||
from controllers.console.explore.error import AppAccessDeniedError
|
||||
from controllers.console.wraps import account_initialization_required
|
||||
from extensions.ext_database import db
|
||||
from libs.login import login_required
|
||||
from models import InstalledApp
|
||||
from services.app_service import AppService
|
||||
from services.enterprise.enterprise_service import EnterpriseService
|
||||
from services.feature_service import FeatureService
|
||||
|
||||
|
||||
def installed_app_required(view=None):
|
||||
@ -48,6 +52,36 @@ def installed_app_required(view=None):
|
||||
return decorator
|
||||
|
||||
|
||||
def user_allowed_to_access_app(view=None):
|
||||
def decorator(view):
|
||||
@wraps(view)
|
||||
def decorated(installed_app: InstalledApp, *args, **kwargs):
|
||||
feature = FeatureService.get_system_features()
|
||||
if feature.webapp_auth.enabled:
|
||||
app_id = installed_app.app_id
|
||||
app_code = AppService.get_app_code_by_id(app_id)
|
||||
res = EnterpriseService.WebAppAuth.is_user_allowed_to_access_webapp(
|
||||
user_id=str(current_user.id),
|
||||
app_code=app_code,
|
||||
)
|
||||
if not res:
|
||||
raise AppAccessDeniedError()
|
||||
|
||||
return view(installed_app, *args, **kwargs)
|
||||
|
||||
return decorated
|
||||
|
||||
if view:
|
||||
return decorator(view)
|
||||
return decorator
|
||||
|
||||
|
||||
class InstalledAppResource(Resource):
|
||||
# must be reversed if there are multiple decorators
|
||||
method_decorators = [installed_app_required, account_initialization_required, login_required]
|
||||
|
||||
method_decorators = [
|
||||
user_allowed_to_access_app,
|
||||
installed_app_required,
|
||||
account_initialization_required,
|
||||
login_required,
|
||||
]
|
||||
|
@ -6,6 +6,7 @@ from flask_restful import Resource, abort, marshal_with, reqparse
|
||||
import services
|
||||
from configs import dify_config
|
||||
from controllers.console import api
|
||||
from controllers.console.error import WorkspaceMembersLimitExceeded
|
||||
from controllers.console.wraps import (
|
||||
account_initialization_required,
|
||||
cloud_edition_billing_resource_check,
|
||||
@ -17,6 +18,7 @@ from libs.login import login_required
|
||||
from models.account import Account, TenantAccountRole
|
||||
from services.account_service import RegisterService, TenantService
|
||||
from services.errors.account import AccountAlreadyInTenantError
|
||||
from services.feature_service import FeatureService
|
||||
|
||||
|
||||
class MemberListApi(Resource):
|
||||
@ -54,6 +56,12 @@ class MemberInviteEmailApi(Resource):
|
||||
inviter = current_user
|
||||
invitation_results = []
|
||||
console_web_url = dify_config.CONSOLE_WEB_URL
|
||||
|
||||
workspace_members = FeatureService.get_features(tenant_id=inviter.current_tenant.id).workspace_members
|
||||
|
||||
if not workspace_members.is_available(len(invitee_emails)):
|
||||
raise WorkspaceMembersLimitExceeded()
|
||||
|
||||
for invitee_email in invitee_emails:
|
||||
try:
|
||||
token = RegisterService.invite_new_member(
|
||||
|
@ -5,5 +5,6 @@ from libs.external_api import ExternalApi
|
||||
bp = Blueprint("inner_api", __name__, url_prefix="/inner/api")
|
||||
api = ExternalApi(bp)
|
||||
|
||||
from . import mail
|
||||
from .plugin import plugin
|
||||
from .workspace import workspace
|
||||
|
27
api/controllers/inner_api/mail.py
Normal file
27
api/controllers/inner_api/mail.py
Normal file
@ -0,0 +1,27 @@
|
||||
from flask_restful import (
|
||||
Resource, # type: ignore
|
||||
reqparse,
|
||||
)
|
||||
|
||||
from controllers.console.wraps import setup_required
|
||||
from controllers.inner_api import api
|
||||
from controllers.inner_api.wraps import enterprise_inner_api_only
|
||||
from services.enterprise.mail_service import DifyMail, EnterpriseMailService
|
||||
|
||||
|
||||
class EnterpriseMail(Resource):
|
||||
@setup_required
|
||||
@enterprise_inner_api_only
|
||||
def post(self):
|
||||
parser = reqparse.RequestParser()
|
||||
parser.add_argument("to", type=str, action="append", required=True)
|
||||
parser.add_argument("subject", type=str, required=True)
|
||||
parser.add_argument("body", type=str, required=True)
|
||||
parser.add_argument("substitutions", type=dict, required=False)
|
||||
args = parser.parse_args()
|
||||
|
||||
EnterpriseMailService.send_mail(DifyMail(**args))
|
||||
return {"message": "success"}, 200
|
||||
|
||||
|
||||
api.add_resource(EnterpriseMail, "/enterprise/mail")
|
@ -1,12 +1,15 @@
|
||||
from flask_restful import marshal_with
|
||||
from flask import request
|
||||
from flask_restful import Resource, marshal_with, reqparse
|
||||
|
||||
from controllers.common import fields
|
||||
from controllers.web import api
|
||||
from controllers.web.error import AppUnavailableError
|
||||
from controllers.web.wraps import WebApiResource
|
||||
from core.app.app_config.common.parameters_mapping import get_parameters_from_feature_dict
|
||||
from libs.passport import PassportService
|
||||
from models.model import App, AppMode
|
||||
from services.app_service import AppService
|
||||
from services.enterprise.enterprise_service import EnterpriseService
|
||||
|
||||
|
||||
class AppParameterApi(WebApiResource):
|
||||
@ -40,5 +43,51 @@ class AppMeta(WebApiResource):
|
||||
return AppService().get_app_meta(app_model)
|
||||
|
||||
|
||||
class AppAccessMode(Resource):
|
||||
def get(self):
|
||||
parser = reqparse.RequestParser()
|
||||
parser.add_argument("appId", type=str, required=True, location="args")
|
||||
args = parser.parse_args()
|
||||
|
||||
app_id = args["appId"]
|
||||
res = EnterpriseService.WebAppAuth.get_app_access_mode_by_id(app_id)
|
||||
|
||||
return {"accessMode": res.access_mode}
|
||||
|
||||
|
||||
class AppWebAuthPermission(Resource):
|
||||
def get(self):
|
||||
user_id = "visitor"
|
||||
try:
|
||||
auth_header = request.headers.get("Authorization")
|
||||
if auth_header is None:
|
||||
raise
|
||||
if " " not in auth_header:
|
||||
raise
|
||||
|
||||
auth_scheme, tk = auth_header.split(None, 1)
|
||||
auth_scheme = auth_scheme.lower()
|
||||
if auth_scheme != "bearer":
|
||||
raise
|
||||
|
||||
decoded = PassportService().verify(tk)
|
||||
user_id = decoded.get("user_id", "visitor")
|
||||
except Exception as e:
|
||||
pass
|
||||
|
||||
parser = reqparse.RequestParser()
|
||||
parser.add_argument("appId", type=str, required=True, location="args")
|
||||
args = parser.parse_args()
|
||||
|
||||
app_id = args["appId"]
|
||||
app_code = AppService.get_app_code_by_id(app_id)
|
||||
|
||||
res = EnterpriseService.WebAppAuth.is_user_allowed_to_access_webapp(str(user_id), app_code)
|
||||
return {"result": res}
|
||||
|
||||
|
||||
api.add_resource(AppParameterApi, "/parameters")
|
||||
api.add_resource(AppMeta, "/meta")
|
||||
# webapp auth apis
|
||||
api.add_resource(AppAccessMode, "/webapp/access-mode")
|
||||
api.add_resource(AppWebAuthPermission, "/webapp/permission")
|
||||
|
@ -121,9 +121,15 @@ class UnsupportedFileTypeError(BaseHTTPException):
|
||||
code = 415
|
||||
|
||||
|
||||
class WebSSOAuthRequiredError(BaseHTTPException):
|
||||
class WebAppAuthRequiredError(BaseHTTPException):
|
||||
error_code = "web_sso_auth_required"
|
||||
description = "Web SSO authentication required."
|
||||
description = "Web app authentication required."
|
||||
code = 401
|
||||
|
||||
|
||||
class WebAppAuthAccessDeniedError(BaseHTTPException):
|
||||
error_code = "web_app_access_denied"
|
||||
description = "You do not have permission to access this web app."
|
||||
code = 401
|
||||
|
||||
|
||||
|
120
api/controllers/web/login.py
Normal file
120
api/controllers/web/login.py
Normal file
@ -0,0 +1,120 @@
|
||||
from flask import request
|
||||
from flask_restful import Resource, reqparse
|
||||
from jwt import InvalidTokenError # type: ignore
|
||||
from werkzeug.exceptions import BadRequest
|
||||
|
||||
import services
|
||||
from controllers.console.auth.error import EmailCodeError, EmailOrPasswordMismatchError, InvalidEmailError
|
||||
from controllers.console.error import AccountBannedError, AccountNotFound
|
||||
from controllers.console.wraps import setup_required
|
||||
from libs.helper import email
|
||||
from libs.password import valid_password
|
||||
from services.account_service import AccountService
|
||||
from services.webapp_auth_service import WebAppAuthService
|
||||
|
||||
|
||||
class LoginApi(Resource):
|
||||
"""Resource for web app email/password login."""
|
||||
|
||||
def post(self):
|
||||
"""Authenticate user and login."""
|
||||
parser = reqparse.RequestParser()
|
||||
parser.add_argument("email", type=email, required=True, location="json")
|
||||
parser.add_argument("password", type=valid_password, required=True, location="json")
|
||||
args = parser.parse_args()
|
||||
|
||||
app_code = request.headers.get("X-App-Code")
|
||||
if app_code is None:
|
||||
raise BadRequest("X-App-Code header is missing.")
|
||||
|
||||
try:
|
||||
account = WebAppAuthService.authenticate(args["email"], args["password"])
|
||||
except services.errors.account.AccountLoginError:
|
||||
raise AccountBannedError()
|
||||
except services.errors.account.AccountPasswordError:
|
||||
raise EmailOrPasswordMismatchError()
|
||||
except services.errors.account.AccountNotFoundError:
|
||||
raise AccountNotFound()
|
||||
|
||||
WebAppAuthService._validate_user_accessibility(account=account, app_code=app_code)
|
||||
|
||||
end_user = WebAppAuthService.create_end_user(email=args["email"], app_code=app_code)
|
||||
|
||||
token = WebAppAuthService.login(account=account, app_code=app_code, end_user_id=end_user.id)
|
||||
return {"result": "success", "token": token}
|
||||
|
||||
|
||||
# class LogoutApi(Resource):
|
||||
# @setup_required
|
||||
# def get(self):
|
||||
# account = cast(Account, flask_login.current_user)
|
||||
# if isinstance(account, flask_login.AnonymousUserMixin):
|
||||
# return {"result": "success"}
|
||||
# flask_login.logout_user()
|
||||
# return {"result": "success"}
|
||||
|
||||
|
||||
class EmailCodeLoginSendEmailApi(Resource):
|
||||
@setup_required
|
||||
def post(self):
|
||||
parser = reqparse.RequestParser()
|
||||
parser.add_argument("email", type=email, required=True, location="json")
|
||||
parser.add_argument("language", type=str, required=False, location="json")
|
||||
args = parser.parse_args()
|
||||
|
||||
if args["language"] is not None and args["language"] == "zh-Hans":
|
||||
language = "zh-Hans"
|
||||
else:
|
||||
language = "en-US"
|
||||
|
||||
account = WebAppAuthService.get_user_through_email(args["email"])
|
||||
if account is None:
|
||||
raise AccountNotFound()
|
||||
else:
|
||||
token = WebAppAuthService.send_email_code_login_email(account=account, language=language)
|
||||
|
||||
return {"result": "success", "data": token}
|
||||
|
||||
|
||||
class EmailCodeLoginApi(Resource):
|
||||
@setup_required
|
||||
def post(self):
|
||||
parser = reqparse.RequestParser()
|
||||
parser.add_argument("email", type=str, required=True, location="json")
|
||||
parser.add_argument("code", type=str, required=True, location="json")
|
||||
parser.add_argument("token", type=str, required=True, location="json")
|
||||
args = parser.parse_args()
|
||||
|
||||
user_email = args["email"]
|
||||
app_code = request.headers.get("X-App-Code")
|
||||
if app_code is None:
|
||||
raise BadRequest("X-App-Code header is missing.")
|
||||
|
||||
token_data = WebAppAuthService.get_email_code_login_data(args["token"])
|
||||
if token_data is None:
|
||||
raise InvalidTokenError()
|
||||
|
||||
if token_data["email"] != args["email"]:
|
||||
raise InvalidEmailError()
|
||||
|
||||
if token_data["code"] != args["code"]:
|
||||
raise EmailCodeError()
|
||||
|
||||
WebAppAuthService.revoke_email_code_login_token(args["token"])
|
||||
account = WebAppAuthService.get_user_through_email(user_email)
|
||||
if not account:
|
||||
raise AccountNotFound()
|
||||
|
||||
WebAppAuthService._validate_user_accessibility(account=account, app_code=app_code)
|
||||
|
||||
end_user = WebAppAuthService.create_end_user(email=user_email, app_code=app_code)
|
||||
|
||||
token = WebAppAuthService.login(account=account, app_code=app_code, end_user_id=end_user.id)
|
||||
AccountService.reset_login_error_rate_limit(args["email"])
|
||||
return {"result": "success", "token": token}
|
||||
|
||||
|
||||
# api.add_resource(LoginApi, "/login")
|
||||
# api.add_resource(LogoutApi, "/logout")
|
||||
# api.add_resource(EmailCodeLoginSendEmailApi, "/email-code-login")
|
||||
# api.add_resource(EmailCodeLoginApi, "/email-code-login/validity")
|
@ -5,7 +5,7 @@ from flask_restful import Resource
|
||||
from werkzeug.exceptions import NotFound, Unauthorized
|
||||
|
||||
from controllers.web import api
|
||||
from controllers.web.error import WebSSOAuthRequiredError
|
||||
from controllers.web.error import WebAppAuthRequiredError
|
||||
from extensions.ext_database import db
|
||||
from libs.passport import PassportService
|
||||
from models.model import App, EndUser, Site
|
||||
@ -24,10 +24,10 @@ class PassportResource(Resource):
|
||||
if app_code is None:
|
||||
raise Unauthorized("X-App-Code header is missing.")
|
||||
|
||||
if system_features.sso_enforced_for_web:
|
||||
app_web_sso_enabled = EnterpriseService.get_app_web_sso_enabled(app_code).get("enabled", False)
|
||||
if app_web_sso_enabled:
|
||||
raise WebSSOAuthRequiredError()
|
||||
if system_features.webapp_auth.enabled:
|
||||
app_settings = EnterpriseService.WebAppAuth.get_app_access_mode_by_code(app_code=app_code)
|
||||
if not app_settings or not app_settings.access_mode == "public":
|
||||
raise WebAppAuthRequiredError()
|
||||
|
||||
# get site from db and check if it is normal
|
||||
site = db.session.query(Site).filter(Site.code == app_code, Site.status == "normal").first()
|
||||
|
@ -4,7 +4,7 @@ from flask import request
|
||||
from flask_restful import Resource
|
||||
from werkzeug.exceptions import BadRequest, NotFound, Unauthorized
|
||||
|
||||
from controllers.web.error import WebSSOAuthRequiredError
|
||||
from controllers.web.error import WebAppAuthAccessDeniedError, WebAppAuthRequiredError
|
||||
from extensions.ext_database import db
|
||||
from libs.passport import PassportService
|
||||
from models.model import App, EndUser, Site
|
||||
@ -29,7 +29,7 @@ def validate_jwt_token(view=None):
|
||||
|
||||
def decode_jwt_token():
|
||||
system_features = FeatureService.get_system_features()
|
||||
app_code = request.headers.get("X-App-Code")
|
||||
app_code = str(request.headers.get("X-App-Code"))
|
||||
try:
|
||||
auth_header = request.headers.get("Authorization")
|
||||
if auth_header is None:
|
||||
@ -57,35 +57,53 @@ def decode_jwt_token():
|
||||
if not end_user:
|
||||
raise NotFound()
|
||||
|
||||
_validate_web_sso_token(decoded, system_features, app_code)
|
||||
# for enterprise webapp auth
|
||||
app_web_auth_enabled = False
|
||||
if system_features.webapp_auth.enabled:
|
||||
app_web_auth_enabled = (
|
||||
EnterpriseService.WebAppAuth.get_app_access_mode_by_code(app_code=app_code).access_mode != "public"
|
||||
)
|
||||
|
||||
_validate_webapp_token(decoded, app_web_auth_enabled, system_features.webapp_auth.enabled)
|
||||
_validate_user_accessibility(decoded, app_code, app_web_auth_enabled, system_features.webapp_auth.enabled)
|
||||
|
||||
return app_model, end_user
|
||||
except Unauthorized as e:
|
||||
if system_features.sso_enforced_for_web:
|
||||
app_web_sso_enabled = EnterpriseService.get_app_web_sso_enabled(app_code).get("enabled", False)
|
||||
if app_web_sso_enabled:
|
||||
raise WebSSOAuthRequiredError()
|
||||
if system_features.webapp_auth.enabled:
|
||||
app_web_auth_enabled = (
|
||||
EnterpriseService.WebAppAuth.get_app_access_mode_by_code(app_code=str(app_code)).access_mode != "public"
|
||||
)
|
||||
if app_web_auth_enabled:
|
||||
raise WebAppAuthRequiredError()
|
||||
|
||||
raise Unauthorized(e.description)
|
||||
|
||||
|
||||
def _validate_web_sso_token(decoded, system_features, app_code):
|
||||
app_web_sso_enabled = False
|
||||
|
||||
# Check if SSO is enforced for web, and if the token source is not SSO, raise an error and redirect to SSO login
|
||||
if system_features.sso_enforced_for_web:
|
||||
app_web_sso_enabled = EnterpriseService.get_app_web_sso_enabled(app_code).get("enabled", False)
|
||||
if app_web_sso_enabled:
|
||||
def _validate_webapp_token(decoded, app_web_auth_enabled: bool, system_webapp_auth_enabled: bool):
|
||||
# Check if authentication is enforced for web app, and if the token source is not webapp,
|
||||
# raise an error and redirect to login
|
||||
if system_webapp_auth_enabled and app_web_auth_enabled:
|
||||
source = decoded.get("token_source")
|
||||
if not source or source != "sso":
|
||||
raise WebSSOAuthRequiredError()
|
||||
if not source or source != "webapp":
|
||||
raise WebAppAuthRequiredError()
|
||||
|
||||
# Check if SSO is not enforced for web, and if the token source is SSO,
|
||||
# Check if authentication is not enforced for web, and if the token source is webapp,
|
||||
# raise an error and redirect to normal passport login
|
||||
if not system_features.sso_enforced_for_web or not app_web_sso_enabled:
|
||||
if not system_webapp_auth_enabled or not app_web_auth_enabled:
|
||||
source = decoded.get("token_source")
|
||||
if source and source == "sso":
|
||||
raise Unauthorized("sso token expired.")
|
||||
if source and source == "webapp":
|
||||
raise Unauthorized("webapp token expired.")
|
||||
|
||||
|
||||
def _validate_user_accessibility(decoded, app_code, app_web_auth_enabled: bool, system_webapp_auth_enabled: bool):
|
||||
if system_webapp_auth_enabled and app_web_auth_enabled:
|
||||
# Check if the user is allowed to access the web app
|
||||
user_id = decoded.get("user_id")
|
||||
if not user_id:
|
||||
raise WebAppAuthRequiredError()
|
||||
|
||||
if not EnterpriseService.WebAppAuth.is_user_allowed_to_access_webapp(user_id, app_code=app_code):
|
||||
raise WebAppAuthAccessDeniedError()
|
||||
|
||||
|
||||
class WebApiResource(Resource):
|
||||
|
@ -29,9 +29,7 @@ from core.repositories import SQLAlchemyWorkflowNodeExecutionRepository
|
||||
from core.workflow.repository.workflow_node_execution_repository import WorkflowNodeExecutionRepository
|
||||
from extensions.ext_database import db
|
||||
from factories import file_factory
|
||||
from models.account import Account
|
||||
from models.model import App, Conversation, EndUser, Message
|
||||
from models.workflow import Workflow
|
||||
from models import Account, App, Conversation, EndUser, Message, Workflow, WorkflowNodeExecutionTriggeredFrom
|
||||
from services.conversation_service import ConversationService
|
||||
from services.errors.message import MessageNotExistsError
|
||||
|
||||
@ -165,8 +163,9 @@ class AdvancedChatAppGenerator(MessageBasedAppGenerator):
|
||||
session_factory = sessionmaker(bind=db.engine, expire_on_commit=False)
|
||||
workflow_node_execution_repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=session_factory,
|
||||
tenant_id=application_generate_entity.app_config.tenant_id,
|
||||
user=user,
|
||||
app_id=application_generate_entity.app_config.app_id,
|
||||
triggered_from=WorkflowNodeExecutionTriggeredFrom.WORKFLOW_RUN,
|
||||
)
|
||||
|
||||
return self._generate(
|
||||
@ -231,8 +230,9 @@ class AdvancedChatAppGenerator(MessageBasedAppGenerator):
|
||||
session_factory = sessionmaker(bind=db.engine, expire_on_commit=False)
|
||||
workflow_node_execution_repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=session_factory,
|
||||
tenant_id=application_generate_entity.app_config.tenant_id,
|
||||
user=user,
|
||||
app_id=application_generate_entity.app_config.app_id,
|
||||
triggered_from=WorkflowNodeExecutionTriggeredFrom.SINGLE_STEP,
|
||||
)
|
||||
|
||||
return self._generate(
|
||||
@ -295,8 +295,9 @@ class AdvancedChatAppGenerator(MessageBasedAppGenerator):
|
||||
session_factory = sessionmaker(bind=db.engine, expire_on_commit=False)
|
||||
workflow_node_execution_repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=session_factory,
|
||||
tenant_id=application_generate_entity.app_config.tenant_id,
|
||||
user=user,
|
||||
app_id=application_generate_entity.app_config.app_id,
|
||||
triggered_from=WorkflowNodeExecutionTriggeredFrom.SINGLE_STEP,
|
||||
)
|
||||
|
||||
return self._generate(
|
||||
|
@ -70,7 +70,7 @@ from events.message_event import message_was_created
|
||||
from extensions.ext_database import db
|
||||
from models import Conversation, EndUser, Message, MessageFile
|
||||
from models.account import Account
|
||||
from models.enums import CreatedByRole
|
||||
from models.enums import CreatorUserRole
|
||||
from models.workflow import (
|
||||
Workflow,
|
||||
WorkflowRunStatus,
|
||||
@ -105,11 +105,11 @@ class AdvancedChatAppGenerateTaskPipeline:
|
||||
if isinstance(user, EndUser):
|
||||
self._user_id = user.id
|
||||
user_session_id = user.session_id
|
||||
self._created_by_role = CreatedByRole.END_USER
|
||||
self._created_by_role = CreatorUserRole.END_USER
|
||||
elif isinstance(user, Account):
|
||||
self._user_id = user.id
|
||||
user_session_id = user.id
|
||||
self._created_by_role = CreatedByRole.ACCOUNT
|
||||
self._created_by_role = CreatorUserRole.ACCOUNT
|
||||
else:
|
||||
raise NotImplementedError(f"User type not supported: {type(user)}")
|
||||
|
||||
@ -739,9 +739,9 @@ class AdvancedChatAppGenerateTaskPipeline:
|
||||
url=file["remote_url"],
|
||||
belongs_to="assistant",
|
||||
upload_file_id=file["related_id"],
|
||||
created_by_role=CreatedByRole.ACCOUNT
|
||||
created_by_role=CreatorUserRole.ACCOUNT
|
||||
if message.invoke_from in {InvokeFrom.EXPLORE, InvokeFrom.DEBUGGER}
|
||||
else CreatedByRole.END_USER,
|
||||
else CreatorUserRole.END_USER,
|
||||
created_by=message.from_account_id or message.from_end_user_id or "",
|
||||
)
|
||||
for file in self._recorded_files
|
||||
|
@ -25,7 +25,7 @@ from core.app.task_pipeline.easy_ui_based_generate_task_pipeline import EasyUIBa
|
||||
from core.prompt.utils.prompt_template_parser import PromptTemplateParser
|
||||
from extensions.ext_database import db
|
||||
from models import Account
|
||||
from models.enums import CreatedByRole
|
||||
from models.enums import CreatorUserRole
|
||||
from models.model import App, AppMode, AppModelConfig, Conversation, EndUser, Message, MessageFile
|
||||
from services.errors.app_model_config import AppModelConfigBrokenError
|
||||
from services.errors.conversation import ConversationNotExistsError
|
||||
@ -223,7 +223,7 @@ class MessageBasedAppGenerator(BaseAppGenerator):
|
||||
belongs_to="user",
|
||||
url=file.remote_url,
|
||||
upload_file_id=file.related_id,
|
||||
created_by_role=(CreatedByRole.ACCOUNT if account_id else CreatedByRole.END_USER),
|
||||
created_by_role=(CreatorUserRole.ACCOUNT if account_id else CreatorUserRole.END_USER),
|
||||
created_by=account_id or end_user_id or "",
|
||||
)
|
||||
db.session.add(message_file)
|
||||
|
@ -27,7 +27,7 @@ from core.workflow.repository.workflow_node_execution_repository import Workflow
|
||||
from core.workflow.workflow_app_generate_task_pipeline import WorkflowAppGenerateTaskPipeline
|
||||
from extensions.ext_database import db
|
||||
from factories import file_factory
|
||||
from models import Account, App, EndUser, Workflow
|
||||
from models import Account, App, EndUser, Workflow, WorkflowNodeExecutionTriggeredFrom
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -138,10 +138,12 @@ class WorkflowAppGenerator(BaseAppGenerator):
|
||||
|
||||
# Create workflow node execution repository
|
||||
session_factory = sessionmaker(bind=db.engine, expire_on_commit=False)
|
||||
|
||||
workflow_node_execution_repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=session_factory,
|
||||
tenant_id=application_generate_entity.app_config.tenant_id,
|
||||
user=user,
|
||||
app_id=application_generate_entity.app_config.app_id,
|
||||
triggered_from=WorkflowNodeExecutionTriggeredFrom.WORKFLOW_RUN,
|
||||
)
|
||||
|
||||
return self._generate(
|
||||
@ -262,10 +264,12 @@ class WorkflowAppGenerator(BaseAppGenerator):
|
||||
|
||||
# Create workflow node execution repository
|
||||
session_factory = sessionmaker(bind=db.engine, expire_on_commit=False)
|
||||
|
||||
workflow_node_execution_repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=session_factory,
|
||||
tenant_id=application_generate_entity.app_config.tenant_id,
|
||||
user=user,
|
||||
app_id=application_generate_entity.app_config.app_id,
|
||||
triggered_from=WorkflowNodeExecutionTriggeredFrom.SINGLE_STEP,
|
||||
)
|
||||
|
||||
return self._generate(
|
||||
@ -325,10 +329,12 @@ class WorkflowAppGenerator(BaseAppGenerator):
|
||||
|
||||
# Create workflow node execution repository
|
||||
session_factory = sessionmaker(bind=db.engine, expire_on_commit=False)
|
||||
|
||||
workflow_node_execution_repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=session_factory,
|
||||
tenant_id=application_generate_entity.app_config.tenant_id,
|
||||
user=user,
|
||||
app_id=application_generate_entity.app_config.app_id,
|
||||
triggered_from=WorkflowNodeExecutionTriggeredFrom.SINGLE_STEP,
|
||||
)
|
||||
|
||||
return self._generate(
|
||||
|
@ -6,7 +6,7 @@ from pydantic import BaseModel, ConfigDict
|
||||
|
||||
from core.model_runtime.entities.llm_entities import LLMResult
|
||||
from core.model_runtime.utils.encoders import jsonable_encoder
|
||||
from core.workflow.entities.node_entities import AgentNodeStrategyInit
|
||||
from core.workflow.entities.node_entities import AgentNodeStrategyInit, NodeRunMetadataKey
|
||||
from models.workflow import WorkflowNodeExecutionStatus
|
||||
|
||||
|
||||
@ -244,7 +244,7 @@ class NodeStartStreamResponse(StreamResponse):
|
||||
title: str
|
||||
index: int
|
||||
predecessor_node_id: Optional[str] = None
|
||||
inputs: Optional[dict] = None
|
||||
inputs: Optional[Mapping[str, Any]] = None
|
||||
created_at: int
|
||||
extras: dict = {}
|
||||
parallel_id: Optional[str] = None
|
||||
@ -301,13 +301,13 @@ class NodeFinishStreamResponse(StreamResponse):
|
||||
title: str
|
||||
index: int
|
||||
predecessor_node_id: Optional[str] = None
|
||||
inputs: Optional[dict] = None
|
||||
process_data: Optional[dict] = None
|
||||
outputs: Optional[dict] = None
|
||||
inputs: Optional[Mapping[str, Any]] = None
|
||||
process_data: Optional[Mapping[str, Any]] = None
|
||||
outputs: Optional[Mapping[str, Any]] = None
|
||||
status: str
|
||||
error: Optional[str] = None
|
||||
elapsed_time: float
|
||||
execution_metadata: Optional[dict] = None
|
||||
execution_metadata: Optional[Mapping[NodeRunMetadataKey, Any]] = None
|
||||
created_at: int
|
||||
finished_at: int
|
||||
files: Optional[Sequence[Mapping[str, Any]]] = []
|
||||
@ -370,13 +370,13 @@ class NodeRetryStreamResponse(StreamResponse):
|
||||
title: str
|
||||
index: int
|
||||
predecessor_node_id: Optional[str] = None
|
||||
inputs: Optional[dict] = None
|
||||
process_data: Optional[dict] = None
|
||||
outputs: Optional[dict] = None
|
||||
inputs: Optional[Mapping[str, Any]] = None
|
||||
process_data: Optional[Mapping[str, Any]] = None
|
||||
outputs: Optional[Mapping[str, Any]] = None
|
||||
status: str
|
||||
error: Optional[str] = None
|
||||
elapsed_time: float
|
||||
execution_metadata: Optional[dict] = None
|
||||
execution_metadata: Optional[Mapping[NodeRunMetadataKey, Any]] = None
|
||||
created_at: int
|
||||
finished_at: int
|
||||
files: Optional[Sequence[Mapping[str, Any]]] = []
|
||||
|
@ -1,3 +1,4 @@
|
||||
from collections.abc import Mapping
|
||||
from datetime import datetime
|
||||
from enum import StrEnum
|
||||
from typing import Any, Optional, Union
|
||||
@ -155,10 +156,10 @@ class LangfuseSpan(BaseModel):
|
||||
description="The status message of the span. Additional field for context of the event. E.g. the error "
|
||||
"message of an error event.",
|
||||
)
|
||||
input: Optional[Union[str, dict[str, Any], list, None]] = Field(
|
||||
input: Optional[Union[str, Mapping[str, Any], list, None]] = Field(
|
||||
default=None, description="The input of the span. Can be any JSON object."
|
||||
)
|
||||
output: Optional[Union[str, dict[str, Any], list, None]] = Field(
|
||||
output: Optional[Union[str, Mapping[str, Any], list, None]] = Field(
|
||||
default=None, description="The output of the span. Can be any JSON object."
|
||||
)
|
||||
version: Optional[str] = Field(
|
||||
|
@ -1,11 +1,10 @@
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Optional
|
||||
|
||||
from langfuse import Langfuse # type: ignore
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from sqlalchemy.orm import Session, sessionmaker
|
||||
|
||||
from core.ops.base_trace_instance import BaseTraceInstance
|
||||
from core.ops.entities.config_entity import LangfuseConfig
|
||||
@ -30,8 +29,9 @@ from core.ops.langfuse_trace.entities.langfuse_trace_entity import (
|
||||
)
|
||||
from core.ops.utils import filter_none_values
|
||||
from core.repositories import SQLAlchemyWorkflowNodeExecutionRepository
|
||||
from core.workflow.nodes.enums import NodeType
|
||||
from extensions.ext_database import db
|
||||
from models.model import EndUser
|
||||
from models import Account, App, EndUser, WorkflowNodeExecutionTriggeredFrom
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -113,8 +113,29 @@ class LangFuseDataTrace(BaseTraceInstance):
|
||||
|
||||
# through workflow_run_id get all_nodes_execution using repository
|
||||
session_factory = sessionmaker(bind=db.engine)
|
||||
# Find the app's creator account
|
||||
with Session(db.engine, expire_on_commit=False) as session:
|
||||
# Get the app to find its creator
|
||||
app_id = trace_info.metadata.get("app_id")
|
||||
if not app_id:
|
||||
raise ValueError("No app_id found in trace_info metadata")
|
||||
|
||||
app = session.query(App).filter(App.id == app_id).first()
|
||||
if not app:
|
||||
raise ValueError(f"App with id {app_id} not found")
|
||||
|
||||
if not app.created_by:
|
||||
raise ValueError(f"App with id {app_id} has no creator (created_by is None)")
|
||||
|
||||
service_account = session.query(Account).filter(Account.id == app.created_by).first()
|
||||
if not service_account:
|
||||
raise ValueError(f"Creator account with id {app.created_by} not found for app {app_id}")
|
||||
|
||||
workflow_node_execution_repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=session_factory, tenant_id=trace_info.tenant_id
|
||||
session_factory=session_factory,
|
||||
user=service_account,
|
||||
app_id=trace_info.metadata.get("app_id"),
|
||||
triggered_from=WorkflowNodeExecutionTriggeredFrom.WORKFLOW_RUN,
|
||||
)
|
||||
|
||||
# Get all executions for this workflow run
|
||||
@ -124,23 +145,22 @@ class LangFuseDataTrace(BaseTraceInstance):
|
||||
|
||||
for node_execution in workflow_node_executions:
|
||||
node_execution_id = node_execution.id
|
||||
tenant_id = node_execution.tenant_id
|
||||
app_id = node_execution.app_id
|
||||
tenant_id = trace_info.tenant_id # Use from trace_info instead
|
||||
app_id = trace_info.metadata.get("app_id") # Use from trace_info instead
|
||||
node_name = node_execution.title
|
||||
node_type = node_execution.node_type
|
||||
status = node_execution.status
|
||||
if node_type == "llm":
|
||||
inputs = (
|
||||
json.loads(node_execution.process_data).get("prompts", {}) if node_execution.process_data else {}
|
||||
)
|
||||
if node_type == NodeType.LLM:
|
||||
inputs = node_execution.process_data.get("prompts", {}) if node_execution.process_data else {}
|
||||
else:
|
||||
inputs = json.loads(node_execution.inputs) if node_execution.inputs else {}
|
||||
outputs = json.loads(node_execution.outputs) if node_execution.outputs else {}
|
||||
inputs = node_execution.inputs if node_execution.inputs else {}
|
||||
outputs = node_execution.outputs if node_execution.outputs else {}
|
||||
created_at = node_execution.created_at or datetime.now()
|
||||
elapsed_time = node_execution.elapsed_time
|
||||
finished_at = created_at + timedelta(seconds=elapsed_time)
|
||||
|
||||
metadata = json.loads(node_execution.execution_metadata) if node_execution.execution_metadata else {}
|
||||
execution_metadata = node_execution.metadata if node_execution.metadata else {}
|
||||
metadata = {str(k): v for k, v in execution_metadata.items()}
|
||||
metadata.update(
|
||||
{
|
||||
"workflow_run_id": trace_info.workflow_run_id,
|
||||
@ -152,7 +172,7 @@ class LangFuseDataTrace(BaseTraceInstance):
|
||||
"status": status,
|
||||
}
|
||||
)
|
||||
process_data = json.loads(node_execution.process_data) if node_execution.process_data else {}
|
||||
process_data = node_execution.process_data if node_execution.process_data else {}
|
||||
model_provider = process_data.get("model_provider", None)
|
||||
model_name = process_data.get("model_name", None)
|
||||
if model_provider is not None and model_name is not None:
|
||||
|
@ -1,3 +1,4 @@
|
||||
from collections.abc import Mapping
|
||||
from datetime import datetime
|
||||
from enum import StrEnum
|
||||
from typing import Any, Optional, Union
|
||||
@ -30,8 +31,8 @@ class LangSmithMultiModel(BaseModel):
|
||||
|
||||
class LangSmithRunModel(LangSmithTokenUsage, LangSmithMultiModel):
|
||||
name: Optional[str] = Field(..., description="Name of the run")
|
||||
inputs: Optional[Union[str, dict[str, Any], list, None]] = Field(None, description="Inputs of the run")
|
||||
outputs: Optional[Union[str, dict[str, Any], list, None]] = Field(None, description="Outputs of the run")
|
||||
inputs: Optional[Union[str, Mapping[str, Any], list, None]] = Field(None, description="Inputs of the run")
|
||||
outputs: Optional[Union[str, Mapping[str, Any], list, None]] = Field(None, description="Outputs of the run")
|
||||
run_type: LangSmithRunType = Field(..., description="Type of the run")
|
||||
start_time: Optional[datetime | str] = Field(None, description="Start time of the run")
|
||||
end_time: Optional[datetime | str] = Field(None, description="End time of the run")
|
||||
|
@ -1,4 +1,3 @@
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import uuid
|
||||
@ -7,7 +6,7 @@ from typing import Optional, cast
|
||||
|
||||
from langsmith import Client
|
||||
from langsmith.schemas import RunBase
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from sqlalchemy.orm import Session, sessionmaker
|
||||
|
||||
from core.ops.base_trace_instance import BaseTraceInstance
|
||||
from core.ops.entities.config_entity import LangSmithConfig
|
||||
@ -29,8 +28,10 @@ from core.ops.langsmith_trace.entities.langsmith_trace_entity import (
|
||||
)
|
||||
from core.ops.utils import filter_none_values, generate_dotted_order
|
||||
from core.repositories import SQLAlchemyWorkflowNodeExecutionRepository
|
||||
from core.workflow.entities.node_entities import NodeRunMetadataKey
|
||||
from core.workflow.nodes.enums import NodeType
|
||||
from extensions.ext_database import db
|
||||
from models.model import EndUser, MessageFile
|
||||
from models import Account, App, EndUser, MessageFile, WorkflowNodeExecutionTriggeredFrom
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -137,8 +138,29 @@ class LangSmithDataTrace(BaseTraceInstance):
|
||||
|
||||
# through workflow_run_id get all_nodes_execution using repository
|
||||
session_factory = sessionmaker(bind=db.engine)
|
||||
# Find the app's creator account
|
||||
with Session(db.engine, expire_on_commit=False) as session:
|
||||
# Get the app to find its creator
|
||||
app_id = trace_info.metadata.get("app_id")
|
||||
if not app_id:
|
||||
raise ValueError("No app_id found in trace_info metadata")
|
||||
|
||||
app = session.query(App).filter(App.id == app_id).first()
|
||||
if not app:
|
||||
raise ValueError(f"App with id {app_id} not found")
|
||||
|
||||
if not app.created_by:
|
||||
raise ValueError(f"App with id {app_id} has no creator (created_by is None)")
|
||||
|
||||
service_account = session.query(Account).filter(Account.id == app.created_by).first()
|
||||
if not service_account:
|
||||
raise ValueError(f"Creator account with id {app.created_by} not found for app {app_id}")
|
||||
|
||||
workflow_node_execution_repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=session_factory, tenant_id=trace_info.tenant_id, app_id=trace_info.metadata.get("app_id")
|
||||
session_factory=session_factory,
|
||||
user=service_account,
|
||||
app_id=trace_info.metadata.get("app_id"),
|
||||
triggered_from=WorkflowNodeExecutionTriggeredFrom.WORKFLOW_RUN,
|
||||
)
|
||||
|
||||
# Get all executions for this workflow run
|
||||
@ -148,27 +170,23 @@ class LangSmithDataTrace(BaseTraceInstance):
|
||||
|
||||
for node_execution in workflow_node_executions:
|
||||
node_execution_id = node_execution.id
|
||||
tenant_id = node_execution.tenant_id
|
||||
app_id = node_execution.app_id
|
||||
tenant_id = trace_info.tenant_id # Use from trace_info instead
|
||||
app_id = trace_info.metadata.get("app_id") # Use from trace_info instead
|
||||
node_name = node_execution.title
|
||||
node_type = node_execution.node_type
|
||||
status = node_execution.status
|
||||
if node_type == "llm":
|
||||
inputs = (
|
||||
json.loads(node_execution.process_data).get("prompts", {}) if node_execution.process_data else {}
|
||||
)
|
||||
if node_type == NodeType.LLM:
|
||||
inputs = node_execution.process_data.get("prompts", {}) if node_execution.process_data else {}
|
||||
else:
|
||||
inputs = json.loads(node_execution.inputs) if node_execution.inputs else {}
|
||||
outputs = json.loads(node_execution.outputs) if node_execution.outputs else {}
|
||||
inputs = node_execution.inputs if node_execution.inputs else {}
|
||||
outputs = node_execution.outputs if node_execution.outputs else {}
|
||||
created_at = node_execution.created_at or datetime.now()
|
||||
elapsed_time = node_execution.elapsed_time
|
||||
finished_at = created_at + timedelta(seconds=elapsed_time)
|
||||
|
||||
execution_metadata = (
|
||||
json.loads(node_execution.execution_metadata) if node_execution.execution_metadata else {}
|
||||
)
|
||||
node_total_tokens = execution_metadata.get("total_tokens", 0)
|
||||
metadata = execution_metadata.copy()
|
||||
execution_metadata = node_execution.metadata if node_execution.metadata else {}
|
||||
node_total_tokens = execution_metadata.get(NodeRunMetadataKey.TOTAL_TOKENS) or 0
|
||||
metadata = {str(key): value for key, value in execution_metadata.items()}
|
||||
metadata.update(
|
||||
{
|
||||
"workflow_run_id": trace_info.workflow_run_id,
|
||||
@ -181,7 +199,7 @@ class LangSmithDataTrace(BaseTraceInstance):
|
||||
}
|
||||
)
|
||||
|
||||
process_data = json.loads(node_execution.process_data) if node_execution.process_data else {}
|
||||
process_data = node_execution.process_data if node_execution.process_data else {}
|
||||
|
||||
if process_data and process_data.get("model_mode") == "chat":
|
||||
run_type = LangSmithRunType.llm
|
||||
@ -191,7 +209,7 @@ class LangSmithDataTrace(BaseTraceInstance):
|
||||
"ls_model_name": process_data.get("model_name", ""),
|
||||
}
|
||||
)
|
||||
elif node_type == "knowledge-retrieval":
|
||||
elif node_type == NodeType.KNOWLEDGE_RETRIEVAL:
|
||||
run_type = LangSmithRunType.retriever
|
||||
else:
|
||||
run_type = LangSmithRunType.tool
|
||||
|
@ -1,4 +1,3 @@
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import uuid
|
||||
@ -7,7 +6,7 @@ from typing import Optional, cast
|
||||
|
||||
from opik import Opik, Trace
|
||||
from opik.id_helpers import uuid4_to_uuid7
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from sqlalchemy.orm import Session, sessionmaker
|
||||
|
||||
from core.ops.base_trace_instance import BaseTraceInstance
|
||||
from core.ops.entities.config_entity import OpikConfig
|
||||
@ -23,8 +22,10 @@ from core.ops.entities.trace_entity import (
|
||||
WorkflowTraceInfo,
|
||||
)
|
||||
from core.repositories import SQLAlchemyWorkflowNodeExecutionRepository
|
||||
from core.workflow.entities.node_entities import NodeRunMetadataKey
|
||||
from core.workflow.nodes.enums import NodeType
|
||||
from extensions.ext_database import db
|
||||
from models.model import EndUser, MessageFile
|
||||
from models import Account, App, EndUser, MessageFile, WorkflowNodeExecutionTriggeredFrom
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -150,8 +151,29 @@ class OpikDataTrace(BaseTraceInstance):
|
||||
|
||||
# through workflow_run_id get all_nodes_execution using repository
|
||||
session_factory = sessionmaker(bind=db.engine)
|
||||
# Find the app's creator account
|
||||
with Session(db.engine, expire_on_commit=False) as session:
|
||||
# Get the app to find its creator
|
||||
app_id = trace_info.metadata.get("app_id")
|
||||
if not app_id:
|
||||
raise ValueError("No app_id found in trace_info metadata")
|
||||
|
||||
app = session.query(App).filter(App.id == app_id).first()
|
||||
if not app:
|
||||
raise ValueError(f"App with id {app_id} not found")
|
||||
|
||||
if not app.created_by:
|
||||
raise ValueError(f"App with id {app_id} has no creator (created_by is None)")
|
||||
|
||||
service_account = session.query(Account).filter(Account.id == app.created_by).first()
|
||||
if not service_account:
|
||||
raise ValueError(f"Creator account with id {app.created_by} not found for app {app_id}")
|
||||
|
||||
workflow_node_execution_repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=session_factory, tenant_id=trace_info.tenant_id, app_id=trace_info.metadata.get("app_id")
|
||||
session_factory=session_factory,
|
||||
user=service_account,
|
||||
app_id=trace_info.metadata.get("app_id"),
|
||||
triggered_from=WorkflowNodeExecutionTriggeredFrom.WORKFLOW_RUN,
|
||||
)
|
||||
|
||||
# Get all executions for this workflow run
|
||||
@ -161,26 +183,22 @@ class OpikDataTrace(BaseTraceInstance):
|
||||
|
||||
for node_execution in workflow_node_executions:
|
||||
node_execution_id = node_execution.id
|
||||
tenant_id = node_execution.tenant_id
|
||||
app_id = node_execution.app_id
|
||||
tenant_id = trace_info.tenant_id # Use from trace_info instead
|
||||
app_id = trace_info.metadata.get("app_id") # Use from trace_info instead
|
||||
node_name = node_execution.title
|
||||
node_type = node_execution.node_type
|
||||
status = node_execution.status
|
||||
if node_type == "llm":
|
||||
inputs = (
|
||||
json.loads(node_execution.process_data).get("prompts", {}) if node_execution.process_data else {}
|
||||
)
|
||||
if node_type == NodeType.LLM:
|
||||
inputs = node_execution.process_data.get("prompts", {}) if node_execution.process_data else {}
|
||||
else:
|
||||
inputs = json.loads(node_execution.inputs) if node_execution.inputs else {}
|
||||
outputs = json.loads(node_execution.outputs) if node_execution.outputs else {}
|
||||
inputs = node_execution.inputs if node_execution.inputs else {}
|
||||
outputs = node_execution.outputs if node_execution.outputs else {}
|
||||
created_at = node_execution.created_at or datetime.now()
|
||||
elapsed_time = node_execution.elapsed_time
|
||||
finished_at = created_at + timedelta(seconds=elapsed_time)
|
||||
|
||||
execution_metadata = (
|
||||
json.loads(node_execution.execution_metadata) if node_execution.execution_metadata else {}
|
||||
)
|
||||
metadata = execution_metadata.copy()
|
||||
execution_metadata = node_execution.metadata if node_execution.metadata else {}
|
||||
metadata = {str(k): v for k, v in execution_metadata.items()}
|
||||
metadata.update(
|
||||
{
|
||||
"workflow_run_id": trace_info.workflow_run_id,
|
||||
@ -193,7 +211,7 @@ class OpikDataTrace(BaseTraceInstance):
|
||||
}
|
||||
)
|
||||
|
||||
process_data = json.loads(node_execution.process_data) if node_execution.process_data else {}
|
||||
process_data = node_execution.process_data if node_execution.process_data else {}
|
||||
|
||||
provider = None
|
||||
model = None
|
||||
@ -226,7 +244,7 @@ class OpikDataTrace(BaseTraceInstance):
|
||||
parent_span_id = trace_info.workflow_app_log_id or trace_info.workflow_run_id
|
||||
|
||||
if not total_tokens:
|
||||
total_tokens = execution_metadata.get("total_tokens", 0)
|
||||
total_tokens = execution_metadata.get(NodeRunMetadataKey.TOTAL_TOKENS) or 0
|
||||
|
||||
span_data = {
|
||||
"trace_id": opik_trace_id,
|
||||
|
@ -287,7 +287,9 @@ class OpsTraceManager:
|
||||
:return:
|
||||
"""
|
||||
# auth check
|
||||
if tracing_provider not in provider_config_map and tracing_provider is not None:
|
||||
try:
|
||||
provider_config_map[tracing_provider]
|
||||
except KeyError:
|
||||
raise ValueError(f"Invalid tracing provider: {tracing_provider}")
|
||||
|
||||
app_config: Optional[App] = db.session.query(App).filter(App.id == app_id).first()
|
||||
|
@ -1,3 +1,4 @@
|
||||
from collections.abc import Mapping
|
||||
from typing import Any, Optional, Union
|
||||
|
||||
from pydantic import BaseModel, Field, field_validator
|
||||
@ -19,8 +20,8 @@ class WeaveMultiModel(BaseModel):
|
||||
class WeaveTraceModel(WeaveTokenUsage, WeaveMultiModel):
|
||||
id: str = Field(..., description="ID of the trace")
|
||||
op: str = Field(..., description="Name of the operation")
|
||||
inputs: Optional[Union[str, dict[str, Any], list, None]] = Field(None, description="Inputs of the trace")
|
||||
outputs: Optional[Union[str, dict[str, Any], list, None]] = Field(None, description="Outputs of the trace")
|
||||
inputs: Optional[Union[str, Mapping[str, Any], list, None]] = Field(None, description="Inputs of the trace")
|
||||
outputs: Optional[Union[str, Mapping[str, Any], list, None]] = Field(None, description="Outputs of the trace")
|
||||
attributes: Optional[Union[str, dict[str, Any], list, None]] = Field(
|
||||
None, description="Metadata and attributes associated with trace"
|
||||
)
|
||||
|
@ -1,4 +1,3 @@
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import uuid
|
||||
@ -7,6 +6,7 @@ from typing import Any, Optional, cast
|
||||
|
||||
import wandb
|
||||
import weave
|
||||
from sqlalchemy.orm import Session, sessionmaker
|
||||
|
||||
from core.ops.base_trace_instance import BaseTraceInstance
|
||||
from core.ops.entities.config_entity import WeaveConfig
|
||||
@ -22,9 +22,11 @@ from core.ops.entities.trace_entity import (
|
||||
WorkflowTraceInfo,
|
||||
)
|
||||
from core.ops.weave_trace.entities.weave_trace_entity import WeaveTraceModel
|
||||
from core.repositories import SQLAlchemyWorkflowNodeExecutionRepository
|
||||
from core.workflow.entities.node_entities import NodeRunMetadataKey
|
||||
from core.workflow.nodes.enums import NodeType
|
||||
from extensions.ext_database import db
|
||||
from models.model import EndUser, MessageFile
|
||||
from models.workflow import WorkflowNodeExecution
|
||||
from models import Account, App, EndUser, MessageFile, WorkflowNodeExecutionTriggeredFrom
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -128,58 +130,57 @@ class WeaveDataTrace(BaseTraceInstance):
|
||||
|
||||
self.start_call(workflow_run, parent_run_id=trace_info.message_id)
|
||||
|
||||
# through workflow_run_id get all_nodes_execution
|
||||
workflow_nodes_execution_id_records = (
|
||||
db.session.query(WorkflowNodeExecution.id)
|
||||
.filter(WorkflowNodeExecution.workflow_run_id == trace_info.workflow_run_id)
|
||||
.all()
|
||||
# through workflow_run_id get all_nodes_execution using repository
|
||||
session_factory = sessionmaker(bind=db.engine)
|
||||
# Find the app's creator account
|
||||
with Session(db.engine, expire_on_commit=False) as session:
|
||||
# Get the app to find its creator
|
||||
app_id = trace_info.metadata.get("app_id")
|
||||
if not app_id:
|
||||
raise ValueError("No app_id found in trace_info metadata")
|
||||
|
||||
app = session.query(App).filter(App.id == app_id).first()
|
||||
if not app:
|
||||
raise ValueError(f"App with id {app_id} not found")
|
||||
|
||||
if not app.created_by:
|
||||
raise ValueError(f"App with id {app_id} has no creator (created_by is None)")
|
||||
|
||||
service_account = session.query(Account).filter(Account.id == app.created_by).first()
|
||||
if not service_account:
|
||||
raise ValueError(f"Creator account with id {app.created_by} not found for app {app_id}")
|
||||
|
||||
workflow_node_execution_repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=session_factory,
|
||||
user=service_account,
|
||||
app_id=trace_info.metadata.get("app_id"),
|
||||
triggered_from=WorkflowNodeExecutionTriggeredFrom.WORKFLOW_RUN,
|
||||
)
|
||||
|
||||
for node_execution_id_record in workflow_nodes_execution_id_records:
|
||||
node_execution = (
|
||||
db.session.query(
|
||||
WorkflowNodeExecution.id,
|
||||
WorkflowNodeExecution.tenant_id,
|
||||
WorkflowNodeExecution.app_id,
|
||||
WorkflowNodeExecution.title,
|
||||
WorkflowNodeExecution.node_type,
|
||||
WorkflowNodeExecution.status,
|
||||
WorkflowNodeExecution.inputs,
|
||||
WorkflowNodeExecution.outputs,
|
||||
WorkflowNodeExecution.created_at,
|
||||
WorkflowNodeExecution.elapsed_time,
|
||||
WorkflowNodeExecution.process_data,
|
||||
WorkflowNodeExecution.execution_metadata,
|
||||
)
|
||||
.filter(WorkflowNodeExecution.id == node_execution_id_record.id)
|
||||
.first()
|
||||
# Get all executions for this workflow run
|
||||
workflow_node_executions = workflow_node_execution_repository.get_by_workflow_run(
|
||||
workflow_run_id=trace_info.workflow_run_id
|
||||
)
|
||||
|
||||
if not node_execution:
|
||||
continue
|
||||
|
||||
for node_execution in workflow_node_executions:
|
||||
node_execution_id = node_execution.id
|
||||
tenant_id = node_execution.tenant_id
|
||||
app_id = node_execution.app_id
|
||||
tenant_id = trace_info.tenant_id # Use from trace_info instead
|
||||
app_id = trace_info.metadata.get("app_id") # Use from trace_info instead
|
||||
node_name = node_execution.title
|
||||
node_type = node_execution.node_type
|
||||
status = node_execution.status
|
||||
if node_type == "llm":
|
||||
inputs = (
|
||||
json.loads(node_execution.process_data).get("prompts", {}) if node_execution.process_data else {}
|
||||
)
|
||||
if node_type == NodeType.LLM:
|
||||
inputs = node_execution.process_data.get("prompts", {}) if node_execution.process_data else {}
|
||||
else:
|
||||
inputs = json.loads(node_execution.inputs) if node_execution.inputs else {}
|
||||
outputs = json.loads(node_execution.outputs) if node_execution.outputs else {}
|
||||
inputs = node_execution.inputs if node_execution.inputs else {}
|
||||
outputs = node_execution.outputs if node_execution.outputs else {}
|
||||
created_at = node_execution.created_at or datetime.now()
|
||||
elapsed_time = node_execution.elapsed_time
|
||||
finished_at = created_at + timedelta(seconds=elapsed_time)
|
||||
|
||||
execution_metadata = (
|
||||
json.loads(node_execution.execution_metadata) if node_execution.execution_metadata else {}
|
||||
)
|
||||
node_total_tokens = execution_metadata.get("total_tokens", 0)
|
||||
attributes = execution_metadata.copy()
|
||||
execution_metadata = node_execution.metadata if node_execution.metadata else {}
|
||||
node_total_tokens = execution_metadata.get(NodeRunMetadataKey.TOTAL_TOKENS) or 0
|
||||
attributes = {str(k): v for k, v in execution_metadata.items()}
|
||||
attributes.update(
|
||||
{
|
||||
"workflow_run_id": trace_info.workflow_run_id,
|
||||
@ -192,7 +193,7 @@ class WeaveDataTrace(BaseTraceInstance):
|
||||
}
|
||||
)
|
||||
|
||||
process_data = json.loads(node_execution.process_data) if node_execution.process_data else {}
|
||||
process_data = node_execution.process_data if node_execution.process_data else {}
|
||||
if process_data and process_data.get("model_mode") == "chat":
|
||||
attributes.update(
|
||||
{
|
||||
|
@ -64,9 +64,9 @@ class PluginNodeBackwardsInvocation(BaseBackwardsInvocation):
|
||||
)
|
||||
|
||||
return {
|
||||
"inputs": execution.inputs_dict,
|
||||
"outputs": execution.outputs_dict,
|
||||
"process_data": execution.process_data_dict,
|
||||
"inputs": execution.inputs,
|
||||
"outputs": execution.outputs,
|
||||
"process_data": execution.process_data,
|
||||
}
|
||||
|
||||
@classmethod
|
||||
@ -113,7 +113,7 @@ class PluginNodeBackwardsInvocation(BaseBackwardsInvocation):
|
||||
)
|
||||
|
||||
return {
|
||||
"inputs": execution.inputs_dict,
|
||||
"outputs": execution.outputs_dict,
|
||||
"process_data": execution.process_data_dict,
|
||||
"inputs": execution.inputs,
|
||||
"outputs": execution.outputs,
|
||||
"process_data": execution.process_data,
|
||||
}
|
||||
|
@ -6,6 +6,12 @@ from urllib.parse import urljoin
|
||||
import requests
|
||||
from requests import Response
|
||||
|
||||
from core.rag.extractor.watercrawl.exceptions import (
|
||||
WaterCrawlAuthenticationError,
|
||||
WaterCrawlBadRequestError,
|
||||
WaterCrawlPermissionError,
|
||||
)
|
||||
|
||||
|
||||
class BaseAPIClient:
|
||||
def __init__(self, api_key, base_url):
|
||||
@ -53,6 +59,15 @@ class WaterCrawlAPIClient(BaseAPIClient):
|
||||
yield data
|
||||
|
||||
def process_response(self, response: Response) -> dict | bytes | list | None | Generator:
|
||||
if response.status_code == 401:
|
||||
raise WaterCrawlAuthenticationError(response)
|
||||
|
||||
if response.status_code == 403:
|
||||
raise WaterCrawlPermissionError(response)
|
||||
|
||||
if 400 <= response.status_code < 500:
|
||||
raise WaterCrawlBadRequestError(response)
|
||||
|
||||
response.raise_for_status()
|
||||
if response.status_code == 204:
|
||||
return None
|
||||
|
32
api/core/rag/extractor/watercrawl/exceptions.py
Normal file
32
api/core/rag/extractor/watercrawl/exceptions.py
Normal file
@ -0,0 +1,32 @@
|
||||
import json
|
||||
|
||||
|
||||
class WaterCrawlError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class WaterCrawlBadRequestError(WaterCrawlError):
|
||||
def __init__(self, response):
|
||||
self.status_code = response.status_code
|
||||
self.response = response
|
||||
data = response.json()
|
||||
self.message = data.get("message", "Unknown error occurred")
|
||||
self.errors = data.get("errors", {})
|
||||
super().__init__(self.message)
|
||||
|
||||
@property
|
||||
def flat_errors(self):
|
||||
return json.dumps(self.errors)
|
||||
|
||||
def __str__(self):
|
||||
return f"WaterCrawlBadRequestError: {self.message} \n {self.flat_errors}"
|
||||
|
||||
|
||||
class WaterCrawlPermissionError(WaterCrawlBadRequestError):
|
||||
def __str__(self):
|
||||
return f"You are exceeding your WaterCrawl API limits. {self.message}"
|
||||
|
||||
|
||||
class WaterCrawlAuthenticationError(WaterCrawlBadRequestError):
|
||||
def __str__(self):
|
||||
return "WaterCrawl API key is invalid or expired. Please check your API key and try again."
|
@ -19,7 +19,7 @@ from core.rag.extractor.extractor_base import BaseExtractor
|
||||
from core.rag.models.document import Document
|
||||
from extensions.ext_database import db
|
||||
from extensions.ext_storage import storage
|
||||
from models.enums import CreatedByRole
|
||||
from models.enums import CreatorUserRole
|
||||
from models.model import UploadFile
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@ -116,7 +116,7 @@ class WordExtractor(BaseExtractor):
|
||||
extension=str(image_ext),
|
||||
mime_type=mime_type or "",
|
||||
created_by=self.user_id,
|
||||
created_by_role=CreatedByRole.ACCOUNT,
|
||||
created_by_role=CreatorUserRole.ACCOUNT,
|
||||
created_at=datetime.datetime.now(datetime.UTC).replace(tzinfo=None),
|
||||
used=True,
|
||||
used_by=self.user_id,
|
||||
|
@ -2,16 +2,30 @@
|
||||
SQLAlchemy implementation of the WorkflowNodeExecutionRepository.
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
from collections.abc import Sequence
|
||||
from typing import Optional
|
||||
from collections.abc import Mapping, Sequence
|
||||
from typing import Any, Optional, Union, cast
|
||||
|
||||
from sqlalchemy import UnaryExpression, asc, delete, desc, select
|
||||
from sqlalchemy.engine import Engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
from core.workflow.entities.node_entities import NodeRunMetadataKey
|
||||
from core.workflow.entities.node_execution_entities import (
|
||||
NodeExecution,
|
||||
NodeExecutionStatus,
|
||||
)
|
||||
from core.workflow.nodes.enums import NodeType
|
||||
from core.workflow.repository.workflow_node_execution_repository import OrderConfig, WorkflowNodeExecutionRepository
|
||||
from models.workflow import WorkflowNodeExecution, WorkflowNodeExecutionStatus, WorkflowNodeExecutionTriggeredFrom
|
||||
from models import (
|
||||
Account,
|
||||
CreatorUserRole,
|
||||
EndUser,
|
||||
WorkflowNodeExecution,
|
||||
WorkflowNodeExecutionStatus,
|
||||
WorkflowNodeExecutionTriggeredFrom,
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -23,16 +37,26 @@ class SQLAlchemyWorkflowNodeExecutionRepository(WorkflowNodeExecutionRepository)
|
||||
This implementation supports multi-tenancy by filtering operations based on tenant_id.
|
||||
Each method creates its own session, handles the transaction, and commits changes
|
||||
to the database. This prevents long-running connections in the workflow core.
|
||||
|
||||
This implementation also includes an in-memory cache for node executions to improve
|
||||
performance by reducing database queries.
|
||||
"""
|
||||
|
||||
def __init__(self, session_factory: sessionmaker | Engine, tenant_id: str, app_id: Optional[str] = None):
|
||||
def __init__(
|
||||
self,
|
||||
session_factory: sessionmaker | Engine,
|
||||
user: Union[Account, EndUser],
|
||||
app_id: Optional[str],
|
||||
triggered_from: Optional[WorkflowNodeExecutionTriggeredFrom],
|
||||
):
|
||||
"""
|
||||
Initialize the repository with a SQLAlchemy sessionmaker or engine and tenant context.
|
||||
Initialize the repository with a SQLAlchemy sessionmaker or engine and context information.
|
||||
|
||||
Args:
|
||||
session_factory: SQLAlchemy sessionmaker or engine for creating sessions
|
||||
tenant_id: Tenant ID for multi-tenancy
|
||||
app_id: Optional app ID for filtering by application
|
||||
user: Account or EndUser object containing tenant_id, user ID, and role information
|
||||
app_id: App ID for filtering by application (can be None)
|
||||
triggered_from: Source of the execution trigger (SINGLE_STEP or WORKFLOW_RUN)
|
||||
"""
|
||||
# If an engine is provided, create a sessionmaker from it
|
||||
if isinstance(session_factory, Engine):
|
||||
@ -44,38 +68,168 @@ class SQLAlchemyWorkflowNodeExecutionRepository(WorkflowNodeExecutionRepository)
|
||||
f"Invalid session_factory type {type(session_factory).__name__}; expected sessionmaker or Engine"
|
||||
)
|
||||
|
||||
# Extract tenant_id from user
|
||||
tenant_id: str | None = user.tenant_id if isinstance(user, EndUser) else user.current_tenant_id
|
||||
if not tenant_id:
|
||||
raise ValueError("User must have a tenant_id or current_tenant_id")
|
||||
self._tenant_id = tenant_id
|
||||
|
||||
# Store app context
|
||||
self._app_id = app_id
|
||||
|
||||
def save(self, execution: WorkflowNodeExecution) -> None:
|
||||
# Extract user context
|
||||
self._triggered_from = triggered_from
|
||||
self._creator_user_id = user.id
|
||||
|
||||
# Determine user role based on user type
|
||||
self._creator_user_role = CreatorUserRole.ACCOUNT if isinstance(user, Account) else CreatorUserRole.END_USER
|
||||
|
||||
# Initialize in-memory cache for node executions
|
||||
# Key: node_execution_id, Value: NodeExecution
|
||||
self._node_execution_cache: dict[str, NodeExecution] = {}
|
||||
|
||||
def _to_domain_model(self, db_model: WorkflowNodeExecution) -> NodeExecution:
|
||||
"""
|
||||
Save a WorkflowNodeExecution instance and commit changes to the database.
|
||||
Convert a database model to a domain model.
|
||||
|
||||
Args:
|
||||
execution: The WorkflowNodeExecution instance to save
|
||||
db_model: The database model to convert
|
||||
|
||||
Returns:
|
||||
The domain model
|
||||
"""
|
||||
# Parse JSON fields
|
||||
inputs = db_model.inputs_dict
|
||||
process_data = db_model.process_data_dict
|
||||
outputs = db_model.outputs_dict
|
||||
metadata = db_model.execution_metadata_dict
|
||||
|
||||
# Convert status to domain enum
|
||||
status = NodeExecutionStatus(db_model.status)
|
||||
|
||||
return NodeExecution(
|
||||
id=db_model.id,
|
||||
node_execution_id=db_model.node_execution_id,
|
||||
workflow_id=db_model.workflow_id,
|
||||
workflow_run_id=db_model.workflow_run_id,
|
||||
index=db_model.index,
|
||||
predecessor_node_id=db_model.predecessor_node_id,
|
||||
node_id=db_model.node_id,
|
||||
node_type=NodeType(db_model.node_type),
|
||||
title=db_model.title,
|
||||
inputs=inputs,
|
||||
process_data=process_data,
|
||||
outputs=outputs,
|
||||
status=status,
|
||||
error=db_model.error,
|
||||
elapsed_time=db_model.elapsed_time,
|
||||
# FIXME(QuantumGhost): a temporary workaround for the following type check failure in Python 3.11.
|
||||
# However, this problem is not occurred in Python 3.12.
|
||||
#
|
||||
# A case of this error is:
|
||||
# https://github.com/langgenius/dify/actions/runs/15112698604/job/42475659482?pr=19737#step:9:24
|
||||
metadata=cast(Mapping[NodeRunMetadataKey, Any] | None, metadata),
|
||||
created_at=db_model.created_at,
|
||||
finished_at=db_model.finished_at,
|
||||
)
|
||||
|
||||
def to_db_model(self, domain_model: NodeExecution) -> WorkflowNodeExecution:
|
||||
"""
|
||||
Convert a domain model to a database model.
|
||||
|
||||
Args:
|
||||
domain_model: The domain model to convert
|
||||
|
||||
Returns:
|
||||
The database model
|
||||
"""
|
||||
# Use values from constructor if provided
|
||||
if not self._triggered_from:
|
||||
raise ValueError("triggered_from is required in repository constructor")
|
||||
if not self._creator_user_id:
|
||||
raise ValueError("created_by is required in repository constructor")
|
||||
if not self._creator_user_role:
|
||||
raise ValueError("created_by_role is required in repository constructor")
|
||||
|
||||
db_model = WorkflowNodeExecution()
|
||||
db_model.id = domain_model.id
|
||||
db_model.tenant_id = self._tenant_id
|
||||
if self._app_id is not None:
|
||||
db_model.app_id = self._app_id
|
||||
db_model.workflow_id = domain_model.workflow_id
|
||||
db_model.triggered_from = self._triggered_from
|
||||
db_model.workflow_run_id = domain_model.workflow_run_id
|
||||
db_model.index = domain_model.index
|
||||
db_model.predecessor_node_id = domain_model.predecessor_node_id
|
||||
db_model.node_execution_id = domain_model.node_execution_id
|
||||
db_model.node_id = domain_model.node_id
|
||||
db_model.node_type = domain_model.node_type
|
||||
db_model.title = domain_model.title
|
||||
db_model.inputs = json.dumps(domain_model.inputs) if domain_model.inputs else None
|
||||
db_model.process_data = json.dumps(domain_model.process_data) if domain_model.process_data else None
|
||||
db_model.outputs = json.dumps(domain_model.outputs) if domain_model.outputs else None
|
||||
db_model.status = domain_model.status
|
||||
db_model.error = domain_model.error
|
||||
db_model.elapsed_time = domain_model.elapsed_time
|
||||
db_model.execution_metadata = json.dumps(domain_model.metadata) if domain_model.metadata else None
|
||||
db_model.created_at = domain_model.created_at
|
||||
db_model.created_by_role = self._creator_user_role
|
||||
db_model.created_by = self._creator_user_id
|
||||
db_model.finished_at = domain_model.finished_at
|
||||
return db_model
|
||||
|
||||
def save(self, execution: NodeExecution) -> None:
|
||||
"""
|
||||
Save or update a NodeExecution domain entity to the database.
|
||||
|
||||
This method serves as a domain-to-database adapter that:
|
||||
1. Converts the domain entity to its database representation
|
||||
2. Persists the database model using SQLAlchemy's merge operation
|
||||
3. Maintains proper multi-tenancy by including tenant context during conversion
|
||||
4. Updates the in-memory cache for faster subsequent lookups
|
||||
|
||||
The method handles both creating new records and updating existing ones through
|
||||
SQLAlchemy's merge operation.
|
||||
|
||||
Args:
|
||||
execution: The NodeExecution domain entity to persist
|
||||
"""
|
||||
# Convert domain model to database model using tenant context and other attributes
|
||||
db_model = self.to_db_model(execution)
|
||||
|
||||
# Create a new database session
|
||||
with self._session_factory() as session:
|
||||
# Ensure tenant_id is set
|
||||
if not execution.tenant_id:
|
||||
execution.tenant_id = self._tenant_id
|
||||
|
||||
# Set app_id if provided and not already set
|
||||
if self._app_id and not execution.app_id:
|
||||
execution.app_id = self._app_id
|
||||
|
||||
session.add(execution)
|
||||
# SQLAlchemy merge intelligently handles both insert and update operations
|
||||
# based on the presence of the primary key
|
||||
session.merge(db_model)
|
||||
session.commit()
|
||||
|
||||
def get_by_node_execution_id(self, node_execution_id: str) -> Optional[WorkflowNodeExecution]:
|
||||
# Update the in-memory cache for faster subsequent lookups
|
||||
# Only cache if we have a node_execution_id to use as the cache key
|
||||
if db_model.node_execution_id:
|
||||
logger.debug(f"Updating cache for node_execution_id: {db_model.node_execution_id}")
|
||||
self._node_execution_cache[db_model.node_execution_id] = execution
|
||||
|
||||
def get_by_node_execution_id(self, node_execution_id: str) -> Optional[NodeExecution]:
|
||||
"""
|
||||
Retrieve a WorkflowNodeExecution by its node_execution_id.
|
||||
Retrieve a NodeExecution by its node_execution_id.
|
||||
|
||||
First checks the in-memory cache, and if not found, queries the database.
|
||||
If found in the database, adds it to the cache for future lookups.
|
||||
|
||||
Args:
|
||||
node_execution_id: The node execution ID
|
||||
|
||||
Returns:
|
||||
The WorkflowNodeExecution instance if found, None otherwise
|
||||
The NodeExecution instance if found, None otherwise
|
||||
"""
|
||||
# First check the cache
|
||||
if node_execution_id in self._node_execution_cache:
|
||||
logger.debug(f"Cache hit for node_execution_id: {node_execution_id}")
|
||||
return self._node_execution_cache[node_execution_id]
|
||||
|
||||
# If not in cache, query the database
|
||||
logger.debug(f"Cache miss for node_execution_id: {node_execution_id}, querying database")
|
||||
with self._session_factory() as session:
|
||||
stmt = select(WorkflowNodeExecution).where(
|
||||
WorkflowNodeExecution.node_execution_id == node_execution_id,
|
||||
@ -85,15 +239,28 @@ class SQLAlchemyWorkflowNodeExecutionRepository(WorkflowNodeExecutionRepository)
|
||||
if self._app_id:
|
||||
stmt = stmt.where(WorkflowNodeExecution.app_id == self._app_id)
|
||||
|
||||
return session.scalar(stmt)
|
||||
db_model = session.scalar(stmt)
|
||||
if db_model:
|
||||
# Convert to domain model
|
||||
domain_model = self._to_domain_model(db_model)
|
||||
|
||||
# Add to cache
|
||||
self._node_execution_cache[node_execution_id] = domain_model
|
||||
|
||||
return domain_model
|
||||
|
||||
return None
|
||||
|
||||
def get_by_workflow_run(
|
||||
self,
|
||||
workflow_run_id: str,
|
||||
order_config: Optional[OrderConfig] = None,
|
||||
) -> Sequence[WorkflowNodeExecution]:
|
||||
) -> Sequence[NodeExecution]:
|
||||
"""
|
||||
Retrieve all WorkflowNodeExecution instances for a specific workflow run.
|
||||
Retrieve all NodeExecution instances for a specific workflow run.
|
||||
|
||||
This method always queries the database to ensure complete and ordered results,
|
||||
but updates the cache with any retrieved executions.
|
||||
|
||||
Args:
|
||||
workflow_run_id: The workflow run ID
|
||||
@ -102,7 +269,7 @@ class SQLAlchemyWorkflowNodeExecutionRepository(WorkflowNodeExecutionRepository)
|
||||
order_config.order_direction: Direction to order ("asc" or "desc")
|
||||
|
||||
Returns:
|
||||
A list of WorkflowNodeExecution instances
|
||||
A list of NodeExecution instances
|
||||
"""
|
||||
with self._session_factory() as session:
|
||||
stmt = select(WorkflowNodeExecution).where(
|
||||
@ -129,17 +296,31 @@ class SQLAlchemyWorkflowNodeExecutionRepository(WorkflowNodeExecutionRepository)
|
||||
if order_columns:
|
||||
stmt = stmt.order_by(*order_columns)
|
||||
|
||||
return session.scalars(stmt).all()
|
||||
db_models = session.scalars(stmt).all()
|
||||
|
||||
def get_running_executions(self, workflow_run_id: str) -> Sequence[WorkflowNodeExecution]:
|
||||
# Convert database models to domain models and update cache
|
||||
domain_models = []
|
||||
for model in db_models:
|
||||
domain_model = self._to_domain_model(model)
|
||||
# Update cache if node_execution_id is present
|
||||
if domain_model.node_execution_id:
|
||||
self._node_execution_cache[domain_model.node_execution_id] = domain_model
|
||||
domain_models.append(domain_model)
|
||||
|
||||
return domain_models
|
||||
|
||||
def get_running_executions(self, workflow_run_id: str) -> Sequence[NodeExecution]:
|
||||
"""
|
||||
Retrieve all running WorkflowNodeExecution instances for a specific workflow run.
|
||||
Retrieve all running NodeExecution instances for a specific workflow run.
|
||||
|
||||
This method queries the database directly and updates the cache with any
|
||||
retrieved executions that have a node_execution_id.
|
||||
|
||||
Args:
|
||||
workflow_run_id: The workflow run ID
|
||||
|
||||
Returns:
|
||||
A list of running WorkflowNodeExecution instances
|
||||
A list of running NodeExecution instances
|
||||
"""
|
||||
with self._session_factory() as session:
|
||||
stmt = select(WorkflowNodeExecution).where(
|
||||
@ -152,26 +333,17 @@ class SQLAlchemyWorkflowNodeExecutionRepository(WorkflowNodeExecutionRepository)
|
||||
if self._app_id:
|
||||
stmt = stmt.where(WorkflowNodeExecution.app_id == self._app_id)
|
||||
|
||||
return session.scalars(stmt).all()
|
||||
db_models = session.scalars(stmt).all()
|
||||
domain_models = []
|
||||
|
||||
def update(self, execution: WorkflowNodeExecution) -> None:
|
||||
"""
|
||||
Update an existing WorkflowNodeExecution instance and commit changes to the database.
|
||||
for model in db_models:
|
||||
domain_model = self._to_domain_model(model)
|
||||
# Update cache if node_execution_id is present
|
||||
if domain_model.node_execution_id:
|
||||
self._node_execution_cache[domain_model.node_execution_id] = domain_model
|
||||
domain_models.append(domain_model)
|
||||
|
||||
Args:
|
||||
execution: The WorkflowNodeExecution instance to update
|
||||
"""
|
||||
with self._session_factory() as session:
|
||||
# Ensure tenant_id is set
|
||||
if not execution.tenant_id:
|
||||
execution.tenant_id = self._tenant_id
|
||||
|
||||
# Set app_id if provided and not already set
|
||||
if self._app_id and not execution.app_id:
|
||||
execution.app_id = self._app_id
|
||||
|
||||
session.merge(execution)
|
||||
session.commit()
|
||||
return domain_models
|
||||
|
||||
def clear(self) -> None:
|
||||
"""
|
||||
@ -179,6 +351,7 @@ class SQLAlchemyWorkflowNodeExecutionRepository(WorkflowNodeExecutionRepository)
|
||||
|
||||
This method deletes all WorkflowNodeExecution records that match the tenant_id
|
||||
and app_id (if provided) associated with this repository instance.
|
||||
It also clears the in-memory cache.
|
||||
"""
|
||||
with self._session_factory() as session:
|
||||
stmt = delete(WorkflowNodeExecution).where(WorkflowNodeExecution.tenant_id == self._tenant_id)
|
||||
@ -194,3 +367,7 @@ class SQLAlchemyWorkflowNodeExecutionRepository(WorkflowNodeExecutionRepository)
|
||||
f"Cleared {deleted_count} workflow node execution records for tenant {self._tenant_id}"
|
||||
+ (f" and app {self._app_id}" if self._app_id else "")
|
||||
)
|
||||
|
||||
# Clear the in-memory cache
|
||||
self._node_execution_cache.clear()
|
||||
logger.info("Cleared in-memory node execution cache")
|
||||
|
@ -32,7 +32,7 @@ from core.tools.errors import (
|
||||
from core.tools.utils.message_transformer import ToolFileMessageTransformer
|
||||
from core.tools.workflow_as_tool.tool import WorkflowTool
|
||||
from extensions.ext_database import db
|
||||
from models.enums import CreatedByRole
|
||||
from models.enums import CreatorUserRole
|
||||
from models.model import Message, MessageFile
|
||||
|
||||
|
||||
@ -339,9 +339,9 @@ class ToolEngine:
|
||||
url=message.url,
|
||||
upload_file_id=tool_file_id,
|
||||
created_by_role=(
|
||||
CreatedByRole.ACCOUNT
|
||||
CreatorUserRole.ACCOUNT
|
||||
if invoke_from in {InvokeFrom.EXPLORE, InvokeFrom.DEBUGGER}
|
||||
else CreatedByRole.END_USER
|
||||
else CreatorUserRole.END_USER
|
||||
),
|
||||
created_by=user_id,
|
||||
)
|
||||
|
7
api/core/variables/consts.py
Normal file
7
api/core/variables/consts.py
Normal file
@ -0,0 +1,7 @@
|
||||
# The minimal selector length for valid variables.
|
||||
#
|
||||
# The first element of the selector is the node id, and the second element is the variable name.
|
||||
#
|
||||
# If the selector length is more than 2, the remaining parts are the keys / indexes paths used
|
||||
# to extract part of the variable value.
|
||||
MIN_SELECTORS_LENGTH = 2
|
8
api/core/variables/utils.py
Normal file
8
api/core/variables/utils.py
Normal file
@ -0,0 +1,8 @@
|
||||
from collections.abc import Iterable, Sequence
|
||||
|
||||
|
||||
def to_selector(node_id: str, name: str, paths: Iterable[str] = ()) -> Sequence[str]:
|
||||
selectors = [node_id, name]
|
||||
if paths:
|
||||
selectors.extend(paths)
|
||||
return selectors
|
98
api/core/workflow/entities/node_execution_entities.py
Normal file
98
api/core/workflow/entities/node_execution_entities.py
Normal file
@ -0,0 +1,98 @@
|
||||
"""
|
||||
Domain entities for workflow node execution.
|
||||
|
||||
This module contains the domain model for workflow node execution, which is used
|
||||
by the core workflow module. These models are independent of the storage mechanism
|
||||
and don't contain implementation details like tenant_id, app_id, etc.
|
||||
"""
|
||||
|
||||
from collections.abc import Mapping
|
||||
from datetime import datetime
|
||||
from enum import StrEnum
|
||||
from typing import Any, Optional
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from core.workflow.entities.node_entities import NodeRunMetadataKey
|
||||
from core.workflow.nodes.enums import NodeType
|
||||
|
||||
|
||||
class NodeExecutionStatus(StrEnum):
|
||||
"""
|
||||
Node Execution Status Enum.
|
||||
"""
|
||||
|
||||
RUNNING = "running"
|
||||
SUCCEEDED = "succeeded"
|
||||
FAILED = "failed"
|
||||
EXCEPTION = "exception"
|
||||
RETRY = "retry"
|
||||
|
||||
|
||||
class NodeExecution(BaseModel):
|
||||
"""
|
||||
Domain model for workflow node execution.
|
||||
|
||||
This model represents the core business entity of a node execution,
|
||||
without implementation details like tenant_id, app_id, etc.
|
||||
|
||||
Note: User/context-specific fields (triggered_from, created_by, created_by_role)
|
||||
have been moved to the repository implementation to keep the domain model clean.
|
||||
These fields are still accepted in the constructor for backward compatibility,
|
||||
but they are not stored in the model.
|
||||
"""
|
||||
|
||||
# Core identification fields
|
||||
id: str # Unique identifier for this execution record
|
||||
node_execution_id: Optional[str] = None # Optional secondary ID for cross-referencing
|
||||
workflow_id: str # ID of the workflow this node belongs to
|
||||
workflow_run_id: Optional[str] = None # ID of the specific workflow run (null for single-step debugging)
|
||||
|
||||
# Execution positioning and flow
|
||||
index: int # Sequence number for ordering in trace visualization
|
||||
predecessor_node_id: Optional[str] = None # ID of the node that executed before this one
|
||||
node_id: str # ID of the node being executed
|
||||
node_type: NodeType # Type of node (e.g., start, llm, knowledge)
|
||||
title: str # Display title of the node
|
||||
|
||||
# Execution data
|
||||
inputs: Optional[Mapping[str, Any]] = None # Input variables used by this node
|
||||
process_data: Optional[Mapping[str, Any]] = None # Intermediate processing data
|
||||
outputs: Optional[Mapping[str, Any]] = None # Output variables produced by this node
|
||||
|
||||
# Execution state
|
||||
status: NodeExecutionStatus = NodeExecutionStatus.RUNNING # Current execution status
|
||||
error: Optional[str] = None # Error message if execution failed
|
||||
elapsed_time: float = Field(default=0.0) # Time taken for execution in seconds
|
||||
|
||||
# Additional metadata
|
||||
metadata: Optional[Mapping[NodeRunMetadataKey, Any]] = None # Execution metadata (tokens, cost, etc.)
|
||||
|
||||
# Timing information
|
||||
created_at: datetime # When execution started
|
||||
finished_at: Optional[datetime] = None # When execution completed
|
||||
|
||||
def update_from_mapping(
|
||||
self,
|
||||
inputs: Optional[Mapping[str, Any]] = None,
|
||||
process_data: Optional[Mapping[str, Any]] = None,
|
||||
outputs: Optional[Mapping[str, Any]] = None,
|
||||
metadata: Optional[Mapping[NodeRunMetadataKey, Any]] = None,
|
||||
) -> None:
|
||||
"""
|
||||
Update the model from mappings.
|
||||
|
||||
Args:
|
||||
inputs: The inputs to update
|
||||
process_data: The process data to update
|
||||
outputs: The outputs to update
|
||||
metadata: The metadata to update
|
||||
"""
|
||||
if inputs is not None:
|
||||
self.inputs = dict(inputs)
|
||||
if process_data is not None:
|
||||
self.process_data = dict(process_data)
|
||||
if outputs is not None:
|
||||
self.outputs = dict(outputs)
|
||||
if metadata is not None:
|
||||
self.metadata = dict(metadata)
|
@ -2,12 +2,12 @@ from collections.abc import Sequence
|
||||
from dataclasses import dataclass
|
||||
from typing import Literal, Optional, Protocol
|
||||
|
||||
from models.workflow import WorkflowNodeExecution
|
||||
from core.workflow.entities.node_execution_entities import NodeExecution
|
||||
|
||||
|
||||
@dataclass
|
||||
class OrderConfig:
|
||||
"""Configuration for ordering WorkflowNodeExecution instances."""
|
||||
"""Configuration for ordering NodeExecution instances."""
|
||||
|
||||
order_by: list[str]
|
||||
order_direction: Optional[Literal["asc", "desc"]] = None
|
||||
@ -15,10 +15,10 @@ class OrderConfig:
|
||||
|
||||
class WorkflowNodeExecutionRepository(Protocol):
|
||||
"""
|
||||
Repository interface for WorkflowNodeExecution.
|
||||
Repository interface for NodeExecution.
|
||||
|
||||
This interface defines the contract for accessing and manipulating
|
||||
WorkflowNodeExecution data, regardless of the underlying storage mechanism.
|
||||
NodeExecution data, regardless of the underlying storage mechanism.
|
||||
|
||||
Note: Domain-specific concepts like multi-tenancy (tenant_id), application context (app_id),
|
||||
and trigger sources (triggered_from) should be handled at the implementation level, not in
|
||||
@ -26,24 +26,28 @@ class WorkflowNodeExecutionRepository(Protocol):
|
||||
application domains or deployment scenarios.
|
||||
"""
|
||||
|
||||
def save(self, execution: WorkflowNodeExecution) -> None:
|
||||
def save(self, execution: NodeExecution) -> None:
|
||||
"""
|
||||
Save a WorkflowNodeExecution instance.
|
||||
Save or update a NodeExecution instance.
|
||||
|
||||
This method handles both creating new records and updating existing ones.
|
||||
The implementation should determine whether to create or update based on
|
||||
the execution's ID or other identifying fields.
|
||||
|
||||
Args:
|
||||
execution: The WorkflowNodeExecution instance to save
|
||||
execution: The NodeExecution instance to save or update
|
||||
"""
|
||||
...
|
||||
|
||||
def get_by_node_execution_id(self, node_execution_id: str) -> Optional[WorkflowNodeExecution]:
|
||||
def get_by_node_execution_id(self, node_execution_id: str) -> Optional[NodeExecution]:
|
||||
"""
|
||||
Retrieve a WorkflowNodeExecution by its node_execution_id.
|
||||
Retrieve a NodeExecution by its node_execution_id.
|
||||
|
||||
Args:
|
||||
node_execution_id: The node execution ID
|
||||
|
||||
Returns:
|
||||
The WorkflowNodeExecution instance if found, None otherwise
|
||||
The NodeExecution instance if found, None otherwise
|
||||
"""
|
||||
...
|
||||
|
||||
@ -51,9 +55,9 @@ class WorkflowNodeExecutionRepository(Protocol):
|
||||
self,
|
||||
workflow_run_id: str,
|
||||
order_config: Optional[OrderConfig] = None,
|
||||
) -> Sequence[WorkflowNodeExecution]:
|
||||
) -> Sequence[NodeExecution]:
|
||||
"""
|
||||
Retrieve all WorkflowNodeExecution instances for a specific workflow run.
|
||||
Retrieve all NodeExecution instances for a specific workflow run.
|
||||
|
||||
Args:
|
||||
workflow_run_id: The workflow run ID
|
||||
@ -62,34 +66,25 @@ class WorkflowNodeExecutionRepository(Protocol):
|
||||
order_config.order_direction: Direction to order ("asc" or "desc")
|
||||
|
||||
Returns:
|
||||
A list of WorkflowNodeExecution instances
|
||||
A list of NodeExecution instances
|
||||
"""
|
||||
...
|
||||
|
||||
def get_running_executions(self, workflow_run_id: str) -> Sequence[WorkflowNodeExecution]:
|
||||
def get_running_executions(self, workflow_run_id: str) -> Sequence[NodeExecution]:
|
||||
"""
|
||||
Retrieve all running WorkflowNodeExecution instances for a specific workflow run.
|
||||
Retrieve all running NodeExecution instances for a specific workflow run.
|
||||
|
||||
Args:
|
||||
workflow_run_id: The workflow run ID
|
||||
|
||||
Returns:
|
||||
A list of running WorkflowNodeExecution instances
|
||||
"""
|
||||
...
|
||||
|
||||
def update(self, execution: WorkflowNodeExecution) -> None:
|
||||
"""
|
||||
Update an existing WorkflowNodeExecution instance.
|
||||
|
||||
Args:
|
||||
execution: The WorkflowNodeExecution instance to update
|
||||
A list of running NodeExecution instances
|
||||
"""
|
||||
...
|
||||
|
||||
def clear(self) -> None:
|
||||
"""
|
||||
Clear all WorkflowNodeExecution records based on implementation-specific criteria.
|
||||
Clear all NodeExecution records based on implementation-specific criteria.
|
||||
|
||||
This method is intended to be used for bulk deletion operations, such as removing
|
||||
all records associated with a specific app_id and tenant_id in multi-tenant implementations.
|
||||
|
@ -58,7 +58,7 @@ from core.workflow.repository.workflow_node_execution_repository import Workflow
|
||||
from core.workflow.workflow_cycle_manager import WorkflowCycleManager
|
||||
from extensions.ext_database import db
|
||||
from models.account import Account
|
||||
from models.enums import CreatedByRole
|
||||
from models.enums import CreatorUserRole
|
||||
from models.model import EndUser
|
||||
from models.workflow import (
|
||||
Workflow,
|
||||
@ -94,11 +94,11 @@ class WorkflowAppGenerateTaskPipeline:
|
||||
if isinstance(user, EndUser):
|
||||
self._user_id = user.id
|
||||
user_session_id = user.session_id
|
||||
self._created_by_role = CreatedByRole.END_USER
|
||||
self._created_by_role = CreatorUserRole.END_USER
|
||||
elif isinstance(user, Account):
|
||||
self._user_id = user.id
|
||||
user_session_id = user.id
|
||||
self._created_by_role = CreatedByRole.ACCOUNT
|
||||
self._created_by_role = CreatorUserRole.ACCOUNT
|
||||
else:
|
||||
raise ValueError(f"Invalid user type: {type(user)}")
|
||||
|
||||
|
@ -46,26 +46,28 @@ from core.app.entities.task_entities import (
|
||||
)
|
||||
from core.app.task_pipeline.exc import WorkflowRunNotFoundError
|
||||
from core.file import FILE_MODEL_IDENTITY, File
|
||||
from core.model_runtime.utils.encoders import jsonable_encoder
|
||||
from core.ops.entities.trace_entity import TraceTaskName
|
||||
from core.ops.ops_trace_manager import TraceQueueManager, TraceTask
|
||||
from core.tools.tool_manager import ToolManager
|
||||
from core.workflow.entities.node_entities import NodeRunMetadataKey
|
||||
from core.workflow.entities.node_execution_entities import (
|
||||
NodeExecution,
|
||||
NodeExecutionStatus,
|
||||
)
|
||||
from core.workflow.enums import SystemVariableKey
|
||||
from core.workflow.nodes import NodeType
|
||||
from core.workflow.nodes.tool.entities import ToolNodeData
|
||||
from core.workflow.repository.workflow_node_execution_repository import WorkflowNodeExecutionRepository
|
||||
from core.workflow.workflow_entry import WorkflowEntry
|
||||
from models.account import Account
|
||||
from models.enums import CreatedByRole, WorkflowRunTriggeredFrom
|
||||
from models.model import EndUser
|
||||
from models.workflow import (
|
||||
from models import (
|
||||
Account,
|
||||
CreatorUserRole,
|
||||
EndUser,
|
||||
Workflow,
|
||||
WorkflowNodeExecution,
|
||||
WorkflowNodeExecutionStatus,
|
||||
WorkflowNodeExecutionTriggeredFrom,
|
||||
WorkflowRun,
|
||||
WorkflowRunStatus,
|
||||
WorkflowRunTriggeredFrom,
|
||||
)
|
||||
|
||||
|
||||
@ -78,7 +80,6 @@ class WorkflowCycleManager:
|
||||
workflow_node_execution_repository: WorkflowNodeExecutionRepository,
|
||||
) -> None:
|
||||
self._workflow_run: WorkflowRun | None = None
|
||||
self._workflow_node_executions: dict[str, WorkflowNodeExecution] = {}
|
||||
self._application_generate_entity = application_generate_entity
|
||||
self._workflow_system_variables = workflow_system_variables
|
||||
self._workflow_node_execution_repository = workflow_node_execution_repository
|
||||
@ -89,7 +90,7 @@ class WorkflowCycleManager:
|
||||
session: Session,
|
||||
workflow_id: str,
|
||||
user_id: str,
|
||||
created_by_role: CreatedByRole,
|
||||
created_by_role: CreatorUserRole,
|
||||
) -> WorkflowRun:
|
||||
workflow_stmt = select(Workflow).where(Workflow.id == workflow_id)
|
||||
workflow = session.scalar(workflow_stmt)
|
||||
@ -258,21 +259,22 @@ class WorkflowCycleManager:
|
||||
workflow_run.exceptions_count = exceptions_count
|
||||
|
||||
# Use the instance repository to find running executions for a workflow run
|
||||
running_workflow_node_executions = self._workflow_node_execution_repository.get_running_executions(
|
||||
running_domain_executions = self._workflow_node_execution_repository.get_running_executions(
|
||||
workflow_run_id=workflow_run.id
|
||||
)
|
||||
|
||||
# Update the cache with the retrieved executions
|
||||
for execution in running_workflow_node_executions:
|
||||
if execution.node_execution_id:
|
||||
self._workflow_node_executions[execution.node_execution_id] = execution
|
||||
|
||||
for workflow_node_execution in running_workflow_node_executions:
|
||||
# Update the domain models
|
||||
now = datetime.now(UTC).replace(tzinfo=None)
|
||||
workflow_node_execution.status = WorkflowNodeExecutionStatus.FAILED.value
|
||||
workflow_node_execution.error = error
|
||||
workflow_node_execution.finished_at = now
|
||||
workflow_node_execution.elapsed_time = (now - workflow_node_execution.created_at).total_seconds()
|
||||
for domain_execution in running_domain_executions:
|
||||
if domain_execution.node_execution_id:
|
||||
# Update the domain model
|
||||
domain_execution.status = NodeExecutionStatus.FAILED
|
||||
domain_execution.error = error
|
||||
domain_execution.finished_at = now
|
||||
domain_execution.elapsed_time = (now - domain_execution.created_at).total_seconds()
|
||||
|
||||
# Update the repository with the domain model
|
||||
self._workflow_node_execution_repository.save(domain_execution)
|
||||
|
||||
if trace_manager:
|
||||
trace_manager.add_trace_task(
|
||||
@ -286,63 +288,67 @@ class WorkflowCycleManager:
|
||||
|
||||
return workflow_run
|
||||
|
||||
def _handle_node_execution_start(
|
||||
self, *, workflow_run: WorkflowRun, event: QueueNodeStartedEvent
|
||||
) -> WorkflowNodeExecution:
|
||||
workflow_node_execution = WorkflowNodeExecution()
|
||||
workflow_node_execution.id = str(uuid4())
|
||||
workflow_node_execution.tenant_id = workflow_run.tenant_id
|
||||
workflow_node_execution.app_id = workflow_run.app_id
|
||||
workflow_node_execution.workflow_id = workflow_run.workflow_id
|
||||
workflow_node_execution.triggered_from = WorkflowNodeExecutionTriggeredFrom.WORKFLOW_RUN.value
|
||||
workflow_node_execution.workflow_run_id = workflow_run.id
|
||||
workflow_node_execution.predecessor_node_id = event.predecessor_node_id
|
||||
workflow_node_execution.index = event.node_run_index
|
||||
workflow_node_execution.node_execution_id = event.node_execution_id
|
||||
workflow_node_execution.node_id = event.node_id
|
||||
workflow_node_execution.node_type = event.node_type.value
|
||||
workflow_node_execution.title = event.node_data.title
|
||||
workflow_node_execution.status = WorkflowNodeExecutionStatus.RUNNING.value
|
||||
workflow_node_execution.created_by_role = workflow_run.created_by_role
|
||||
workflow_node_execution.created_by = workflow_run.created_by
|
||||
workflow_node_execution.execution_metadata = json.dumps(
|
||||
{
|
||||
def _handle_node_execution_start(self, *, workflow_run: WorkflowRun, event: QueueNodeStartedEvent) -> NodeExecution:
|
||||
# Create a domain model
|
||||
created_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
metadata = {
|
||||
NodeRunMetadataKey.PARALLEL_MODE_RUN_ID: event.parallel_mode_run_id,
|
||||
NodeRunMetadataKey.ITERATION_ID: event.in_iteration_id,
|
||||
NodeRunMetadataKey.LOOP_ID: event.in_loop_id,
|
||||
}
|
||||
|
||||
domain_execution = NodeExecution(
|
||||
id=str(uuid4()),
|
||||
workflow_id=workflow_run.workflow_id,
|
||||
workflow_run_id=workflow_run.id,
|
||||
predecessor_node_id=event.predecessor_node_id,
|
||||
index=event.node_run_index,
|
||||
node_execution_id=event.node_execution_id,
|
||||
node_id=event.node_id,
|
||||
node_type=event.node_type,
|
||||
title=event.node_data.title,
|
||||
status=NodeExecutionStatus.RUNNING,
|
||||
metadata=metadata,
|
||||
created_at=created_at,
|
||||
)
|
||||
workflow_node_execution.created_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
|
||||
# Use the instance repository to save the workflow node execution
|
||||
self._workflow_node_execution_repository.save(workflow_node_execution)
|
||||
# Use the instance repository to save the domain model
|
||||
self._workflow_node_execution_repository.save(domain_execution)
|
||||
|
||||
self._workflow_node_executions[event.node_execution_id] = workflow_node_execution
|
||||
return workflow_node_execution
|
||||
return domain_execution
|
||||
|
||||
def _handle_workflow_node_execution_success(self, *, event: QueueNodeSucceededEvent) -> WorkflowNodeExecution:
|
||||
workflow_node_execution = self._get_workflow_node_execution(node_execution_id=event.node_execution_id)
|
||||
def _handle_workflow_node_execution_success(self, *, event: QueueNodeSucceededEvent) -> NodeExecution:
|
||||
# Get the domain model from repository
|
||||
domain_execution = self._workflow_node_execution_repository.get_by_node_execution_id(event.node_execution_id)
|
||||
if not domain_execution:
|
||||
raise ValueError(f"Domain node execution not found: {event.node_execution_id}")
|
||||
|
||||
# Process data
|
||||
inputs = WorkflowEntry.handle_special_values(event.inputs)
|
||||
process_data = WorkflowEntry.handle_special_values(event.process_data)
|
||||
outputs = WorkflowEntry.handle_special_values(event.outputs)
|
||||
execution_metadata_dict = dict(event.execution_metadata or {})
|
||||
execution_metadata = json.dumps(jsonable_encoder(execution_metadata_dict)) if execution_metadata_dict else None
|
||||
|
||||
# Convert metadata keys to strings
|
||||
execution_metadata_dict = {}
|
||||
if event.execution_metadata:
|
||||
for key, value in event.execution_metadata.items():
|
||||
execution_metadata_dict[key] = value
|
||||
|
||||
finished_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
elapsed_time = (finished_at - event.start_at).total_seconds()
|
||||
|
||||
process_data = WorkflowEntry.handle_special_values(event.process_data)
|
||||
# Update domain model
|
||||
domain_execution.status = NodeExecutionStatus.SUCCEEDED
|
||||
domain_execution.update_from_mapping(
|
||||
inputs=inputs, process_data=process_data, outputs=outputs, metadata=execution_metadata_dict
|
||||
)
|
||||
domain_execution.finished_at = finished_at
|
||||
domain_execution.elapsed_time = elapsed_time
|
||||
|
||||
workflow_node_execution.status = WorkflowNodeExecutionStatus.SUCCEEDED.value
|
||||
workflow_node_execution.inputs = json.dumps(inputs) if inputs else None
|
||||
workflow_node_execution.process_data = json.dumps(process_data) if process_data else None
|
||||
workflow_node_execution.outputs = json.dumps(outputs) if outputs else None
|
||||
workflow_node_execution.execution_metadata = execution_metadata
|
||||
workflow_node_execution.finished_at = finished_at
|
||||
workflow_node_execution.elapsed_time = elapsed_time
|
||||
# Update the repository with the domain model
|
||||
self._workflow_node_execution_repository.save(domain_execution)
|
||||
|
||||
# Use the instance repository to update the workflow node execution
|
||||
self._workflow_node_execution_repository.update(workflow_node_execution)
|
||||
return workflow_node_execution
|
||||
return domain_execution
|
||||
|
||||
def _handle_workflow_node_execution_failed(
|
||||
self,
|
||||
@ -351,43 +357,52 @@ class WorkflowCycleManager:
|
||||
| QueueNodeInIterationFailedEvent
|
||||
| QueueNodeInLoopFailedEvent
|
||||
| QueueNodeExceptionEvent,
|
||||
) -> WorkflowNodeExecution:
|
||||
) -> NodeExecution:
|
||||
"""
|
||||
Workflow node execution failed
|
||||
:param event: queue node failed event
|
||||
:return:
|
||||
"""
|
||||
workflow_node_execution = self._get_workflow_node_execution(node_execution_id=event.node_execution_id)
|
||||
# Get the domain model from repository
|
||||
domain_execution = self._workflow_node_execution_repository.get_by_node_execution_id(event.node_execution_id)
|
||||
if not domain_execution:
|
||||
raise ValueError(f"Domain node execution not found: {event.node_execution_id}")
|
||||
|
||||
# Process data
|
||||
inputs = WorkflowEntry.handle_special_values(event.inputs)
|
||||
process_data = WorkflowEntry.handle_special_values(event.process_data)
|
||||
outputs = WorkflowEntry.handle_special_values(event.outputs)
|
||||
|
||||
# Convert metadata keys to strings
|
||||
execution_metadata_dict = {}
|
||||
if event.execution_metadata:
|
||||
for key, value in event.execution_metadata.items():
|
||||
execution_metadata_dict[key] = value
|
||||
|
||||
finished_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
elapsed_time = (finished_at - event.start_at).total_seconds()
|
||||
execution_metadata = (
|
||||
json.dumps(jsonable_encoder(event.execution_metadata)) if event.execution_metadata else None
|
||||
)
|
||||
process_data = WorkflowEntry.handle_special_values(event.process_data)
|
||||
workflow_node_execution.status = (
|
||||
WorkflowNodeExecutionStatus.FAILED.value
|
||||
|
||||
# Update domain model
|
||||
domain_execution.status = (
|
||||
NodeExecutionStatus.FAILED
|
||||
if not isinstance(event, QueueNodeExceptionEvent)
|
||||
else WorkflowNodeExecutionStatus.EXCEPTION.value
|
||||
else NodeExecutionStatus.EXCEPTION
|
||||
)
|
||||
workflow_node_execution.error = event.error
|
||||
workflow_node_execution.inputs = json.dumps(inputs) if inputs else None
|
||||
workflow_node_execution.process_data = json.dumps(process_data) if process_data else None
|
||||
workflow_node_execution.outputs = json.dumps(outputs) if outputs else None
|
||||
workflow_node_execution.finished_at = finished_at
|
||||
workflow_node_execution.elapsed_time = elapsed_time
|
||||
workflow_node_execution.execution_metadata = execution_metadata
|
||||
domain_execution.error = event.error
|
||||
domain_execution.update_from_mapping(
|
||||
inputs=inputs, process_data=process_data, outputs=outputs, metadata=execution_metadata_dict
|
||||
)
|
||||
domain_execution.finished_at = finished_at
|
||||
domain_execution.elapsed_time = elapsed_time
|
||||
|
||||
self._workflow_node_execution_repository.update(workflow_node_execution)
|
||||
# Update the repository with the domain model
|
||||
self._workflow_node_execution_repository.save(domain_execution)
|
||||
|
||||
return workflow_node_execution
|
||||
return domain_execution
|
||||
|
||||
def _handle_workflow_node_execution_retried(
|
||||
self, *, workflow_run: WorkflowRun, event: QueueNodeRetryEvent
|
||||
) -> WorkflowNodeExecution:
|
||||
) -> NodeExecution:
|
||||
"""
|
||||
Workflow node execution failed
|
||||
:param workflow_run: workflow run
|
||||
@ -399,47 +414,47 @@ class WorkflowCycleManager:
|
||||
elapsed_time = (finished_at - created_at).total_seconds()
|
||||
inputs = WorkflowEntry.handle_special_values(event.inputs)
|
||||
outputs = WorkflowEntry.handle_special_values(event.outputs)
|
||||
|
||||
# Convert metadata keys to strings
|
||||
origin_metadata = {
|
||||
NodeRunMetadataKey.ITERATION_ID: event.in_iteration_id,
|
||||
NodeRunMetadataKey.PARALLEL_MODE_RUN_ID: event.parallel_mode_run_id,
|
||||
NodeRunMetadataKey.LOOP_ID: event.in_loop_id,
|
||||
}
|
||||
merged_metadata = (
|
||||
{**jsonable_encoder(event.execution_metadata), **origin_metadata}
|
||||
if event.execution_metadata is not None
|
||||
else origin_metadata
|
||||
|
||||
# Convert execution metadata keys to strings
|
||||
execution_metadata_dict: dict[NodeRunMetadataKey, str | None] = {}
|
||||
if event.execution_metadata:
|
||||
for key, value in event.execution_metadata.items():
|
||||
execution_metadata_dict[key] = value
|
||||
|
||||
merged_metadata = {**execution_metadata_dict, **origin_metadata} if execution_metadata_dict else origin_metadata
|
||||
|
||||
# Create a domain model
|
||||
domain_execution = NodeExecution(
|
||||
id=str(uuid4()),
|
||||
workflow_id=workflow_run.workflow_id,
|
||||
workflow_run_id=workflow_run.id,
|
||||
predecessor_node_id=event.predecessor_node_id,
|
||||
node_execution_id=event.node_execution_id,
|
||||
node_id=event.node_id,
|
||||
node_type=event.node_type,
|
||||
title=event.node_data.title,
|
||||
status=NodeExecutionStatus.RETRY,
|
||||
created_at=created_at,
|
||||
finished_at=finished_at,
|
||||
elapsed_time=elapsed_time,
|
||||
error=event.error,
|
||||
index=event.node_run_index,
|
||||
)
|
||||
execution_metadata = json.dumps(merged_metadata)
|
||||
|
||||
workflow_node_execution = WorkflowNodeExecution()
|
||||
workflow_node_execution.id = str(uuid4())
|
||||
workflow_node_execution.tenant_id = workflow_run.tenant_id
|
||||
workflow_node_execution.app_id = workflow_run.app_id
|
||||
workflow_node_execution.workflow_id = workflow_run.workflow_id
|
||||
workflow_node_execution.triggered_from = WorkflowNodeExecutionTriggeredFrom.WORKFLOW_RUN.value
|
||||
workflow_node_execution.workflow_run_id = workflow_run.id
|
||||
workflow_node_execution.predecessor_node_id = event.predecessor_node_id
|
||||
workflow_node_execution.node_execution_id = event.node_execution_id
|
||||
workflow_node_execution.node_id = event.node_id
|
||||
workflow_node_execution.node_type = event.node_type.value
|
||||
workflow_node_execution.title = event.node_data.title
|
||||
workflow_node_execution.status = WorkflowNodeExecutionStatus.RETRY.value
|
||||
workflow_node_execution.created_by_role = workflow_run.created_by_role
|
||||
workflow_node_execution.created_by = workflow_run.created_by
|
||||
workflow_node_execution.created_at = created_at
|
||||
workflow_node_execution.finished_at = finished_at
|
||||
workflow_node_execution.elapsed_time = elapsed_time
|
||||
workflow_node_execution.error = event.error
|
||||
workflow_node_execution.inputs = json.dumps(inputs) if inputs else None
|
||||
workflow_node_execution.outputs = json.dumps(outputs) if outputs else None
|
||||
workflow_node_execution.execution_metadata = execution_metadata
|
||||
workflow_node_execution.index = event.node_run_index
|
||||
# Update with mappings
|
||||
domain_execution.update_from_mapping(inputs=inputs, outputs=outputs, metadata=merged_metadata)
|
||||
|
||||
# Use the instance repository to save the workflow node execution
|
||||
self._workflow_node_execution_repository.save(workflow_node_execution)
|
||||
# Use the instance repository to save the domain model
|
||||
self._workflow_node_execution_repository.save(domain_execution)
|
||||
|
||||
self._workflow_node_executions[event.node_execution_id] = workflow_node_execution
|
||||
return workflow_node_execution
|
||||
return domain_execution
|
||||
|
||||
def _workflow_start_to_stream_response(
|
||||
self,
|
||||
@ -469,7 +484,7 @@ class WorkflowCycleManager:
|
||||
workflow_run: WorkflowRun,
|
||||
) -> WorkflowFinishStreamResponse:
|
||||
created_by = None
|
||||
if workflow_run.created_by_role == CreatedByRole.ACCOUNT:
|
||||
if workflow_run.created_by_role == CreatorUserRole.ACCOUNT:
|
||||
stmt = select(Account).where(Account.id == workflow_run.created_by)
|
||||
account = session.scalar(stmt)
|
||||
if account:
|
||||
@ -478,7 +493,7 @@ class WorkflowCycleManager:
|
||||
"name": account.name,
|
||||
"email": account.email,
|
||||
}
|
||||
elif workflow_run.created_by_role == CreatedByRole.END_USER:
|
||||
elif workflow_run.created_by_role == CreatorUserRole.END_USER:
|
||||
stmt = select(EndUser).where(EndUser.id == workflow_run.created_by)
|
||||
end_user = session.scalar(stmt)
|
||||
if end_user:
|
||||
@ -515,9 +530,9 @@ class WorkflowCycleManager:
|
||||
*,
|
||||
event: QueueNodeStartedEvent,
|
||||
task_id: str,
|
||||
workflow_node_execution: WorkflowNodeExecution,
|
||||
workflow_node_execution: NodeExecution,
|
||||
) -> Optional[NodeStartStreamResponse]:
|
||||
if workflow_node_execution.node_type in {NodeType.ITERATION.value, NodeType.LOOP.value}:
|
||||
if workflow_node_execution.node_type in {NodeType.ITERATION, NodeType.LOOP}:
|
||||
return None
|
||||
if not workflow_node_execution.workflow_run_id:
|
||||
return None
|
||||
@ -532,7 +547,7 @@ class WorkflowCycleManager:
|
||||
title=workflow_node_execution.title,
|
||||
index=workflow_node_execution.index,
|
||||
predecessor_node_id=workflow_node_execution.predecessor_node_id,
|
||||
inputs=workflow_node_execution.inputs_dict,
|
||||
inputs=workflow_node_execution.inputs,
|
||||
created_at=int(workflow_node_execution.created_at.timestamp()),
|
||||
parallel_id=event.parallel_id,
|
||||
parallel_start_node_id=event.parallel_start_node_id,
|
||||
@ -565,9 +580,9 @@ class WorkflowCycleManager:
|
||||
| QueueNodeInLoopFailedEvent
|
||||
| QueueNodeExceptionEvent,
|
||||
task_id: str,
|
||||
workflow_node_execution: WorkflowNodeExecution,
|
||||
workflow_node_execution: NodeExecution,
|
||||
) -> Optional[NodeFinishStreamResponse]:
|
||||
if workflow_node_execution.node_type in {NodeType.ITERATION.value, NodeType.LOOP.value}:
|
||||
if workflow_node_execution.node_type in {NodeType.ITERATION, NodeType.LOOP}:
|
||||
return None
|
||||
if not workflow_node_execution.workflow_run_id:
|
||||
return None
|
||||
@ -584,16 +599,16 @@ class WorkflowCycleManager:
|
||||
index=workflow_node_execution.index,
|
||||
title=workflow_node_execution.title,
|
||||
predecessor_node_id=workflow_node_execution.predecessor_node_id,
|
||||
inputs=workflow_node_execution.inputs_dict,
|
||||
process_data=workflow_node_execution.process_data_dict,
|
||||
outputs=workflow_node_execution.outputs_dict,
|
||||
inputs=workflow_node_execution.inputs,
|
||||
process_data=workflow_node_execution.process_data,
|
||||
outputs=workflow_node_execution.outputs,
|
||||
status=workflow_node_execution.status,
|
||||
error=workflow_node_execution.error,
|
||||
elapsed_time=workflow_node_execution.elapsed_time,
|
||||
execution_metadata=workflow_node_execution.execution_metadata_dict,
|
||||
execution_metadata=workflow_node_execution.metadata,
|
||||
created_at=int(workflow_node_execution.created_at.timestamp()),
|
||||
finished_at=int(workflow_node_execution.finished_at.timestamp()),
|
||||
files=self._fetch_files_from_node_outputs(workflow_node_execution.outputs_dict or {}),
|
||||
files=self._fetch_files_from_node_outputs(workflow_node_execution.outputs or {}),
|
||||
parallel_id=event.parallel_id,
|
||||
parallel_start_node_id=event.parallel_start_node_id,
|
||||
parent_parallel_id=event.parent_parallel_id,
|
||||
@ -608,9 +623,9 @@ class WorkflowCycleManager:
|
||||
*,
|
||||
event: QueueNodeRetryEvent,
|
||||
task_id: str,
|
||||
workflow_node_execution: WorkflowNodeExecution,
|
||||
workflow_node_execution: NodeExecution,
|
||||
) -> Optional[Union[NodeRetryStreamResponse, NodeFinishStreamResponse]]:
|
||||
if workflow_node_execution.node_type in {NodeType.ITERATION.value, NodeType.LOOP.value}:
|
||||
if workflow_node_execution.node_type in {NodeType.ITERATION, NodeType.LOOP}:
|
||||
return None
|
||||
if not workflow_node_execution.workflow_run_id:
|
||||
return None
|
||||
@ -627,16 +642,16 @@ class WorkflowCycleManager:
|
||||
index=workflow_node_execution.index,
|
||||
title=workflow_node_execution.title,
|
||||
predecessor_node_id=workflow_node_execution.predecessor_node_id,
|
||||
inputs=workflow_node_execution.inputs_dict,
|
||||
process_data=workflow_node_execution.process_data_dict,
|
||||
outputs=workflow_node_execution.outputs_dict,
|
||||
inputs=workflow_node_execution.inputs,
|
||||
process_data=workflow_node_execution.process_data,
|
||||
outputs=workflow_node_execution.outputs,
|
||||
status=workflow_node_execution.status,
|
||||
error=workflow_node_execution.error,
|
||||
elapsed_time=workflow_node_execution.elapsed_time,
|
||||
execution_metadata=workflow_node_execution.execution_metadata_dict,
|
||||
execution_metadata=workflow_node_execution.metadata,
|
||||
created_at=int(workflow_node_execution.created_at.timestamp()),
|
||||
finished_at=int(workflow_node_execution.finished_at.timestamp()),
|
||||
files=self._fetch_files_from_node_outputs(workflow_node_execution.outputs_dict or {}),
|
||||
files=self._fetch_files_from_node_outputs(workflow_node_execution.outputs or {}),
|
||||
parallel_id=event.parallel_id,
|
||||
parallel_start_node_id=event.parallel_start_node_id,
|
||||
parent_parallel_id=event.parent_parallel_id,
|
||||
@ -908,23 +923,6 @@ class WorkflowCycleManager:
|
||||
|
||||
return workflow_run
|
||||
|
||||
def _get_workflow_node_execution(self, node_execution_id: str) -> WorkflowNodeExecution:
|
||||
# First check the cache for performance
|
||||
if node_execution_id in self._workflow_node_executions:
|
||||
cached_execution = self._workflow_node_executions[node_execution_id]
|
||||
# No need to merge with session since expire_on_commit=False
|
||||
return cached_execution
|
||||
|
||||
# If not in cache, use the instance repository to get by node_execution_id
|
||||
execution = self._workflow_node_execution_repository.get_by_node_execution_id(node_execution_id)
|
||||
|
||||
if not execution:
|
||||
raise ValueError(f"Workflow node execution not found: {node_execution_id}")
|
||||
|
||||
# Update cache
|
||||
self._workflow_node_executions[node_execution_id] = execution
|
||||
return execution
|
||||
|
||||
def _handle_agent_log(self, task_id: str, event: QueueAgentLogEvent) -> AgentLogStreamResponse:
|
||||
"""
|
||||
Handle agent log
|
||||
|
73
api/extensions/ext_request_logging.py
Normal file
73
api/extensions/ext_request_logging.py
Normal file
@ -0,0 +1,73 @@
|
||||
import json
|
||||
import logging
|
||||
|
||||
import flask
|
||||
import werkzeug.http
|
||||
from flask import Flask
|
||||
from flask.signals import request_finished, request_started
|
||||
|
||||
from configs import dify_config
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _is_content_type_json(content_type: str) -> bool:
|
||||
if not content_type:
|
||||
return False
|
||||
content_type_no_option, _ = werkzeug.http.parse_options_header(content_type)
|
||||
return content_type_no_option.lower() == "application/json"
|
||||
|
||||
|
||||
def _log_request_started(_sender, **_extra):
|
||||
"""Log the start of a request."""
|
||||
if not _logger.isEnabledFor(logging.DEBUG):
|
||||
return
|
||||
|
||||
request = flask.request
|
||||
if not (_is_content_type_json(request.content_type) and request.data):
|
||||
_logger.debug("Received Request %s -> %s", request.method, request.path)
|
||||
return
|
||||
try:
|
||||
json_data = json.loads(request.data)
|
||||
except (TypeError, ValueError):
|
||||
_logger.exception("Failed to parse JSON request")
|
||||
return
|
||||
formatted_json = json.dumps(json_data, ensure_ascii=False, indent=2)
|
||||
_logger.debug(
|
||||
"Received Request %s -> %s, Request Body:\n%s",
|
||||
request.method,
|
||||
request.path,
|
||||
formatted_json,
|
||||
)
|
||||
|
||||
|
||||
def _log_request_finished(_sender, response, **_extra):
|
||||
"""Log the end of a request."""
|
||||
if not _logger.isEnabledFor(logging.DEBUG) or response is None:
|
||||
return
|
||||
|
||||
if not _is_content_type_json(response.content_type):
|
||||
_logger.debug("Response %s %s", response.status, response.content_type)
|
||||
return
|
||||
|
||||
response_data = response.get_data(as_text=True)
|
||||
try:
|
||||
json_data = json.loads(response_data)
|
||||
except (TypeError, ValueError):
|
||||
_logger.exception("Failed to parse JSON response")
|
||||
return
|
||||
formatted_json = json.dumps(json_data, ensure_ascii=False, indent=2)
|
||||
_logger.debug(
|
||||
"Response %s %s, Response Body:\n%s",
|
||||
response.status,
|
||||
response.content_type,
|
||||
formatted_json,
|
||||
)
|
||||
|
||||
|
||||
def init_app(app: Flask):
|
||||
"""Initialize the request logging extension."""
|
||||
if not dify_config.ENABLE_REQUEST_LOGGING:
|
||||
return
|
||||
request_started.connect(_log_request_started, app)
|
||||
request_finished.connect(_log_request_finished, app)
|
@ -63,6 +63,7 @@ app_detail_fields = {
|
||||
"created_at": TimestampField,
|
||||
"updated_by": fields.String,
|
||||
"updated_at": TimestampField,
|
||||
"access_mode": fields.String,
|
||||
}
|
||||
|
||||
prompt_config_fields = {
|
||||
@ -98,6 +99,7 @@ app_partial_fields = {
|
||||
"updated_by": fields.String,
|
||||
"updated_at": TimestampField,
|
||||
"tags": fields.List(fields.Nested(tag_fields)),
|
||||
"access_mode": fields.String,
|
||||
}
|
||||
|
||||
|
||||
@ -176,6 +178,7 @@ app_detail_fields_with_site = {
|
||||
"updated_by": fields.String,
|
||||
"updated_at": TimestampField,
|
||||
"deleted_tools": fields.List(fields.Nested(deleted_tool_fields)),
|
||||
"access_mode": fields.String,
|
||||
}
|
||||
|
||||
|
||||
|
@ -0,0 +1,51 @@
|
||||
"""add WorkflowDraftVariable model
|
||||
|
||||
Revision ID: 2adcbe1f5dfb
|
||||
Revises: d28f2004b072
|
||||
Create Date: 2025-05-15 15:31:03.128680
|
||||
|
||||
"""
|
||||
|
||||
import sqlalchemy as sa
|
||||
from alembic import op
|
||||
|
||||
import models as models
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "2adcbe1f5dfb"
|
||||
down_revision = "d28f2004b072"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.create_table(
|
||||
"workflow_draft_variables",
|
||||
sa.Column("id", models.types.StringUUID(), server_default=sa.text("uuid_generate_v4()"), nullable=False),
|
||||
sa.Column("created_at", sa.DateTime(), server_default=sa.text("CURRENT_TIMESTAMP"), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(), server_default=sa.text("CURRENT_TIMESTAMP"), nullable=False),
|
||||
sa.Column("app_id", models.types.StringUUID(), nullable=False),
|
||||
sa.Column("last_edited_at", sa.DateTime(), nullable=True),
|
||||
sa.Column("node_id", sa.String(length=255), nullable=False),
|
||||
sa.Column("name", sa.String(length=255), nullable=False),
|
||||
sa.Column("description", sa.String(length=255), nullable=False),
|
||||
sa.Column("selector", sa.String(length=255), nullable=False),
|
||||
sa.Column("value_type", sa.String(length=20), nullable=False),
|
||||
sa.Column("value", sa.Text(), nullable=False),
|
||||
sa.Column("visible", sa.Boolean(), nullable=False),
|
||||
sa.Column("editable", sa.Boolean(), nullable=False),
|
||||
sa.PrimaryKeyConstraint("id", name=op.f("workflow_draft_variables_pkey")),
|
||||
sa.UniqueConstraint("app_id", "node_id", "name", name=op.f("workflow_draft_variables_app_id_key")),
|
||||
)
|
||||
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade():
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
|
||||
# Dropping `workflow_draft_variables` also drops any index associated with it.
|
||||
op.drop_table("workflow_draft_variables")
|
||||
|
||||
# ### end Alembic commands ###
|
@ -27,7 +27,7 @@ from .dataset import (
|
||||
Whitelist,
|
||||
)
|
||||
from .engine import db
|
||||
from .enums import CreatedByRole, UserFrom, WorkflowRunTriggeredFrom
|
||||
from .enums import CreatorUserRole, UserFrom, WorkflowRunTriggeredFrom
|
||||
from .model import (
|
||||
ApiRequest,
|
||||
ApiToken,
|
||||
@ -112,7 +112,7 @@ __all__ = [
|
||||
"CeleryTaskSet",
|
||||
"Conversation",
|
||||
"ConversationVariable",
|
||||
"CreatedByRole",
|
||||
"CreatorUserRole",
|
||||
"DataSourceApiKeyAuthBinding",
|
||||
"DataSourceOauthBinding",
|
||||
"Dataset",
|
||||
|
@ -1,7 +1,7 @@
|
||||
from enum import StrEnum
|
||||
|
||||
|
||||
class CreatedByRole(StrEnum):
|
||||
class CreatorUserRole(StrEnum):
|
||||
ACCOUNT = "account"
|
||||
END_USER = "end_user"
|
||||
|
||||
@ -14,3 +14,10 @@ class UserFrom(StrEnum):
|
||||
class WorkflowRunTriggeredFrom(StrEnum):
|
||||
DEBUGGING = "debugging"
|
||||
APP_RUN = "app-run"
|
||||
|
||||
|
||||
class DraftVariableType(StrEnum):
|
||||
# node means that the correspond variable
|
||||
NODE = "node"
|
||||
SYS = "sys"
|
||||
CONVERSATION = "conversation"
|
||||
|
@ -29,7 +29,7 @@ from libs.helper import generate_string
|
||||
from .account import Account, Tenant
|
||||
from .base import Base
|
||||
from .engine import db
|
||||
from .enums import CreatedByRole
|
||||
from .enums import CreatorUserRole
|
||||
from .types import StringUUID
|
||||
from .workflow import WorkflowRunStatus
|
||||
|
||||
@ -1270,7 +1270,7 @@ class MessageFile(Base):
|
||||
url: str | None = None,
|
||||
belongs_to: Literal["user", "assistant"] | None = None,
|
||||
upload_file_id: str | None = None,
|
||||
created_by_role: CreatedByRole,
|
||||
created_by_role: CreatorUserRole,
|
||||
created_by: str,
|
||||
):
|
||||
self.message_id = message_id
|
||||
@ -1417,7 +1417,7 @@ class EndUser(Base, UserMixin):
|
||||
)
|
||||
|
||||
id = db.Column(StringUUID, server_default=db.text("uuid_generate_v4()"))
|
||||
tenant_id = db.Column(StringUUID, nullable=False)
|
||||
tenant_id: Mapped[str] = db.Column(StringUUID, nullable=False)
|
||||
app_id = db.Column(StringUUID, nullable=True)
|
||||
type = db.Column(db.String(255), nullable=False)
|
||||
external_user_id = db.Column(db.String(255), nullable=True)
|
||||
@ -1547,7 +1547,7 @@ class UploadFile(Base):
|
||||
size: int,
|
||||
extension: str,
|
||||
mime_type: str,
|
||||
created_by_role: CreatedByRole,
|
||||
created_by_role: CreatorUserRole,
|
||||
created_by: str,
|
||||
created_at: datetime,
|
||||
used: bool,
|
||||
|
@ -1,4 +1,7 @@
|
||||
from sqlalchemy import CHAR, TypeDecorator
|
||||
import enum
|
||||
from typing import Generic, TypeVar
|
||||
|
||||
from sqlalchemy import CHAR, VARCHAR, TypeDecorator
|
||||
from sqlalchemy.dialects.postgresql import UUID
|
||||
|
||||
|
||||
@ -24,3 +27,51 @@ class StringUUID(TypeDecorator):
|
||||
if value is None:
|
||||
return value
|
||||
return str(value)
|
||||
|
||||
|
||||
_E = TypeVar("_E", bound=enum.StrEnum)
|
||||
|
||||
|
||||
class EnumText(TypeDecorator, Generic[_E]):
|
||||
impl = VARCHAR
|
||||
cache_ok = True
|
||||
|
||||
_length: int
|
||||
_enum_class: type[_E]
|
||||
|
||||
def __init__(self, enum_class: type[_E], length: int | None = None):
|
||||
self._enum_class = enum_class
|
||||
max_enum_value_len = max(len(e.value) for e in enum_class)
|
||||
if length is not None:
|
||||
if length < max_enum_value_len:
|
||||
raise ValueError("length should be greater than enum value length.")
|
||||
self._length = length
|
||||
else:
|
||||
# leave some rooms for future longer enum values.
|
||||
self._length = max(max_enum_value_len, 20)
|
||||
|
||||
def process_bind_param(self, value: _E | str | None, dialect):
|
||||
if value is None:
|
||||
return value
|
||||
if isinstance(value, self._enum_class):
|
||||
return value.value
|
||||
elif isinstance(value, str):
|
||||
self._enum_class(value)
|
||||
return value
|
||||
else:
|
||||
raise TypeError(f"expected str or {self._enum_class}, got {type(value)}")
|
||||
|
||||
def load_dialect_impl(self, dialect):
|
||||
return dialect.type_descriptor(VARCHAR(self._length))
|
||||
|
||||
def process_result_value(self, value, dialect) -> _E | None:
|
||||
if value is None:
|
||||
return value
|
||||
if not isinstance(value, str):
|
||||
raise TypeError(f"expected str, got {type(value)}")
|
||||
return self._enum_class(value)
|
||||
|
||||
def compare_values(self, x, y):
|
||||
if x is None or y is None:
|
||||
return x is y
|
||||
return x == y
|
||||
|
@ -1,29 +1,36 @@
|
||||
import json
|
||||
import logging
|
||||
from collections.abc import Mapping, Sequence
|
||||
from datetime import UTC, datetime
|
||||
from enum import Enum, StrEnum
|
||||
from typing import TYPE_CHECKING, Any, List, Optional, Self, Union
|
||||
from uuid import uuid4
|
||||
|
||||
from core.variables import utils as variable_utils
|
||||
from core.workflow.constants import CONVERSATION_VARIABLE_NODE_ID, SYSTEM_VARIABLE_NODE_ID
|
||||
from factories.variable_factory import build_segment
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from models.model import AppMode
|
||||
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy import func
|
||||
from sqlalchemy import UniqueConstraint, func
|
||||
from sqlalchemy.orm import Mapped, mapped_column
|
||||
|
||||
import contexts
|
||||
from constants import DEFAULT_FILE_NUMBER_LIMITS, HIDDEN_VALUE
|
||||
from core.helper import encrypter
|
||||
from core.variables import SecretVariable, Variable
|
||||
from core.variables import SecretVariable, Segment, SegmentType, Variable
|
||||
from factories import variable_factory
|
||||
from libs import helper
|
||||
|
||||
from .account import Account
|
||||
from .base import Base
|
||||
from .engine import db
|
||||
from .enums import CreatedByRole
|
||||
from .types import StringUUID
|
||||
from .enums import CreatorUserRole, DraftVariableType
|
||||
from .types import EnumText, StringUUID
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from models.model import AppMode
|
||||
@ -451,15 +458,15 @@ class WorkflowRun(Base):
|
||||
|
||||
@property
|
||||
def created_by_account(self):
|
||||
created_by_role = CreatedByRole(self.created_by_role)
|
||||
return db.session.get(Account, self.created_by) if created_by_role == CreatedByRole.ACCOUNT else None
|
||||
created_by_role = CreatorUserRole(self.created_by_role)
|
||||
return db.session.get(Account, self.created_by) if created_by_role == CreatorUserRole.ACCOUNT else None
|
||||
|
||||
@property
|
||||
def created_by_end_user(self):
|
||||
from models.model import EndUser
|
||||
|
||||
created_by_role = CreatedByRole(self.created_by_role)
|
||||
return db.session.get(EndUser, self.created_by) if created_by_role == CreatedByRole.END_USER else None
|
||||
created_by_role = CreatorUserRole(self.created_by_role)
|
||||
return db.session.get(EndUser, self.created_by) if created_by_role == CreatorUserRole.END_USER else None
|
||||
|
||||
@property
|
||||
def graph_dict(self):
|
||||
@ -656,24 +663,24 @@ class WorkflowNodeExecution(Base):
|
||||
|
||||
@property
|
||||
def created_by_account(self):
|
||||
created_by_role = CreatedByRole(self.created_by_role)
|
||||
created_by_role = CreatorUserRole(self.created_by_role)
|
||||
# TODO(-LAN-): Avoid using db.session.get() here.
|
||||
return db.session.get(Account, self.created_by) if created_by_role == CreatedByRole.ACCOUNT else None
|
||||
return db.session.get(Account, self.created_by) if created_by_role == CreatorUserRole.ACCOUNT else None
|
||||
|
||||
@property
|
||||
def created_by_end_user(self):
|
||||
from models.model import EndUser
|
||||
|
||||
created_by_role = CreatedByRole(self.created_by_role)
|
||||
created_by_role = CreatorUserRole(self.created_by_role)
|
||||
# TODO(-LAN-): Avoid using db.session.get() here.
|
||||
return db.session.get(EndUser, self.created_by) if created_by_role == CreatedByRole.END_USER else None
|
||||
return db.session.get(EndUser, self.created_by) if created_by_role == CreatorUserRole.END_USER else None
|
||||
|
||||
@property
|
||||
def inputs_dict(self):
|
||||
return json.loads(self.inputs) if self.inputs else None
|
||||
|
||||
@property
|
||||
def outputs_dict(self):
|
||||
def outputs_dict(self) -> dict[str, Any] | None:
|
||||
return json.loads(self.outputs) if self.outputs else None
|
||||
|
||||
@property
|
||||
@ -681,7 +688,7 @@ class WorkflowNodeExecution(Base):
|
||||
return json.loads(self.process_data) if self.process_data else None
|
||||
|
||||
@property
|
||||
def execution_metadata_dict(self):
|
||||
def execution_metadata_dict(self) -> dict[str, Any] | None:
|
||||
return json.loads(self.execution_metadata) if self.execution_metadata else None
|
||||
|
||||
@property
|
||||
@ -777,15 +784,15 @@ class WorkflowAppLog(Base):
|
||||
|
||||
@property
|
||||
def created_by_account(self):
|
||||
created_by_role = CreatedByRole(self.created_by_role)
|
||||
return db.session.get(Account, self.created_by) if created_by_role == CreatedByRole.ACCOUNT else None
|
||||
created_by_role = CreatorUserRole(self.created_by_role)
|
||||
return db.session.get(Account, self.created_by) if created_by_role == CreatorUserRole.ACCOUNT else None
|
||||
|
||||
@property
|
||||
def created_by_end_user(self):
|
||||
from models.model import EndUser
|
||||
|
||||
created_by_role = CreatedByRole(self.created_by_role)
|
||||
return db.session.get(EndUser, self.created_by) if created_by_role == CreatedByRole.END_USER else None
|
||||
created_by_role = CreatorUserRole(self.created_by_role)
|
||||
return db.session.get(EndUser, self.created_by) if created_by_role == CreatorUserRole.END_USER else None
|
||||
|
||||
|
||||
class ConversationVariable(Base):
|
||||
@ -819,3 +826,201 @@ class ConversationVariable(Base):
|
||||
def to_variable(self) -> Variable:
|
||||
mapping = json.loads(self.data)
|
||||
return variable_factory.build_conversation_variable_from_mapping(mapping)
|
||||
|
||||
|
||||
# Only `sys.query` and `sys.files` could be modified.
|
||||
_EDITABLE_SYSTEM_VARIABLE = frozenset(["query", "files"])
|
||||
|
||||
|
||||
def _naive_utc_datetime():
|
||||
return datetime.now(UTC).replace(tzinfo=None)
|
||||
|
||||
|
||||
class WorkflowDraftVariable(Base):
|
||||
@staticmethod
|
||||
def unique_columns() -> list[str]:
|
||||
return [
|
||||
"app_id",
|
||||
"node_id",
|
||||
"name",
|
||||
]
|
||||
|
||||
__tablename__ = "workflow_draft_variables"
|
||||
__table_args__ = (UniqueConstraint(*unique_columns()),)
|
||||
|
||||
# id is the unique identifier of a draft variable.
|
||||
id: Mapped[str] = mapped_column(StringUUID, primary_key=True, server_default=db.text("uuid_generate_v4()"))
|
||||
|
||||
created_at = mapped_column(
|
||||
db.DateTime,
|
||||
nullable=False,
|
||||
default=_naive_utc_datetime,
|
||||
server_default=func.current_timestamp(),
|
||||
)
|
||||
|
||||
updated_at = mapped_column(
|
||||
db.DateTime,
|
||||
nullable=False,
|
||||
default=_naive_utc_datetime,
|
||||
server_default=func.current_timestamp(),
|
||||
onupdate=func.current_timestamp(),
|
||||
)
|
||||
|
||||
# "`app_id` maps to the `id` field in the `model.App` model."
|
||||
app_id: Mapped[str] = mapped_column(StringUUID, nullable=False)
|
||||
|
||||
# `last_edited_at` records when the value of a given draft variable
|
||||
# is edited.
|
||||
#
|
||||
# If it's not edited after creation, its value is `None`.
|
||||
last_edited_at: Mapped[datetime | None] = mapped_column(
|
||||
db.DateTime,
|
||||
nullable=True,
|
||||
default=None,
|
||||
)
|
||||
|
||||
# The `node_id` field is special.
|
||||
#
|
||||
# If the variable is a conversation variable or a system variable, then the value of `node_id`
|
||||
# is `conversation` or `sys`, respective.
|
||||
#
|
||||
# Otherwise, if the variable is a variable belonging to a specific node, the value of `_node_id` is
|
||||
# the identity of correspond node in graph definition. An example of node id is `"1745769620734"`.
|
||||
#
|
||||
# However, there's one caveat. The id of the first "Answer" node in chatflow is "answer". (Other
|
||||
# "Answer" node conform the rules above.)
|
||||
node_id: Mapped[str] = mapped_column(sa.String(255), nullable=False, name="node_id")
|
||||
|
||||
# From `VARIABLE_PATTERN`, we may conclude that the length of a top level variable is less than
|
||||
# 80 chars.
|
||||
#
|
||||
# ref: api/core/workflow/entities/variable_pool.py:18
|
||||
name: Mapped[str] = mapped_column(sa.String(255), nullable=False)
|
||||
description: Mapped[str] = mapped_column(
|
||||
sa.String(255),
|
||||
default="",
|
||||
nullable=False,
|
||||
)
|
||||
|
||||
selector: Mapped[str] = mapped_column(sa.String(255), nullable=False, name="selector")
|
||||
|
||||
value_type: Mapped[SegmentType] = mapped_column(EnumText(SegmentType, length=20))
|
||||
# JSON string
|
||||
value: Mapped[str] = mapped_column(sa.Text, nullable=False, name="value")
|
||||
|
||||
# visible
|
||||
visible: Mapped[bool] = mapped_column(sa.Boolean, nullable=False, default=True)
|
||||
editable: Mapped[bool] = mapped_column(sa.Boolean, nullable=False, default=False)
|
||||
|
||||
def get_selector(self) -> list[str]:
|
||||
selector = json.loads(self.selector)
|
||||
if not isinstance(selector, list):
|
||||
_logger.error(
|
||||
"invalid selector loaded from database, type=%s, value=%s",
|
||||
type(selector),
|
||||
self.selector,
|
||||
)
|
||||
raise ValueError("invalid selector.")
|
||||
return selector
|
||||
|
||||
def _set_selector(self, value: list[str]):
|
||||
self.selector = json.dumps(value)
|
||||
|
||||
def get_value(self) -> Segment | None:
|
||||
return build_segment(json.loads(self.value))
|
||||
|
||||
def set_name(self, name: str):
|
||||
self.name = name
|
||||
self._set_selector([self.node_id, name])
|
||||
|
||||
def set_value(self, value: Segment):
|
||||
self.value = json.dumps(value.value)
|
||||
self.value_type = value.value_type
|
||||
|
||||
def get_node_id(self) -> str | None:
|
||||
if self.get_variable_type() == DraftVariableType.NODE:
|
||||
return self.node_id
|
||||
else:
|
||||
return None
|
||||
|
||||
def get_variable_type(self) -> DraftVariableType:
|
||||
match self.node_id:
|
||||
case DraftVariableType.CONVERSATION:
|
||||
return DraftVariableType.CONVERSATION
|
||||
case DraftVariableType.SYS:
|
||||
return DraftVariableType.SYS
|
||||
case _:
|
||||
return DraftVariableType.NODE
|
||||
|
||||
@classmethod
|
||||
def _new(
|
||||
cls,
|
||||
*,
|
||||
app_id: str,
|
||||
node_id: str,
|
||||
name: str,
|
||||
value: Segment,
|
||||
description: str = "",
|
||||
) -> "WorkflowDraftVariable":
|
||||
variable = WorkflowDraftVariable()
|
||||
variable.created_at = _naive_utc_datetime()
|
||||
variable.updated_at = _naive_utc_datetime()
|
||||
variable.description = description
|
||||
variable.app_id = app_id
|
||||
variable.node_id = node_id
|
||||
variable.name = name
|
||||
variable.set_value(value)
|
||||
variable._set_selector(list(variable_utils.to_selector(node_id, name)))
|
||||
return variable
|
||||
|
||||
@classmethod
|
||||
def new_conversation_variable(
|
||||
cls,
|
||||
*,
|
||||
app_id: str,
|
||||
name: str,
|
||||
value: Segment,
|
||||
) -> "WorkflowDraftVariable":
|
||||
variable = cls._new(
|
||||
app_id=app_id,
|
||||
node_id=CONVERSATION_VARIABLE_NODE_ID,
|
||||
name=name,
|
||||
value=value,
|
||||
)
|
||||
return variable
|
||||
|
||||
@classmethod
|
||||
def new_sys_variable(
|
||||
cls,
|
||||
*,
|
||||
app_id: str,
|
||||
name: str,
|
||||
value: Segment,
|
||||
editable: bool = False,
|
||||
) -> "WorkflowDraftVariable":
|
||||
variable = cls._new(app_id=app_id, node_id=SYSTEM_VARIABLE_NODE_ID, name=name, value=value)
|
||||
variable.editable = editable
|
||||
return variable
|
||||
|
||||
@classmethod
|
||||
def new_node_variable(
|
||||
cls,
|
||||
*,
|
||||
app_id: str,
|
||||
node_id: str,
|
||||
name: str,
|
||||
value: Segment,
|
||||
visible: bool = True,
|
||||
) -> "WorkflowDraftVariable":
|
||||
variable = cls._new(app_id=app_id, node_id=node_id, name=name, value=value)
|
||||
variable.visible = visible
|
||||
variable.editable = True
|
||||
return variable
|
||||
|
||||
@property
|
||||
def edited(self):
|
||||
return self.last_edited_at is not None
|
||||
|
||||
|
||||
def is_system_variable_editable(name: str) -> bool:
|
||||
return name in _EDITABLE_SYSTEM_VARIABLE
|
||||
|
@ -72,7 +72,7 @@ dependencies = [
|
||||
"python-dotenv==1.0.1",
|
||||
"pyyaml~=6.0.1",
|
||||
"readabilipy~=0.3.0",
|
||||
"redis[hiredis]~=6.0.0",
|
||||
"redis[hiredis]~=6.1.0",
|
||||
"resend~=2.9.0",
|
||||
"sentry-sdk[flask]~=2.28.0",
|
||||
"sqlalchemy~=2.0.29",
|
||||
|
@ -49,7 +49,7 @@ from services.errors.account import (
|
||||
RoleAlreadyAssignedError,
|
||||
TenantNotFoundError,
|
||||
)
|
||||
from services.errors.workspace import WorkSpaceNotAllowedCreateError
|
||||
from services.errors.workspace import WorkSpaceNotAllowedCreateError, WorkspacesLimitExceededError
|
||||
from services.feature_service import FeatureService
|
||||
from tasks.delete_account_task import delete_account_task
|
||||
from tasks.mail_account_deletion_task import send_account_deletion_verification_code
|
||||
@ -628,6 +628,10 @@ class TenantService:
|
||||
if not FeatureService.get_system_features().is_allow_create_workspace and not is_setup:
|
||||
raise WorkSpaceNotAllowedCreateError()
|
||||
|
||||
workspaces = FeatureService.get_system_features().license.workspaces
|
||||
if not workspaces.is_available():
|
||||
raise WorkspacesLimitExceededError()
|
||||
|
||||
if name:
|
||||
tenant = TenantService.create_tenant(name=name, is_setup=is_setup)
|
||||
else:
|
||||
@ -928,7 +932,11 @@ class RegisterService:
|
||||
if open_id is not None and provider is not None:
|
||||
AccountService.link_account_integrate(provider, open_id, account)
|
||||
|
||||
if FeatureService.get_system_features().is_allow_create_workspace and create_workspace_required:
|
||||
if (
|
||||
FeatureService.get_system_features().is_allow_create_workspace
|
||||
and create_workspace_required
|
||||
and FeatureService.get_system_features().license.workspaces.is_available()
|
||||
):
|
||||
tenant = TenantService.create_tenant(f"{account.name}'s Workspace")
|
||||
TenantService.create_tenant_member(tenant, account, role="owner")
|
||||
account.current_tenant = tenant
|
||||
|
@ -40,7 +40,7 @@ IMPORT_INFO_REDIS_KEY_PREFIX = "app_import_info:"
|
||||
CHECK_DEPENDENCIES_REDIS_KEY_PREFIX = "app_check_dependencies:"
|
||||
IMPORT_INFO_REDIS_EXPIRY = 10 * 60 # 10 minutes
|
||||
DSL_MAX_SIZE = 10 * 1024 * 1024 # 10MB
|
||||
CURRENT_DSL_VERSION = "0.2.0"
|
||||
CURRENT_DSL_VERSION = "0.3.0"
|
||||
|
||||
|
||||
class ImportMode(StrEnum):
|
||||
|
@ -18,8 +18,10 @@ from core.tools.utils.configuration import ToolParameterConfigurationManager
|
||||
from events.app_event import app_was_created
|
||||
from extensions.ext_database import db
|
||||
from models.account import Account
|
||||
from models.model import App, AppMode, AppModelConfig
|
||||
from models.model import App, AppMode, AppModelConfig, Site
|
||||
from models.tools import ApiToolProvider
|
||||
from services.enterprise.enterprise_service import EnterpriseService
|
||||
from services.feature_service import FeatureService
|
||||
from services.tag_service import TagService
|
||||
from tasks.remove_app_and_related_data_task import remove_app_and_related_data_task
|
||||
|
||||
@ -155,6 +157,10 @@ class AppService:
|
||||
|
||||
app_was_created.send(app, account=account)
|
||||
|
||||
if FeatureService.get_system_features().webapp_auth.enabled:
|
||||
# update web app setting as private
|
||||
EnterpriseService.WebAppAuth.update_app_access_mode(app.id, "private")
|
||||
|
||||
return app
|
||||
|
||||
def get_app(self, app: App) -> App:
|
||||
@ -307,6 +313,10 @@ class AppService:
|
||||
db.session.delete(app)
|
||||
db.session.commit()
|
||||
|
||||
# clean up web app settings
|
||||
if FeatureService.get_system_features().webapp_auth.enabled:
|
||||
EnterpriseService.WebAppAuth.cleanup_webapp(app.id)
|
||||
|
||||
# Trigger asynchronous deletion of app and related data
|
||||
remove_app_and_related_data_task.delay(tenant_id=app.tenant_id, app_id=app.id)
|
||||
|
||||
@ -373,3 +383,15 @@ class AppService:
|
||||
meta["tool_icons"][tool_name] = {"background": "#252525", "content": "\ud83d\ude01"}
|
||||
|
||||
return meta
|
||||
|
||||
@staticmethod
|
||||
def get_app_code_by_id(app_id: str) -> str:
|
||||
"""
|
||||
Get app code by app id
|
||||
:param app_id: app id
|
||||
:return: app code
|
||||
"""
|
||||
site = db.session.query(Site).filter(Site.app_id == app_id).first()
|
||||
if not site:
|
||||
raise ValueError(f"App with id {app_id} not found")
|
||||
return str(site.code)
|
||||
|
@ -1075,7 +1075,7 @@ class DocumentService:
|
||||
created_by=account.id,
|
||||
)
|
||||
else:
|
||||
logging.warn(
|
||||
logging.warning(
|
||||
f"Invalid process rule mode: {process_rule.mode}, can not find dataset process rule"
|
||||
)
|
||||
return
|
||||
|
@ -1,11 +1,90 @@
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from services.enterprise.base import EnterpriseRequest
|
||||
|
||||
|
||||
class WebAppSettings(BaseModel):
|
||||
access_mode: str = Field(
|
||||
description="Access mode for the web app. Can be 'public' or 'private'",
|
||||
default="private",
|
||||
alias="accessMode",
|
||||
)
|
||||
|
||||
|
||||
class EnterpriseService:
|
||||
@classmethod
|
||||
def get_info(cls):
|
||||
return EnterpriseRequest.send_request("GET", "/info")
|
||||
|
||||
@classmethod
|
||||
def get_app_web_sso_enabled(cls, app_code):
|
||||
return EnterpriseRequest.send_request("GET", f"/app-sso-setting?appCode={app_code}")
|
||||
def get_workspace_info(cls, tenant_id: str):
|
||||
return EnterpriseRequest.send_request("GET", f"/workspace/{tenant_id}/info")
|
||||
|
||||
class WebAppAuth:
|
||||
@classmethod
|
||||
def is_user_allowed_to_access_webapp(cls, user_id: str, app_code: str):
|
||||
params = {"userId": user_id, "appCode": app_code}
|
||||
data = EnterpriseRequest.send_request("GET", "/webapp/permission", params=params)
|
||||
|
||||
return data.get("result", False)
|
||||
|
||||
@classmethod
|
||||
def get_app_access_mode_by_id(cls, app_id: str) -> WebAppSettings:
|
||||
if not app_id:
|
||||
raise ValueError("app_id must be provided.")
|
||||
params = {"appId": app_id}
|
||||
data = EnterpriseRequest.send_request("GET", "/webapp/access-mode/id", params=params)
|
||||
if not data:
|
||||
raise ValueError("No data found.")
|
||||
return WebAppSettings(**data)
|
||||
|
||||
@classmethod
|
||||
def batch_get_app_access_mode_by_id(cls, app_ids: list[str]) -> dict[str, WebAppSettings]:
|
||||
if not app_ids:
|
||||
return {}
|
||||
body = {"appIds": app_ids}
|
||||
data: dict[str, str] = EnterpriseRequest.send_request("POST", "/webapp/access-mode/batch/id", json=body)
|
||||
if not data:
|
||||
raise ValueError("No data found.")
|
||||
|
||||
if not isinstance(data["accessModes"], dict):
|
||||
raise ValueError("Invalid data format.")
|
||||
|
||||
ret = {}
|
||||
for key, value in data["accessModes"].items():
|
||||
curr = WebAppSettings()
|
||||
curr.access_mode = value
|
||||
ret[key] = curr
|
||||
|
||||
return ret
|
||||
|
||||
@classmethod
|
||||
def get_app_access_mode_by_code(cls, app_code: str) -> WebAppSettings:
|
||||
if not app_code:
|
||||
raise ValueError("app_code must be provided.")
|
||||
params = {"appCode": app_code}
|
||||
data = EnterpriseRequest.send_request("GET", "/webapp/access-mode/code", params=params)
|
||||
if not data:
|
||||
raise ValueError("No data found.")
|
||||
return WebAppSettings(**data)
|
||||
|
||||
@classmethod
|
||||
def update_app_access_mode(cls, app_id: str, access_mode: str):
|
||||
if not app_id:
|
||||
raise ValueError("app_id must be provided.")
|
||||
if access_mode not in ["public", "private", "private_all"]:
|
||||
raise ValueError("access_mode must be either 'public', 'private', or 'private_all'")
|
||||
|
||||
data = {"appId": app_id, "accessMode": access_mode}
|
||||
|
||||
response = EnterpriseRequest.send_request("POST", "/webapp/access-mode", json=data)
|
||||
|
||||
return response.get("result", False)
|
||||
|
||||
@classmethod
|
||||
def cleanup_webapp(cls, app_id: str):
|
||||
if not app_id:
|
||||
raise ValueError("app_id must be provided.")
|
||||
|
||||
body = {"appId": app_id}
|
||||
EnterpriseRequest.send_request("DELETE", "/webapp/clean", json=body)
|
||||
|
18
api/services/enterprise/mail_service.py
Normal file
18
api/services/enterprise/mail_service.py
Normal file
@ -0,0 +1,18 @@
|
||||
from pydantic import BaseModel
|
||||
|
||||
from tasks.mail_enterprise_task import send_enterprise_email_task
|
||||
|
||||
|
||||
class DifyMail(BaseModel):
|
||||
to: list[str]
|
||||
subject: str
|
||||
body: str
|
||||
substitutions: dict[str, str] = {}
|
||||
|
||||
|
||||
class EnterpriseMailService:
|
||||
@classmethod
|
||||
def send_mail(cls, mail: DifyMail):
|
||||
send_enterprise_email_task.delay(
|
||||
to=mail.to, subject=mail.subject, body=mail.body, substitutions=mail.substitutions
|
||||
)
|
@ -7,3 +7,7 @@ class WorkSpaceNotAllowedCreateError(BaseServiceError):
|
||||
|
||||
class WorkSpaceNotFoundError(BaseServiceError):
|
||||
pass
|
||||
|
||||
|
||||
class WorkspacesLimitExceededError(BaseServiceError):
|
||||
pass
|
||||
|
@ -1,6 +1,6 @@
|
||||
from enum import StrEnum
|
||||
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
|
||||
from configs import dify_config
|
||||
from services.billing_service import BillingService
|
||||
@ -27,6 +27,32 @@ class LimitationModel(BaseModel):
|
||||
limit: int = 0
|
||||
|
||||
|
||||
class LicenseLimitationModel(BaseModel):
|
||||
"""
|
||||
- enabled: whether this limit is enforced
|
||||
- size: current usage count
|
||||
- limit: maximum allowed count; 0 means unlimited
|
||||
"""
|
||||
|
||||
enabled: bool = Field(False, description="Whether this limit is currently active")
|
||||
size: int = Field(0, description="Number of resources already consumed")
|
||||
limit: int = Field(0, description="Maximum number of resources allowed; 0 means no limit")
|
||||
|
||||
def is_available(self, required: int = 1) -> bool:
|
||||
"""
|
||||
Determine whether the requested amount can be allocated.
|
||||
|
||||
Returns True if:
|
||||
- this limit is not active, or
|
||||
- the limit is zero (unlimited), or
|
||||
- there is enough remaining quota.
|
||||
"""
|
||||
if not self.enabled or self.limit == 0:
|
||||
return True
|
||||
|
||||
return (self.limit - self.size) >= required
|
||||
|
||||
|
||||
class LicenseStatus(StrEnum):
|
||||
NONE = "none"
|
||||
INACTIVE = "inactive"
|
||||
@ -39,6 +65,27 @@ class LicenseStatus(StrEnum):
|
||||
class LicenseModel(BaseModel):
|
||||
status: LicenseStatus = LicenseStatus.NONE
|
||||
expired_at: str = ""
|
||||
workspaces: LicenseLimitationModel = LicenseLimitationModel(enabled=False, size=0, limit=0)
|
||||
|
||||
|
||||
class BrandingModel(BaseModel):
|
||||
enabled: bool = False
|
||||
application_title: str = ""
|
||||
login_page_logo: str = ""
|
||||
workspace_logo: str = ""
|
||||
favicon: str = ""
|
||||
|
||||
|
||||
class WebAppAuthSSOModel(BaseModel):
|
||||
protocol: str = ""
|
||||
|
||||
|
||||
class WebAppAuthModel(BaseModel):
|
||||
enabled: bool = False
|
||||
allow_sso: bool = False
|
||||
sso_config: WebAppAuthSSOModel = WebAppAuthSSOModel()
|
||||
allow_email_code_login: bool = False
|
||||
allow_email_password_login: bool = False
|
||||
|
||||
|
||||
class FeatureModel(BaseModel):
|
||||
@ -54,6 +101,8 @@ class FeatureModel(BaseModel):
|
||||
can_replace_logo: bool = False
|
||||
model_load_balancing_enabled: bool = False
|
||||
dataset_operator_enabled: bool = False
|
||||
webapp_copyright_enabled: bool = False
|
||||
workspace_members: LicenseLimitationModel = LicenseLimitationModel(enabled=False, size=0, limit=0)
|
||||
|
||||
# pydantic configs
|
||||
model_config = ConfigDict(protected_namespaces=())
|
||||
@ -68,9 +117,6 @@ class KnowledgeRateLimitModel(BaseModel):
|
||||
class SystemFeatureModel(BaseModel):
|
||||
sso_enforced_for_signin: bool = False
|
||||
sso_enforced_for_signin_protocol: str = ""
|
||||
sso_enforced_for_web: bool = False
|
||||
sso_enforced_for_web_protocol: str = ""
|
||||
enable_web_sso_switch_component: bool = False
|
||||
enable_marketplace: bool = False
|
||||
max_plugin_package_size: int = dify_config.PLUGIN_MAX_PACKAGE_SIZE
|
||||
enable_email_code_login: bool = False
|
||||
@ -80,6 +126,8 @@ class SystemFeatureModel(BaseModel):
|
||||
is_allow_create_workspace: bool = False
|
||||
is_email_setup: bool = False
|
||||
license: LicenseModel = LicenseModel()
|
||||
branding: BrandingModel = BrandingModel()
|
||||
webapp_auth: WebAppAuthModel = WebAppAuthModel()
|
||||
|
||||
|
||||
class FeatureService:
|
||||
@ -92,6 +140,10 @@ class FeatureService:
|
||||
if dify_config.BILLING_ENABLED and tenant_id:
|
||||
cls._fulfill_params_from_billing_api(features, tenant_id)
|
||||
|
||||
if dify_config.ENTERPRISE_ENABLED:
|
||||
features.webapp_copyright_enabled = True
|
||||
cls._fulfill_params_from_workspace_info(features, tenant_id)
|
||||
|
||||
return features
|
||||
|
||||
@classmethod
|
||||
@ -111,8 +163,8 @@ class FeatureService:
|
||||
cls._fulfill_system_params_from_env(system_features)
|
||||
|
||||
if dify_config.ENTERPRISE_ENABLED:
|
||||
system_features.enable_web_sso_switch_component = True
|
||||
|
||||
system_features.branding.enabled = True
|
||||
system_features.webapp_auth.enabled = True
|
||||
cls._fulfill_params_from_enterprise(system_features)
|
||||
|
||||
if dify_config.MARKETPLACE_ENABLED:
|
||||
@ -136,6 +188,14 @@ class FeatureService:
|
||||
features.dataset_operator_enabled = dify_config.DATASET_OPERATOR_ENABLED
|
||||
features.education.enabled = dify_config.EDUCATION_ENABLED
|
||||
|
||||
@classmethod
|
||||
def _fulfill_params_from_workspace_info(cls, features: FeatureModel, tenant_id: str):
|
||||
workspace_info = EnterpriseService.get_workspace_info(tenant_id)
|
||||
if "WorkspaceMembers" in workspace_info:
|
||||
features.workspace_members.size = workspace_info["WorkspaceMembers"]["used"]
|
||||
features.workspace_members.limit = workspace_info["WorkspaceMembers"]["limit"]
|
||||
features.workspace_members.enabled = workspace_info["WorkspaceMembers"]["enabled"]
|
||||
|
||||
@classmethod
|
||||
def _fulfill_params_from_billing_api(cls, features: FeatureModel, tenant_id: str):
|
||||
billing_info = BillingService.get_info(tenant_id)
|
||||
@ -145,6 +205,9 @@ class FeatureService:
|
||||
features.billing.subscription.interval = billing_info["subscription"]["interval"]
|
||||
features.education.activated = billing_info["subscription"].get("education", False)
|
||||
|
||||
if features.billing.subscription.plan != "sandbox":
|
||||
features.webapp_copyright_enabled = True
|
||||
|
||||
if "members" in billing_info:
|
||||
features.members.size = billing_info["members"]["size"]
|
||||
features.members.limit = billing_info["members"]["limit"]
|
||||
@ -178,38 +241,53 @@ class FeatureService:
|
||||
features.knowledge_rate_limit = billing_info["knowledge_rate_limit"]["limit"]
|
||||
|
||||
@classmethod
|
||||
def _fulfill_params_from_enterprise(cls, features):
|
||||
def _fulfill_params_from_enterprise(cls, features: SystemFeatureModel):
|
||||
enterprise_info = EnterpriseService.get_info()
|
||||
|
||||
if "sso_enforced_for_signin" in enterprise_info:
|
||||
features.sso_enforced_for_signin = enterprise_info["sso_enforced_for_signin"]
|
||||
if "SSOEnforcedForSignin" in enterprise_info:
|
||||
features.sso_enforced_for_signin = enterprise_info["SSOEnforcedForSignin"]
|
||||
|
||||
if "sso_enforced_for_signin_protocol" in enterprise_info:
|
||||
features.sso_enforced_for_signin_protocol = enterprise_info["sso_enforced_for_signin_protocol"]
|
||||
if "SSOEnforcedForSigninProtocol" in enterprise_info:
|
||||
features.sso_enforced_for_signin_protocol = enterprise_info["SSOEnforcedForSigninProtocol"]
|
||||
|
||||
if "sso_enforced_for_web" in enterprise_info:
|
||||
features.sso_enforced_for_web = enterprise_info["sso_enforced_for_web"]
|
||||
if "EnableEmailCodeLogin" in enterprise_info:
|
||||
features.enable_email_code_login = enterprise_info["EnableEmailCodeLogin"]
|
||||
|
||||
if "sso_enforced_for_web_protocol" in enterprise_info:
|
||||
features.sso_enforced_for_web_protocol = enterprise_info["sso_enforced_for_web_protocol"]
|
||||
if "EnableEmailPasswordLogin" in enterprise_info:
|
||||
features.enable_email_password_login = enterprise_info["EnableEmailPasswordLogin"]
|
||||
|
||||
if "enable_email_code_login" in enterprise_info:
|
||||
features.enable_email_code_login = enterprise_info["enable_email_code_login"]
|
||||
if "IsAllowRegister" in enterprise_info:
|
||||
features.is_allow_register = enterprise_info["IsAllowRegister"]
|
||||
|
||||
if "enable_email_password_login" in enterprise_info:
|
||||
features.enable_email_password_login = enterprise_info["enable_email_password_login"]
|
||||
if "IsAllowCreateWorkspace" in enterprise_info:
|
||||
features.is_allow_create_workspace = enterprise_info["IsAllowCreateWorkspace"]
|
||||
|
||||
if "is_allow_register" in enterprise_info:
|
||||
features.is_allow_register = enterprise_info["is_allow_register"]
|
||||
if "Branding" in enterprise_info:
|
||||
features.branding.application_title = enterprise_info["Branding"].get("applicationTitle", "")
|
||||
features.branding.login_page_logo = enterprise_info["Branding"].get("loginPageLogo", "")
|
||||
features.branding.workspace_logo = enterprise_info["Branding"].get("workspaceLogo", "")
|
||||
features.branding.favicon = enterprise_info["Branding"].get("favicon", "")
|
||||
|
||||
if "is_allow_create_workspace" in enterprise_info:
|
||||
features.is_allow_create_workspace = enterprise_info["is_allow_create_workspace"]
|
||||
if "WebAppAuth" in enterprise_info:
|
||||
features.webapp_auth.allow_sso = enterprise_info["WebAppAuth"].get("allowSso", False)
|
||||
features.webapp_auth.allow_email_code_login = enterprise_info["WebAppAuth"].get(
|
||||
"allowEmailCodeLogin", False
|
||||
)
|
||||
features.webapp_auth.allow_email_password_login = enterprise_info["WebAppAuth"].get(
|
||||
"allowEmailPasswordLogin", False
|
||||
)
|
||||
features.webapp_auth.sso_config.protocol = enterprise_info.get("SSOEnforcedForWebProtocol", "")
|
||||
|
||||
if "license" in enterprise_info:
|
||||
license_info = enterprise_info["license"]
|
||||
if "License" in enterprise_info:
|
||||
license_info = enterprise_info["License"]
|
||||
|
||||
if "status" in license_info:
|
||||
features.license.status = LicenseStatus(license_info.get("status", LicenseStatus.INACTIVE))
|
||||
|
||||
if "expired_at" in license_info:
|
||||
features.license.expired_at = license_info["expired_at"]
|
||||
if "expiredAt" in license_info:
|
||||
features.license.expired_at = license_info["expiredAt"]
|
||||
|
||||
if "workspaces" in license_info:
|
||||
features.license.workspaces.enabled = license_info["workspaces"]["enabled"]
|
||||
features.license.workspaces.limit = license_info["workspaces"]["limit"]
|
||||
features.license.workspaces.size = license_info["workspaces"]["used"]
|
||||
|
@ -19,7 +19,7 @@ from core.rag.extractor.extract_processor import ExtractProcessor
|
||||
from extensions.ext_database import db
|
||||
from extensions.ext_storage import storage
|
||||
from models.account import Account
|
||||
from models.enums import CreatedByRole
|
||||
from models.enums import CreatorUserRole
|
||||
from models.model import EndUser, UploadFile
|
||||
|
||||
from .errors.file import FileTooLargeError, UnsupportedFileTypeError
|
||||
@ -81,7 +81,7 @@ class FileService:
|
||||
size=file_size,
|
||||
extension=extension,
|
||||
mime_type=mimetype,
|
||||
created_by_role=(CreatedByRole.ACCOUNT if isinstance(user, Account) else CreatedByRole.END_USER),
|
||||
created_by_role=(CreatorUserRole.ACCOUNT if isinstance(user, Account) else CreatorUserRole.END_USER),
|
||||
created_by=user.id,
|
||||
created_at=datetime.datetime.now(datetime.UTC).replace(tzinfo=None),
|
||||
used=False,
|
||||
@ -133,7 +133,7 @@ class FileService:
|
||||
extension="txt",
|
||||
mime_type="text/plain",
|
||||
created_by=current_user.id,
|
||||
created_by_role=CreatedByRole.ACCOUNT,
|
||||
created_by_role=CreatorUserRole.ACCOUNT,
|
||||
created_at=datetime.datetime.now(datetime.UTC).replace(tzinfo=None),
|
||||
used=True,
|
||||
used_by=current_user.id,
|
||||
|
@ -87,7 +87,9 @@ class OpsService:
|
||||
:param tracing_config: tracing config
|
||||
:return:
|
||||
"""
|
||||
if tracing_provider not in provider_config_map and tracing_provider:
|
||||
try:
|
||||
provider_config_map[tracing_provider]
|
||||
except KeyError:
|
||||
return {"error": f"Invalid tracing provider: {tracing_provider}"}
|
||||
|
||||
config_class, other_keys = (
|
||||
@ -150,7 +152,9 @@ class OpsService:
|
||||
:param tracing_config: tracing config
|
||||
:return:
|
||||
"""
|
||||
if tracing_provider not in provider_config_map:
|
||||
try:
|
||||
provider_config_map[tracing_provider]
|
||||
except KeyError:
|
||||
raise ValueError(f"Invalid tracing provider: {tracing_provider}")
|
||||
|
||||
# check if trace config already exists
|
||||
|
141
api/services/webapp_auth_service.py
Normal file
141
api/services/webapp_auth_service.py
Normal file
@ -0,0 +1,141 @@
|
||||
import random
|
||||
from datetime import UTC, datetime, timedelta
|
||||
from typing import Any, Optional, cast
|
||||
|
||||
from werkzeug.exceptions import NotFound, Unauthorized
|
||||
|
||||
from configs import dify_config
|
||||
from controllers.web.error import WebAppAuthAccessDeniedError
|
||||
from extensions.ext_database import db
|
||||
from libs.helper import TokenManager
|
||||
from libs.passport import PassportService
|
||||
from libs.password import compare_password
|
||||
from models.account import Account, AccountStatus
|
||||
from models.model import App, EndUser, Site
|
||||
from services.enterprise.enterprise_service import EnterpriseService
|
||||
from services.errors.account import AccountLoginError, AccountNotFoundError, AccountPasswordError
|
||||
from services.feature_service import FeatureService
|
||||
from tasks.mail_email_code_login import send_email_code_login_mail_task
|
||||
|
||||
|
||||
class WebAppAuthService:
|
||||
"""Service for web app authentication."""
|
||||
|
||||
@staticmethod
|
||||
def authenticate(email: str, password: str) -> Account:
|
||||
"""authenticate account with email and password"""
|
||||
|
||||
account = Account.query.filter_by(email=email).first()
|
||||
if not account:
|
||||
raise AccountNotFoundError()
|
||||
|
||||
if account.status == AccountStatus.BANNED.value:
|
||||
raise AccountLoginError("Account is banned.")
|
||||
|
||||
if account.password is None or not compare_password(password, account.password, account.password_salt):
|
||||
raise AccountPasswordError("Invalid email or password.")
|
||||
|
||||
return cast(Account, account)
|
||||
|
||||
@classmethod
|
||||
def login(cls, account: Account, app_code: str, end_user_id: str) -> str:
|
||||
site = db.session.query(Site).filter(Site.code == app_code).first()
|
||||
if not site:
|
||||
raise NotFound("Site not found.")
|
||||
|
||||
access_token = cls._get_account_jwt_token(account=account, site=site, end_user_id=end_user_id)
|
||||
|
||||
return access_token
|
||||
|
||||
@classmethod
|
||||
def get_user_through_email(cls, email: str):
|
||||
account = db.session.query(Account).filter(Account.email == email).first()
|
||||
if not account:
|
||||
return None
|
||||
|
||||
if account.status == AccountStatus.BANNED.value:
|
||||
raise Unauthorized("Account is banned.")
|
||||
|
||||
return account
|
||||
|
||||
@classmethod
|
||||
def send_email_code_login_email(
|
||||
cls, account: Optional[Account] = None, email: Optional[str] = None, language: Optional[str] = "en-US"
|
||||
):
|
||||
email = account.email if account else email
|
||||
if email is None:
|
||||
raise ValueError("Email must be provided.")
|
||||
|
||||
code = "".join([str(random.randint(0, 9)) for _ in range(6)])
|
||||
token = TokenManager.generate_token(
|
||||
account=account, email=email, token_type="webapp_email_code_login", additional_data={"code": code}
|
||||
)
|
||||
send_email_code_login_mail_task.delay(
|
||||
language=language,
|
||||
to=account.email if account else email,
|
||||
code=code,
|
||||
)
|
||||
|
||||
return token
|
||||
|
||||
@classmethod
|
||||
def get_email_code_login_data(cls, token: str) -> Optional[dict[str, Any]]:
|
||||
return TokenManager.get_token_data(token, "webapp_email_code_login")
|
||||
|
||||
@classmethod
|
||||
def revoke_email_code_login_token(cls, token: str):
|
||||
TokenManager.revoke_token(token, "webapp_email_code_login")
|
||||
|
||||
@classmethod
|
||||
def create_end_user(cls, app_code, email) -> EndUser:
|
||||
site = db.session.query(Site).filter(Site.code == app_code).first()
|
||||
if not site:
|
||||
raise NotFound("Site not found.")
|
||||
app_model = db.session.query(App).filter(App.id == site.app_id).first()
|
||||
if not app_model:
|
||||
raise NotFound("App not found.")
|
||||
end_user = EndUser(
|
||||
tenant_id=app_model.tenant_id,
|
||||
app_id=app_model.id,
|
||||
type="browser",
|
||||
is_anonymous=False,
|
||||
session_id=email,
|
||||
name="enterpriseuser",
|
||||
external_user_id="enterpriseuser",
|
||||
)
|
||||
db.session.add(end_user)
|
||||
db.session.commit()
|
||||
|
||||
return end_user
|
||||
|
||||
@classmethod
|
||||
def _validate_user_accessibility(cls, account: Account, app_code: str):
|
||||
"""Check if the user is allowed to access the app."""
|
||||
system_features = FeatureService.get_system_features()
|
||||
if system_features.webapp_auth.enabled:
|
||||
app_settings = EnterpriseService.WebAppAuth.get_app_access_mode_by_code(app_code=app_code)
|
||||
|
||||
if (
|
||||
app_settings.access_mode != "public"
|
||||
and not EnterpriseService.WebAppAuth.is_user_allowed_to_access_webapp(account.id, app_code=app_code)
|
||||
):
|
||||
raise WebAppAuthAccessDeniedError()
|
||||
|
||||
@classmethod
|
||||
def _get_account_jwt_token(cls, account: Account, site: Site, end_user_id: str) -> str:
|
||||
exp_dt = datetime.now(UTC) + timedelta(hours=dify_config.ACCESS_TOKEN_EXPIRE_MINUTES * 24)
|
||||
exp = int(exp_dt.timestamp())
|
||||
|
||||
payload = {
|
||||
"iss": site.id,
|
||||
"sub": "Web API Passport",
|
||||
"app_id": site.app_id,
|
||||
"app_code": site.code,
|
||||
"user_id": account.id,
|
||||
"end_user_id": end_user_id,
|
||||
"token_source": "webapp",
|
||||
"exp": exp,
|
||||
}
|
||||
|
||||
token: str = PassportService().issue(payload)
|
||||
return token
|
@ -5,7 +5,7 @@ from sqlalchemy import and_, func, or_, select
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from models import App, EndUser, WorkflowAppLog, WorkflowRun
|
||||
from models.enums import CreatedByRole
|
||||
from models.enums import CreatorUserRole
|
||||
from models.workflow import WorkflowRunStatus
|
||||
|
||||
|
||||
@ -58,7 +58,7 @@ class WorkflowAppService:
|
||||
|
||||
stmt = stmt.outerjoin(
|
||||
EndUser,
|
||||
and_(WorkflowRun.created_by == EndUser.id, WorkflowRun.created_by_role == CreatedByRole.END_USER),
|
||||
and_(WorkflowRun.created_by == EndUser.id, WorkflowRun.created_by_role == CreatorUserRole.END_USER),
|
||||
).where(or_(*keyword_conditions))
|
||||
|
||||
if status:
|
||||
|
@ -1,4 +1,5 @@
|
||||
import threading
|
||||
from collections.abc import Sequence
|
||||
from typing import Optional
|
||||
|
||||
import contexts
|
||||
@ -6,11 +7,13 @@ from core.repositories import SQLAlchemyWorkflowNodeExecutionRepository
|
||||
from core.workflow.repository.workflow_node_execution_repository import OrderConfig
|
||||
from extensions.ext_database import db
|
||||
from libs.infinite_scroll_pagination import InfiniteScrollPagination
|
||||
from models.enums import WorkflowRunTriggeredFrom
|
||||
from models.model import App
|
||||
from models.workflow import (
|
||||
from models import (
|
||||
Account,
|
||||
App,
|
||||
EndUser,
|
||||
WorkflowNodeExecution,
|
||||
WorkflowRun,
|
||||
WorkflowRunTriggeredFrom,
|
||||
)
|
||||
|
||||
|
||||
@ -116,7 +119,12 @@ class WorkflowRunService:
|
||||
|
||||
return workflow_run
|
||||
|
||||
def get_workflow_run_node_executions(self, app_model: App, run_id: str) -> list[WorkflowNodeExecution]:
|
||||
def get_workflow_run_node_executions(
|
||||
self,
|
||||
app_model: App,
|
||||
run_id: str,
|
||||
user: Account | EndUser,
|
||||
) -> Sequence[WorkflowNodeExecution]:
|
||||
"""
|
||||
Get workflow run node execution list
|
||||
"""
|
||||
@ -128,13 +136,18 @@ class WorkflowRunService:
|
||||
if not workflow_run:
|
||||
return []
|
||||
|
||||
# Use the repository to get the node executions
|
||||
repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=db.engine, tenant_id=app_model.tenant_id, app_id=app_model.id
|
||||
session_factory=db.engine,
|
||||
user=user,
|
||||
app_id=app_model.id,
|
||||
triggered_from=None,
|
||||
)
|
||||
|
||||
# Use the repository to get the node executions with ordering
|
||||
order_config = OrderConfig(order_by=["index"], order_direction="desc")
|
||||
node_executions = repository.get_by_workflow_run(workflow_run_id=run_id, order_config=order_config)
|
||||
|
||||
return list(node_executions)
|
||||
# Convert domain models to database models
|
||||
workflow_node_executions = [repository.to_db_model(node_execution) for node_execution in node_executions]
|
||||
|
||||
return workflow_node_executions
|
||||
|
@ -10,10 +10,10 @@ from sqlalchemy.orm import Session
|
||||
|
||||
from core.app.apps.advanced_chat.app_config_manager import AdvancedChatAppConfigManager
|
||||
from core.app.apps.workflow.app_config_manager import WorkflowAppConfigManager
|
||||
from core.model_runtime.utils.encoders import jsonable_encoder
|
||||
from core.repositories import SQLAlchemyWorkflowNodeExecutionRepository
|
||||
from core.variables import Variable
|
||||
from core.workflow.entities.node_entities import NodeRunResult
|
||||
from core.workflow.entities.node_execution_entities import NodeExecution, NodeExecutionStatus
|
||||
from core.workflow.errors import WorkflowNodeRunFailedError
|
||||
from core.workflow.graph_engine.entities.event import InNodeEvent
|
||||
from core.workflow.nodes import NodeType
|
||||
@ -26,7 +26,6 @@ from core.workflow.workflow_entry import WorkflowEntry
|
||||
from events.app_event import app_draft_workflow_was_synced, app_published_workflow_was_updated
|
||||
from extensions.ext_database import db
|
||||
from models.account import Account
|
||||
from models.enums import CreatedByRole
|
||||
from models.model import App, AppMode
|
||||
from models.tools import WorkflowToolProvider
|
||||
from models.workflow import (
|
||||
@ -267,33 +266,37 @@ class WorkflowService:
|
||||
# run draft workflow node
|
||||
start_at = time.perf_counter()
|
||||
|
||||
workflow_node_execution = self._handle_node_run_result(
|
||||
getter=lambda: WorkflowEntry.single_step_run(
|
||||
node_execution = self._handle_node_run_result(
|
||||
invoke_node_fn=lambda: WorkflowEntry.single_step_run(
|
||||
workflow=draft_workflow,
|
||||
node_id=node_id,
|
||||
user_inputs=user_inputs,
|
||||
user_id=account.id,
|
||||
),
|
||||
start_at=start_at,
|
||||
tenant_id=app_model.tenant_id,
|
||||
node_id=node_id,
|
||||
)
|
||||
|
||||
workflow_node_execution.app_id = app_model.id
|
||||
workflow_node_execution.created_by = account.id
|
||||
workflow_node_execution.workflow_id = draft_workflow.id
|
||||
# Set workflow_id on the NodeExecution
|
||||
node_execution.workflow_id = draft_workflow.id
|
||||
|
||||
# Use the repository to save the workflow node execution
|
||||
# Create repository and save the node execution
|
||||
repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=db.engine, tenant_id=app_model.tenant_id, app_id=app_model.id
|
||||
session_factory=db.engine,
|
||||
user=account,
|
||||
app_id=app_model.id,
|
||||
triggered_from=WorkflowNodeExecutionTriggeredFrom.SINGLE_STEP,
|
||||
)
|
||||
repository.save(workflow_node_execution)
|
||||
repository.save(node_execution)
|
||||
|
||||
# Convert node_execution to WorkflowNodeExecution after save
|
||||
workflow_node_execution = repository.to_db_model(node_execution)
|
||||
|
||||
return workflow_node_execution
|
||||
|
||||
def run_free_workflow_node(
|
||||
self, node_data: dict, tenant_id: str, user_id: str, node_id: str, user_inputs: dict[str, Any]
|
||||
) -> WorkflowNodeExecution:
|
||||
) -> NodeExecution:
|
||||
"""
|
||||
Run draft workflow node
|
||||
"""
|
||||
@ -301,7 +304,7 @@ class WorkflowService:
|
||||
start_at = time.perf_counter()
|
||||
|
||||
workflow_node_execution = self._handle_node_run_result(
|
||||
getter=lambda: WorkflowEntry.run_free_node(
|
||||
invoke_node_fn=lambda: WorkflowEntry.run_free_node(
|
||||
node_id=node_id,
|
||||
node_data=node_data,
|
||||
tenant_id=tenant_id,
|
||||
@ -309,7 +312,6 @@ class WorkflowService:
|
||||
user_inputs=user_inputs,
|
||||
),
|
||||
start_at=start_at,
|
||||
tenant_id=tenant_id,
|
||||
node_id=node_id,
|
||||
)
|
||||
|
||||
@ -317,21 +319,12 @@ class WorkflowService:
|
||||
|
||||
def _handle_node_run_result(
|
||||
self,
|
||||
getter: Callable[[], tuple[BaseNode, Generator[NodeEvent | InNodeEvent, None, None]]],
|
||||
invoke_node_fn: Callable[[], tuple[BaseNode, Generator[NodeEvent | InNodeEvent, None, None]]],
|
||||
start_at: float,
|
||||
tenant_id: str,
|
||||
node_id: str,
|
||||
) -> WorkflowNodeExecution:
|
||||
"""
|
||||
Handle node run result
|
||||
|
||||
:param getter: Callable[[], tuple[BaseNode, Generator[RunEvent | InNodeEvent, None, None]]]
|
||||
:param start_at: float
|
||||
:param tenant_id: str
|
||||
:param node_id: str
|
||||
"""
|
||||
) -> NodeExecution:
|
||||
try:
|
||||
node_instance, generator = getter()
|
||||
node_instance, generator = invoke_node_fn()
|
||||
|
||||
node_run_result: NodeRunResult | None = None
|
||||
for event in generator:
|
||||
@ -380,20 +373,21 @@ class WorkflowService:
|
||||
node_run_result = None
|
||||
error = e.error
|
||||
|
||||
workflow_node_execution = WorkflowNodeExecution()
|
||||
workflow_node_execution.id = str(uuid4())
|
||||
workflow_node_execution.tenant_id = tenant_id
|
||||
workflow_node_execution.triggered_from = WorkflowNodeExecutionTriggeredFrom.SINGLE_STEP.value
|
||||
workflow_node_execution.index = 1
|
||||
workflow_node_execution.node_id = node_id
|
||||
workflow_node_execution.node_type = node_instance.node_type
|
||||
workflow_node_execution.title = node_instance.node_data.title
|
||||
workflow_node_execution.elapsed_time = time.perf_counter() - start_at
|
||||
workflow_node_execution.created_by_role = CreatedByRole.ACCOUNT.value
|
||||
workflow_node_execution.created_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
workflow_node_execution.finished_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
# Create a NodeExecution domain model
|
||||
node_execution = NodeExecution(
|
||||
id=str(uuid4()),
|
||||
workflow_id="", # This is a single-step execution, so no workflow ID
|
||||
index=1,
|
||||
node_id=node_id,
|
||||
node_type=node_instance.node_type,
|
||||
title=node_instance.node_data.title,
|
||||
elapsed_time=time.perf_counter() - start_at,
|
||||
created_at=datetime.now(UTC).replace(tzinfo=None),
|
||||
finished_at=datetime.now(UTC).replace(tzinfo=None),
|
||||
)
|
||||
|
||||
if run_succeeded and node_run_result:
|
||||
# create workflow node execution
|
||||
# Set inputs, process_data, and outputs as dictionaries (not JSON strings)
|
||||
inputs = WorkflowEntry.handle_special_values(node_run_result.inputs) if node_run_result.inputs else None
|
||||
process_data = (
|
||||
WorkflowEntry.handle_special_values(node_run_result.process_data)
|
||||
@ -402,23 +396,23 @@ class WorkflowService:
|
||||
)
|
||||
outputs = WorkflowEntry.handle_special_values(node_run_result.outputs) if node_run_result.outputs else None
|
||||
|
||||
workflow_node_execution.inputs = json.dumps(inputs)
|
||||
workflow_node_execution.process_data = json.dumps(process_data)
|
||||
workflow_node_execution.outputs = json.dumps(outputs)
|
||||
workflow_node_execution.execution_metadata = (
|
||||
json.dumps(jsonable_encoder(node_run_result.metadata)) if node_run_result.metadata else None
|
||||
)
|
||||
if node_run_result.status == WorkflowNodeExecutionStatus.SUCCEEDED:
|
||||
workflow_node_execution.status = WorkflowNodeExecutionStatus.SUCCEEDED.value
|
||||
elif node_run_result.status == WorkflowNodeExecutionStatus.EXCEPTION:
|
||||
workflow_node_execution.status = WorkflowNodeExecutionStatus.EXCEPTION.value
|
||||
workflow_node_execution.error = node_run_result.error
|
||||
else:
|
||||
# create workflow node execution
|
||||
workflow_node_execution.status = WorkflowNodeExecutionStatus.FAILED.value
|
||||
workflow_node_execution.error = error
|
||||
node_execution.inputs = inputs
|
||||
node_execution.process_data = process_data
|
||||
node_execution.outputs = outputs
|
||||
node_execution.metadata = node_run_result.metadata
|
||||
|
||||
return workflow_node_execution
|
||||
# Map status from WorkflowNodeExecutionStatus to NodeExecutionStatus
|
||||
if node_run_result.status == WorkflowNodeExecutionStatus.SUCCEEDED:
|
||||
node_execution.status = NodeExecutionStatus.SUCCEEDED
|
||||
elif node_run_result.status == WorkflowNodeExecutionStatus.EXCEPTION:
|
||||
node_execution.status = NodeExecutionStatus.EXCEPTION
|
||||
node_execution.error = node_run_result.error
|
||||
else:
|
||||
# Set failed status and error
|
||||
node_execution.status = NodeExecutionStatus.FAILED
|
||||
node_execution.error = error
|
||||
|
||||
return node_execution
|
||||
|
||||
def convert_to_workflow(self, app_model: App, account: Account, args: dict) -> App:
|
||||
"""
|
||||
|
@ -6,6 +6,7 @@ from celery import shared_task # type: ignore
|
||||
from flask import render_template
|
||||
|
||||
from extensions.ext_mail import mail
|
||||
from services.feature_service import FeatureService
|
||||
|
||||
|
||||
@shared_task(queue="mail")
|
||||
@ -25,10 +26,24 @@ def send_email_code_login_mail_task(language: str, to: str, code: str):
|
||||
# send email code login mail using different languages
|
||||
try:
|
||||
if language == "zh-Hans":
|
||||
html_content = render_template("email_code_login_mail_template_zh-CN.html", to=to, code=code)
|
||||
template = "email_code_login_mail_template_zh-CN.html"
|
||||
system_features = FeatureService.get_system_features()
|
||||
if system_features.branding.enabled:
|
||||
application_title = system_features.branding.application_title
|
||||
template = "without-brand/email_code_login_mail_template_zh-CN.html"
|
||||
html_content = render_template(template, to=to, code=code, application_title=application_title)
|
||||
else:
|
||||
html_content = render_template(template, to=to, code=code)
|
||||
mail.send(to=to, subject="邮箱验证码", html=html_content)
|
||||
else:
|
||||
html_content = render_template("email_code_login_mail_template_en-US.html", to=to, code=code)
|
||||
template = "email_code_login_mail_template_en-US.html"
|
||||
system_features = FeatureService.get_system_features()
|
||||
if system_features.branding.enabled:
|
||||
application_title = system_features.branding.application_title
|
||||
template = "without-brand/email_code_login_mail_template_en-US.html"
|
||||
html_content = render_template(template, to=to, code=code, application_title=application_title)
|
||||
else:
|
||||
html_content = render_template(template, to=to, code=code)
|
||||
mail.send(to=to, subject="Email Code", html=html_content)
|
||||
|
||||
end_at = time.perf_counter()
|
||||
|
33
api/tasks/mail_enterprise_task.py
Normal file
33
api/tasks/mail_enterprise_task.py
Normal file
@ -0,0 +1,33 @@
|
||||
import logging
|
||||
import time
|
||||
|
||||
import click
|
||||
from celery import shared_task # type: ignore
|
||||
from flask import render_template_string
|
||||
|
||||
from extensions.ext_mail import mail
|
||||
|
||||
|
||||
@shared_task(queue="mail")
|
||||
def send_enterprise_email_task(to, subject, body, substitutions):
|
||||
if not mail.is_inited():
|
||||
return
|
||||
|
||||
logging.info(click.style("Start enterprise mail to {} with subject {}".format(to, subject), fg="green"))
|
||||
start_at = time.perf_counter()
|
||||
|
||||
try:
|
||||
html_content = render_template_string(body, **substitutions)
|
||||
|
||||
if isinstance(to, list):
|
||||
for t in to:
|
||||
mail.send(to=t, subject=subject, html=html_content)
|
||||
else:
|
||||
mail.send(to=to, subject=subject, html=html_content)
|
||||
|
||||
end_at = time.perf_counter()
|
||||
logging.info(
|
||||
click.style("Send enterprise mail to {} succeeded: latency: {}".format(to, end_at - start_at), fg="green")
|
||||
)
|
||||
except Exception:
|
||||
logging.exception("Send enterprise mail to {} failed".format(to))
|
@ -7,6 +7,7 @@ from flask import render_template
|
||||
|
||||
from configs import dify_config
|
||||
from extensions.ext_mail import mail
|
||||
from services.feature_service import FeatureService
|
||||
|
||||
|
||||
@shared_task(queue="mail")
|
||||
@ -33,21 +34,43 @@ def send_invite_member_mail_task(language: str, to: str, token: str, inviter_nam
|
||||
try:
|
||||
url = f"{dify_config.CONSOLE_WEB_URL}/activate?token={token}"
|
||||
if language == "zh-Hans":
|
||||
template = "invite_member_mail_template_zh-CN.html"
|
||||
system_features = FeatureService.get_system_features()
|
||||
if system_features.branding.enabled:
|
||||
application_title = system_features.branding.application_title
|
||||
template = "without-brand/invite_member_mail_template_zh-CN.html"
|
||||
html_content = render_template(
|
||||
"invite_member_mail_template_zh-CN.html",
|
||||
template,
|
||||
to=to,
|
||||
inviter_name=inviter_name,
|
||||
workspace_name=workspace_name,
|
||||
url=url,
|
||||
application_title=application_title,
|
||||
)
|
||||
mail.send(to=to, subject=f"立即加入 {application_title} 工作空间", html=html_content)
|
||||
else:
|
||||
html_content = render_template(
|
||||
template, to=to, inviter_name=inviter_name, workspace_name=workspace_name, url=url
|
||||
)
|
||||
mail.send(to=to, subject="立即加入 Dify 工作空间", html=html_content)
|
||||
else:
|
||||
template = "invite_member_mail_template_en-US.html"
|
||||
system_features = FeatureService.get_system_features()
|
||||
if system_features.branding.enabled:
|
||||
application_title = system_features.branding.application_title
|
||||
template = "without-brand/invite_member_mail_template_en-US.html"
|
||||
html_content = render_template(
|
||||
"invite_member_mail_template_en-US.html",
|
||||
template,
|
||||
to=to,
|
||||
inviter_name=inviter_name,
|
||||
workspace_name=workspace_name,
|
||||
url=url,
|
||||
application_title=application_title,
|
||||
)
|
||||
mail.send(to=to, subject=f"Join {application_title} Workspace Now", html=html_content)
|
||||
else:
|
||||
html_content = render_template(
|
||||
template, to=to, inviter_name=inviter_name, workspace_name=workspace_name, url=url
|
||||
)
|
||||
mail.send(to=to, subject="Join Dify Workspace Now", html=html_content)
|
||||
|
||||
|
@ -6,6 +6,7 @@ from celery import shared_task # type: ignore
|
||||
from flask import render_template
|
||||
|
||||
from extensions.ext_mail import mail
|
||||
from services.feature_service import FeatureService
|
||||
|
||||
|
||||
@shared_task(queue="mail")
|
||||
@ -25,10 +26,26 @@ def send_reset_password_mail_task(language: str, to: str, code: str):
|
||||
# send reset password mail using different languages
|
||||
try:
|
||||
if language == "zh-Hans":
|
||||
html_content = render_template("reset_password_mail_template_zh-CN.html", to=to, code=code)
|
||||
template = "reset_password_mail_template_zh-CN.html"
|
||||
system_features = FeatureService.get_system_features()
|
||||
if system_features.branding.enabled:
|
||||
application_title = system_features.branding.application_title
|
||||
template = "without-brand/reset_password_mail_template_zh-CN.html"
|
||||
html_content = render_template(template, to=to, code=code, application_title=application_title)
|
||||
mail.send(to=to, subject=f"设置您的 {application_title} 密码", html=html_content)
|
||||
else:
|
||||
html_content = render_template(template, to=to, code=code)
|
||||
mail.send(to=to, subject="设置您的 Dify 密码", html=html_content)
|
||||
else:
|
||||
html_content = render_template("reset_password_mail_template_en-US.html", to=to, code=code)
|
||||
template = "reset_password_mail_template_en-US.html"
|
||||
system_features = FeatureService.get_system_features()
|
||||
if system_features.branding.enabled:
|
||||
application_title = system_features.branding.application_title
|
||||
template = "without-brand/reset_password_mail_template_en-US.html"
|
||||
html_content = render_template(template, to=to, code=code, application_title=application_title)
|
||||
mail.send(to=to, subject=f"Set Your {application_title} Password", html=html_content)
|
||||
else:
|
||||
html_content = render_template(template, to=to, code=code)
|
||||
mail.send(to=to, subject="Set Your Dify Password", html=html_content)
|
||||
|
||||
end_at = time.perf_counter()
|
||||
|
@ -4,16 +4,19 @@ from collections.abc import Callable
|
||||
|
||||
import click
|
||||
from celery import shared_task # type: ignore
|
||||
from sqlalchemy import delete
|
||||
from sqlalchemy import delete, select
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from core.repositories import SQLAlchemyWorkflowNodeExecutionRepository
|
||||
from extensions.ext_database import db
|
||||
from models.dataset import AppDatasetJoin
|
||||
from models.model import (
|
||||
from models import (
|
||||
Account,
|
||||
ApiToken,
|
||||
App,
|
||||
AppAnnotationHitHistory,
|
||||
AppAnnotationSetting,
|
||||
AppDatasetJoin,
|
||||
AppModelConfig,
|
||||
Conversation,
|
||||
EndUser,
|
||||
@ -188,9 +191,24 @@ def _delete_app_workflow_runs(tenant_id: str, app_id: str):
|
||||
|
||||
|
||||
def _delete_app_workflow_node_executions(tenant_id: str, app_id: str):
|
||||
# Get app's owner
|
||||
with Session(db.engine, expire_on_commit=False) as session:
|
||||
stmt = select(Account).where(Account.id == App.owner_id).where(App.id == app_id)
|
||||
user = session.scalar(stmt)
|
||||
|
||||
if user is None:
|
||||
errmsg = (
|
||||
f"Failed to delete workflow node executions for tenant {tenant_id} and app {app_id}, app's owner not found"
|
||||
)
|
||||
logging.error(errmsg)
|
||||
raise ValueError(errmsg)
|
||||
|
||||
# Create a repository instance for WorkflowNodeExecution
|
||||
repository = SQLAlchemyWorkflowNodeExecutionRepository(
|
||||
session_factory=db.engine, tenant_id=tenant_id, app_id=app_id
|
||||
session_factory=db.engine,
|
||||
user=user,
|
||||
app_id=app_id,
|
||||
triggered_from=None,
|
||||
)
|
||||
|
||||
# Use the clear method to delete all records for this tenant_id and app_id
|
||||
|
@ -0,0 +1,70 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<style>
|
||||
body {
|
||||
font-family: 'Arial', sans-serif;
|
||||
line-height: 16pt;
|
||||
color: #101828;
|
||||
background-color: #e9ebf0;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
.container {
|
||||
width: 600px;
|
||||
height: 360px;
|
||||
margin: 40px auto;
|
||||
padding: 36px 48px;
|
||||
background-color: #fcfcfd;
|
||||
border-radius: 16px;
|
||||
border: 1px solid #ffffff;
|
||||
box-shadow: 0 2px 4px -2px rgba(9, 9, 11, 0.08);
|
||||
}
|
||||
.header {
|
||||
margin-bottom: 24px;
|
||||
}
|
||||
.header img {
|
||||
max-width: 100px;
|
||||
height: auto;
|
||||
}
|
||||
.title {
|
||||
font-weight: 600;
|
||||
font-size: 24px;
|
||||
line-height: 28.8px;
|
||||
}
|
||||
.description {
|
||||
font-size: 13px;
|
||||
line-height: 16px;
|
||||
color: #676f83;
|
||||
margin-top: 12px;
|
||||
}
|
||||
.code-content {
|
||||
padding: 16px 32px;
|
||||
text-align: center;
|
||||
border-radius: 16px;
|
||||
background-color: #f2f4f7;
|
||||
margin: 16px auto;
|
||||
}
|
||||
.code {
|
||||
line-height: 36px;
|
||||
font-weight: 700;
|
||||
font-size: 30px;
|
||||
}
|
||||
.tips {
|
||||
line-height: 16px;
|
||||
color: #676f83;
|
||||
font-size: 13px;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<p class="title">Your login code for {{application_title}}</p>
|
||||
<p class="description">Copy and paste this code, this code will only be valid for the next 5 minutes.</p>
|
||||
<div class="code-content">
|
||||
<span class="code">{{code}}</span>
|
||||
</div>
|
||||
<p class="tips">If you didn't request a login, don't worry. You can safely ignore this email.</p>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
@ -0,0 +1,70 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<style>
|
||||
body {
|
||||
font-family: 'Arial', sans-serif;
|
||||
line-height: 16pt;
|
||||
color: #101828;
|
||||
background-color: #e9ebf0;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
.container {
|
||||
width: 600px;
|
||||
height: 360px;
|
||||
margin: 40px auto;
|
||||
padding: 36px 48px;
|
||||
background-color: #fcfcfd;
|
||||
border-radius: 16px;
|
||||
border: 1px solid #ffffff;
|
||||
box-shadow: 0 2px 4px -2px rgba(9, 9, 11, 0.08);
|
||||
}
|
||||
.header {
|
||||
margin-bottom: 24px;
|
||||
}
|
||||
.header img {
|
||||
max-width: 100px;
|
||||
height: auto;
|
||||
}
|
||||
.title {
|
||||
font-weight: 600;
|
||||
font-size: 24px;
|
||||
line-height: 28.8px;
|
||||
}
|
||||
.description {
|
||||
font-size: 13px;
|
||||
line-height: 16px;
|
||||
color: #676f83;
|
||||
margin-top: 12px;
|
||||
}
|
||||
.code-content {
|
||||
padding: 16px 32px;
|
||||
text-align: center;
|
||||
border-radius: 16px;
|
||||
background-color: #f2f4f7;
|
||||
margin: 16px auto;
|
||||
}
|
||||
.code {
|
||||
line-height: 36px;
|
||||
font-weight: 700;
|
||||
font-size: 30px;
|
||||
}
|
||||
.tips {
|
||||
line-height: 16px;
|
||||
color: #676f83;
|
||||
font-size: 13px;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<p class="title">{{application_title}} 的登录验证码</p>
|
||||
<p class="description">复制并粘贴此验证码,注意验证码仅在接下来的 5 分钟内有效。</p>
|
||||
<div class="code-content">
|
||||
<span class="code">{{code}}</span>
|
||||
</div>
|
||||
<p class="tips">如果您没有请求登录,请不要担心。您可以安全地忽略此电子邮件。</p>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
@ -0,0 +1,69 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<style>
|
||||
body {
|
||||
font-family: 'Arial', sans-serif;
|
||||
line-height: 16pt;
|
||||
color: #374151;
|
||||
background-color: #E5E7EB;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
.container {
|
||||
width: 100%;
|
||||
max-width: 560px;
|
||||
margin: 40px auto;
|
||||
padding: 20px;
|
||||
background-color: #F3F4F6;
|
||||
border-radius: 8px;
|
||||
box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
.header {
|
||||
text-align: center;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
.header img {
|
||||
max-width: 100px;
|
||||
height: auto;
|
||||
}
|
||||
.button {
|
||||
display: inline-block;
|
||||
padding: 12px 24px;
|
||||
background-color: #2970FF;
|
||||
color: white;
|
||||
text-decoration: none;
|
||||
border-radius: 4px;
|
||||
text-align: center;
|
||||
transition: background-color 0.3s ease;
|
||||
}
|
||||
.button:hover {
|
||||
background-color: #265DD4;
|
||||
}
|
||||
.footer {
|
||||
font-size: 0.9em;
|
||||
color: #777777;
|
||||
margin-top: 30px;
|
||||
}
|
||||
.content {
|
||||
margin-top: 20px;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<div class="content">
|
||||
<p>Dear {{ to }},</p>
|
||||
<p>{{ inviter_name }} is pleased to invite you to join our workspace on {{application_title}}, a platform specifically designed for LLM application development. On {{application_title}}, you can explore, create, and collaborate to build and operate AI applications.</p>
|
||||
<p>Click the button below to log in to {{application_title}} and join the workspace.</p>
|
||||
<p style="text-align: center;"><a style="color: #fff; text-decoration: none" class="button" href="{{ url }}">Login Here</a></p>
|
||||
</div>
|
||||
<div class="footer">
|
||||
<p>Best regards,</p>
|
||||
<p>{{application_title}} Team</p>
|
||||
<p>Please do not reply directly to this email; it is automatically sent by the system.</p>
|
||||
</div>
|
||||
</div>
|
||||
</body>
|
||||
|
||||
</html>
|
@ -0,0 +1,69 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<style>
|
||||
body {
|
||||
font-family: 'Arial', sans-serif;
|
||||
line-height: 16pt;
|
||||
color: #374151;
|
||||
background-color: #E5E7EB;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
.container {
|
||||
width: 100%;
|
||||
max-width: 560px;
|
||||
margin: 40px auto;
|
||||
padding: 20px;
|
||||
background-color: #F3F4F6;
|
||||
border-radius: 8px;
|
||||
box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
.header {
|
||||
text-align: center;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
.header img {
|
||||
max-width: 100px;
|
||||
height: auto;
|
||||
}
|
||||
.button {
|
||||
display: inline-block;
|
||||
padding: 12px 24px;
|
||||
background-color: #2970FF;
|
||||
color: white;
|
||||
text-decoration: none;
|
||||
border-radius: 4px;
|
||||
text-align: center;
|
||||
transition: background-color 0.3s ease;
|
||||
}
|
||||
.button:hover {
|
||||
background-color: #265DD4;
|
||||
}
|
||||
.footer {
|
||||
font-size: 0.9em;
|
||||
color: #777777;
|
||||
margin-top: 30px;
|
||||
}
|
||||
.content {
|
||||
margin-top: 20px;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<div class="container">
|
||||
<div class="content">
|
||||
<p>尊敬的 {{ to }},</p>
|
||||
<p>{{ inviter_name }} 现邀请您加入我们在 {{application_title}} 的工作区,这是一个专为 LLM 应用开发而设计的平台。在 {{application_title}} 上,您可以探索、创造和合作,构建和运营 AI 应用。</p>
|
||||
<p>点击下方按钮即可登录 {{application_title}} 并且加入空间。</p>
|
||||
<p style="text-align: center;"><a style="color: #fff; text-decoration: none" class="button" href="{{ url }}">在此登录</a></p>
|
||||
</div>
|
||||
<div class="footer">
|
||||
<p>此致,</p>
|
||||
<p>{{application_title}} 团队</p>
|
||||
<p>请不要直接回复此电子邮件;由系统自动发送。</p>
|
||||
</div>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
@ -0,0 +1,70 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<style>
|
||||
body {
|
||||
font-family: 'Arial', sans-serif;
|
||||
line-height: 16pt;
|
||||
color: #101828;
|
||||
background-color: #e9ebf0;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
.container {
|
||||
width: 600px;
|
||||
height: 360px;
|
||||
margin: 40px auto;
|
||||
padding: 36px 48px;
|
||||
background-color: #fcfcfd;
|
||||
border-radius: 16px;
|
||||
border: 1px solid #ffffff;
|
||||
box-shadow: 0 2px 4px -2px rgba(9, 9, 11, 0.08);
|
||||
}
|
||||
.header {
|
||||
margin-bottom: 24px;
|
||||
}
|
||||
.header img {
|
||||
max-width: 100px;
|
||||
height: auto;
|
||||
}
|
||||
.title {
|
||||
font-weight: 600;
|
||||
font-size: 24px;
|
||||
line-height: 28.8px;
|
||||
}
|
||||
.description {
|
||||
font-size: 13px;
|
||||
line-height: 16px;
|
||||
color: #676f83;
|
||||
margin-top: 12px;
|
||||
}
|
||||
.code-content {
|
||||
padding: 16px 32px;
|
||||
text-align: center;
|
||||
border-radius: 16px;
|
||||
background-color: #f2f4f7;
|
||||
margin: 16px auto;
|
||||
}
|
||||
.code {
|
||||
line-height: 36px;
|
||||
font-weight: 700;
|
||||
font-size: 30px;
|
||||
}
|
||||
.tips {
|
||||
line-height: 16px;
|
||||
color: #676f83;
|
||||
font-size: 13px;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<p class="title">Set your {{application_title}} password</p>
|
||||
<p class="description">Copy and paste this code, this code will only be valid for the next 5 minutes.</p>
|
||||
<div class="code-content">
|
||||
<span class="code">{{code}}</span>
|
||||
</div>
|
||||
<p class="tips">If you didn't request, don't worry. You can safely ignore this email.</p>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
@ -0,0 +1,70 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<style>
|
||||
body {
|
||||
font-family: 'Arial', sans-serif;
|
||||
line-height: 16pt;
|
||||
color: #101828;
|
||||
background-color: #e9ebf0;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
.container {
|
||||
width: 600px;
|
||||
height: 360px;
|
||||
margin: 40px auto;
|
||||
padding: 36px 48px;
|
||||
background-color: #fcfcfd;
|
||||
border-radius: 16px;
|
||||
border: 1px solid #ffffff;
|
||||
box-shadow: 0 2px 4px -2px rgba(9, 9, 11, 0.08);
|
||||
}
|
||||
.header {
|
||||
margin-bottom: 24px;
|
||||
}
|
||||
.header img {
|
||||
max-width: 100px;
|
||||
height: auto;
|
||||
}
|
||||
.title {
|
||||
font-weight: 600;
|
||||
font-size: 24px;
|
||||
line-height: 28.8px;
|
||||
}
|
||||
.description {
|
||||
font-size: 13px;
|
||||
line-height: 16px;
|
||||
color: #676f83;
|
||||
margin-top: 12px;
|
||||
}
|
||||
.code-content {
|
||||
padding: 16px 32px;
|
||||
text-align: center;
|
||||
border-radius: 16px;
|
||||
background-color: #f2f4f7;
|
||||
margin: 16px auto;
|
||||
}
|
||||
.code {
|
||||
line-height: 36px;
|
||||
font-weight: 700;
|
||||
font-size: 30px;
|
||||
}
|
||||
.tips {
|
||||
line-height: 16px;
|
||||
color: #676f83;
|
||||
font-size: 13px;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<p class="title">设置您的 {{application_title}} 账户密码</p>
|
||||
<p class="description">复制并粘贴此验证码,注意验证码仅在接下来的 5 分钟内有效。</p>
|
||||
<div class="code-content">
|
||||
<span class="code">{{code}}</span>
|
||||
</div>
|
||||
<p class="tips">如果您没有请求,请不要担心。您可以安全地忽略此电子邮件。</p>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
@ -16,10 +16,9 @@ from core.workflow.enums import SystemVariableKey
|
||||
from core.workflow.nodes import NodeType
|
||||
from core.workflow.repository.workflow_node_execution_repository import WorkflowNodeExecutionRepository
|
||||
from core.workflow.workflow_cycle_manager import WorkflowCycleManager
|
||||
from models.enums import CreatedByRole
|
||||
from models.enums import CreatorUserRole
|
||||
from models.workflow import (
|
||||
Workflow,
|
||||
WorkflowNodeExecution,
|
||||
WorkflowNodeExecutionStatus,
|
||||
WorkflowRun,
|
||||
WorkflowRunStatus,
|
||||
@ -94,7 +93,7 @@ def mock_workflow_run():
|
||||
workflow_run.app_id = "test-app-id"
|
||||
workflow_run.workflow_id = "test-workflow-id"
|
||||
workflow_run.status = WorkflowRunStatus.RUNNING
|
||||
workflow_run.created_by_role = CreatedByRole.ACCOUNT
|
||||
workflow_run.created_by_role = CreatorUserRole.ACCOUNT
|
||||
workflow_run.created_by = "test-user-id"
|
||||
workflow_run.created_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
workflow_run.inputs_dict = {"query": "test query"}
|
||||
@ -107,7 +106,6 @@ def test_init(
|
||||
):
|
||||
"""Test initialization of WorkflowCycleManager"""
|
||||
assert workflow_cycle_manager._workflow_run is None
|
||||
assert workflow_cycle_manager._workflow_node_executions == {}
|
||||
assert workflow_cycle_manager._application_generate_entity == mock_app_generate_entity
|
||||
assert workflow_cycle_manager._workflow_system_variables == mock_workflow_system_variables
|
||||
assert workflow_cycle_manager._workflow_node_execution_repository == mock_node_execution_repository
|
||||
@ -123,7 +121,7 @@ def test_handle_workflow_run_start(workflow_cycle_manager, mock_session, mock_wo
|
||||
session=mock_session,
|
||||
workflow_id="test-workflow-id",
|
||||
user_id="test-user-id",
|
||||
created_by_role=CreatedByRole.ACCOUNT,
|
||||
created_by_role=CreatorUserRole.ACCOUNT,
|
||||
)
|
||||
|
||||
# Verify the result
|
||||
@ -132,7 +130,7 @@ def test_handle_workflow_run_start(workflow_cycle_manager, mock_session, mock_wo
|
||||
assert workflow_run.workflow_id == mock_workflow.id
|
||||
assert workflow_run.sequence_number == 6 # max_sequence + 1
|
||||
assert workflow_run.status == WorkflowRunStatus.RUNNING
|
||||
assert workflow_run.created_by_role == CreatedByRole.ACCOUNT
|
||||
assert workflow_run.created_by_role == CreatorUserRole.ACCOUNT
|
||||
assert workflow_run.created_by == "test-user-id"
|
||||
|
||||
# Verify session.add was called
|
||||
@ -215,24 +213,23 @@ def test_handle_node_execution_start(workflow_cycle_manager, mock_workflow_run):
|
||||
)
|
||||
|
||||
# Verify the result
|
||||
assert result.tenant_id == mock_workflow_run.tenant_id
|
||||
assert result.app_id == mock_workflow_run.app_id
|
||||
# NodeExecution doesn't have tenant_id attribute, it's handled at repository level
|
||||
# assert result.tenant_id == mock_workflow_run.tenant_id
|
||||
# assert result.app_id == mock_workflow_run.app_id
|
||||
assert result.workflow_id == mock_workflow_run.workflow_id
|
||||
assert result.workflow_run_id == mock_workflow_run.id
|
||||
assert result.node_execution_id == event.node_execution_id
|
||||
assert result.node_id == event.node_id
|
||||
assert result.node_type == event.node_type.value
|
||||
assert result.node_type == event.node_type
|
||||
assert result.title == event.node_data.title
|
||||
assert result.status == WorkflowNodeExecutionStatus.RUNNING.value
|
||||
assert result.created_by_role == mock_workflow_run.created_by_role
|
||||
assert result.created_by == mock_workflow_run.created_by
|
||||
# NodeExecution doesn't have created_by_role and created_by attributes, they're handled at repository level
|
||||
# assert result.created_by_role == mock_workflow_run.created_by_role
|
||||
# assert result.created_by == mock_workflow_run.created_by
|
||||
|
||||
# Verify save was called
|
||||
workflow_cycle_manager._workflow_node_execution_repository.save.assert_called_once_with(result)
|
||||
|
||||
# Verify the node execution was added to the cache
|
||||
assert workflow_cycle_manager._workflow_node_executions[event.node_execution_id] == result
|
||||
|
||||
|
||||
def test_get_workflow_run(workflow_cycle_manager, mock_session, mock_workflow_run):
|
||||
"""Test _get_workflow_run method"""
|
||||
@ -261,12 +258,13 @@ def test_handle_workflow_node_execution_success(workflow_cycle_manager):
|
||||
event.execution_metadata = {"metadata": "test metadata"}
|
||||
event.start_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
|
||||
# Create a mock workflow node execution
|
||||
node_execution = MagicMock(spec=WorkflowNodeExecution)
|
||||
# Create a mock node execution
|
||||
node_execution = MagicMock()
|
||||
node_execution.node_execution_id = "test-node-execution-id"
|
||||
|
||||
# Mock _get_workflow_node_execution to return the mock node execution
|
||||
with patch.object(workflow_cycle_manager, "_get_workflow_node_execution", return_value=node_execution):
|
||||
# Mock the repository to return the node execution
|
||||
workflow_cycle_manager._workflow_node_execution_repository.get_by_node_execution_id.return_value = node_execution
|
||||
|
||||
# Call the method
|
||||
result = workflow_cycle_manager._handle_workflow_node_execution_success(
|
||||
event=event,
|
||||
@ -275,14 +273,9 @@ def test_handle_workflow_node_execution_success(workflow_cycle_manager):
|
||||
# Verify the result
|
||||
assert result == node_execution
|
||||
assert result.status == WorkflowNodeExecutionStatus.SUCCEEDED.value
|
||||
assert result.inputs == json.dumps(event.inputs)
|
||||
assert result.process_data == json.dumps(event.process_data)
|
||||
assert result.outputs == json.dumps(event.outputs)
|
||||
assert result.finished_at is not None
|
||||
assert result.elapsed_time is not None
|
||||
|
||||
# Verify update was called
|
||||
workflow_cycle_manager._workflow_node_execution_repository.update.assert_called_once_with(node_execution)
|
||||
# Verify save was called
|
||||
workflow_cycle_manager._workflow_node_execution_repository.save.assert_called_once_with(node_execution)
|
||||
|
||||
|
||||
def test_handle_workflow_run_partial_success(workflow_cycle_manager, mock_session, mock_workflow_run):
|
||||
@ -322,12 +315,13 @@ def test_handle_workflow_node_execution_failed(workflow_cycle_manager):
|
||||
event.start_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
event.error = "Test error message"
|
||||
|
||||
# Create a mock workflow node execution
|
||||
node_execution = MagicMock(spec=WorkflowNodeExecution)
|
||||
# Create a mock node execution
|
||||
node_execution = MagicMock()
|
||||
node_execution.node_execution_id = "test-node-execution-id"
|
||||
|
||||
# Mock _get_workflow_node_execution to return the mock node execution
|
||||
with patch.object(workflow_cycle_manager, "_get_workflow_node_execution", return_value=node_execution):
|
||||
# Mock the repository to return the node execution
|
||||
workflow_cycle_manager._workflow_node_execution_repository.get_by_node_execution_id.return_value = node_execution
|
||||
|
||||
# Call the method
|
||||
result = workflow_cycle_manager._handle_workflow_node_execution_failed(
|
||||
event=event,
|
||||
@ -337,12 +331,6 @@ def test_handle_workflow_node_execution_failed(workflow_cycle_manager):
|
||||
assert result == node_execution
|
||||
assert result.status == WorkflowNodeExecutionStatus.FAILED.value
|
||||
assert result.error == "Test error message"
|
||||
assert result.inputs == json.dumps(event.inputs)
|
||||
assert result.process_data == json.dumps(event.process_data)
|
||||
assert result.outputs == json.dumps(event.outputs)
|
||||
assert result.finished_at is not None
|
||||
assert result.elapsed_time is not None
|
||||
assert result.execution_metadata == json.dumps(event.execution_metadata)
|
||||
|
||||
# Verify update was called
|
||||
workflow_cycle_manager._workflow_node_execution_repository.update.assert_called_once_with(node_execution)
|
||||
# Verify save was called
|
||||
workflow_cycle_manager._workflow_node_execution_repository.save.assert_called_once_with(node_execution)
|
||||
|
265
api/tests/unit_tests/extensions/test_ext_request_logging.py
Normal file
265
api/tests/unit_tests/extensions/test_ext_request_logging.py
Normal file
@ -0,0 +1,265 @@
|
||||
import json
|
||||
import logging
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
from flask import Flask, Response
|
||||
|
||||
from configs import dify_config
|
||||
from extensions import ext_request_logging
|
||||
from extensions.ext_request_logging import _is_content_type_json, _log_request_finished, init_app
|
||||
|
||||
|
||||
def test_is_content_type_json():
|
||||
"""
|
||||
Test the _is_content_type_json function.
|
||||
"""
|
||||
|
||||
assert _is_content_type_json("application/json") is True
|
||||
# content type header with charset option.
|
||||
assert _is_content_type_json("application/json; charset=utf-8") is True
|
||||
# content type header with charset option, in uppercase.
|
||||
assert _is_content_type_json("APPLICATION/JSON; CHARSET=UTF-8") is True
|
||||
assert _is_content_type_json("text/html") is False
|
||||
assert _is_content_type_json("") is False
|
||||
|
||||
|
||||
_KEY_NEEDLE = "needle"
|
||||
_VALUE_NEEDLE = _KEY_NEEDLE[::-1]
|
||||
_RESPONSE_NEEDLE = "response"
|
||||
|
||||
|
||||
def _get_test_app():
|
||||
app = Flask(__name__)
|
||||
|
||||
@app.route("/", methods=["GET", "POST"])
|
||||
def handler():
|
||||
return _RESPONSE_NEEDLE
|
||||
|
||||
return app
|
||||
|
||||
|
||||
# NOTE(QuantumGhost): Due to the design of Flask, we need to use monkey patch to write tests.
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_request_receiver(monkeypatch) -> mock.Mock:
|
||||
mock_log_request_started = mock.Mock()
|
||||
monkeypatch.setattr(ext_request_logging, "_log_request_started", mock_log_request_started)
|
||||
return mock_log_request_started
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_response_receiver(monkeypatch) -> mock.Mock:
|
||||
mock_log_request_finished = mock.Mock()
|
||||
monkeypatch.setattr(ext_request_logging, "_log_request_finished", mock_log_request_finished)
|
||||
return mock_log_request_finished
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_logger(monkeypatch) -> logging.Logger:
|
||||
_logger = mock.MagicMock(spec=logging.Logger)
|
||||
monkeypatch.setattr(ext_request_logging, "_logger", _logger)
|
||||
return _logger
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def enable_request_logging(monkeypatch):
|
||||
monkeypatch.setattr(dify_config, "ENABLE_REQUEST_LOGGING", True)
|
||||
|
||||
|
||||
class TestRequestLoggingExtension:
|
||||
def test_receiver_should_not_be_invoked_if_configuration_is_disabled(
|
||||
self,
|
||||
monkeypatch,
|
||||
mock_request_receiver,
|
||||
mock_response_receiver,
|
||||
):
|
||||
monkeypatch.setattr(dify_config, "ENABLE_REQUEST_LOGGING", False)
|
||||
|
||||
app = _get_test_app()
|
||||
init_app(app)
|
||||
|
||||
with app.test_client() as client:
|
||||
client.get("/")
|
||||
|
||||
mock_request_receiver.assert_not_called()
|
||||
mock_response_receiver.assert_not_called()
|
||||
|
||||
def test_receiver_should_be_called_if_enabled(
|
||||
self,
|
||||
enable_request_logging,
|
||||
mock_request_receiver,
|
||||
mock_response_receiver,
|
||||
):
|
||||
"""
|
||||
Test the request logging extension with JSON data.
|
||||
"""
|
||||
|
||||
app = _get_test_app()
|
||||
init_app(app)
|
||||
|
||||
with app.test_client() as client:
|
||||
client.post("/", json={_KEY_NEEDLE: _VALUE_NEEDLE})
|
||||
|
||||
mock_request_receiver.assert_called_once()
|
||||
mock_response_receiver.assert_called_once()
|
||||
|
||||
|
||||
class TestLoggingLevel:
|
||||
@pytest.mark.usefixtures("enable_request_logging")
|
||||
def test_logging_should_be_skipped_if_level_is_above_debug(self, enable_request_logging, mock_logger):
|
||||
mock_logger.isEnabledFor.return_value = False
|
||||
app = _get_test_app()
|
||||
init_app(app)
|
||||
|
||||
with app.test_client() as client:
|
||||
client.post("/", json={_KEY_NEEDLE: _VALUE_NEEDLE})
|
||||
mock_logger.debug.assert_not_called()
|
||||
|
||||
|
||||
class TestRequestReceiverLogging:
|
||||
@pytest.mark.usefixtures("enable_request_logging")
|
||||
def test_non_json_request(self, enable_request_logging, mock_logger, mock_response_receiver):
|
||||
mock_logger.isEnabledFor.return_value = True
|
||||
app = _get_test_app()
|
||||
init_app(app)
|
||||
|
||||
with app.test_client() as client:
|
||||
client.post("/", data="plain text")
|
||||
assert mock_logger.debug.call_count == 1
|
||||
call_args = mock_logger.debug.call_args[0]
|
||||
assert "Received Request" in call_args[0]
|
||||
assert call_args[1] == "POST"
|
||||
assert call_args[2] == "/"
|
||||
assert "Request Body" not in call_args[0]
|
||||
|
||||
@pytest.mark.usefixtures("enable_request_logging")
|
||||
def test_json_request(self, enable_request_logging, mock_logger, mock_response_receiver):
|
||||
mock_logger.isEnabledFor.return_value = True
|
||||
app = _get_test_app()
|
||||
init_app(app)
|
||||
|
||||
with app.test_client() as client:
|
||||
client.post("/", json={_KEY_NEEDLE: _VALUE_NEEDLE})
|
||||
assert mock_logger.debug.call_count == 1
|
||||
call_args = mock_logger.debug.call_args[0]
|
||||
assert "Received Request" in call_args[0]
|
||||
assert "Request Body" in call_args[0]
|
||||
assert call_args[1] == "POST"
|
||||
assert call_args[2] == "/"
|
||||
assert _KEY_NEEDLE in call_args[3]
|
||||
|
||||
@pytest.mark.usefixtures("enable_request_logging")
|
||||
def test_json_request_with_empty_body(self, enable_request_logging, mock_logger, mock_response_receiver):
|
||||
mock_logger.isEnabledFor.return_value = True
|
||||
app = _get_test_app()
|
||||
init_app(app)
|
||||
|
||||
with app.test_client() as client:
|
||||
client.post("/", headers={"Content-Type": "application/json"})
|
||||
|
||||
assert mock_logger.debug.call_count == 1
|
||||
call_args = mock_logger.debug.call_args[0]
|
||||
assert "Received Request" in call_args[0]
|
||||
assert "Request Body" not in call_args[0]
|
||||
assert call_args[1] == "POST"
|
||||
assert call_args[2] == "/"
|
||||
|
||||
@pytest.mark.usefixtures("enable_request_logging")
|
||||
def test_json_request_with_invalid_json_as_body(self, enable_request_logging, mock_logger, mock_response_receiver):
|
||||
mock_logger.isEnabledFor.return_value = True
|
||||
app = _get_test_app()
|
||||
init_app(app)
|
||||
|
||||
with app.test_client() as client:
|
||||
client.post(
|
||||
"/",
|
||||
headers={"Content-Type": "application/json"},
|
||||
data="{",
|
||||
)
|
||||
assert mock_logger.debug.call_count == 0
|
||||
assert mock_logger.exception.call_count == 1
|
||||
|
||||
exception_call_args = mock_logger.exception.call_args[0]
|
||||
assert exception_call_args[0] == "Failed to parse JSON request"
|
||||
|
||||
|
||||
class TestResponseReceiverLogging:
|
||||
@pytest.mark.usefixtures("enable_request_logging")
|
||||
def test_non_json_response(self, enable_request_logging, mock_logger):
|
||||
mock_logger.isEnabledFor.return_value = True
|
||||
app = _get_test_app()
|
||||
response = Response(
|
||||
"OK",
|
||||
headers={"Content-Type": "text/plain"},
|
||||
)
|
||||
_log_request_finished(app, response)
|
||||
assert mock_logger.debug.call_count == 1
|
||||
call_args = mock_logger.debug.call_args[0]
|
||||
assert "Response" in call_args[0]
|
||||
assert "200" in call_args[1]
|
||||
assert call_args[2] == "text/plain"
|
||||
assert "Response Body" not in call_args[0]
|
||||
|
||||
@pytest.mark.usefixtures("enable_request_logging")
|
||||
def test_json_response(self, enable_request_logging, mock_logger, mock_response_receiver):
|
||||
mock_logger.isEnabledFor.return_value = True
|
||||
app = _get_test_app()
|
||||
response = Response(
|
||||
json.dumps({_KEY_NEEDLE: _VALUE_NEEDLE}),
|
||||
headers={"Content-Type": "application/json"},
|
||||
)
|
||||
_log_request_finished(app, response)
|
||||
assert mock_logger.debug.call_count == 1
|
||||
call_args = mock_logger.debug.call_args[0]
|
||||
assert "Response" in call_args[0]
|
||||
assert "Response Body" in call_args[0]
|
||||
assert "200" in call_args[1]
|
||||
assert call_args[2] == "application/json"
|
||||
assert _KEY_NEEDLE in call_args[3]
|
||||
|
||||
@pytest.mark.usefixtures("enable_request_logging")
|
||||
def test_json_request_with_invalid_json_as_body(self, enable_request_logging, mock_logger, mock_response_receiver):
|
||||
mock_logger.isEnabledFor.return_value = True
|
||||
app = _get_test_app()
|
||||
|
||||
response = Response(
|
||||
"{",
|
||||
headers={"Content-Type": "application/json"},
|
||||
)
|
||||
_log_request_finished(app, response)
|
||||
assert mock_logger.debug.call_count == 0
|
||||
assert mock_logger.exception.call_count == 1
|
||||
|
||||
exception_call_args = mock_logger.exception.call_args[0]
|
||||
assert exception_call_args[0] == "Failed to parse JSON response"
|
||||
|
||||
|
||||
class TestResponseUnmodified:
|
||||
def test_when_request_logging_disabled(self):
|
||||
app = _get_test_app()
|
||||
init_app(app)
|
||||
|
||||
with app.test_client() as client:
|
||||
response = client.post(
|
||||
"/",
|
||||
headers={"Content-Type": "application/json"},
|
||||
data="{",
|
||||
)
|
||||
assert response.text == _RESPONSE_NEEDLE
|
||||
assert response.status_code == 200
|
||||
|
||||
@pytest.mark.usefixtures("enable_request_logging")
|
||||
def test_when_request_logging_enabled(self, enable_request_logging):
|
||||
app = _get_test_app()
|
||||
init_app(app)
|
||||
|
||||
with app.test_client() as client:
|
||||
response = client.post(
|
||||
"/",
|
||||
headers={"Content-Type": "application/json"},
|
||||
data="{",
|
||||
)
|
||||
assert response.text == _RESPONSE_NEEDLE
|
||||
assert response.status_code == 200
|
187
api/tests/unit_tests/models/test_types_enum_text.py
Normal file
187
api/tests/unit_tests/models/test_types_enum_text.py
Normal file
@ -0,0 +1,187 @@
|
||||
from collections.abc import Callable, Iterable
|
||||
from enum import StrEnum
|
||||
from typing import Any, NamedTuple, TypeVar
|
||||
|
||||
import pytest
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy import exc as sa_exc
|
||||
from sqlalchemy import insert
|
||||
from sqlalchemy.orm import DeclarativeBase, Mapped, Session
|
||||
from sqlalchemy.sql.sqltypes import VARCHAR
|
||||
|
||||
from models.types import EnumText
|
||||
|
||||
_user_type_admin = "admin"
|
||||
_user_type_normal = "normal"
|
||||
|
||||
|
||||
class _Base(DeclarativeBase):
|
||||
pass
|
||||
|
||||
|
||||
class _UserType(StrEnum):
|
||||
admin = _user_type_admin
|
||||
normal = _user_type_normal
|
||||
|
||||
|
||||
class _EnumWithLongValue(StrEnum):
|
||||
unknown = "unknown"
|
||||
a_really_long_enum_values = "a_really_long_enum_values"
|
||||
|
||||
|
||||
class _User(_Base):
|
||||
__tablename__ = "users"
|
||||
|
||||
id: Mapped[int] = sa.Column(sa.Integer, primary_key=True)
|
||||
name: Mapped[str] = sa.Column(sa.String(length=255), nullable=False)
|
||||
user_type: Mapped[_UserType] = sa.Column(EnumText(enum_class=_UserType), nullable=False, default=_UserType.normal)
|
||||
user_type_nullable: Mapped[_UserType | None] = sa.Column(EnumText(enum_class=_UserType), nullable=True)
|
||||
|
||||
|
||||
class _ColumnTest(_Base):
|
||||
__tablename__ = "column_test"
|
||||
|
||||
id: Mapped[int] = sa.Column(sa.Integer, primary_key=True)
|
||||
|
||||
user_type: Mapped[_UserType] = sa.Column(EnumText(enum_class=_UserType), nullable=False, default=_UserType.normal)
|
||||
explicit_length: Mapped[_UserType | None] = sa.Column(
|
||||
EnumText(_UserType, length=50), nullable=True, default=_UserType.normal
|
||||
)
|
||||
long_value: Mapped[_EnumWithLongValue] = sa.Column(EnumText(enum_class=_EnumWithLongValue), nullable=False)
|
||||
|
||||
|
||||
_T = TypeVar("_T")
|
||||
|
||||
|
||||
def _first(it: Iterable[_T]) -> _T:
|
||||
ls = list(it)
|
||||
if not ls:
|
||||
raise ValueError("List is empty")
|
||||
return ls[0]
|
||||
|
||||
|
||||
class TestEnumText:
|
||||
def test_column_impl(self):
|
||||
engine = sa.create_engine("sqlite://", echo=False)
|
||||
_Base.metadata.create_all(engine)
|
||||
|
||||
inspector = sa.inspect(engine)
|
||||
columns = inspector.get_columns(_ColumnTest.__tablename__)
|
||||
|
||||
user_type_column = _first(c for c in columns if c["name"] == "user_type")
|
||||
sql_type = user_type_column["type"]
|
||||
assert isinstance(user_type_column["type"], VARCHAR)
|
||||
assert sql_type.length == 20
|
||||
assert user_type_column["nullable"] is False
|
||||
|
||||
explicit_length_column = _first(c for c in columns if c["name"] == "explicit_length")
|
||||
sql_type = explicit_length_column["type"]
|
||||
assert isinstance(sql_type, VARCHAR)
|
||||
assert sql_type.length == 50
|
||||
assert explicit_length_column["nullable"] is True
|
||||
|
||||
long_value_column = _first(c for c in columns if c["name"] == "long_value")
|
||||
sql_type = long_value_column["type"]
|
||||
assert isinstance(sql_type, VARCHAR)
|
||||
assert sql_type.length == len(_EnumWithLongValue.a_really_long_enum_values)
|
||||
|
||||
def test_insert_and_select(self):
|
||||
engine = sa.create_engine("sqlite://", echo=False)
|
||||
_Base.metadata.create_all(engine)
|
||||
|
||||
with Session(engine) as session:
|
||||
admin_user = _User(
|
||||
name="admin",
|
||||
user_type=_UserType.admin,
|
||||
user_type_nullable=None,
|
||||
)
|
||||
session.add(admin_user)
|
||||
session.flush()
|
||||
admin_user_id = admin_user.id
|
||||
|
||||
normal_user = _User(
|
||||
name="normal",
|
||||
user_type=_UserType.normal.value,
|
||||
user_type_nullable=_UserType.normal.value,
|
||||
)
|
||||
session.add(normal_user)
|
||||
session.flush()
|
||||
normal_user_id = normal_user.id
|
||||
session.commit()
|
||||
|
||||
with Session(engine) as session:
|
||||
user = session.query(_User).filter(_User.id == admin_user_id).first()
|
||||
assert user.user_type == _UserType.admin
|
||||
assert user.user_type_nullable is None
|
||||
|
||||
with Session(engine) as session:
|
||||
user = session.query(_User).filter(_User.id == normal_user_id).first()
|
||||
assert user.user_type == _UserType.normal
|
||||
assert user.user_type_nullable == _UserType.normal
|
||||
|
||||
def test_insert_invalid_values(self):
|
||||
def _session_insert_with_value(sess: Session, user_type: Any):
|
||||
user = _User(name="test_user", user_type=user_type)
|
||||
sess.add(user)
|
||||
sess.flush()
|
||||
|
||||
def _insert_with_user(sess: Session, user_type: Any):
|
||||
stmt = insert(_User).values(
|
||||
{
|
||||
"name": "test_user",
|
||||
"user_type": user_type,
|
||||
}
|
||||
)
|
||||
sess.execute(stmt)
|
||||
|
||||
class TestCase(NamedTuple):
|
||||
name: str
|
||||
action: Callable[[Session], None]
|
||||
exc_type: type[Exception]
|
||||
|
||||
engine = sa.create_engine("sqlite://", echo=False)
|
||||
_Base.metadata.create_all(engine)
|
||||
cases = [
|
||||
TestCase(
|
||||
name="session insert with invalid value",
|
||||
action=lambda s: _session_insert_with_value(s, "invalid"),
|
||||
exc_type=ValueError,
|
||||
),
|
||||
TestCase(
|
||||
name="session insert with invalid type",
|
||||
action=lambda s: _session_insert_with_value(s, 1),
|
||||
exc_type=TypeError,
|
||||
),
|
||||
TestCase(
|
||||
name="insert with invalid value",
|
||||
action=lambda s: _insert_with_user(s, "invalid"),
|
||||
exc_type=ValueError,
|
||||
),
|
||||
TestCase(
|
||||
name="insert with invalid type",
|
||||
action=lambda s: _insert_with_user(s, 1),
|
||||
exc_type=TypeError,
|
||||
),
|
||||
]
|
||||
for idx, c in enumerate(cases, 1):
|
||||
with pytest.raises(sa_exc.StatementError) as exc:
|
||||
with Session(engine) as session:
|
||||
c.action(session)
|
||||
|
||||
assert isinstance(exc.value.orig, c.exc_type), f"test case {idx} failed, name={c.name}"
|
||||
|
||||
def test_select_invalid_values(self):
|
||||
engine = sa.create_engine("sqlite://", echo=False)
|
||||
_Base.metadata.create_all(engine)
|
||||
|
||||
insertion_sql = """
|
||||
INSERT INTO users (id, name, user_type) VALUES
|
||||
(1, 'invalid_value', 'invalid');
|
||||
"""
|
||||
with Session(engine) as session:
|
||||
session.execute(sa.text(insertion_sql))
|
||||
session.commit()
|
||||
|
||||
with pytest.raises(ValueError) as exc:
|
||||
with Session(engine) as session:
|
||||
_user = session.query(_User).filter(_User.id == 1).first()
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user