Fix test cases (#3718)

### What problem does this PR solve?

Fix test cases

### Type of change

- [x] Other (please describe): Fix error cases

---------

Signed-off-by: jinhai <haijin.chn@gmail.com>
This commit is contained in:
Jin Hai 2024-11-28 17:37:46 +08:00 committed by GitHub
parent 964a6f4ec4
commit cdae8d28fe
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
8 changed files with 20 additions and 19 deletions

View File

@ -333,8 +333,7 @@ docker build -f Dockerfile -t infiniflow/ragflow:dev .
cd web
npm install --force
```
7. Configure frontend to update `proxy.target` in **.umirc.ts** to `http://127.0.0.1:9380`:
8. Launch frontend service:
7. Launch frontend service:
```bash
npm run dev
```

View File

@ -308,8 +308,7 @@ docker build -f Dockerfile -t infiniflow/ragflow:dev .
cd web
npm install --force
```
7. Konfigurasikan frontend untuk memperbarui `proxy.target` di **.umirc.ts** menjadi `http://127.0.0.1:9380`:
8. Jalankan aplikasi frontend:
7. Jalankan aplikasi frontend:
```bash
npm run dev
```

View File

@ -289,8 +289,7 @@ docker build -f Dockerfile -t infiniflow/ragflow:dev .
cd web
npm install --force
```
7. フロントエンドを設定し、**.umirc.ts** の `proxy.target``http://127.0.0.1:9380` に更新します:
8. フロントエンドサービスを起動する:
7. フロントエンドサービスを起動する:
```bash
npm run dev
```

View File

@ -291,8 +291,7 @@ docker build -f Dockerfile -t infiniflow/ragflow:dev .
cd web
npm install --force
```
7. **.umirc.ts** 에서 `proxy.target``http://127.0.0.1:9380` 으로 업데이트합니다:
8. 프론트엔드 서비스를 시작합니다:
7. 프론트엔드 서비스를 시작합니다:
```bash
npm run dev
```

View File

@ -296,8 +296,7 @@ docker build -f Dockerfile -t infiniflow/ragflow:dev .
cd web
npm install --force
```
7. 配置前端,将 **.umirc.ts** 的 `proxy.target` 更新为 `http://127.0.0.1:9380`
8. 启动前端服务:
7. 启动前端服务:
```bash
npm run dev
```

View File

@ -40,7 +40,10 @@ def login():
@pytest.fixture(scope="session")
def get_api_key_fixture():
register()
try:
register()
except Exception as e:
print(e)
auth = login()
url = HOST_ADDRESS + "/v1/system/new_token"
auth = {"Authorization": auth}

View File

@ -14,8 +14,8 @@ def test_dataset(get_auth):
dataset_list = []
while True:
res = list_dataset(get_auth, page_number)
data = res.get("data")
for item in data.get("kbs"):
data = res.get("data").get("kbs")
for item in data:
dataset_id = item.get("id")
dataset_list.append(dataset_id)
if len(dataset_list) < page_number * 150:
@ -43,8 +43,8 @@ def test_dataset_1k_dataset(get_auth):
dataset_list = []
while True:
res = list_dataset(get_auth, page_number)
data = res.get("data")
for item in data.get("kbs"):
data = res.get("data").get("kbs")
for item in data:
dataset_id = item.get("id")
dataset_list.append(dataset_id)
if len(dataset_list) < page_number * 150:
@ -66,7 +66,7 @@ def test_duplicated_name_dataset(get_auth):
# list dataset
res = list_dataset(get_auth, 1)
data = res.get("data")
data = res.get("data").get("kbs")
dataset_list = []
pattern = r'^test_create_dataset.*'
for item in data:
@ -109,7 +109,7 @@ def test_update_different_params_dataset(get_auth):
dataset_list = []
while True:
res = list_dataset(get_auth, page_number)
data = res.get("data")
data = res.get("data").get("kbs")
for item in data:
dataset_id = item.get("id")
dataset_list.append(dataset_id)

View File

@ -191,3 +191,6 @@ def test_retrieve_chunks(get_api_key_fixture):
doc = docs[0]
doc.add_chunk(content="This is a chunk addition test")
rag.retrieve(dataset_ids=[ds.id],document_ids=[doc.id])
rag.delete_datasets(ids=[ds.id])
# test different parameters for the retrieval