mirror of
https://git.mirrors.martin98.com/https://github.com/infiniflow/ragflow.git
synced 2025-08-12 05:48:58 +08:00
Fixed a broken link (#2190)
To fix a broken link ### Type of change - [x] Documentation Update
This commit is contained in:
parent
c04686d426
commit
922f79e757
@ -357,7 +357,7 @@ This exception occurs when starting up the RAGFlow server. Try the following:
|
||||
|
||||
1. Right click the desired dialog to display the **Chat Configuration** window.
|
||||
2. Switch to the **Model Setting** tab and adjust the **Max Tokens** slider to get the desired length.
|
||||
3. Click **OK** to confirm your change.
|
||||
3. Click **OK** to confirm your change.
|
||||
|
||||
|
||||
### 2. What does Empty response mean? How to set it?
|
||||
@ -370,7 +370,7 @@ You limit what the system responds to what you specify in **Empty response** if
|
||||
|
||||
### 4. How to run RAGFlow with a locally deployed LLM?
|
||||
|
||||
You can use Ollama to deploy local LLM. See [here](https://github.com/infiniflow/ragflow/blob/main/docs/guides/deploy_local_llm.md) for more information.
|
||||
You can use Ollama to deploy local LLM. See [here](../guides/deploy_local_llm.mdx) for more information.
|
||||
|
||||
### 5. How to link up ragflow and ollama servers?
|
||||
|
||||
|
Loading…
x
Reference in New Issue
Block a user