Update document (#3746)

### What problem does this PR solve?

Fix description on local LLM deployment case

### Type of change

- [x] Documentation Update

---------

Signed-off-by: jinhai <haijin.chn@gmail.com>
Co-authored-by: writinwaters <93570324+writinwaters@users.noreply.github.com>
This commit is contained in:
Jin Hai 2024-11-29 14:50:45 +08:00 committed by GitHub
parent 06a21d2031
commit 0a62dd7a7e
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -74,9 +74,9 @@ In the popup window, complete basic settings for Ollama:
4. OPTIONAL: Switch on the toggle under **Does it support Vision?** if your model includes an image-to-text model.
:::caution NOTE
- If RAGFlow is in Docker and Ollama runs on the same host machine, use `http://host.docker.internal:11434` as base URL.
- If your Ollama and RAGFlow run on the same machine, use `http://localhost:11434` as base URL.
- If your Ollama and RAGFlow run on the same machine and Ollama is in Docker, use `http://host.docker.internal:11434` as base URL.
- If your Ollama runs on a different machine from RAGFlow, use `http://<IP_OF_OLLAMA_MACHINE>:11434` as base URL.
- If your Ollama runs on a different machine from RAGFlow, use `http://<IP_OF_OLLAMA_MACHINE>:11434` as base URL.
:::
:::danger WARNING