mirror of
https://git.mirrors.martin98.com/https://github.com/infiniflow/ragflow.git
synced 2025-04-23 22:50:17 +08:00

### What problem does this PR solve?  make `<xxxx>` visiable it was misinterpreted as part of the HTML tags  Issue link:None ### Type of change - [ ] Bug Fix (non-breaking change which fixes an issue) - [ ] New Feature (non-breaking change which adds functionality) - [ ] Breaking Change (fix or feature that could cause existing functionality not to work as expected) - [x] Documentation Update - [ ] Refactoring - [ ] Performance Improvement - [ ] Test cases - [ ] Python SDK impacted, Need to update PyPI - [ ] Other (please describe):
1.4 KiB
1.4 KiB
Ollama
One-click deployment of local LLMs, that is Ollama.
Install
Launch Ollama
Decide which LLM you want to deploy (here's a list for supported LLM), say, mistral:
$ ollama run mistral
Or,
$ docker exec -it ollama ollama run mistral
Use Ollama in RAGFlow
- Go to 'Settings > Model Providers > Models to be added > Ollama'.
Base URL: Enter the base URL where the Ollama service is accessible, like,
http://<your-ollama-endpoint-domain>:11434
.
- Use Ollama Models.