ragflow/docs/ollama.md
KevinHuSh cd254b6bdd
Refine Ollama docs (#267)
### What problem does this PR solve?

Issue link:#221

### Type of change
- [x] Documentation Update
2024-04-09 09:04:44 +08:00

1.4 KiB

Ollama

One-click deployment of local LLMs, that is Ollama.

Install

Launch Ollama

Decide which LLM you want to deploy (here's a list for supported LLM), say, mistral:

$ ollama run mistral

Or,

$ docker exec -it ollama ollama run mistral

Use Ollama in RAGFlow

  • Go to 'Settings > Model Providers > Models to be added > Ollama'.

Base URL: Enter the base URL where the Ollama service is accessible, like, http://:11434

  • Use Ollama Models.