From 42690cb74e6882620c1095a4d049b2c4e3afba82 Mon Sep 17 00:00:00 2001 From: Andy Lee Date: Sat, 9 Aug 2025 16:46:47 -0700 Subject: [PATCH] docs: remove ollama embedding extra instructions --- README.md | 21 --------------------- 1 file changed, 21 deletions(-) diff --git a/README.md b/README.md index 7db3882..748b252 100755 --- a/README.md +++ b/README.md @@ -98,27 +98,6 @@ uv sync -### 🆕 Using Ollama for Embeddings (Privacy-Focused) - -LEANN now supports Ollama for generating embeddings locally, perfect for privacy-sensitive applications: - -```bash -# First, pull an embedding model from Ollama -ollama pull nomic-embed-text # or mxbai-embed-large, bge-m3, etc. - -# Build an index using Ollama embeddings -leann build my-project --docs ./documents --embedding-model nomic-embed-text --embedding-mode ollama - -# Use with example apps -python -m apps.document_rag --embedding-model nomic-embed-text --embedding-mode ollama --query "Your question" -``` - -**Available Ollama Embedding Models:** -- `nomic-embed-text`: High-performing 768-dim embeddings -- `mxbai-embed-large`: Large 1024-dim embeddings -- `bge-m3`: Multilingual embeddings -- See [Ollama library](https://ollama.com/library) for more embedding models - ## Quick Start Our declarative API makes RAG as easy as writing a config file.