diff --git a/README.md b/README.md
index 6bdebfc..91b55c6 100755
--- a/README.md
+++ b/README.md
@@ -33,17 +33,26 @@ LEANN achieves this through *graph-based selective recomputation* with *high-deg
πͺΆ **Lightweight:** Graph-based recomputation eliminates heavy embedding storage, while smart graph pruning and CSR format minimize graph storage overhead. Always less storage, less memory usage!
+π¦ **Portable:** Transfer your entire knowledge base between devices (even with others) with minimal cost - your personal AI memory travels with you.
+
π **Scalability:** Handle messy personal data that would crash traditional vector DBs, easily managing your growing personalized data and agent generated memory!
β¨ **No Accuracy Loss:** Maintain the same search quality as heavyweight solutions while using 97% less storage.
## Installation
-> **Prerequisites:** Install uv first if you don't have it:
-> ```bash
-> curl -LsSf https://astral.sh/uv/install.sh | sh
-> ```
-> π [Detailed uv installation methods β](https://docs.astral.sh/uv/getting-started/installation/#installation-methods)
+
+π¦ Prerequisites: Install uv (if you don't have it)
+
+Install uv first if you don't have it:
+
+```bash
+curl -LsSf https://astral.sh/uv/install.sh | sh
+```
+
+π [Detailed uv installation methods β](https://docs.astral.sh/uv/getting-started/installation/#installation-methods)
+
+
LEANN provides two installation methods: **pip install** (quick and easy) and **build from source** (recommended for development).
@@ -58,6 +67,7 @@ Clone the repository to access all examples and install LEANN from [PyPI](https:
git clone git@github.com:yichuan-w/LEANN.git leann
cd leann
uv venv
+source .venv/bin/activate
uv pip install leann
```
@@ -82,10 +92,55 @@ uv sync
```
-
-
-## Quick Start in 30s
-
-Our declarative API makes RAG as easy as writing a config file.
-[Try in this ipynb file β](demo.ipynb) [](https://colab.research.google.com/github/yichuan-w/LEANN/blob/main/demo.ipynb)
-
-```python
-from leann.api import LeannBuilder, LeannSearcher, LeannChat
-
-# 1. Build the index (no embeddings stored!)
-builder = LeannBuilder(backend_name="hnsw")
-builder.add_text("C# is a powerful programming language")
-builder.add_text("Python is a powerful programming language and it is very popular")
-builder.add_text("Machine learning transforms industries")
-builder.add_text("Neural networks process complex data")
-builder.add_text("Leann is a great storage saving engine for RAG on your MacBook")
-builder.build_index("knowledge.leann")
-
-# 2. Search with real-time embeddings
-searcher = LeannSearcher("knowledge.leann")
-results = searcher.search("programming languages", top_k=2)
-
-# 3. Chat with LEANN using retrieved results
-llm_config = {
- "type": "ollama",
- "model": "llama3.2:1b"
-}
-
-chat = LeannChat(index_path="knowledge.leann", llm_config=llm_config)
-response = chat.ask(
- "Compare the two retrieved programming languages and say which one is more popular today.",
- top_k=2,
-)
```
-## RAG on Everything!
-
-LEANN supports RAG on various data sources including documents (.pdf, .txt, .md), Apple Mail, Google Search History, WeChat, and more.
+
### π Personal Data Manager: Process Any Documents (.pdf, .txt, .md)!
@@ -156,11 +176,6 @@ Ask questions directly about your personal PDFs, documents, and any directory co
The example below asks a question about summarizing two papers (uses default data in `examples/data`):
-```bash
-# Drop your PDFs, .txt, .md files into examples/data/
-uv run ./examples/main_cli_example.py
-```
-
```
# Or use python directly
source .venv/bin/activate
@@ -171,6 +186,7 @@ python ./examples/main_cli_example.py
### π§ Your Personal Email Secretary: RAG on Apple Mail!
+> **Note:** The examples below currently support macOS only. Windows support coming soon.
@@ -460,10 +476,10 @@ If you find Leann useful, please cite:
## β¨ [Detailed Features β](docs/features.md)
-## π€ [Contributing β](docs/contributing.md)
+## π€ [CONTRIBUTING β](docs/CONTRIBUTING.md)
-## [FAQ β](docs/faq.md)
+## β [FAQ β](docs/faq.md)
## π [Roadmap β](docs/roadmap.md)
diff --git a/demo.ipynb b/demo.ipynb
index 016302c..e91ec01 100644
--- a/demo.ipynb
+++ b/demo.ipynb
@@ -4,7 +4,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "# Quick Start in 30s\n",
+ "# Quick Start \n",
"\n",
"**Home GitHub Repository:** [LEANN on GitHub](https://github.com/yichuan-w/LEANN)\n",
"\n",
@@ -49,68 +49,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {},
- "outputs": [
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "Writing passages: 100%|ββββββββββ| 5/5 [00:00<00:00, 17077.79chunk/s]\n",
- "Batches: 100%|ββββββββββ| 1/1 [00:00<00:00, 36.43it/s]\n",
- "WARNING:leann_backend_hnsw.hnsw_backend:Converting data to float32, shape: (5, 768)\n",
- "INFO:leann_backend_hnsw.hnsw_backend:INFO: Converting HNSW index to CSR-pruned format...\n"
- ]
- },
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "M: 64 for level: 0\n",
- "Starting conversion: index.index -> index.csr.tmp\n",
- "[0.00s] Reading Index HNSW header...\n",
- "[0.00s] Header read: d=768, ntotal=5\n",
- "[0.00s] Reading HNSW struct vectors...\n",
- " Reading vector (dtype=, fmt='d')... Count=6, Bytes=48\n",
- "[0.00s] Read assign_probas (6)\n",
- " Reading vector (dtype=, fmt='i')... Count=7, Bytes=28\n",
- "[0.14s] Read cum_nneighbor_per_level (7)\n",
- " Reading vector (dtype=, fmt='i')... Count=5, Bytes=20\n",
- "[0.24s] Read levels (5)\n",
- "[0.33s] Probing for compact storage flag...\n",
- "[0.33s] Found compact flag: False\n",
- "[0.33s] Compact flag is False, reading original format...\n",
- "[0.33s] Probing for potential extra byte before non-compact offsets...\n",
- "[0.33s] Found and consumed an unexpected 0x00 byte.\n",
- " Reading vector (dtype=, fmt='Q')... Count=6, Bytes=48\n",
- "[0.33s] Read offsets (6)\n",
- "[0.41s] Attempting to read neighbors vector...\n",
- " Reading vector (dtype=, fmt='i')... Count=320, Bytes=1280\n",
- "[0.41s] Read neighbors (320)\n",
- "[0.54s] Read scalar params (ep=4, max_lvl=0)\n",
- "[0.54s] Checking for storage data...\n",
- "[0.54s] Found storage fourcc: 49467849.\n",
- "[0.54s] Converting to CSR format...\n",
- "[0.54s] Conversion loop finished. \n",
- "[0.54s] Running validation checks...\n",
- " Checking total valid neighbor count...\n",
- " OK: Total valid neighbors = 20\n",
- " Checking final pointer indices...\n",
- " OK: Final pointers match data size.\n",
- "[0.54s] Deleting original neighbors and offsets arrays...\n",
- " CSR Stats: |data|=20, |level_ptr|=10\n",
- "[0.63s] Writing CSR HNSW graph data in FAISS-compatible order...\n",
- " Pruning embeddings: Writing NULL storage marker.\n",
- "[0.71s] Conversion complete.\n"
- ]
- },
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "INFO:leann_backend_hnsw.hnsw_backend:β
CSR conversion successful.\n",
- "INFO:leann_backend_hnsw.hnsw_backend:INFO: Replaced original index with CSR-pruned version at 'index.index'\n"
- ]
- }
- ],
+ "outputs": [],
"source": [
"from leann.api import LeannBuilder\n",
"\n",
@@ -136,81 +75,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {},
- "outputs": [
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "INFO:leann.api:π LeannSearcher.search() called:\n",
- "INFO:leann.api: Query: 'programming languages'\n",
- "INFO:leann.api: Top_k: 2\n",
- "INFO:leann.api: Additional kwargs: {}\n",
- "INFO:leann.embedding_server_manager:Port 5557 has incompatible server, trying next port...\n",
- "INFO:leann.embedding_server_manager:Port 5558 has incompatible server, trying next port...\n",
- "INFO:leann.embedding_server_manager:Port 5559 has incompatible server, trying next port...\n",
- "INFO:leann.embedding_server_manager:Port 5560 has incompatible server, trying next port...\n",
- "INFO:leann.embedding_server_manager:Port 5561 has incompatible server, trying next port...\n",
- "INFO:leann.embedding_server_manager:Port 5562 has incompatible server, trying next port...\n",
- "INFO:leann.embedding_server_manager:Starting embedding server on port 5563...\n",
- "INFO:leann.embedding_server_manager:Command: /Users/yichuan/Desktop/code/test_leann_pip/LEANN/.venv/bin/python -m leann_backend_hnsw.hnsw_embedding_server --zmq-port 5563 --model-name facebook/contriever --passages-file /Users/yichuan/Desktop/code/test_leann_pip/LEANN/content/index.meta.json\n",
- "huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...\n",
- "To disable this warning, you can either:\n",
- "\t- Avoid using `tokenizers` before the fork if possible\n",
- "\t- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)\n",
- "INFO:leann.embedding_server_manager:Server process started with PID: 31699\n"
- ]
- },
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "[read_HNSW - CSR NL v4] Reading metadata & CSR indices (manual offset)...\n",
- "[read_HNSW NL v4] Read levels vector, size: 5\n",
- "[read_HNSW NL v4] Reading Compact Storage format indices...\n",
- "[read_HNSW NL v4] Read compact_level_ptr, size: 10\n",
- "[read_HNSW NL v4] Read compact_node_offsets, size: 6\n",
- "[read_HNSW NL v4] Read entry_point: 4, max_level: 0\n",
- "[read_HNSW NL v4] Read storage fourcc: 0x6c6c756e\n",
- "[read_HNSW NL v4 FIX] Detected FileIOReader. Neighbors size field offset: 326\n",
- "[read_HNSW NL v4] Reading neighbors data into memory.\n",
- "[read_HNSW NL v4] Read neighbors data, size: 20\n",
- "[read_HNSW NL v4] Finished reading metadata and CSR indices.\n",
- "INFO: Skipping external storage loading, since is_recompute is true.\n",
- "INFO: Registering backend 'hnsw'\n"
- ]
- },
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "Traceback (most recent call last):\n",
- " File \"\", line 198, in _run_module_as_main\n",
- " File \"\", line 88, in _run_code\n",
- " File \"/Users/yichuan/Desktop/code/test_leann_pip/LEANN/.venv/lib/python3.11/site-packages/leann_backend_hnsw/hnsw_embedding_server.py\", line 323, in \n",
- " create_hnsw_embedding_server(\n",
- " File \"/Users/yichuan/Desktop/code/test_leann_pip/LEANN/.venv/lib/python3.11/site-packages/leann_backend_hnsw/hnsw_embedding_server.py\", line 98, in create_hnsw_embedding_server\n",
- " passages = PassageManager(passage_sources)\n",
- " ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n",
- " File \"/Users/yichuan/Desktop/code/test_leann_pip/LEANN/.venv/lib/python3.11/site-packages/leann/api.py\", line 127, in __init__\n",
- " raise FileNotFoundError(f\"Passage index file not found: {index_file}\")\n",
- "FileNotFoundError: Passage index file not found: /Users/yichuan/Desktop/code/test_leann_pip/LEANN/index.passages.idx\n",
- "ERROR:leann.embedding_server_manager:Server terminated during startup.\n"
- ]
- },
- {
- "ename": "RuntimeError",
- "evalue": "Failed to start embedding server on port 5563",
- "output_type": "error",
- "traceback": [
- "\u001b[31m---------------------------------------------------------------------------\u001b[39m",
- "\u001b[31mRuntimeError\u001b[39m Traceback (most recent call last)",
- "\u001b[36mCell\u001b[39m\u001b[36m \u001b[39m\u001b[32mIn[4]\u001b[39m\u001b[32m, line 4\u001b[39m\n\u001b[32m 1\u001b[39m \u001b[38;5;28;01mfrom\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[34;01mleann\u001b[39;00m\u001b[34;01m.\u001b[39;00m\u001b[34;01mapi\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[38;5;28;01mimport\u001b[39;00m LeannSearcher\n\u001b[32m 3\u001b[39m searcher = LeannSearcher(\u001b[33m\"\u001b[39m\u001b[33mindex\u001b[39m\u001b[33m\"\u001b[39m)\n\u001b[32m----> \u001b[39m\u001b[32m4\u001b[39m results = \u001b[43msearcher\u001b[49m\u001b[43m.\u001b[49m\u001b[43msearch\u001b[49m\u001b[43m(\u001b[49m\u001b[33;43m\"\u001b[39;49m\u001b[33;43mprogramming languages\u001b[39;49m\u001b[33;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mtop_k\u001b[49m\u001b[43m=\u001b[49m\u001b[32;43m2\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[32m 5\u001b[39m results\n",
- "\u001b[36mFile \u001b[39m\u001b[32m~/Desktop/code/test_leann_pip/LEANN/.venv/lib/python3.11/site-packages/leann/api.py:439\u001b[39m, in \u001b[36mLeannSearcher.search\u001b[39m\u001b[34m(self, query, top_k, complexity, beam_width, prune_ratio, recompute_embeddings, pruning_strategy, expected_zmq_port, **kwargs)\u001b[39m\n\u001b[32m 437\u001b[39m start_time = time.time()\n\u001b[32m 438\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m recompute_embeddings:\n\u001b[32m--> \u001b[39m\u001b[32m439\u001b[39m zmq_port = \u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43mbackend_impl\u001b[49m\u001b[43m.\u001b[49m\u001b[43m_ensure_server_running\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 440\u001b[39m \u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43mmeta_path_str\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 441\u001b[39m \u001b[43m \u001b[49m\u001b[43mport\u001b[49m\u001b[43m=\u001b[49m\u001b[43mexpected_zmq_port\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 442\u001b[39m \u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 443\u001b[39m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 444\u001b[39m \u001b[38;5;28;01mdel\u001b[39;00m expected_zmq_port\n\u001b[32m 445\u001b[39m zmq_time = time.time() - start_time\n",
- "\u001b[36mFile \u001b[39m\u001b[32m~/Desktop/code/test_leann_pip/LEANN/.venv/lib/python3.11/site-packages/leann/searcher_base.py:81\u001b[39m, in \u001b[36mBaseSearcher._ensure_server_running\u001b[39m\u001b[34m(self, passages_source_file, port, **kwargs)\u001b[39m\n\u001b[32m 72\u001b[39m server_started, actual_port = \u001b[38;5;28mself\u001b[39m.embedding_server_manager.start_server(\n\u001b[32m 73\u001b[39m port=port,\n\u001b[32m 74\u001b[39m model_name=\u001b[38;5;28mself\u001b[39m.embedding_model,\n\u001b[32m (...)\u001b[39m\u001b[32m 78\u001b[39m enable_warmup=kwargs.get(\u001b[33m\"\u001b[39m\u001b[33menable_warmup\u001b[39m\u001b[33m\"\u001b[39m, \u001b[38;5;28;01mFalse\u001b[39;00m),\n\u001b[32m 79\u001b[39m )\n\u001b[32m 80\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m server_started:\n\u001b[32m---> \u001b[39m\u001b[32m81\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mRuntimeError\u001b[39;00m(\n\u001b[32m 82\u001b[39m \u001b[33mf\u001b[39m\u001b[33m\"\u001b[39m\u001b[33mFailed to start embedding server on port \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mactual_port\u001b[38;5;132;01m}\u001b[39;00m\u001b[33m\"\u001b[39m\n\u001b[32m 83\u001b[39m )\n\u001b[32m 85\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m actual_port\n",
- "\u001b[31mRuntimeError\u001b[39m: Failed to start embedding server on port 5563"
- ]
- }
- ],
+ "outputs": [],
"source": [
"from leann.api import LeannSearcher\n",
"\n",
diff --git a/docs/CONTRIBUTING.md b/docs/CONTRIBUTING.md
new file mode 100644
index 0000000..4a37e26
--- /dev/null
+++ b/docs/CONTRIBUTING.md
@@ -0,0 +1,220 @@
+# π€ Contributing
+
+We welcome contributions! Leann is built by the community, for the community.
+
+## Ways to Contribute
+
+- π **Bug Reports**: Found an issue? Let us know!
+- π‘ **Feature Requests**: Have an idea? We'd love to hear it!
+- π§ **Code Contributions**: PRs welcome for all skill levels
+- π **Documentation**: Help make Leann more accessible
+- π§ͺ **Benchmarks**: Share your performance results
+
+## π Development Setup
+
+### Prerequisites
+
+1. **Install uv** (fast Python package installer):
+ ```bash
+ curl -LsSf https://astral.sh/uv/install.sh | sh
+ ```
+
+2. **Clone the repository**:
+ ```bash
+ git clone https://github.com/LEANN-RAG/LEANN-RAG.git
+ cd LEANN-RAG
+ ```
+
+3. **Install system dependencies**:
+
+ **macOS:**
+ ```bash
+ brew install llvm libomp boost protobuf zeromq pkgconf
+ ```
+
+ **Ubuntu/Debian:**
+ ```bash
+ sudo apt-get install libomp-dev libboost-all-dev protobuf-compiler \
+ libabsl-dev libmkl-full-dev libaio-dev libzmq3-dev
+ ```
+
+4. **Build from source**:
+ ```bash
+ # macOS
+ CC=$(brew --prefix llvm)/bin/clang CXX=$(brew --prefix llvm)/bin/clang++ uv sync
+
+ # Ubuntu/Debian
+ uv sync
+ ```
+
+## π¨ Pre-commit Hooks
+
+We use pre-commit hooks to ensure code quality and consistency. This runs automatically before each commit.
+
+### Setup Pre-commit
+
+1. **Install pre-commit** (already included when you run `uv sync`):
+ ```bash
+ uv pip install pre-commit
+ ```
+
+2. **Install the git hooks**:
+ ```bash
+ pre-commit install
+ ```
+
+3. **Run pre-commit manually** (optional):
+ ```bash
+ pre-commit run --all-files
+ ```
+
+### Pre-commit Checks
+
+Our pre-commit configuration includes:
+- **Trailing whitespace removal**
+- **End-of-file fixing**
+- **YAML validation**
+- **Large file prevention**
+- **Merge conflict detection**
+- **Debug statement detection**
+- **Code formatting with ruff**
+- **Code linting with ruff**
+
+## π§ͺ Testing
+
+### Running Tests
+
+```bash
+# Run all tests
+uv run pytest
+
+# Run specific test file
+uv run pytest test/test_filename.py
+
+# Run with coverage
+uv run pytest --cov=leann
+```
+
+### Writing Tests
+
+- Place tests in the `test/` directory
+- Follow the naming convention `test_*.py`
+- Use descriptive test names that explain what's being tested
+- Include both positive and negative test cases
+
+## π Code Style
+
+We use `ruff` for both linting and formatting to ensure consistent code style.
+
+### Format Your Code
+
+```bash
+# Format all files
+ruff format
+
+# Check formatting without changing files
+ruff format --check
+```
+
+### Lint Your Code
+
+```bash
+# Run linter with auto-fix
+ruff check --fix
+
+# Just check without fixing
+ruff check
+```
+
+### Style Guidelines
+
+- Follow PEP 8 conventions
+- Use descriptive variable names
+- Add type hints where appropriate
+- Write docstrings for all public functions and classes
+- Keep functions focused and single-purpose
+
+## π¦ CI/CD
+
+Our CI pipeline runs automatically on all pull requests. It includes:
+
+1. **Linting and Formatting**: Ensures code follows our style guidelines
+2. **Multi-platform builds**: Tests on Ubuntu and macOS
+3. **Python version matrix**: Tests on Python 3.9-3.13
+4. **Wheel building**: Ensures packages can be built and distributed
+
+### CI Commands
+
+The CI uses the same commands as pre-commit to ensure consistency:
+```bash
+# Linting
+ruff check .
+
+# Format checking
+ruff format --check .
+```
+
+Make sure your code passes these checks locally before pushing!
+
+## π Pull Request Process
+
+1. **Fork the repository** and create your branch from `main`:
+ ```bash
+ git checkout -b feature/your-feature-name
+ ```
+
+2. **Make your changes**:
+ - Write clean, documented code
+ - Add tests for new functionality
+ - Update documentation as needed
+
+3. **Run pre-commit checks**:
+ ```bash
+ pre-commit run --all-files
+ ```
+
+4. **Test your changes**:
+ ```bash
+ uv run pytest
+ ```
+
+5. **Commit with descriptive messages**:
+ ```bash
+ git commit -m "feat: add new search algorithm"
+ ```
+
+ Follow [Conventional Commits](https://www.conventionalcommits.org/):
+ - `feat:` for new features
+ - `fix:` for bug fixes
+ - `docs:` for documentation changes
+ - `test:` for test additions/changes
+ - `refactor:` for code refactoring
+ - `perf:` for performance improvements
+
+6. **Push and create a pull request**:
+ - Provide a clear description of your changes
+ - Reference any related issues
+ - Include examples or screenshots if applicable
+
+## π Documentation
+
+When adding new features or making significant changes:
+
+1. Update relevant documentation in `/docs`
+2. Add docstrings to new functions/classes
+3. Update README.md if needed
+4. Include usage examples
+
+## π€ Getting Help
+
+- **Discord**: Join our community for discussions
+- **Issues**: Check existing issues or create a new one
+- **Discussions**: For general questions and ideas
+
+## π License
+
+By contributing, you agree that your contributions will be licensed under the same license as the project (MIT).
+
+---
+
+Thank you for contributing to LEANN! Every contribution, no matter how small, helps make the project better for everyone. π
diff --git a/docs/contributing.md b/docs/contributing.md
deleted file mode 100644
index 1cacc26..0000000
--- a/docs/contributing.md
+++ /dev/null
@@ -1,11 +0,0 @@
-# π€ Contributing
-
-We welcome contributions! Leann is built by the community, for the community.
-
-## Ways to Contribute
-
-- π **Bug Reports**: Found an issue? Let us know!
-- π‘ **Feature Requests**: Have an idea? We'd love to hear it!
-- π§ **Code Contributions**: PRs welcome for all skill levels
-- π **Documentation**: Help make Leann more accessible
-- π§ͺ **Benchmarks**: Share your performance results
diff --git a/docs/normalized_embeddings.md b/docs/normalized_embeddings.md
index d6f285e..46213e5 100644
--- a/docs/normalized_embeddings.md
+++ b/docs/normalized_embeddings.md
@@ -72,4 +72,4 @@ Using the wrong distance metric with normalized embeddings can lead to:
- **Incorrect ranking** of search results
- **Suboptimal performance** compared to using the correct metric
-For more details on why this happens, see our analysis of [OpenAI embeddings with MIPS](../examples/main_cli_example.py).
\ No newline at end of file
+For more details on why this happens, see our analysis of [OpenAI embeddings with MIPS](../examples/main_cli_example.py).
diff --git a/packages/leann-backend-diskann/pyproject.toml b/packages/leann-backend-diskann/pyproject.toml
index f8f38bc..ae3b3a9 100644
--- a/packages/leann-backend-diskann/pyproject.toml
+++ b/packages/leann-backend-diskann/pyproject.toml
@@ -4,8 +4,8 @@ build-backend = "scikit_build_core.build"
[project]
name = "leann-backend-diskann"
-version = "0.1.14"
-dependencies = ["leann-core==0.1.14", "numpy", "protobuf>=3.19.0"]
+version = "0.1.15"
+dependencies = ["leann-core==0.1.15", "numpy", "protobuf>=3.19.0"]
[tool.scikit-build]
# Key: simplified CMake path
diff --git a/packages/leann-backend-hnsw/pyproject.toml b/packages/leann-backend-hnsw/pyproject.toml
index 82a46b8..b989d6d 100644
--- a/packages/leann-backend-hnsw/pyproject.toml
+++ b/packages/leann-backend-hnsw/pyproject.toml
@@ -6,10 +6,10 @@ build-backend = "scikit_build_core.build"
[project]
name = "leann-backend-hnsw"
-version = "0.1.14"
+version = "0.1.15"
description = "Custom-built HNSW (Faiss) backend for the Leann toolkit."
dependencies = [
- "leann-core==0.1.14",
+ "leann-core==0.1.15",
"numpy",
"pyzmq>=23.0.0",
"msgpack>=1.0.0",
diff --git a/packages/leann-core/pyproject.toml b/packages/leann-core/pyproject.toml
index a8a9983..3b66c69 100644
--- a/packages/leann-core/pyproject.toml
+++ b/packages/leann-core/pyproject.toml
@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project]
name = "leann-core"
-version = "0.1.14"
+version = "0.1.15"
description = "Core API and plugin system for LEANN"
readme = "README.md"
requires-python = ">=3.9"
diff --git a/packages/leann/README.md b/packages/leann/README.md
index 0488c3d..4281ef1 100644
--- a/packages/leann/README.md
+++ b/packages/leann/README.md
@@ -16,25 +16,24 @@ uv pip install leann[diskann]
```python
from leann import LeannBuilder, LeannSearcher, LeannChat
+from pathlib import Path
+INDEX_PATH = str(Path("./").resolve() / "demo.leann")
# Build an index
builder = LeannBuilder(backend_name="hnsw")
builder.add_text("LEANN saves 97% storage compared to traditional vector databases.")
-builder.build_index("my_index.leann")
+builder.add_text("Tung Tung Tung Sahur calledβthey need their bananaβcrocodile hybrid back")
+builder.build_index(INDEX_PATH)
# Search
-searcher = LeannSearcher("my_index.leann")
-results = searcher.search("storage savings", top_k=3)
+searcher = LeannSearcher(INDEX_PATH)
+results = searcher.search("fantastical AI-generated creatures", top_k=1)
# Chat with your data
-chat = LeannChat("my_index.leann", llm_config={"type": "ollama", "model": "llama3.2:1b"})
-response = chat.ask("How much storage does LEANN save?")
+chat = LeannChat(INDEX_PATH, llm_config={"type": "hf", "model": "Qwen/Qwen3-0.6B"})
+response = chat.ask("How much storage does LEANN save?", top_k=1)
```
-## Documentation
-
-For full documentation, visit [https://leann.readthedocs.io](https://leann.readthedocs.io)
-
## License
MIT License
diff --git a/packages/leann/pyproject.toml b/packages/leann/pyproject.toml
index a6db993..6727621 100644
--- a/packages/leann/pyproject.toml
+++ b/packages/leann/pyproject.toml
@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project]
name = "leann"
-version = "0.1.14"
+version = "0.1.15"
description = "LEANN - The smallest vector index in the world. RAG Everything with LEANN!"
readme = "README.md"
requires-python = ">=3.9"
@@ -36,7 +36,5 @@ diskann = [
]
[project.urls]
-Homepage = "https://github.com/yourusername/leann"
-Documentation = "https://leann.readthedocs.io"
-Repository = "https://github.com/yourusername/leann"
-Issues = "https://github.com/yourusername/leann/issues"
+Repository = "https://github.com/yichuan-w/LEANN"
+Issues = "https://github.com/yichuan-w/LEANN/issues"
diff --git a/pyproject.toml b/pyproject.toml
index 0856945..aac0f78 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -53,6 +53,7 @@ dev = [
"ruff>=0.1.0",
"matplotlib",
"huggingface-hub>=0.20.0",
+ "pre-commit>=3.5.0",
]
diskann = [