ensemble-mcp docs
Presentation GitHub

Installation Guide

Detailed installation instructions for ensemble-mcp, covering multiple install methods, system requirements, and upgrade procedures.

System Requirements

RequirementMinimumNotes
Python3.11+3.12 and 3.13 also supported
OSLinux, macOS, WindowsAny platform with Python support
Disk Space~50 MB~22 MB for ONNX model + DB + package
RAM~100 MBONNX Runtime embedding model

Runtime Dependencies

These are installed automatically via pip:

PackageVersionPurpose
mcp≥1.0MCP protocol implementation
onnxruntime≥1.17Local embedding model inference
numpy≥2.4.4Vector operations and cosine similarity
tokenizers≥0.15Tokenizer for the embedding model
rich≥15.0.0Terminal output formatting
aiohttp≥3.9Web dashboard HTTP server

Install from PyPI

The recommended installation method:

pip install ensemble-mcp

Using uvx (No Install)

With uv, you can run without installing:

uvx ensemble-mcp

Using pipx (Isolated Install)

For a globally available, isolated installation:

pipx install ensemble-mcp

Command Detection During Registration

When you run ensemble-mcp install, the installer automatically detects how ensemble-mcp is available and registers the appropriate command in each AI tool's config:

PriorityDetectionRegistered Command
1stensemble-mcp on PATH (pip/pipx)ensemble-mcp
2nduvx on PATHuvx ensemble-mcp
3rdNeither found/path/to/python -m ensemble_mcp (full sys.executable path)

The installer prefers a direct ensemble-mcp binary first because it's the most specific and reliable — it confirms the package is actually installed locally. The uvx fallback can auto-fetch from PyPI but may fail on private networks or if the package hasn't been published yet. The final fallback uses the current Python interpreter's absolute path (e.g. /home/user/.venv/bin/python), not the bare python command, to ensure the correct environment is used.

Install from Source

Clone the repository and install in editable mode:

git clone https://github.com/LynkByte/ensemble.git
cd ensemble
pip install -e .

With Development Dependencies

To also install testing and linting tools:

pip install -e ".[dev]"

This adds: pytest, pytest-asyncio, pytest-aiohttp, pytest-cov, ruff, mypy, and build.

Docker

Build and run the server in a container:

# Build the image
docker build -t ensemble-mcp .

# Run the server (stdio mode)
docker run -i ensemble-mcp

# Run with a persistent data volume
docker run -i -v ensemble-data:/root/.cache/ensemble-mcp ensemble-mcp
Note

When running in Docker, the MCP server communicates over stdio. Your AI tool must be configured to launch the container instead of a local command.

Verifying Installation

After installing, verify the CLI is available:

# Check the command exists
ensemble-mcp --help

# Run the server (Ctrl+C to stop)
ensemble-mcp

# Check server health via the web dashboard
ensemble-mcp web

The first run will automatically:

  1. Create ~/.cache/ensemble-mcp/ directory
  2. Download the ONNX embedding model (~22 MB) to ~/.cache/ensemble-mcp/models/
  3. Create the SQLite database at ~/.cache/ensemble-mcp/data.db

Upgrading

From PyPI

pip install --upgrade ensemble-mcp

From Source

cd ensemble
git pull
pip install -e .

Database Migrations

Schema migrations are applied automatically on startup. The server uses ensure_schema() to create or update tables as needed. Your existing data is preserved across upgrades.

Data Locations

PathContents
~/.cache/ensemble-mcp/data.dbSQLite database (WAL mode) — patterns, sessions, indexes
~/.cache/ensemble-mcp/models/ONNX MiniLM-L6-v2 model files (~22 MB)
~/.config/ensemble-mcp/config.tomlGlobal configuration file (optional)
.ensemble-mcp.tomlPer-project configuration (optional, in project root)

Uninstalling

Remove ensemble-mcp registration from all AI tools:

ensemble-mcp uninstall

To also remove agent/skill files and cached data:

ensemble-mcp uninstall --remove-agents --clean-data

Then remove the package:

pip uninstall ensemble-mcp

Next Steps