Installation
Prerequisites
- Node.js 20 or later - Required for running Bellwether
- An MCP server to test - Any server implementing the Model Context Protocol
- An LLM API key (optional) - Only required for
bellwether explore;bellwether checkis free and requires no API keys
Install via npm
The recommended way to install Bellwether:
npm install -g @dotsetlabs/bellwether
Verify the installation:
bellwether --version
Use with npx (No Installation)
Run Bellwether directly without global installation:
# Initialize and run check
npx @dotsetlabs/bellwether init npx @mcp/your-server
npx @dotsetlabs/bellwether check
This is useful for CI/CD pipelines or one-off usage. Most commands require a config file, so init is usually the first step (auth, discover, and registry can run without one; validate-config can validate an explicit path via --config).
LLM Provider Setup (Optional)
By default, bellwether check requires no LLM and no API keys. It's free, fast, and deterministic. Only set up an LLM provider if you want bellwether explore for persona-based behavioral analysis and AGENTS.md documentation generation.
For explore mode, choose one of the following providers:
Anthropic Claude (Recommended)
Get your API key from console.anthropic.com.
Default model: claude-haiku-4-5 (best quality/cost balance)
OpenAI
Get your API key from platform.openai.com.
Default model: gpt-4.1-nano (budget option)
Setting Your API Key
Choose one of these methods (in order of recommendation):
Option A: Interactive setup (recommended)
The easiest way to configure your API key:
bellwether auth
This interactive wizard will:
- Ask which provider you want to use
- Prompt for your API key (input is hidden)
- Store it securely in your system keychain
Option B: System keychain (manual)
Store your API key in the system keychain directly:
bellwether auth add openai
# or
bellwether auth add anthropic
Option C: Global .env file
Set once, use everywhere:
mkdir -p ~/.bellwether
echo "OPENAI_API_KEY=sk-your-key-here" >> ~/.bellwether/.env
Option D: Project .env file
Per-project configuration (overrides global):
echo "OPENAI_API_KEY=sk-your-key-here" >> .env
Option E: Shell environment
Temporary, for current session only:
export OPENAI_API_KEY=sk-your-key-here
Use bellwether auth for the best experience. It stores your API key securely in the system keychain (macOS Keychain, Windows Credential Manager, or Linux Secret Service) and works across all your projects.
Most commands require bellwether.yaml (for example check, explore, baseline, golden, contract, watch).
auth, discover, and registry can run without a config file. validate-config can validate a specific config path via --config. Run bellwether init once in your project root for the rest.
Checking Auth Status
See which providers are configured:
bellwether auth status
Ollama (Free, Local)
For completely free, local LLM usage with explore mode:
# Install Ollama (macOS/Linux)
curl -fsSL https://ollama.com/install.sh | sh
# Start Ollama
ollama serve
# Pull a model
ollama pull qwen3:8b
# Initialize config for local Ollama
bellwether init --preset local npx @modelcontextprotocol/server-filesystem /tmp
# Run explore (no API key needed)
bellwether explore
Default model: qwen3:8b
Ollama is great for development and testing. For production CI/CD, OpenAI or Anthropic provide more consistent results.
Provider Selection
Bellwether uses the provider set in bellwether.yaml:
llm:
provider: openai
Configuration File
For persistent configuration, create bellwether.yaml in your project root:
server:
command: npx @mcp/your-server
timeout: 30000
llm:
provider: anthropic
model: claude-sonnet-4-5
explore:
personas:
- technical_writer
- security_tester
maxQuestionsPerTool: 3
output:
dir: "."
Verify Setup
Test your installation:
# Initialize with a test server
bellwether init npx @modelcontextprotocol/server-memory
# Check mode (free, no API key needed)
bellwether check
# Explore mode with Ollama (free)
ollama serve &
bellwether explore
For check mode, you should see:
- Connection to the MCP server
- Tool discovery
- Schema validation
- CONTRACT.md file generated
For explore mode, you should also see:
- Persona-based testing
- AGENTS.md file generated
Troubleshooting
"API key not found"
This only applies to explore mode. Set up your API key:
bellwether auth
Or check your current authentication status:
bellwether auth status
"Connection refused" with Ollama
Make sure Ollama is running:
ollama serve
Node.js version issues
Bellwether requires Node.js 20+:
node --version # Should be v20.x.x or higher
# Use nvm to install correct version
nvm install 20
nvm use 20
Next Steps
- Quick Start - Run your first check
- Local Development - Test your MCP server during development
- Configuration Guide - Advanced configuration options
- CLI Reference - Full command documentation