Installation
Clarissa can be installed in several ways depending on your preferences and setup. It supports multiple LLM providers including cloud APIs and local inference.
Requirements
- Bun v1.0 or later (for running from source or npm install)
- At least one LLM provider:
- OpenRouter API key (100+ models)
- OpenAI API key (GPT models)
- Anthropic API key (Claude models)
- Apple Intelligence (macOS 26+ with Apple Silicon)
- LM Studio (local server)
- Local GGUF model (via
clarissa download)
From npm (Recommended)
The easiest way to install Clarissa is via npm or bun:
Using Bun
bun install -g clarissa Using npm
npm install -g clarissa From Source
Clone the repository and link the package locally:
git clone https://github.com/cameronrye/clarissa.git
cd clarissa
bun install
bun link This will make the clarissa command available globally on your system.
Standalone Binary
Download a pre-built binary from the releases page and add it to your PATH.
Available Platforms
| Platform | Binary Name |
|---|---|
| macOS (ARM) | clarissa-macos-arm64 |
| macOS (Intel) | clarissa-macos-x64 |
| Linux (x64) | clarissa-linux-x64 |
| Linux (ARM) | clarissa-linux-arm64 |
| Windows (x64) | clarissa-windows-x64.exe |
Example: macOS ARM
chmod +x clarissa-macos-arm64
mv clarissa-macos-arm64 /usr/local/bin/clarissa Verify Installation
After installation, verify that Clarissa is working:
clarissa --help Quick Setup
Run the setup command to configure your API keys:
clarissa init This will prompt for API keys for OpenRouter, OpenAI, and Anthropic. You can skip any provider you don't want to use. Clarissa will automatically use the best available provider based on your configuration.
Local Models (Optional)
For offline or privacy-sensitive use, download a local GGUF model:
clarissa download Then set it as the active model:
clarissa use Qwen2.5-7B.gguf Next Steps
For more configuration options, see the configuration guide. Or jump straight to using Clarissa.