Overview

OpenAnalyst CLI is an independent, open-source AI coding agent that connects to every major LLM provider through a single, unified terminal interface. It features a full full-screen terminal UI, multi-provider streaming, multi-agent orchestration, 51+ slash commands, 19 built-in tools, and MCP protocol support.

Why OpenAnalyst?

Capability OpenAnalyst CLI Other CLI Tools
Providers 7 first-class (OpenAnalyst, Anthropic, OpenAI, xAI, Gemini, OpenRouter, Bedrock) Typically 1–2
Mid-conversation switching Session persists across providers Not supported
Terminal UI Full Full-screen TUI (default) Basic REPL
Multi-agent Built-in parallel orchestration Limited or none
Multimedia /image, /voice, /speak, /vision, /diagram Rarely supported
Slash commands 51+ 5–15 typical
Binary Native binary, zero runtime deps Often needs Node/Python

Key Features

🌐

Universal Provider Support

Connect to OpenAnalyst, Anthropic Claude, OpenAI GPT, Google Gemini, xAI Grok, OpenRouter (350+ models), and Amazon Bedrock — all with live model discovery and streaming.

🖥

Full Terminal UI

Scrollable chat with inline tool cards, startup banner, status line with spinner and token counts, vim-mode input, permission dialogs, and mouse navigation.

🤖

Multi-Agent Orchestration

Spawn sub-agents for parallel tasks, run autonomous agent loops with think→act→observe→verify cycles, and dispatch Mixture of Experts for complex problems.

🔧

51+ Slash Commands & 19 Tools

Git operations, multimedia generation, web scraping, planning, analysis, Playwright browser control, and more — all from your terminal.

🔒

Permission System & Sandboxing

Three permission modes (read-only, workspace-write, full-access) with modal approval dialogs and filesystem sandboxing.

🚀

Single Native Binary

Native binary with zero runtime dependencies. Fast startup, low memory, and works on macOS, Linux, and Windows.

Documentation Pages

Getting Started

Core Concepts

Advanced