VibeCody is an AI-powered developer toolchain built entirely in Rust. It combines a terminal-first CLI coding assistant (VibeCLI) with a full-featured desktop code editor (VibeUI), both powered by a shared library of AI and editor primitives.
| Project | Description | Status |
|---|---|---|
| VibeCLI | AI coding assistant for the terminal (TUI + REPL) | Active |
| VibeUI | AI-powered desktop code editor (Tauri + Monaco) | Active |
| VibeApp | Secondary Tauri app | Active |
git clone https://github.com/TuringWorks/vibecody.git
cd vibecody
make setup # Installs Rust, Node.js, system libs, npm depsOr run the setup script directly:
./scripts/setup.shmake uimake cli # Build release binary
./target/release/vibecli --tui # Run with TUI
# Or with a specific provider
./target/release/vibecli --tui --provider claude --model claude-3-5-sonnet-20241022make doctor # Checks all required + optional toolsmake setup Install all prerequisites
make doctor Verify dev environment is ready
make ui Run VibeUI in dev mode (Vite + Tauri)
make cli Build VibeCLI release binary
make test Run all workspace tests
make test-fast Run tests (excluding collab crate)
make check Fast type-check (Rust + TypeScript)
make lint Run clippy + TypeScript check
make build Build everything for production
make clean Remove build artifacts
make docker Build Docker image
vibecody/
├── Cargo.toml # Workspace root (6 members)
├── Dockerfile # Multi-stage musl build (Alpine runtime)
├── docker-compose.yml # VibeCLI + Ollama sidecar (air-gapped)
├── install.sh # One-liner installer (SHA-256 verified)
├── vibecli/
│ └── vibecli-cli/ # CLI binary (TUI + REPL)
│ ├── src/
│ │ ├── main.rs # Entry point, command routing
│ │ ├── config.rs # TOML config (~/.vibecli/config.toml)
│ │ ├── serve.rs # HTTP daemon for VS Code ext/SDK
│ │ ├── repl.rs # Rustyline REPL helper
│ │ └── tui/ # Ratatui TUI (app, ui, components)
│ └── skills/ # 507 skill files (25 categories)
├── vibeui/
│ ├── src/ # React + TypeScript frontend
│ │ ├── App.tsx # Root component
│ │ └── components/ # 60+ panel components
│ ├── src-tauri/ # Tauri Rust backend
│ └── crates/ # Shared Rust library crates
│ ├── vibe-core/ # Text buffer, FS, workspace, Git, index
│ ├── vibe-ai/ # 17 AI providers, agents, hooks, planner
│ ├── vibe-lsp/ # Language Server Protocol client
│ ├── vibe-extensions/ # WASM-based extension system
│ └── vibe-collab/ # CRDT multiplayer collaboration
├── vibeapp/ # Secondary Tauri app
├── vibe-indexer/ # Remote indexing service
├── vscode-extension/ # VS Code extension (chat + completions)
├── jetbrains-plugin/ # JetBrains IDE plugin
├── neovim-plugin/ # Neovim plugin
├── packages/
│ └── agent-sdk/ # TypeScript Agent SDK
├── docs/ # Jekyll GitHub Pages site
└── .github/workflows/ # CI/CD (pages, release)
The vibeui/crates/ libraries are designed to be reused across both VibeCLI and VibeUI:
Core editor primitives — text buffer (rope-based), file system operations, workspace management, Git integration, terminal PTY, diff engine, code search, and embedding-based codebase indexing.
Unified AI provider abstraction with agent loop, hooks, planner, multi-agent orchestration, skills, artifacts, admin policy, trace/session resume, and OpenTelemetry. Supports 17 providers:
- Ollama — Local/private models (default)
- Anthropic Claude — Claude 3.5 Sonnet/Opus
- OpenAI — GPT-4 and variants
- Google Gemini — Gemini Pro 1.5
- xAI Grok — Grok Beta
- Groq — Fast inference
- OpenRouter — Multi-provider gateway
- Azure OpenAI — Enterprise Azure-hosted models
- AWS Bedrock — AWS-hosted models
- GitHub Copilot — Copilot integration
- LocalEdit — Local code editing model
- Mistral — Mistral AI models
- Cerebras — Wafer-scale inference
- DeepSeek — DeepSeek models
- Zhipu — GLM models
- Vercel AI — Vercel AI SDK
- Failover — Auto-failover wrapper (chains multiple providers)
Language Server Protocol client for intelligent code features (go-to-definition, diagnostics, completions).
WASM-based extension runtime (Wasmtime), enabling a plugin API.
CRDT-based multiplayer collaboration for real-time shared editing sessions.
All providers implement the AIProvider trait from vibe-ai:
#[async_trait]
pub trait AIProvider: Send + Sync {
fn name(&self) -> &str;
async fn is_available(&self) -> bool;
async fn complete(&self, context: &CodeContext) -> Result<CompletionResponse>;
async fn stream_complete(&self, context: &CodeContext) -> Result<CompletionStream>;
async fn chat(&self, messages: &[Message], context: Option<String>) -> Result<String>;
async fn stream_chat(&self, messages: &[Message]) -> Result<CompletionStream>;
// + chat_response, chat_with_images, and more
}Configure providers in ~/.vibecli/config.toml:
[ollama]
enabled = true
api_url = "http://localhost:11434"
model = "qwen2.5-coder:7b"
[claude]
enabled = false
api_key = "sk-ant-..."
model = "claude-3-5-sonnet-20241022"
[openai]
enabled = false
api_key = "sk-..."
model = "gpt-4o"
[gemini]
enabled = false
api_key = "AIza..."
model = "gemini-2.0-flash"
[grok]
enabled = false
api_key = "..."
model = "grok-3-mini"
[groq]
enabled = false
api_key = "gsk_..."
model = "llama-3.3-70b-versatile"
[mistral]
enabled = false
api_key = "..."
model = "mistral-large-latest"
# See docs/configuration.md for all 17 providers
[safety]
require_approval_for_commands = true
require_approval_for_file_changes = truemake setup installs everything automatically. If you prefer manual setup:
| Requirement | Version | Install |
|---|---|---|
| Rust | stable | curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh |
| Node.js | >= 20 | nodejs.org or nvm install 20 |
| Git | any | Usually pre-installed |
| Ollama | any | Optional — ollama.ai for local AI |
| Docker | any | Optional — for container sandbox |
Linux only (Tauri system dependencies):
# Debian/Ubuntu
sudo apt install libwebkit2gtk-4.1-dev libgtk-3-dev libayatana-appindicator3-dev librsvg2-dev patchelf build-essential libssl-dev pkg-config
# Fedora
sudo dnf install webkit2gtk4.1-devel gtk3-devel libappindicator-gtk3-devel librsvg2-devel patchelf openssl-devel
# Arch
sudo pacman -S webkit2gtk-4.1 gtk3 libappindicator-gtk3 librsvg patchelf openssl base-develmacOS only: Xcode command line tools (xcode-select --install)
5,900+ unit tests across the workspace.
make test # All workspace tests
make test-fast # Skip collab crate (faster)
make check # Type-check only (Rust + TypeScript)
# Specific crates
cargo test -p vibe-core
cargo test -p vibe-ai
cargo test -p vibecli| Problem | Fix |
|---|---|
rustup could not choose a version of cargo |
Run rustup default stable |
npm run tauri dev can't find cargo (Linux) |
Use make ui or npm run tauri:dev — these prepend ~/.cargo/bin to PATH |
| Port 1420 already in use | Kill stale Vite: lsof -i :1420 then kill <pid> |
"VibeUI" is damaged (macOS) |
Run xattr -cr /Applications/VibeUI.app (unsigned app — Gatekeeper quarantine) |
Missing libwebkit2gtk-4.1-dev (Linux) |
Run make setup or install manually (see Prerequisites) |
Failed to run cargo: No such file (macOS .app) |
Fixed in v0.3.0 — app now inherits shell PATH at startup |
Full documentation is available at the GitHub Pages site (replace with actual URL).
- Architecture Overview
- VibeCLI Reference
- VibeUI Reference
- Roadmap
- Roadmap v2 (Phases 6–9)
- Configuration Guide
- Contributing
MIT — see individual crate Cargo.toml files.
- Tauri — Desktop application framework
- Monaco Editor — Code editor component
- Ratatui — Terminal UI framework
- Ropey — Rope data structure for text buffers
- Ollama — Local LLM runtime