A multi-agent AI discussion platform where users can set up LLM participants to discuss topics autonomously.
- Create AI agents with custom system prompts and configurations
- Define conversation flows as graphs with event nodes (generate, evaluate, decide, summarize)
- Watch multiple agents discuss topics in real-time via WebSocket
- Upload documents for context (RAG integration with ChromaDB)
- Dark/light mode support
- Pause/resume discussion controls
- Frontend: React 19 + Vite + shadcn/ui + Tailwind CSS + React Flow
- Backend: FastAPI + LangGraph + Celery
- Database: Supabase (PostgreSQL + Auth)
- Message Queue: Redis
- Vector Store: ChromaDB
- LLM: vLLM (self-hosted, OpenAI-compatible)
- Docker and Docker Compose v2+
- A Supabase account and project (free tier works)
- An LLM API endpoint (vLLM, OpenAI, or any OpenAI-compatible API)
# Clone the repository
git clone <repository-url>
cd agent-discuss
# Copy environment file
cp .env.example .envEdit .env with your credentials:
# Get these from Supabase Dashboard > Settings > API
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_KEY=your-anon-key
SUPABASE_SERVICE_KEY=your-service-role-key
SUPABASE_JWT_SECRET=your-jwt-secret
# Frontend needs these too
VITE_SUPABASE_URL=https://your-project.supabase.co
VITE_SUPABASE_ANON_KEY=your-anon-key- Go to your Supabase project's SQL Editor
- Copy and run the contents of
backend/scripts/init.sql - This creates all required tables with Row Level Security policies
# Start all services with Docker Compose
docker compose up -d
# Watch the logs
docker compose logs -f| Service | URL |
|---|---|
| Frontend | http://localhost:5173 |
| Backend API | http://localhost:8000 |
| API Documentation | http://localhost:8000/docs |
| ChromaDB | http://localhost:8001 |
- Register/login at http://localhost:5173
- Go to Settings > LLM Providers
- Add your LLM connection:
- Name: My vLLM / OpenAI / etc.
- Base URL: Your API endpoint (e.g.,
http://host.docker.internal:8080/v1for local vLLM) - API Key: Your API key
- Model: Model name to use
cd backend
# Create virtual environment
python -m venv venv
source venv/bin/activate # or `venv\Scripts\activate` on Windows
# Install dependencies
pip install -r requirements.txt
# Run the server
uvicorn app.main:app --reload --port 8000cd frontend
# Install dependencies
npm install
# Start dev server
npm run dev# Using Docker
docker run -d -p 6379:6379 redis:7-alpine
# Or install locally
brew install redis # macOS
sudo apt install redis-server # Ubuntucd backend
celery -A app.tasks.celery_app worker --loglevel=info.
├── backend/ # FastAPI backend
│ ├── app/
│ │ ├── api/routes/ # REST endpoints
│ │ ├── agents/ # LangGraph orchestration
│ │ │ ├── state.py # State definitions
│ │ │ ├── graph.py # Graph builder
│ │ │ ├── nodes.py # Node implementations
│ │ │ └── context.py # Context management
│ │ ├── services/ # Business logic
│ │ │ ├── websocket_manager.py
│ │ │ ├── vectorstore.py
│ │ │ └── document_processor.py
│ │ └── tasks/ # Celery tasks
│ └── scripts/ # Database scripts
│
├── frontend/ # React frontend
│ └── src/
│ ├── components/
│ │ ├── ui/ # shadcn components
│ │ ├── discussions/ # Discussion views
│ │ ├── messages/ # Message display
│ │ ├── graph-editor/ # React Flow editor
│ │ ├── agents/ # Agent management
│ │ └── documents/ # Document uploads
│ ├── hooks/ # React hooks
│ ├── stores/ # Zustand stores
│ └── pages/ # Route pages
│
├── docker-compose.yml # Docker orchestration
└── .env.example # Environment template
The application supports these node types in conversation graphs:
| Node Type | Description |
|---|---|
| Start | Entry point of the conversation |
| Generate | Agents take turns generating content |
| Evaluate | Agents vote/score on criteria |
| Decision | Conditional branch based on evaluation |
| Summary | Compress context to manage token limits |
| End | Final summary and conclusion |
Make sure all services are running:
docker compose psCheck that:
- Backend is running on port 8000
VITE_WS_URLis set correctly in.env- No firewall blocking WebSocket connections
- Verify the base URL is accessible from Docker (use
host.docker.internalfor local services) - Check API key is correct
- Ensure model name matches exactly
- Check Celery worker logs:
docker compose logs celery-worker - Verify ChromaDB is running:
docker compose ps chromadb - Check document status in the UI (should show "Processing" then "Ready")
# Stop and remove all containers and volumes
docker compose down -v
# Rebuild from scratch
docker compose up -d --buildInteractive API documentation is available at http://localhost:8000/docs when the backend is running.
Key endpoints:
POST /api/v1/auth/login- LoginGET /api/v1/agents- List agentsGET /api/v1/discussions- List discussionsPOST /api/v1/discussions/{id}/start- Start a discussionWS /api/v1/ws/discussions/{id}- WebSocket for real-time updates
MIT