Genie is an intelligent agent that continuously discovers relevant career, speaking, and professional growth opportunities based on your personal goals.
All core features are implemented and tested. See FINAL_STATUS.md for complete details.
- IMPLEMENTATION_COMPLETE.md - Complete feature overview and architecture
- FINAL_STATUS.md - Implementation status and what works now
- QUICK_REFERENCE.md - Common commands and troubleshooting
- ABLY_SETUP.md - Ably real-time messaging setup
- AGENT_ARCHITECTURE.md - Multi-agent system details
- CLOUD_SETUP.md - Supabase and Temporal Cloud setup
- DEPLOYMENT.md - Production deployment guide
- Chat-Based Interface: ChatGPT-style conversational UI for creating goals
- AI-Powered Clarification: Natural language processing with conversational follow-up questions
- Multi-Source Scraping: Searches across 8+ platforms using LLM-powered extraction
- Smart Ranking: Vector similarity search with user feedback integration
- Real-time Updates: Live status updates via Ably during searches
- Beautiful Dashboard: Modern React interface with dark theme
- Feedback Learning: System learns from your preferences
- FastAPI - High-performance Python web framework
- PostgreSQL + pgvector - Database with vector similarity search
- Temporal - Workflow orchestration for async tasks
- OpenAI GPT-4 - LLM for goal clarification and summarization
- SQLAlchemy - ORM for database operations
- Ably - Managed realtime messaging and WebSocket infrastructure
- React 18 + TypeScript - Modern UI framework
- TanStack Query - Data fetching and caching
- Tailwind CSS - Utility-first styling
- Vite - Fast build tool
- Clarifier Agent - Refines user goals into structured filters
- Executor Agent - Coordinates parallel scraping across sources
- Ranker Agent - Ranks opportunities by relevance with feedback weighting
- Coordinator Agent - Orchestrates the entire workflow
- Docker and Docker Compose
- OpenAI API key
- Ably account (for realtime updates)
- Supabase account (for auth and database)
- Temporal Cloud account (for workflow orchestration)
- Clone the repository:
git clone <repository-url>
cd genie- Configure environment variables:
cp .env.example .env
# Edit .env and add your API keys-
Set up Ably for realtime communication: See ABLY_SETUP.md for detailed instructions.
-
Start all services:
docker-compose up -d- Access the application:
- Frontend: http://localhost:5173
- Backend API: http://localhost:8000
- API Docs: http://localhost:8000/docs
- Temporal UI: http://localhost:8080
cd backend
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
uvicorn app.main:app --reloadcd frontend
npm install
npm run devgenie/
├── backend/
│ ├── app/
│ │ ├── agents/ # Multi-agent system
│ │ ├── api/ # FastAPI routes
│ │ ├── models/ # SQLAlchemy models
│ │ ├── schemas/ # Pydantic schemas
│ │ ├── scrapers/ # Web scrapers
│ │ ├── services/ # Business logic
│ │ ├── workflows/ # Temporal workflows
│ │ ├── config.py # Configuration
│ │ ├── database.py # Database setup
│ │ └── main.py # FastAPI app
│ ├── alembic/ # Database migrations
│ ├── requirements.txt
│ └── Dockerfile
├── frontend/
│ ├── src/
│ │ ├── api/ # TanStack Query hooks
│ │ ├── components/ # React components
│ │ ├── pages/ # Page components
│ │ ├── types/ # TypeScript types
│ │ └── App.tsx # Main app component
│ ├── package.json
│ └── Dockerfile
└── docker-compose.yml
POST /api/goals- Create a new goalGET /api/goals- List user's goalsGET /api/goals/{id}- Get goal detailsPATCH /api/goals/{id}- Update goal statusDELETE /api/goals/{id}- Delete goalPOST /api/goals/{id}/refresh- Manually trigger scraping
GET /api/opportunities- List opportunities (with filtering)GET /api/opportunities/{id}- Get opportunity details
POST /api/feedback- Submit feedback on an opportunityGET /api/feedback- List user feedbackGET /api/feedback/stats- Get feedback statistics
Currently integrated sources:
- Papercall.io - Speaking opportunities and CFPs
- Sessionize - Conference speaking slots
- RemoteOK - Remote job listings
- We Work Remotely - Remote jobs
- Indeed - General job search
- Y Combinator Jobs - Startup positions
- AngelList (Wellfound) - Startup jobs
- Eventbrite - Events and conferences
- Create a new scraper in
backend/app/scrapers/:
from app.scrapers.base import BaseScraper
class NewSourceScraper(BaseScraper):
def __init__(self):
super().__init__(
source_name="newsource",
base_url="https://newsource.com"
)
async def scrape(self, filters: dict) -> list:
# Implement scraping logic
pass- Register in
backend/app/scrapers/__init__.py:
SCRAPER_REGISTRY["newsource"] = NewSourceScraper()Triggered when a user creates a new goal:
- Clarify goal with LLM
- Execute scraping across relevant sources
- Rank and store opportunities
- Return results to user
Runs every 24 hours:
- Scrape all sources
- Store new opportunities
- Log scraping status
Continuous monitoring for active goals:
- Check for new opportunities (every 24h)
- Rank by relevance
- Send notifications if new matches found
Key environment variables:
# Supabase
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_ANON_KEY=your-anon-key
SUPABASE_SERVICE_ROLE_KEY=your-service-role-key
SUPABASE_JWT_SECRET=your-jwt-secret
# Database (use direct port 5432, not pooled 6543)
DATABASE_URL=postgresql+asyncpg://user:password@host:5432/database
# OpenAI
OPENAI_API_KEY=sk-...
# Temporal Cloud
TEMPORAL_ADDRESS=your-namespace.tmprl.cloud:7233
TEMPORAL_NAMESPACE=your-namespace
TEMPORAL_API_KEY=your-temporal-api-key
# Ably Realtime
ABLY_API_KEY=your-ably-api-key
# Application
SECRET_KEY=your-secret-key-min-32-characters
DEBUG=True
ALLOWED_ORIGINS=http://localhost:5173,http://localhost:3000
# Scraping
SCRAPING_RATE_LIMIT=2
SCRAPING_USER_AGENT=Genie-Bot/1.0cd backend
pytestcd frontend
npm test- Build images:
docker-compose build- Run in production mode:
docker-compose -f docker-compose.prod.yml up -d- Set
DEBUG=Falsein production - Use strong
SECRET_KEY - Configure proper CORS origins
- Set up SSL/TLS certificates
- Use managed PostgreSQL (recommended)
- Configure monitoring and logging
- All scrapers respect
robots.txt - Rate limiting prevents server overload (1-2 req/sec)
- Only public data is scraped
- Source attribution in all opportunity listings
- No personal data collected without consent
- Email notifications
- Mobile app (React Native)
- Multi-goal support
- Advanced filters and preferences
- Social sharing
- API partnerships
- Skills gap analysis
- Learning resource recommendations
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
MIT License - see LICENSE file for details
For issues, questions, or contributions:
- GitHub Issues: [repository-url]/issues
- Documentation: [docs-url]
Built with:
- OpenAI GPT-4
- FastAPI
- React
- Temporal
- pgvector
- Tailwind CSS