Skip to content

Latest commit

 

History

History
334 lines (263 loc) · 11.4 KB

File metadata and controls

334 lines (263 loc) · 11.4 KB

CLAUDE.md

This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.

Project Overview

SprintIQ is an AI-powered sprint intelligence and team operations copilot designed for seed to Series C startups (10-150 person engineering teams). The platform aggregates cross-platform development activity, identifies execution risks before they become blockers, and provides actionable intelligence for sprint planning and team optimization.

Technology Stack

Frontend

  • Framework: Next.js 15 with App Router and TypeScript
  • Styling: TailwindCSS with Radix UI components
  • State Management: Zustand for client state, TanStack Query for server state
  • Authentication: Supabase Auth with GitHub OAuth
  • Charts: Recharts for analytics dashboards

Backend

  • API: Express.js server with TypeScript
  • Database: PostgreSQL with Prisma ORM
  • Queue System: BullMQ for background job processing
  • AI Integration: OpenAI GPT-4 for insights and analysis
  • Integrations: GitHub, Slack, Asana, Trello, ClickUp, JIRA, Zendesk

Development Commands

Core Development

# Start both frontend and backend
npm run dev

# Start separately
npm run dev:next        # Next.js frontend on port 3000
npm run dev:server      # Express backend on port 3001

# Build for production
npm run build           # Builds both server and frontend
npm run build:server    # Server TypeScript compilation only

# Start production server
npm start               # Runs compiled server

Database Operations

# Generate Prisma client (run after schema changes)
npm run db:generate

# Push schema changes to database
npm run db:push

# Create and run migrations
npm run db:migrate

# Seed database with initial data
npm run db:seed

# Open Prisma Studio for database management
npm run db:studio

Testing

# Run all tests
npm test

# Run tests in watch mode
npm run test:watch

# Run tests with coverage
npm run test:coverage

# Run end-to-end tests
npm run test:e2e
npm run test:e2e:ui

# Run specific test suites
npm run qa:functional      # Functional tests
npm run qa:integration     # Integration tests
npm run qa:load           # Load testing with Artillery

# Run complete test suite
npm run test:all          # Lint + Jest + Playwright

Queue Workers

# Start background job workers
npm run queue:dev

Code Quality

# Lint code
npm run lint

# Type checking
npx tsc --noEmit          # Frontend
cd server && npx tsc --noEmit  # Backend

Architecture Overview

Dual-Server Architecture

The application runs two TypeScript servers:

  • Frontend: Next.js 15 App Router (src/app/) serving React components
  • Backend: Express.js API server (server/) handling business logic and integrations

Key Directories

src/
├── app/                    # Next.js App Router (pages and API routes)
│   ├── api/               # Next.js API routes
│   └── dashboard/         # Dashboard pages
├── components/            # React components
│   ├── ui/               # Reusable UI components (Radix-based)
│   ├── integrations/     # Third-party integration components
│   └── dashboard/        # Dashboard-specific components
├── lib/                  # Shared utilities
│   ├── integrations/     # Third-party service clients
│   ├── services/         # Business logic services
│   └── ai/              # AI/ML utilities
└── stores/               # Zustand state management

server/
├── routes/               # Express API routes
├── services/             # Backend business logic
├── workers/              # Background job processors
├── middleware/           # Express middleware
└── utils/               # Server utilities

prisma/
├── schema.prisma         # Database schema
├── migrations/           # Database migrations
└── seed.ts              # Database seeding

Database Architecture

  • PostgreSQL with Prisma ORM
  • Multi-tenant: Organizations contain users, teams, and data
  • Integration Support: Dedicated profile tables for GitHub, Slack, Asana, Trello, ClickUp
  • Task Delegation: Advanced task management with reassignment tracking and CDC-style logging
  • AI Insights: Generated insights with confidence scoring and delegation capabilities

Authentication & Authorization

  • Supabase Auth for user authentication with GitHub OAuth
  • Organization-based multi-tenancy
  • Role-based access control (Admin, Manager, Developer)
  • Integration profiles for third-party service connections

Integration System

Supported Integrations

  • GitHub: Repository activity, commits, PRs, issues
  • Slack: Channel messages, team communication
  • Project Management: Asana, Trello, ClickUp, JIRA, Linear
  • Support: Zendesk ticket integration

Integration Architecture

Each integration follows a consistent pattern:

  1. OAuth/API Key Setup: Stored in encrypted profile tables
  2. Webhook Support: Real-time updates via /api/webhooks/
  3. Sync Workers: Background jobs for historical data sync
  4. Unified Data Model: External tasks normalized into common schema

AI/ML Components

OpenAI Integration

  • GPT-4 for complex analysis and insights
  • Embeddings for semantic search and context matching
  • Prompt Engineering for domain-specific analysis

AI Services

  • Sprint Summaries: Automated sprint performance analysis
  • Team Health Scoring: Collaboration and productivity metrics
  • Blocker Detection: Risk identification and mitigation suggestions
  • Task Delegation: AI-powered task assignment with confidence scoring

Testing Strategy

Test Structure

  • Unit Tests: Jest with React Testing Library
  • Integration Tests: API endpoint testing
  • Functional Tests: Business logic validation
  • E2E Tests: Playwright for user workflows
  • Load Tests: Artillery for performance testing

Test Coverage

  • Minimum 70% coverage requirement across branches, functions, lines, statements
  • Mock external API calls and database connections
  • Dedicated test helpers and fixtures

Background Jobs & Workers

Queue System

  • BullMQ for reliable job processing
  • Redis for job queue storage
  • Concurrent Workers: GitHub sync, third-party integrations, AI analysis

Job Types

  • Data Sync: GitHub, Slack, third-party tools
  • AI Analysis: Insight generation, team health scoring
  • Notifications: Slack digests, email reports
  • Cleanup: Expired insights, old activity data

Development Guidelines

Code Organization

  • TypeScript Strict Mode: All code must be strongly typed
  • Modular Services: Business logic in dedicated service classes
  • Error Handling: Comprehensive error boundaries and logging
  • Path Aliases: Use @/* for src imports, @/server/* for server imports

Database Patterns

  • Prisma Generate: Always run after schema changes
  • Migrations: Use prisma migrate for schema changes
  • Seeding: Maintain seed data for development
  • Soft Deletes: Use status fields rather than hard deletes where appropriate

Security Practices

  • Encrypted Tokens: All API keys and tokens encrypted at rest
  • CORS Configuration: Proper origin restrictions
  • Helmet: Security headers in production
  • Input Validation: Zod schemas for API validation

Environment Setup

Required Environment Variables

# Database
DATABASE_URL=postgresql://username:password@localhost:5432/sprintiq

# Authentication
NEXT_PUBLIC_SUPABASE_URL=your-supabase-url
NEXT_PUBLIC_SUPABASE_ANON_KEY=your-supabase-anon-key
SUPABASE_SERVICE_ROLE_KEY=your-supabase-service-role-key

# AI Services
OPENAI_API_KEY=your-openai-api-key

# MCP (Model Context Protocol) Services
KLAVIS_API_KEY=your-klavis-api-key

# Integration APIs
GITHUB_TOKEN=your-github-token
SLACK_BOT_TOKEN=your-slack-bot-token
ASANA_CLIENT_ID=your-asana-client-id
ASANA_CLIENT_SECRET=your-asana-client-secret

# Background Jobs
REDIS_URL=redis://localhost:6379

# Billing
STRIPE_SECRET_KEY=your-stripe-secret-key
NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY=your-stripe-publishable-key

Development Workflow

  1. Database Setup: Run migrations and seed data
  2. Start Services: Use npm run dev for concurrent frontend/backend
  3. Queue Workers: Start with npm run queue:dev for background jobs
  4. Testing: Run tests before commits
  5. Integration Testing: Use /api/integrations/status for health checks

Performance Considerations

Database Optimization

  • Connection Pooling: Prisma connection management
  • Query Optimization: Use select to limit fields
  • Indexing: Strategic indexes on frequently queried fields
  • Pagination: Implemented for large datasets

Background Processing

  • Job Prioritization: Critical tasks (blockers) get higher priority
  • Rate Limiting: Respect third-party API limits
  • Retry Logic: Exponential backoff for failed jobs
  • Monitoring: Job queue health and performance metrics

Troubleshooting

Common Issues

  • Database Connection: Check DATABASE_URL and PostgreSQL service
  • Missing Prisma Client: Run npm run db:generate
  • Queue Workers Not Processing: Verify Redis connection
  • Integration Failures: Check API keys and webhook endpoints
  • Build Errors: Verify TypeScript configuration and dependencies

Debug Tools

  • Prisma Studio: Visual database management
  • Server Logs: Winston logging with structured output
  • Queue Dashboard: BullBoard for job monitoring (when enabled)
  • Integration Status: /api/integrations/status endpoint

MCP Context7 Integration

Overview

This project is configured to work with MCP (Model Context Protocol) Context7 for enhanced AI assistance. Context7 provides historical context, patterns, and insights from previous development work.

MCP Configuration

The project includes MCP server configurations in:

  • .claude.json - Claude Code MCP configuration
  • mcp.json - Standalone MCP server configuration

Available MCP Servers

  1. Context7 Server: Provides historical context and patterns
  2. Filesystem Server: Efficient codebase navigation and analysis
  3. Git Server: Git repository context and history

Context7 Setup

# Required environment variables for Context7
CONTEXT7_API_KEY=your-context7-api-key
CONTEXT7_PROJECT_ID=sprintiq
CONTEXT7_WORKSPACE=sprintiq-development

Using Context7 with Claude Code

When working with this codebase, Claude Code will automatically:

  • Query Context7 for similar patterns and past decisions
  • Reference historical context before suggesting architectural changes
  • Access previous integration patterns and lessons learned
  • Provide enhanced analysis based on development history

MCP Best Practices

  1. Context Queries: Use specific queries when asking for Context7 insights
  2. Pattern Recognition: Leverage historical patterns for consistent development
  3. Decision Documentation: Context7 tracks architectural decisions and rationale
  4. Integration Patterns: Reference previous integration implementations
  5. Code Analysis: Enhanced analysis with historical context and trends

Troubleshooting MCP

  • Connection Issues: Check Context7 API key and project configuration
  • Server Not Found: Verify MCP server installation and paths
  • Permission Errors: Ensure proper file system access for filesystem server