Competency System & AI Quiz Generation MVP

Competency System & AI Quiz Generation MVP

Date Started: 2025-01-17 Status: Phase 1 Complete - Foundation Built Focus: Upload existing content + AI quiz generation (no authoring tools yet)

What We Built

1. Competency Database Foundation

File: /apps/web/supabase/migrations/023_create_competency_system.sql

Created complete database schema for competency-based learning:

Tables Created:

  • competency_frameworks: Collections of competencies (e.g., "Bloom's Taxonomy", "ISTE Standards")

    • Support for both standard frameworks and custom tenant-created frameworks
    • Row-level security for multi-tenancy
    • Full audit trail with created_by tracking
  • competencies: Individual competencies within frameworks

    • Hierarchical structure (parent/child relationships)
    • Bloom's taxonomy levels (1-6)
    • Keyword tagging for AI matching
    • Flexible metadata storage (JSONB)
  • activity_competencies: Maps learning activities to competencies

    • Weight system (0.0-1.0) for importance
    • Mastery threshold per competency (0-100%)
    • Evidence types: completion, quiz_score, essay_grade, scorm_completion, scorm_score, time_spent
    • AI suggestion tracking (ai_suggested, ai_confidence)
  • learner_competency_progress: Tracks learner mastery

    • Real-time mastery level calculation (0-100%)
    • Status tracking: not_started, in_progress, mastered, needs_review
    • Evidence count and last demonstrated timestamp

Functions & Triggers:

  • calculate_competency_mastery(): Weighted average calculation of competency mastery
  • update_competency_progress(): Auto-updates when activity progress changes
  • Trigger: Fires on activity_progress INSERT/UPDATE to maintain real-time competency tracking

RLS Policies:

  • Learners can view their own progress
  • Instructors can view progress for their courses
  • Instructors/admins can create/manage competency frameworks
  • Full tenant isolation

Note: Migration ready but not yet applied to database. Can be applied via Supabase Dashboard SQL Editor when ready.


2. AI Quiz Generation

Backend: OpenAI Integration

File: /apps/web/src/lib/ai/openai.ts

Added generateQuiz() function with:

  • Inputs:

    • learningObjectives: Array of objectives to assess
    • topic: Optional subject area
    • difficulty: beginner | intermediate | advanced
    • questionCount: 1-20 questions (default 5)
    • questionTypes: multiple_choice, true_false, short_answer
  • AI Features:

    • Uses GPT-4o for high-quality question generation
    • Bloom's Taxonomy cognitive level assignment (1-6)
    • Automatic point allocation based on difficulty
    • Generates explanations for correct answers
    • Creates plausible distractors for multiple choice
    • Structured JSON output for consistency
  • Question Structure:

{
  question_text: string;
  question_type: 'multiple_choice' | 'true_false' | 'short_answer';
  options?: string[]; // For multiple_choice
  correct_answer: string;
  explanation?: string;
  points: number;
  bloom_level?: 1 | 2 | 3 | 4 | 5 | 6;
}

API Route

File: /apps/web/src/app/api/ai/generate-quiz/route.ts

Endpoint: POST /api/ai/generate-quiz

Authentication & Authorization:

  • Requires authenticated user
  • Restricted to instructors and admins only
  • Validates user role before generation

Validation:

  • Learning objectives required (non-empty array)
  • Difficulty must be valid enum
  • Question count: 1-20
  • Question types must be valid
  • Comprehensive error handling for AI service failures

Error Responses:

  • 401: Unauthorized (not signed in)
  • 403: Forbidden (learners cannot generate)
  • 400: Invalid input
  • 503: AI service unavailable
  • 500: Internal server error

Frontend: AI Quiz Generator Component

File: /apps/web/src/components/instructor/AIQuizGenerator.tsx

Features:

  • Learning Objectives Input: Dynamic list with add/remove
  • Topic Field: Optional context for better question generation
  • Difficulty Selector: Three-level visual toggle (beginner/intermediate/advanced)
  • Question Count: Numeric input (1-20)
  • Question Type Selection: Checkboxes for multiple_choice, true_false, short_answer
  • Real-time Validation: Client-side checks before API call
  • Loading States: Spinner + disabled button during generation
  • Error Handling: User-friendly error messages

Generated Questions Preview:

  • Displays all generated questions with metadata
  • Shows question type badges (Multiple Choice, True/False, etc.)
  • Point values and Bloom levels visible
  • Highlights correct answers in green
  • Shows explanations when available
  • Ready for instructor review/editing

Callback Support:

  • onQuizGenerated() callback for parent components
  • Can integrate with existing quiz builder workflow
  • Questions can be edited before final save

How to Use

For Developers:

  1. Apply Database Migration:

    -- Via Supabase Dashboard SQL Editor, run:
    -- /apps/web/supabase/migrations/023_create_competency_system.sql
    
  2. Configure OpenAI API Key:

    # .env.local
    OPENAI_API_KEY=sk-...
    
  3. Use AI Quiz Generator in Course Builder:

    import { AIQuizGenerator } from '@/components/instructor/AIQuizGenerator';
    
    function CourseBuilder() {
      const handleQuizGenerated = (questions) => {
        // Save questions to quiz activity
        console.log('Generated questions:', questions);
      };
    
      return (
        <AIQuizGenerator
          defaultTopic="Photosynthesis"
          onQuizGenerated={handleQuizGenerated}
        />
      );
    }
    

For Instructors:

  1. Generate Quiz:

    • Enter topic (optional)
    • Add learning objectives (e.g., "Students will identify parts of a plant cell")
    • Select difficulty level
    • Choose number of questions (1-20)
    • Select question types
    • Click "Generate Quiz"
  2. Review Generated Questions:

    • AI generates questions based on your objectives
    • Each question shows:
      • Question text
      • Answer options (for multiple choice)
      • Correct answer (highlighted in green)
      • Explanation (if provided)
      • Point value and Bloom level
  3. Next Steps (Future Implementation):

    • Edit questions if needed
    • Add to quiz activity
    • Assign to module

What's NOT Built Yet (Future Phases)

Based on user feedback: No authoring tools yet - focus is on: ✅ Upload existing content (SCORM, documents, videos) ✅ AI quiz generation from learning objectives ❌ Full content authoring (text editor, lesson builder, etc.)

Future Roadmap:

Phase 2: Competency Mapping UI (Next)

  • Simple UI to tag existing activities with competencies
  • Competency framework selector
  • Bulk competency assignment
  • AI suggestions for SCORM package competencies

Phase 3: Teacher Dashboard

  • Real-time competency mastery heatmap
  • Student progress monitoring
  • Identify struggling learners

Phase 4: Adaptive Learning (Later)

  • Skip redundant content if competency mastered
  • Personalized learning paths
  • Automatic recommendations

Phase 5: Content Authoring (Much Later)

  • AI-assisted text content creation
  • Lesson outline generation
  • Rubric builder

Technical Decisions

Why These Choices?

  1. Competency Foundation First: Need database schema before building UI
  2. AI Quiz Generation: Quick win that provides immediate value to instructors
  3. No Authoring Yet: User feedback - focus on upload workflows first
  4. GPT-4o for Quizzes: More capable model ensures high-quality questions
  5. Bloom's Taxonomy: Standard framework familiar to educators
  6. Weighted Competency Mastery: More accurate than simple average
  7. Auto-Update Triggers: Real-time progress without manual recalculation

Database Design Rationale:

  • JSONB Metadata: Flexible for future extensions without schema changes
  • RLS Policies: Security at database level, not just application
  • Hierarchical Competencies: Support nested standards (Common Core, NGSS)
  • Evidence Types: Different activities demonstrate competency differently
  • AI Confidence Tracking: Allow human override of AI suggestions

Files Modified/Created

New Files:

  1. /apps/web/supabase/migrations/023_create_competency_system.sql - Database schema
  2. /apps/web/src/app/api/ai/generate-quiz/route.ts - API endpoint
  3. /apps/web/src/components/instructor/AIQuizGenerator.tsx - UI component

Modified Files:

  1. /apps/web/src/lib/ai/openai.ts - Added generateQuiz() function

Testing Notes

Manual Testing Required:

  1. Test quiz generation with various learning objectives
  2. Verify AI quality of generated questions
  3. Test error handling (invalid API key, network errors)
  4. Verify instructor-only access (403 for learners)
  5. Test question type combinations

Unit Tests Needed (Future):

  • API route validation
  • Quiz generation prompt building
  • Component rendering
  • Error state handling

Performance Considerations

OpenAI API Costs:

  • GPT-4o: ~$0.005 per 1K input tokens, ~$0.015 per 1K output tokens
  • Average 5-question quiz: ~1,000 input + 1,500 output = $0.03 per quiz
  • 100 quizzes/month = $3/month in AI costs (negligible)

Database Performance:

  • Indexed foreign keys (framework_id, competency_id, activity_id)
  • GIN index on keywords array for fast AI matching
  • Trigger efficiency: Only updates affected competencies

Frontend Performance:

  • Quiz generation: 3-10 seconds (depends on OpenAI API)
  • Loading states prevent UI blocking
  • No heavy client-side processing

Security Considerations

API Route:

  • ✅ Authentication required (Supabase auth)
  • ✅ Role-based access control (instructors/admins only)
  • ✅ Input validation (learning objectives, counts, types)
  • ✅ Error message sanitization (no API key leakage)

Database:

  • ✅ Row-level security on all tables
  • ✅ Tenant isolation
  • ✅ Audit trails (created_by, created_at)

AI Generation:

  • ✅ No user data sent to OpenAI (only learning objectives)
  • ✅ No PII in prompts
  • ✅ Content sanitization before display (DOMPurify exists in players)

Next Steps

  1. Apply Migration: Run migration 023 in Supabase Dashboard
  2. Add Component to Course Builder: Integrate AIQuizGenerator into existing quiz activity editor
  3. User Testing: Get instructor feedback on AI question quality
  4. Competency Mapping UI: Build simple UI to tag existing activities with competencies
  5. SCORM Competency Analyzer: AI that suggests competencies from SCORM manifests

Questions for User

  1. Should we integrate AIQuizGenerator directly into the existing QuizActivityEditor?
  2. Do you want to test quiz generation with your OpenAI API key first?
  3. What competency frameworks do you want pre-loaded? (Bloom's, ISTE, Common Core?)
  4. Should we auto-save generated quizzes as drafts or require manual confirmation?

Handoff Notes for Next Developer

If continuing with Competency Mapping UI:

  • Read the database schema in migration 023
  • Understand activity_competencies table structure
  • Look at existing activity editors for patterns
  • Follow AIQuizGenerator component style for consistency

If integrating with existing quiz editor:

  • Check /apps/web/src/components/instructor/QuizActivityEditor.tsx
  • AIQuizGenerator can be modal or sidebar
  • Use onQuizGenerated callback to populate quiz_questions
  • Maintain existing auto-save behavior

If working on SCORM competency analyzer:

  • Study SCORM manifest parser: /apps/web/lib/parser/manifest-parser.ts (scorm-api)
  • Extract adlcp:objectives from manifest
  • Send to OpenAI for competency normalization
  • Return competency suggestions with confidence scores