Why AI curriculum quality varies so much
A vague prompt produces a generic lesson.
A specific prompt produces school-ready output.
The quality of AI curriculum is almost entirely determined by the quality of the prompt. A vague prompt — ‘write a lesson on photosynthesis for Year 9’ — produces a competent, generic lesson. A specific, well-structured prompt that includes the student profile, the specific misconceptions to address, the preferred teaching approach, and the assessment format produces something a teacher could use with minimal editing. The additional time to write the specific prompt is 3–5 minutes. The reduction in editing time is 40–60 minutes.
Four prompt templates
Copy and fill.
One prompt per build stage.
🤖Course outline prompt
‘Design a [N]-week curriculum course on [specific topic] for [year group] [subject] students at [level]. Student profile: [prior knowledge, exam context, relevant gaps]. The course should enable students to [specific outcome]. Structure: [N] topics, one per week, with 3 lessons per topic. For each week include: topic title, specific learning objectives written as student abilities, and key concepts. Also generate: prerequisite knowledge checklist, common misconceptions to address early, and a final assessment task.’
🤖Lesson sequence prompt — all outlines before any content
‘Using the course outline above, generate lesson outlines for all [N] lessons. For each lesson: lesson title, 2 specific learning objectives (student abilities), the single most important concept, the primary activity type, and the formative check concept. Do not generate full lesson content yet.’
🤖Lesson content prompt — 3 lessons per batch
‘Generate full lesson content for Lessons [X], [X+1], and [X+2] from the sequence above. For each lesson include: teacher-facing lesson notes, student-facing instruction, one guided practice activity, one independent practice activity, and a 3-question formative check where Q1 tests recall, Q2 tests understanding, and Q3 tests application in a novel context. Address the following specific misconception in Lesson [X]: [misconception].’
🤖Assessment generation prompt
‘Generate a final assessment for [course title] completable in [duration]. Include: 2 recall questions (1 mark each), 3 understanding questions (3 marks each), 2 application questions in novel contexts (5 marks each), 1 extended response requiring synthesis across multiple topics (10 marks). Mark scheme with model answers, most common incorrect answers, and the misconception each incorrect answer reveals.’
The compounding prompt investment
A well-designed prompt template used 18 times
produces 18 coherent lessons.
The investment in prompt quality compounds across the whole curriculum build. A well-designed prompt template, used consistently across 18 lessons, produces 18 lessons that share a coherent voice, approach, and level. A vague prompt used 18 times produces 18 lessons with inconsistent quality, register, and level — each requiring significant individual editing.