AICI 501

Assessment as Design

This pathway reframes assessment as the architecture of evidence in The Aletheian Design Theory of Learning. Each lesson introduces a core assessment move; each “.1” mastery studio uses real or sample student work to critique and refine those designs until they serve learning with clarity and integrity.

Assessment not as a score at the end, but as a designed conversation between learners, evidence, and ADTL.
Jump to a Lesson
Select a core lesson or its mastery studio. Each pair stacks the lesson above its “.1” critique session.
Lesson 1
Reframing Assessment Through The Aletheian Design Theory of Learning
Learning goal: redefine assessment as designed evidence of learning, not just grading.

Learners connect ADTL pillars to assessment, clarifying how evidence of learning should look, feel, and function across Cognitive Design, Cultural Connection, and Aesthetic Experience.

  • Differentiate between “grading events” and designed opportunities to generate evidence of learning.
  • Locate assessment within the Aletheian Learning Cycle and each ADTL pillar.
  • Identify current assessment practices that conflict with or distort ADTL values.
  • Short case vignettes of assessment moments (quizzes, discussions, projects, reflections).
  • Sort activity: “Measurement only,” “Evidence of learning,” or “Unclear.”
  • Mini-lesson: assessment as design—framing prompt, task, context, and response mode as intentional choices.
  • Map one current assessment against ADTL pillars and the Aletheian Learning Cycle.
  • Draft a brief “Assessment as Design” stance statement for their context.
  • Completed mapping of at least one assessment to ADTL elements.
  • Written stance statement that distinguishes assessment from grading.
This lesson meets the goal by anchoring assessment in theory and design, not habit or tradition.
Lesson 1.1
Mastery Studio: Assessment Autopsy & Redesign Intent
Mastery focus: critique a current assessment and name specific redesign priorities.

Learners bring an existing assessment, dissect what it currently measures and signals, and name how it must change to align with ADTL and their new stance.

  • Present a real assessment (task, test, project, etc.) and explain its current purpose.
  • Identify mismatches between intended outcomes and actual evidence produced.
  • Connect strengths and gaps to specific ADTL pillars.
  • Peers ask: “What is this assessment really measuring?” “What kinds of learners does it serve best?”
  • Group explores hidden messages about intelligence, success, and belonging.
  • Participant can clearly name at least two redesign priorities (e.g., more authentic evidence, clearer criteria).
  • Participant revises their stance statement to include assessment commitments.
  • Participant chooses one assessment to redesign during AICI 501.
Mastery appears when educators see existing assessments as malleable designs, not fixed requirements.
Lesson 2
Designing Outcomes & Evidence Maps
Learning goal: connect outcomes, success criteria, and evidence into a coherent map.

Learners build outcomes–evidence maps that specify what students should understand or be able to do, and what forms of evidence will make that learning visible across modalities.

  • Write clear, student-facing outcomes aligned to ADTL and standards.
  • Define what counts as sufficient, rich evidence of each outcome.
  • Distinguish between proxy measures (e.g., completion) and true evidence of understanding.
  • Examples of weak vs strong outcomes and evidence pairings.
  • Mini-lesson on outcomes, success criteria, and evidence in the Aletheian Learning Cycle.
  • Guided construction of an Outcomes & Evidence Map for one unit or sequence.
  • Draft 3–5 outcomes for a selected unit, with explicit success criteria.
  • List possible evidence forms (products, performances, conversations, reflections, digital artifacts).
  • Completed Outcomes & Evidence Map that can guide later task and rubric design.
  • Annotated notes on where additional evidence types might support equity and access.
This lesson meets the goal by giving designers a blueprint for what assessment should actually reveal.
Lesson 2.1
Mastery Studio: Outcomes & Evidence Critique
Mastery focus: sharpen outcomes and evidence so they are precise, equitable, and teachable.

Learners present their Outcomes & Evidence Maps and use critique to clarify ambiguous language, expand evidence types, and align maps more tightly to learners’ realities.

  • Present a map including outcomes, criteria, and planned evidence forms.
  • Explain why these evidence forms fit their learners and context.
  • Identify where outcomes might still be too broad or too narrow.
  • Peers look for jargon, vagueness, or hidden bias in outcome language.
  • Group suggests additional or alternate evidence forms that include more learners’ strengths.
  • Participant revises at least one outcome and its evidence to improve clarity and equity.
  • Participant can articulate how students will understand what success looks like.
  • Participant connects maps explicitly back to ADTL pillars.
Mastery emerges when outcomes and evidence are transparent enough for students to own.
Lesson 3
Designing Aletheian Tasks & Performance Assessments
Learning goal: design tasks that generate rich, authentic evidence of learning.

Learners use their outcomes–evidence maps to design performance tasks and assessment experiences that invite thinking, creation, and application rather than recall alone.

  • Translate outcomes and evidence into performance tasks and assessment prompts.
  • Align tasks with ADTL pillars and the Aletheian Learning Cycle (e.g., Encounter, Weave, Illuminate).
  • Use AI as a co-designer to generate options, then refine for rigor and fit.
  • Examples of strong Aletheian tasks (projects, role plays, portfolios, labs, design briefs).
  • Mini-lesson on crafting prompts and conditions that elicit thinking, not guessing.
  • Studio writing time to design 1–2 core tasks or assessment experiences.
  • Draft at least one performance task aligned to a specific outcome and evidence type.
  • Annotate where AI assisted with ideas or language and what was changed.
  • Task drafts that clearly specify what students do, produce, and experience.
  • Notes on anticipated student responses and potential access barriers.
This lesson meets the goal by moving from abstract outcomes to concrete, lived assessment experiences.
Lesson 3.1
Mastery Studio: Task Critique & Student Experience
Mastery focus: refine tasks based on how diverse learners will encounter them.

Learners present tasks and, through critique, examine rigor, clarity, cultural resonance, and the likely emotional and cognitive experience for different students.

  • Present a task or assessment experience along with its intended outcome and evidence type.
  • Narrate how one or two different learner profiles might experience the task.
  • Identify where instructions, supports, or expectations may misfire.
  • Peers offer feedback on clarity, relevance, and ADTL alignment.
  • Group suggests modifications to instructions, scaffolds, or choice structures.
  • Participant revises the task to better support and challenge a wider range of learners.
  • Participant clarifies how the task’s evidence will be captured and interpreted.
  • Participant defends design choices using ADTL language.
Mastery appears when tasks feel challenging, inviting, and legible to real students—not just to designers.
Lesson 4
Criteria, Rubrics & AI-Assisted Calibration
Learning goal: build rubrics and criteria that make quality visible and discussable.

Learners design rubrics or single-point criteria aligned to outcomes, evidence, and ADTL, and explore how AI can support drafting and calibration without replacing human judgment.

  • Translate outcomes into clear criteria and rubric language.
  • Use AI to generate rubric drafts, then refine for precision and bias.
  • Plan how criteria will be shared and used with students.
  • Examples of analytic, holistic, and single-point rubrics in Aletheian contexts.
  • Mini-lesson on criteria clarity, student-friendly language, and common pitfalls.
  • Studio time to draft or revise rubrics, optionally with AI-assisted first drafts.
  • Draft a rubric (or single-point rubric) for one key task.
  • Highlight where AI contributed to language and where human revision was needed.
  • Rubric draft aligned to outcomes and evidence map.
  • Plan for introducing the rubric to students as a design tool, not just a scoring tool.
This lesson meets the goal by giving form and language to what quality looks like in Aletheian work.
Lesson 4.1
Mastery Studio: Rubric Calibration & Bias Check
Mastery focus: test rubrics against work samples and refine for fairness and clarity.

Learners apply their rubrics to sample or real student work, comparing judgments and using discrepancies to refine both criteria and their own expectations.

  • Use the rubric to score several pieces of student work (real or anonymized samples).
  • Compare scores and rationales with peers.
  • Identify places where the rubric language is confusing or overly generous/harsh.
  • Discuss potential bias in criteria (whose ways of demonstrating learning are privileged?).
  • Explore how AI could assist in spotting pattern bias, with human oversight.
  • Participant revises rubric language for clarity, inclusivity, and specificity.
  • Participant identifies calibration strategies for teams (e.g., anchor samples, norming sessions).
  • Participant specifies how students will use the rubric for self- and peer-assessment.
Mastery is evident when rubrics support consistent, fair, student-visible judgments about work quality.
Lesson 5
Feedback, Reflection & Student Partnership
Learning goal: design feedback and reflection structures that move learning forward.

Learners design feedback rhythms, reflection prompts, and student-facing tools so assessment becomes a dialogue that helps learners see, name, and extend their own growth.

  • Design cycles of feedback and revision within the Aletheian Learning Cycle.
  • Use AI to draft feedback stems and reflection prompts, then revise for voice and specificity.
  • Plan how students will self-assess and set goals using criteria.
  • Examples of feedback formats: comments, conferences, audio, check-ins.
  • Mini-lesson on effective feedback (timely, specific, actionable, kind).
  • Studio time to design feedback + reflection structures around one key task or unit.
  • Draft feedback stems and student reflection prompts tied to rubric criteria.
  • Map when and how feedback will happen within a sequence.
  • Feedback and reflection plan aligned with ADTL and the assessment system.
  • Draft prompts that students could realistically answer with insight.
This lesson meets the goal by making assessment a co-created learning process, not a unilateral judgment.
Lesson 5.1
Mastery Studio: Feedback & Reflection Evidence
Mastery focus: analyze feedback exchanges and reflections as evidence of design quality.

Learners bring examples of feedback they have given or plan to give, along with student reflections, and interpret these as design evidence: what is working, what needs to change?

  • Share feedback samples and (if available) student responses or reflections.
  • Describe how students used or did not use the feedback to revise their work.
  • Identify patterns of misunderstanding or superficial revision.
  • Peers respond to clarity, tone, and usefulness of feedback.
  • Group suggests changes to prompts or structures to deepen student thinking.
  • Participant revises feedback and reflection prompts to better support agency and metacognition.
  • Participant names specific strategies for teaching students how to use criteria and feedback.
  • Participant sees feedback artifacts as mirrors of their design choices.
Mastery is evident when feedback and reflection produce visible shifts in student thinking and work.
Lesson 6
Building an Aletheian Assessment System
Learning goal: synthesize tasks, evidence, rubrics, and feedback into a coherent system.

Learners zoom out to see assessment as an ecosystem across a unit, course, or year, ensuring that evidence, tools, and experiences form a balanced Aletheian system rather than isolated events.

  • Map assessments across time, checking for overcrowding, gaps, and redundancy.
  • Align assessment rhythms with the Aletheian Learning Cycle and ADTL pillars.
  • Identify opportunities for student choice, revision, and portfolio building.
  • Review of key artifacts created in AICI 501 (maps, tasks, rubrics, feedback plans).
  • Mini-lesson on balance: diagnostic, formative, summative, and portfolio evidence.
  • Studio time to assemble these pieces into a coherent assessment map or calendar.
  • Create an Assessment System Map for one course or major unit.
  • Label each assessment with its purpose, evidence type, and ADTL connections.
  • Assessment system that shows intentional sequencing and variety of evidence.
  • Notes on how students will see and experience this system over time.
This lesson meets the goal by lifting assessment design to the systems level while staying grounded in student experience.
Lesson 6.1
Mastery Studio: Assessment System Portfolio Piece
Mastery focus: finalize a portfolio-ready assessment system and articulate its impact.

Learners present their Aletheian Assessment System as a portfolio artifact, showing how outcomes, tasks, rubrics, and feedback work together—and what this system is designed to change for their students.

  • Present the Assessment System Map and at least one associated task + rubric + feedback structure.
  • Explain how the system embodies The Aletheian Design Theory of Learning.
  • Describe anticipated shifts in student experience and teacher practice.
  • Peers provide feedback on coherence, equity, and feasibility.
  • Group surfaces cross-cutting ideas that could inform departmental or school-level assessment work.
  • Participant refines the system into a portfolio-ready artifact for The Aletheia Design Portfolio.
  • Participant names at least two future adjustments or expansions to the system.
  • Participant reflects on how AICI 501 has changed their view of assessment.
Final mastery appears when assessment is understood and practiced as an ongoing design conversation between theory, evidence, and student lives.