Building Confidence into Phonics Assessment

Building Confidence into Phonics Assessment

Turning manual trial feedback from 50+ schools into a digital tool teachers could trust

Turning manual trial feedback from 50+ schools into a digital tool teachers could trust

Company

Government (NZ)

ROLE

UX/UI Designer

duration

2 Months (Discovery → Concept Delivery for Validation)

KEY IMPACT

Turned ambiguous trial feedback into clear direction - synthesized 50+schools' data into 4 testable hypotheses

Turned ambiguous trial feedback into clear direction - synthesized 50+schools' data into 4 testable hypotheses

Turned ambiguous trial feedback into clear direction - synthesized 50+schools' data into 4 testable hypotheses

Matched design to real teaching behavior - reduced cognitive load through flexible, context-aware workflows

Matched design to real teaching behavior - reduced cognitive load through flexible, context-aware workflows

Matched design to real teaching behavior - reduced cognitive load through flexible, context-aware workflows

Drove design alignment through stakeholder collaboration - led concept workshops that surfaced trade-offs and validated direction early

Drove design alignment through stakeholder collaboration - led concept workshops that surfaced trade-offs and validated direction early

Drove design alignment through stakeholder collaboration - led concept workshops that surfaced trade-offs and validated direction early

Balanced cultural integrity with technical efficiency - unified system across English and Te Reo Māori mediums

Balanced cultural integrity with technical efficiency - unified system across English and Te Reo Māori mediums

Balanced cultural integrity with technical efficiency - unified system across English and Te Reo Māori mediums

Designed for continuity - build handoff documentation that preserved research insights through work transition

Designed for continuity - build handoff documentation that preserved research insights through work transition

Designed for continuity - build handoff documentation that preserved research insights through work transition

TL;DR

In 2024, the NZ government prioritized implementing Phonics assessment tools to help teachers identify student individual phonics levels and provide accurate learning support.


Rather than building from scratch, the decision was made to adapt an Australian version of UK phonic check for NZ context - align with NZ's refreshed curriculum, work across both English and Te Reo Māori medium.

My role was to collaborate with our vendors to bring this tool to life in digital formats, enabling smarter assessment and reporting between schools and the Ministry.

In 2024, the NZ government prioritized implementing Phonics assessment tools to help teachers identify student individual phonics levels and provide accurate learning support.

Rather than building from scratch, the decision was made to adapt an Australian version of UK phonic check for NZ context - align with NZ's refreshed curriculum, work across both English and Te Reo Māori medium.

My role was to collaborate with our vendors to bring this tool to life in digital formats, enabling smarter assessment and reporting between schools and the Ministry.

In 2024, the NZ government prioritized implementing Phonics assessment tools to help teachers identify student individual phonics levels and provide accurate learning support.

Rather than building from scratch, the decision was made to adapt an Australian version of UK phonic check for NZ context - align with NZ's refreshed curriculum, work across both English and Te Reo Māori medium.

My role was to collaborate with our vendors to bring this tool to life in digital formats, enabling smarter assessment and reporting between schools and the Ministry.

Due to personal circumstances, my involvement concluded after the concept delivery.

Read more about the Phonics Assessment

Read more about the Phonics Assessment

Read more about the Phonics Assessment

CHALLENGES

  • Designing from secondhand sources - I worked from workshop outputs and written trial feedback from 50+ schools, synthesizing secondhand data into testable concepts without direct teacher access or real-time clarification.

  • Designing from secondhand sources - I worked from workshop outputs and written trial feedback from 50+ schools, synthesizing secondhand data into testable concepts without direct teacher access or real-time clarification.

  • Designing from secondhand sources - I worked from workshop outputs and written trial feedback from 50+ schools, synthesizing secondhand data into testable concepts without direct teacher access or real-time clarification.

  • Interpreting dense, unstructured field data - Trial feedback came as written observations with no ability to ask follow-up questions.

  • Interpreting dense, unstructured field data - Trial feedback came as written observations with no ability to ask follow-up questions.

  • Interpreting dense, unstructured field data - Trial feedback came as written observations with no ability to ask follow-up questions.

  • No vendor input during concept development - Contract delays meant the development team wasn't available to review concepts or validate technical feasibility.

  • No vendor input during concept development - Contract delays meant the development team wasn't available to review concepts or validate technical feasibility.

  • No vendor input during concept development - Contract delays meant the development team wasn't available to review concepts or validate technical feasibility.

  • Designing quickly to maintain project momentum - Time pressure meant prioritizing speed over certainty, getting testable concepts ready fast so user feedback could guide the next iteration, rather than waiting for perfect alignment.

  • Designing quickly to maintain project momentum - Time pressure meant prioritizing speed over certainty, getting testable concepts ready fast so user feedback could guide the next iteration, rather than waiting for perfect alignment.

  • Designing quickly to maintain project momentum - Time pressure meant prioritizing speed over certainty, getting testable concepts ready fast so user feedback could guide the next iteration, rather than waiting for perfect alignment.

DISCOVERY

Understanding What Teachers Experienced

By the time I joined the project, 50+ schools had gone through trials with the manual version of the phonics assessment material. I analyzed their feedback to understand where the materials failed in real classroom conditions.

The Problem

Teachers testing from 50+ schools showed their struggles with mainly four things:

Teachers testing from 50+ schools showed their struggles with mainly four things:

Teachers testing from 50+ schools showed their struggles with mainly four things:

Test length and difficulty caused student frustration

Why?

Extended testing time created anxiety, especially for younger or less proficient students, leading some to refuse completion and producing inaccurate results.

Image distraction

Why?

Alien images from original version distracted younger students, leading to results that didn't reflect their actual phonic skills.

Recording tools created friction during live testing

Why?

Annotation sheets for recording were hard to follow in real-time, forcing teachers to find more space and spend extra time to complete.

Insufficient guidance undermined confidence

Why?

Teachers needed clearer training and support to administer tests consistently and effectively.

THE INSIGHT

Connecting the Dots

One finding stood out from the trial - pre-filling student information and proper setup made testing significantly more effective.

One finding stood out from the trial - pre-filling information and proper setup made testing significantly more effective for teachers.

One finding stood out from the trial - pre-filling student information and proper setup made testing significantly more effective.

Drawing on my teaching background, helped me see what the feedback was really saying: the problem wasn't just the test itself - it was all the administrative work surrounding it.

Drawing on my teaching background, helped me see what the feedback was really saying: the problem wasn't just the test itself - it was all the administrative work surrounding it.

Drawing on my teaching background, helped me see what the feedback was really saying: the problem wasn't just the test itself - it was all the administrative work surrounding it.

HMW …

With varied assessments across different learning stages and mediums~

With varied assessments across different learning stages and mediums~

With varied assessments across different learning stages and mediums~

How Might We eliminate manual tracking and setup work so teachers can focus on what matters - observing and supporting student learning?

How Might We eliminate manual tracking and setup work so teachers can focus on what matters - observing and supporting student learning?

How Might We eliminate manual tracking and setup work so teachers can focus on what matters - observing and supporting student learning?

SOLUTION CONCEPT

Designing for Real Classroom Conditions

The four problems from the trial weren't isolated issues, they were interconnected friction points that compounded during live assessments. My design decisions aimed to reduce that friction while building in flexibility to test what actually worked for teachers and students in different school settings.

Decision No.1:

Supporting task decision through unified visibility

The identified opportunity guided me on showing key information about students test records at a glance to eliminate unnecessary manual tracking.

It surfaces:

  • Testings for this week and upcoming

  • Current phonics levels

  • Test results from past

Decision No. 2:

Reducing student frustration during testing

I added a pause function for two scenarios:

  • When student showed unwillingness to continue

  • When unexpected interruptions occurred (fire drills, emergencies)

    The system captured the context and allowed teachers to resume within an hour without losing progress.

If a student made 5 consecutive errors, the assessment will be ended. This respected where the student was in their learning journey rather than creating unnecessary stress.

Decision No. 3:

Keeping student focus on the task

Alien imagery confused younger students, but we still needed to indicate non-real words for decoding skills assessment. I replaced aliens with simple colored dots using the branding color - same function, no distraction.

Decision No. 4:

Reducing recording friction during live testings

  • Designed the annotation interface with embedded guidance for each word and responsive comment field to accommodate what teacher has to say.

  • Provided two test modes teachers could choose from: Focus mode (one word at a time to minimize distraction) and list mode (all words visible for holistic overview)

The strategic choice: design both bodes and let user testing reveal which worked better in real classroom, rather than assuming one approach fit all teaching styles.

Decision No. 5:

Building confidence through embedded guidance

  • Gave engagement instruction before test starts

  • Offered a dedicated instruction page with guidance and best practice examples (detail needs to be consolidated with SMEs)

  • Teachers who needed more depth could access more resources or training recordings with just one click away


  • Provided a dedicated instruction page with guidance and best practice examples

  • Teachers who needed more depth could access detailed resources or training recordings with just one click - easy reach

  • Gave engagement instruction before test starts

  • Offered a dedicated instruction page with guidance and best practice examples (detail needs to be consolidated with SMEs)

  • Teachers who needed more depth could access more resources or training recordings with just one click away

Decision No. 6:

Adapting assessment to different school technology contexts

Not all schools have access to multiple devices for assessments. Offering two delivery modes teachers could choose based on their available resources:


  • Presentation mode - scan QR code to display assessment materials on another device for the student

  • Paper mode - download and print assessment materials to test with, ensuring schools with limited technology could still participate


This flexibility meant the tool could work across varied school settings without requiring hardware investments.


Not all schools have access to multiple devices for assessments. Offering two delivery modes teachers could choose based on their available resources:


  • Presentation mode - scan QR code to display assessment materials on another device for the student

  • Paper mode - download and print assessment materials to test with, ensuring schools with limited technology could still participate


This flexibility meant the tool could work across varied school settings without requiring hardware investments.

Not all schools have access to multiple devices for assessments. Offering two delivery modes teachers could choose based on their available resources:


  • Presentation mode - scan QR code to display assessment materials on another device for the student

  • Paper mode - download and print assessment materials to test with, ensuring schools with limited technology could still participate


This flexibility meant the tool could work across varied school settings without requiring hardware investments.

TESTING DIRECTION

This concept design focused on solving the core problems, so I kept visual details lightweight at this stage because the priority was testing whether these approaches actually worked in real classrooms.

Note on TeReo Māori Medium

Without Māori medium trial feedback, I applied the same workflows as the English medium design with adapted visual identity and culturally appropriate patterns. This consistency will reduce cognitive load for teachers working across both mediums while minimizing development complexity and maintenance burden.

Without Māori medium trial feedback, I applied the same workflows as the English medium design with adapted visual identity and culturally appropriate patterns. This consistency will reduce cognitive load for teachers working across both mediums while minimizing development complexity and maintenance burden.

Without Māori medium trial feedback, I applied the same workflows as the English medium design with adapted visual identity and culturally appropriate patterns. This consistency will reduce cognitive load for teachers working across both mediums while minimizing development complexity and maintenance burden.

REFLECTION & NEXT STEP

While we gained valuable vendor perspective during user journey definition, early involvement in concept design discussions would have strengthened technical alignment and make the testing focus on viable solutions. However, when their involvement became uncertain, we had to adjust quickly, keep the project moving forward but with future collaboration in mind.

Next Step

If I were able to continue working on this project, my next step was to:

  • run field tests with teachers to collect feedback for the next design iteration

  • bring the development team into design discussions to validate technical feasibility before or during testing

  • collaborate with curriculum team on refining recording or annotation instruction

  • consult with SMEs and branding team to ensure cultural integrity for Māori Medium contexts

NEXT

NEXT

NEXT

Investing 101

Create a free website with Framer, the website builder loved by startups, designers and agencies.