Skip to content

Heuristic evaluation checklist: before, execution, after

This heuristic evaluation checklist covers the full project from scope definition to the prioritized action plan. Use it as a working document — copy it into your project notes, tick off each item as you go, and add comments where the project deviates from the standard flow. The checklist assumes the standard 3-evaluator setup against Nielsen’s 10 usability heuristics on a focused user flow; for larger evaluations or specialized heuristic sets the same items still apply, but plan more time for briefing and consolidation.

Before

  • Define the scope: 1–3 critical user flows, list of in-scope screens and states, list of explicit exclusions
  • Pick the user type the evaluators should adopt (novice, expert, specific role)
  • Choose the heuristic set (Nielsen’s 10 by default, plus domain-specific or WCAG if relevant)
  • Recruit 3–5 evaluators with at least basic UX background, none of whom designed the flow being evaluated
  • Brief evaluators on the heuristics; run a 30-minute calibration on a third-party product if it is the team’s first time
  • Choose the logging format (shared Google Sheets, NN/g workbook, Miro board) and require everyone to use it
  • Prepare the screens, prototype access, or a working build that every evaluator can reach
  • Schedule the consolidation meeting in advance so evaluators have a deadline

Execution

  • Each evaluator does an orientation pass without logging issues — get familiar with the product
  • Each evaluator does an evaluation pass logging every violation in the structured format
  • Every issue includes: screen, element, heuristic, what goes wrong, why it matters, severity, recommended fix
  • Take a screenshot or short Loom for every major issue
  • Log severity against a fixed scale (cosmetic / minor / major / critical or Nielsen 0–4)
  • Do not share findings between evaluators until the independent pass is complete
  • If using AI for a first pass, treat its output as a junior evaluator — verify before accepting
  • Cross-check accessibility against WCAG quick rules if accessibility is in scope

After

  • Hold a consolidation meeting; merge duplicates, discuss disagreements, recalibrate severity
  • Cluster issues into 3–7 thematic groups (navigation, feedback, errors, terminology, etc.)
  • Score each issue or cluster on severity, frequency, and business impact
  • Pick the top 5–15 issues for the action plan with rough effort estimates
  • Annotate screenshots for the headline findings
  • Write the brief (5–10 pages or short deck) with method, headline findings, action plan, and limitations
  • Present to product, design, and engineering leads in person; do not email the spreadsheet
  • Archive the consolidated backlog and the codebook of heuristics for the next evaluation
  • Schedule a follow-up usability test to validate the highest-impact fixes once they ship