Reports & targets

Give leadership the information that connects to action

Why this matters

Policies can’t be enforced by leadership without measurable target reports.

Give leadership tools they need

Reports should be thought of as tools to help leadership communicate the results the enterprise wants to see.

Spend time learning what reports are valuable to leadership.

For example, leadership may prefer to point to a single number or average score. In other contexts, leaders may wish to dive deeper to drive specific results.

Reporting data sources

Don’t underestimate the value, nor the effort necessary, to assemble all of this data and present it.

To give leadership the data needed to enforce policies, you’ll collect information from manual testers, automated testing tools, project management queries, team surveys and training systems.

Manual and automated accessibility assessments

Manual testing

Manual testing is precisely that: a human actually testing the experience using the screenreader and browser combinations you need to support.

Experts can deliver an organized report of defects by severity. This is a necessary tool for improving the customer experience.

Limits of manual testing

A manual test isn’t the same as a usability study, but it is effective in uncovering the issues your customers experience.

Manual testing is performed by people, and perception of what constitutes a defect can vary slightly from one tester to the next. It will be helpful for your testers to reference your severity definitions, and use uniform testing acceptance criteria like MagentaA11y.com.

Automated testing

Automated tests find programmatic errors, but can’t describe actual customer experiences.

How to use automated scans effectively

Scanning tools quickly pinpoint syntax defects in code. Some flagged issues won’t affect the customer experience, but you should exercise scrutiny and manual testing if a web page is riddled with invalid code and errors.

Limitations of automated scans

Testing tools have value. But it’s important to understand their drawbacks. Even the most robust tools can identify less than half of the potential defects on a page.

Code can be inaccessible for a person using their keyboard or screen reader without being flagged as invalid markup by an automated tools.

Practical examples
  • Automated scans can instantaneously test checkboxes for properly associated labels and other code attributes, but can’t tell you if the labels make sense.
  • Automation tools can flag an image for missing alt text, but can’t tell you if it would be better for the screen reader to ignore that particular icon.
  • Custom components, like an accordion expander, could be inaccessible with the keyboard and yet be formed of valid code that won’t raise an error.

Project management system

Query your project management system to track:

  • Usage of accessibility acceptance criteria before work begins
  • Recorded accessibility defects by severity and age

Learning management systems

Collect the completion rate of accessibility training for each product team as compared to training policy targets. This data will be an indicator of teams that are committed to meeting accessibility policy and correlates to better results.

Validation check of remediation

Unfortunately, it’s entirely possible for teams to believe, or simply claim, they have remediated defects when that is not the case.

In the interests of due diligence, remediation work must occasionally be manually validated for quality using the same methodology as the original assessment.

Input Data source Report Target
Automated assessment Automated assessment application Code defects by severity Trend down
Manual assessment Import into project management app Defects by severity & age Meet remediation deadlines
Accessibility training by role Learning management system (LMS) Completion rates 100% trained
Accessibility acceptance criteria usage Project management app Usage statistics for UI projects 100% usage
Report inputs and data sources

Accessibility score variables

One way to deliver data to leadership and teams is to condense defects and best practice commitments into a singular accessibility score for every product release cycle.

Acceptance criteria usage

When teams don’t add acceptance/test criteria at the beginning of their work, they will often skip accessibility testing. Tracking the usage of acceptance criteria where applicable indicates that teams are committed to proper design, development and testing practices.

Quantity of defects by severity

Categorize defects by severity to add meaning to trends.

It can be difficult to describe progress on a large number of defects. But, if your reports can show high severity issues have been halved, you’ll present a more digestible concept for leadership.

Defect age

Track the number of days or release cycles from detection to remediation. When teams ignore high severity issues, it indicates misalignment of priorities with the enterprise.

This report will be used to enforce remediation policies.

Training completion

Tracking the percentage of team members who’ve completed training is an indicator of alignment with policy.

Sprint and quarterly reporting

What should be included in reports to teams and leadership?

Accessibility score trend

There are multiple ways to calculate an accessibility score, but what’s important to leadership is the trend of those scores.

By tracking the accessibility score variables every release cycle, you can produce a trending report leadership can use to enforce policy.

Risks to the organization

This can be a factor of severity, usage and the age of the issue. A high severity issue on a low traffic page might be less risk than a medium severity issue on a high traffic page.

Best practices compliance by role

Teams consistently following best practices should achieve positive accessibility outcomes.

This data can be collected by a sample survey or quiz of individual team members across the enterprise.

Remediated vs net new defects

Unfortunately, it’s entirely possible for teams to remediate defects from an assessment, while simultaneously creating new issues when practices don’t change. This happens often when product teams don’t reach out for help from accessibility coaches.

Create attainable target scores

For products with dismally poor accessibility, this allows setting of an attainable target score. For example, if a product is scoring 40/100, a target score of 60 can be set to give the team a sense of progress and the ability to level up. Once they meet the target score, it can be raised to the next attainable level.

The capability to assemble this data gives the accessibility team a direct way to leverage leadership to enforce policies to improve these metrics.

Product name Product notes Target score Current score Total issues Blocking severity Blocking age High severity High age Medium severity Medium age Low severity Low age Training complete Criteria usage
Store Quick remediation 100 96 4 0 0 0 0 0 0 4 14 98% 100%
Blog 3rd party, slow to respond 80 63 6 0 0 2 28 0 0 4 28 78% 0%
Account settings Struggling with priorities 50 20 11 1 14 1 35 1 35 8 35 58% 20%
Example sprint product report

Benchmarking reports

While you shouldn’t plan your entire program from casual benchmarking, you need to understand peer organizations and competitor efforts.

This can help you make the case for a stronger accessibility program if you’re behind.

Compare accessibility program against peers

It can be helpful to gauge your enterprise’s commitment to accessibility by understanding what peers or competitors are doing.

If you find your competitors are placing significant resources into accessibility innovation and compliance, that’s a helpful marker to point out to your leadership.

Compare compliance against peers

Run an automated assessment on peers and competitors. For example, an automated testing tool could produce an assessment on a competitor’s public facing website with little commitment.

Your checklist

Download checklist
Reporting data sources
Accessibility score variables
Sprint and quarterly reporting
Benchmarking reports
Download checklist