Reports & targets

Give leadership the information that connects to action

Why this matters

Without measurable targets, policies can’t be enforced. Without reports, leadership cannot enforce policy.

What should be reported to leadership

Report data consists of manual and automated assessments expressed by severity in each product. Sheer numbers are easily overwhelming, and not always useful to express the scope of the problem. It must be distilled to targets for leadership.

Understand reporting limitations

The data available are only indicators of compliance or non-compliance. A product could have many relatively inconsequential issues detected by automated tools with few issues that actually affect customer experience.

Benchmarking reports

While you shouldn’t plan your entire program from casual benchmarking, you need to understand peer organizations and competitor efforts.

This can help you make the case for a stronger accessibility program if you’re behind.

Compare accessibility program against peers

It can be helpful to gauge your enterprise’s commitment to accessibility by understanding what peers or competitors are doing.

If you find your competitors are placing significant resources into accessibility innovation and compliance, that’s a helpful marker to point out to your leadership.

Compare compliance against peers

Run an automated assessment on peers and competitors. For example, an automated testing tool could produce an assessment on a competitor’s public facing website with little commitment.

Accessibility assessments

While accessibility assessments are a given, part of the capability is onboarding the team to the assessment process, what the expectations and deadlines are for remediation, and where to find help.

Automated scans often produce an overwhelming number of defects, and manual assessments may point out issues the team simply isn’t equipped to understand, let alone remediate.

Set the right expectations

Don’t deliver assessment reports and wait to see results.

Set an expectation that teams will require a degree of help from accessibility coaches to understand and act on the reports.

It’s a serious red flag when teams don’t reach out for help. Most likely, they are failing to prioritize the work or possibly overestimating their team’s ability to remediate the issues.

Validation check of remediation

Unfortunately, it’s entirely possible for teams to believe, or simply claim, they have remediated defects when that is not the case.

In the interests of due diligence, remediation work should be manually validated for quality using the same methodology as the original assessment.

Reporting data sources

Do not underestimate the effort necessary to assemble all of this data and present it.

To give leadership the data needed to enforce policies, you’ll collect information from manual testers, automated testing tools, project management queries and training systems.

Spend time learning what reports are valuable to leadership.

For example, leadership may prefer to point to a single number or average score. In other contexts, leaders may wish to dive deeper to drive specific results.

Input Data source Report Target
Automated assessment Automated assessment application Code defects by severity Trend down
Manual assessment Import into project management app Defects by severity & age Meet remediation deadlines
Accessibility training by role Learning management system (LMS) Completion rates 100% trained
Accessibility acceptance criteria usage Project management app Usage statistics for UI projects 100% usage
Report inputs and data sources

Sprint and quarterly reports

Reports and measurements must coincide with the enterprise’s cadence of work.

For instance, if your organization works in 2 week sprints, that means every quarter you can deliver 6 trend reports on those releases.

Accessibility score

One way to deliver data to leadership is to condense defects and best practices into a singular score for every product release cycle.

Quantity of defects by severity

Categorize defects by severity to add meaning to trends. It can be difficult to describe progress on a large number of defects. But, if your reports can show high severity issues have been halved, you’ll present a more digestible concept for leadership.

Average defect age

Track the number of days or release cycles from detection to remediation.

This information will come from a project management tool queries. You must ensure uniform entry of accessibility issues and be able to render a report by product on the average age of issues by severity.

This report will be used to enforce remediation policies.

Risks to the organization

This can be a factor of severity, usage and the age of the issue. A high severity issue on a low traffic page might be less risk than a medium severity issue on a high traffic page.

Remediated vs net new defects

Unfortunately, it’s entirely possible for teams to remediate defects from an assessment, while simultaneously creating new issues when practices don’t change.

Best practices compliance

Teams consistently following best practices should achieve positive accessibility outcomes. For each role, team members and managers should be able to report compliance with each best practice.

Training completion

Is the team making accessibility training a priority?

Acceptance criteria usage

When teams don’t add acceptance/test criteria at the beginning of their work, they will often skip accessibility

Creating attainable target scores

For products with dismally poor accessibility, this allows setting of an attainable target score. For example, if a product is scoring 40/100, a target score of 60 can be set to give the team a sense of progress and the ability to level up. Once they meet the target score, it can be raised to the next attainable level.

The capability to assemble this data gives the accessibility team a direct way to leverage leadership to enforce policies to improve these metrics.

Product name Product notes Target score Current score Total issues Blocking severity Blocking age High severity High age Medium severity Medium age Low severity Low age Training complete Criteria usage
Store Quick remediation 100 96 4 0 0 0 0 0 0 4 14 98% 100%
Blog 3rd party, slow to respond 80 63 6 0 0 2 28 0 0 4 28 78% 0%
Account settings Struggling with priorities 50 20 11 1 14 1 35 1 35 8 35 58% 20%
Example sprint product report

Your checklist

Download checklist
Benchmarking reports
Reporting data sources
Sprint and quarterly reports
Download checklist