Enter password to view case study

Accessible to all who serve: uncovering barriers in a state workforce system

Grounded in empathy and guided by WCAG 2.1 Level AA standards, this work uncovers systemic accessibility barriers across a state workforce system, ensuring the employees behind the mission can do their best work without the system standing in their way.

My Role:

UX Designer, Consulting

Client:

State Workforce Development Program

Methods:

Automated Testing, Manual Testing, WCAG 2.1 AA Audit

Automated Testing, Manual Testing, WCAG 2.1
AA Audit

TOOLS:

WAVE, Manual Keyboard and Screen Review

CONTEXT

Every barrier in a system is a barrier for a real person

This state workforce development program helps participants receiving SNAP or TANF benefits gain skills, build resumes, and find employment. The system that supports that mission is used daily by state employees including case managers, regional managers, receptionists, and facilitators.

As new legislation brought stricter accessibility requirements for government systems into effect, it became clear the application needed to be evaluated against WCAG 2.1 Level AA standards. Accessibility had always been a legal and ethical obligation. Now there was urgency to understand exactly where the system stood.

Accessibility is often framed as a technical requirement. But at its core it is an act of empathy. Every barrier in a system is a barrier for a real person trying to do their job. A case manager with low vision struggling with low contrast text. An employee relying on a keyboard who cannot navigate past a broken focus state. A screen reader user encountering a page with no heading structure and no way to orient themselves. These are the people this audit was for.

METHODOLOGY

Two layers of testing, because one was not enough.

The audit covered nine screens selected to represent the range of layouts, interaction patterns, and user workflows present across the application. Rather than auditing every screen individually, we identified screens that reflected the structural patterns repeated throughout the system. Violations found here were understood to be present more broadly.

  • Automated testing: Each screen was run through the WAVE accessibility evaluation tool, which identified contrast errors, missing labels, empty buttons, and structural issues. Automated testing provided a reliable baseline but had clear limits. It could not evaluate keyboard navigation, hover states, or the experience of moving through a page sequentially.

  • Manual testing: Each screen was then reviewed manually, with particular attention to tabbing order, focus visibility, hover state contrast, and the overall experience of navigating without a mouse. This layer caught violations that automated tools consistently missed, including focus outlines disappearing mid-page and hover states that failed contrast requirements.

9

screens audited across the application

screens audited across the application

38+

individual violations identified across all screens

individual violations identified across all screens

38+
3

violation categories present on every single screen audited

violation categories present on every single screen audited

2

recently redesigned screens with the highest violation counts

recently redesigned screens with the highest violation counts

FINDINGS

The violations were not random. They were systemic.

Across all nine screens, three violation categories appeared without exception. These were not isolated problems on individual pages. They were patterns embedded in the foundation of the application, which meant fixing them required systemic attention, not one-off patches.

Present on all 9 screens

Present on all legacy screens

Present on all 9 screens

Present on all legacy screens

Missing heading structure

Every screen audited lacked proper heading hierarchy. Headings provide document structure, navigation landmarks, and orientation for screen reader users. Without them, assistive technology users have no reliable way to understand or navigate the page.

Low color contrast

Contrast between foreground text and background colors fell below WCAG 2.1 AA requirements across all screens. This affects all users in challenging lighting conditions and is a significant barrier for users with low vision. Form labels, status tags, pagination controls, and hover states were among the most common offenders.

Inadequate keyboard navigation

Focus outlines were either invisible or disappeared mid-page on every screen reviewed. Users who navigate by keyboard, including those with motor impairments, rely on visible focus indicators to know where they are on a page. This violation was only catchable through manual testing and would not have been surfaced by automated tools alone.

Insufficient text size

Very small text throughout the legacy screens made content difficult to read, particularly for users with low vision. Font sizes needed to be increased to a minimum of 12px across these pages. Notably this issue was not present on the two newly redesigned screens, suggesting some progress had been made in this area.

The audit also uncovered a finding none of us expected. The two screens that had been most recently redesigned, the Appointment Roster and the Messaging interface, carried the highest concentration of violations. Our assumption going in was that the legacy screens would be the most problematic. The results told a different story.

A redesigned screen is not automatically an accessible one. Without inspection at the development stage, violations can be introduced even when the design intent was sound.

FULL SCOPE

Nine screens. One consistent pattern.

Each screen was evaluated individually with its own set of findings. The grid below captures the full scope of the audit. The screens highlighted above represent the strongest examples. The remaining five are documented here with their key violation categories.

Login

4 annotation zones

Form labels

Contrast

Text Size

Tabbing

Legacy

Login

4 annotation zones

Form labels

Contrast

Text Size

Tabbing

Legacy

Participant

3 annotation zones

Contrast

Orphaned labels

Tabbing

Legacy

Participant

3 annotation zones

Contrast

Orphaned labels

Tabbing

Legacy

Participant Information

4 annotation zones

Contrast

Idle tabs

Text size

Tabbing

Legacy

Participant Information

4 annotation zones

Contrast

Idle tabs

Text size

Tabbing

Legacy

My Caseload

4 annotation zones

Contrast

Table Headers

Text size

Tabbing

Legacy

My Caseload

4 annotation zones

Contrast

Table Headers

Text size

Tabbing

Legacy

Caseload Search

4 annotation zones

Orphaned labels

Missing fieldset

Contrast

Caption

Legacy

Caseload Search

4 annotation zones

Orphaned labels

Missing fieldset

Contrast

Caption

Legacy

Search Non-Compliance

4 annotation zones

Orphaned labels

Missing fieldset

Contrast

Empty Buttons

Legacy

Search Non-Compliance

4 annotation zones

Orphaned labels

Missing fieldset

Contrast

Empty Buttons

Legacy

Resources

1 annotation zones

Hover Contrast

Legacy

Resources

1 annotation zones

Hover Contrast

Legacy

Appointment Roster

8 annotation zones

Contrast

Form labels

Empty Buttons

Navigation

Tabbing

Redesigned

Appointment Roster

8 annotation zones

Contrast

Form labels

Empty buttons

Navigation

Tabbing

Redesigned

Messaging

6 annotation zones

Orphaned labels

Missing fieldset

Contrast

Empty Buttons

Redesigned

Messaging

6 annotation zones

Orphaned labels

Missing fieldset

Contrast

Empty Buttons

Redesigned

IMPACT

What this audit changed.

The audit had immediate consequences in two directions. The Appointment Roster had already been deployed when the violations were identified, which meant the development team had to revisit a screen already in production. That process underscored the real cost of catching accessibility issues late: rework, delays, and a system actively in use that was falling short of the people depending on it.

The Messaging interface had not yet been deployed, which gave the team an opportunity to address violations before they reached users. That difference in outcome between two screens at different stages of deployment made the case clearly: accessibility review cannot be a final step. It has to be woven into the process from the start.

For me personally, the most significant outcome was a change in how I work. Before this audit I was less involved in the development inspection stage. Afterward it became clear that a designer needs to be present and active during development review, not just at the handoff. Catching a contrast violation in a design file takes minutes. Catching it after deployment takes much longer and costs considerably more.

Appointment Roster

DEPLOYED SCREEN

Violations identified after deployment required the development team to revisit a screen already in active use, illustrating the cost of late accessibility review.

Appointment Roster

DEPLOYED SCREEN

Violations identified after deployment required the development team to revisit a screen already in active use, illustrating the cost of late accessibility review.

Messaging

PRE-DEPLOYMENT SCREEN

Violations caught before deployment gave the team the opportunity to remediate before the screen reached users, demonstrating the value of early review.

Seven remaining screens

LEGACY SCREENS

Currently in active use with documented violations. The audit established a clear record of what needs to be addressed as the system continues to evolve.

Appointment Roster

DEPLOYED SCREEN

Violations identified after deployment required the development team to revisit a screen already in active use, illustrating the cost of late accessibility review.

Seven remaining screens

LEGACY SCREENS

Currently in active use with documented violations. The audit established a clear record of what needs to be addressed as the system continues to evolve.

REFLECTION

What I carry forward


New does not mean accessible

The finding that surprised everyone most was that our redesigned screens had more violations than some of the legacy ones. Good design intent does not automatically translate into accessible outcomes during development. Inspection at every stage is the only way to close that gap.

New does not mean accessible

The finding that surprised everyone most was that our redesigned screens had more violations than some of the legacy ones. Good design intent does not automatically translate into accessible outcomes during development. Inspection at every stage is the only way to close that gap.

Automated testing has a ceiling

WAVE caught a meaningful number of violations but could not evaluate tabbing, hover states, or sequential navigation. Manual testing was not a supplement to automated testing. It was an essential and irreplaceable part of the process.

Automated testing has a ceiling

WAVE caught a meaningful number of violations but could not evaluate tabbing, hover states, or sequential navigation. Manual testing was not a supplement to automated testing. It was an essential and irreplaceable part of the process.

Accessibility is empathy in practice

Behind every WCAG criterion is a real person. A focus outline that disappears is not just a technical failure. It is a case manager with a motor impairment losing their place on the page. Keeping that human reality visible throughout the audit shaped every recommendation we made.

Accessibility is empathy in practice

Behind every WCAG criterion is a real person. A focus outline that disappears is not just a technical failure. It is a case manager with a motor impairment losing their place on the page. Keeping that human reality visible throughout the audit shaped every recommendation we made.

Earlier is always better

This audit changed how I engage with the development process. Being involved earlier in the inspection stage means violations can be addressed when they are easiest and least costly to fix. That is not just good practice for accessibility. It is good practice for everything.

Earlier is always better

This audit changed how I engage with the development process. Being involved earlier in the inspection stage means violations can be addressed when they are easiest and least costly to fix. That is not just good practice for accessibility. It is good practice for everything.