Dell: Ensuring Equal Access Through Accessibility Improvements

Client
Dell
Services
Accessibility Audit
Deliverables
Audit Report
Timeline
5 months

Background

Dell initiated a comprehensive accessibility audit on over 600 webpages to ensure adherence to the WCAG 2.1 Level A and AA standards. By implementing the audit's detailed recommendations, Dell aimed to not only enhance its web accessibility but also to leverage these improvements in broader strategic and operational contexts.

My Role

In my role as the project lead, I was responsible for several key contributions:
1. Developed Testing and Reporting Frameworks
2. Mentored Team Members
3. Accessibility Testing & Reporting

Impact

Audience Reach
20% increase
Significantly broadened access and engagement across diverse user groups.
Clients Onboarded
+2 new clients onboarded
Expanded our market presence through successful engagements in web and document accessibility services.

Final Report

This report was one of several provided to Dell, focusing on a specific set of URLs that were rigorously tested against the WCAG guidelines. It detailed the number of accessibility issues found, categorized by severity, to help Dell prioritize which problems required immediate resolution. By highlighting critical areas, the report served as a roadmap for Dell to enhance their web accessibility, ensuring compliance and improving user experience.

Guideline Violations

Provided a detailed summary of our evaluation, clearly listing each WCAG guideline with a straightforward 'pass' or 'fail' status for the tested URLs. This organized format provided immediate insight into compliance levels, serving as a vital resource for both our team and the client to quickly identify areas of success and those needing further attention.

Strategic Planning for Efficient Audit

During the planning phase of our web accessibility audit, we categorized URLs based on their structural similarities. This strategic grouping allowed us to efficiently test and report on similar pages, significantly reducing the time required for each audit cycle. By focusing on these similarities, we could apply findings and solutions across multiple pages more effectively, enhancing the overall speed and impact of our accessibility improvements.

How we work

Automated Testing

Leveraged automated testing tools to quickly scan large numbers of pages for common accessibility issues.

Manual Testing

Conducted in-depth, hands-on evaluations to catch accessibility issues that automated tools might’ve overlooked

Reporting

Reporting all issues in excel using predefined templates

Conducting the Analysis

1. Automated Testing

We began with automated testing using tools such as AXE and WAVE to identify and report high-level issues detectable by these tools. These tools allowed us to quickly pinpoint common accessibility barriers, such as missing alt text, improper heading structures, and color contrast issues.

2.1 Manual Testing

However, automated testing only covered a fraction of the guidelines, necessitating a deeper dive into the audit. We manually reviewed each of the 600 webpages to provide comprehensive and actionable insights.

Difference between errors identified in Automated testing vs. Manual Testing

During the manual testing phase, we employed a range of techniques including keyboard testing and voice assistant tools to thoroughly evaluate the URLs against the A and AA WCAG guidelines. Each identified issue was comprehensively documented in an Excel sheet. This documentation included several crucial details:

01
Guideline Violated
The specific WCAG guideline that was not met.
02
Severity Level
The criticality of the issue, categorized to prioritize fixes.
03
Issue Description
A detailed explanation of the accessibility barrier encountered.
04
Screenshot with Markings
Visual evidence highlighting the exact nature and location of the issue within the webpage.
05
Suggested Changes
Recommendations for code or design adjustments necessary to rectify the issue and ensure compliance.

This structured documentation process was crucial for creating an actionable roadmap towards achieving full web accessibility.

2.2 Validation with users

After conducting manual testing, we validate the most critical user flows to ensure these experiences are barrier-free for actual users. Our use case testing is performed by testers with disabilities who are native assistive technology (AT) users.

Reporting

In the reporting phase of the project, we consolidated the findings from all categories of URLs to provide a high-level overview of the accessibility issues present. This comprehensive reporting was essential for understanding the intensity and distribution of the issues across different webpages. To enhance clarity and immediacy of the data, we utilized Power BI to visualize the insights. This allowed the team to quickly grasp which areas required immediate attention and facilitated a more informed decision-making process regarding prioritization and resource allocation for remediation efforts.

Key Takeaways

01
Collaborating with Actual Users
Collaborating closely with people with disabilities provided invaluable insights into their experiences navigating websites. This direct interaction highlighted the challenges they face, highlighting the need to adopt a thoughtful and inclusive approach in design processes to ensure accessibility.
02
Embracing New Challenges
Diving into the relatively unexplored territory of web accessibility marked a significant learning curve for our team. My proactive approach in learning the basics, coupled with actively seeking expertise from others, not only enriched my understanding but also distinguished my role within the team.
03
Developing Sustainable Frameworks
A key learning from the project was the importance of establishing robust frameworks to streamline future processes. By creating systematic testing and reporting frameworks, we not only improved the efficiency of our current project but also laid a foundation for ongoing and future accessibility initiatives.