Experiences in Testing a Live Project for Accessibility
The scope of an accessibility testing effort is very broad—it is often complete only when real users come in to test and provide their perspective on the application.
A couple of years back, my company was tasked with testing an e-learning application’s accessibility readiness. This was an agile team with three developers and three testers, and at the end of the release cycle we had logged more than a thousand issues. We worked closely with the product team in getting these resolved with varied workarounds and recommendations for fixes.
As a visually impaired user, I want to summarize the real accessibility issues I faced in this application—such as unlabeled form fields, missing headings, redundant links, and missing alternative text for images—so that other testers can leverage this information in their test efforts. Of the overall set of issues, I want to highlight three of them in greater detail.
Carousels were used in the application, but they were completely ignored by the screen reader. This caused issues not just for the visually challenged, but also for users with dyslexia and ADHD. We recommended adding alt text for all images and setting ARIA live regions across web elements. These two fixes were picked up by the screen reader, enhancing overall accessibility.
However, there were a few additional issues:
- The screen reader was reading the alt text each time an image was updated in the carousel, affecting the normal reading experience of a screen reader user.
- Update pace in the carousel was very quick, which wasn’t conducive for people with dyslexia.
- The moving content on the page was impacting users with ADHD.
To resolve these issues, we proposed an option to pause or hide the carousel on the application’s pages. This was implemented quickly, making the carousel fully accessible.
Ajax modals were used but were inaccessible to screen reader and keyboard-only users. There was a serious issue of focus in Ajax overlays. Our suggestion here was to set direct keyboard focus on these overlays, and when they were opened, it resolved the issue.
Many elements of the application had a less than recommended contrast ratio of 4.5:1, causing issues for low-vision users. Due to time and cost constraints, it was not possible to fix each of these issues in real time. Additionally, the product team was not willing to compromise on the application’s UI. To address the contrast, we recommended adding an extra feature in the application to enable the user to switch from “color” to “black and white” mode. This workaround was openly accepted, helping arrive at a consensus for addressing the issue.
This project was a truly different accessibility test experience, and it reminded us that accessibility is not about pure play testing and reporting defects. It’s more about providing suggestions, recommendations, and workarounds to enhance the overall experience for varied groups of end-users, making the live product truly accessible to everyone.