Automated testing will catch 40% (conservatively) of all accessibility issues. This type of testing is well-suited to find and flag issues like invalid markup, improper use of ARIA and duplicated IDs.
I've created a list of automation ideas. This list is not ready for public consumption, but is available as a conversation starter during our discovery efforts.
Automated testing cannot, and will not, have a human tester's subjectivity. Many accessibility issues are "more art than science" and need a subjective opinion. For instance, an automated testing tool would not know to flag focus management as an issue. A human tester would be able to identify this as a potential problem for screen readers, and log it accordingly.
- Run axe checks on every page or unique state (required)
- Run axe-coconut on every page. Coconut is a leading-edge tool, and identifies more WCAG2.1 success criteria that will eventually become the law of the land.
- Run WAVE on every page if time allows. WAVE is more visual than axe or axe-coconut, but offers some excellent ways to identify nested headings and HTML5 landmark tags.
- Color contrast checks
- Color blindness checks
- Zoom layouts to 400% and inspect them for readability. If layouts break at 400%, I will start reducing them until they become stable, and log the zoom ratio when things started breaking.
- Keyboard navigation for the happy path
- Screen reader testing using these preferred pairings:
- IE11 + JAWS
- Chrome + JAWS
- NVDA + Firefox
- Safari + VoiceOver
- iOS Safari + VoiceOver
Teams can request accessibility reviews for their PDF and Word files. This includes several manual checks:
- Acrobat Pro accessibility checker
- Review with one or more screen readers. Usually this is JAWS.
- Evaluating Common Look as another option for quick PDF scans.
- Write issue tickets for findings
- Consult on code to fix tickets
- Review pull requests for accessibility fixes
- Help teams QA and close tickets
- Research new tools and interfaces
WCAG 2.1 offers 17 new success criteria that are likely to be included in Section 508 requirements in the future. Many of these new success criteria are focused on typography, cognition (understanding), and usability. They will require additional time in manual testing.
- Windows high-contast mode
- Inverse or "dark" mode testing
- Text to speech using tools like Dragon Speaking
- Zoom magnifier tools like Zoom Text
- WCAG 2.1 mobile usability
- WCAG 2.1 typography