A comparison of automated testing tools for digital accessibility | Equal Entry

There are quite a few tools that claim to help find accessibility issues through automated, pre-programmed test suites. But how accurate are they? Equal Entry have pitted six of the most popular tools (including aXe) against each other and found that they actually miss a fair bit, even in areas that they should be able to diagnose relatively easily. (To be clear: only automatable accessibility concerns were tested; everyone – including the tool's vendors themselves – agrees that you cannot test for accessibility fully without manual testing πŸ˜‰)

Unfortunately, it's hard to know which of the tools performs well or poorly, as the results are all anonymised. This is due to a desire for fairness (okay) and the fact that several of the tools contain anti-benchmarking clauses in their ToS (boo!), but it's still annoying. Equal Entry also hasn't shared the test site used, so their results cannot be verified. Still, a useful (partial) benchmark.

On the results:

On our reference site, the tools tested tended to find a relatively small percentage of the defects we had embedded in the site.

It’s worth noting that half the players in this study produced more false positives than true defects.

On the organisational costs of flagging too many accessibility "issues":

Every issue costs your organization time and money to investigate. A great deal of time and money can be wasted in examining β€œissues” that are not issues.

Explore Other Notes

Older ➑

The machine stops

For many folks writing or sharing art on the open web in 2024, the rise of corporate theft under the guise of "AI" has become a real sticking point. I share these sentiments, though have yet to start […]
  • There are quite a few tools that claim to help find accessibility issues through automated, pre-programmed test suites. But how accurate are they? Equal Entry have pitted six of the most popular […]
  • Murray Adcock.
Journal permalink