Anticipating the story behind the spec

Most people – even developers – find it pretty hard to get excited about abstract, technical specifications. Accessibility standards are no exception. But many things become more tangible when empathy comes into play, when you can see the people behind the specifications and thus better understand the concrete effects that limited accessibility brings with it.

Put simply: Few people know what it means when a national rail operator fails to meet the "AA standard" - but everyone can relate to the frustration when your journey is denied simply because you cannot read the ticketing website.

How does AI turn the user experience into an "aha" experience?

That's why we asked ourselves: How can we transform abstract and technical accessibility reports into more personal, tangible and empathy-boosting experiences? How can we use advanced AI technologies to recognize the potential impact of accessibility issues on real people's lives? And how can we deliver these "aha" experiences to the people who design, develop and operate today's web?

Barriers for people with impaired vision: currently the most common accessibility problem

With A11y-AI, the accessibility of any website can be checked from the perspective of our first synthetic user "Claudia". Claudia was designed to anticipate accessibility issues related to "low vision". This already covers some of the most common accessibility issues. Nevertheless, we are already working on implementing the next perspective or limitation to broaden the scope and thus create a broader understanding of accessibility challenges.

Let's create positive change – and a more inclusive web – together! What's the first site you're going to check?