Whether a carrot or stick approach works best for enforcing website accessibility compliance remains an ongoing debate across government and industry at large.
The former focuses on educating those controlling digital assets on both the business and moral case for maintaining websites and apps that can be easily accessed by individuals with a wide range of physical or cognitive impairments.
The latter entails a punitive response to websites that fail to comply with digital accessibility standards either through public naming and shaming or recourse to legal remedy through the courts.
One approach that blends aspects of both the carrot and the stick is the publication of sector-wide leaderboards ranked on a scoring system that rewards the most accessible websites.
However, given the fact that website accessibility rankings are often published by different organizations using subjective algorithms and scoring systems — are key takeaways easily diluted, and is an opportunity being missed to create a leaderboard that is widely recognized as universal and authoritative?
Furthermore, even though rankings are necessarily driven by data science — is there an art to how they should be packaged and presented to ensure an effective balance between the carrot and the stick?
Focusing on high-achievers
One organization that routinely publishes and updates accessibility rankings tables across a range of private and public sector websites is Silktide.
The Silktide Index is a powerful tool, not only for identifying organizations who are leading the way in digital accessibility within their sector, but also for monitoring general trends and giving visibility to government policy.
A prime example of this is Silktide’s rankings of U.K. councils across 2020 leading up to the September Public Sector Bodies Accessibility Regulations (PSBAR) deadline and beyond.
In addition to the leader board itself, Silktide’s data demonstrates a steady rise in the average accessibility scores of U.K. council websites throughout the course of the year, lending some credence to those who suggest government-mandated accessibility regulations can be effective.
Concerning their index, Silktide is keen to stress that their rankings have been specifically designed to reflect the aspirational carrot of being seen as a leader, rather than the stick of being shamed as a bottom feeder.
“Our goal in creating the Index was never to name and shame,” says Simon Waters Silktide’s Head of Marketing.
“Right from the outset, we made a very conscious decision to only show the Top 10 or Top 30 most accessible websites within a sector on one page and then just have all the other scored websites in an alphabetical list below.”
Chris Fletcher, a Product Specialist at Silktide, who also provides services across website privacy and SEO, further adds, “Last year, prior to releasing the Index, we very much wanted to focus on making it aspirational.”
“We didn’t want to be saying to organizations, ‘Look, you have a poorly accessible website. Therefore, throw some money at us and let us help you fix it.’ We wanted to say to the industry at large — ‘This should be your long-term goal. Look at how well these other organizations are doing.’
Establishing authoritative rankings
Lawrence Shaw, CEO of Sitemorse, is a veteran of website accessibility rankings tables. Sitemorse was one of the first-ever organizations to produce an accessibility leaderboard of U.K. council websites way back in 2006.
A recently published blog post on the website of AAAtraq, Shaw’s other company dealing with automated website compliance testing reveals how, 15 years on, 85% of the top 100 NASDAQ-listed companies still score as “high risk” for non-compliance with the Americans with Disabilities Act (ADA) regulations.
Only Alphabet, Google’s parent company was accredited as “low risk” following detailed website scanning and analysis.
Shaw firmly believes that the canonical but technically complex Web Content Accessibility Guidelines (WCAG) significantly muddy the waters when it comes to producing authoritative accessibility rankings because different companies elect to base the final positions on hundreds of different metrics and scoring models.
“Imagine if your accessibility scanning software was only capable of checking seven accessibility items,” says Shaw.
“The publisher would just end up saying that those seven things were the only checkpoints that matter for building an accessibility rankings table.”
He continues, “A better way would be for everybody to agree on say 20 critical checkpoints for website accessibility — key items that affect most of the people most of the time.
“If there was just one way to test and one way to report results, we could do away with the competitive nature of index tables. That has to be a good thing because index tables should not be competitive with each other.
“The only thing that should be competitive within the Index is how organizations are ranking and improving in relation to one and other, says Shaw.
Muddying the waters
The critical point is that wildly divergent rankings tables lead to a lack of clarity on who is performing well or badly in relation to accessibility, undermining both the benchmarking of accurate information and the motivation to try and do better.
This lack of transparency is evidenced by the fact that recent analysis of U.K. council accessibility rankings undertaken by Sitemorse showed Cambridge County Council ranked in second place by accessibility and optimization outfit Monsido with a compliance score of 74.65%.
Yet, the same council did not even make the Silktide Index Top 30 and was ranked at 210 in the country by Sitemorse.
For those with a clearer understanding of top-level sports than website accessibility, the effect is akin to having several versions of English football’s Premier League table.
Each version assigns different points for games won and drawn and some versions assign points based on entirely arbitrary metrics such as the number of corners won in a game.
Though website accessibility rankings are not really about crowning undisputed champions, they should be focused on clearly identifiable trends towards improvement and these messages are lost when rankings become open to interpretation.
The reverse is true of having a uniformly recognized ultimate rankings table, possibly moderated by an independent government regulator.
Messaging, education and motivation are so vital to those wanting to both promote and learn more about digital accessibility.
Having access to an accessibility rankings master list, an undisputed list of lists, could go a long way to achieving this. Organizations working within the digital accessibility space can then use their collective power as educators and publishers to further amplify it.
As to the question — “Are Accessibility indexes useful tools for promoting good practice?” The answer might be “Yes. In theory, but less so in their currently subjective and fragmented format.”
As the father of the World Wide Web, Sir Tim Berners -Lee once famously stated, “The power of the web is in its universality.”
Perhaps it’s time that this very same principle is applied to how this noble ambition is continuously measured and assessed.
"help" - Google News
April 18, 2021 at 02:30PM
https://ift.tt/2OXI1hc
Do Published Website Accessibility Rankings Help Organizations Perform Better? - Forbes
"help" - Google News
https://ift.tt/2SmRddm
Bagikan Berita Ini
0 Response to "Do Published Website Accessibility Rankings Help Organizations Perform Better? - Forbes"
Post a Comment