Hacker News new | past | comments | ask | show | jobs | submit login
Equal Access: Automated accessibility checker for web projects (github.com/ibma)
101 points by iovrthoughtthis on May 20, 2020 | hide | past | favorite | 31 comments



Automated tools are all fine and dandy and I like to use them.

A pet peeve of mine is that "red on white" is considered bad while being, in fact, perfectly legible:

> Element has insufficient color contrast of 3.99 (foreground color: #ff0000, background color: #ffffff, font size: 9.4pt (12.6px), font weight: normal). Expected contrast ratio of 4.5:1

I think whatever algorithm is used to determine contrast is deeply flawed for some colour combinations.

I remember reading an article here on HN about how luminance (or luminosity?) varied in a way that wasn't easily discernible by inspecting the RGB components of a colour and how there was a different (much harder) algorithm which instead ensured luminosity could be taken into account when picking colours for a site.

I'd wish accessibility tests used _that_ instead of taking the easy way out and using known-flawed RGB checks.


The algorithm used is defined in WCAG: https://www.w3.org/WAI/WCAG21/Techniques/general/G18.html#te.... It does attempt to calculate the luminance but you might be referring to a more accurate algorithm.

Personally I don't find pure red on white very readable, but I have noticed some oddities such as pure black on orange (#ff6a00) passing the check while pure white on orange fails, despite being much more readable to my eye.


> A pet peeve of mine is that "red on white" is considered bad while being, in fact, perfectly legible:

When I get my eyes tested at the opticians, one of the tests is usually looking at a red ring and a green ring against a white background and telling the optician which ring seems clearer. I think it's something to do with long/short sightedness? I'm shortsighted so the red ring is always clearer to me without glasses, and trying to read green writing on a whiteboard can be more difficult.


Could it be because it is accounting for people who are colorblind?


I'm colour blind, and red on white is totally fine for me as long as the red is dark enough. Actually, any colour on white is OK if it's dark enough.


Thank you for that. Could you confirm whether "pure red on white" is okay for you, or whether it's too light?

i.e. #ff0000 on #ffffff


Just checked and I'd prefer it just a touch darker (#dd0000), but as long as the font weight isn't too light, I think it's OK.

There are different forms of colour blindness - I have the red-green type (though can't correctly identify many colours), but I don't think it should make any difference for this.


Yes


These tools are to help people who need accommodations. If you don't need accommodations, some of these accessibility rules may not make sense to you.



Red on white is almost indistinguishable to me from black on white. If you're trying to use red to indicate a state, it fails on me and I'm sure many others.


The parent's complaint is while the red may look black to you, you have no trouble differentiating it from the white background. The problem of red text on a white background would be if someone couldn't see red and had another impairment categorized as "low vision"; for them, red-on-white text the size found on HN would not be readable. If the red-on-white text was larger and bold (>18px, weight 700), its 3.99:1 ratio would be sufficient for WCAG's 3:1 large text threshold.


Sometimes color contrast makes more sense when you can see all colors in a scheme compared at once as color tiles with the contrast spelled out numerically. https://github.com/prettydiff/colorContrast

As somebody has already pointed color contrast is defined by a relative luminosity calculation defined by W3C.


Just today I got flagged for a contrast of 4.49. Turned out that someone had mistakenly changed the page background from #f7f7f7 to $f4f4f4, which was enough to put the foreground color over the edge.


This looks extremely useful.

Ensuring that websites are accessible is important work. I used to work at a company where the CTO personally dedicated several sprints to refactoring the display layer code to better support screen readers. From what I remember, it was a bit of a tedious process, but nonetheless one appreciated by all of their customers who had disabled staff.

Tools like this could save a lot of time and make the web more accessible.


For WCAG work, I use and highly recommend WAVE https://wave.webaim.org


WCAG's api just shows the no. of errors per type. Do you run it on your own servers and does it give more info that way?


I have also found https://accessibilityinsights.io/ from microsoft to be usefull. It only covers the in-browser test side for web but has a nice ui.


I'm not entirely sure if it's the same use case, but I think the de facto tool here is axe: https://www.deque.com/axe/

(IIRC, it's also integrated in Webhint and Google's Lighthouse.)


Automated accessibility testing for Android - both static and dynamic accessibility issues, all automated with little to no setup: https://github.com/vontell/Bility (all open source too)


Identifies color issues, keyboard, and user interaction issues, etc... by building up a state diagram dependent on a user's simulated abilities. Generates a report based on WCAG 2.0 specs.


With some help from others, it is also extendable to other platforms like Web and iOS


Amazing, I had a similar idea last year and I am glad a corporate actually built this https://giuseppegurgone.com/automating-accessibility-tests-p...


PSA: Global Accessibility Awareness day is tomorrow (Thursday) and will be mostly celebrated on twitter under #GAAD. Most years there are in-person events all over the world.


Very cool. It’s hard to find good tools to ensure you’re staying accessible once you’ve made the effort to get there.


You should make this as a web service imo.


The trouble with running these sorts of things as a web service is that it often means the assessment has to happen after deployment.

Obviously, it doesn't have to be live. You can deploy to test.example.com and try it out there, but I think that's too late.

Easily detectable poor accessibility should be as much of a closed gate as having the layout go all wonky and the words all in the wrong place.

Ideally, it should be runnable on your local copy as you are writing it, just the same as viewing the page in a browser.


There are web services to add accessibility checks to your CI processes. https://tenon.io is one, Deque has services. Deque makes axe-core, an open source library of accessibility checks that's used by Google's Lighthouse and Microsoft's Accessibility Insights for Web.

This code from IBM uses their own accessibility checklist, I'll be interested to see how it compares to axe-core.

https://www.ibm.com/able/checklists.html


The accessibility-checker package provides CI tools for Node and Karma. These tools started out as server-based checkers about a decade ago, but not all DOMs are serializable and you lose a lot of the CSS context that's needed to do proper evaluation. It's fine for fairly static pages, but has a lot of potential for false positives/negatives.

The tools provide checking for WCAG 2.0 AA, WCAG 2.1 AA and the IBM checklist. The IBM checklist is basically WCAG 2.1 AA, but has some more strict requirements for defining landmarks for screen reader navigation.


FYI - it's possible to run Google Lighthouse against any URL (includes accessibility checking), and there are web services for that.





Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: