Automated tools are all fine and dandy and I like to use them.
A pet peeve of mine is that "red on white" is considered bad while being, in fact, perfectly legible:
> Element has insufficient color contrast of 3.99 (foreground color: #ff0000, background color: #ffffff, font size: 9.4pt (12.6px), font weight: normal). Expected contrast ratio of 4.5:1
I think whatever algorithm is used to determine contrast is deeply flawed for some colour combinations.
I remember reading an article here on HN about how luminance (or luminosity?) varied in a way that wasn't easily discernible by inspecting the RGB components of a colour and how there was a different (much harder) algorithm which instead ensured luminosity could be taken into account when picking colours for a site.
I'd wish accessibility tests used _that_ instead of taking the easy way out and using known-flawed RGB checks.
Personally I don't find pure red on white very readable, but I have noticed some oddities such as pure black on orange (#ff6a00) passing the check while pure white on orange fails, despite being much more readable to my eye.
> A pet peeve of mine is that "red on white" is considered bad while being, in fact, perfectly legible:
When I get my eyes tested at the opticians, one of the tests is usually looking at a red ring and a green ring against a white background and telling the optician which ring seems clearer. I think it's something to do with long/short sightedness? I'm shortsighted so the red ring is always clearer to me without glasses, and trying to read green writing on a whiteboard can be more difficult.
Just checked and I'd prefer it just a touch darker (#dd0000), but as long as the font weight isn't too light, I think it's OK.
There are different forms of colour blindness - I have the red-green type (though can't correctly identify many colours), but I don't think it should make any difference for this.
Red on white is almost indistinguishable to me from black on white. If you're trying to use red to indicate a state, it fails on me and I'm sure many others.
The parent's complaint is while the red may look black to you, you have no trouble differentiating it from the white background. The problem of red text on a white background would be if someone couldn't see red and had another impairment categorized as "low vision"; for them, red-on-white text the size found on HN would not be readable. If the red-on-white text was larger and bold (>18px, weight 700), its 3.99:1 ratio would be sufficient for WCAG's 3:1 large text threshold.
Sometimes color contrast makes more sense when you can see all colors in a scheme compared at once as color tiles with the contrast spelled out numerically. https://github.com/prettydiff/colorContrast
As somebody has already pointed color contrast is defined by a relative luminosity calculation defined by W3C.
Just today I got flagged for a contrast of 4.49. Turned out that someone had mistakenly changed the page background from #f7f7f7 to $f4f4f4, which was enough to put the foreground color over the edge.
Ensuring that websites are accessible is important work. I used to work at a company where the CTO personally dedicated several sprints to refactoring the display layer code to better support screen readers. From what I remember, it was a bit of a tedious process, but nonetheless one appreciated by all of their customers who had disabled staff.
Tools like this could save a lot of time and make the web more accessible.
Automated accessibility testing for Android - both static and dynamic accessibility issues, all automated with little to no setup: https://github.com/vontell/Bility (all open source too)
Identifies color issues, keyboard, and user interaction issues, etc... by building up a state diagram dependent on a user's simulated abilities. Generates a report based on WCAG 2.0 specs.
PSA: Global Accessibility Awareness day is tomorrow (Thursday) and will be mostly celebrated on twitter under #GAAD. Most years there are in-person events all over the world.
There are web services to add accessibility checks to your CI processes. https://tenon.io is one, Deque has services. Deque makes axe-core, an open source library of accessibility checks that's used by Google's Lighthouse and Microsoft's Accessibility Insights for Web.
This code from IBM uses their own accessibility checklist, I'll be interested to see how it compares to axe-core.
The accessibility-checker package provides CI tools for Node and Karma. These tools started out as server-based checkers about a decade ago, but not all DOMs are serializable and you lose a lot of the CSS context that's needed to do proper evaluation. It's fine for fairly static pages, but has a lot of potential for false positives/negatives.
The tools provide checking for WCAG 2.0 AA, WCAG 2.1 AA and the IBM checklist. The IBM checklist is basically WCAG 2.1 AA, but has some more strict requirements for defining landmarks for screen reader navigation.
A pet peeve of mine is that "red on white" is considered bad while being, in fact, perfectly legible:
> Element has insufficient color contrast of 3.99 (foreground color: #ff0000, background color: #ffffff, font size: 9.4pt (12.6px), font weight: normal). Expected contrast ratio of 4.5:1
I think whatever algorithm is used to determine contrast is deeply flawed for some colour combinations.
I remember reading an article here on HN about how luminance (or luminosity?) varied in a way that wasn't easily discernible by inspecting the RGB components of a colour and how there was a different (much harder) algorithm which instead ensured luminosity could be taken into account when picking colours for a site.
I'd wish accessibility tests used _that_ instead of taking the easy way out and using known-flawed RGB checks.