Hacker News new | past | comments | ask | show | jobs | submit login
Why I can no longer recommend a Mac to fellow blind computer users (applevis.com)
387 points by devinprater 11 months ago | hide | past | favorite | 90 comments



Blind developer here. I could never recommend VoiceOver on MacOS to anyone. 10 years ago before I lost my eyesight I was a huge fan of Apple - everything was just working so much smoother than in Windows world. And much more visually appealing. However, after losing eyesight, I had to switch to Windows. I learned VoiceOver on Mac at first, but working with it was so unbearably frustrating,, even back in 2016 when I was trying. Here are some of my complaints that I still remember:

* Many actions work every other time. I remember that interacting with text area of terminal app was especially painful since the sequence of commands was non-deterministic.

* Hierarchical navigation model is more cumbersome than flat navigation on Windows. In XCode to access some project settings you need to interact with some panel, which has two horizontal subpanels, so you need to interact with the right one, which in turn has two more vertical subpanels, you interact with the bottom one, which has three more subpanels... The recursion depth was 9 levels, I kid you not; and making a single mistake will lead you to wrong place.

* Searching webpage in a foreign language doesn't work. Because Command+F needs to be presssed in English layout and this would open VoiceOver search window, where switching to another layout doesn't work.

* No easy way to open a link in another tab in browser - as opposed to Control+Enter on Windows.

* Too difficult keystrokes - I remember one of keystrokes involved 5 keys: Fn+Ctrl+Option+Command+Up/Down. By now I forgot what it means, but I remember that it gave me plumber's disease - pain in my left wrist from having to press too many keys for extended periods of time.

I probably forgot a bunch more. Not sure if any of these have been fixed since then. But my general impression was that Apple is not very interested in fixing bugs, but instead, they appear to care a lot about presenting shiny keynote slides every year on WWDC claiming how much they care about accessibility.

In Windows world things are considerably better. Jaws is much more convenient to use, even though I've heard many reports of them not willing to fix bugs. NVDA is open source and it is my favorite, since if something doesn't work for me - I just go and fix it myself, but in general things are rarely broken to that extent in NVDA.

Also if I remember correctly Jaws and NVDA share about 45% of marketshare in the screenreader world, while VoiceOver is about 10%. So judging from this point alone anyone would be much better learning a Windows screenreader.


> I remember one of keystrokes involved 5 keys: Fn+Ctrl+Option+Command+Up/Down.

What the actual fuck. This is ridiculous if you are blind and otherwise abled. Now imagine what it’s like if you have incomplete control over your fingers, or tremors, etc. At that point it’s simply impossible. Apple, with all those billions in the bank, maybe you could spend a few million getting these problems right.


If someone came to me with that, I'd recommend either sticky keys, where you can just press a key and the system holds it down, or VoiceOver's "trackpad commander" which basically gives VO control of the trackpad, so the user can swipe left for the previous item, swipe right for the next, that kind of thing. Of course, you then need to turn the trackpad commander off when you type cause any little tap will be sent to VoiceOver and your focus is moved from the text field you were in, to the submit button right as you press Enter for a new line...

There's also the "numpad commander" for external keyboards with a number pad. I like this; you navigate with the 4 and 6 keys, 4 moves back and item, 6 moves forward, and there are keys to skip to the next heading and such. Of course, for hand tremors, I'd really hope one day Apple's voice control and VoiceOver work well together. Right now, it's kinda harder to do if you don't know the name of controls to click on before you get to them with VoiceOver, or if you don't have headphones, the voice output will interfere with the voice input.


I haven’t seen that key combo before, but I wonder if part of the thinking is that they are all next to each other. I’m not sure fine motor control is needed for fn+ctrl+option+cmd; the side of the hand would be enough.


> But my general impression was that Apple is not very interested in fixing bugs, but instead, they appear to care a lot about presenting shiny keynote slides every year on WWDC claiming how much they care about accessibility.

yes, you could say the same about their sustainability /environment/ (and arguably /privacy/). it's more about marketing and owning the narrative than it is about actual substance imo.


Thank you for a very insightful comment.


What is your opinion of iOS VoiceOver and its' rotor?


iOS VoiceOver is better than Google TalkBack on Android. Although I've heard recently that Android is catching up fast.

I mean it's amazing that blind people can use such a portable device; but I don't quite like that there are only so few gestures that can be made using touch screen. That's the reason they had to invent rotor, I personally find it inconenient compared to using a full size keyboard; but given touch screen rotor is probably the best you can get out of it.

But also I need to mention again my complaints about bugs that never get fixed. Just in case someone from Apple is reading this:

* In some apps the focus jumps about 1 second after opening a new screen. This is especially bad in YouTube app, but also can be rreproduced in other apps albeit with lower delay.

* Back in the days when iPhones had a physical home button, tripple press was supposed to toggle VoiceOver. But the problem is that in more than half iPhones (mine and friends') it stops working after a few weeks of usage. Yeah it can be argued that it's hard to reproduce, but this bug was so well-known and infamous in blind community that shame on Apple for not fixing this.

In general I'd think that a smart decision would be to make both iOS and MacOS VoiceOvers open-source, so that blind people themselves can fix all the bugs. Until then we're left with the situation that a bunch of disinterested sighted devs work on VoiceOvers, and as I already mentioned they are more interested in implementing some shiny new features instead of fixing decade old bugs. And as far as I know being an accessibility engineer is kind of considered one of the worst in the pecking order of engineers, so only those people who couldn't find a better career become accessiblity engs. engs.


I have been a regular Mac user for over two years (MacBook Air M1) and the issue is far less severe than this person claims.

It used to be a much bigger problem around Mac OS 12, but even then, Cmd + option + q (quit and keep windows) would solve it pretty quickly.

Voice Over has a lot of issues, but so do Windows screen readers, and as a Mac user, I can at least decide to postpone an accessibility-breaking update, something which Windows doesn't let me do. Such an update was one of the main reasons why I switched OSes.


I'm glad you don't experience this anymore. I wish I could say the same. Of course, I have a 2019 MBP so no M1, and I'm not rich enough to just go out there and buy it when I hear it effects even M2 models. I don't bet an issues that hault productivity that much.


It's also possible to switch the default so that Cmd + q will quit and keep windows, while Cmd + option + q will quit and discard windows. I like it this way much better. I have it set up for just Safari however.


Not blind, I have no experience at all with any of this. However what I found quite impressive was witnessing an Apple Store employee guide a blind person through voice over. That to me implies that Apple at least trains some folks in the stores on accessibility features.


Yep. Sometimes they even hire blind people to do it. There was one at the Apple Store I went to when I first was looking into a Mac. Years later, I heard he left though, moving into another field.


Maybe some, maybe now.

I would love to believe it was due to my excoriating writeup of the experience a decade ago:

https://liam-on-linux.livejournal.com/18605.html

That was just shockingly bad treatment, in our opinions (categorically not humble.)


I was expecting some sort of systemic problem across the operating system, but it seems to be just one specific bug in WebKit—though there is also a link to “our recent post on problems in macOS Sonoma and the replies”, and though it does sound to be a rather debilitating bug for affected users, and though this sort of bug making it to production can be a symptom of systemic problems in the organisation.


They mention another post covering broader issues, just highlighting this as the worst:

> I want to emphasise that the “Safari not responding” bug is far from the only issue effecting VoiceOver users on Mac. As our recent post on problems in macOS Sonoma and the replies outline, there are numerous other VoiceOver frustrations and failures impacting users.

As a web developer who is committed to supporting blind and disabled users in general, I'm just getting deep into all of this now, and so far I can say that VO is the worst of the three screen readers for web usage. It has fallen far behind on standards support for modern aria attributes, completely ignoring many of them in practice, to the point where it's not possible to always achieve intended screen reader behaviour in VO.

Much as with Safari, Apple only seems to pays lip service to support of VO, completely blind users seem to only ever use VO if given no other option, but many online stats muddy OS usage with web usage to make this less obvious.


Would you tolerate a machine that just stops working for several minutes randomly all day while screaming at you?


This perfectly describes my typical experience using the internet in 2023.


That’s what ads are.


I think you just described compiler output.


aren't we already doing that everyday?


One fault can leave a platform 100% broken.


This broadly seems -- for a blind user -- like the monitor periodically taking a break for a sighted user. Seems like it would get real old by, like, the 3rd time or so.


A single bug that stops you using the internet at all is far worse than hundreds of bugs that simply inconvenience you.


This is an interesting take, and a very salacious headline.

I’m a lawyer who represent a number of disabled individuals (including the blind). My clients with vision impairments as all in on Apple products, saying that the accessibility features are head and shoulders above other options.


I switched from MacOS to Linux/Wayland after 10 years on MacOS because it accommodated my vision impairment better.

Because I have cataracts, it helps to make everything on the screen bigger. MacOS can do it, but the result is blurry unless the scaling factor is exactly 1 or exactly 2, and choosing 2 reduces the horizontal resolution too much for some web sites. (My monitor is 1080p, so with a scaling factor of 2, the viewport is only 960 pixels wide.)

In contrast, Linux/Wayland offers scaling factors 1.0, 1.25, 1.5, 1.75, 2.0 and 2.25. (I've been using 1.75 for months.) Windows works similarly to Linux/Wayland (though apps that have not been updated to work with recent versions of the OS end up blurry), so MacOS is definitely a laggard here.


MacOS 10 used to let you specify scaled resolution in Control Panel >> Displays >> Resolution (scaled). It doesn't give you numbers, though, more like "larger text" and "more space", with "effective resolution" under the image. The effect is a scaling factor, though, you just can't see the actual value. I expect macOS 11+ also continues to do this.


Sure, I know. That is the blurry option. There is a free utility that causes choices with "(HiDPI)" at the end of them to show up in the list of possible resolutions even when using a non-HiDPI monitor, and that is how you get a non-blurry scaling factor of 2.0 on a Mac.

(Also, it hasn't been called Control Panel for 20 years.)


> but the result is blurry unless the scaling factor is exactly 1 or exactly 2

I can’t relate. I’ve been using a scaling factor of 1.5 for about a decade now (4K monitor scaled to 2560x1440) and it’s never blurry.


It is not as sharp as it could be, and someone on this site once commented that he notices the difference (between Mac and Windows, where the software doesn't introduce blurriness) even on his 4K monitor.

(You could ask me why if I hate blurriness so much, why am I still using a 1080p monitor. My answer is my high-DPI monitor is coming soon.)


Sounds more like personal preference than something objective. Windows’ 1.5 scaling on the same monitor looks pretty bad to me (font hinting in particular, legibility isn’t great).


I chose not to scale windows but to deliberately go lower screen resolution on my 4K monitor and aliasing is minimal, almost unnoticeable. Great for coding. For gaming though, I switch back to native 4K resolution. All this while windows' software scaling remains 1:1.


Although I've used the "Display" tab of the "System" pane of the Settings app in Windows 10 to change the scaling factor (labeled as "Scale and Layout"), I cannot make sense of your comment.


Every time I tried to scale windows UI in "software" way, as you just described it, I would encounter bug after bug after bug either in windows or apps or taskbar. After a realization that windows will never get UI scaling right, I decided to do "hardware" scaling, meaning, I still keep my windows scaling at 100%(recommended), but then I set the display resolution to a compatible lower one. My monitor is 38" and it's native resolution is 3840x1600. But when I set that resolution, there is no way in hell I would be able to read the fonts of the screen. So, instead, I set my resolution to 2560x1080, which doesn't look blurry, scales perfectly, and zero UI bugs.


OK I see now.


Mac really only works in the ~110 or ~220 ppi sweet spots which most people don't realize. The only monitors I've seen that fully work are the LG UltraFines and the Mac displays. I'm on a ThinkVision display now and it's middling.


Oh we love our iPhones. But then as a computer OS, we get Windows. Some people try to stick with Apple, either because of Logic Pro or the M1 chip, but I'd say a good 85% of blind people that even have a computer in the first place, have Windows.


I have worked with some people in the local blind community in my city here in Denmark. They all use Windows even though I see lots of iPhones.


It's best to refer to those with disabilities as people rather than their disability, EG, 'blind People' rather than 'The Blind'.

Also, this is talking about voiceover on Mac OS, which is a specific aspect of apple's accessibility solutions. For the most part, IOS on iPhone and iPad OS are better than that of android, though not without their issues. The issue at hand, however, is very real and, for professionals, highly inconvenient.


Am I gonna have to put :blind: in my username here too? Anyway I'm blind and I'm a blind person, and a good many blind people like being called blind people, because it's shorter, and doesn't minimize the way blindness effects us. Anyway I was responding to someone that said something like blind people use Apple or some such, my point being that we do love our iPhones, but for computers we use Windows. And just in case you didn't get it, I'm blind too.


This is definitely true in the case of phones, but not for computers. NVDA on Windows is excellent and improving all the time.


I am curious what he's going to recommend instead. AFAIK, Macs are the best out there (by quite some way) for accessibility.


Windows with JAWS or NVDA is fairly mainstream, and it's accessibility features are pretty well understood and supported. VoiceOver less so. That only covers one facet of accessibility of course, but they have like 80 percent market share in that segment.


Last time I saw a poll of blind CS/engineering students across NA, the wide majority preferred JAWS as a screen reader. That implies Windows.


JAWS is very well-known, and well-supported. Web sites are often tested against it.

There’s something to be said about having a well-known system, as it cultivates a great deal of “tribal knowledge,” and that can be invaluable.

That said, I think that we train ourselves on whatever platform we use. Both Windows and Mac have fairly comprehensive support for Accessibility. I think Apple is newer to the game, but the granularity of the options in the latest OSes is pretty crazy. Not just for blind folks; but for all kinds of disabilities. I’m confident that Apple takes accessibility very seriously.

I like to leverage accessibility and localization in my programming.


> I'm confident that Apple takes accessibility very seriously

I think Apple gets too much of a pass for being serious and commited, where other players have actual results.

On mild level accessibility, the part that surprised me the most was keyboard mapping. While macos got caps lock/esc/ctl remapping as touted out improvement, there is no blessed way to remap the rest of the keys.

That means third party deamons like Karabiner are the only straight option, and since it stopped being a kernel extension it's also out of the critical loops and will lag under load. In the worst moment I see keystrokes getting processed with enough lag to miss the combined triggers or go to different applications.

Windows has more options out of the box (e.g. 106 key layouts IME languages etc.), Powertoys is blessed and efficient, and AutoHotKey works well even under stress.

I'm still hoping for more improvement on macos land, but I wish there was less talk and more work on Apple's part.


my understanding of this is that Windows definitely makes it easier to write keylogging software etc. which of course also turns into making it easy to do all sorts of other keyboard things like hotkeys etc. (or perhaps I should say Windows makes it easy to write all sorts of hotkey, keyboard manipulating tools which means writing bad things like keyloggers etc. is also super easy)

This (again my limited understanding) is why AutoHotKey has not been successfully ported to Mac, because it is either not technically possible to achieve or time required would make it not worthwhile.


I haven't used OSX in a while, but in the past I created my own keymap using Ukelele. Would this not still be possible?


Ukele seems to be about character output, and doesn't interfere with the actual physical key behaviors or modifiers (that's what I get from the site and the interface when running it).

For instance I need the Left and Right Command keys to each switch on/off the IME when single pressed, and act as Command when combined with other keys. Same way the Fn key doesn't seem to be mappable.

Seems to work at the same level as the Powertoys remapper on windows, minus the shortcuts.


I still use Ukulele and a custom Swedish Dvorak keyboard layout without any problems (in fact, the layout I created 10+ years ago still works out of the box, on every release of macOS).


not someone who needs to use a screen reader but whenever I've tried them my experience is that JAWS and NVDA are infuriating, the learning curve on VO is much more forgiving and I can get done what I want in short order.

JAWS is often used in Universities in the U.S having a lock on the market - perhaps this is why the people polled prefer it?


Implies Windows but without information about why Windows was chosen. Sometimes people have to use whatever computer is available.

So I take the question more as: given whatever computer platform you are stuck on, what screen reader do you prefer?

Feel free to share the poll though, maybe it will clarify.


N=1, but the only blind engineer I've worked with preferred Windows, I think for JAWS. The company was a little behind the curve on hardware, but was fairly progressive in a lot of ways, and made accommodations without question, so I know he could have gotten a Mac if he'd wanted one.

He even gave a demo to the team at one point of what it was like to navigate editors, screens and browsers as a blind person, starting with the screen reader at a normal speed and ramping it up to what he was accustomed to.

I'm a little sad I didn't get to work with him directly outside of a single time he helped me test some custom UI elements, so I don't know much more about his take on voiceover versus jaws, but this was also over a decade ago. I'm sure things have changed more than a little since then.


Microsoft actually takes accessibility quite seriously.


Back when I was at MS I heard of accessibility PMs walking into VP offices and telling them that their newest version wasn't shipping until accessibility was 100% done.

I don't know if it the same now, but when I was there certain aspects of the customer experience were held sacred.


I wish it was still the same now. I updated to the latest Win11 feature drop thing, hit Windows + E to open File Explorer, and heard "File Explorer. Home, pane." And that's it. Luckily I know how to change the folder that File Explorer starts in to something useful. But a new computer user? They'll be confused and not know if they broke something, or if they're doing something wrong, or how to get to their files. But at least with Windows, I can quickly and easily browse the web and do just about anything I need to. Of course, I cheat and pull out Emacs with Emacspeak on WSL, and I'm like the only blind person that does this on Windows so I would never, ever expect other blind people to reach anywhere near WSL for anything.


What terminal emulator do you use for WSL?


Just Windows Terminal. Now than NVDA supports it really well. Or, I should say, Windows terminal supports NVDA well.


Most of us use Windows.


> AFAIK, Macs are the best out there (by quite some way) for accessibility.

Well, now you know different :-)


There is a well known bug in the German TTS adding the Word „Homograph“ before and after each number for about a half year right now. It’s a shame for everyone who depends on that technology.


Non-sequitur to the article, but are there any good utilities for blind users to use on Linux?


* BRLTTY for Braille displays/Braille terminals

* Fenrir for a userland CLI screen reader

* Orca for GUI screen reader and you'll want to start Orca before you start other apps so it can do some magic with accessibility export variables and ATSPI stuff. I don't understand it and I still do the ritual of putting export linux-a11y = 1 in .profile just in case. That's not the exact variable but the list is out there somewhere.

* cups-filters for printing to Braille embossers/printers. I think just about every major distro has the embosser drivers now. Linux can even use imagemagick to convert images into Braille graphics. Meanwhile Windows doesn't even come with embosser drivers, let alone printing graphics to them. If only GUI Linux accessibility were better. It's already got a really good base. Braille support for just about any display you have and all. Not very good TTS engines though. Anyway sorry for making this longer than just list of stuff.


Orca is one of the more prominent screen readers on Linux:

https://help.gnome.org/users/orca/stable/index.html.en

I used it briefly years ago for some tests but in general I'm guessing NVDA and JAWS on Windows are probably better options.


From developer POV this is real frustrating. This seems like something that makes bad experience but at least in the post there are no reproducible steps. Also when a bug shows only sometimes, if it’s rare enough, sometimes it’s really hard to nail down. I guess Apple’s metrics are superior to mine. Yet, sometimes just catching an issue is tough.

The other complication with Apple is the fact that sometimes it might be needed for multiple teams to fix an issue (eg. WebKit, Safari and VoiceOver/Accessibility) so they all have their own apple ways which might not be trivial as a single team fixing an issue.

And least, yeah, it might just be prioritization as there are some bugs that are with us for years or even regressions between each OS iteration that stays forever.


That "other complication" should actually be in their favor. Getting an issue fixed can only be more difficult when the various working parts are the responsibility of completely different companies.


Apple used to be great at developing good accessibility experiences, especially on their iPhones. (I personally cannot say much about accessibility on macOS as I rarely use it)

I think Google and Microsoft are catching up with Android and Windows respectively.


I was worried that making MacOS free with the purchase of Macs would lead to it withering on the vine and/or become a featureitis junkie.

Is that the case here? Not enough regular care and cage cleaning? This seems like a DEI class-action lawsuit nightmare in the making.


I tried using a mac (2019 MBP, 8 GB RAM, 4 Thunderbolt ports) at work. I could create course material in Ulysses. I could manage files and folders with Finder. I could even slowly navigate through Moodle's stupid web interface. But then I had to start using Google Docs, and a Salesforce instance. Google Docs works reasonably well on Windows. I even gave a good enough of a bug report for the orca maintainer to make it more responsive on Linux with chrome. But Mac? I can't even begin to describe how messy it is. VoiceOver goes in and out of the document, the VoiceOver cursor gets stuck on menu items, that kind of messed up, where even if I want to download the document for offline usage, it's just... Frustrating enough for me to throw away the MacBook, not (entirely) literally, and grab the windows machine again.

Salesforce is slightly better except when you get to a table of record entries, that table is empty. Simply, nothing there. Luckily, I can use firefox to get around this. Or I can simply go back to windows and not have any of these problems. The web is snappy again, there is no freezing up or not responding or busy messages, nothing. It, amazingly for Windows, just works. And that's a shame for Apple.


I'd have dumped mac a long time back if it wasn't for ulysses, such a great experience. Microsoft Word on mac is a car crash of accessibility and, like a car crash, there is a dispute over who's fault it is... Apple with voiceover or Microsoft with word.

Sadly, no plans for ulysses on windows.


I’m very sorry to hear about your experience. That really sucks. The fact we have a thriving browser ecosystem has got to count for something though. Could you recommend people use Firefox instead?

I have come to respect Apple in recent years for their focus on and promotion of accessibility within the industry. They are leading where others have often abrogated the responsibility to build inclusive products. We should celebrate this.

Therefore this headline made me sad. I appreciate sometimes drastic measures are required to persuade those with power to use it to change things, and it does sound like this bug has been around for too long. If Apple are serious about accessibility they need to fix it.

I have read that the relentless release cycle at Apple does lead to bugs like this never getting fixed. I hope this post gets upvoted enough that someone with influence sees it, and ensures that this one is different. Good luck!


>They are leading where others have often abrogated the responsibility to build inclusive products. We should celebrate this.

What "others" are there except Microsoft Windows and Linux - both also having plenty users here saying that they see those OSes as ahead of Apple? Seems to me that at best Apple is on par, not someone to celebrate above others.


Good point!

I was thinking about when Google launched their progressive web apps page a few years ago with the Accessibility section marked as ‘coming soon’ – not good enough!

I was also thinking about the marketing oomph Apple have put behind innovative accessibility features over the last year or so: https://www.apple.com/accessibility/ It’s great to see a company’s accessibility page being more than a dry statement of conformance with WCAG 2.1. As far as I could tell, they were the first major tech company to take a lead on this.

They clearly do need to do more work. As do we all!


I certainly don’t want to discount from the point this article is making, but as someone with a significant visual impairment, but that is not completely blind, I legitimately can’t imagine using anything that’s not macOS these days. This is coming from someone that grew up using Windows, and had an extensive Linux phase…including Gentoo.

Screen readers are a bit of a lightning rod for accessible technology interest, almost entirely because most people have some sort of sick curiosity. “How can someone use a computer so differently to the way that I do!?”. Of course, most of these people stop here, never bothering to try using a screen reader to navigate whatever they’re making. They might open VoiceOver, realise they don’t know how to intuitively use it, and fumble around with trying to close it again. This tends to have the effect of sucking any motivation out of the room. Most people won’t then go and meaningfully improve their screen reader experience, but they also won’t think to address any other accessibility shortcomings, especially visual ones, because “blind people use screen readers!” Is the pervasive meme.

This is part of why I am all in all so happy with how Apple has been going in this space lately. An obvious result of co-design / consultation, or dare I say it…hiring people with disabilities. Addressing accessibility concerns that the stereotypical SV techie has never even heard of.


OP is describing a very long standing bug which they describe as: "This “Safari not responding” behaviour when using VoiceOver dramatically impacts ..."

OP seems to be happy with the functionality of VoiceOver on Mac but not with the stability of VoiceOver on Mac.

You don't mention what actually works for you (on your Mac - you do mention that), only that you disagree with the article.


So what do you use that is better?


Blog post from Applevis Subtitle: Why I Can No Longer Recommend a Mac to Fellow Blind Computer Users

I don't mean to editorialize, the title was just too long for the title box.


Yup, the limit is 80 chars so it needs to be chopped. In this case the suffix is more informative than the prefix so I've swapped out "We deserve better from Apple" above. Thanks!


[flagged]


There’s so much wrong with your post:

- Blind users and other disabled people are often professionals.

- Apple is very much a personal computer company. They make iMovie, Photos, Apple TV, Pages, etc for personal end users. If you’ve used some of Apple’s pro apps like Final Cut or Logic, you’d know they can design for pro users - but macOS as a whole is the opposite, getting more simplistic over time as they cater more to consumers. The Mac is a good pro computer IN SPITE of it being designed for consumers.

- many people in the blind community are dishonest? Wtf. This just sounds like discrimination to me.


[flagged]


> as a professional in the IT industry, my recommendation to the my colleagues, don't fall for someone if they say they have a disability and demand something. It's not our concern.

Supplying your end users the tools they need to do their work is absolutely your job. What do you even mean?


Im going to pile on here and implore you as best I can to check your priors here. You are coming across as incredibly rude and unhinged. I’m sorry that a blind person…hurt you at some point? I don’t even know. But your behaviour is frankly unacceptable.


I'm confused as to your statement. So you're saying blind computer users shouldn't be recommending what they feel works for them to fellow blind computer users? Only people who are getting paid to review should recommend experience?

Also, are you saying that blind computer users can't use a wide assortment of computers? I have a buddy who is blind and is a developer and they happen to have a top end MacBook Pro and uses it quite well. I don't think that calling them a -low- user makes any sense at all.


[flagged]


this is an incredibly bizarre prejudice to hold


Apple is the best brand accessibility wise, unless we are talking about some insignificant niche products


Accessibility isn't a homogenous, fungible quantity. Calling something 'the best brand accessibility wise' is barely even meaningful, let alone useful.

Accessibility is a qualitative issue, and disabled people's needs are varied enough that many accessibility tools are only helpful or usable to a subset of them.


Please take a stroll through Apple's accessibility options and you will see the sheer quantity and variety of them tackling many disabilities. If you actually have any Apple devices, that is


So the blind users commenting here saying Apple is not the best are, what, lying? Or maybe you are not the target for those features and could instead listen to both sides of this argument and stay on the sidelines? Apple is not the clear winner (if a winner at all) in the eyes of the people commenting here that use the features discussed. Several are pro Apple but at least as many write long comments about how Linux and Windows are better for them.

I think we should listen to these commenters and not just tell others to go away and look at a list of features, with no background to tell if it is good or bad.


I'm visually impaired but still have a lot of vision left, and I can tell you from firsthand experience that this is an area where Apple's approach to accessibility falls down for my use cases because Apple's approach to scaling means that

  - their maximum font sizes for UIs are small
  - UI scaling breaks more apps than on other operating systems because Apple emphasizes pixel alignment more than reflowing
  - the unified menu bar means that app menus swallow up menu bar icons if you scale the UI up at all
My mother, who also still has some vision but is already legally blind, is in the process of switching to an iPhone after many years of low-vision Android use, can confirm that iOS has much the same problems, including small maximum font sizes and worse reflowing/cutoff behavior than she saw on Android. She's enjoying the transition overall because she's relying more on TTS than she used to, and also because the person who is training her on phone accessibility features knows iOS better than he knows Android.

But for me and my sister, who have the same disease as my mother but whose cases are not as advanced, Apple devices are a worse fit due to their inflexibility with respect to font sizes and reflowing, which makes taking advantage of our remaining vision more cumbersome for us, even though doing so is generally more efficient for us at this point.

Apple devices don't have moar accessibility than other devices. They have different accessibility than other devices. Whether they work well will vary for different disabilities, including different presentations or stages of the same condition. And the usability of those accessibility features is also mediated socially by things like what apps or kinds of apps a person needs to use, as well as what training and support happen to be available to them where they live and work.

This kind of feature checklist thinking you're doing is a profoundly impractical way of thinking about accessibility needs. Checklists have their place in all this, but accessibility needs in the real world are highly varied and contingent on nitty gritty specifics.

That Apple puts considerable resources into accessibility features is great, but it's not like there's a certain 'level' they can reach where there are no more gaps. More crucially, there's no level they can reach where criticism from disabled users is no longer valid or meaningful.


This is correct. Apple is first when it comes to accessibility. However accessibility isn't invented for the blind.


If you're not coding, programming, a Mac is just expensive and overkill.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: