Hacker News new | past | comments | ask | show | jobs | submit login
Interview with a blind developer on how he works (2017) (vincit.com)
335 points by agomez314 6 months ago | hide | past | favorite | 94 comments



Microsoft has done a lot of good work with VSCode for blind developers. I'm totally blind and used Eclipse for many years. I started using VSCode when I was learning Go. I watched audio queue's be added and thought it was a thing I'd never use because I am set in my ways. Now that I'm using VSCode to do Go programming professionally I'm glad they exist. It's not a game change but it is a nice quality of life improvement.


I definitely second the VS Code recommendation. I'm doing a lot of work with Jupyter these days, and Jupyter's accessibility really sucks. The VS Code Jupyter plugin is just as good as the rest of the app.

I'm also a fan of the IndentNav[1] addon. It essentially allows you to navigate code like a tree, using indentation levels for hierarchy. This is very useful to skip over functions or blocks of code you're not currently interested in, get a quick overview of what a large function is doing without getting into the nitty-gritty details, or quickly figure out what if statement an "else" belongs to.

[1] https://marketplace.visualstudio.com/items?itemName=TonyMaly...


Off topic, and I hope it doesn't cause offense, but Im very curious; what do you find to be good domains to work with as a blind developer? As a nonblind web developer whose domain is highly visual, I'm sometimes irrationally worried that I'd be out of a job if my vision went for one reason and another, and it gets me speculating about what the programming life is like for blind developers. Other visual aspects that I'd maybe worry about are graphical presentations made by other parties in my org.


I've done a lot of web API development in both legacy applications and micro services. It's pretty accessible since I don't need to rely n a UI and can use CURL.


I know a blind man who works at Google and sometimes does 3D graphics programming.


That's a modern day Beethoven :) That's awesome. I think a lot of folks would find that amazing to read about!


Do you mind walking me through your setup?

Blind programmers are always so impressive with their skills and tooling. I’ve tried to adopt some stuff I’ve learned, but it’s hard and y’all make it look so easy


It looks like the audio samples have disappeared since 2017, but they're available on the Wayback Machine for those who want to listen: https://web.archive.org/web/20170831162900/https://www.vinci...


Thanks a lot! I expected the samples to be hard to understand, but this is absolutely mind blowing.

Any sighted around here who managed to improve their ability to understand fast speech?


I routinely listen to podcasts at 2x while doing other things, and can go up to 3x if it's quiet enough (or a slow enough task) to focus. I'm willing to bet I could do it faster if I was doing nothing else.


Thanks, very interesting. Those samples sound like total gibberish to me. I'm a native english speaker, and if I really concentrate, I can sometimes pick out a word or two, maybe a phrase with the written text right there too. Maybe I could get to understanding speech that fast if I worked my way up to it.


Thank you good man!


It’s like when I’m speed running a podcast or YT video.


I'm surprised at the number of commenters here that are blind, or commenters who work with someone who is blind. Maybe this is more common than I thought, and that's awesome. Goes to show that we should try to make accessibility more than just an afterthought.


I'm sighted but goshdarnit some of the little accessibility features that microsoft have worked into their products are super useful. Readaloud for example in Edge, a lifesaver when you have to trawl thru azure docs at doublespeed.

My biggest beef with Microsoft around accessibility is how keyboard shortcuts are slowly dying out as apps move to becoming embedded webapps. Excel on windows for example has great keyboard navigability. Teams does not.


Excel on windows is an javascript application too right? No ?


Not to my knowledge. The 365 apps can be run in a browser or as native windows desktop apps. I avoid the former as they seem slow to me. And their keyboard shortcuts clash with those of the containing web browser.


My greatest realization with a11y came when I sat down and used screen reader software on my work project and realized how unusable it was for anyone visually impaired. Even tiny things like font sizes take on a different perspective once you realize what it can be like.

It also made me realize that making an accessible site, such as all the gov.uk ones, means that the site will almost by default have a great UX flow and be easy to use. You have to make some decisions that can limit you, but at the same time you achieve so much in usability once you design things with a11y in mind.


Yeah, and what's more frustrating is not getting time to work on those bugs. My current place is a little better, but my last place was a joke. We spent time and hired consultants to evaluate our product, logged the a11y bugs, and dove right into our next feature work. Those bugs are probably still there years later, if they haven't aged out.


Blind leading the sighted


One time when I was a kid, I rode with my dad while he gave a blind guy from our local Linux users group a ride home. With the windows rolled down and an occasional question, he gave us turn-by-turn directions to his house, based on the map he had in his head of the route between the venue and his house, plus the way traffic sounded as it echoed off of various key buildings and streets that were landmarks for him.


I once worked with a blind malware analyst. I was skeptical because of the working memory requirements for reverse engineering being hard in general... until we worked together and his screen reader gave him hex and assembly. He was beyond fast at reversing and generally excellent at his job.


Blind person here, do you remember what tools they used by any chance?

I find modern IDA pro to be pretty inaccessible, same with Ghidra (though I only tried that one on Mac, and the Windows version is apparently somewhat better).


As GUI application, I find they are pretty challenging and deviate from normal application design. Ghidra’s multi window design is both powerful and a royal mess, many things aren’t intuitive. Resizing columns in the listing window deserves some sort of inauspicious award.

Ghidra has a “headless” mode where it will just run without the UI.

Or have you tried Binary Ninja? Its UI is simpler and a bit more primitive, but it is designed more programmatic analysis, less manual/interactive usage. It’s GUI is more for debugging scripts, not so much for manual reversing.

Programmatic analysis can be powerful for automated taint analysis and such.


TBH, hex and assembly sounds like a sufficiently specific domain that they may have written their own tools/plugins. Maybe even a custom pronunciation library for reading off the symbols. With something like feisal that could be pretty feasible.


so they had long been a mac user, but the screenreader support started to slip around ~2014 so they actually switched back to windows because it was just a much better experience since apple who used to rock had dropped the ball. i tried to get him in touch with some cupertino people but we got snubbed, sadly.

anyway, it was IDA Pro, and they had spent some upfront time on tuning the layout to optimize their flow. I do not remember specifically but yes their IDA looked very different than the default setup.


I can’t see why that would be a Mac specific issue. I use both platforms, and it would be on the Developer to support screen readers on either platform. Microsoft Office didn’t support voiceover until 2016 for example, and you can’t really report that to apple accessibility.

Where I found a lot of bugs is when it comes to supporting web standards like mathML, and this is where I will usually switch to windows since the screen reader tends to be more flexible, and you have a lot more software focused on assistive tech.


I think what you just mentioned here, at the time there was a lot more software for windows for assistive tech which was a much better experience, where mac os x's screenreader became difficult and buggy.

previously macos just worked way better given that the apps were developed with apple's framework and polish, and accessibility was part of that


Previous coverage from 2019 where the author commented as well: https://news.ycombinator.com/item?id=21898537

Related topic from 2020 (Ask HN: I'm a software engineer going blind, how should I prepare?): https://news.ycombinator.com/item?id=22918980


Thanks! Macroexpanded:

Blind software development at 450 words per minute (2017) - https://news.ycombinator.com/item?id=21898537 - Dec 2019 (53 comments)

Ask HN: I'm a software engineer going blind, how should I prepare? - https://news.ycombinator.com/item?id=22918980 - April 2020 (473 comments)


That second thread is really interesting — I don’t think I have to worry about blindness, but I’ve wondered how I’d adapt to working if I couldn’t see, or didn’t have use of one or both hands.


Thanks for the references, especially the 2nd one. I came here to post it as a comment, but here on HN, there will always be someone who will outperform you :).

For me, the mentioned post (Ask HN: I'm a software engineer..) is a true classic, at least for me. The first/top comment is legendary as well.

Also, the above story was the highest voted Ask HN submission of all time until a few months ago where it was beaten by Ask HN: I’m an FCC Commissioner proposing regulation of IoT security updates https://news.ycombinator.com/item?id=37392676


To me, this brings up all sorts of anxieties around how I could possibly function (professionally) if I lost my eyesight... and what I would practice in advance if I have reason to expect problems.


Very interesting article and it was interesting to see his desk without a display and mouse.

I posted a long and very moving story the other day about a software developer who was getting blind on an eye in six hours. https://elye-project.medium.com/i-became-blind-within-6-hour...

It made me think about how much time we spend in front of screens these days and that eyesight can't be taken for granted.

This article was a nice contrast describing the possibilities to do work as a blind developer.


If anyone wants to read the linked article without dealing with mediums paywall, here it is: https://pastebin.com/Jyw04xJV


Link where you can hear the audio samples: https://boingboing.net/2017/08/28/this-blind-software-develo...


Wow, the fact that he can understand what's being said at that speed is incredible. His brain must've repurposed most of those visual processing neurons for audio.


> His brain must've repurposed most of those visual processing neurons for audio.

Maybe! Blind people's visual cortices do light up a lot more than you might think, apparently. Some blind people even experience visual synesthesia.

But I think a big part of it is just ordinary practice. My mom is in the process of learning screenreaders on her iPhone and Windows PC right now, and she's still working her way up to speeds like that. It takes a lot of time to reach that level.


Related:

How A Blind Developer Uses Visual Studio (2017) - https://www.youtube.com/watch?v=94swlF55tVc - https://news.ycombinator.com/item?id=14347908


"The answer of course is Visual Studio", pun intended.


Unfortunately that video is gone. Is that the one where he was using sped up audio cues to analyze code?


That's odd, the YouTube URL still works for me. Here's another link to the same video: https://www.youtube.com/watch?v=wKISPePFrIs

I think it's the same one you mean, his screen reader's audio is so fast it sounds like electronic blips.


If blind people can do things primarily with their sense of hearing, then I believe I can learn to do the same with practice. This would allow me to work with both my visual and auditory senses. Alternatively, I could switch between working with my sight and hearing during different sessions, giving each sense a break. It sounds like a wonderful productivity hack. What do you think of this idea? I wonder if there are any blind training services for non-blind people.


> I wonder if there are any blind training services for non-blind people.

Blind people generally get mobility (i.e., navigating with a cane) and technology accessibility (i.e., using your phone and computer with screenreaders) training through social programs that are for blind people, but the individual teachers often offer private tutoring beyond the scope of those programs. Presumably some of those instructors are also willing to teach sighted people. If you call around, I bet you'll find one.


I had the same thought and learned how to comprehend TTS at very high speeds (~5x plus or minus a bit depending on how alert or tired I am.) I recommend it, it allows me to "read" for longer/more than my normal fatigue limit would permit.


Thanks for sharing your case. But what do you do with the parts you miss while listening to TTS? Do you just read fiction or light reading that you don't mind missing?


My headphones have a button which I mapped to seek backwards 10 seconds. Sometimes I still can't make something out (often because TTS is butchering the pronunciation) and have to read it visually. I read both fiction and nonfiction like this. Books, news and technical articles, etc. I turn it off when I run into things like tables of numbers since that just turns into a wall of noise for me.


10,000% respect for anyone able to develop or work productivility and being blind. I lost the use of my dominant right for a few months three years ago, so difficult to adapt even for a few months. Respect!


I work with an engineer who is visually challenged and uses emacs with dictation to code. He is a pretty phenomenal engineer.


This makes me realize it must suck to be a mediocre blind developer.

The average person is just average. Unless there is something about visual impairment that makes them better at development, then I would expect them to be on average, just average developers.

Yet I would guess that visually impaired developers get more scrutiny. If nothing else, just because they are a novelty.

Anyhow, I guess I am just reminding myself to treat each person as an individual.


I feel I'm a mediocre blind developer and it does suck, mostly just the blind part though. I do my best not to let my coworkers know I'm blind (I work remotely.) Mostly to avoid either 'cripple porn' or being disregarded. Either way, it's easier.


I'm "on the spectrum" as they say, and I don't talk about it much; I'm quite concerned with what's called "passing". I want people around me to focus on the things I can do, rather than the things I can't do. Can totally empathize with that someone with a more acute disability than mine, such as blindness, would want to do the same.


Quite bizar you can hide the fact your blind at work. But yeah, why not, fuck pity.


If he can perceive bright light I wonder if he can detect color?

Because I’m mostly normally sighted I use audio cues for extremely low bandwidth data, like chat or build status. I wonder if someone with a text to speech tool would be better off getting visual cues for low bandwidth data. For example one or more color panels on the wall in front of you to indicate chat status or unread email or failed builds. Four pixels in three colors can encode a lot of status.


I refer to my vision as "no useful vision" I can see enough light to sometimes tell if the lights are left on in a room or if the son is out. I can not see color, it's either bright, neutral, or dark.


That's interestingly exactly the case for me if I close my eyes. Through the eyelids I can tell dark, neutral and bright.


At best I can tell the direction of a light source with my eyes closed. That still seems like it could encode four states. But the light intensity might make my skin uncomfortable. Better perhaps to use tactile.


I wonder if there’s a disability friendly version of the famous Apple Pascal poster


I wonder how much multimodal LLMs are changing the game for blind developers. I imagine if used the right way it could be a huge productivity booster.


Why do you think this? I'm totally blind and find Chat GPT to be useful for quick answers to questions instead of searching through SEO spam and low quality tutorials. I find Copilot to be sueful for better code completion and helping to write basic unit tests. I don't find that it can actually write complex code for me. I very well may be using the tools wrong though.


Multimodal LLMs can take images and describe them as text, which I imagine could be useful in some cases, for the vision impaired.


It can be useful, although the usual caveats with LLMs apply. for example I can get a description of the contents of a window or control with one keystroke. Here is what it says about this HN page:

The image shows a screenshot of a text-based discussion on an online forum. The discussion involves several users exchanging thoughts on various topics related to accessibility for visually impaired individuals, multimodal language models (LLMs), and coding. It features typical forum elements like usernames, timestamps, and reply links.

[... summary of the comments]

The color scheme is predominantly white with text in black, usernames in blue, and links in light blue, creating a contrast that's commonly seen in web-based forums.


> It features typical forum elements like usernames, timestamps, and reply links.

God, that's so on-brand for AI.

"The image shows a group of humans. They are engaged in human-like activities like talking, eating, and hiding their insecurities. They feature typical human elements like hands, ears, and nipples."


Thanks for this. Pretty useful I'd say!


That's a good point. This may have been useful when I was working on internal webapps to determine if the UI looked like I expected it to. That was in 2010 though when it was more common to use plain HTML instead of single page frameworks. I'm glad I moved into back-end development.


Not much, though I use GPT Vision (through a Mac app[1]) occasionally, especially for screenshots and sometimes plots.

[1] https://github.com/chigkim/VOCR


Related: How a Completely Blind Manager/Dev Uses Emacs Every Day - Parham Doustdar - https://emacsconf.org/2019/talks/08/


Interesting. I thought blind developers would be working 100% on a UNIX command line, using tools like ed for text editing (no GUI or TUI).

Related: "Command Line Programs for the Blind", by Karl Dahlke [1]

[1] http://www.eklhad.net/philosophy.html


I actually tried using Ed and didn't like it very much. My main gripe with it is that you strictly have to operate on a line-by-line basis, there's no way to go word-by-word, do option+backspace to erase the word that you just went past and then replace it with something else. Sure, there's the s command, but then turning "Eric" into "Daniel" also turns "American" into "Amdanielan", and you have to be extra careful around that sort of thing.


This may have been a better option in the early 90's when GUI screen readers were not as good. I did z/OS Assembly programming for several years using ISPF as a text editor and submitting jobs to compile my code. I"ll gladly take an IDE with features from 1995 like renaming variables, moving to the next error, and jumping to a function definition.


If you don't mind me asking, what kind of Mainframe stuff were you working on where writing assembly was the best way to go about it?


I was working on profiling and crash analysis software. The software had existed since the late 70's or early 80's. It was written in assembler so that's what was used to update it. If it was written from scratch it's unlikely the majority of it would be written in assembly.


Did IDEs in 1995 actually support renaming variables? I thought that was a more modern thing like 2000s.


IDEs where mostly non-existing in 1995.

I believe Smalltalk could have been considered an IDE? Just vague memories from people using it those days, I have never used it.



Turbo Pascal, Mac Programmers' Workshop, QuickBASIC, and a variety of others would like to say hi.


I would hate to have to do any editing in ed. Just reading a file line by line would be painful.

GUI programs are actually often easier for blind people in the same way they are for everyone else as long as they have good keyboard navigation and use a toolkit with good accessibility support.

I currently mainly use VSCode for development; they are putting a lot of effort into accessibility.


Unfortunately, accessibility under Linux at least is considered lagging and problematic. Microsoft Windows seems to have better accessibility support all round than Linux.

See:

https://scribe.rip/@r.d.t.prater/linux-accessibility-an-unma...


Unix people can just use Emacspeak for everything.


Those don’t expose semantics and state in the same way that well coded GUIs do.


A mentor of mine now has very poor eyesight due primarily to age. It's affected his ability to use a computer significantly and even his ability to concentrate on a task for sustained periods of time. Also, a friend of my wife's has macular degeneration, and at time of writing has 30% or less of his eyesight remaining.

I find my own eyesight is starting to deteriorate (though with prescription lenses it's still pretty good), so in light of this and the above I'm definitely interested in how blind programmers work; accordingly, I've started experimenting with Emacspeak so that, if I ever go totally or effectively blind, there's still a pathway that I can use so that I can continue to computer effectively (including programming) without sight.


I've known this to be true, as the subject of the article states:

> "Windows is the most accessible operating system there is"

and people don't believe me when I tell them.


One of the best talks I saw was by a blind dev in India. About half way through I realized “holy crap, this guy had to memorize all his slides and talking points” and then it was even more impressive.

Also to add: we are all temporarily disabled at some time in our life one way or another. It’s never a bad day to try out an accessibility tool and explore how well your work functions with it.


Wonderful article. I see it’s a few years old. Wonder if VS Code has improved vastly in accessibility and what the author’s take is now.


One of my biggest fears as a frontend dev is to build UIs that are inaccessible for people. It's easy to take so many things for granted.

The good news is you get a lot of accessibility out of the box using native html. For higher abstractions I love working with accessibility first libraries like HeadlessUI or Shadcn (which is built on top of Radix).


Really interesting.

One thing jumped out at me was whether some of my own coding process as a sighted developer overlap. I’m sure I’m not the only developer who talks about “Seeing the code in my head” before type it out. Not literal lines but I craft a mental map before hand, which guides the process more on a subconscious level.


One time I phone screened a blind engineer. I had no idea he was blind at the time. He did the coding exercise with no problems, and I forwarded him to the onsite. The only thing that was unusual, was that all his code was flush left. It was perfectly fine code, just formatted oddly. ¯\_(ツ)_/¯ Nothing autoformat couldn’t fix.

When someone else interviewed him at the onsite and saw my comment in the packet, they told me. We guessed it was some artifact of using a braille display or something.

If anyone is familiar with this technology, I’d be curious to know if my hypothesis is correct.


If my limited knowledge is correct, many braille displays are fairly loud, and you'd hear the clacking when the engineer wasn't typing (if you used a live coding website).

None of the blind engineers I've worked with used them- too slow to refresh, too limited number of characters that can be displayed at once, and/or too expensive. Some are better than others, but they're all expensive.

If I had to guess, the usual tool the engineer used for coding had voiceover support for tab stops, but not in the browser editor.


Actually...is there even a reason for a blind developer to indent their code (outside of languages like Python where whitespace is significant)? Sighted people do that so that the code is easier to read, but if you aren't reading visually, then the extra whitespace is useless (the nested context information is already conveyed by curly braces and the like).

It's possible that this developer normally writes without stopping to insert or adjust indentation, and then runs a formatter to pretty up the code before sharing it with others - but simply forgot to do so in this instance (or didn't have one available in the coding exercise editor)


Yes, in my opinion indentation is essential. Screen readers can tell you the indentation level which definitely helps keeping track of where you are when reading nested blocks. It is also important to make the code readable to others.


it's also possible they had very limited vision and used very high magnification, where indentation might be actively unhelpful since you'd just see lots of blank space at times


You can change tabs to show up as a single character. Of course if people are forcing spaces then you’re stuck, you need to hope that space isn’t significant and then collapse multiple spaces.


That's a nice post to read. Thanks for sharing.


I sure wish I could read this..




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: