Hacker News new | past | comments | ask | show | jobs | submit login
How can you program if you're blind? (stackoverflow.com)
185 points by rayvega on Oct 28, 2010 | hide | past | favorite | 67 comments



I've sailed the open oceans with a blind man at the helm. Which sounds like a line from PoTC but really: he used an audio compass that played a tone at different points of the compass, higher or lower so he knew how to correct. The rest was done by feeling the way the boat handled through the deck and the wheel.


I'm a blind programmer. I use a 80-character braille display which shows one line of text at a time. My preferred language is Python although some blind programmers dislike it for its syntactic use of indentation. I find that scrolling with the braille display and feeling for the indentations on left gives me a good feel for the shape of a program. I know another programmer who prefers Perl because of its rich use of sigils, although he uses speech synthesis. I prefer to use vim for all types of editing (I hate IDEs although many blind programmers like Eclipse or Visual Studio). Why do you reckon the feedback blind programmers use as non-tangible BTW?


I was just thinking, before reading your response, that python would suck for the blind, because of its strong preference of "beautification" of the code. So your response actually surprises me.

Do you think having braces over indentation would have helped for accessibility. Any other issues with the language syntax?


Maybe one's preference depends on whether one uses speech or braille. Although I known of some blind programmers who won't learn Python because of the indentation, some like it. Some speech users use a piano scale to show indentation. Some use editor settings that transform the source code on load so that blocks are delimited and reformat on save. The EdSharp, developed by a blind programmer, is one such editor.


I was thinking that too, but I wagered there would be a way to give representations to white-space which act as a way of dealing with it.

Although, it would seem to me that it would involve a lot of counting of white-space.


What do you think about languages like J or APL or even Haskell, where you can do an awful lot in very few characters?


I don't know APL and I'm not currently smart enough for Haskell. My current infatuation for learning is with Scala. When I first started programming in the mid-80's, I was hacking on a COBOL stock-control system on VMS; the braille display I was using at the time didn't do terminal emulation, and this software built screens by outputting VT52 escape sequences. Although I learnt to decode them so that I could check my screens for alignment and spacing etc. I discovered that I could log an interactive session using SET HOST/LOG; I wrote a TECO program to translate these sessions back into a 2-dimensional layout that was much easier to read on my braille display. So, yes I generally like dense languages.


> "I learnt to decode them so that I could check my screens for alignment and spacing etc."

That's just amazing, but I guess it makes a lot of sense... since your display is not 2d-based in the same way as standard screens are and you can read the code itself (i.e. the display is consistent). I'd upvote you a thousand times just for how cool it is to workaround difficulties in so leet way :)

Of course it also reminds me of... I don't see code, just blondes, brunettes, redheads...


I used to work in a braille lab during my masters. Braille lab would generate braille books and reading material depending on blind students' request.

For regular text, we used to do a simple text conversion and print it using a special printer called braille embosser. We used a software called duxbury for the purpose. http://www.duxburysystems.com/dbt.asp

For math(non single line), braille uses something called, nemeth code. So we used code almost all the math functions/equations manually in nemeth code.

For graphics, we used to emboss 2d picture on a special paper. The process involved, scan the picture -> convert it into a grayscale -> remove all the noise -> take a print out -> use a tactile embosser to put it on a swell touch paper (http://www.abledata.com/abledata.cfm?pageid=19327&top=15...) Typically, a chart or graph or waveforms were key in understanding the material.

During the process I interacted with few blind cs engineers who were graduated and were working for Microsoft. At that time(10yrs ago), Microsoft was the best blind friendly company. They used the visual studio for daily hacking.


Have you ever tried PostScript, Forth or Factor?


The compass tones are ingenius, but aside from that I'm not surprised he was able to do this as sailing is one of those things where feeling plays a much larger role than one would expect. There are many forces at play that have nothing to do with sight. Having helmed in night storms of 20 foot seas where I could basically see nothing, the motion of the vessel and feedback force exerted on the wheel, I quickly got a sense of how the boat was moving. In fact in those situations I probably would have been much better off with your friends compass than the soaked one at the wheel. And dont even get me started on trying to read the windex in such conditions!

Coding blind would be much different. The input is physical but there is nothing tangible coming back. It is more or less all visual. Come to think of it, if I ever lost my vision I would probably become a full time sailor :)


> The input is physical but there is nothing tangible coming back. It is more or less all visual.

Not really. At the end of the day it's just data, text, how you encode it is not very important as long as it can be understood.

Because humans very strongly rely on sight, it makes sense that the default serialization format for text output be print/visual, but there are a number of non-visual interfaces to text. And if none fits, build a new paradigm.

I think samlittlewood below makes a good point: even with a fairly slow (e.g. braille) output, it's at worst similar to coding using a line-oriented editor (e.g. ed). Except you're used to it because you see pretty much everything via a line-editor so you're probably proficient at it.


I went to school with a blind CS student. He had no problem using a computer by himself but the normal classroom setting was vary difficult for him. Based on how he operated meetings would be far more difficult than actually writing code.

PS: Ok, his is not going to be building a GUI but text to speech works well. And a keyboard only interface works fine for both VI and visual studio.


Why were meetings more difficult than writing code?


The tendency of people explaining their ideas to draw diagrams on paper / a white board. Once a diagram was on a computer there are tools to help him understand it, but it took a while. So, the only way for him to keep up with a free flowing discussion is to follow what people said.


I remember looking into whiteboard systems that automatically digitize what people write as a way to conveniently record what was written but the implication that this tech could also be used to help a blind person during a meeting is interesting.


How did he handle other users of the ocean?


No worse than any other helmsman :-)

Which is to say with the boat heeled over or the jib up, the helm is reliant on the bowman and the rest of the crew keeping watch anyway, you can't see much forward.


We had a blind classmate all through my undergraduate program in CS, and he was a very capable hacker. He lived inside emacs using emacspeak with a high speed synthesized reader voice, and you could often hear his terminal talking to him in the front row if you listened carefully during class.

Blind people develop good memories because they are constantly building detailed mental models of spaces they need to navigate, and I think this might give them a leg up programming because they are used to working with mental models rather than referring to maps, diagrams and documentation. Their are obvious drawbacks, but I think programming is probably a pretty good choice of profession for blind people.


How similar is this to using a line editor (ed/edlin/edit/teco) on a slow terminal? You have a model in your head, code in a buffer, and you use the edit commands to update both - with occasional listing sanity checks to check you are keeping the two in sync, or move to a new region.

I hear some guys wrote a pretty cool operating system like that!


It is somewhat similar, but to get the real 'feel', you should also limit your line width to one to two characters. Braille reading typically is done with both hands (with one looking ahead, and one doing the actual reading), so it does 'show' slightly more than that, but you will still be faster at reading those characters than a blind person would be.

Also, you would have to give up syntax coloring and style and font variations.

Finally, chances are that some characters (braces, asterisk, plus sign) from your programming language take up more than one braille character. Real nerds would hack their character-to-braille pattern translator to optimize that translation table, but the price you pay for that is that you will have to cope with multiple mappings. Actually, you would have to do that anyways, as there is disagreement about the encoding to use, even for digits, and because the U.S. frequently uses a contracted braille system (where, for example, the single-letter word 't' must be read as 'the' where appropriate; that 'where appropriate' is for the reader to guess at)


Also, you would have to give up syntax coloring and style and font variations.

I'm pretty sure ed was dead by the time those were invented, the wiki claims the "one of the first" syntax highlighting editors was in 85.


There's a guy on the Lua mailing list who has mentioned (e.g. http://lua-users.org/lists/lua-l/2008-09/msg00513.html) that he appreciates how its relatively small and keyword based syntax (http://www.lua.org/manual/5.1/manual.html#8), makes it easy to program in with a screen reader.

There's also Emacspeak (http://emacspeak.sourceforge.net/). I haven't used it, but there's a chapter about it in _Beautiful Code_.


It's interesting how quickly the text to speech setup can be made to run. My dad's computer talks so quickly I can't understand without slowing it down.


There's a lot of talk about line editors in this thread. From a quick scan of Wikipedia, it seems that 40 cells is about the biggest braille display you can buy.

WTF? Why has nobody invented a full-screen braille display? Is there some reason it's massively more complicated than it seems?


Well, I'm working on a full-screen display. :-)

Software demo: http://www.youtube.com/watch?v=4qnL7479fbU

Source: http://github.com/hugs/pinmachine

Hardware demo: http://www.youtube.com/watch?v=AUgbu0cE2-Q

Biggest issues are cost and size. I'm currently at "1977 Apple" stage - a big, expensive, hand-made prototype. But the biggest issue is cost per pin. Also, resolution is an issue... I'm going for "big and expensive" first, then will work on making it small and cheap overtime. I'm working with small dc motors first, but there are plenty of electromechanical tricks to get this down to braille-worthy stuff.


This seems like a project that the MIT media lab would like to put its hands on. Why do not you seek a collaboration.


I'd love to chat with the MIT media lab folks. If you're so inclinded, check out my HN bio and have them contact me via email or twitter. ;-)

I haven't sought a collaboration yet, because "pinmachine" is my weekend project -- my time is otherwise filled up working on my startup (saucelabs.com).


I had a thought about it, ideally it would be a pin board (like these http://www.youtube.com/watch?v=ikkjT7ACJME ) with solenoids or similar motors on each pin... somewhat similar to core memory too, with a different geometry.


Yup, that's exactly what I'm going for... It'll be really cool when you can program anything to appear. My dream is to make it large enough to render "Han Solo in Carbonite" on this kind of display. :-)


I suppose that "big enough to palpate with both hands" would be already a tremendous progress for blind people :)


I'd love to have a 80x25 braille display. I believe they've been made in the labs. The blocker is the expense.


Hm. I smell a really good startup opportunity, if one could build a reproducible design.

I know there's a ton of blind people out there... do most of them know or rely upon braille, and would be interested in such a product?


Most blind people use speech output. If braille displays were more affordable perhaps more people would use them. But with the cost of 80-cel displays being roughly double 40-cell displays, how could make an affordable multiline display?


I think you'd need a different paradigm. Right now each cell (if I understand the design) is a relatively complex mechanical entity. Scaling a display means scaling the number of cells linearly, which gets expensive fast.

But if you rethought the problem and built a sort of generic pixel-field display, only with each pixel being a pin, and a consistent way of activating them (perhaps something like e-ink), you could manufacture arbitrarily large screens with sub-linear cost.


This might be what you're looking for:

"When glass touch screens feel like sandpaper" : http://www.cnn.com/2010/TECH/innovation/10/08/tesla.touch.di...


Ah yes! A foldable display with nanotech that could make tactile maps and such with different textures. That's what I want.


Not quite foldable...

How about a single "cell" that is on an X-Y mount (or just X for a simpler single line display). Your finger "sees" only one cell at a time anyway. As you used your finger to "skate" over the surface of the board, the "display" computer would read the X-Y position and drive the pins appropriately for that spot.

Back to a one-dimensional (X only) readout with something like a scroll wheel on one side to scroll up and down seems pretty easy and useful. Picture a dot matrix printer where your finger is the "paper". The printhead would need a 2D set of pins, not just a vertical line... hmmm, maybe a vertical line of pins would be adequate in conjunction with the left-right sliding motion.

Result: size is limited by the X-Y mechanical accuracy (backlash, rigidity) and cost would scale very well compared to size: one cell's worth of pin drivers for any reasonable size readout.

[edit] Per Someone http://news.ycombinator.com/item?id=1842617, a X-only (line) readout with two heads, one for each hand (probably on the same platen) would mimic the Braille best. If you had a force sensor to sense the user pushing +/- in the "Y" dimension, you could use that to "scroll" the line without having to take your hands off the readout.


When I looked around St Dunstans in Sussex (provides support for the blind and visually impaired), it was suggested that Braille was on the out, which surprised me.


What is the market for such a device? How many blind computer users are there? How many blind programmers?


Then why not go for an alternative approach?

Perhaps a capacitive touch screen with certain subtle audio and haptic feedback?

It seems like the mechanical approach current Braille devices use might've jumped the shark, and I wonder if the blind world doesn't need a system that's more easy to do in today's world.


In the USA, that typically is what people can afford. At about 3 character an inch, it also nicely fits below a laptop. In parts of Wesrern Europe, 70 or 80 characters are common, too. I have even seen two-line displays with two times 80 characters. Those are huge, though.

Fills-screen is a bit of a holy grail. What prevents it from happening is technology and price. The only reliable technology uses space next to each character for a piezzo-electric actuator. That makes it impossible to make displays that stretch far in two dimensions (that two line display was over an inch high)

Also, those actuators are fairly expensive, say $40 per character. production volume is low, so any manufacturer will need 100% or so of margin. Result: 40 characters for a couple of grand.

AFAIk, any other technology still suffers from reliability problems. It turns out to be quite hard to make a small reliable actuator with the required force, travel, and resilience to dirt.


40 cells is 240 individually actuated pins.

10 lines is 2400.

80 x 25 is 12,000.


Most refreshable Braille displays have 8 pins per character. That fits neatly in a byte, and allows for some more characters to show. That is quite a gain, as standard Braille (64 different patterns) doesn't even have separate code points for capitals.


Utility vs cost not there. You still need to scan it one line at a time, not like a visual screen where you can see it all at once and focus on what you think will be relevant.


I know a blind python programmer Krishna kant Mane[0], who works with pylon, is a FOSS evangelist, leads a foss project GNU Khata[1] and an awesome speaker[2]. I loved the way he uses all free and open source tools to work on his laptop, program and deliver. Hats off.

[0]http://www.blogger.com/profile/07891377863557367515 [1]http://gnukhata.gnulinux.in [2]http://gnunify.in/speaker/profile/108


A group of students from Southern Illinois University Edwardsville and Washington State University is working on a programming language for the blind:

http://www.youtube.com/watch?v=lC1mOSdmzFc


Has anybody out there who's not blind tried doing a "blind for a day" thing? Just wear a blindfold for 24 hours and see how much of your life it affects. I haven't tried it but after reading all this I kind of want to.


I'm legally blind and have been since birth. I only have vision on my right eye. My visual acuity is low (around 20/180) and my field of vision very narrow (less than 20deg). Still, it doesn't really stop me from doing much. I program-and I also design.

A couple of years ago I developed narrow-angle glaucoma, in addition to my other problems. After several laser surgeries failed to resolve the problem, I had a more invasive surgical procedure done. Afterwards, I was completely blind for 48hrs. Aside from the fact that it was terrifying, because at that point we didn't know how much of my vision would come back, I found that I was able to adjust fairly quickly. In just a few hours, I was able to get around the house, get dressed, shower, brush teeth, etc. unassisted. Eating was a little more difficult. It's hard to fork food you can't see. I stuck to sandwiches. I didn't attempt reading (I don't know Braille) or using the computer.

I'm sure this was made easier by the fact that I'm very familiar with my house and by the fact that my vision was already poor, so I'm more used to doing more things by feel/sound/memory, especially at night.

Which, I guess, answers your question, but to finish off the story:

After a couple of days, my vision started to slowly come back, though it was very blurry for a long time. I was able to start using the computer, via a combination of the screen reader, enlarging, and inverting the colors on the display. All I could see at that point were shapes, and, after about a week, in order to read text, it had to be about 128pt and high contrast.

Fortunately, most of my vision returned. The first couple of months were slow going, but after that, it got to the point where I could pretty much function as I did before. It took around 6 months to recover completely, though. I did end up losing some contrast sensitivity and as a result, I can no longer read printed text. There's just not enough contrast. I mean, I can see the words, reading for more than just a few minutes gives me a headache. So now I read books on an iPad and with the backlit screen (a backlit display helps a ton) and zoom capabilities, I can read without headaches.


I routinely test web applications with screen reading software; I am partially sighted so I have an interest in developing things that work well with screenreaders. I highly recommend "being blind" when you test the things you build, and the idea of blindfolding yourself for a day just to see what it's like is a good idea. It gives you a little more empathy. However, please keep in mind that it's also not as bad for the blind, as they have a lot of practice using assistive devices, technology, etc. The human body and mind are incredible, adaptable, wonderful things.

P.S. I did a video of a screenreader and twitter a few days ago - you can see the thread for that at http://news.ycombinator.com/item?id=1799246


I've thought about doing this. I think the biggest problem - aside from setting up the computer for text to speech - would be getting to work, and to other places. I'm familiar enough with my current route that I could walk there with a cane, I think, but crossing the street would be really hard. Especially since none of our walk signals are equipped with audio indicators.


The answers seem to reflect the technical "how", but I still don't get how you can efficiently program if it takes minutes to read a page of existing code.


One of the lead programmers of JAWS is Glen Gordon. He is blind is one of the smartest programmers I have known.

http://en.wikipedia.org/wiki/JAWS_%28screen_reader%29

http://www.afb.org/afbpress/pub.asp?DocID=aw070204


There are plenty of 'impossible' things you can do blind.

For example, there are blind golf players. They have their own golf associations and tournaments. One of my old co-workers used to assist blind golfers by describing the location, distance, etc. The golfer does the rest.


I think the biggest thing for me would be living without intellisense/autosuggest - I mean it'd be OK to code ifs and fors and whatnots without them, but I can't imagine how I'd quickly navigate the large, enterprise software libraries I deal with regularly...


The key is to use a good screenreader like JAWS. As I posted here its next to impossible using Window's built in screenreader, Narrator.

http://news.ycombinator.com/item?id=1799588


Amazing. It's really hard to imagine programming blind and jumping between methods/files/debuggers/browsers effectively, but it seems some people do pull it off.

I was struggling with tendonitis in my right arm a year back and had to do all my coding left-handed, and that made me feel crippled and annoyingly slow. It's really impressive what people can overcome.


I always wanted to learn braille as a kid so I could read in the dark.


I worked with a programmer who was legally blind. He used speech-recognition software plus some sort of display software that blew each character up to about 10" x 6".


Are there any other HNers out there with other disabilities that cause you to modify how you interact with your computer or how you program?



A better question would be how can you program graphics if you're blind?

That's something I would love to find out and/or solve.


I've used Graphviz to make relationship diagrams to aid my communication with sighted people. There's an interesting Google Developer Podcast with T.V. Raman where he mentions programming directly in Postscript while he worked at Adobe. I've used LaTeX for my personal documents (CV etc.) and sighted people have asked me how on earth I managed to produce such a nice document. I wonder how feasible it would be to use Metafont for more intense graphical expressions.


I wonder how useful would be a VoiceOver-optimized editor for iPad.


I read this headline as if Agent Smith were asking Mr. Anderson...


I suggest the person asking this question to watch "Sneakers."


with so much crappy apps out there I figured blind people where already programing. :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: