Hacker News new | past | comments | ask | show | jobs | submit login

Somehow I don't think that learning how to operate a unique proprietary user interface(iPad) will help people to operate a real desktop computer. While it may temporary help to boost users self confidence in using a computer it will only serve as a fallback after the inevetable frustrations with a full UI.

What is realy apsent from the mind of older generations is the idea that computers are just tools, albeit imensly powerfull ones, and as with all tools we need time and especially motivation to learn how to use them.

Personally, I don't give a damm how my car works but I had to learn how to use it to get form one place to another and it would certainly be much harder to do it if every other car had a radically diferent set of controlls.




But why would all these people need a "real desktop computer"? People need to read their emails, browse their facebooks, store and browse their images... And most of them will get all this done easier with a simpler device than a "real" computer.


Apple is training people how to use the next generation of computers with their language. Because no one else has had the guts to do what Apple is doing, they are ceding the opportunity to define that language.


A person is entirely capable of learning and mastering more than one interface. All that's required is the ability to build cognitive model of how the thing works that remains coherent as they use it. Humans are actually quite adept at learning new things,, and the tendency to assume that familiarity is a necessity is one of the things that has caused interfaces to stagnate.

What is telling is that for as long as we've had the familiar "full UI" of a "real desktop", many intelligent and motivated people still struggle with many of the conventions. Have you ever marveled at the fact that most desktop software in 2010 still defaults to destroying your work? And if you do want to keep your work, it insists that you give it a name, a name which must be unique and must not contain certain characters that the computer is entirely capable of understanding, and then it asks that you choose where in some invisible hierarchy you want to keep it (even though the answer in almost every case is is "where I can find it again"). All of this complicates something that might as well be done with pen and paper.

These are things that everyone who uses a desktop interface has to get used to, and there are some good reasons that they are the way they are, but they aren't necessary things for the task. It is possible to safely shed all of these conventions and end up with something that still gets the job done, possibly better than the traditional desktop way.


Hypothetically, if all I want to do is exchange emails and browse the web, why should I have to learn how to use a "real desktop computer" or a "full UI"? I'd be learning a whole bunch of things I probably don't really care about or need to know. Empower me as a user to be able to do what I want to do, don't burden me down with a whole lot of details that get in my way.

Computers can be very powerful tools, but I am not sure many people are really interested in using that power -- I think many, even those who grew up with computers, would like the details to go away.

To follow your car analogy, cars used to be much more complicated to operate than they are now. Modern cars have electric starters, automatic chokes, and in many cases automatic transmissions, for instance.

Apple is simplifying things with iPad, Google is heading in the same route with Chrome OS, and I'm sure there are many other players we haven't heard from yet. It's going to be very interesting how things eventually play out.


A bit wrong of a comparison. A car is a tool, and can get dangerous, which is why it has this strict interface and education rules.

Computer usage is more like math. It's something you just need, part of every day life, though many people still hate it, as it is more flexible. GUI's really share similar concepts, you just have to adapt them, and if people hate one word, than it's adapt - seriously hate.

You really have time to figure out how things work and GUI's are really no rocket science to figure out.

Nobody expects users to become system administrators or application developers, but answering questions about self explaining clickable graphical elements is very painful. (With a little endurance you could probably get a chimpanzee to understand most of our 'modern' UI's).


When I was using a Macintosh SE in 1989, my peers told me that I was dumb for not learning DOS commands so I could use a "real" computer. (Somewhat ironic: Now I'm a Linux user and spend all my time at the shell, and most of them are happily using Windows or Mac OS X...)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: