Hacker News new | past | comments | ask | show | jobs | submit login
The death of consistency in UI design (osnews.com)
83 points by thomholwerda on June 18, 2012 | hide | past | favorite | 58 comments



Repost from OSnews comments:

This philosophy of putting consistency on this giant pedestal reminds me of the strict non-expressive philosophy of 60s modernist graphic design. They use Helvetica for everything, because Helvetica is neutral, believe that type should never be expressive because the meaning is in the content. Today many like the style, but very few share the philosophy that design should not be expressive.

Today graphic design is expressive. You can look at a poster and guess the content based on the typeface, colors, texture, etc. I think we are seeing a similar development in the world of UI design. Having the notes app actually sort-of look like handwritten notes gives people visual cues to what this application and makes it easy to understand. I would not be surprised if such visual cues makes the app disappear than a pure consistent app would. These apps are only different visually, in behavior they are often very consistent. You also have apps that take things much futher, such as Convertbot, Clear or Paper. I think breaking UI conventions is completely acceptable if they make the experience better. Personally I find Paper to be far more invisible than the other more consistent sketching-apps for iPad.

I think there is more than enough room for both philosophies (and everything between). Vote with your wallet and buy the apps that work well for you.


Additionally, there is the argument that today's users are way more familiar with the language of UIs. Because of that, some bandwidth that used to be necessary to drive home the message "this is what you can do here" can be used for other purposes.

And for those wondering about whether that "more familiar" argument has merit: the same has happened in movies. Nowadays, everyone can follow what happens in a movie if, in a scene immediately following one shot in the New York office, a shop owner has a slight French accent. In early movies, such a change required intermediate scenes showing someone taking a cab to the airport (and saying "to the airport" to the driver), getting in a plane, getting out of the plane, etc.


Your point about Helvetica reminds me of this line from a post written today by Armin:

"Helvetica is the fixed-gear bike of typefaces: it’s as basic as it gets, but the statement it makes is as complex as anything else."

http://www.underconsideration.com/brandnew/archives/new_univ...


Allow me to explain to you why you're wrong.

My computer is my computer. Not Apple's computer when I'm using iTunes (hah, as if I'd ever use that piece of crap, but stay with me here), not Microsoft's when I'm using Windows Media Player, not Google's when I'm using Chrome and not Mozilla's when I'm using Thunderbird.

I know what I want my user interface to look like. I know what fonts, colours and relative sizes I want things to be. All in exactly the same way that I know what colour I want the walls to be painted in my home and where I want my furniture.

By breaking UI conventions because some idiot thinks that he knows what colours and shapes my monitor should display, I am unable to dictate what my own things should look like. Instead of a designer spending his time on designing a theme for a GUI toolkit and skinning all of the applications on my computer at once, he wastes time on coming up with a visual theme for just one of them.

Intelligence 101 - tend towards abstraction as much as possible. This recent trend clearly has it's roots in the idiot depart... ahem, sorry, I mean marketing department where the thinking went something like "but we need to solidify our brand experience by making our application stand out!". Yeah, buddy, yours and everybody elses. Give me my shit back, provide the functionality I'm buying from you and design a nice theme for KDE or Windows if you feel so inclined.

To hammer the point in further with another example, CSS was designed with, err, designers in mind and look at what a head-fuck that turned out to be. Leave the thinking to people who know how to think - we'll tell you what needs prettying up and how to do it right.

To throw out an idea of how things could be better - why can't GUI toolkits support fluid designs which can be manipulated by central themes? Why can't some people opt for ribbon menus while others use the old drop down menus just by changing a theme? What about if I want all my menus to be ribbon style but in a vertical fashion? Or if I want that list of porn actresses names' to be on the right rather than on the left of the porn database application I use to track my stalking habits? GUI's should adapt to how the individual thinks as opposed to the other way round as it currently is. That is the smart way to do it. This recent trend in UI's and, frankly, many other things, is taking the stupid approach to the problem.


If you are that particular about your tools you should consider making them yourself, or using software you can really claim ownership of. iTunes and Chrome are not your software in any meaningful sense. Even less so probably because they are free (gratis).

With Mozilla you might have a claim, and I assume you are free to make your own version that has the columns in your preferred order or whatnot. For official support you just have to convince a sufficient number of like-minded programmers that yours is a problem worth solving, and then some minor technical bits like "What determines the central theme?" and "What scripting language shall we standardize on?"


If my hardware (hard disk) has a particular configuration of bits stored on it then they are my bits. How can information stored on something I own not be mine? I'm not talking in the legal sense here, I think that patents are ridiculous, but in the common sense sense.


I can't tell if your original comment was more than a rant. In some sense you own those bits, but not in a way that is useful to you. Starting with just the bits, it's going to be a lot of work to make iTunes skinnable, and you likely won't be able to distribute your changes legally.

It seems similar to ranting that Miles Davis has recorded all of these long boring solos on _your_ CDs. You could spend your time editing them out, you could demand an edited reissue from the publisher, you could just listen to something else, or you could learn to play trumpet and record something you'd rather listen to. The least useful option seems to be posting to HN.


> I know what I want my user interface to look like. I know what fonts, colours and relative sizes I want things to be.

You do? Wow. I sure don't.

Designing an user interface is mostly not about picking your favorite font and color. It sounds like you use the word "design" for what is actually "theming" or "skinning".


  I know what fonts, colours and relative sizes I want things to be.
This is only tangentially related, but a few years ago I set up my web browser to default to white on black. Big mistake. I promptly found out that about half the websites out there specify a text color, but not a background color.

Customization needs to be actively supported. It rarely comes for free.


^ the truth. hurts, doesn't it.


>My computer is my computer. Not Apple's computer when I'm using iTunes (hah, as if I'd ever use that piece of crap, but stay with me here)

Yes, very funny. Well, I'm a CS graduate, and I have no qualm with using iTunes. My colleague in the next workstation is a PhD from Cambridge and he uses iTunes too. Might want to expand your horizons about what is a "generally accepted piece of crap" and what is a personal preference.

>I know what I want my user interface to look like. I know what fonts, colours and relative sizes I want things to be. All in exactly the same way that I know what colour I want the walls to be painted in my home and where I want my furniture. By breaking UI conventions because some idiot thinks that he knows what colours and shapes my monitor should display, I am unable to dictate what my own things should look like.

That you "know what you want" doesn't mean what what you want is also right for you. For example, every obese person that craves for McDonalds also, er, "knows what he wants". But it would be better for him, in the long term, if he was denied that.

Furthermore it's not just about "you" or "me".

For one, the "idiot" you refer to is the programmer, and he also wants to express himself in his product. If you don't like it, don't buy it.

Second, for UIs in general to evolve people must come up and try different approaches, and, yes, break for the conventions.

Third, your argument doesn't even make much sense. You say you "know what you want", but at the same time you despise programmers making customized UIs instead of following the OS guidelines.

It would surely make much more sense to despise OS companies making their UI guidelines.

For, if a programmer makes a custom looking app, you can buy something else, but if your OS vendor comes up with designs you don't like you have no recourse except some basic customizations (which might not even apply, e.g you cannot change how the menubar works or where it is placed on OS X) or the painful switch to a different platform altogether. So you have to suffer their UI decisions in almost every apps.

>To hammer the point in further with another example, CSS was designed with, err, designers in mind and look at what a head-fuck that turned out to be. Leave the thinking to people who know how to think - we'll tell you what needs prettying up and how to do it right.

The implication that designers don't know how to think is as idiotic as is preposterous.

And the insinuation that non-designer "thinker" can just "tell 'em what needs prettying up" and we'll get good GUIs (as if design work is mere "prettying up"...) is even more laughable.


>Yes, very funny. Well, I'm a CS graduate, and I have no qualm with using iTunes. My colleague in the next workstation is a PhD from Cambridge and he uses iTunes too. Might want to expand your horizons about what is a "generally accepted piece of crap" and what is a personal preference.

http://www.straferight.com/photopost/data/500/medium/double-...

Whoopdeedoo, you've got a degree! Congratulations! I got a C in German when I was 16 but I don't brag about it.

>That you "know what you want" doesn't mean what what you want is also right for you. For example, every obese person that craves for McDonalds also, er, "knows what he wants". But it would be better for him, in the long term, if he was denied that.

Where's that triple facepalm jpeg...

What would help an obese person, and anyone else trying to break a habit in the long term is education, not taking away their free will. If an obese person understands what causes his sugar cravings, why eating sugary food makes him fat and that eating sugar will spike his blood sugar levels and then cause them to crash which will then cause yet another craving, he will stop eating it and lose weight. If that same person is educated about amino acids and how L-Glutamine, a non-essential amino acid, can stop cravings dead in their tracks when taken as a supplement, he will take a spoonful of that instead of reaching for the junk food when a craving hits him. If, through experience, he learns that exercise makes you feel awesome and has that little side effect of bringing weight down, he'll be sold for life. Denying people their right to make choices is a weak attempt to control what is not ours. Education is the way forward.

To address the other points you made, programmers do not like customized UI's. That's something which designers come up with because they are a) not aware of standards and the implications of breaking them and b) take orders from management who are advised by marketing on how to make the product memorable. To quote those crazy Asian rappers, the Wu-Tang Clan - "Cash rules everything around me, C.R.E.A.M. - get the money, dollar dollar bills y'all!"

> but if your OS vendor comes up with designs you don't like you have no recourse except some basic customizations (which might not even apply, e.g you cannot change how the menubar works or where it is placed on OS X)

I agree that this is exactly what's wrong with the current GUI toolkits - those kinds of customizations should be implemented at the library level, not the application level. I.E. have a widget for a menu which a theme can render however the hell it wants. Hell, make themes applicable on a per-application basis since you might want a different style for, say, music production software or an IDE.

Besides, OS X is possibly the worst example of a customizable UI, save Gnome/GTK. Almost none of it is customizable by design. I swear, Apple products are completely brain damaged.

> The implication that designers don't know how to think is as idiotic as is preposterous.

Just as the implication that programmers can't design is idiotic and preposterous, not that anyone would ever claim such a thing. Nonetheless, they completely suck at it - http://www.davelgil.com/BC_placeholder.png Just as designers can think but completely suck at that. If they would just combine forces and form into a bigger and better machine...

> And the insinuation that non-designer "thinker" can just "tell 'em what needs prettying up" and we'll get good GUIs (as if design work is mere "prettying up"...) is even more laughable.

But that's exactly what happens even with web UI's - the programmers define the structural elements - e.g. this is a button, this is a list - and the designers skin those widgets. You can't paint a structure which doesn't exist!

Auf Wiedersehen.


>Whoopdeedoo, you've got a degree! Congratulations! I got a C in German when I was 16 but I don't brag about it.

No, you only brag about how idiotic a) programmers that design custom UIs and b) designers in general are compared to you. Anyway, my point, which apparently went whoosh wasn't to brag (lotsa people have CS degrees), it was to show that even people trained in CS can find iTunes OK --unlike your implication that it is objectively crap for ignorant idiots.

>What would help an obese person, and anyone else trying to break a habit in the long term is education, not taking away their free will. If an obese person understands what causes his sugar cravings, why eating sugary food makes him fat and that eating sugar will spike his blood sugar levels and then cause them to crash which will then cause yet another craving, he will stop eating it and lose weight.

I don't see this "educate them" thing working wonders in the US. Plus it's not about taking away his "free will", just his junk food. He can still "will for it" it as much as he likes. It's not like badly made food with lots of additives, preservatives, second rate ingredients, sodium and saturated fats is a constitutional right.

>To address the other points you made, programmers do not like customized UI's. That's something which designers come up with because they are a) not aware of standards and the implications of breaking them and b) take orders from management who are advised by marketing on how to make the product memorable.

You'd be surprised. I'm a programmer. So is Will Shipley. So are tons of other, er, programmers that happen to like customized GUIs. Being a programmer is orthogonal to liking customized GUIs.

>Besides, OS X is possibly the worst example of a customizable UI, save Gnome/GTK. Almost none of it is customizable by design. I swear, Apple products are completely brain damaged.

A UI being "brain damaged" is also orthogonal to it being non customizable. You write as if there were no engineering and UI tradeoffs in a customizable UI. There are, and they are very real --telephone support costs from people accidentally switching their UI to some other style is an obvious example.

Also consider all the code needed to implement the "fluid GUI" you mention, with menus that can be shown as ribbons, regular toolbars or what have you by user choice. More code: more bugs, more costs, more complexity.

>Just as the implication that programmers can't design is idiotic and preposterous, not that anyone would ever claim such a thing. Nonetheless, they completely suck at it. Just as designers can think but completely suck at that.

This again implies that only the programmer's job has thinking involved in it. Which is as far from the truth as it can be. But check me response below for more:

>But that's exactly what happens even with web UI's - the programmers define the structural elements - e.g. this is a button, this is a list - and the designers skin those widgets. You can't paint a structure which doesn't exist!

In 1996 maybe. It's 2012. It hasn't been done like that for ages. If anything, with modern UX emphasis, it has got to the opposite: the designers design all the structure, interactions and functionality and the backend programmers have to implement it. But in the best web shops it's 50-50.


>unlike your implication that it is objectively crap for ignorant idiots.

I intended to communicate that it's crap in general.

>It's not like badly made food with lots of additives, preservatives, second rate ingredients, sodium and saturated fats is a constitutional right.

I believe that it is everyone's right to do what they want to their own bodies. Education lets people make the right decisions. Rules take away a person's right to make decisions. Maybe it's a constitutional right where you live or maybe not, but the law is not in line with common sense much of the time anyway.

>Also consider all the code needed to implement the "fluid GUI" you mention, with menus that can be shown as ribbons, regular toolbars or what have you by user choice. More code: more bugs, more costs, more complexity.

We're on the same page with more code being more costly but I'd argue that that's exactly what we're doing with the current customized UI's! By implementing this stuff at the library level instead of tens of thousands of implementations of custom UI's at the application level, we, in effect, reduce the amount of code. Since the population using the implementation increases, there is more testing and any bugs get squashed more quickly.

>There are, and they are very real --telephone support costs from people accidentally switching their UI to some other style is an obvious example.

That's the issue I have with this approach - everything comes down to business costs. In other words, the utility that something has to a small segment of society (the company) becomes more important than the utility it has to everyone else. This, to me, suggests an organizational structure which is broken by design.

>This again implies that only the programmer's job has thinking involved in it.

I'll concede this point and rephrase it - programmers are good at abstraction and saving work in the long run. This needs to be utilised more often than it currently is, particularly in terms of design.

I'm quitting smoking so I apologise for the tone of some of these posts.


I'd like to add a thought: maybe inconsistency in UI design is the only way to experiment and actually improve the current state of design.

When Apple released its ipod (with the big wheel in the middle), it was nothing like the other players we were used to. Was it horrible? Was I annoyed by the difference? No, I thought it was cool, and I loved my iPod. Some years later, they did the exact same thing with the iPhone. Completely different, and yet, it set a new standard for design.

Now I understand the point of the article, and it's true that it's annoying to see every other small app coming up with new UI conventions. But I guess I'm not too radical about it: if this is the price to pay for innovation in UI design, I'll take it.


Agreed - note some of the most enduring new UI conventions we've come to expect in the last couple of years, all of which were pioneered by third-party apps bucking the trend (even in places where there was already established convention).

- Pull down to refresh (where Apple encouraged an explicit refresh button on the bottom of the screen)

- Swipe aside views (where Apple encouraged explicit navigation buttons)

- Slide aside menus (a la Path, where Apple expected tab bars to fulfill that role)

- Fan-out controls (a la Path, where Apple expected the use of drop downs)

The list goes on. I for one am happy that some people are unwilling to just follow the tried and true.


Beat me to it. On a similar thread imagine if interface design hadn't evolved since the first 5 years of GUIs.


Can't you make the argument that the web killed consistency in UI design a long time ago?

The web has always been a completely blank slate from a UI perspective and what happened was that defacto UI/UX "standards" continue to grow and evolve. Someone will always try to do something completely different. But thats okay it pushes the boundaries a bit. You'll still have a lot of apps that use the consistent tried and true UX methods but some will flaunt these and push where UI can go. Path and the Band Of The Day apps both do this beautifully. Both are non-standard but very easy to navigate.


I think Twitter Bootstrap and Zurb Foundation are trying to bring some of that consistency back into the Web. Even completely unique applications with their own design and functionality are easy to navigate when they start with these frameworks.

For example, Roll20 [http://roll20.net/], an online RPG tabletop, is a full-featured web app that is easy to use even though it's quite complex, partly because they worried about designing a custom UI on the macro scale, leaving the micro scale (buttons, form elements, etc.) to Twitter Bootstrap.


I totally agree. The whole idea of responsive design that those types of frameworks promote is exactly the type of defacto standards that arise out of the mishmash of web UI/UX.


Initially all apps looked and felt different. It wasn't until apps started to live within a window on the desktop that consistency became a popular topic and concern.


My thoughts exactly. Every piece of software on the 8-bits and prior had a drastically different feel, even given the technical constraints.

But in the wake of MacOS and Windows 3.1, "native look" suddenly became a highly-touted feature. I think now that this was a result of low experience with desktop apps leading managers to "fake it till they made it" - presumably the OS makers invested more time in UX than the average app developer, so aping look+feel would get them a little closer to a "good" app.


Up until recently Native controls, limited font choices limited bandwidth and in the past limited colors constrained.


Limited? I wish. Remember all those sites with the big cyan comic sans image text banners? I'm glad the web's wild days of experimentation are a decade in the past.


Limited indeed. Comic Sans was one of the ~11 web-safe typefaces that you could count on pretty much everyone to have. Now that there's a bunch of good-looking embeddable typefaces, people don't face as much temptation to use shitty Comic Sans.


I think he is mourning a problem that needs no solution. Let me put it this way: my wife, who has very little understanding of how to operate computers (to make capital letters, she uses capslock / letter / capslock), literally figured out how to use her iPhone 4 within 4 minutes of turning it on.

If the problem with the UI is that it makes it harder to use, then I don't think that consistency is the problem. Lest you believe that I have a sample size of one, I have had my entire family use the iPhone, and they aren't particularly computer/gadget savy. My father, for instance, uses his Nokia "phone book" frequently, however that "phone book" is a taped on list of numbers on the back of his mobile phone. And with a little prompting, he was playing Angry Birds within about 10 minutes. My young daughter, 4 years old, has worked out how to unlock the phone (to my chagrin) and plays "Peppa Pig" at odd hours of the night.

If anything, the darned iPhone interface is too easy to use!


My "old school" of UI dates back to the 1980s, and through the mid 1990s.

To my eye, UI shows elements of program legacy. From Apple (early Mac) to PC (DeskMate, Amiga, MS DOS), early Windows. On the Unix side, a proliferation of X toolkits: Xt, Athena, Xaw, Motif, Tk, gtk, Qt.

For desktops, one of the beauties of X is the emulation modes of many window managers -- if you prefer the look and feel of Windows 95, Amiga, BeOS, Motif, CDE, VUE, old-style Mac, Mac Acqua, or any of several dozen other desktops, it's possible to try them out easily. While not all of these emulate deep characteristics of their source GUIs, there are at least some characteristics which are available for review.

Even CLI/console tools show their legacy. Emacs draws from TOPS/ITS, mc from DOS via Norton Commander, dd's odd command-line syntax will be familiar to those who've used JCL. Even native Unix/Linux command sets draw from various BSD (short options, single dash, single character) and GNU (long option, double dash, word) formats. More recent commands with a weaker Unix philosophy legacy use a single dash and even MixedCase words (ImageMagick, some disk utilities).

While inconsistent among tools, the formats do give me a clue as to what the legacy, and possible behavior, of these tools will be.

In the mobile space, minimized UIs are somewhat necessary. Interestingly, several of these were pioneered on Linux (e.g.: Matchbox). This does not mean that there's no room for exploration of useful UI metaphors in a new and highly constrained form factor.

Of possible interest: http://en.wikipedia.org/wiki/List_of_widget_toolkits


The thing I hate most about modern UIs is the lack of information. Instead of buttons with text, it's just a whole bunch of mysterious icons. Give me at least some idea of what will happen when I click your icon, and please try and make it look clickable, rather than some background glyphs. Oh, and mobile OSes need to come up with some sort of analog for keyboard shortcuts.

Consistency is nice, but usability and discoverability are more important.


Google seems to be worst with this. In addition to having confusing icons, they're not even consistent with themselves. The Gmail archive icon on my Android phone is a picture of a filing cabinet. The archive icon on their main web app is a box with an arrow pointing down. They're two completely different icons for the exact same action.


They balance that by elsewhere using the same icon for completely different actions, like the sealed envelope that means Compose as well as Mark Unread.


Agree! And it's getting even worse with all smartphones.

1. Icons are used far more often on smartphones to save space.

2. You don't have any mouse cursor to hover with and get a tooltip, you are left with the option to gamble and click or never try the button.

The gmail android app is horrible in this sense, "send" is just some weird arrow in the top right corner, i first thought it would open some menu. Fortunately it showed a confirmation dialog first...The rest of the icons aren't much better.


Windows Phone has a workable solution for this. Icons are shown for menus, but if you don't know what the icon means you can tap the ... at the edge of the menu bar and it pops up with some text underneath.


The ICS browser is the worst offender of this. If you enable Labs mode, the address bar disappears away and you need to gesture-swipe from the screen edge to get a semi-circular menu. Now, this in itself is a brilliant idea, only marred by the fact that it's a bunch of icons that you can't understand. I'd love if they added a big semi-transparent textbox on the top informing you which option is currently under your finger, something analogous to the alphabets that popup when you're scrubbing through the contact alphabet list.


I guess you'll like that Windows 8's Metro has labels underneath its icons, then.


it just doesn't make sense for mobile to have keyboard-shortcuts, the analog would be you touch the button, rather than bringing up a keyboard and typing shift-zz


I'd like something physical I can feel and use muscle memory to remember. Having to search the screen, interpret some obscure glyph and make sure I touch the right spot requires far too much attention for most tasks.

Volume down + left swipe to save a file. Camera button + screen tap to go to the app's home screen. Something like that, so I don't have to look. I'm not entirely familiar with screen gestures, but I guess that's the closest analog, so long as they aren't too complicated or picky.


The author is focusing on the wrong details. As long as the workflows and interface element positioning and shape are kept consistent, the visual theme is of little consequence. As long as the back button is on the upper left of your iOS app and has the right shape, it can have zebra stripes for all your user cares.


I couldn't agree more with what this guy is saying.

But, mobile applications aren't actually the worst offenders here. At the very least, both platforms have their guidelines for developers to follow if they want to make their apps consistent with the rest of system. And I wouldn't be so sure this is not important: snazzy looks might gain you more installs, but UX that silently and subtly uses established patterns fits more firmly into users' minds and may noticeably increase retention (and thus number of positive reviews).

The bigger culprits in terms of violating UI consistency are, I think, web applications. As far as I know, you don't even _have_ any definite guidelines there to begin with, which is of course due to decentralized nature of the web ecosystem. There are some effort that encourage consistency as a side effect (Twitter Bootstrap comes to mind) but the decade-old habits of designing every website completely from scratch do not seem to be dying anytime soon.


But mobile OSes are constantly changing the guidelines. Every new phone has a whole new UI language. Gingerbread, ICS, vendor mods, etc.


"As a proponent of what is now called the old school of UI design"

Doesn't sound like a very old school to me. vi, emacs and a large portion of popular command line tools came about in an era where devising one's own interface and interface conventions was the only route to actually providing an interface. Desktop environments ameliorated this somewhat, but only for common, generic tasks (copying, pasting, etc.). For any specialized or application-specific tasks it's always been roll-your-own, except that now people are actually thinking about the problem before creating the UI.

Consistency is important, but it's not the only factor.


An article like this coming from a site that has not updated it's UI in what looks like 6 or 7 years rings alarm bells for me. The UI of the blog is literally looking like a sore thumb, my eyes almost jumped at how different and compressed it looked.

I really disagree with this guys thoughts. When I'm in Twitter, I want it to look like 'Twitter'. I want to know I'm in Instagram, or Facebook, and see the difference. Imagine if all apps used the exact same UI solutions. How would innovation happen? It just doesn't make sense on so many levels. Diversity is a good thing.


I remember when there was consistency in UI design: everything looked like Windows95


Far more recently, everything looked like Gnome 2. It was nice because I could launch an application I'd never used before and know how various elements would behave.

* Certain shortcut keys would be consistent across applications

* Other shortcuts would be customizable by hovering over a menu item and pressing the shortcut keys.

* Listboxes could be filtered by typing and a little floating textbox would show what I'd typed.

* If I was in a dark room and was using a dark theme, I (probably) wouldn't suddenly have a bright white window blinding me.

* The file selector for opening/saving files would be standard.

* I'd be able to edit things on networked locations open in Nautilus.

And so on and so forth. There were all sorts of nice little touches that everything would have just because it was a Gnome 2 application. Going back to Windows XP, it always felt like everything was competing for my attention by being different in assorted inconsistant ways.


One of my favorite things about Mac OS is the extremely consistent shortcut keys across all apps.


Actually, Windows 95 looked better than KDE does today. So... yeah, I probably wouldn't mind if my current desktop looked like win95.


:shudder: Good point.


I think this is missing an important point.

The consistency in Android and iPhone is the touch screen. By removing abstraction, by a limited screen real-estate and thus limited scope of applications, I don't believe this is an actual problem.

In fact one could say that because of the above, exploration and inconsistency should be encouraged.

In the future apps wont be so much about on-screen interaction but rather representing information based on automated real life interaction. (Lots of automation the coming years)

The chrome will be all around you and not on the screen. Only the object/content will be on the screen (and the many screens).

So sure apps are not UI consistent but neither is a hammer, a schrewdriver and a ruler (i.e. apps)


This is going to be a serious problem for the adoption of web apps. Current problems with web apps are things like lack of keyboard shortcuts (without the browser getting in the way). Browser chrome wasting space in the UI and an overall more "laggy" feeling than a native app.

All of these problems however can (have?) be improved upon. UI consistency will be the last great bastion however, whenever I use a web app I have to spend a considerable amount of time figuring out where everything will be in the UI, this is even true for things like gmail.

Perhaps there will become a defaco standard widget set for web apps?


The defacto-standard already seems to be happening with Twitter Bootstrap. Give it another few years and I expect this will spread into more well-known apps (or the current startups using it will become more prominent).

And I do think people are addressing the other issues. Chrome's Web-App API can open pages in new windows without the browser UI, and Firefox is due to have their similar API available soon.

Things like keyboard shortcuts will likely sort themselves out eventually: for example, Github's issue tracking tries to mirror the keyboard shortcuts of gmail wherever the actions are similar.


Thank goodness the sentiment in this rant is not more widespread. The author would prefer that app designers y to avoid from standing out? Good luck with that. And really, is it that bad? Functional apps frequently make good use of prevailing design. And consumer an entertainment apps absolutely should be exploring. The form factor is so limited already that it's much harder to go very far astray.


I agree that consistency of UI is, broadly, an important part of general usability, but I think that the web has been more unforgivably destructive towards this than mobile, which is a younger and inherently more constrained environment.

This is why bootstrap is so important: it gives the consistent look and feel to the web that has been missing for so long.


Isn't this the same drum that has been beating since the days of desktop applications... for those old enough to remember when applications where actually run on a desktop OUTSIDE of a web browser ;).

Actually... even past the dark ages anyone remember the popularity of webpagesthatsuck.com?

No matter what the technology someone will need to do something that is out of the bounds of the 'stock user interface'. So you either account for it by allowing everyone to do anything or you limit it...

http://www.useit.com/jakob/constbook_preface_2nded.html


I would argue that innovation and trying new things are always a net positive. If you give it enough time (i.e. on a long enough timeline). Because new standards and best practices emerge. New standards and best practices that are better than the previous standards and best practices. Yes, it requires effort and debate to get to that point. But it's better than becoming stale and irrelevant, IMO.

In fact, I think a new set of best practices is already emerging in mobile app development. You just have to know where to look.


Not a bad article but I dont believe this is the big issue that he makes out. Mobile apps far less sophisticated than desktop, I have a Galaxy Nexus and sure there are differences between apps and between things like where the Menu icon is. The reality is though that you figure them out, it takes an extra second or so first time you use the app, its not the end of the world.


I think it's just a matter of the rapid pace of innovation in the mobile space right now. The common pattern in innovation is to diverge and then converge. We're already seeing the convergence around a couple of paradigms like pull down to refresh and sidebar menus and we're going to see more of them as we slowly figure out what's the best solution to a design problem.


I for one am glad that Tweetie, Path and Facebook broke iOS UI conventions and innovated. Pull to refresh, stacking views, swipe to show options, FB's peak-a-boo home screen, etc. If not for these innovations, we'd all be stuck in tedious UINavigationController and drab UITableView hell. Everything would look like the Settings app.


Anal.


>It's been one of my major pet peeves on both Android and iOS: the total and utter lack of consistency. Applications - whether first party or third party - all seem to live on islands, doing their own thing, making their own design choices regarding basic UI interactions, developing their own non-standard buttons and controls. Consistency died five years ago, and nobody seems to care but me.

Yes, it's called evolution.

As the saying goes, "A foolish consistency is the hobgoblin of little minds".

The consistency he asks for is that foolish consistency --where e.g buttons all look the same, all colors match, applications have a cookie cutter look, etc. I.e strictly following some platform's design guidelines.

Not only that would KILL development of new UI concept (a lot of widely adopted ideas come from some app experimenting with new look&feel ideas), it would also make any OS look drab and boring.

Consistency that should matter would be more like: drag and drop works everywhere it should, text copy/paste plays well from app to app (remember KDE/Gnome/Motif/what have you around 2005? I hope it's better nowadays), keyboard shortcuts are respected, etc. Generally, controls should MOSTLY work the same (with the provision for experimentation from the occasional app).

As for things like the color of the toolbar, the look of buttons, etc? Not so much.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: