Dark patterns are everywhere and I'm afraid not enough people are raising a ruckus about them. For example, the "Google Pay" app in India, which works via the UPI system, shows its own "scanner code" in the app as opposed to the standard UPI QR code. Furthermore, you can't discover where the UPI QR code is, if you want to pay using BHIM - an app that works directly with UPI payments. If you're using GPay too, then you can access the scanner code. If not, you used to be able to select the UPI QR code from a menu and scan that. However, later updates have hidden that - not eliminated it, mind you, deliberately hidden that. You have to "swipe left" when you see the GPay scanner code to get to the QR code - not at all a discoverble act. My wife discovered it by accident after we were puzzled a few times about where the QR code that used to be accessible went (both of us use BHIM). As a consequence, folks who use GPay do not know about UPI. This is, in my books, malicious design. Hanlon's razor would ask me to try "stupidity" first. But no. This isn't stupidity. It is downright malicious (the benign term being "dark pattern") to do this .. at least at Google's level.
When I want to really convey my point in a discussion, I call it "manipulative pattern". People feel more implicated when you point them their behviour or actions have been manipulated actively.
"Human exploitative pattern" or "human manipulative pattern" makes me instantly pay attention and on edge. They are proactively exploiting/manipulating our brain's cognitive processing/award system/neural addiction pathways on a subconscious level and massive scale, which needs to be identified and categorized for all to see... cause we didn't evolve any cognitive firewalls to stop bad actors from intentionally nudging us to make bad choices that they profit from while pulling our levers that they gathered from data signals to dial up the deception and trickery to a science that predicts our probable actions to a degree of accuracy that is terrifyingly accurate.
This is just like the marketing and advertising industries playing emotional music and showing humans doing things that pull on our "heart strings" to get us to buy their crap they are peddling. We need protections from tech company dark patterns and cognitive processing manipulation/trickery.
I prefer "deceptive pattern". They try to trick you into thinking some option doesn't exist, that some default is the only way to do things, that you'll benefit from some harmful thing, etc.
We need an objective way to tell which things are manipulative. And to whom. Also, ads are manipulative too (for many people), but do they belong to the same category?
The "manipulative" here refers to the UI actively or passively lying to you, and slightly nudging you towards doing something you do not want.
Ads, or the way ads are presented can be manipulative too - e.g. articles which pose as informative, but in fact are written just to push a certain product. But if the ads are presented overtly, they are not manipulative in the sense that is discussed here.
I would favor “anti-user”, “user-exploitative”, or just “deceptive”.
There’s nothing inherently wrong with something that’s “pro-business”, and in fact I see “user-centric” or “user-driven” as being good business. “Pro-business” is bad when it comes at the cost of the user/customer/employee/etc.
Hostile pattern, à la hostile architecture. Though note that in the sense of hostile architecture, "hostile pattern" would refer to a superset of dark pattern, that can also refer to patterns that beneficially manipulates the user by making it difficult to do things they don't want to do anyway (e.g. turning Wi-Fi back on in phones some time after a user turns it off due to a poor connection)
The point of the term "dark pattern" is to stop short of calling it malicious, deceptive or downright fraudulent. That's thin ice.
If this article was released with the term "malicious", the Amazon lawyers would come knocking. And the author would be in big trouble, because it's very difficult to prove malicious intent.
ok, malignant. Are the Amazon lawyers going to argue that Amazon is not actually inimical to human existence and an invention of the devil so malignancy cannot be proven?
Let's call it Racket UI, and the practice UI Racketeering.
I mean, they get to say that infringing on intellectual property is piracy.
From Wikipedia[0]:
> However, according to the original and more specific definition, a racket or racketeering generally involves extortion or criminal coercion. Originally and often still specifically, a "racket" in this sense refers to an organized criminal act in which the perpetrators fraudulently offer a service that will not be put into effect, offer a service to solve a nonexistent problem, or offer a service that solves a problem that would not exist without the racket. Particularly, the potential problem may be caused by the same party that offers to solve it, but that fact may be concealed, with the specific intent to engender continual patronage for this party.
It fits what they're doing with "dark patterns" much better than torrenting a movie fits armed robbery on high seas.
I think they need to clamp down on these sorts of things. We need to advocate for the common user who are susceptible to these exploits optimized for human brain trickery.
On your Android phone or tablet, open the Google Maps app .
Touch and hold an area of the map that isn't labeled. You'll see a red pin appear.
You'll see the coordinates in the search box at the top.
If you want to be amazed on what can make it into official docs, I present to you the One Note doc page for Find and Replace [0]:
In OneNote, you can use instant search to find specific text, and then replace it with different text using a keyboard shortcut.
On a blank page, type the replacement text that you want to use. For example, if you’re trying to update a project name in your notes, type the new project name.
Select the text you just typed, and then press Ctrl+C to copy it to the clipboard.
Press Ctrl+E to expand the search box in the top right corner of the OneNote window.
In the search box, type the text you want to find.
At the bottom of the results list, click Pin Search Results, or press Alt+O.
In the Search Results pane on the right side of your window, click the first search result (a text link next to a white page icon) to jump to the page where OneNote has highlighted the text it has found.
On the page, double-click each highlighted occurrence of the text, and then press Ctrl+V to paste your replacement text over it.
Repeat steps 6-7 for each additional page in the search results list.
Hey, OneNote is supposed to be a stable product, you can't expect them to just go and import some random feature from a bleeding-edge experiment like Notepad!
Almost all new legitimate uses of a GUI those days look like a workaround. Why is Edge and Chrome and Word and... stealing my windows titlebar ?
Why can i not access the favorites menu with one click as it used to be ? Why do i need a big label which uses 30 % of the screen and does nothing ? Why everything looks like a label ?
What's interesting to me, is that those that shape the platform guidelines (Google/Android, MS/Win, don't have apple so can't say) often are the worst offenders to completely disregard them.
Just as an example, I've turned off auto updates on my phone and inspect each app update individually, but the google apps often just say "bug fixes and performance improvements".
When that's the case, such general guidelines become useless. "If they don't follow them, then why should we?"
They don't really mean anything on desktop anymore seen as only a small minority of MacOS apps are built using those frameworks. Everything else is either Electron or custom cross platform UI frameworks like Adobe Suite, Alberton, Cinema4D etc.
I went the opposite way. When Google started deliberately making Maps harder to use without following their preferred usage pattern (removing 90% of street names etc. so you have to use search and directions for everything) I stopped using Maps. I'm now almost 100% degoogled and while it's less convenient sometimes, I feel so much less icky.
I think it depends on how it's linked/embedded. The street names showed up fine for me on the main site just now, but embedded versions on company websites etc. often only show 1-2 street names and leave the rest blank. Same with the mobile version iirc.
One could argue that removing the UPI QR code from the app would be the truly malicious act .. and at least they're keeping it in. If they could do that, they would. NPCI regulations require these payment services to interoperate via UPI. If they hadn't required that (I love NPCI and their work), you can be guaranteed that the UPI code will not feature in the app.
swiping became popular as smartphones matured. more than one monitor on a PC became popular around the same time. suddenly some cool genius transvented the concept of accessing a useless second desktop on a teeny tiny screen. his golf buddy thought it would be useful to hide ugly, pesky clutter there. hey presto! your grandma is using snapchat gestures to accidentally delete bridge club emails
the three bar menu button is more intuitive than this hidden-yet-integral functionality swiping bullshit, and you can hide an infinite amount of submenus behind such a recognisable icon. why are we still swiping?
don't get me started on phone gestures on laptop trackpads. they should come with CTRL-0 laser-etched on to their flimsy scratchplates
On Gmail swiping from left to right can either open the sidebar or delete the email you hit, depending on how far from the edge you started. Fantastic.
Google has gone rogue for a long time. But for UX, it's not necessarily dark pattern. IMO, there is a bad smell, started by Google, particularly Chrome if I remembered correctly, which tends to hide menus/actions by moving them under the burger/3 dots menu. Such behavior saves a little bit screen estates but adds one extra click to finish one particular operation and the bigger problem is that it hides the available options from the user and in turn makes applications less intuitive.
Nope. The presence of the burger icon still affords discoverability even if a little less convenient. I can still see it and go "oh here is an icon and let me see what comes up if I tap it". In the GPay case, you see nothing and so assume there is nothing beyond what you're seeing. The thing is, a menu that you can tap to get at the QR code used to be there to help get to it. It was removed.
Same story in a lot of iOS design. There used to be a button where you could see all the options: now there’s some magic gesture for everything. Wtf, just show me a list, show me options I don’t yet know that I need or would know what to call, show me what you can do.
A simple example is how you can slide some list items to expose a delete button. Yet only some lists support that. So you’re apparently expected to slide every list to see if it supports that paradigm. Instead of doing something like normalizing a small arrow on list items that support such a thing. Everything is hidden.
Burger menus are great. You’d never be able to fit all those options on screen at once, nor would it make sense for rare actions. But the burger menu gives people a place to put actions on mobile UI where others, like Apple, would just remove them and gimp the mobile experience.
It’s why all of Apple’s first party applications are so bad, like the podcast and news app. There are like zero options. Just add and remove and consume.
A similar recent trend that really bugs me is apps on OSX that run in the background, with a presence in the toolbar at the top of the screen used to be closable with a single click on the icon, move the mouse to the last item in the menu, which always used to be "Quit".
Now it's hidden under layers of menus on purpose.
The worst experience I've encountered is Adobe Creative Cloud. They force you to open the app with the icon, then use the main menu of the app on the other side of the screen to choose "Quit Creative Cloud" to then be presented with the following choices:
(Cancel) (Quit) (Hide)
"Hide" is highlighted and "Quit" is the one you just asked for. Perviously that step would pop up a nag screen begging you not to quit the app, but that no longer happens.
They are obviously testing various ways to become "stickier". And with each iteration it gets more and more difficult to close. I just want my RAM back please!
I won't call this particularly malicious .. but even Chrome is in this game - it asks to "hold cmd-Q" on macOS to quit chrome .. whereas every other app close when you type cmd-Q. My respect scales are tilting way away from Google and Microsoft is looking pretty good relatively.
My least favorite google UX is in youtube. It's almost impossible to swipe backwards, and videos (ads) keep playing after back swiping multiple times. Even after swiping back to the home screen. It plays the ad over the home screen, and you have to click a tiny x to actually stop the ad.
It's also worth mentioning that the misconception of thinking UX as being solely related to aesthetics is getting dangerously more and more common, even among professionals referring to themselves as UX designers.
Aesthetics is important, but, in some sorts of systems, it has much lower priority than other non-functional requirements, like overall speed, maintainability, testability and so on.
I don't know if my experience is the same of most of you, but I've seen more and more systems getting aesthetic pleasant but painfully slow and/or buggy these days. It's easy when you can simply quit using the software, but when your bank, your broker, your airline, your accountant, the company that operates the subway ticket machines, when those sorts of businesses start to embrace such vision, things start to get complicated.
And, unfortunately, all those examples I mention here are real and this is not an exhaustive list.
Besides, I'm not sure those companies (at least the ones in my examples) do that as an attempt to exploit users. I believe they do that due to plain lack of competence.
The biggest issue to me is the dumbing down of an experience and design, treating your users as incompetent vs. treating them as intelligent and being able to figure out things; there's still bad design that can occur when complex, but good UX of presenting information appropriately and contextually can create a rich but easy, intuitive flow.
The slowness of certain platforms is also an equally big issue, Reddit for example who seems to purposefully ignore and allow their website via browser experience to suffer - in an attempt to drive people to their app for ad tracking purposes - which I think is shooting themselves in the foot with the trend towards eliminating shallow-manipulative advertising from existence.
If you have the opportunity of visiting a company still using COBOL, Clipper or some other text mode software, ask their users for an opinion on those old systems compared to the current ones.
Most of them prefer the acient ones. When asked why, they simply say "I don't know. It works, it feels right".
Even the younger users (who already had experience with web applications, big tech's stuff etc when they joined the company) prefer the text based ones.
I'm not at all implying that every software should be text based. But I think there are some very important lessons to be learned here. Things we developers used to know, intuitively.
I think UX designers should listen more to those users and less to theories and to Google and its MDC trying to make the internet look like their own products
I don't think the issue is text based or non text based. In my opinion, the problem is TOOL versus ASSISTANT. A tool is easy to understand and does one thing, an Assistant is more hand waving, tries to guess what you "really" want, steers you away from error and lures you into something.
Can confirm. Started out writing code in IDEs, now I'm using Vim
and CLI utilities. It's a polished, stable ecosystem that
doesn't hide anything from the user and isn't incentivised to do
anything other than its purpose, and that becomes very
noticeable once you're familiar with it.
It doesn't come with file sync, plugin management or flashy
animations. But I also don't get loading screens, don't need to
look at SaaS pricing tiers and don't get overwhelmed by UI dark
patterns. That's a trade-off I'll take any day.
One of the great things about command-line interfaces of old are that you could often enter your keyboard input faster than the system could respond to it, and it was OK because the system buffered them up for you and executed them as it caught up.
So if you know on this screen you need to hit F1, and that on the next screen you will have to hit X, then on the following screen, you need to type in the SKU number, you could simply issue as fast as you wanted: F1-X-4112295[ENTER], and take your mind off it while the system would just go through the whole experience successfully at its own pace.
With "better" GUI/mouse based systems, you have to click here, wait... wait... wait..., now click there, wait... wait... wait..., then click there and type in your SKU. Your attention needs to be on the screen at all times in order to wait for the UI and aim the mouse.
Next time you go to your bank web site to pay a bill, think to yourself: Wouldn't it be nice if, rather than navigating their cumbersome UX, you could just hit a known string of keystrokes that you memorized, hit enter, then walk away to get coffee while all the Javascript and page reloading chugged along and paid your bill?
So much this! I've noticed how common it is that interfaces can't keep up with me when, by all rights, they should be able to! I'm a limited, slow human, while their instructions execute billions of times per second. And yet I'm constantly running into scenarios were the next screen comes up and I try to give it an input, but it's not ready. Or when I hit the next-preset button for my car radio, and yet it has to pause to load that station before I can use the button again.
Yea. I have 48 cores on my workstation at work, dozens of gigabeans of RAM and an ultra-mega-terra-gigabit ethernet connection, and still I sit there watching spinners as websites composed primarily of text and images struggle to load.
Right, any desktop app where you have to wait for a non-stupid operation (like a Gaussian blur on a 50mpix image) makes me want to throw my computer through the window. It's 100% unacceptable, every interaction should have its result in the very next frame, and we have all the tools for that.
That would be nice but it is impossible because to actually commit some money I have to go through 2FA involving the bank app on my phone and enter one or two other secrets on the bank website. This is Italy and probably most of EU since the last year.
I understand this is for the safety of my money, probably also for the safety of the bank itself (if something goes wrong it's more and more my fault) but sending money to someone is becoming a PITA.
Even checking your balance from your non-mobile device is quite a pain now. It's much more likely for me to miss a fraudulent transaction now since I no longer login on a daily basis.
"So if you know on this screen you need to hit F1, and that on the next screen you will have to hit X, then on the following screen, you need to type in the SKU number, you could simply issue as fast as you wanted: F1-X-4112295[ENTER], and take your mind off it while the system would just go through the whole experience successfully at its own pace."
There actually are a few plugin managers for Vim. I think i used Vundle once upon a time, but honestly don't recall.
My biggest pet peeve with vim and emacs is that (a) discovery sucks as with most all text mode tools, and (b) emacs in particular is slow and difficult to configure to get some mainstream languages working. Doom was the closest I ever got to being happy with my setup, but the tooltips and autocomplete were still objectively inferior even to vscode (for typescript and react, at least).
Lisps are a different story, but not by much. I'm actually fairly happy with using vscode to write common lisp, though I don't do much of it these days.
Sorry to come at this late but there's discoverability-by-doing and I think it's clear from general user feedback that emacs doesn't have it.
I agree it's very well documented, and I assume your impulse upon reading the previous paragraph is to tell me there's a REPL. it's simply not casually approachable. sorry. ask someone who isn't drowning in koolaid!
Also, for Vim, `:<tab>` gives you all available commands in the current buffer and `:h command` gives you the vim help page for that command (which is usually at least a few sentences long, and pretty thorough).
> Also, for Vim, `:<tab>` gives you all available commands in the current buffer
Don't misunderstand me, I love vim and use it almost exclusively and everyday, but tab-ing in the way you describe yields 2-3 screenfulls of commands that I have to manually (well, with my eyes anyhow) scan to find what I want and it includes gems such as: "spellrare" and "{{{{{{{{"
The latter of which I do not know how to open a help page for.
To me "discoverable" would mean that I would be able to use this interface to learn more commands, but this is only as discoverable as having to read man-pages. Sure - that is a great way to learn, but it's not discoverable.
Now, what would probably get a little closer is if `:<tab>` opened a list if commands in an fzf-style list also listing their descriptions and maybe grouping them by some other way than just alphabet sort order.
(If anyone can tell me what is up with those '{' I would greatly appreciate it)
For a network of sites I am developing I am planning a command prompt where users can type /command for basically every function: /msg username, /logout, /search phrase, etc.
I did something similar for a web app at work and the people on the floor really liked it. It was mich faster than trying to mouse to things, especially when half of their task is scanning barcodes anyway.
Interesting! I had a similar idea but as a library to drop into existing sites, perhaps even be a browser extension. Vimium does a great job of enabling keyboard control over a website but I think a generic web CLI could take the experience even further.
When I started at my current employer we were running our retail management software on DOS terminals connected to a SCO Unix box. My experience with the system was that it wasn't intuitive or easy to learn, but was pretty bulletproof for the veteran users who had been working with it for ~30 years. I learned as much of it as I needed to know, and got to the point where I could do some tasks without even thinking about which keys I was pressing.
I helped set up the replacement system and received a lot of complaints from those same users - some resistance to change, but also some legitimate failings of the new software. Tradeoffs in the network structure made it run more slowly at satellite locations, Windows 10 on all of the terminals caused network issues until we got updates and bandwidth moderation dialed in, and some employees needed additional practice using a mouse.
I see trade-offs between the old and the new here. The new software is easier to train new users (less arcane) and able to securely handle things like chip credit cards (which had previously needed a separate card terminal system) and charge accounts. At the same time, the old system was faster (for the limited features it provided) and was able to run more smoothly on older equipment (very little OS overhead, text only network transmissions).
Text-only as opposed to encrypted text and images.
As far as I've been able to make out, document images are generated on the server and sent to the terminals as PostScript files. I've noticed at satellite locations the receipt printers hesitate for a few seconds before printing the barcode at the bottom, which tells me that either 1) it's taking some time to transmit the barcode image over the internet or 2) the server is taking its sweet time cataloguing the transaction in the document archive. Either way, I've gotten complaints about it from the users.
Having used 3270 terminal for time registration many years ago. And it was 1000 times better and faster than all the web apps that followed. So I can confirm this.
I'm working freelance and built my own cli for this exact purpose. I've been wanting to redo it as a web app, but i honestly like my cli tool better. It lives in git, and Gitlab CI handles generation of pdf/excel reports and uploading to Dropbox. Inputting new data is just so damn fast compared to opening any web app!
Some of the worst offenders in this area were the preinstalled
apps on Windows 8. Their official OneDrive app had I think 1-2
options to toggle and didn't even show how much of the cloud
storage was used/free. It was basically a picture viewer but
without properly working right click.
And in terms of performance I fear slow, ad-filled websites
slowly trained users to accept bad performance in applications
outside the browser as well. At least I don't know how to
otherwise explain the widespread indifference toward those
things, often even among developers.
> And in terms of performance I fear slow, ad-filled websites slowly trained users to accept bad performance in applications outside the browser as well. At least I don't know how to otherwise explain the widespread indifference toward those things, often even among developers.
Yeah, I don't know either, and I think yours is a very good hypothesis.
I'm currently working on a windows project after spending four years exclusively on Linux, using mostly vim and Eclipse. Visual Studio feels now way worse than it used to. You get more conscious about the time you're wasting.
And, most interestingly, the same happens to users. It's not a matter of being text based, it's a matter of getting useful stuff done, and fast.
Users can and actually enjoy using the keyboard. Users don't want to be delighted with our stuff (speaking of the enterprise here), they just want a system that helps them to get their job done and, after that, get out of their way.
Designers and more and more developers seem to forget that animations, complex colorful screens, all this stuff has to be built, tested, processed by user's hardware which is already having a bad time supporting windows. So, instead of shipping something fast, something we are able to build and fix and test quickly, we deliver something detrimental, in many ways, to user experience.
> Visual Studio feels now way worse than it used to. You get more conscious about the time you're wasting.
Visual Studio is the perfect example to me. I begged dad to upgrade the computer to 64/128MB of RAM to get VC++6 to run. It was feature complete then and maybe a bit earlier. Sure supported languages and syntax of come and gone but the core product feature set (editor + intellisense) has not changed in over 20 years, yet the current version takes an order of magnitude more memory and is slower despite having a desktop more powerful than anything I could have imagined back then.
> It's not a matter of being text based, it's a matter of getting useful stuff done, and fast.
Sometimes this is about being composable and scriptable, text based generally helps here.
> Sometimes this is about being composable and scriptable, text based generally helps here.
Absolutely. This is one of the main reasons I'm using Vim - not
only because of the program itself but how I can just combine it
with the multiplexer I want, in my preferred terminal emulator,
interfacing with other text-based tools. Many things I do
regularly would require me to write custom plugins in VS Code.
In Vim I can just bind keys to an arbitrary sequence of inputs -
something modern applications sorely lack.
It's limited, but that limitation severely reduced the
complexity of the interface - it's all just a string and bash
is, despite all its weirdness, an extremely productive string
processing language.
I think this is why the spreadsheet is the killer productivity app. It's just a keyboard enabled information management tool, power users can take a spreadsheet really, really far. Entire businesses rest upon spreadsheets.
That is coming from someone who considers the need to use a spreadsheet as a marker that my job scope had gone awry. Love spreadsheets, hate managing information.
Imagine if Excel had 1 inch padding around any cell, a gray large font on a light gray background, no cell borders and autohiding row and column labels because "cleaner and modern UI" :-)
I really wish spreadsheets had a way to loop over rows and columns. Currently I have to create a huge mess of ARRAYFORMULAs whenever I need to apply some computation to all rows. I use my spreadsheets as databases and I want them to calculate new information as soon as I input new data. This use case should not be so hard...
Yeah, using an actual programming language would probably solve all this complexity. I'm using Google Sheets though and I absolutely need everything to be fully functional in the mobile app so I'm not sure how far I can take it with the javascripting. The app doesn't even support editing pivot tables.
Makes me think of a system/application getting attention, reach, that it doesn't deserve - hasn't earned - a negative of the advertising industry as well, one reason I love Tesla is that they don't advertise, any attention they get has been earned.
Part of the issue is inter-team communication and part of the problem comes from that very phrase "dumbing down". I often hear people argue for "dumbing things down" when there is an overwhelming amount of information, but that phrase:
1. Treats people's intelligence as 1-dimensional, which it is not.
2. Gives no clear direction to an engineer who is passionate about UX, recognises that humans are multidimensional, and is trying to ask about what their users really need.
You might say I'm arguing semantics, but communication matters!
Without it, you can write code at high speed, but you won't learn what direction to go in order to present information appropriately and contextually to create a rich but easy, intuitive flow.
In order to get my reddit addiction in check I uninstalled the app and started using it in the browser. After 1 week I quit reddit forever. Never looking back.
Reddit is an example of a good dark pattern for me: The popup reminds me that I need to leave the side. Unfortunately, like the Nigerian Prince Paradox, an upfront obnoxiousness is often designed to filter people out early, if they risk not being gullible enough during the rest of the path.
May I ask what you think is wrong with Reddit's UI? In my opinion, it's one of the best out there and I've never had problems with it. HN is better, but mainly because it offers far less functionality.
Facebook is for me the prototypical example of a bad UI with lots of dark patterns, confusing navigation, different settings scattered all over the place, and frequent technical glitches.
Have you used Reddit on a phone without using the app? It constantly berates you to use the app, even when you've clearly stated your preference for using the site before.
There's also the problem that the new UI is clearly more buggy than the old one, and requires manually refreshing to fix glitches far too often, but that's more of a technical problem and not a UX issue.
When I click on a post it only shows 4 comments by default, and I have to click again to see more.
To be honest, they actually seem to have improved new reddit a lot since I last looked (a few months ago). Now once you do that second click it shows a decent number of comments, whereas previously even then it would still not show many.
> May I ask what you think is wrong with Reddit's UI?
Worth noting that the mobile experience is VERY different from the desktop website, and the desktop website is very different if you're using the Reddit Enhancement Suite plugin (and if you haven't used the site in a while, that difference is even starker now that RES uses "old.reddit.com" while everyone else is stuck on the new redesign)
I love the RES / old.Reddit experience, but the new desktop redesign and the mobile experience are both pretty annoying.
We are all busy and distracted. Dumbing down an experience to it's essential aspects is a great way to make something easier to use.
If a 'fool could use it while trying to juggle' then that's good.
Making it dumb means speaking to intuition, not thought.
I should add, there are some good comments here about 'command line' etc. - I suggest that that makes sense but in an entirely different realm of usability.
Designing for something that's going to be used by competent people all day for their jobs, is different than designing for the proles standing in line at the bank, looking for 3 minutes of respite from whatever, or even trying to squeeze in that little update.
I somewhat agree. A UI that works well for inexperienced users is better than one that isn't. But the real measure of a good UI is if it works well for inexperienced users and actively allows for them to gain experience while using the software.
A good example is how, in classic application menus, each entry would have its keyboard shortcut printed next to it. A fresh user can navigate the menu to find an action, and the same option is open to anyone how forgot about how to invoke the action, but whenever they land on the action, they can see the keyboard shortcut and eventually they'll remember it and learn to invoke the action faster without having to make an active effort.
This is good UX.
Then designers or product owners or someone decided that the shortcuts made their menus look cluttered (or maybe they really took that "treating your users as incompetent" mantra to heart and thought the shortcuts looked too technical or intimidating), thus condeming inexperienced users to stay inexperienced forever.
Those are good points - often we're in different headspaces as well on the matter.
Physical product design, mobile app, command line, common entry (like the software the waiter uses), pro software (i.e. Photoshop) are all very different aspects of usability representing colliding viewpoints.
I've run into this while interviewing people for UX positions. They'll walk me through a portfolio and point out a lot of the aesthetics, whereas I'm asking about their decision-making process: how did you identify the problem? What other solutions did you consider and why didn't you go with them? How did you determine the organization of X? Why did you choose this flow? Why are you framing the task this way? Why did you make this more important than that? Why are you grouping this and that together? Why not these other things? Why expose this information at a glance and hide that information behind an interaction? etc.
Disconnect between UX department and engineering department can cause this too.
Business object can grow out of nothing.
UX designers might feel creative and introduce some unintended features, dynamic colors, even complex structure data. If the design is not validated regularly, these non essential items get worked on by the engineers not knowing that these are not in anybody's interest other than the UX designer.
This is not solely UX designer's fault. This is also the team workflow's fault to not accommodate time for validating design.
At the other spectrum, from engineering side it can be from the underlying mechanism, such as request of huge data should be made stateful because one http call can't handle it, multiple points of hardware failures in a seemingly single operation, etc.
These can sometimes change the game and UX design need to be changed so that user information.
If these are not well communicated, the initial UX design may not suffice to honestly picture the actual underlying process of the product, causing the users to miss one or two options they should have, e.g. missing retry button where user is supposed to be able to retry an action.
Again there should be a period where UX design is validated again and again in the middle of development. A department or a role in the development team that takes care of the big picture of UX design (information architect and interaction design), the big picture of system architecture, and the big picture of project management (to determine which developer works when) might be ideal for it.
> This is not solely UX designer's fault. This is also the team workflow's fault to not accommodate time for validating design.
We suffer from this greatly. Our app is complicated by nature. The designs that come from our product team turn a blind eye on the complexity and try to "make it simple". As a result, the product doesn't really work (in some aspects).
Every time we try to "validate" those designs, we're shut down "because simplicity".
Honestly I think product designers should go through a database course or something to understand that if there's a one-to-many relationship between to entities in real life, you cannot force it into a one-to-one relationship and ignore all potential corner cases. Suddenly 40% of cases are "corner cases" and the app doesn't work.
And a very flawed interpretation of "simplicity" btw.
What is the point of creating an interface containing the bare minimum of buttons if that will lead users to spend more time to perform what they need? Just like that Apple's "one-button" remote?
What is the point of employing fancy graphical diagraming technics to layout widgets if the resulting layout impairs users?
I'm ok about sb designing an interface without talking to users if and only if the designer is being assisted by/responds to sb who has already been on the future users shoes, or have previous similar experience.
The prevalence of this theoretical, taste and opinion-based ideas about how interfaces should be designed must stop.
Indeed. And they should listen to the actual users and product managers as well (assuming product managers know the use cases very well). Instead, so many UX designers behave like "they knew better".
That perfectly describes what happened to the software provided by my stock broker. They changed the interface so that tools stay on fixed positions, with fixed sizes.
Now the interface looks nice but it also makes some sorts of operations simply impossible.
They completely fail to realize that the "mess" of the old platform was actually obeying to customers ongoing needs for monitoring a specific set of assets, and we didn't f.. care about how the interface looked like, we just cared about numbers and that was all.
It's so depressing how fast even conservative sectors like finance are forgetting some decades-old stuff which could just be taken for granted until some years ago.
Then, the said UX designer is not a true UX designer.
User experience involves stuffs other than visual appearances. They must also take care of interaction, information architecture, usability. Decreased usability is a symptom of failure of misdesign.
People learn about these topics at different period of their life.
The most important thing, to me at least, is to work with people open to learning and exploring something new. People who, when realizing they lack of a certain competence, goes curious about the topic rather than denying it.
A good UX designer might not know these topics before they are hired, but they will be happy to explore when they heard of the terms.
And that's why I'm such a big fan of simple, as-much-text-based-as-possible interfaces. By self-imposing a decrease in her degrees of freedom, one gets fewer things compounding the probably already high project burdens.
In this context, validating the design means ensuring it is portraying the business object as correct as possible to relative to the business requirement.
Like, "validating implementation against specification", but replace implementation with design and specification with business requirement
For me, the most frustrating example of this is the control panel on Windows. The new mobile-friendly menus exclude many important "advanced" (emphasis on the quotes) settings, especially related to audio and networking.
The old menu is still available, but over the years they've made it progressively harder to find. But when apps get confused about my bluetooth headset the only way to fix it is to pull up the old menu.
I actually like it better because I just type whatever I'm looking for in the search bar and it brings up the appropriate item, I don't have to go searching. In fact, for something like bluetooth settings I wouldn't even pull up the control panel, just click start and type "bluetooth" then enter (when you see it's brought up the correct item).
I remember in the old versions of windows changing environment variables was something that was a bit of a pain paint because it was hidden behind multiple "advanced" dialogs. Now it's real easy, click start type "envi" and hit enter. That actually brings up "System properties" advanced tab and you have to click "environment variables."
Don’t you remember in Windows XP how you had to click on 6 « advanced » dialogs to set the machine IP? And DHCP wasn’t the default on many, many routers at the time. Windows has always been bad at UX, probably because each dialog was the job of a different team.
Off-topic, one of the things that surprised me the most when I finally made the jump to Linux (I don't remember anymore which dark pattern finally turned me off of Windows, but it was one too goddamned many) -- the bluetooth support was light-years ahead of anything I'd previously encountered. I was expecting hardware incompatibilities to be way more problematic than they actually were.
It infuriates me to no end that bluetooth headsets have two modes -- one for listening only, and one for listening+voice, where the latter has absurdly low audio quality.
This has been a major source of problems for me, especially on Windows, where I often have to manually switch between the two.
I have two Linux laptops. On one, Bluetooth devices endlessly disconnect/reconnect and I gave up entirely. On the other, Bluetooth works flawlessly, better than Windows! So it's really hit and miss in my experience.
Came here to say that. UX should, by and large, be staffed with deep experience in User Psychology, with some tech and design experience to know how to apply that knowledge. Instead it's mostly discount designers who occasionally conduct a user interview.
My stock broker have recently redesigned their systems to provide us "an improved experience".
Except that, besides removing some functionality existent in the previous platform, the new one doesn't allow you any longer to lay out your tools the way you think it's more convenient.
They asked for a feedback and I said it feels like the artist conceptual design has priority over the damn user needs. It's evident that the people involved on that project have never performed something beyond trivial operations at the stock market.
Financial industry usually values stability and it's a bit disconcerting to see a stock broker embarking on such trend now.
>Financial industry usually values stability and it's a bit disconcerting to see a stock broker embarking on such trend now.
It's because of Robinhood. Mobile friendly (Material?) UX is very "welcoming" to users, compared to the multitudes of tables and charts that are typically part of a trading platform. Since RH was able to onboard a lot of users, especially amateurs, the others have been paying attention and doing every thing to become similar to RH, to capture that huge amateur market.
> nothing prevented them from offering an alternate, more amateur-friendly platform, without turning the traditional one off
This is exactly the gripe I have with my broker too right now, so we might be using the same broker.
They try to squeeze the entire platform onto a mobile app, when I'm sure 95% of their user base trades from a PC. In the end, it results in botched functionality, a shitty UX of hiding things away, the works.
> Came here to say that. UX should, by and large, be staffed with deep experience in User Psychology
Maybe if you're trying to convince people to part with money. Most applications just need UX designers that understand the users workflows and what they're trying to accomplish. When a person comes to this screen of the app why are they there? What data to they need? What are their next steps? Are the doing it for 1 item or 15?
Programmers want to create generic, do everything CRUD interfaces with 15 layers. "UX" people just want to make it pretty. No one is left to understand what they users are actually doing with the software.
Slack is a prime example of this. Simple things like switching a chat room, which was instantaneous in mIRC on my old 166mhz PC, can take seconds in Slack. But at least there are round corners everywhere, plenty of padding around elements, and animated gifs /s
Real talk though...if you polled slack users and said "You could have instantaneous room switching, but you have to give up animated gifs", what percentage of users would actually take that trade?
Absolutely, you're probably right if you meant to imply that the percentage of such users is low. But on the other hand, if Slack became as fast as desktop apps from the 1990s tomorrow, how many users do you think would raise their arms in joy and exclaim that their computers suddenly became much faster? Latency is less directly visible to users - which is probably why it receives less attention - but silently adds to the users frustrations when working with the system.
I think it can be a challenge on both the supply and demand sides.
On the supply side the profession has ballooned and subjectively it feels like a disproportionate amount of the growth has been folks who tend towards the "aesthetic" side and/or who lack stat/psych/research/engineering experience. Not that aesthetics isn't valuable, mind you, just that the balance feels off.
On the demand side more product teams are considering UX but many of them seem to have the impression that UX is there to "make it look better". In my experience most teams are receptive to doing user research to identify and solve the underlying problems, but occasionally I'll get someone who insists that the solution is to use the latest UI fad.
Combined and I can see why UX is being perceived as more shallow.
The UX sin that bothers me the most lately is when I can enter data faster than the device can handle and it doesn't buffer. The example I gave last time is running calc.exe, then hitting enter and typing a formula. On Win10 you'll lose keypresses and half the formula, but not on the older versions. It used to be a very convenient way to do some quick maths without even interrupting my workflow, now I have to wait for it.
It's even in some games I've played recently, where I'm in a menu and hitting down twice then a button will sometimes ignore the second press because it's still playing a 'move the cursor' animation.
I was pleasantly surprised last time I snuck a look at the screen in my local post office where I saw an ASCII terminal UI full of text with hotkeys. There is a mistaken belief that complicated is bad, but in a job where you spend your whole day all week long it can be very beneficial because everyone can be a power user.
> ...even among professionals referring to themselves as UX designers.
I believe this is a part of the problem. Because there is a word "designers" in "UX designers", a lot of designers expanded their credentials to include UX. In my experience, they invariably suck at it (at least the ~5 I had the pleasure to work with).
The proliferation of high-fidelity wireframes (and tools like Adobe XD, Figma) is another testament to that; impressive looking designs that hide basic UX mistakes. Low fidelity wireframes would make them obvious, but who wants to look at the "wires" when you can have colors?
I think the designers simply prioritize aesthetics over content / usability and it is difficult for them to get past that. The best UX designers I have worked with in the past were either engineers or coming from completely unrelated fields.
> other non-functional requirements, like overall speed, maintainability, testability and so on
If your title is UX designer, and the UX problem is 'overall speed', how do you solve that given your skills and responsibilities?
As software engineering has grown from someone hacking in their basement to large corporate teams practicing SAFe agile, the responsibilities of the individual software engineer have been reduced. The lone basement hacker is responsible for everything, but the corporate engineer is a cog in the machine with requirements handed down from project managers and ux designers.
Overall speed is a systems quality encompassing the whole system. It's something a lone engineer responsible for the whole system can tackle and be diligent about, but the corporate engineer must rehash the design over and over with ux designers and project managers until an efficient design, not just an aesthetically pleasing design, is what is used.
Another systems quality besides speed is security, and you'll notice more and more systems are less secure than they used to be.
I agree with your conclusion that it’s more likely lack of competence.
I appreciate the article ending with the impossible NY website for COVID-19 vaccine registration. The situation is the same in Maryland, except that one must register for three so as to be on a waiting list with the county health department and the private pharmacy and the local hospital (as there is no telling which might get supply first). Not to mention the larger problem: the target group is people age 75 and up! No one seems to have thought that maybe an optional system of registration by phone, or through a call center representative authorized by the caller to register them on their behalf, is plausible. Somehow, three bad websites is considered fine. And one can’t blame this on ‘government’- the private pharmacies are as bad.
So is it incompetence, laziness, or sheer lack of concern whether the most vulnerable elderly get vaccinated? Incompetence might be the most comforting interpretation.
This is happening everywhere. It took me around 2 hours to sign up my parents for the vaccine (in an EU country) on a website that was impossible to figure out. Add details here, then you have a list of details, but those are not the actual people that you can sign up, then you need to go somewhere else to actually be able to schedule someone. It's as if they made the scheduling website so complicated that only the people that really want the vaccine will spend 1+ hour to try and register. For a 65+ person, it is almost impossible to navigate through that. And for the icing on the cake, it's all a react SPA that they build in a hurry, probably, that when you got a server error back for one of the requests, the whole multi-step process crashed and you had to start over. After all the frustration I ended up completing the process with Postman.
At a social networking company once I brought up the fact that the post message to feed workflow required 3x more steps than any of our competitors, and that perhaps we should focus on making it easy to post. Design of course ignored it, as they wanted something "measureable" to iterate on.
I totally agree. It begins with using monochrome icons and ends with something that is pleasant to look at but where nothing that is useful is standing out. As if they try to make screenshots to look good to hang them on their walls. Form over function....
I disagree, it was always as common as it is and never broke past the barrier of getting people to understand it's value, beyond designers themselves. This was true before UX was called UX (before 2014 maybe and as far back as designers of any kind have been around) and it's true now.
I'd actually argue otherwise - UX has gotten tremendously better in the last decade. Online products have been moulded perfectly towards improving the use case for their customers.
The main issue though is that you aren't the customer anymore.
Look at Facebook - they've done an absolutely incredible job becoming a dream experience for advertisers. Youtube has catered to these advertisers as well and blocked controversial content that wouldn't look good next to their brand. It's a golden age to be at the head of a marketing team with a large ad budget. These companies have made golden escalators for getting sponsored content in front of the productized consumers.
If you want UX to succeed and become something that benefits you as a user, then you should focus on platforms where you truly are the customer. Look at Square or Patreon - in both platforms you are truly the customer, and their UX is incredible.
>Look at Square or Patreon - in both platforms you are truly the customer, and their UX is incredible.
I'm sorry, are we talking about the same Patreon? Because patreon.com has a bunch of seemingly pointless clicks for posting content.
Here's the step-by-step for posting a video from the creator page:
1. Click on Posts in the sidebar. An animation creates a dropdown - this frequently stutters even on a reasonable desktop and creates a small delay.
2. Click on New. This loads a new page where you can select a post type.
3. Click on Video. This loads another new page where I can select Vimeo or Add URL.
4. Click on Add URL. This immediately creates a textbox under the button that I can paste a URL into. I can set a title, tags, additional content and post it.
---
Compare this to uploading a video to Youtube:
1. Click on Create. This immediately opens a dropdown.
2. Click on Upload video. This opens an upload video box that you can click on to open a file upload dialog or you can drag a file onto it. If you're on the Youtube frontpage then there's a page load in this step, but if you're on the Creator Page, then there's no page load.
3. While the file is uploading you can fill in the title etc and even press Publish immediately (while it's still uploading).
I understand that the two cases aren't equivalent, but the experience of posting content feels better on Youtube. I think in terms of UX something like imgur or streamable are great examples.
Edit: I was overly harsh at first. It's just a part of UX that has consistently annoyed me with Patreon. Overall it's decent, but these small things make it feel worse than it is.
If we are following what OP was inferring, arguably the customer in Patreons sense are the ones forking over money, in which case the UI is pretty damn easy.
The creators are an important part of the platform, but arguably will put up with more because they are getting paid.
As Closi mentions in the sibling comment, the customers are the ones paying the money.
What you've described is exactly what I'm talking about - the UX is optimized for whoever is paying. Your example is a perfect showing of this. A creator's experience is secondary for Patreon - a patron's experience comes first (hell, it's even in their name). Patreon improved the UX of funding creatives.
Pateron? What? Pateron has the worst UX of any platform I regularly use.
Sign up to be a pateron of some artist to download their works. Your only way to do this is to scroll through the list of all their posts one at a time in an endless scroll. And to add insult to injury the page will crash and you have to start over from the top of the scroll, there is no way to start 50 or 100 posts in.
For artists offering media via pateron a good UX might be something like Mega.
1. It is slow.
2. It is buggy.
3. They often introduce new features (instead of fixing old ones), call me on phone, and convince me to use them, only for them to not work right.
4. The UI seemly tries it hard to obscure useful stuff and make easy to waste money, sometimes it is blatant, for example when I was getting a ton of click fraud coming from mobile apps, I found a hidden option to disable ads on mobile apps, 2 weeks later they removed the option entirely, restoring the fraud, when I went to research how to fix that, the answer was that I had to either let the fraud happen, or stop advertising on mobile (even in sites) entirely. I took the second option (and the SEO and ad-ranking hit, since google hates when you ignore mobile)
Indeed; consumer-oriented software previously focused on user productivity as its primary concern, and at present this class of software is focused on user engagement as its primary concern.
Well, he's right. I feel sorry for the students that studied psych, hoping to help people in need, and, instead, are working for companies to produce dark patterns.
That must suck; but at least they probably make more money than they would, helping people. Unfortunately, in the US, at least, we value the reaper, more than the servant. Teachers and social workers are paid badly, treated with immense disrespect, and sidelined.
People who make money by treating their users like chattel are lionized and held up as national models.
Ah...well. That's the USofA.
For me, I decided that I love UX -the original kind, and I write software that implements it. I don't particularly care if I ever make a dime off it. I love to write high-quality, non-manipulative, useful software. It's a rare luxury, and I'm grateful to be in that position.
It does make me sad to know that the kind of work that I do is scorned, but that won't stop me from doing it.
Well said. A redefinition in design not driven by big tech would be a great initiative. Maybe even an open source data bank of non dark patterned practices.
Another shout out for “About Face : the essentials of interaction design” which everyone should read, especially programmers.
For basically everyone of an age to being reading HN, we've essentially never known anything other than technology (and life in general) improving over time, often exponentially.
If you've never seen this talk by Jonathan Blow, he's makes a rather compelling argument that we don't necessarily have any reason to believe this will continue:
How about this. Tonight, when I want to watch a program on BBC Iplayer, I need to switch on the kindle fire stick, wait for it to load, find the iplayer app and wait for 30 seconds until it loads, search for a channel, with each character I type taking 0.5 seconds to show up on the screen (no kidding), select the channel and wait 10 seconds for it to buffer.
When I was a kid, I just used to press one button on the front of the tv and it was on in less than a second.
Not sure what my point is really but there’s something not quite right with this situation. Technology has improved choice but made the experience awful.
I have to wait many seconds for the "entertainment system" computer in my car to boot in order to simply hear the radio. And when I tune to a different station, it's a touchscreen with UP and DN buttons that I have to tap on to tune ±0.2 MHz, and it takes takes maybe 750ms to lock on to the station. Think about how absurd this is! Compared to my first car made in the early 80s where the radio started as soon as you turned the key, and you had an analog dial that panned through the spectrum instantly with the sound continuously playing.
The first telly I remember was b/w and had three channels and belonged to a neighbour. Provided the dog hadn't recently wazzed on the rubber plant that supported the aerial, in which case ITV was optional and then we only had BBC One and Two.
However, old school CRT screens could handle pans and zooms that my modern LG thing can only dream of. Then again the thing I'm watching now is comparatively huge and "waffer theeeen". A ~50" CRT would stick out from the wall about four feet and weigh enough that I'd be using some of the more robust Civil Engineering things I learned at college to fix it to the wall.
When I was a child it took a while to tune a TV by hand, channel by channel. Remember portables with the little aerials on the end of a wire? Then finding out that to watch the rugby today involved perching the TV on a chair near a window and the aerial held by long suffering (someone) holding it at a strange angle near the ceiling. You missed half the match faffing around.
My laptop runs Arch - that's far more friendly than anything I used in the '70s-'00s. I recall getting an Epsom FX80 dot matrix printer connected to our C-64 was quite traumatic and involved getting a Centronics (parallel) interface card made up and stuff. I still have the C-64 and it now has a USB interface.
Now I press the home button and pick a service on my TV. OK on my TV that isn't one of my RPi driven monsters that uses the MythTV backend. I have an Octo-LNB on my sat dish ...
(Sorry about this (twitch) but it's iPlayer and you were a child, not a kid)
> My laptop runs Arch - that's far more friendly than anything I used in the '70s-'00s.
Not to be too much of a Linux-using stereotype, but it really does seem like the things that have been getting better are those things that are made (usually for free) for users for the benefit of themselves and other users, and not by profit-seeking corporations as closed-source software.
The few exceptions to that rule are those areas that are heavily dependent on technological improvements, e.g. music production, gaming, media editing. But that's just because there are still huge profit-wins to be had just by improving the quality of what the user receives. There's no reason to think wins like that won't dry up.
Nah I was a kid and we had a colour tv. Your setup sounds very impressive but you’re not the average viewer, and it’s the average viewer that has an old fire stick like mine that takes 30s to load the iplayer app.
I assume you’re in the UK if you’re using iPlayer.
Why not just use Freeview? It’s not quite as fast as analogue TV because it has to acquire and decode the digital signal, but apart from that it’s just as convenient, the quality is better and there are many more channels.
I can walk over to my TV, it takes a second or two to start up. Maybe another second or two to open an app. And then I can watch any one of tens of thousands of shows.
Yes, when I was kid the TV turned on almost instantly but then I was stuck watching whatever minimal content was broadcasted. Can't even pause it!
I have the same argument with Jonathan Blow's talk -- he talks about how complex things are now compared to the past but the past was so much less capable. Yes, you can write an OS in 3 weeks as long as it doesn't do very much.
If we don't have choice, we manufacture happiness. Something we predict will be somehow worse than "real" happiness, but studies show it's not worse. If we have choice, we predict we will be more happy, but studies show we aren't.
It seems like all of those other steps allow you more flexibility to do a bunch of stuff you probably couldn't do as a kid, such as watch BBC from any country in the world, or watch something from the BBC that came on earlier, or to pause the BBC so you can get some chips from the pantry.
Every once in a while, Netflix will take like 15s to load. But before I cuss out my XBox One, I do try to remember that as a kid I was happy downloading one picture off a 300BPS modem in like 10 minutes from a BBS (which sometimes took 30 minutes to finally get into) back when I was kid.
The cool thing is that while it's not easy, there's probably never been an easier time to make good stuff. There's a lot of opportunity out there to make better experiences.
My android box running pirated streaming and torrent services (or even IPTV occasionally when I want to watch “TV”) seems to work 100x better than your proprietary tech. Plus it runs Prime and Netflix just fine...
My $50 Chromecast is still the best thing around for watching movies or streams off my computer. Kodi works but it takes set up times which I don’t like investing.
Sounds like you either just have low end tech or bad gov run software services.
And I’m not running anything fancy just your typical Android SoC with a decent remote with a keyboard on the back, which I bought for <$100 max off Amazon.
I'm not sure I'd use the term "improving" but would instead say "changing." The perception of those changes has certainly been weighted towards "improvement" but some of that perception is a hangover from the popular mid-century views that most people held regarding science and technology (others held such views prior to that time but I wouldn't say they were the majority).
By "things" it seems you ignore the key bit of the parent I was replying to... namely "technology," especially within the confines of the main article which focused on UX.
Specifically I used the word "changing" rather than "improving" because within the tech world (and to a degree the larger world of product) there is a long history of new and novel equating to better, hence the tired ad slogan "new and improved!" That emphasis on newness as desirable is a result of the many measurable improvements that did result from the rapid pace of innovation that occurred in the 19th and 20th century, improvements that were obviously perceptible in that they resulted in large leaps forward vs incrementally over long time periods.
Also, to claim that there has been improvements for large-scale populations via some new thing is nowhere close to being sufficient to explain how that new thing is an improvement for the individual customer.
Which one should I read first? I read Taleb's critique, who comes off as a raving lunatic in this context, and Gladwell who must have some axe to grind since he doesn't actually address the concrete metrics that Pinker brings up.
Lower childhood mortality, less poverty, longer life expectancy, less violent crime, less disease, more equality, less war. Things are getting better, along almost every metric we can think of. It's not just "different". It's also not just Pinker's work confirming that, OurWorldInData is good on this topic too.
The only sensible argument I've heard to the contrary is that our systemic tail risks have gotten bigger. Which is accurate but doesn't change the fact that things are much better for almost everyone presuming that such risks can be mitigated.
Jason Hinkel's article raises some good points but is hardly a valid rebuttal. It's extremely weak. I'm still firmly on the side of Pinker's narrative, which clearly fits the data far better.
(1) Hinkel falsely concludes that "The poverty rate has worsened dramatically since 1981", using a graph of the number of people in poverty (which increases by ~31% between 1981 to 2013) as justification. The rate did not increase. The ratedecreased according to his own graph! The population increased by 59% over that same time period, so the correct conclusion from his own data is that the poverty rate actually reduced over that same time period.
(2) Not only is poverty better, but almost every other meaningful metric (disease/mortality/war deaths/crime deaths) is also better, which he hand-waves away in a single paragraph after falsely asserting that the poverty rate has gone up!
It is borderline dishonest, or perhaps at best he is innumerate. If this is the best rebuttal then I am even more confident in the conclusion that things are getting better - MUCH better - aside from a number of existential tail risks that we need to mitigate.
Excluding China, the graph in [1] shows ~3pp decrease in poverty over 30 years, and a good portion of that time was above the starting point so it might just be random fluctuation.
Also,
> only 5% of new income from global growth goes to the poorest 60% of humanity – people living on less than $7.40/day.
Excluding China is motivated reasoning - and even doing so still shows the opposite conclusion to the one he's clearly desperate to validate. What if I excluded Venezuela or North Korea in order to boost the conclusion that I've decided on a priori?
He explains why it makes sense to handle China differently -- basically because China hasn't applied the policies that Pinker advocates, so it shouldn't be used as evidence for them. China's policies have been very different. He should treat other countries the same way, as you suggest, though China's impact is likely larger than the others.
This is motivated reasoning on his part. If the question at hand is "is the world getting better?", then excluding China makes no sense in the pursuit of answering that question.
It's also ridiculous for him to say that China doesn't use policies that Pinker advocates for. That's absolutist and binary thinking. Pinker would advocate for the open market liberalising policies of Deng Xiaoping relative to Maoist economic authoritarianism, which helped to lift millions out of poverty in China, even though there's still a lot that he doesn't agree with China's system.
If Pinker doesn't like Putin's strongman behavior in the region should he exclude Russia from his statistics on improving world peace? If Pinker doesn't like the US healthcare system should he exclude the US from his statistics on childhood mortality?
This is a dishonest rhetorical strategy that Hinkel is employing. It's clear to me now that Hinkel is a bad faith salesman who has set out to demonstrate his hypothesis at all costs.
If the question is "is the world getting better?" then yes, treat everything the same. But that question isn't as useful as others that could be asked, such as "which policies make the world get better?"
Hinkel's argument is that Pinker uses "is the world getting better?" as a proxy for "are my preferred policies making the world better?". If policy evaluation is their goal, the ad-hoc exclusion of China is overly simplistic, but so is drawing policy conclusions from overall trends without examining what's driving the trends. Both Pinker and Hinkel should be measuring the degree to which various policies were applied and evaluate poverty-reduction against that quantity.
I'm sure you can think of other dimensions in which things are worse than they used to be. (Not being snarky, I just think things are usually more complicated.)
In decade 1 as the author describes it, there was still patience of investors that there ROI would come, and AI/data wasn’t feasible enough, so good UX was the name of the game. Think of carts that needed to be 1 click affairs.
When I did my training in advertising, the first line uttered by my teacher was “We are learning you to lie here” , and indeed the rest of the course was about how to deceive the customer by pushing the right buttons.
That’s the state of affairs of most of the commercial web. And I think you are right it won’t change.
There's another somewhat related problem: UI fatigue.
Many companies have now gotten into the habit of changing their UI with every release, moving things around and changing the depth at which functions lie. Apple is particularly guilty of this. Try following a tutorial for adding a button to a view in Xcode - the UI to open the assistant panel isn't where their documentation says it should be.
And this happens with every single release of Xcode. There isn't a single Xcode tutorial on the web that is accurate anymore (not even Apple's own docs).
I suspect that orgs perform arbitrary UI redesigns for the same reasons supermarkets occasionally reorganise their product layout. It stops customers getting too efficient at extracting only the value they think they want.
I can see that argument for apps that want to sell products or show ads, but what about GP's example? What value is there in getting in the way of a software developer trying to create apps for your platform?
My personal guess is that most UI redesigns are busywork explicitly created to justify the existence of the UI design department. (And I don't say this to throw shade on UI designers in general. This could just as well be the new manager trying to keep his headcount, or the new lead designer removing all traces of the previous designer's work over personal quarrels.)
Part of it may be the same reason that developers love to rewrite code: all creative works over a certain age accumulate compromises, and newcomers don't understand the reason for those compromises. So a new set of designers, like a new set of developers, feel compelled to "rationalise" what they've inherited.
Sometimes they succeed, and build a much more logical information architecture... but many times the new design succumbs to the same entropy that that the original one did.
UX is the "user experience". An experience can be good or an experience can be bad. An experience can reduce friction, or it can intentionally create it. UX as a term on its own is neither positive nor negative.
The title should just be "I dislike UX anti-patterns", because good UX, i.e. low-friction/high-reward (for your customer) UX is still the nice thing it's always been.
I find it a bit ironic that the author of this post references a New York Times article on how difficult it is to cancel your Amazon Prime subscription without any mention of how difficult the New York Times makes it so cancel a subscription to their own service. If you have a New York Times subscription you can only cancel it by calling them on the phone during working hours or using their support chat which is only available during working hours (for everyone except Californians, kinda odd how they won't make that available to the whole country, huh?). Why not just have a simple cancel subscription button on your account page?
> a New York Times article on how difficult it is to cancel your Amazon Prime subscription without any mention of how difficult the New York Times makes it so cancel a subscription to their own service
Well, not only that, but things have gotten so bad that Amazon Prime is really not the best example. I actually just went through this process: I accepted a 30 day Amazon Prime free trial just before Christmas because I was going to be doing a lot of ordering. Of course I forgot to cancel it until a few days after the trial was up. So I went to go cancel it and try to get any money they had charged me back.
And... it wasn't really that bad? From the main page, you click one link to manage your account / Prime subscription. Another one link takes you to cancel the subscription. There are like three or so pages that explain the benefits you'll be losing and allow you to back out, or choose options like "cancel my subscription at the end of the month", or "cancel it now and refund the money". I wouldn't describe any of this as dark-patterns because even if it's manipulative it was at least very clear how to do what I wanted to do.
That the other thing, too. I was able to get $14 of the $15 monthly fee back when canceling - they didn't try to pull any bullshit with that.
A couple of qualifiers: (a) yes, it could (and should) be better, and (b) I'm a California resident, so it's possible they presented a different flow to me than to most other people.
Well at least you can cancel by phone. I was moving out and wanted to cancel my internet subscription last year. Months in advance, I went to their website and located (with difficulty!) their cancellation page. There was either a form online you could fill where they would call you back, or you could call them. I did the former, waited for a couple of weeks, never had an answer. So I called them, got someone who told me it would be cancelled in 3 months as requested and forgot about it. I even received a survey about how my cancellation process went.
4 months later, I received a bill for the first month after it was supposed to be cancelled. So I called support, which told me that cancellations could only be processed by registered letter, and that they had sent me a mail regarding this (which I never received). She then told me that the service would be cancelled in 2 months since it's the minimum delay, but it would be shortened to 1 month since I was moving to a place which already had a subscription with them, provided I gave the name/address/customer number of the place I was moving to. I asked if she was sure about the details and she said yes.
Now, 2 months later, of course I still received a bill. Called them again, turns out you actually need the signature of the person holding the other contract for the 1 month thing to be valid... Apparently, they sent me another mail, which again I did not receive. (I did, however, receive a late payment notice with massive fees to my new address...) I sent a letter complaining about the whole situation, and just got told to "read the contract attentively next time".
Piling onto your post because of recent bad experiences:
Did you know that literally anyone can email Comcast and complain about alleged DMCA violations on your residential internet? Comcast has no mechanism for submitting a counter-notice, and if you get too many such complaints they will ban you for up to 6 months. It's shocking that an ISP can be so bad that they're the impetus for moving all by themselves.
In your case though, if you're interested you might be able to get a lawyer involved? Lying is generally legal, but they might have some civil liability for fraud or something. It probably wouldn't be profitable for you, but if you won it might make you feel better.
I thought about it, but honestly it took a lot of time and energy out of me already, I just don't want to deal with them again.
I just won't ever do business with them again, and I tell my story from time to time to people that ask my opinion about ISPs.
I'm not even sure I have a case, I looked a bit online and as you said, lying is not illegal even if in cases like this that doesn't sit very well with me.
In your case, isn't having a counter notice mechanism a requirement? I guess you could have sued them as well for that? But I understand, I don't want to have to sue every company I do business with, that's not a healthy environment even though it probably would be better for everyone in the long run.
My Internet suddenly stopped working in the middle of a work day a few weeks back (I work from home) and I tried to call Spectrum to diagnose the problem and all they could tell me was the account holder canceled the account, which was clearly not true as I tried to tell them since I am the account holder and I did no such thing. Unfortunately, my wife (who they would actually talk to) works in a SCIF, so I had no way to get a hold of her and had to just leave her a message and wait.
It turns out someone had called and claimed to be moving into my house, so they canceled my account in order to enable setting up a new account for the person who claimed to be moving into my house.
I am still amazed that this is possible. You can get any customer's Internet turned off and they can't stop you just by signing up to get a new account at their address.
I had such a hard time cancelling The Economist last time, I joined their web chat to ask them if cancelling was as easy as signing up and if it was I would subscribe. Unfortunately my place in the queue kept _incrementing_, so I ended up buying myself a gift subscription so I knew I wouldn't have to go through that again.
Didn't California pass a law saying you got to cancel as easily as you signed up?
Yes, which is why a hack at some of these websites is to set your address as being in California, at which point one-click cancellation buttons magically seem to appear.
Wow, having different pages based on state seems to be such an amount of effort that it's basically an admission of fraud to have the other version: you must make enough money of not-cancelling to justify having two.
The worst I've seen is cancelling a subscription to the French newspaper Le Monde. You have to print and fill a form and send it to them by snail mail. Totally unbelievable.
It's very annoying. They want to be able to negotiate if you are leaving because of price, and you can get it for quite cheap. But I wish they just had a more attractive list price and a cancel button.
I don't normally subscribe or unsubscribe to services and so the NYTimes unsubscribe process really surprised me at the time. In the end, having to go through a chat system wasn't as bad as I thought, but it did annoy me at the time that I couldn't just hit a button to cancel.
Since then, I've been wary of signing up for anything new, knowing that there is likely a painful process on the other end to cancel my account.
Agreed. Much as there's a difference between "free as in speech and free as in beer" there's a distinction between "usable like hammer and usable like cocaine". For as long as engagement is allowed to pay better than utility, thats what's on the menu.
The dark patterns described here are part of a broader pattern for big tech. The firms mentioned, Amazon, Google, etc, have made the strategic decision to debase their products to further monetize them. This is similar to the way in which high-end fashion brands are often tempted to cash in on their brand name by marketing to mass market. Just as with high-end retail, this is journey goes one way. Now these companies have embraced the lowest common denominator version of themselves it will be difficult to reverse these decisions.
The one big tech exception to all this is Facebook which basically zoomed straight towards the lowest common denominator version of itself yes continues to plough ever deeper trenches each year
Recently I had a problem explaining to a recruiter why I am designing only UI, Visual language and care only for visual parts of the product. She didn't react at all when I started talking about "Dark Patterns":)
It is requirement nowadays, it is not optional. If you as a designer don't think about new ways to exploit the user base, you are not worth your salt.
I moved only to visual stuff for two reasons:
1. Designers used to think about the user, functionally and aesthetically. Nowadays designers who cannot draw a straight line move templates around and comply with the management without any form of critical thinking.
2. There is no way for me to be a "writer" and do the dishes.
Yep, nowadays designers must not only code but and write compelling stories to be hired. Utter nonsense.
This is not a secret. Here is a regular article for UX designer requirements, and under number 4 we have:
UX writing
Writing is the unsung hero of UX. People speak highly of coding, which is a skill that shouldn’t be dismissed, but writing is a talent that can be nurtured over less time to create brilliant user experiences. Pick up your phone and look at any of your apps and it will be filled with perfectly crafted words.
https://bit.ly/3cnEkuo
In this point in my 20 year career as a designer I simply refuse to comply.
You cannot create long term value for your customers by shifting the focus of design from the user to the product managers and shareholders.
> In Juul's case, fraudulent proclamations of "empathy" served as a smokescreen for the true aim of the company, which was to use d.school-inflected product design to addict, capture, and monetize an entire generation of young nicotine users.
People try to make everything about "empathy" these days. The term has been on the rise since September 2008 [0] and I've seen it routinely used and abused. Who can refute empathy or a lack thereof? I recently took a corporate training about personalities. The whole seminar was supposed to show you strengths of the various personality type, what their opposites are, and how to interface with them better. Make better teams through understanding; nothing new. One measurement went from what's basically Extremely Empathetic => Data Driven (neither being bad, explicitly stated.) Routinely people that I worked with recategorized people in "data driven" to "non-empathetic" in casual conversation. Reading this article made me realize the real deceiving UX isn't unique to code or processes, it's just human inclination for things they don't like; whether that be the "cancel" button or personality traits.
Leave the kids alone. Adults who market harmful products to kids are predatory assholes. I'd hope there is a special place in hell for whoever dreamed up count chocola and all the other bullshit used to sell candied cereal to kids. Just because it's legal doesn't mean it was ethical. And in the case of Juul, going into highschools with their marketting material, I doubt it was even legal.
One of my big time complaints about UX is the lack of reflexivity for end-users and support technicians.
As a college student, I worked part-time for the campus IT department. My job mostly consisted of responding to support tickets that came into our system. Around 90% of these requests required me to initiate a remote desktop connection with the client. Never in a million years could I simply understand the problem that the user was experiencing by their explanation.
So why on earth are there so many systems in place that don't offer a user-perspective view of the data? I'm not just scoping this problem to the world of IT - I've experienced this problem from both the support technician and end-user standpoint in sales, support, and education.
Three things I would change - UX, filtering and ranking - to be made user customizable, or allow alternative front ends and rankings provided by third parties. That would solve many problems. I would apply this rule to all large internet companies - Google's UX and ranking, Twitter's UX and ranking, etc. They all sin by tricking users (UX), hiding stuff (filtering) and selectively showing stuff in "the one true way" (ranking).
Why are they doing this? To milk the web for user data, ad money and extract more money from customers directly. Our eyeballs are their most prized product. But the end result is this mono-culture / single take on UX and ranking for all.
Practically it would mean to force them to open up their backends to competitors, or be split into front and back, with open competition. I would also spin off the user data part of backend to allow portability of user pods.
There should be more rights to users on the internet. Deplatforming, manipulation and exploitation can't continue like this.
In defense of the d.school, what they teach are techniques and tools - not ethics.
UX broadly is in the same boat. UX is a tool. We can use it for good or for evil.
At the end of the day UX is only as valuable as how well individuals (or companies) can align their personal definition of 'good' with 'good' that the underlying company cares for.
I think it would be correct to say that receiving an education should teach you about ethics. Whether that should be strategic from the top down (you take specific classes about ethics) or embedded in each class is a question I don’t know the answer to.
In an individual class the amount of ethics varies. When I took d.school classes there certainly wasn’t a “market mover” mentality. We were genuinely interested in understanding people better so we could improve lives. But ethics certainly wasn’t explicitly on the agenda. We just didn’t have any assholes taking the course at that time.
This is part of the argument within the university for loosening credit requirements. When I was there, I took the HCI track. That track had the smallest credit requirement and also let you take a broader base of classes that counted towards the major, such as philosophy. That was a more rounded experience than some of my peers who took harder cs tracks like networking and systems.
At least when I was teaching intro HCI courses at Carnegie Mellon we did have lectures and discussions on dark patterns and the ethics of UX. Taking advantage of human psychology to manipulate people has been a problem in the field for a long time (e.g. gambling machines, advertising, dark patterns, screen addiction) so IMO it's a bit irresponsible not to at least warn students that they're likely going to be asked to do some morally questionable things.
> sure, they said, we can do the research, listen to customers, and make recommendations for improvement. But what if leadership not only ignores our recommendations but tells us to do something different?
Do the research? Listen to customers? These are so 1990s.
I'm surprised the author didn't cover that one explicitly: how much of the problem with UX comes with it being "data-driven" these days? In lieu of acquiring direct feedback from the users, everyone these days just overloads their products with telemetry, runs countless A/B tests, gaze tracking experiments... just about everything imaginable except talking to the users.
It's nice to be data-driven - good, quantifiable data is easy to gradient-descent on. But somehow, the metrics tend to be selected in context of what profits the business the most, and not what maximizes the value of a product to the user...
> But somehow, the metrics tend to be selected in context of what profits the business the most, and not what maximizes the value of a product to the user...
The divergence happens when the user isn't the one paying for the product; e.g. when the user's company is paying for it or when advertisers pay for it instead.
It’s not all sunshine and rainbows on the paid side of the house. All the disconnected out of touch decision making still happens but the metrics are ARR and churn. The game is making it as easy as possible for people to pay us and inconvenient as possible for people to stop.
B2B SaaS companies offer a great UX to the people who pay for their software - long golfing trips, fancy dinners, powerpoints full of feature checklists. The users aren't the customers.
"users don't know what they want" is one factor. Nevermind that Macs lacked more than one mouse button, or iPhones more than one physical button for many years.
The single-button approach is valid, IMO - for the right audience. I hated it myself... and then I had to help my parents and grandparents learn the ropes with mobile devices. Turned out that iOS made them more comfortable for one simple reason: no matter where they ended up, if they felt like they were lost or overwhelmed, there was always that physical button, looking the same, and in the same exact place, that they knew would backtrack them back to "the beginning" - i.e. to a known state from which they could try again to do whatever they wanted.
So, in some ways, it is genius, and I really hated it when Apple killed it off. The problem wasn't the concept itself - it was trying to sell it as UX for everyone. I think that's one of the big problems with UX today - the things that are done under the guise of "simplification" aren't actually simple.
The problems posed by multiple mouse buttons are very similar to the "hidden swipe gestures" people are complaining about elsewhere in the comments for this article. There's nothing inherently wrong with either - any more than there is with e.g., English being a language with thousands of "hidden" words that people have to learn and remember instead of limiting you to only the words visible in an autotext menu - but you have to think about how people learn them, how consistently they're available, how to accommodate people not knowing them or learning them at different rates, etc.
HN commenters tend to be power users of desktop-style UI more than touch or mobile-style UI which can make it hard to see similarities between them (right click conventions really aren't much more consistent or obvious than swipe conventions)
I will disagree on that last bit, again, based on personal experience. Right-clicking to produce a context menu in Windows was easy to explain to my mom, and she very quickly started to use it everywhere to discover the available actions - and it worked. In mobile UIs, on the other hand, to this day, she is struggling to figure out where you're supposed to press-and-hold, where you're supposed to swipe etc. Quite often, she'll do press-and-hold to do something that worked in one app, and it doesn't work in another. Desktop apps are much more consistent in that regard.
The "you can always go work somewhere else" argument will never stop being a cop out. Your employer controls access to your salary, healthcare benefits, and potentially more. Leaving your job is never a completely safe move and there's an abundance of reasons why it's simply not an option to most people, and it doesn't matter what tier of middle class occupation you have for that to be true.
Moreso this argument isn't one to be taken seriously because it assumes every collective failure is a failure of millions of individual choices rather than any of the cultural inertia or environmental influences that steer our choices.
We're talking about UX designers, and in particular UX designers at large tech companies (FAMAG). UX designers at e.g. Amazon, do have huge optionality to go elsewhere and still get paid good salaries. They are not e.g. clothing workers in Bangladesh.
> Moreso this argument isn't one to be taken seriously because it assumes every collective failure is a failure of millions of individual choices rather than any of the cultural inertia or environmental influences that steer our choices.
That's a strawman - I'm not saying this applies for 'every collective failure'. Systemic factors are important, however we shouldn't absolve individuals at large tech firms of their own responsibility.
There are plenty of tech companies not using dark patterns (I used to work for one). Granted Amazon employees might not be able to get paid as much elsewhere, but that's the choice you make.
I have worked for 5 companies in my 15 years career. One of that company was ecommerce consulting firm which dealt with a lot of mom-n-pop ecommerce stores. Let me tell you not a single one of them was willing to give up any dark patterns if that meant losing revenues. Unless you are working on non-revenue generating product like internal dashboard, it is all about get sales anyway possible.
I am sure there are ethical companies, I see many of them mentioned here, started by readers of this forum, but what percentage are those. Imo, they are less than 1%.
I recently needed a plumber because of an emergency clog. I don't use Angie's List but I gave it a shot. Their plumber search runs you through a ridiculous multi-page questionnaire process rather than just showing relevant results. Some genius was paid highly to come up with this crap. It can't be helping their their future prospects as an operating concern.
I worked at a company that did this. The data showed that the longer we made the questionnaire, the more people felt invested in the process, leading to fewer abandoning when we finally asked for email address.
Things were better when the Operating System decided how a gui should look and work.
I just cannot comprehend how anybody could come up with something like the redesign of
https://www.tagesschau.de/ which was launched this week.
Sarcasm? While I don't read German, this is actually a pretty good design for a news site. It's clean, consistent and uncluttered. It loads quickly and doesn't have any obnoxious animations or advertising. The font choice is clean and well spaced, but let down by poor definition of link vs title/theme colour.
I agree that the new style looks a lot better, however I'm not comparing their new site to the old one but to actually usable news sites. Maybe I used the wrong word, what's awful is the layout. I want information not one row of pictures which take up half my screen.
It's definitely a new site, in the dystopian sense.
Usage hint: Reducing the browser window to what is approximate smartphone width in portrait orientation somewhat helps with the experience. Clearly, using anything other than a mobile device is deviant.
On the other hand, in terms of UX, I felt somewhat neglected as there wasn't a "Use our App!" popup. Also, the lack of welcoming "Subscribe!" popups (options "I can't wait!" and "Later, but for sure!") is a bit reminiscent of past decades of public services and their arrogant attitude towards customers needs. I was even left to my own on any attempts to scroll without any reassuring popups gratifying my engagement by another opportunity to click a button. I could just scroll and scroll without noticing, accompanied by a growing feel of loneliness and solitude in this digital void. If this isn't pure negligence… ;-)
I cliked your link and opened my devtools out of curisority, and every headline is under the class name "teaser", so it would seem the main intent of the site is to tease you!
Besides that, it looks like no one did any design at all, like what you'd get on the default template of wordpress or other CMS
They must've looked at their traffic stats, seen that most users are mobile and decided to focus on that version instead of having substantially different layouts for mobile vs desktop.
I fail to see anybody mention this but isn’t UX entirely subjective? Also big corps have AB tested the hell out of everything leading to a UX that converts the most and hence is probably the best.
Complaining about bad UX and then not showing many examples but merely a few outliers also is a bit misleading.
To me it seems a possible way forward with UX is adaptivity. We need to make users reach their goals in the way they want it (hence subjectivity). Any static UX in the way of that is by definition bad. Good UX should adapt, evolve and morph for the user at hand. And it should do this dynamically and not compiled before hand by a few UX think lords that know how it should be done.
Counterintuitive wisdom from my QA/Test days. Maybe it's a koan.
Other's have already dismantled the A/B testing sham. From my own experience, noobs do this to show management they're doing something. Justifying their continued employment.
There haven't been that many innovations in UX on the web for a while. It's all essentially buttons, toggles and drop downs. I was expecting a lot more boutique interfaces, tightly crafted to the specific use case.
My guess is we are actually not very good at developing that kind of thing as an industry.
Product configurators are a good example of what I mean, they are super rare and considered monumental undertakings by most agencies. Prized monoliths of culminated talent, that break in two years and get replaced by a series of drop downs because as an industry we avoid risk like the plague yet cycle agencies annually.
It's not that we aren't very good at it, it's that nobody wants to do it.
I'd liken it to building codes: given the advances in materials science, one would expect us to be building wildly more diverse styles of homes, with ornate forms and functions that were unimaginable 40 years ago. Why don't we?
It turns out that consumers are cheap and unsophisticated. As hardwood becomes plywood+edging becomes MDF+veneer, people are willing to boil the quality frog to grease economies of scale, and by the time we wonder why the hell nothing in our house lasts longer than 2 years, a very heavy pendulum is in motion and that's about the end of it.
There are real gains here for templatization, repeatability, speed, and maintenance. When everything is built the same way, it's easy to find someone to work on it for a low price, and the people who use it know how it works before they walk in the door.
Just as every house looks like home depot, every app looks like bootstrap.
Breaking these cycles requires that the masses have taste, funds, bargaining power, and lasting resolve. On average, we don't.
I don't know if it's possible to separate the good from the bad. Things that used to be metal and last 30 years are now plastic that might not make it 30 days. But those same forces are the only reason it's economical to produce eg a flagship phone, which are downright miracles of manufacturing.
The universe of implemented use cases, solutions to people's real world problems, has exploded.
But just considering the widgets (ignoring voice, touch, haptics, need I go on?), just the actual visual language and interaction models, there's never been a better time to be a designer (UX/UI).
I used to create custom (owner draw) widgets for Windows. At the time I thought they were super complex.
Today, the simplest web widget dwarves anything I did. Just the typeahead combo dropbox add-on for Bootstrap has more code and subtly and capability than anything I did.
I don't disagree that we've got a lot of solutions, but they all use pretty much the same inputs. I'm not sure if that's a bad thing, I'm just surprised is all.
> Just the typeahead combo dropbox add-on for Bootstrap has more code and subtly and capability
I know there wasn't much in my comment but really what I'm thinking of is akin to a steering wheel in a car. Where you move a lever and it effects the system in a novel way. Where most of what we have is variations on text entry and boolean flipping. If we step away from web applications you do have a whole plethora of UX innovation inside video games, apps, and products in general. I am enamored with the elegance of simple portable battery packs for example.
> Then make them.
I think the world will remain a better place if I just stick to code, but I like the enthusiasm.
The worst dark pattern I've noticed lately is in the cookie acceptance pop-ups where you take 30s to uncheck all the cookies and then press the "Accept All" button, so evil and annoying!!
> This example may not seem like much: cancellation processes are often a hassle, even in Internet companies. (As far back as 2006, AOL made national news for insolently refusing a customer's cancellation request.)
> When Vincent Ferrari, 30, of the Bronx, called AOL to cancel his membership last month, it took him a total of 21 minutes
Wait… that's … that's it? I just cancelled my Comcast account. The call is here in my history: 28m 35s. That only counts the time on the call, mind you; you still have to return their hardware. Including that, it took me well over an hour (driving, waiting in line at the UPS store, driving home), particularly since it had to go to a UPS store. If we count the time that I also spent going to a Comcast store, trying to return their equipment there, which the store and phone representatives later informed me that I "can't" do, because they don't accept "business" class equipment there despite their website directly contradicting that[1], then it took well over two hours to cancel.
(There was a separate call to disconnect service, that was 10m 37s. We moved, and thought we might have Comcast on the other end, so we did that part separately. When we shopped for an ISP on the other end, nobody picked up the phone for Comcast, so we went with their (only) competitor in our area. So, whether you want to count that against the time it took to cancel, IDK.)
I remember reading or hearing somewhere: we have some of our smartest people working on the best way to sell ads to people.
On an unrelated note, I like the style of the blog: no 3 column bullshit. I can resize the screen however I want to and the text flows all the way from the left edge of the screen to the right edge. Barely any wasted space.
I think there's a difference in objectives and motivations of the providers and its users. It is this difference is what poisons their relationship, UX included. Why would service providers not play underhandedly (like with dark patterns) when there's immediate value for them in it, and no real negative consequence? Their goodwill would cost them money directly. Sometimes they even _have to_ play like that, otherwise competitors will squeeze them out of the game altogether. And it's not like users in general are any better morally.
I think what needs to change is the idealistic view on UX. That service providers manage UX to make the user experience better for the users. I think what happens is that they do just that, but not in a way that will hurt the health of the business. And in the meantime, they will do everything that they can get away with.
It all depends on what the tech company putting out the service is trying to achieve. While there may have been an era of increased hype in 'good UX' prior to 2008, UX as user exploitation has always been around. Both serve distinct roles for the tech company in pursuit of the ultimate goal: making money. One attracts users away from the competition, the other prevents them from leaving. The users themselves are secondary in this equation, because cash and profit come first to ensure the business survives and expands.
I looked at the some of the author's services like Good Todo and the law I have described above also applies to them. They are merely coming at it from a different angle, emphasizing user privacy and convenience as a competitive edge to get users to stay.
At BigCo I've found that senior UX designers are primarily engaged in grift. The daily grind of polishing the ubiquitous rough edges in the existing UI is beneath them, so they feel like they need to go redesign the entire product every two years. The sole clients of this redesign are the execs, who want to see something shiny.
The massive redesigns aren't really amenable to A/B testing; after launching, they generally are deeply flawed and engineers spend a quarter clawing back to the usability of the original design. Eventually it becomes the new baseline, which means it's boring, which means it's time to redesign again. Repeat until you leave, collect your accolades for all the redesigns you presided over, and depart to go do the same at BigCo2.
The problems I see with UX being UX lead for several years and, switched to PO: UX was valuable when most people who produces websites didn't have much idea and team composition was totally different. Plus, market maturity increased and the amount of good practices patterns made UX totally secondary. Most of the businesses are started from frameworks who include a huge amount of good UX but, that is not enough. In order to be sure you have two options: go full qualitative tests, or go quantitative tests, for the second ones you don't need UX, you need a good PO with tons of answers and a data team who crunches the data for you. This way you can move faster, as the article says.
So true. I think the UX of websites is still better that before, but now there is also reverse-UX / anti-UX (difficult cancelling of subscription, leading to purchase).
I don't think it's bad per se, just not very moral.
Governments are notorious for contracting websites to the lowest bidder that claims they can check all the requirements boxes. Websites run through the same procurement process as buying toilet paper for the offices.
And the result is websites that do technically check all the boxes in the document thought up by someone charged with "getting a website built for this", but rarely anything that would be considered a good experience.
The federal government has gotten a lot better at this since the founding of 18f, but state and local still has the problem of checklistware
It has likely become not just unprofitable to focus on UX, but downright detrimental to the company’s bottom line. I say this without the evidence to back it up, but the anecdotal evidence is everywhere. I see it all the time as I preach UX improvements.
If my theory here is right, then the problem can’t be solved by preaching UX to businesses. I think it can only be solved by reminder end users that they deserve better.
Easy, incentivized signup with difficult off-ramping has been a business model since … forever. This is not new to digital, it is much more nefarious as the possibility domain with ability to quickly iterate patterns is much higher.
OP is focused on retail/consumer space, UX is still a huge diff in enterprise and will continue to reside at or near the C suite
When I happen to come across an old site from the 90s or early 2000s, it’s such a breath of fresh air. No complicated UX, no “get my newsletter!!” pop ups, and content focused on text instead of hideously huge images that serve no purpose.
While it might have been ugly, the Internet was so much more fun before the finance and marketing people got involved.
Well designers, now you know what it's like for users who have had to suffer with you sticking bigger fonts and more whitespace padding in every design for the last ten years despite users wanting the opposite.
lol similar to the blog, I just got tricked into a prime membership. I'd swear that I didn't click the button in their full screen "upsell" page in the middle of the checkout process.. but yet money gone and only half refund. Their customer contact details is hidden so deep, that it took me 10 minutes to find out. Finally contacted them and got the remaining amount as store credit. Experiences like these puts me further of amazon. Now think about the definition of good UX.
Replace UX with programming and it is the same exact story: “Losing faith in Programming”. You go in hoping that you are creating code that helps your user and end up optimizing for ARR.
Companies have deliberately made cancelation processes more difficult for a long time. That's not some byproduct of the "UX dark ages" you're having nightmares about.
In addition to use exploitation, there's a lot of shallow visual-appeal hooks but no interest in making things useful or better for accomplishing tasks/goals.
It’s practitioners believe that they work for the “users”, but really these “users” they imagine don’t exist.
Which is hard for them to accept, after carving their user personas and praying to them for answers.
The owners don’t mind having the UX priesthood at the table. Especially once they realized that they just needed to adopt the language of user experience cult without giving up any real power.
Years later practitioners are noticing the trick and realizing they’re fooling themselves.
1) Users are customers.
2) You work for your boss/company.
3) It’s good to advocate customers, but don’t turn your preferences into a religious crusade.
It's a young field. We need UX/ advocates with a seat at the table defending the users. Better yet, we need UX advocates across the board.
But it'll take time, most companies are just getting into it. But making better UX a company wide effort is a good start.
As for the bigCorps, obviously this is a wider discussion than UX alone, as having these dark patterns fall through the process indicates a certain problem in company culture and/or decisionmaking.
It's a bit early to start losing faith, we're just getting started.
> It's a young field and we need UX advocates with a seat at the table actually defending the users. It's not easy and it will take some time, but as the knowledge of UX is increasing across decision-makers and people building products, we will get to a better place.
With all due respect: you are new here. The level of UX is at Windows 1.0 or Xt widgets. I hate to say that but windows 3.1 looks like a revolution compared with Windows 10, OSX (current) or Material "design".
There was a time when things worked. Now the dark ages of UX have come i it will be a long time until the new industrial revolution will come.
> It's a bit early to start losing faith, we haven't even properly started yet.
This is true for Win 10 and Google's brain dead interface. Some not so long time ago things were much, much better.
The worst thing is that there is budget allocated for such things.
With all due respect, I'm talking about the practice of aligning user needs to product. If we're talking about UI elements inside a design system (such as material design), that is a different discussion alltogether. That's not even where the OP is about.
"The worst thing that there's budget allocated for such things" is just offensive and invalidates all the progress that was made in the past decades in the field of HCI.
Frankly it's dissapointing to see in HN how little understanding there is in what is happening in this field and to see all these passive-aggressiveness towards UX design is frankly demotivating. UX is something that in my opinion is carried by every person that touches the product in some way, it's about more than just putting some flows together as a designer and harrassing developers.
sorry OP but this just means that since '97, UX evolved faster than a human can follow (as usual in tech) and you are just out of touch with current trends. UX is still of the same, if not better importance as a decade ago (remember that uber is just taxi with better ux, airbnb is just accomodation with better ux, and revolut is just banking with better ux)
Thanks for introducing "user exploitation" to my vocabulary, it certainly describes the overwhelming focus on giving ad networks as much screen real estate as possible so we can be surveilled by marketers and governments alike.
As an additional correlation to the timeline, Taboola took off from 2007 to 2013 especially, when all the online journalism sites were trying to figure out how to stay afloat and suddenly they were all handed a guaranteed monthly income, all they have to do is mix scams and health-scare and "Tommy Chong CBD" fraud in with your news, your readers will love having relevant recommendations for what to read next!
(It's my conspiracy theory that the only reason these ad networks pay out so well is to keep the surveillance dragnet wide, track what information every person sees -- for the price of a little click fraud)
But back to the topic, these design changes (even editorial changes to what headlines are on your front page) are certainly not done with the user in mind.
Another design trend that has completely taken over of course is algorithmic timelines. Both Twitter and Instagram started pushing non-chronological sorting on previously chronological feeds in 2016. This is a transparent effort to take advantage of the slot-machine-addiction of reward and disappointment brought about by a refresh of content -- requiring you to refresh the page and load more ad impressions. I can't tell you how many times I tried to use facebook's search feature, but eneded up scrolling down the timeline trying to find a post I saw earlier, how many dozens of ad impressions were bought and sold while I scrolled, all because facebook couldn't be bothered to write a useful timeline search?
The places where people spend most of their time online (social media, youtube) are designed to act like quicksand, information is re-arranged every time you hit the back button, just because they need to mix some payola in and keep you on the site.
/rant
I think there is still a log of creativity to be had with computers, for my part I'm building a WYSIWYG CMS that frames all of its data into polygon 'tiles' that fit together into a mosaic in 2 or 3 or more dimensions, just something to get away from the rectangles and single column timelines that make everything look the same. I can't even remember what website I saw something on because I can't tell one from the other. So, see also another great piece on web design from 2016 [1] relating to design trends in physical spaces, the trend of cafes and airbnbs to ignore any of its local context in favor of a bland Le Corbusier-meets-Edison-Bulbs style, a full on mcdonaldization of service so that you never have to try anything new (except even mcdonalds adapts the menu to local taste)
>[Big Tech] is turning UX into an actively harmful discipline has drained talent and expertise away from projects that could, and should, have had more help. An enraging example comes from right here in New York City: our vaccine websites are impossible to use
So how is Facebook to blame for bad vaccine sites? Should they fire their user-experience teams to stop the "drain"?
It's the "drained talent" aspect, which IMO is correct. If I remember correctly, government salaries typically max out around G14, which was lower than most starting SFBA bigco salaries.
The world is pretty ripe with people who can make a stable, usable vaccine website. But on the whole, they don't want the job for 50k in a dingy building from the 1950s when they could make 250k to work on a shiny app that psychologically manipulates their peers in perverse but increasingly effective ways.
Imagine soda companies buying up basically all of the water in america and canada to make some highly addictive line of drinks, such that when a hurricane hits and people need rations, the government can't get water at a price and quantity sufficient to provide relief. Big tech is so profitable compared to every other sector that they can outbid the rest of the nation for engineering resources, leaving every other sector to be staffed by the dregs.
This is an oversimplification of course. Some people really want to stay in not-san-francisco. Some people really want to serve their country. But in aggregate, engineering talent is disproportionately hoarded by SV tech cos because they're willing to buy as much as the world can make for more than anyone else can pay.
edit: I say "engineering" a lot here but it's similarly true for UX design. The concentration effects are weaker, but the price deltas are bigger.
> In the single biggest public health crisis in the world, New York can't build a usable vaccine website. The telephone - 1950s technology - is our best option, after 25 years of web development.
I think the latest thing in the 1950's was Touch-Tone. The telephone itself is actually 1870s technology.
" but a major one was the exodus of financialization experts from Wall Street to Silicon Valley. Suddenly the "get rich quick" mentality that had caused the 2008 crash was being adopted by senior leadership at Big Tech firms. "
Stop with these 'blame wall street' populist fantasies.
The 'fault' of 'whatever' is happening is 100% tech and it has nothing to do with bankers.
In reality, young people coming into the industry with a different set of values than those that came before them are driving this. A generation that simultaneously declares to be more 'socially conscious' is literally driving these activities, which is quite hypocritical.
The shift is easy to see: people having grown up more separated from their communities, with culturally secular values (not bad one's just more intellectual than rooted in relationships with those around them) can simultaneously detach themselves from the consequences of their outcomes at work, while at the same time participating in a 'protest' about something or the other in order to validate their self-identity as 'socially conscious'.
We're going to need a broad set of basic regulations, hopefully not too harsh, probably led by the EU as the only institution that seems to have enough power and wherewithal to care enough.
While note making any statement about politics - Biden's 'side' would be the most likely group to be concerned about privacy, and yet we haven't heard a peep about them on the issue. The other 'side' is deeply concerned about 'freedom of expression' which is good, perhaps they can find a way to make some legislation about it.
tl;dr: I wanted UX to be a principle, to have a noble moral standing and be used only for "good". But it turns out UX is just a tool and as just a tool it is now being used for "evil".
I eschewed clicking this link when this hit front page. I thought it'd just be more whining. I was wrong. I'm sorry for the delay.
That angelfish metaphor is just perfect. The WFMU graphic is terrific. The criticism is spot on.
--
Yes, and: Why I also left UI.
TLDR:
No one at any time has ever cared about mental models, metaphors.
The serious aspects of UI, QA/Test, methodologies have all been swept aside. I don't know why.
Even in the 90s, other geeks were hostile towards UI.
--
In the 90s, I considered myself a UI designer. Before the kids started saying "UX". Which is like my kid telling me that "emo" wasn't anything like "goth", because "goth" is what old people listen to. Um, ok. Sorry, kid: Same smell, new slogan.
I mostly did direct manipulation graphical user interfaces, similar to AutoCAD and Illustrator. I had been programming since grade school. Slinging code was just what one did.
I fucking loved working on UI. During a time when most geeks hated it. Except game devs.
At some point, I got tired of trying to persuade the priests working at Autodesk and MicroStation to stop abusing their users. I'll just make my own UIs, dammit.
My one contribution to the art of UI was somehow divining new mental models. Transmuting complex stuff into "well, duh" simple. You know you've nailed a solution when you can delete the related pages and chapters from the docs. Good UI explains itself. h/t Donald Norman (Just like good code.)
Example 1: For a game, I created navigation controls which unified walking on the ground and flying in the air (like a bird), while avoiding getting "hyperlost". (We never shipped. For comparison, the 3D RTS game Myth's UI was kinda the same, but not really.)
Example 2: For our production printing software apps, I "fixed" the color mapping UI. Mapping named colors w/ RGB values to CMYK is straightforward. But lithography has a bunch of colorants which cannot be represented in RGB, like double black and metallics. I made it stupid easy to control.
Example 3: Also for printing, I created a parametric image positioning (imposition) app which generated the production plan from a simple specification. The Holy Grail. Just in time for the print industry to crash.
--
To implement my UIs, I had to dive deeper and deeper into programming. I can't really explain why. Something always got lost in the translation. I've always had to show vs tell. There was always some legacy technical reason why I couldn't get the UIs I wanted.
So I'd have to fix the next layer down. So instead of engaging directly with end users, working on their real problems, I'm coding OpenGL bindings and web services and persistence layers.
I think this is called yak shaving.
It's a terrible, terrible trap. Over the years, the harder I worked, the further away I got from my objectives.
--
My criteria for a successful idea, back when we geeks still cared about learning organizations, was if the idea outlives one's own participation.
By my own standards, I've had very few successes. Regardless of the domain. Project management, testing, writing, UI design, etc. I did a lot of cool shit. Most of it now lost.
I also never figured out how to monetize, profit from my ideas and creations. My guess is doing stuff and getting paid are two different domains and very few people have both.
I think both of these failures are related.
Or along the lines of this Hurst article, maybe being a working artist is just hard. I know a lot of struggling artists. I respect any creative who figures out how to get paid.
--
Geeks remain hostile to stuff they don't understand.
Yes, there's now better visual arts and visual communication, it's still pretty rare. I attribute this to having fewer gatekeepers. Anyone can throw their work up on the web.
And just like visual design, geeks (and non-geeks) don't understand mental models and metaphors.
They'll see the work. Acknowledge that it's good. But since they can't figure how it came to be, the work is somehow invalid. Or due to luck. Or somehow otherwise illegitimate.
You can't A/B test yourself to greatness. (Riffing on wisdom from my time doing QA/Test.)
But I've never been able to explain my intuition and non-linear thinking to other geeks. There is no "show your work", like all the steps of long division to satisfy the math teacher.
I'm still grateful to the one geek boss I had who was at least honest about his rejection. He was a math whiz and had previously made a lot of money selling software to Wall St. He demanded that I walk him thru my solutions. I'd demo, show him my notebooks (eg storyboards), the results from usability testing, etc. But he just couldn't accept the end results because he couldn't understand how I logically got there.
OMG we had such fights.
Years later he apologized. He had tried to do his own UI for some of the same problems. He admitted mine were better. And that he should have been more accepting of my style of thinking. Which he called "abduction" (vs deduction or induction), but I don't think that's right either.
There are certainly other reasons why creative work is diminished, rejected. This is just the brick wall that I noticed.
--
So. To wrap up this rant.
Being a creative is hard.
Everything Hurst wrote is spot on.
I just wanted to add my own personal history (color) to compliment his thesis.