Hacker News new | past | comments | ask | show | jobs | submit login
Where will UX Design be in 5 years? (trydesignlab.com)
163 points by andrewwilshere on April 4, 2017 | hide | past | favorite | 184 comments



To be honest UI dev tools just plain suck, making everything move more slowly that it needs to. If I'm on the web it's the HTML + CSS clusterfuck, If I'm in C# I'm writing color converters for WPF, if it's Java I have to use the outdated looking Swing, if it's C++ it's QT which I need to doll out cash for. Additionally every smart phone OS has it's own groundbreaking GUI layout concepts and ideas too.

Then you get the layers on top, Xamarin or Eto.Forms or X, that convert down to the native widgets or just create their own entirely. These usually come with a healthy dose of missing functionality, such as mouse drag detection.

And then there are the languages with no UI libraries at all. As if this is even acceptable in 2017.

All in all, whoever decided GUI frameworks were a good idea got it very very wrong indeed. This stuff should have been standardized long ago, stored in some universal data format, and then every programming language could add support for it. They would want to add support for it. Closest to this is HTML of course... but, well, I think everyone would hope for something more elegant.


Sun NeWS should have been that GUI standard, thirty years ago:

https://en.wikipedia.org/wiki/NeWS

There was a simplicity and purity in its approach to front-end interactivity and rendering that completely eludes the HTML+CSS+JS stack we're now stuck with. As the Wikipedia article states...

"NeWS was architecturally similar to what is now called AJAX, except that NeWS coherently: used PostScript code instead of JavaScript for programming; used PostScript graphics instead of DHTML and CSS for rendering; used PostScript data instead of XML and JSON for data representation."

PostScript is basically a graphics-oriented Forth, so it's kind of a weird language to write directly... But it would have been great as a target for all sorts of interesting compilers and GUI tools.


If you have compilers and GUI tools, how does it matter than the underlying layer is HTML+CSS+JSON? The designer shouldn't see it anyway.

Now that we have a web-aware platform (which NeWS was not), nothing stops us to create good UI tools on top - yet it is happening very slowly.


You mean by piling up <div>, <span>, CSS and JavaScript?


Yes, why not? That's no different than piling objects in a GUI widget library to build a tree structure for the layout of a window; except that it has a human-readable string representation.


Sure there is, on a GUI toolkit I can always control at some level how those pixels are drawn, while on a browser I need to hope it does the right thing.

Also there is big performance impact, where piling up divs with CCS requires the right incantations, some browser version specific, for hardware acceleration to kick in.

And in any case, it is impossible to fully replicate the UX of the host platform.


> I can always control at some level how those pixels are drawn

How well does that work when your GUI must be shown in arbitrary devices, with arbitrary screen sizes and arbitrary resolutions? And how well does it reflow when the user changes the viewport size?

In any case, the original post to which I replied was comparing HTML5 to the NeWS platform, which provided a unified UI stack itself, so you wouldn't be able to match the native behavior following that approach either.


That is what layout managers and logical pixels are for.

You are missing the fact that NeWS offered this kind of control.

"The NeWS Book" - https://archive.org/details/bitsavers_sunNeWSThe_11666863


Didn't NeWS have its own issues? It lost to X11 for a reason...

https://www.usenix.org/legacy/event/usenix2000/invitedtalks/...


you either die a hero or...


I think the react-native style architecture is the way forward. Mainly, thinking of the UI as a state-machine, and managing that state from a cross-platform runtime.

The reality is every platform offers it's own encouraged design patterns and to create the best UX you need to maintain builds for each platform.

In the next 5 years I don't see JS being replaced as that cross-platform runtime, but maybe a transpiled strongly typed language could become defacto.


buzzzzzzzz No.

Standardization exists. In two ways. Either your brand has a standardization or your platform does. Both work. It's like saying "we only need one coding language" when the reality is quite different.

The disconnect is when brand isn't as complete as platform. Granted, platform has really only fallen into place in the last few years for iOS and Android. But does brand really need standardization initially? No. Because new paradigms need to stand out, either for UX or for impression. Otherwise AOL would be Snapchat, and it ain't.

I respect that coders don't grok product design fully or in parts. But please understand, a great deal of the fun, the joy of tech, is in UI/UX evolution.


I think we're arguing different things here. I fully agree that from a design perspective, things need to look distinctive (MacOS vs Windows vs Android etc). But implementation wise, the only actual difference is the images used and the colour of the gradient brushes.

The only shake-up UI-wise in the last 10 years was the introduction of smart phones, which meant the need to design for smaller screens, as well as work with gesture commands. Instead of adding to what we had, we spawned Bootstrap on the Web, Android has it's own custom widgets, as does IOS and so does Windows Mobile. This is the divided state of technology we live in. A smaller screen somehow communicated to people we need four new GUI frameworks.

This result stifles design, because if I design for IOS, I have to re-make everything again for android (including any custom controls, themes and so on). This has a cost, I can't support android UI controls if there doesn't exist the same on IOS, or other platforms. So people might choose to layer Xamarin on top, essentially throwing another framework into the mix. Why do we need so much abstraction?

Recently I've switched my desktop project from WPF to Eto.Forms, so it can also target Mac and Linux. But the result is I now have no mouse drag functionality, because that isn't implemented yet. These are real limitations I face everyday. What exactly is the problem I wonder? We are literally talking about squares with text/images on here. It is mind boggling.


>if it's Java I have to use the outdated looking Swing

Swing has been deprecated in favor of javaFx for a while now.


and neither are used on Android


Thanks Google.


I get the /s but does anyone know why this is the case? Would make life a lot easier IMO.


oh, I did not mean to be sarcastic, just to point out that the state of these frameworks don't matter that much when they are not used on the main platform where people write UI in java.

As for the lack of Swing/JavaFx on Android, I don't know if there is an official reason.

It could be that Google only wanted to take the language and replicate everything else in order to avoid lawsuits.

Another option (and they are not exclusive) is that you take a lot of cruft with you when you integrate something like Swing.

Also, widgets like RecyclerView are really complex and do some deep optimizations like pre-binding views when the ui thread is idle. Having ownership of this part of the stack in order to make it evolve might have been considered worth the development cost.


I'm working on something much more elegant that takes designers above the focus of "HTML+CSS code" to an abstraction that makes sense. Similar to webflow, but allows you to continue to build out your app without having to drop down into text editing. A solution may be closer than you think ;)


Interested hear more about your approach. I'm also trying to solve it with Pragma - https://www.laktek.com/2016/11/29/introducing-pragma/

Here's a little sneak peak: https://twitter.com/laktek/status/845815533450682370


we should definitely have a chat - could you drop me an email? my address is in my profile. Thanks!


Also interested. Do you have anything online to explain your base ideas?


hey! I actually don't, but was planning to write up a blog post to explain the core concept behind Basis (the working project title). If you'd like, I can send you some info if you drop me an email :)


What do UI dev tools have to do with UX design?


Designs have to be implemented in something.

Currently UX designers are often like chefs who work by drawing pictures of food and writing down conceptual recipes. When they've created a beautiful menu this way, they're surprised to find out that their recipes can't actually be made using the cooking equipment available, and some of the ingredients don't even exist. You're not going to run a Michelin-rated restaurant like that.

The industry desperately needs better UI development tools. To continue my (already strained) food analogy... Right now we are still in a "hunter-gatherer" stage where everyone does their own thing, teams are foraging and hunting, and occasionally they stumble on something great. Consistent UX will require an "agricultural revolution", a systematic production innovation that makes the essential pieces more understandable and repeatable in isolation.


You really underestimate the amount of time it takes to practice and do adequate UX design; unless you spend 10,000 hours doing design, you are going to suck at it, and most of the jack of all trades designers devs I met have really been mediocre at both. "Designers should code" is as fallacious as "accountants should be lawyers". Specialization has been the direction of human civilization ever since the neolithic revolution allowed us to have enough food surplus to move into cities.

Designers should definitely work with devs, if they are surprised their designs can't be implemented in the end, then that organization is REALLY dysfunctional.

We definitely need better UI development tools, but this is orthogonal to UX design, much of which doesn't even involve the aesthetics of the UI anyways.


I have spent 10000+ hours doing design and I've got to say that my leap into code (especially backend) has been incredibly eye-opening. I've realized how so much of design is ego-driven, how design studios design for design awards, and what a crime it is that so many technical minds are steered to work on problems more related to fashion than accessibility or productivity.

Designers should definitely code. Too much of design is siloed in fanciful thinking, and fancy people competing with fancy people over fancy things perpetuate delusion.


I agree, but would like to add that there is a caveat to 'designers should code' that is not letting a programming language or framework dictate design through its limitations or structure.

I am a designer who codes, in fact my background is UX and my job is as a developer. It's easy to think about implementation during design, for example, designing a web-app and knowing it will be implemented in Angular leads to thinking about how Angular will implement it, leading generally to modals everywhere, just everywhere. If I knew nothing about the technical implementation I wonder if it would be a smoother design process, possibly better results. But then I see so much poor design everywhere that I can never be sure if something was designed by a coder who knows little about design or a designer who knows nothing about implementation.


Ive spent maybe 1000 hours designing and now 1000 hours coding and i can say actually developing something certainly feels alot more wholesome than designing the layout of an interface.


I constantly have to export my work in webflow for further development on Node, React Webpack, etc. Nothing like fine grained control over the digital elements! :-)


Designers must understand the media they work in. Design involves tradeoffs and accommodations for practical limits. A designer who doesn't understand the capabilities of the systems they design for is destined to produce bad designs. You can't understand the limits of CSS unless you can work with it.

Imagine if Apple hardware designers depended on their manufacturers to tell them what's achievable with aluminum and glass. Instead Apple has a small manufacturing facility in their design studio so the designers can iterate rapidly and actually understand what is possible.


I would say that knowing some code has allowed me to know when developers are bullshitting me. Most developers I've worked with have been great people, I'm happy to say. But occasionally I come across someone who goes "Uh-uh, can't be done, theoretically impossible, defies the laws of physics" when they mean "I can't be arsed to change the convoluted mess that is our production spaghetti code". Usually at larger institutions. Knowing some programming allows me to call their bluff.

Also, it's far more common that organisational culture, law, or the marketing department is a bigger constraint than the code stack. As a designer, I should consider my "medium" to be far broader than knowing some CSS, or I'm not doing my job.

Being a good designer is inherently being a generalist, and in being a generalist, the designer should understand some basic things about programming in order to have common ground. But that is certainly not the same as saying "designers should code". If designers should be programmers, then they should also be accountants, customer service, marketing, operations, lawyers, c-suite, and every other profession they interact with to solve a problem.


Yes. But this is far far far away from "designers must code." Except for a few unicorns, this just leads to people who are mediocre at both design and code. A designer who is bogged down in code will also produce bad designs, especially when they let their inability to think of how something will be coded (because they aren't advanced coders) limit them!

Designers have their own tool chain to prototype UIs and user interactions, but these are not UI devtools.


I'm not advocating that designers should code -- far from it. I wish they had better tools for modeling user experiences.

You mention that designers have their own toolchains for prototyping, but these has nothing to do with UI dev tools. I've often wondered why this would have to be so.

Look at other advanced design fields like architecture and industrial design. It would be unthinkable today that professional architects would just draw a facade and not use CAD to actually model the structure. Similarly, it would be very unusual for an industrial designer to model a product using a toolchain that is entirely separate from manufacturing. The assumption today is that CAD and CAM work together, not separately.

Why can't we have the same in software? For my part, I've been trying to work on a solution in the form of apps like React Studio [1]. It's not there yet, but I'm convinced that proper software modeling tools will happen eventually.

[1] https://reactstudio.com


UI dev tools are tools for producing a UI. Prototyping and modeling tools are quite different; it would be unthinkable today for a professional architect to build the building they are architecting and not actually model the structure. Also consider that much of the design that goes on is far away from the actual pixels that show up on the screen; e.g. interaction designers have very little to do with the UI.

> Why can't we have the same in software?

Because devs and designers have different needs.


I'm not sure what you're responding to. Did someone say that designers should do all their own coding?


I'll just add one caveat - it's my experience that developers often have little interest in great design code. I write far better mobile design code than any developer I've met (which to be fair isn't many). For me, I contribute best by focusing on that part, leaving developers to focus on their interests which is often new languages, tools, libraries, architecture...


FWIW - designers are facing the other way - to meeting and exceeding client expectations. In many cases it's not about what can be built, but what could be. It's not great for devs and most of their best stuff gets tossed when it hits real world production. But for a moment, the client was overwhelmed. And that's what sells design.


I can do both, design and development. For what it's worth, I can't design the best nor code the best, but I make damn fine products because I'm not caught up in being the best at either.

I make products that ship, people like, and can be further developed in both design and code by specialists. That's good enough.


I agree with the first part! The second part can never happen since there is no process that can handle the complexity of the real world. There might be patterns that could be found generic enough, but not a deconstructed atlas of everything you need to know. It's a process of discovery at best.

To deliver a really great user experience, it's really important to not just look at the interfaces between the customer and your products, but instead try to figure out the "flows" and processes that leads to that the customer uses your platform or app.

In some domains, you will need an awful lot of domain knowledge, paired with UI design expertise, modeling and development skills to do just that.

Typically, it's very hard to find all this in one single person, so excelling in teamwork is always a good trait.


That's a great analogy thanks. I still have the scars of doing the JS powering a crazy UX related to a time-chart zooming in and out with various aggregation level filters. On the other hand, when I told the designers to do X (just change this div to be beveled) they said they wouldn't, I needed to add the divs around it and they wouldn't (couldn't) change the CSS rules to make the UI changes. Sometimes it seems backwards to me, coming from a declarative UI background.


Do you think it's really worth it to have a menu that's so "beautiful" it's difficult to implement without more tooling?

Does it enhance accessibility? Elucidate the IA? Or is it just more whiz-bang?


I'm actually advocating exactly the opposite of "beautiful menus" that are hard to implement.

The current state of tools doesn't allow designers to think in terms of actual UI components and navigation structures. Instead, they are forced to draw pictures and make animated slideshows. This leads designers down a path of tweaking aesthetics because that's what their tools are designed for. (In this regard, newer apps like Sketch and Principle are not fundamentally different from Illustrator or After Effects.)

We should have tools that allow designers to think in terms of higher-level UX components with the confidence that the designs can always translate into code.

I worked on two products that came from this school of thinking, Neonto's Native Studio [1] and React Studio [2] (they're both variations of the same codebase really). I know someone will bridge this gap eventually...

[1] https://neonto.com/nativestudio [2] https://reactstudio.com


> if it's C++ it's QT which I need to doll out cash for.

What?

*EDIT: TIL, Qt is GPL, but offers a lesser licence for a price. For some reason, I didn't know that.


Nokia relicensed it to LGPL 3. Bindings for other languages can have another license though, obviously. For example, the Python PyQt bindings are licensed under GPL or a commercial license.


Yeah, either you make your whole project GPL, or you buy a license so you can make your code proprietary/non-GPL.


I thought LGPL meant if you link to (Qt) shared libraries, your project does not have to be LGPL ?


You're right, Qt is available under LGPL 3.0 now.

https://www.qt.io/licensing-comparison/


Not all platforms allow for dynamic linking.


Out of interest, which platforms that Qt targets, cannot have dynamic linking?


Embedded devices, car infotainment systems, iOS (only dynamic frameworks from Apple are allowed).


If you are targeting ios you are already paying Apple for a developer account, so paying Qt basically just doubles the license cost. Everything else is fairly exclusively corporate, where Qt licenses are a non-issue.


I don't see an issue paying for Qt, other professions pay for their tools.

I was just giving an exemple where LGPL isn't possible.


iOS allows dynamic libraries these days. Granted, it didn’t allow for them for a long time, but since iOS 8 you can use dynamic libraries just fine.


Weird, I feel that UI tools are evolving very fast on Android.

ConstraintLayout is pretty much AutoLayout without the bad parts, Databinding makes declarative style UI easy to write.

It is not perfect by any means, but it is definitely improving quite steadily.


But these ideas of declarative frameworks supporting databinding are old, and have been implemented many times in many heterogeneous frameworks and platforms over the past two decades (probably longer, but I don't have context).


I don't have the context either, but according to the people behind databinding on Android, the previous databinding frameworks are usually pretty slow.

They focused on making it run fast by making it compile to bytecode and make the same minimal calls you would make in java .

It is true that it is not a new idea at all though


As a designer I can tell you right now = solve this problem and make lots of money as an engineer. If you need help, PM me, I will list you all the clusterfucks Adobe has made trying to solve this problem.


Could you sent that list to me? :) You have no email in your profile, so maybe you could send it to me at the email in my profile?


> if it's Java I have to use the outdated looking Swing

Java has JavaFX now


Swing is themeable, as has always been.

With some minimal effort, it's possible to do a nice looking GUI with fonts that not look like vomit, etc.

If Sun and Oracle had spent more than 5 minutes on packaging applications - why not in jxe-files for executable jars, as a simple start to avoid torturing your users - and using nice fonts - and why not a discrete and non-whiny non-bundlig autoupdating jre - they could actually have won on the desktop, when it still was relevant.


Spending a few minutes a day reading "Filthy Rich Clients" would already help.

Sun is the one to blame, as traditional UNIX company their Swing team just didn't had any clue about the desktop world, outside enterprise walls.

Oracle is actually doing much more than Sun ever did, they introduced a way to package applications with Java 8, originally part of the JavaFX project, and are adding support for AOT compilation (it was tabu at Sun) and application specific runtimes.


I was a Swing developer (and loved it). Distributing Swing applications was never a major issue. The issue was that once the web hit, all my co-workers wanted to do web applications and convinced management web was where it was at.


I specifically mean such a simple thing that they have used the .jar file extension both for executable jars and plain archives of files.

If they had defined and promoted using .jxe as the file name extension for executable jars, there had been so much less confusion for the end users.

(Just like they have done with ear, war, rar, etc)

Also, the security considerations for java web start applications where rather strange, with ugly icons warning that you were running a java application etc, and much higher requirements for certificates and signed executable compared to just about everything else that you run on a computer.

If they had re-imaged the framework to use a "installation" and providing automatic updates, just like normal applications, they could have lowered the bar a little.


I have been both side of the fence, and when I have the freedom to choose which projects I apply for, projects with native UIs always win my heart.

Personally I think it was a good move to go back into native UIs, right about the time people were starting to adopt Angular.


The biggest problem was always that everyone hated the native win32 theme, Even MS didn't use it in big products like office.


UI <> UX


UX = UI + ?

UX % UI = ?

Is it a feeling or a thing perhaps?


UX is about needs and goals of users. Is about research with the users, documentation, and then UI, and usability. UI is a part of UX. The most visible, not the most important.


So UX is really about product design?


It is about the experience that the user has when using your product.


and service


Great topic, but the article doesn't answer its own question.

Predictions:

- Microsoft will move the start menu to a different corner of the screen again.

- Some phones will have a dedicated hardware "dismiss popup" button.

- Neon colors with contrasting fringes will be used for visibility on AR displays.

- Alexa will start initiating conversations.

- Hyperreality.[1]

[1] https://vimeo.com/166807261


"Microsoft will move the start menu to a different corner of the screen again."

Aren't they transitioning to a new business model that doesn't incentivize marketing-driven design anymore?


Where did you read that? All I'm seeing is more and more ads in win 10 as time goes on.


Next, Windows video ads you can't block that only appear when you have a projector connected, chosen by adtech which reads your PowerPoint presentations.


Fucking patent that.


Wait, wait, I've been in an OSX/AdBlock/uBlock bubble since Windows 7, are you saying that the latest mass consumer version of Windows has advertisements built in like network television?


Yes. The login screen has "click to buy this desktop wallpaper" ads and occasionally paid ad placements (I believe there was a tomb raider one). And recently they've been putting their cloud drive (OneDrive??) ads in explorer (even for the pro edition). It was one of the final straws that pushed me back to linux.


The crazy thing is that it does that per account apparently.

I ran few tools under my admin acc and I stopped seeing most of the crap.

I saw my wife clicking the start menu under her account on the same PC, and sure enough, ads there.

I think I'll just move Win10 under a VM to preserve some apps and run Linux.


I've heard people talk about them, but I've never seen it on any of the 4 Windows 10 machines running under my roof. Apart from the (already plenty annoying) notification about a new version of Office.


I haven't seen any either. I makes me wonder if people are using pirated versions of Windows and Microsoft has figured out a way to know this reliably and target them with ads.

I do have my "developer tools extensions" enabled on my Windows 10 machine, which I wonder if it affects being targeted by ads.


A/B testing for Windows 10 occurs in the wild, instead of extensive QA testing.

You don't see it, because you're in a separate group.


I upgraded from paid win7 ultimate. Maybe they a/b test?


Android already has a hardware back button (sometimes)... somebody should rog that up to get rid of pop ups in browsers.


I think it's time we whitelist sites that can create popups and popunders. Things like the firefox popup blocker flat out do not work anymore.


is there any real use-case for popunder ?

As for popup, Chrome on Android blocks them by default for me (maybe I touched a setting a long time ago).


Addendum: Everything will be in JavaScript. HTML & CSS will be JS DSLs by then.


"Alexa will start initiating conversations"

hilarious :-)


Unfortunately probably true. Alexa: "Tomorrow is your wedding anniversary. Do you want me to sent your wife a present?"

Did I just write "unfortunately"? :-)


Alexa: link to product that your wife would like, made by the company who paid most for advertising, not the identical product made by their competitor


If I could express a desire rather than a prediction, it's that "undiscoverable" UIs die. I hate having to swipe and tap and generally mess with the screen in the hope of triggering some secret feature. I don't know when it became trendy to get rid of UI chrome but I really wish it would come back.

(I wouldn't mind if the hamburger menu died either.)


While we're at it, can we kill the fixed headers that waste 10-20% of my screen space on every page?

I already know what website I'm on, thanks.


I use this [1] extension to automatically remove fixed elements. And this [2] in case there is something left, or something I just don't like to look at. Probably the most used extensions I have unfortunately.

[1] https://chrome.google.com/webstore/detail/fix-fixed/fmekfmdh...

[2] https://chrome.google.com/webstore/detail/click-to-remove-el...


Add to that the "EU cookies" notice.


Adblock Origin has an optional blacklist you can turn on to kill those. Look in the settings. It's called "EU: Prebake - Filter Obtrusive Cookie Notices"


Thank you! Totally forgot to see if there was a setting for that.


I don't mind the hamburger menu because at least there's a consistent UI that tells me where I can click to find stuff.

Otherwise agree, undiscoverable hidden UIs must die, and so must these flat UIs (which are equivalent to undiscoverable ones). Underlined blue words are good too.

Worst ui trend ever. I eagerly await a return to the 1980s. Which is not nostalgia, because I didn't use computers then — I'm too young and my family was too poor despite being early adopters of at home computers.


The 80s had their high points for sure, but computer interfaces really aren't one of them unless you like green-screen mainframe interfaces.

IMHO, the peak for UIs that are easily usable by regular people with minimal training was back in the mid-to-late-00s. It was before this idiotic flat-UI fad, when UIs had a colorful 3D look to them so they were actually attractive, and also functional (since they were also very discoverable). This was just before everyone suddenly decided that PC UIs needed to be the same as those used on small touchscreens.

However, I will say those 80s mainframe interfaces were extremely functional. For someone who had taken the time to learn it, they could get work done really quickly.


Are you sure about this? Have you seen what the Xerox Star could do in 1981 (http://toastytech.com/guis/star.html)? Or how about Symbolics Genera? Both of these systems had UI functionality in some ways more advanced than what we have today.

They tried to build smart workstations, where today we have essentially "dumb" operating systems and rely on applications to make them useful. The price we pay for this choice is the poor integration across applications. Web applications are even worse: zero integration across applications coupled with huge security problems.


>Are you sure about this? Have you seen what the Xerox Star could do in 1981

Yes, I'm quite sure. I was alive in the 80s, and I remember what things were like. I sure as hell don't remember any Xerox Stars, but I do remember lots of mainframe terminals at places like banks. I never even heard of the Star until I saw your link, similar to how I only learned about the Xerox Alto many, many years later and only saw one in the Smithsonian museum, never in actual use. Computers like that were rare research projects, not at all representative of how things were in the 80s. Macs are as close as you're going to get to today's UIs for computers which normal people had access to and were actually in any kind of widespread use, and these weren't really popular until the mid/late 80s (and even then were pretty limited as they were expensive).

You are correct about integration and how shitty web applications are though.


I'm convinced that flat got popular because it was cheaper than creating textured graphics. The endless fawning over flat on HN was extremely obnoxious - back in 2013 you couldn't go a day without a "flat design is good design" post ending up on the front page.

Glad to see that there are other people out there who hate it. If there's one example of comically dumb flat design patterns right now, it's that "filter" icon you see around the web that looks like a solid-colored upside down triangle and nothing like a filter.


I doubt that mobile screens will get bigger. The resolution will increase, but that just means that the few over-designed details that are shown will be crisper.


I think screens themselves could get bigger, but the devices themselves are as big as they'll ever get. Bendable screens are already a reality, now we just need to get them to make sharp folds. Then you could fold your screen up and put it in your pocket. :D


I see a couple challenges with the future of UX design. One being that everyone has an opinion as to how something should work. Unless someone has years of widely-recognized, valid experience/expertise in design, who's to say if one person's instincts are better than another's?

The other issue I see is process taking over creativity. Soundcloud's iconic design was driven largely by the intuition of the designers and developers. What if they let 'user feedback' and 'UX research' drive the design instead? Would they have created the same interface? Maybe someone can easily knock my points down but at least thats how I see it. To be clear I think UX research is a great tool to improve features, but in general I have a hard time trusting it when it comes to conceiving a new feature.


I've seen the argument of creativity vs process before, and I think it comes from a lack of understanding in the design process.

Most design systems now use a 'double dimond' process. The first stage is researching the problem, and coming up with as many potential solutions as possible - no matter how crazy. That's the stage where creativity is part of the process.

The second diamond is refining a concept - this is what most people view as "UX design" - but it's actually the entire process. If you look at design studios such as IDEO or Emapthy Design, you'll see that conceiving new feature ideas is their specialty.


Yes. a few years back I had the good fortune of participating as an embedded member of an IDEO design team when a former employer hired them. They spent an entire month exclusively on researching the specific problem area we were tackling, conducting in-home interviews with customers and even casually observing them browsing relevant aisles at retail stores.

Actual design work didn't even get started until well over halfway through the engagement, after the extensive research revealed where the big opportunities were for a new product.


Even worse, what if someone has years of widely recognized, valid experience and is also wrong?

If that seems preposterous, I'd point to things like Helicobacter pylori causing ulcers. Decades of dogmatic thinking.


> I'd point to things like Helicobacter pylori causing ulcers. Decades of dogmatic thinking.

This is tangential, but huh? H. Pylori was never to my knowledge believed to be the only cause of gastric ulcers. It is definitely correlated and eradication resolves the disease for some patients. Eradication also reduces stomach cancer risk by about half.


It was the other way around. The dogma was that it was caused by excess acid and the revelation was that the bacteria were a major part of it. I was pointing to the story in general.


Ok. I see what you mean now. Thanks.


> Even worse, what if someone has years of widely recognized, valid experience and is also wrong?

A more practical one might be the skeuomorphic phase apple went through.


That's actually a great example. Skeuomorphic design has it's place and can be good design, but my Calendar app doesn't need to look like a leather diary in as it did in 2014.

A good example is DJ software that looks like set of decks and mixer.


The calendar one seems to be endemic, every calendar seems to go for that "monthly" view that physical calendars had. I live an unorganized life, all I really need from a calendar is an ordered list of the half dozen upcoming events in there.

I was never a physical calendar user either, so when I get confused with the digital reproduction I wonder if I'm representative of the next generation that's never had a physical calendar.


> One being that everyone has an opinion as to how something should work. Unless someone has years of widely-recognized, valid experience/expertise in design, who's to say if one person's instincts are better than another's?

Test and measurement. The proof should be in the pudding. If one design converts better, signs up more customers, brings in more revenue, etc. than another, it should win. "Rely on designers' hunches" is not a repeatable process.


that approach has its own dangers (that at least need to be carefully guarded against): getting stuck in local maxima, incoherence, user attention fatigue, vulnerability to disruption, etc


I think you don't understand the design process. Not being bitchy, so let me explain.

Instinct should be 1% of design if anything. Design isn't about colors and pretty, it's about a detailed process that includes data structures, mapping processes, analysis of users' behavior and removing all instinct and subjectivity from the process of design. It should include a minimum of guesswork wherever possible. It always amazes me, for example, that shared hosting comes with awstats as standard on 99% of shared hosts, but designers and web-developers alike ignore all the data sitting there available for analysis, to use just one example of how designers aren't using the tools available to them (and you have to know what to do with the awstats data too).

My prediction for the future of UX is that the army of self-taught pseudo-UX 'experts' are going to have to up their game, get some real training and stop being intuitive if the industry is to continue to grow. Would anyone on HN hire a full time software tester/QA who didn't have some training in that field? Probably, but you'd be better off with a QA who is certified.


> it's about a detailed process that includes data structures, mapping processes, analysis of users' behavior and removing all instinct and subjectivity from the process of design.

Ugh no. Please, no.

Don't do this. I'm saying – pleading – this as a user.

Apple didn't do it (but maybe now they do, and it's starting to show in some of their products as a sort of creeping clinical sterility). Steve Jobs didn't do it [0]. And yet the UX set by them has been admired, imitated and aspired-to for over 30 years.

Plenty of other examples can be seen in the UI of Japanese games versus Western games.

I'm sorry but relying on "data structures, mapping processes, analysis of users' behavior" means you suck at UX design – like reading books on how to be socially adept – and the best you're going to achieve is a functional but sterile, neutral, gray block of clinical equipment, devoid of personality and soul and color and warmth ...if you sneered at these words, then I for sure wouldn't want to be stuck using your products.

In any case, if data structures, mapping processes, and analysis of user behavior are your primary skills then you'll be replaced by AI within a decade anyway. :)

[0] http://www.newyorker.com/news/news-desk/steve-jobs-technolog...


I think you are mixing up visual design, graphic design, with UX and interaction design.


No, he's right, in part. UX & interaction design shouldn't be just about measuring clickthrough and abandon rates, or you get interfaces like Google's products, who use that design philosophy.[1]

UX design must be informed by data and field research (which depends on those "data structures, mapping processes, analysis of users' behavior"), but it still needs an opinionated designer who empathizes with the user pains and problems, and creates an interface that solves the needs as well as provides the adequate usage feeling (which, even being non-functional, is an important part of making a design usable) [2].

[1] https://medium.com/the-design-innovator/iteration-is-not-des...

[2] https://en.wikipedia.org/wiki/Emotional_Design#Content


Intuition is going away anytime soon. The data that one can collect today cannot compete head to head with the intuition and rich mental models of someone who deeply understands some domain and how that domain connects to other aspects of the human experience. It's no contest.

Data should be used as a powerful supplement, assumptions should be validated, and due diligence must be carried out.

But if there isn't a vision behind the project, it will be likely be mediocre at best.


Design should be data informed but not data lead. Simply because data tells you about the situation now but rarely is it enough to create a model in order to predict the effect of future changes accurately. Further data generally doesn't fully describe whats happening and must be interpreted. This means both the act of understanding the product use and improving the product remain subjective at the point of design. It's only later after the product has been changed we can verify our designs.

People with good taste and domain knowledge aren't going to go away anytime soon.


soundcloud has an iconic design? the little waveform-esque view?


Yes, as someone who was using MySpace a lot for consuming music, SoundCloud's design was revolutionary.


The problem I see with UX today is that there is a split.

One the one hand are the people who just want to make good products people love to use.

On the other hand are the corps who want to use this as a tool to direct people in a way that they spent more money for nothing gained.


There are lots of issues with UX as a field right now, but there's zero question in my mind this is the biggest one. There are incentives that are to some degree not aligned with the interests of users.

It doesn't even end at the profit interests of business owners -- career development is another one. UX workers in the field have incentives to demonstrate currency in design fashions (whether or not those fashions work well with a given application or problem domain), and the more changes you can stake a claim to, the more credit you can take.

There isn't a solution to this. The one mitigating factor is that there are limits to how badly you can misalign user incentives before many decide to go elsewhere (though there's also such a thing as monopoly power and other forms of captive audience, and there's a long tail of users who are sticky even when they technically have the ability to go elsewhere).


> One the one hand are the people who just want to make good products people love to use.

I don't want to "love to use" any product, the product is just a means to an end and just needs to let me do what I need to do and stay out of my way otherwise.


Good UI is invisible. You don't even notice it's there.

Sadly people seem to have taken this to mean "good UI is literally not visible and you have to try every obscure swipe and tap combo in the hope of discovering new commands."


Some say the journey is as important as the destination. In which case, I prefer to use products that I can love.


Exactly. A hammer is just a tool so I can put a frame of my family on the wall.


Absolutely this. And UX is increasingly about creating dark patterns that manipulate users for profit.

So my prediction is that because of financial feedback loops Dark UX will become more and more of a problem - which means the digital world will become more and more hostile, frustrating, intrusive, and abrasive, and less and less rewarding for users.


My own predictions for Hot Web Trends of 2022:

Websites will be closer to content streaming apps rather than today's downloadable text/image/media pages. You'll view text media within a container that draws the content on the server side - ensuring that the user can't copy-paste it as text. Advertising will be baked in to the displayed content with no clear way for the client to distinguish it.

UX will aim to be as unobtrusive as possible. Users will be provided few direct controls, if any, and instead apps will infer what is best at keeping the user watching.

Video and animation will be more prevalent and seamlessly integrated with text content. Boxes of mixed media side-by-side will be considered messy and confusing, instead the app will decide what is most relevant to display, and when. Scrollbars will be a relic of the past.

Search boxes will be considered poor design: who enjoys wasting time typing? Server selected content will be continuously streamed until the user shows signs of wanting to leave, at which point clickbait will appear.

Voice will be pushed as the preferred method for interacting with devices. This will be done to move the user further away from the mouse/remote/controller or any other input that can exit/turn off the app or device. Voice commands will be advertised as active, but a ToS change will allow apps to eavesdrop on the mood of the audience. Advertisers will be especially interested in this data. Commercials may be skipped by saying out loud, "skip [brand name]".


Dear god do I hope you are wrong.

Though as an aside, in this content-recycling ecosystem, I can't see sites that highjack copy-pasting as being particularly successful. Rebloggers would just stick to sources they can easily bake into their own content streams, so you'd need to consistently be serving exclusive content or content much earlier than it 'hits the grapevine'.


drink verification can to continue


how is this different from last gen television? (apart from the "voice commands")


Design Lab seems like they have an excellent program, based on reviews and seeing some course material.

I think they're bullish on "new markets" for UX design (understandably). It seems like UX is the last thing companies prioritize and the first they boot. And it's not really a mystery why - most projects still fail to deliver basic functionality, let alone provide a thoughtful user experience [0].

[0] http://www.cio.com.au/article/533532/why_it_projects_really_...


Seems to me like a pretty severe underestimation of augmented reality. Even the most basic feature of simulating multiple, large monitors in one's visual field is enough to push it into the mainstream– at least among creative professionals like developers, engineers, and designers. The current technologies seem to be well within five years of doing that well, and we still haven't seen anything from Magic Leap or Apple. Based on the opinions expressed, I'd hazard to guess the author of this article hasn't even tried a hololens...


/I am UX designer/

UX is becoming more differentiated, more different UX roles and positions will occur:

UX Researcher - is already there

IA Architect - will be more common

UX Analyst - is collecting and interpreting data from Keboola-like tools (or "Hadoop-based" tools collecting data about user base), providing additional data to UX researcher

"Prototyper" - coder responsible for technical clarity of coded prototypes and their preparedness to be inherited into production

UX designer - responsible for synthesizing data from research and applying them into the product/service in proper context; preferably a people-person, as he/she connects all other sides

...


Thats just the business around UX not UX itself.


It's setting of roles, as UX is no longer a one-person business (as you say).


I think this merits the question, what is UX Design? Seriously.


Translating technical processes into something people can use effectively... and hopefully enthusiastically.


Snapchat's Spectacles [1] offers an amazing example of user experience design as Norman describes it:

“Today that term [user experience] has been horribly misused. It’s been used by people who say, ‘I’m a user experience designer, I design websites,’ or, ‘I design apps,’ and they have no clue what they’re doing, and they think the ‘experience’ is that simple device – the website or the app or who knows what – no, it’s everything – it’s the way you experience the world, it’s the way you experience your life, it’s the way you experience a service, or, yeah, an app, or a computer system, but it’s a system that’s everything. Got it?”

1. https://www.spectacles.com/


"A full picture of the Mac’s UX design includes advertising, store layout, the purchase process, the box, the documentation, how it feels to hold, the esteem and social meaning of owning it, and so on."

At some point does it become so all encompassing to be meaningless as a term?


We might need a new all-encompassing term, but UX is definitely theater.

The stuff on the screen is just one of many props used in the “performance”.

Apple seems to be the only company that gets this.


I call it end-to-end UX and while a lot of UX designers focus only on the UI, no, I do not think it's a meaningless term. There's still a lot of confusion over what UX really means and a lot of designers don't design for the end-to-end experience.


Do a search on "service design", and you'll find plenty of information on designing at that level of scope, all-encompassing as it may seem.

"Service design is the activity of planning and organizing people, infrastructure, communication and material components of a service in order to improve its quality and the interaction between the service provider and its customers." (From Wikipedia)

It's less common that I see UX defined in such broad terms, however.


Blowing up project budgets trying to get markup to display like random native app controls. Same place it was five years ago, and ten years before that.


This is why went from someone that enjoyed web development in its early days to work on native UIs every time I get the opportunity.

Nowadays I see Web dev as plain work, for fun I code only native.


UX design will start shifting toward taking human emotions into account. Our metrics will also start shifting toward reason-/need-oriented data.


Dark design will become much darker once we start designing for emotions with more intention. Emotionally manipulative UX will spread.

It will be combated by UX designed to implicitly teach emotional responsibility, regulation, and resiliency.


I don't think the latter is the job of UX. I think it may be the job of AI.

I'm imagining that a few decades from now everyone will have a personal AI "guardian angel"/personal assistant, which filters and defangs online bullshit, distills the most useful/effective information using its own context aware initiative, and presents it in a form that's customised to be ideal for individuals, given a lifetime of knowledge of each user combined with broader deep insight modelling to maintain an evolving psych profile.

Of course this could go horribly wrong. But it's an interesting idea which could be the basis for a Next Generation AI OS - something that isn't just about maintaining files, running processes on threads, and managing a GUI, but is which is psychologically sophisticated and maybe even appears more mature and informed than its owner.

Humans are more similar than different, and so are human problems. In the same way that online text has replaced dead media text in a literal way, AI could replace the content and insight of both dead media text and social learning with dynamic evolving summary models.


Your proposed solution externalizes all of the things I said. Instead of teaching humans how to human, it sounds like you're proposing to pawn off that role.

That's why it's a UX problem: to make us more human, not less.

You know the phrase "First learn to do it right, then learn to do it well"? Maybe let's hold off on externally augmenting humans until we've learned to human properly.


Cool! A personal Navi (Zelda)


Not so sure on the teach responsibility part, or how so you explain Casinos?


Casinos = dark design. Of course they wouldn't teach emotional responsibility.


User interface design will always be about the experience of the person using the software. How well can you do the things you're trying to do with it? Trends like AI and voice interfaces are less important than the fundamentals -- clarity, consistency, efficiency, and so forth -- which most software still doesn't get right.


Buy the way - this article gets most everything with UX wrong. But it's worth it to get HN community engaged on UX.


UX Design will be significantly more important. Bad UX will equal NO Customers. UX has to be more, it has to not just be the digital medium but encapsulate everything the user or customer needs. HUGE growth ahead for UX Design!


It will be more web based. Again.


It will be more in the background and not do much in your face as it is today.

Manual inputting information is not the way forward, automatic adding information based on context is.

In five years UX will be less UX.


04/April/2022


This I can get behind! Is there a name for this format?


As nice as it is, it's language-dependent.


Does anyone have any good resources for UX that they would recommend? Books, courses, certifications, or otherwise?


Double Hamburger with Bacon and Cheese menu..


I wonder what devices will come in the next 5 years that require significantly new developments in UX.


If the MTA metrocard machines are any indication, it may be a sorry state indeed!


This is not really about what UX will be in 5 years, it is how it is right now!


I've had the (still fairly unique) opportunity of actually having to make very different considerations with regards to user experience. This came in the form of designing for HoloLens. I bring this up because I believe in 5 years we'll be much closer to realizing the potential of VR and MR (mixed reality, like HoloLens.)

I do think VR devices will live a short life and die off. VR (in the form of a single purpose device) doesn't really have a place once a device can provide both augmentation and a totally virtual experience. I don't think we're far away from that.

Certainly what we'll see is a uptake in MR devices. I suspect future iterations of HoloLens and other MR devices will bring forth a desire to experience and incorporate MR. Some businesses are already jumping on the opportunity, but in my opinion the technology isn't quite there. In the case of HoloLens, it's clear it still lacks a real understanding of intuitive input.

I think one of the big mistakes people make when considering new UX patterns in 3D space is that not everything is designed for 3D space. And conversely, not everything designed for 2D interfaces works in 3D. You can't just move your app to VR. Certainly you'd think it would be rather strange if a restaurant provided its menu as a stack of blocks. Some interfaces are indeed better suited for 2D, so for that reason I believe there is a place for some 2D interfaces in a mixed reality future.

The other big mistake I've found people make is the idea that in order for something to have great UX, it should mimic real life. Perhaps this is true in some cases, and if you're building a product that's designed to mimic real life then that's probably the best choice. But new experiences will undoubtedly emerge (more often than not, I suspect), experiences which are foreign in concept to us now because its simply not possible in our physical world, and that will be a real test for UX experts out there.

One thing we will need to focus on is understanding what it means for an interaction to be discoverable. A lot of people seem to think voice is intuitive, but I don't think that could be further from the truth unless you've got a general intelligence to talk to. No one thinks automated answering systems are user friendly. Even with general intelligence it can be hard to put your intention into words when it comes time to "take an action."

Personally I think the optimal solution for these types of interfaces will be a mixture of context awareness and neural impulses. If I can look at a TV, and the device can see what I'm looking at (on board camera also sees a TV) then it has an understanding of what actions I might be interested in performing. At that point it can show options in 2D above the TV or however you want to lay it out. I'd then be able to look at the option I want (device tracks pupils and knows with accuracy what I'm looking at) and think about touching it. This impulse acts as an "invoke" action on the current thing I'm focused on.

If this stuff becomes possible, then that's about as low friction as I can think of without interfacing with the brain. Will be be there in 5 years? Hard to say. I'd be willing to bet we might have something that gets us partly or mostly there, and may have to be tethered to a secondary device like a phone for additional processing.


For the UX of hololens/AR, I really hope - as a step before neural links - they combine it with something like a keyboard glove (there are several ongoing developments for such devices). Something to make an input, while barely moving as it is today with the mouse and/or keyboard. This just might lead to the experience you mentioned, that the AR device recognizes objects you can interact with and the minimal effort to make the interaction.

Of course, you can try to interact while in VR or AR by throwing your arms around and pointing in the air. This works, if you'd like to be immersed in a game. But for everyday tasks, that is not the subtle interaction as provided by current haptic interfaces like a mouse, keyboard or touchscreen.


NLP will take over. "Alexa make the page 'pop' more".

Nevermind.


Have there been anything new in UX since 1970s?


Probably the same as with programming, so no, haha.

But, we can always use all the research that went into desktop machines and sell it as new things for mobile and web.

This works better than I initially thought, lol.


Swipes, pinches, shakes, 3D effects.

One phone UX thing I hate for example. Phone makes random noises at night. Now try to pinpoint the app or setting or setting within an app that causes that.


Swiping, and pinching was pretty game changing. Shaking and 3D effects are just useless novelties.


Proliferation of touchscreen interfaces


Pervasive bitmapped displays.


Color.


It seems bizarre to use Google Glass as justification that AR won't end up being a thing. Check out Magic Leap.


I'm a UX Consultant working at a full-service agency in Europe. The most valuable (and interesting) part of my job has more to do with strategy and the design process than it does with UI.

In my pov, my job is first and foremost to understand the "problem" that a client communicates. The stated problem may or may not correspond to the actual problem that needs to be solved. This is especially true of semi-privatized, previously national monopoly working in infrastructure (like SNCF or EDF in France, Deutsche Bahn in Germany...) or service industries, but is also true for smaller mid-size businesses.

The stated problem might be something vague like "more digitalization", "being more competitive with with newer, smaller actors" (eg. SNCF/DB vs. Trainline/Uber) or something specific, like "redesign our website so it's more modern", or "design a new iOS/Android app.

The real problem might be: aligning internal departments on a vision (seriously, getting the heads of Marketing and Engineering on the same page is VERY hard at that scale); making sure that good ideas at the company don't get diluted and killed going through middle management; promoting innovation within a company (which then is translated to a website, an application or even a change in strategy, management, structure).

Sometimes the real problem is that the company just hasn't spent enough time (or money) to understand what their users'/client's experience in relation to their product (in context of their normal lives). It could be something as simple as clients not really knowing how to use the product properly, being frustrated because they had to wait too long, not knowing which option to pick or pressing on "like" just to find a post later. It could be that clients absolutely love a small detail that wasn't meant to be very important. You learn a lot about just talking to people and asking questions.

So, depending on the type of project my job as UX Consultant is to:

— coordinate ethnographic interviews (mini-ethnographic maybe; the idea is to learn from real, in depth conversations) with either clients, potential clients or internal management

— organize ideation/innovation workshops (two goals: come up with new ideas AND get different departments and levels of management aligned on the same vision)

— map out customer journeys and create personae (with the client) so that we're able to empathise with real users (not always relevant; there's a risk of doing meaningless work if done because this it something that was sold, but when used properly can be really useful).

- translating these insights/findings into strategic recommendations or digital objects (app, websites)

- designing wireframes to concretize ideas and encourage discussion/debate

- working on high-fidelity interactive prototypes to pass on to developers (who might be working client-side)

- designing and organizing usability tests (and A/B tests) when relevant.

So that's how I see it.

Of course, there's also that part of UX which is to create addiction, increase "engagement" or encourage certain behavior (increase newsletter signup). That's not really my thing.


The answer to most questions beginning with, "where will (insert job) be ... ?" will be invariably be, "done by a neural net," as is partially covered in this article.


Disagree about VR. GPU power will finally reach the point where 3D is commonplace. 3D UI and 3D animations will be the biggest new trend.


I still haven't found a game which properly does a 3D UI, let alone a piece of productivity software. Even games which attempt 3D ultimately end up with 2.5D UIs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: