
Where will UX Design be in 5 years? - andrewwilshere
http://trydesignlab.com/blog/where-ux-design-5-years-predictions/
======
hacker_9
To be honest UI dev tools just plain suck, making everything move more slowly
that it needs to. If I'm on the web it's the HTML + CSS clusterfuck, If I'm in
C# I'm writing color converters for WPF, if it's Java I have to use the
outdated looking Swing, if it's C++ it's QT which I need to doll out cash for.
Additionally every smart phone OS has it's own groundbreaking GUI layout
concepts and ideas too.

Then you get the layers on top, Xamarin or Eto.Forms or X, that convert down
to the native widgets or just create their own entirely. These usually come
with a healthy dose of missing functionality, such as mouse drag detection.

And then there are the languages with no UI libraries at all. As if this is
even acceptable in 2017.

All in all, whoever decided GUI frameworks were a good idea got it very very
wrong indeed. This stuff should have been standardized long ago, stored in
some universal data format, and then every programming language could add
support for it. They would _want_ to add support for it. Closest to this is
HTML of course... but, well, I think everyone would hope for something more
elegant.

~~~
pavlov
Sun NeWS should have been that GUI standard, thirty years ago:

[https://en.wikipedia.org/wiki/NeWS](https://en.wikipedia.org/wiki/NeWS)

There was a simplicity and purity in its approach to front-end interactivity
and rendering that completely eludes the HTML+CSS+JS stack we're now stuck
with. As the Wikipedia article states...

 _" NeWS was architecturally similar to what is now called AJAX, except that
NeWS coherently: used PostScript code instead of JavaScript for programming;
used PostScript graphics instead of DHTML and CSS for rendering; used
PostScript data instead of XML and JSON for data representation."_

PostScript is basically a graphics-oriented Forth, so it's kind of a weird
language to write directly... But it would have been great as a target for all
sorts of interesting compilers and GUI tools.

~~~
TuringTest
If you have compilers and GUI tools, how does it matter than the underlying
layer is HTML+CSS+JSON? The designer shouldn't see it anyway.

Now that we have a web-aware platform (which NeWS was not), nothing stops us
to create good UI tools on top - yet it is happening very slowly.

~~~
pjmlp
You mean by piling up <div>, <span>, CSS and JavaScript?

~~~
TuringTest
Yes, why not? That's no different than piling objects in a GUI widget library
to build a tree structure for the layout of a window; except that it has a
human-readable string representation.

~~~
pjmlp
Sure there is, on a GUI toolkit I can always control at some level how those
pixels are drawn, while on a browser I need to hope it does the right thing.

Also there is big performance impact, where piling up divs with CCS requires
the right incantations, some browser version specific, for hardware
acceleration to kick in.

And in any case, it is impossible to fully replicate the UX of the host
platform.

~~~
TuringTest
> I can always control at some level how those pixels are drawn

How well does that work when your GUI must be shown in arbitrary devices, with
arbitrary screen sizes and arbitrary resolutions? And how well does it reflow
when the user changes the viewport size?

In any case, the original post to which I replied was comparing HTML5 to the
NeWS platform, which provided a unified UI stack itself, so you wouldn't be
able to match the native behavior following that approach either.

~~~
pjmlp
That is what layout managers and logical pixels are for.

You are missing the fact that NeWS offered this kind of control.

"The NeWS Book" \-
[https://archive.org/details/bitsavers_sunNeWSThe_11666863](https://archive.org/details/bitsavers_sunNeWSThe_11666863)

------
Animats
Great topic, but the article doesn't answer its own question.

Predictions:

\- Microsoft will move the start menu to a different corner of the screen
again.

\- Some phones will have a dedicated hardware "dismiss popup" button.

\- Neon colors with contrasting fringes will be used for visibility on AR
displays.

\- Alexa will start initiating conversations.

\- Hyperreality.[1]

[1] [https://vimeo.com/166807261](https://vimeo.com/166807261)

~~~
zzzcpan
"Microsoft will move the start menu to a different corner of the screen
again."

Aren't they transitioning to a new business model that doesn't incentivize
marketing-driven design anymore?

~~~
quickben
Where did you read that? All I'm seeing is more and more ads in win 10 as time
goes on.

~~~
fludlight
Wait, wait, I've been in an OSX/AdBlock/uBlock bubble since Windows 7, are you
saying that the latest mass consumer version of Windows has advertisements
built in like network television?

~~~
flukus
Yes. The login screen has "click to buy this desktop wallpaper" ads and
occasionally paid ad placements (I believe there was a tomb raider one). And
recently they've been putting their cloud drive (OneDrive??) ads in explorer
(even for the pro edition). It was one of the final straws that pushed me back
to linux.

~~~
quickben
The crazy thing is that it does that per account apparently.

I ran few tools under my admin acc and I stopped seeing most of the crap.

I saw my wife clicking the start menu under her account on the same PC, and
sure enough, ads there.

I think I'll just move Win10 under a VM to preserve some apps and run Linux.

------
LeoPanthera
If I could express a desire rather than a prediction, it's that
"undiscoverable" UIs die. I hate having to swipe and tap and generally mess
with the screen in the hope of triggering some secret feature. I don't know
when it became trendy to get rid of UI chrome but I really wish it would come
back.

(I wouldn't mind if the hamburger menu died either.)

~~~
wlesieutre
While we're at it, can we kill the fixed headers that waste 10-20% of my
screen space on every page?

I already know what website I'm on, thanks.

~~~
sogen
Add to that the "EU cookies" notice.

~~~
LeoPanthera
Adblock Origin has an optional blacklist you can turn on to kill those. Look
in the settings. It's called "EU: Prebake - Filter Obtrusive Cookie Notices"

~~~
sogen
Thank you! Totally forgot to see if there was a setting for that.

------
capkutay
I see a couple challenges with the future of UX design. One being that
everyone has an opinion as to how something should work. Unless someone has
years of widely-recognized, valid experience/expertise in design, who's to say
if one person's instincts are better than another's?

The other issue I see is process taking over creativity. Soundcloud's iconic
design was driven largely by the intuition of the designers and developers.
What if they let 'user feedback' and 'UX research' drive the design instead?
Would they have created the same interface? Maybe someone can easily knock my
points down but at least thats how I see it. To be clear I think UX research
is a great tool to improve features, but in general I have a hard time
trusting it when it comes to conceiving a new feature.

~~~
kagamine
I think you don't understand the design process. Not being bitchy, so let me
explain.

Instinct should be 1% of design if anything. Design isn't about colors and
pretty, it's about a detailed process that includes data structures, mapping
processes, analysis of users' behavior and removing all instinct and
subjectivity from the process of design. It should include a minimum of
guesswork wherever possible. It always amazes me, for example, that shared
hosting comes with awstats as standard on 99% of shared hosts, but designers
and web-developers alike ignore all the data sitting there available for
analysis, to use just one example of how designers aren't using the tools
available to them (and you have to know what to do with the awstats data too).

My prediction for the future of UX is that the army of self-taught pseudo-UX
'experts' are going to have to up their game, get some real training and stop
being intuitive if the industry is to continue to grow. Would anyone on HN
hire a full time software tester/QA who didn't have some training in that
field? Probably, but you'd be better off with a QA who is certified.

~~~
Razengan
> it's about a detailed process that includes data structures, mapping
> processes, analysis of users' behavior and removing all instinct and
> subjectivity from the process of design.

Ugh no. _Please,_ no.

Don't do this. I'm saying – pleading – this as a user.

Apple didn't do it (but maybe now they do, and it's starting to show in some
of their products as a sort of creeping clinical sterility). Steve Jobs didn't
do it [0]. And yet the UX set by them has been admired, imitated and aspired-
to for over 30 years.

Plenty of other examples can be seen in the UI of Japanese games versus
Western games.

I'm sorry but relying on "data structures, mapping processes, analysis of
users' behavior" means you suck at UX design – like reading books on how to be
socially adept – and the best you're going to achieve is a functional but
sterile, neutral, _gray_ block of clinical equipment, devoid of personality
and soul and color and warmth ...if you sneered at these words, then I for
sure wouldn't want to be stuck using your products.

In any case, if data structures, mapping processes, and analysis of user
behavior are your primary skills then you'll be replaced by AI within a decade
anyway. :)

[0] [http://www.newyorker.com/news/news-desk/steve-jobs-
technolog...](http://www.newyorker.com/news/news-desk/steve-jobs-technology-
alone-is-not-enough)

~~~
kagamine
I think you are mixing up visual design, graphic design, with UX and
interaction design.

~~~
TuringTest
No, he's right, in part. UX & interaction design shouldn't be just about
measuring clickthrough and abandon rates, or you get interfaces like Google's
products, who use that design philosophy.[1]

UX design must be _informed_ by data and field research (which depends on
those "data structures, mapping processes, analysis of users' behavior"), but
it still needs an opinionated designer who empathizes with the user pains and
problems, and creates an interface that solves the needs as well as provides
the adequate usage feeling (which, even being non-functional, is an important
part of making a design usable) [2].

[1] [https://medium.com/the-design-innovator/iteration-is-not-
des...](https://medium.com/the-design-innovator/iteration-is-not-
design-668695445f76#.ogdyuar44)

[2]
[https://en.wikipedia.org/wiki/Emotional_Design#Content](https://en.wikipedia.org/wiki/Emotional_Design#Content)

------
k__
The problem I see with UX today is that there is a split.

One the one hand are the people who just want to make good products people
love to use.

On the other hand are the corps who want to use this as a tool to direct
people in a way that they spent more money for nothing gained.

~~~
flukus
> One the one hand are the people who just want to make good products people
> love to use.

I don't want to "love to use" any product, the product is just a means to an
end and just needs to let me do what I need to do and stay out of my way
otherwise.

~~~
taneq
Good UI is invisible. You don't even notice it's there.

Sadly people seem to have taken this to mean "good UI is literally not visible
and you have to try every obscure swipe and tap combo in the hope of
discovering new commands."

------
syphilis2
My own predictions for Hot Web Trends of 2022:

Websites will be closer to content streaming apps rather than today's
downloadable text/image/media pages. You'll view text media within a container
that draws the content on the server side - ensuring that the user can't copy-
paste it as text. Advertising will be baked in to the displayed content with
no clear way for the client to distinguish it.

UX will aim to be as unobtrusive as possible. Users will be provided few
direct controls, if any, and instead apps will infer what is best at keeping
the user watching.

Video and animation will be more prevalent and seamlessly integrated with text
content. Boxes of mixed media side-by-side will be considered messy and
confusing, instead the app will decide what is most relevant to display, and
when. Scrollbars will be a relic of the past.

Search boxes will be considered poor design: who enjoys wasting time typing?
Server selected content will be continuously streamed until the user shows
signs of wanting to leave, at which point clickbait will appear.

Voice will be pushed as the preferred method for interacting with devices.
This will be done to move the user further away from the
mouse/remote/controller or any other input that can exit/turn off the app or
device. Voice commands will be advertised as active, but a ToS change will
allow apps to eavesdrop on the mood of the audience. Advertisers will be
especially interested in this data. Commercials may be skipped by saying out
loud, "skip [brand name]".

~~~
_ZeD_
how is this different from last gen television? (apart from the "voice
commands")

------
mshenfield
Design Lab seems like they have an excellent program, based on reviews and
seeing some course material.

I think they're bullish on "new markets" for UX design (understandably). It
seems like UX is the last thing companies prioritize and the first they boot.
And it's not really a mystery why - most projects still fail to deliver basic
functionality, let alone provide a thoughtful user experience [0].

[0]
[http://www.cio.com.au/article/533532/why_it_projects_really_...](http://www.cio.com.au/article/533532/why_it_projects_really_fail/)

------
laser
Seems to me like a pretty severe underestimation of augmented reality. Even
the most basic feature of simulating multiple, large monitors in one's visual
field is enough to push it into the mainstream– at least among creative
professionals like developers, engineers, and designers. The current
technologies seem to be well within five years of doing that well, and we
still haven't seen anything from Magic Leap or Apple. Based on the opinions
expressed, I'd hazard to guess the author of this article hasn't even tried a
hololens...

------
mirap
/I am UX designer/

UX is becoming more differentiated, more different UX roles and positions will
occur:

UX Researcher - is already there

IA Architect - will be more common

UX Analyst - is collecting and interpreting data from Keboola-like tools (or
"Hadoop-based" tools collecting data about user base), providing additional
data to UX researcher

"Prototyper" \- coder responsible for technical clarity of coded prototypes
and their preparedness to be inherited into production

UX designer - responsible for synthesizing data from research and applying
them into the product/service in proper context; preferably a people-person,
as he/she connects all other sides

...

~~~
ThomPete
Thats just the business around UX not UX itself.

~~~
mirap
It's setting of roles, as UX is no longer a one-person business (as you say).

------
tlow
I think this merits the question, what is UX Design? Seriously.

~~~
radley
Translating technical processes into something people can use effectively...
and hopefully enthusiastically.

------
snailletters
Snapchat's Spectacles [1] offers an amazing example of user experience design
as Norman describes it:

“Today that term [user experience] has been horribly misused. It’s been used
by people who say, ‘I’m a user experience designer, I design websites,’ or, ‘I
design apps,’ and they have no clue what they’re doing, and they think the
‘experience’ is that simple device – the website or the app or who knows what
– no, it’s everything – it’s the way you experience the world, it’s the way
you experience your life, it’s the way you experience a service, or, yeah, an
app, or a computer system, but it’s a system that’s everything. Got it?”

1\. [https://www.spectacles.com/](https://www.spectacles.com/)

------
rileymat2
"A full picture of the Mac’s UX design includes advertising, store layout, the
purchase process, the box, the documentation, how it feels to hold, the esteem
and social meaning of owning it, and so on."

At some point does it become so all encompassing to be meaningless as a term?

~~~
noblethrasher
We might need a new all-encompassing term, but UX is definitely theater.

The stuff on the screen is just one of many props used in the “performance”.

Apple seems to be the only company that gets this.

------
knieveltech
Blowing up project budgets trying to get markup to display like random native
app controls. Same place it was five years ago, and ten years before that.

~~~
pjmlp
This is why went from someone that enjoyed web development in its early days
to work on native UIs every time I get the opportunity.

Nowadays I see Web dev as plain work, for fun I code only native.

------
crawfordcomeaux
UX design will start shifting toward taking human emotions into account. Our
metrics will also start shifting toward reason-/need-oriented data.

~~~
crawfordcomeaux
Dark design will become much darker once we start designing for emotions with
more intention. Emotionally manipulative UX will spread.

It will be combated by UX designed to implicitly teach emotional
responsibility, regulation, and resiliency.

~~~
TheOtherHobbes
I don't think the latter is the job of UX. I think it may be the job of AI.

I'm imagining that a few decades from now everyone will have a personal AI
"guardian angel"/personal assistant, which filters and defangs online
bullshit, distills the most useful/effective information using its own context
aware initiative, and presents it in a form that's customised to be ideal for
individuals, given a lifetime of knowledge of each user combined with broader
deep insight modelling to maintain an evolving psych profile.

Of course this could go horribly wrong. But it's an interesting idea which
could be the basis for a Next Generation AI OS - something that isn't just
about maintaining files, running processes on threads, and managing a GUI, but
is which is psychologically sophisticated and maybe even appears more mature
and informed than its owner.

Humans are more similar than different, and so are human problems. In the same
way that online text has replaced dead media text in a literal way, AI could
replace the _content and insight_ of both dead media text and social learning
with dynamic evolving summary models.

~~~
crawfordcomeaux
Your proposed solution externalizes all of the things I said. Instead of
teaching humans how to human, it sounds like you're proposing to pawn off that
role.

That's why it's a UX problem: to make us more human, not less.

You know the phrase "First learn to do it right, then learn to do it well"?
Maybe let's hold off on externally augmenting humans until we've learned to
human properly.

------
panic
User interface design will always be about the experience of the person using
the software. How well can you do the things you're trying to do with it?
Trends like AI and voice interfaces are less important than the fundamentals
-- clarity, consistency, efficiency, and so forth -- which most software still
doesn't get right.

------
radley
Buy the way - this article gets most everything with UX wrong. But it's worth
it to get HN community engaged on UX.

------
UXArchitect929
UX Design will be significantly more important. Bad UX will equal NO
Customers. UX has to be more, it has to not just be the digital medium but
encapsulate everything the user or customer needs. HUGE growth ahead for UX
Design!

------
EGreg
It will be more web based. Again.

------
ThomPete
It will be more in the background and not do much in your face as it is today.

Manual inputting information is not the way forward, automatic adding
information based on context is.

In five years UX will be less UX.

------
bruno2223
04/April/2022

~~~
orik
This I can get behind! Is there a name for this format?

~~~
alphapapa
As nice as it is, it's language-dependent.

------
Justen
Does anyone have any good resources for UX that they would recommend? Books,
courses, certifications, or otherwise?

------
yunocat
Double Hamburger with Bacon and Cheese menu..

------
ng-user
I wonder what devices will come in the next 5 years that require significantly
new developments in UX.

------
hullsean
If the MTA metrocard machines are any indication, it may be a sorry state
indeed!

------
d--b
This is not really about what UX will be in 5 years, it is how it is right
now!

------
iLoch
I've had the (still fairly unique) opportunity of actually having to make very
different considerations with regards to user experience. This came in the
form of designing for HoloLens. I bring this up because I believe in 5 years
we'll be much closer to realizing the potential of VR and MR (mixed reality,
like HoloLens.)

I do think VR devices will live a short life and die off. VR (in the form of a
single purpose device) doesn't really have a place once a device can provide
both augmentation and a totally virtual experience. I don't think we're far
away from that.

Certainly what we'll see is a uptake in MR devices. I suspect future
iterations of HoloLens and other MR devices will bring forth a desire to
experience and incorporate MR. Some businesses are already jumping on the
opportunity, but in my opinion the technology isn't quite there. In the case
of HoloLens, it's clear it still lacks a real understanding of intuitive
input.

I think one of the big mistakes people make when considering new UX patterns
in 3D space is that not everything is designed for 3D space. And conversely,
not everything designed for 2D interfaces works in 3D. You can't just move
your app to VR. Certainly you'd think it would be rather strange if a
restaurant provided its menu as a stack of blocks. Some interfaces are indeed
better suited for 2D, so for that reason I believe there is a place for some
2D interfaces in a mixed reality future.

The other big mistake I've found people make is the idea that in order for
something to have great UX, it should mimic real life. Perhaps this is true in
some cases, and if you're building a product that's designed to mimic real
life then that's probably the best choice. But new experiences will
undoubtedly emerge (more often than not, I suspect), experiences which are
foreign in concept to us now because its simply not possible in our physical
world, and that will be a real test for UX experts out there.

One thing we will need to focus on is understanding what it means for an
interaction to be discoverable. A lot of people seem to think voice is
intuitive, but I don't think that could be further from the truth unless
you've got a general intelligence to talk to. No one thinks automated
answering systems are user friendly. Even with general intelligence it can be
hard to put your intention into words when it comes time to "take an action."

Personally I think the optimal solution for these types of interfaces will be
a mixture of context awareness and neural impulses. If I can look at a TV, and
the device can see what I'm looking at (on board camera also sees a TV) then
it has an understanding of what actions I might be interested in performing.
At that point it can show options in 2D above the TV or however you want to
lay it out. I'd then be able to look at the option I want (device tracks
pupils and knows with accuracy what I'm looking at) and think about touching
it. This impulse acts as an "invoke" action on the current thing I'm focused
on.

If this stuff becomes possible, then that's about as low friction as I can
think of without interfacing with the brain. Will be be there in 5 years? Hard
to say. I'd be willing to bet we might have something that gets us partly or
mostly there, and may have to be tethered to a secondary device like a phone
for additional processing.

~~~
nerthus
For the UX of hololens/AR, I really hope - as a step before neural links -
they combine it with something like a keyboard glove (there are several
ongoing developments for such devices). Something to make an input, while
barely moving as it is today with the mouse and/or keyboard. This just might
lead to the experience you mentioned, that the AR device recognizes objects
you can interact with and the minimal effort to make the interaction.

Of course, you can try to interact while in VR or AR by throwing your arms
around and pointing in the air. This works, if you'd like to be immersed in a
game. But for everyday tasks, that is not the subtle interaction as provided
by current haptic interfaces like a mouse, keyboard or touchscreen.

------
ben_jones
NLP will take over. "Alexa make the page 'pop' more".

Nevermind.

------
avodonosov
Have there been anything new in UX since 1970s?

~~~
bbcbasic
Swipes, pinches, shakes, 3D effects.

One phone UX thing I hate for example. Phone makes random noises at night. Now
try to pinpoint the app or setting or setting within an app that causes that.

~~~
overcast
Swiping, and pinching was pretty game changing. Shaking and 3D effects are
just useless novelties.

------
chenz
It seems bizarre to use Google Glass as justification that AR won't end up
being a thing. Check out Magic Leap.

------
that_ux_guy
I'm a UX Consultant working at a full-service agency in Europe. The most
valuable (and interesting) part of my job has more to do with strategy and the
design _process_ than it does with UI.

In my pov, my job is first and foremost to understand the "problem" that a
client communicates. The stated problem may or may not correspond to the
actual problem that needs to be solved. This is especially true of semi-
privatized, previously national monopoly working in infrastructure (like SNCF
or EDF in France, Deutsche Bahn in Germany...) or service industries, but is
also true for smaller mid-size businesses.

The stated problem might be something vague like "more digitalization", "being
more competitive with with newer, smaller actors" (eg. SNCF/DB vs.
Trainline/Uber) or something specific, like "redesign our website so it's more
modern", or "design a new iOS/Android app.

The real problem might be: aligning internal departments on a vision
(seriously, getting the heads of Marketing and Engineering on the same page is
VERY hard at that scale); making sure that good ideas at the company don't get
diluted and killed going through middle management; promoting innovation
within a company (which then is translated to a website, an application or
even a change in strategy, management, structure).

Sometimes the real problem is that the company just hasn't spent enough time
(or money) to understand what their users'/client's experience in relation to
their product (in context of their normal lives). It could be something as
simple as clients not really knowing how to use the product properly, being
frustrated because they had to wait too long, not knowing which option to pick
or pressing on "like" just to find a post later. It could be that clients
absolutely love a small detail that wasn't meant to be very important. You
learn a lot about just talking to people and asking questions.

So, depending on the type of project my job as UX Consultant is to:

— coordinate ethnographic interviews (mini-ethnographic maybe; the idea is to
learn from real, in depth conversations) with either clients, potential
clients or internal management

— organize ideation/innovation workshops (two goals: come up with new ideas
AND get different departments and levels of management aligned on the same
vision)

— map out customer journeys and create personae (with the client) so that
we're able to empathise with real users (not always relevant; there's a risk
of doing meaningless work if done because this it something that was sold, but
when used properly can be really useful).

\- translating these insights/findings into strategic recommendations or
digital objects (app, websites)

\- designing wireframes to concretize ideas and encourage discussion/debate

\- working on high-fidelity interactive prototypes to pass on to developers
(who might be working client-side)

\- designing and organizing usability tests (and A/B tests) when relevant.

So that's how I see it.

Of course, there's also that part of UX which is to create addiction, increase
"engagement" or encourage certain behavior (increase newsletter signup).
That's not really my thing.

------
danm07
The answer to most questions beginning with, "where will (insert job) be ...
?" will be invariably be, "done by a neural net," as is partially covered in
this article.

------
KaoruAoiShiho
Disagree about VR. GPU power will finally reach the point where 3D is
commonplace. 3D UI and 3D animations will be the biggest new trend.

~~~
H4CK3RM4N
I still haven't found a game which properly does a 3D UI, let alone a piece of
productivity software. Even games which attempt 3D ultimately end up with 2.5D
UIs.

