
We Have Reached Peak Screen. Now Revolution Is in the Air - montrose
https://www.nytimes.com/2018/06/27/technology/peak-screen-revolution.html
======
pcprincipal
I've been using Moment ([https://inthemoment.io/](https://inthemoment.io/))
for about a year and am really proud of how I've been able to reduce total
screen time from 4 hours to 45 minutes per day. Most of the gains were due to
deleting Safari and Twitter from my device (Moment breaks down app usage based
on battery usage).

To anyone reading this, be aware of how grabbing your phone in the morning to
read headlines or Medium digest can turn into 30 minutes or even an hour.
Think about how checking e-mails can take you completely out of context and
badly dilute the quality of your work. Consider how the amazing small
interactions with other people that make life beautiful can be destroyed by
even a glance at the phone.

Monitoring phone usage and actively cutting it down I really feel has improved
my quality of life. Now that Apple has built in tools to do this, I hope more
people will treat this as seriously as they treat exercise, nutrition, etc.

~~~
codethief
Can anyone recommend an app for Android which does the same thing as Moment
and is effective? (Not that I don't know any apps that offer similar
functionality but so far none of them has really convinced me / worked.)

~~~
graeme
Android P is baking it into the system soon. That's probably the best you'll
get.

[https://www.theverge.com/2018/6/5/17426922/apple-digital-
hea...](https://www.theverge.com/2018/6/5/17426922/apple-digital-health-vs-
google-wellbeing-time-well-spent-wwdc-2018)

------
plurgid
I say I'm a musician on the side, but truthfully I've been messing with sound
instinctively since I came out of the womb, and I only picked up programming
and design and stuff later on. As such, I gravitated toward electronic music
doodads almost immediately.

This issue of "falling down the LCD well" has been at the forefront of
electronic music for a while. Synthesizers were cool in the 70's because they
had knobs, and any abstract logic involved was done by you, so you had to be
into it.

Then the 80's and 90's came along and we got stuff like the DX7 and
innumerable "workstation keyboards" that were little more than a tiny display
and two or three buttons. Maybe a jog wheel if you were lucky.

These were unambiguously cleaner from a design perspective, but what people
began to realize is that the screen was too much of an abstraction. Then of
course DAWs came along and even the synthesizers themselves moved into the
computer physically.

Music, like life in general, is visceral, and the screen is not. Musicians
became frustrated with the lack of physicality.

The modular synthesizer approach of the 70's has made a HUGE resurgence with
the eurorack standard.

Not everyone, but LOTS of people actually PREFER a gigantic mess of tangled
wires with physical plugs and knobs to a sterile pure-logic implementation on
the computer that can do all the same stuff cheaper and in a more reproducible
way.

I suspect we'll see a similar sort of resurgence of physicality across every
product that has been absorbed into the computer screen.

~~~
Adamantcheese
I'm somewhat sad that modular synthesizers are so expensive, even semimodular
standalone units. They're really fun pieces of equipment to play with.

~~~
ricardobeat
What’s expensive to you? With 2-3k you can gather an amazing array of
equipment these days, unthinkable 10 years ago.

~~~
Adamantcheese
A basic modular system with all the basic oscillators and filters you'd find
in a standard VST plugin set will cost roughly 5 thousand if not more. It's a
bit expensive really, but I suppose that's the cost of analog in a digital
world. It might just be me and my reluctance to spend money but it seems like
prices would be dropping for equipment if it was making as big of a comeback
as it is, but prices seem to be relatively stable, sometimes more expensive as
new small scale manufacturers come out with modules.

~~~
ricardobeat
Five grand? No way. Take a look: [https://www.bax-
shop.nl/producten.html?keyword=analog+synth&...](https://www.bax-
shop.nl/producten.html?keyword=analog+synth&o=score&p%5Bmax%5D=800)

------
Nasrudith
I know that I am not the typical use case for several reasons including the
failure of voice recognition with speech impediments but I doubt voice will be
the way forward any more than smart watches were - being niche at best.

First off with speech everyone can tell what you are doing and it is obtrusive
- look at the old DMV signs against cellphones back from when they could
literally just call.

Second it is just plain worse as an interface - just try to use a phone tree.
People have dutifully ignored phone tree based answering machines existence
except for the visually impaired who frankly lack options and must use what
everyone else would consider useless.

Third there is less to do with it and thus less reason to get involved with
the frontier. Again the same trap as the smart watch. People asked what can
you do with it and the iWatch flopped despite Apple trendiness.

Direct thought reading might work better but that is in the easier said than
done category. They can't even make an acceptably accurate non-invasive
glucose meter so it is very unlikely to come out in consumer goods.

Google glass interestingly also largely flopped for several reasons despite
heading in the opposite direction and provoked sheer irrational hatred above
and beyond all other carriable or constantly recording cameras. AR sounds nice
but they also need to factor in the significant glasses wearing population.
Also apparently had short battery life for something to be worn as a HUD.

Noting where things can go wrong is easy compared to figuring out where to go
in the future even with caveats like "10 years directed research lead time". I
think the market has matured for personal electronics now until they can offer
"magic" again. VR is neat but a niche in chicken and egg situation.

~~~
ewzimm
I appreciate your sense of doubt about often overblown predictions for the
usefulness of the next new thing, but I think some of your claims are a bit
too pessimistic. The Apple Watch can hardly be considered a flop. They're now
selling 8 million per quarter and have outsold the entire Swiss watch
industry. In just 3 years, it became the most popular watch in the world.

Now more and more people have a voice-activated assistant not only in certain
rooms of their home or in their pocket, but also on their wrist. It might seem
to be a small thing not to have to reach into your pocket to invoke a voice
assistant, but that small amount of time saved really adds up, especially if
you're doing something with your hands like cooking or holding a baby, which
some people still do!

~~~
Retra
Holiday season comes around and everybody gets Apple Watches. Doesn't mean
they're good. It often means people feel compelled to buy things they don't
want, because gift buying is hard. It happens every year.

------
dgudkov
The article points in the wrong direction. Yes, smartphones is the new smoking
but not because of their screens. It's not the screens that are addictive,
it's what _on_ the screens -- the internet. The internet is too good to not be
addictive, especially with the rise of social media, internet news and
messengers. Switching to other, non screen-based interaction channels will
only bring all the notifications, social media and news to these new channels
-- be it voice assistants, watches or whatever else.

~~~
taneq
Exactly. The screen is just a window into another world where information is
far more accessible and available. And calling it "addictive" isn't really
appropriate (in general, obviously there are a lot of nasty Skinner-box apps
which do deserve that name) when it's a high-quality resource and paying it
more attention makes sense.

~~~
civilitty
I agree that calling it addictive isn't appropriate, but only because the vast
majority of our clinical and social experience with addiction is entirely
based around physical dependency. Just look at the amount of research into
opioids, nicotine, amphetamines, benzos, and alcohol compared to the dozens of
less addictive/harmful psychedelics and research chemicals.

Information and the psychological dependency it creates, whether it be a
message from someone we are attracted to or the stimulating audiovisual
response from a slot machine, used to be tied to slow physical mediums like
snail mail and restricted by location. In that context, the smartphone has
created a whole new era of crazy that we are completely unprepared to deal
with. Coupled with the rapid pace of development that didn't leave anyone
enough time to adopt to the internet before it became ubiquitous, they have
provided a perfect delivery mechanism for psychological dependency optimized
on a massive scale, long before we've had time to adapt.

~~~
taneq
Consider music. Music is a form of information, it's pleasurable to listen to,
and many people spend significant portions of their day listening to it and
would be strongly unwilling to give it up. You could call this 'psychological
dependency' or you could just accept that music is a good addition to most
peoples' lives and that their spending time and money on experiencing it is an
entirely reasonable decision.

~~~
civilitty
You're demonstrating my point. Portable cassette/CD/MP3 players have existed
for what, four decades? The vast majority of humans only experienced music
through bards and family/friends up until the last century. This is all a
vastly unexplored topic and that's exactly why calling it an addiction is
inaccurate. Our understanding of pleasure and neurological dependency is still
in its infancy yet our ability to manipulate it at a commercial level through
sound, visuals, and base human instinct has grown exponentially - despite the
fact that the latter has happened largely through free market trial and error.

The phrase _" you could just accept that [music/video games/heroin] is a good
addition to most peoples' lives and that their spending time and money on
experiencing it is an entirely reasonable decision"_ is right out of the
hardcore addicts' self-defense-from-intervention playbook (been there).
Likewise, it's equally ridiculous to use music, a passive art form that has
existed for thousands of years, to argue in good faith about the kind of
technology that we have today. Especially when that technology has spawned
several unique industries ranging from social media to gaming that are
dependent on "whales" spending thousands of dollars a month and other
behaviors distinguishable from clinical addiction only by the lack of violent
withdrawals.

------
yanslookup
Article makes the argument that Apple is secretly releasing tech (iWatch,
Airpods, screentime app) that aim to eventually phase out the screen.

With AR and VR, my money is on more screen time in the future not less.

~~~
jpl56
Today, services we interact with on a screen bring us ads. This is how we pay
for them.

Tomorrow, when new interactions will exist, we also will have ads. On the
watch, during Siri conversations, ...

Same as Facebook at it's beginning, they will wait until we are used to it
before beginning to do it. I'm looking forward to the "vocal assistant"
version of uBlockOrigin ;)

~~~
jdietrich
_> Tomorrow, when new interactions will exist, we also will have ads. On the
watch, during Siri conversations_

Apple's essential point of differentiation with Android is the fact that they
make money on hardware. Google give away their mobile OS to funnel more
attention into their attention monetisation machine. Every step that Apple
takes to protect user privacy deepens their moat, because Google only make
money by harvesting data and monetising attention.

Google can't compete on privacy, so it's very much in Apple's interests to
push that issue as hard as possible. Their decision to block ad trackers by
default in Safari was very smart; deciding to insert ads into Siri or WatchOS
would be indescribably stupid.

~~~
drb91
> deciding to insert ads into Siri or WatchOS would be indescribably stupid.

Most notifications are just ads to use an app. I'd say the ads are already
here.

------
darkpicnic
> When you do give in, you lose your mind.

Does anyone else think this kind of writing is unnecessarily hyperbolic? I'm
so tired of reading articles that resemble the next Michael Bay script. The
core of this article may have value, but I can't even get to it since it's
drenched in distracting click-bait sauce.

This isn't story time, New York Times. Treat me like an adult.

------
rb808
> and about 11 hours a day looking at screens of any kind.

That is so depressing, but thinking about it myself, 11 is probably a minimum.
I have to change careers to selling ice creams at the beach or something.

~~~
vkjv
Despite earning marginally more than minimum wage, being a lifeguard is still
the happiest I've ever been at a job.

~~~
rambleramble
The summer after high school, I worked for the parks department of my city. I
just did general maintenance things, sold tokens for batting cages, emptied
trash, "ensured the safety of the people in the park". I remember being so
thrilled when I got an offer for a "real job" doing software development.

I miss working at the park. And I'm pretty sure it's not just nostalgia. I
liked being outside. I liked working with the people there.

I've been doing pretty well at work lately, but I don't seem to get any
happier with any type of promotion I receive. It's just more of a challenge,
which I like, but it doesn't make me any happier.

I make more money now, and I just keep working hoping that I'll get to a point
where I'm happy with my job, where I find something I like, but it doesn't
really seem to be happening.

\---

I made a throwaway for this because after I typed it up, I realized it's just
me lamenting about growing up.. but I figure it's worth saying anyway.

~~~
bonestamp2
Happiness is not something you find, it's something you make when you
recognize what is missing in your life and you make that thing part of your
life. It sounds like you're halfway there if you've identified what did make
you happy.

If you enjoy your job, you're still well ahead of most people there too, even
if it doesn't make you happy. Use your spare time to do the things you miss
about working in the park, get outside, help people, etc. Maybe you could
start a meetup.com group to give hiking tours of local trails/parks. Talk to a
park ranger and find out if there's anything you can do with your group while
in the park that would help the park.

~~~
sp527
Happiness is exactly the opposite. It's realizing that nothing external or
material can leave you perfectly content for any length of time because that
violates what it is to be human. Happiness is a choice you make and a mindset
you adopt.

------
meathook
Great message, but this is mostly an opinion piece with few technical details.
I am also not a fan of how the answer is to wait for Apple (Big Tech) to save
us.

Bret Victor wrote a piece [0] lamenting the convergence on screens as _the_
interaction design paradigm almost 7 years ago. Bret explains why screens are
limiting interaction design through examples centered around the human body.

Ironically, this NYT piece gives the impression that a human being is a
floating head and fingers i.e. an AR/VR avatar that they seem to loathe. I
hope the future of computing isn't just the ability to check my calendar
without a screen while walking. I want to use my body in tandem with
computation. I don't have a Killer App for this interaction paradigm, but I
found this paper by Scott Klemmer, Björn Hartmann, and Leila Takayama useful
for thinking about it [1].

[0]
[http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesi...](http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/)
[1]
[https://hci.stanford.edu/publications/2006/HowBodiesMatter-D...](https://hci.stanford.edu/publications/2006/HowBodiesMatter-
DIS2006.pdf)

~~~
jeffreyrogers
I disagree with Bret for fundamental reasons. I think Bret is trying to make
computing more human, when I view computing as fundamentally _unhuman_. It
seems that there is an inverse correlation between screen use and mental
health. Bret's solution is to improve our technology. My solution would be to
limit our use of technology. Computers are fabulous for certain things:
transmitting information, processing data, automating repetitive things, but
they're awful for many others. I worry that by trying to make our interactions
with computers more human we're just going to become more dependent on them.

------
niftich
Visual signals and interfaces have information density and permanence, which
gives gleanability. You can put lots of things into your visual field (e.g.
multiple application windows, multiple computing devices, multiple inanimate
objects), where they'll stay put without further input. Inherent cognitive
ability even allows us to track multiple objects in motion throughout a scene
and keep them coherent -- which is the skill that enables driving. Visual
interfaces are very well suited to how humans best absorb information, and how
they context switch given a large number of potential tasks.

A world where we begin to move off of visual interfaces will be awkward. While
humans are good at absorbing conversational audio, they mentally filter most
of it out to distill it down to its essential elements, and knowing what's
essential may not even be known ahead of time. We'll direct voice-outputting
interfaces to repeat things often, and they must be smart enough to accurately
determine the context of our inquiry.

Voice output is often paired with voice input, but voice propagates well in
public, leaking information to everything in range. Devices that capture
speech-like input in a private way are not yet widespread. Meanwhile,
structured command input through voice is awkward, and natural language
processing doesn't sound natural yet. It's complex to implement and the
computer frequently encounters a situation it doesn't yet understand, which is
the most discouraging kind of interaction one can have with a computing
platform. Factors like these highlight that audio-based interfaces are rarely
programmed to be discoverable, and even if they were, exchanging that
information over audio is less efficient than doing so visually.

New research into interface design is needed to address many of the
shortcomings of current attempts to de-emphasize screens.

------
rm_-rf_slash
I took a class in rapid prototyping earlier this spring. One question has
resonated with me since: “How does a human appear to the computer?”

There was a grotesque drawing of an eyeball attached to an ear along with a
couple of fingers. It’s not entirely inaccurate: most of our interactions with
computers are with our fingers, eyes, and ears. But now that microcontrollers
like the Arduino and SBCs like the Raspberry Pi are so cheap and accessible,
we can begin to look at different ways to interact with computers, through
sensors instead of touchscreens and keyboards.

In a few decades, we may see a shift in our human-computer interfaces as
lasting and profound as the leap from mainframe terminals to personal
computers.

------
bwang29
The author sees this as a problem and the solution is a new revolution but
didn’t realize the underlying problem might be we are constantly seeking the
next revolution, or to refuse to accept the horror of people stop buying
stuff.

------
matt_s
The title suggested to my brain that some sort of eye glass device would
handle projections from your phone in a non intrusive way for quick access to
things like calendar, last text received, etc.

No 3d, could just be text with voice control for what to display. The
eyeglasses would be normal looking eyeglasses, maybe a heavier frame to house
the needed electronics. Or maybe the ear loop has the extra stuff but not
thick like a hearing aid.

Anyhow, what I think the article intended was the next revolution being in
perfecting voice commands to apps. This is obviously for consumers not
computer geeks that work on computers all day.

------
tonyedgecombe
It’s not screens that are the problem, it’s the internet.

~~~
EGreg
This!

And notifications of irresponsible programs. The tragedy of the commons where
the commons is human attention.

~~~
mnx
Holy shit this is a good take. Can anyone recommend some article looking at
attention from this perspective?

------
germinalphrase
This seems like a clear next step; however, I have a hard time believing that
the underlying goal of this direction is NOT a pair of AR glasses - another,
potentially more addictive, screen.

I’m personally really excited about that potential. I would love to pivot my
career away from teaching to building AR workflow mediation for teachers. I
would even be pleased to only carry a watch and headphones to fulfill the
majority of my computer-related tasks.

That said, I do fear Hyper-Reality[1] and such a persistent, obligatory
mediation of our lived experience.

~~~
shawn
[1]
[https://www.youtube.com/watch?v=YJg02ivYzSs](https://www.youtube.com/watch?v=YJg02ivYzSs)

------
tyu100
Wow, yet another internet moral panic piece from Farhad Manjoo and the New
York Times. For extra credit replace smartphone and facebook with MTV and
television and see if you can tell the difference from something written in
1983.

------
personjerry
Developing nations are nowhere near “peak screen”.

------
tk75x
link without paywall [http://archive.is/6nR92](http://archive.is/6nR92)

------
PerilousD
article is behind a pay wall cant read it?

~~~
AndrewDucker
[https://bypasspaywalls.weebly.com/](https://bypasspaywalls.weebly.com/)

