
What’s Next in Computing? - MichaelAO
https://medium.com/@cdixon/what-s-next-in-computing-e54b870b80cc
======
Animats
Coming up next:

\- If your job involves sitting at a desk, and your inputs and outputs come in
via phone or display, expect to be automated.

\- Automatic driving.

\- AIs which sell. These will be annoying but effective.

\- Big Brother will be much more effective.

\- Within ten years, an AI running some investment will fire a CEO.

Probably not important:

\- Virtual reality. Other than for games, it won't be big.

\- Internet of Things for the home. Home remote control is a niche product.
It's been available since the 1980s and never got much traction.

Not yet:

\- Robots for routine unstructured tasks. Still a hard problem, from both a
hardware, software, and cost perspective.

\- Nanotechnology (excluding surface chemistry stuff)

~~~
Ensorceled
> Virtual reality. Other than for games, it won't be big.

There are other niche markets for VR: Interior Design, Real Estate (especially
commercial), Education, Health Care, Industrial Design ...

Every major sports network is working to deliver VR for sporting events.

~~~
TheCondor
Sports and concerts seem like pretty compelling VR apps. If the music people
handle their business right, I could see VR front row tickets at home being a
premium to mid grade tickets. Same for sports, a couple rime vantage points
that the user can toggle between? I could see people paying more than a live
ticket

~~~
tim333
I tried William Hill's VR horse racing app. It uses computer game style horses
linked to live position detectors on the real ones. It was quite good.
Wouldn't use it myself but I could see it working. They hope to make money
from the betting in the usual manner.

[http://home.bt.com/tech-gadgets/future-tech/virtual-
reality-...](http://home.bt.com/tech-gadgets/future-tech/virtual-reality-
comes-to-horse-racing-11364008360119)

------
ianamartin
whenever I see a headline of the formula, "What's next in X" I immediately
think to myself, "probably not what this article thinks it is."

That said, yeah, some of the handwriting is on the wall. I think IoT will be
the next big hit, not because people want it or will use any of it, but
because that's all anyone is going to be selling. Samsung, LG, GE, et al.
aren't going to give us a choice.

It won't be a big dramatic change like self-driving cars. It will be a slow
trickle of toaster fridges that we don't notice until we are out trying to
replace a washing machine one day and can't find one without a self-ironing
board attached that wants to connect to our bed to know when to wake up the
coffee pot that makes your egg-white substitute omelet and makes sure your
shirt is wrinkle free at exactly the moment your shower dries you off and
combs your hair.

And while all of us nerds are busy disabling all that crap so we can just
clean some underwear, Wall Street will be declaring that IoT is here and
winning! What they won't mention is that it's winning because there's nothing
else left to buy.

~~~
monk_e_boy
I don't agree. There are plenty of things in my life that could be hooked up
to some online AI to make them smarter.

Anything with a lock. Just figure out that I'm me and open. If I leave lock
up. Open up for anyone on my authorised list.

More cameras pointed at stuff. Our local club (100 members) purchased a
weather station and decent camera and pointed it at the sea so we can all
check the wind and surf. Cool. More of these cheaper please.

We have Woo devices that we attach to kiteboards, these measure jump height,
duration and G force. These are then synced to a phone and internet, so not
quite IoT but close. More of these please. Attach them to anything that moves
(shoes, kites, surfboards, swim fins, soccer balls, bike) more data is fun.

GoPros, most people I know have these. They need to be better hooked up to the
internet.

I think there are a ton of IoT that will make sense and people will buy. Just
because you don't see the value in a toaster hooked up to the 'net, doesn't
mean the IoT is dumb. I think it's great.

~~~
daveguy
The big problem I have with IoT is that _none_ of this requires online AI.
What part of having an authorized list or identifying you requires internet
access?

The IoT will take off when services are run on site. When the processing power
is available from a small appliance box and companies advertise security and
reliability associated with local processing it will catch on.

What happens to your smart lock of your internet connection goes out? What
about if the server that holds your whitelist is compromised?

~~~
estreeper
I can see value in internet access. The lock having internet access means I
can be on vacation in Thailand and grant a friend access to my house to drop
off the rent check I left on the counter. And later, I can make sure my friend
didn't forget. Or, make sure my dog-sitter is showing up. I also can't think
of a more _convenient_ way to grant access and identify people off-hand.

As for the connection going out, there are solutions. Redundant cellular
connections, maybe? And if you really can't get in, it's not the end of the
world. We already have solutions for that: locksmiths. Might be expensive
though, and end up destroying the device, or maybe your door.

The data security thing is really bad. Unfortunately, that's a much larger
problem though, not really related just to IoT.

~~~
marcosdumay
Got a nice start reading that comment. A nice amount of good sense. And
then...

> We already have solutions for that:

Yes, finally somebody will say "local cache"!

> locksmiths.

Ok. Not yet.

~~~
estreeper
That's a good point!

I guess more broadly, I was thinking about a scenario in which the lock device
dies, and making the point that conventional devices aren't foolproof either.
In this case, the fool being me, and the proof being locking my keys in the
house.

So, any lock can fail, but I'm not really concerned as long as it has a
reasonably low failure rate. We've tolerated conventional lock systems failing
(via user error mostly) for a long time.

------
eva1984
I think people need to pay more attention to the current AI status. It is
rapidly maturing and offering surprises every year. Most of the progress
didn't make to the news, as AlphaGo level, but it is going to have bigger
impact to our lives, even our jobs.

like: [http://arxiv.org/abs/1512.00965](http://arxiv.org/abs/1512.00965)

~~~
amelius
It would be nice if there were a searchable blog about AI and its
applications. That way, even people like me who are not so interested in the
algorithms behind deep learning, can still see where this technology is
heading.

~~~
Pamar
DataTau ([http://www.datatau.com/news](http://www.datatau.com/news)) is a
clone of HN dedicated to Data Science.

Maybe something similar for AI?

~~~
Houshalter
DataTau seems somewhat abandoned. I would suggest the subreddits
[https://www.reddit.com/r/machinelearning](https://www.reddit.com/r/machinelearning)
or
[https://www.reddit.com/r/thisisthewayitwillbe](https://www.reddit.com/r/thisisthewayitwillbe)

------
dcw303
I had a Windows Mobile phone in 2005 (One of the earlier O2 XDA models, from
memory) and it certainly fell into the gestation phase. Email was passable if
you were connected to an Exchange server, but tapping out long responses with
the stylus was a chore. It was definitely possible to browse the web, but the
prevailing Windows-style point and click interface combined with a low res
screen made it more trouble than it was worth. At the time, I couldn't imagine
how to make mobile easier to use, but once I experienced an iPhone it was very
obvious that the Touch UI fundamentally changed things.

I say this because I got an Oculus DK2 a couple of years ago, and after a
month of enthusiastic use it got cast aside. I know it doesn't affect
everyone, but I suffered from some nausea which soured me quickly. Queasiness
aside though, There was a bigger problem. When I tried a demo, there was an
obvious wow factor at the start, but it wore off quickly. Just attaching VR to
standard 3D game was not as exciting as I had hoped.

The first television programs were essentially radio shows with a camera put
in front of the presenter, until someone was able to use video to harness the
story telling possibilities. The first web sites were glorified electronic
newspapers, and then someone figured out how to integrate interactivity that
was impossible in print. All new media forms intially imitates the old. As
McLuhan said, the medium is the message.

I think it will take an as-yet unknown paradigm shift to make VR compelling.

~~~
hellomrjack
I am actually looking for people who are susceptible to VR motion sickness, I
worked on a VR project where we tried to make it as accessible as possible. It
would be really useful for me to see how it is for people who are particularly
affected by it. You can grab it from [https://share.oculus.com/app/super-
turbo-atomic-ninja-rabbit...](https://share.oculus.com/app/super-turbo-atomic-
ninja-rabbit-experience) (I defiantly agree that a lot more work needs doing
to work out how to tell stories effectively with VR).

------
kordless
I wrote a hypothesis [1] on the cause and effect of the cycles Chris mentions.
With those assumptions about the separation of user and their data in mind, I
make the following near term technology predictions:

\- a swing back to private cloud will be kicked off by container/microkernel
technologies, starting the largest cycle we've seen to date in terms of growth
and value

\- public cloud computing growth will slow slightly and will have to refocus
on lower privacy needs use-cases (or die trying)

\- IoT and cloud computing will start to merge as a market, where the compute
resources of your IoT devices are used to operate on the data other IoT
devices nearby

\- cryptocurrencies and the blockchain will finally find a home in securing
the processes behind cloud provisioning of what was known previously as SaaS
software

\- cryptocurrency deployments make open source project's revenue models
sustainable

\- open source hardware, including circuitry, becomes ubiquitously available
via physical printing processes, which then drives the IoT and cloud computing
markets even higher.

\- the digital nomad lifestyle explodes as a result of the changes to the open
source model

\- startups and the VC culture in Silicon Valley will be forced to retool
their financial strategies and software business models for accessing
decentralized markets

\- a meltdown of global financial markets will be the only thing which enables
bringing positive, decentralized change to those financial markets.

As John Chambers, chairman of Cicso said a few years back, "You're going to
see a brutal, brutal consolidation of the IT industry". Better buckle up.

[1] [https://medium.com/@kordless/an-ecology-of-
cloud-1cedfa326b8...](https://medium.com/@kordless/an-ecology-of-
cloud-1cedfa326b8c)

BTW, there is only one thing that will make me stop coming to HN and
participating in the community, and that's when I see people's opinions on
something downvoted. Downvote if you must, but keep in mind it's a game of
blame if the post you are downvoting is someone's opinion or view on
something, which itself is blameless.

~~~
eitally
All of the points subsequent to your first one conflict with that conjecture,
that we'll see a huge swing back toward private cloud & on-premise DCs.

Speaking from where I saw the industry [when I was a part of it] and now from
what kinds of CIO/CTO conversations I see the GCP AMs/SEs having, I don't
think we're remotely close to a swing back. Enterprise Capex budgets move VERY
slowly most of the time, and it will take a multiple years to alter current
momentum, which has finally reached the point where it's almost a foregone
conclusion for many industries that on-prem DCs no longer make sense.

This opinion is driven almost as much by the increasingly compelling portfolio
of enterprisy SaaS products like Netsuite and a plethora of workforce
management, CRMs, analytics/visualization, sooooo much more of what has
historically been internally developed business productivity apps.

~~~
kordless
There are a lot of people who think public cloud is the bee's knees and is
going to continue to grow without bounds. The fact is, private cloud revenue
is 30X as big as public cloud. It's this big because of previous growth
cycles. Also, there's the consumer market to consider, just like in previous
cycles.

Public cloud is simply multi tenant computing on data managed by someone else
away from where the customer created it. I'm predicting data will come home to
roost on single tenant hardware (which comes in by way of IoT) and the code
that has been called SaaS will follow it there. We'll still have managed
services, they'll just run near us, instead of on compute owned by someone
else.

Regardless of whether you agree with that or not, it's a HARD fact that
bandwidth isn't growing as fast as storage and compute. That fact alone
suggests heavy decentralization of service architectures in the coming years.

------
jedberg
I think the author is right about cars being next. There is a lot of work by a
lot of people in this area, and one of them is going to get it very right (my
personal bet is on Tesla or Apple).

~~~
wr1472
You're choosing Apple over all the existing car manufacturers? What have done
in this space other that carplay?

~~~
jedberg
> You're choosing Apple over all the existing car manufacturers?

Yes. Much like smartphones and tablets, they hadn't done much in the space
until they exploded on the scene after sitting back and watching everyone else
get things wrong. That's what they're really good at -- seeing what others do
wrong and not repeating those mistakes.

Unlike tablets though, it's a lot harder to hide when you're making a car.
They've already secured a test track and they're hiring vehicle engineers.

~~~
icebraining
Apple had always been involved in that space. The Newton, the eMate, the
collaboration with Motorola, and (obviously) the iPod, were all in the same
type of device. The latter even had scheduling, contacts and downloadable
games which you could buy the iTunes Store, before the iPhone was released.

------
ocdtrekkie
I suspect we will finally reach the point where enough is enough on
proprietary cloud, and we'll start to see decentralization taking a strong
forefront again.

A lot of big companies are driving this AI talk, but by and large it helps
them more than it helps the rest of us.

~~~
pixl97
>but by and large it helps them more than it helps the rest of us.

Correct. If you have 10,000 employees and you could replace 5,000 of them with
'computerized smarts' you shift a huge amount of liabilities. They don't
require health insurance that seems to go up in cost every year. No retirement
fund is needed. In theory the costs of using AI will go down over time per
unit of work accomplished.

As for an individual it is difficult to see how AI will decrease our workload,
increase our happiness, or increase our wealth.

~~~
jedberg
> As for an individual it is difficult to see how AI will decrease our
> workload, increase our happiness, or increase our wealth.

You have to be the guy or gal creating the AI. :)

------
dfischer
Something that has been on my mind lately is are we near the end of viable
startups in apps? I mean this in a multi-platform approach not just mobile.
But for apps in general I am starting to feel what's possible has been done
and we are nearing a point where the big guys are already doing it and you
might as well just join FANG instead of building something on your own.

Maybe I'm getting jaded though.

There are always new niches coming out... And there are reinvented cycles.
Like how many big dating apps have been there since the 90s? Quite a bit
actually. Can there really be another dating app in current internet app
standards between desktop and mobile? Probably not.

Maybe the next dating app will be in VR and AR. In fact I guarantee it.

But I'm feeling that we are getting tapped out on app ideas before the next
flood.

Makes me a little sad because I'm more entrepreneurial than anything and not
feeling many problems to solve lately. Maybe it's just me. Anyone else feel
similar?

~~~
deegles
More platforms = more opportunities. The Kim Kardashian game on Android/iOS
made $200 million dollars in 2014. You don't have to come up with a new niche
to be wildly successful, you can innovate on execution.

That being said, I believe there is huge potential for new types of voice-only
apps on the Amazon Alexa platform.

~~~
sdenton4
Yeah, there's a huge problem with discovery and new apps, though, which
kardashian has a huge leg up on.

Other challenges are that people tend to spend 80% of their time on 3 apps,
and that people are likely experimenting with new apps much less than they
were a few years ago, as they've already got their goto apps at this point.

------
ronnier
What if there's an upper limit on what can be created and what if we are
mostly there. What if we never really progress much beyond where we are other
than gradually getting faster, longer battery life, and more storage. Games
and movies and TV shows are now mostly sequels and rehashed versions. A 2006
car vs a 2016... Basically the same.

~~~
nwjtkjn
There were no self-driving cars in 2006, and electric cars seemed like a far
off dream.

~~~
ocdtrekkie
[https://en.wikipedia.org/wiki/General_Motors_EV1](https://en.wikipedia.org/wiki/General_Motors_EV1)
\- Available to consumers (for lease, and they were recalled)

[https://en.wikipedia.org/wiki/DARPA_Grand_Challenge_%282005%...](https://en.wikipedia.org/wiki/DARPA_Grand_Challenge_%282005%29)
\- Competitions of self-driving cars for over a decade

We really haven't taken the magical leaps some people believe we have. We just
learned to market them better. :)

~~~
nwjtkjn
So since the "internet" existed in the 70's, nothing exciting happened in the
90's?

------
Geee
I think the general trend is that computers start to understand and be aware
of physical world, rather than just numbers entried by humans. And not just
understand, but manipulate, which means that you can go to forest and run
forest.foreach(tree => { if(tree.length > 5) cut() }) I.e the world becomes
computable (or rather manipulatable).

~~~
chm
The world is already this way, just not accessible in such a concise and
precise manner. When you ask a lumberjack to cut down all trees greater than 5
meters tall, you're effectively doing the same thing as in your example. The
execution model, runtime, etc are not the same, but the point still stands.

~~~
Geee
Yup, that's true, but computers also used to be people who did computing.

~~~
chm
Indeed, and that was my main point. "Computing" in a general sense is as
fundamental as any physical law (if one wants to make a distinction between
those). We have reduced computing to its essence, and are now building up
complex systems in a "rational" way. The Universe already "knew" about the
essence of computing, and just let things happen.

------
AndrewKemendo
I am biased as an AR developer, but I think Chris underplays the impact of AR
as a computing and communications platform.

AI - and eventually AGI - is an application of computation. It's revolutionary
and I think will transform everything but it's not a computing platform. That
is, AI itself doesn't have an interface, so it still needs a platform to run
on. AR and eventually BCI and wetware are the platforms for interfacing with
it.

Cars aren't a computing platform either - they will certainly utilize and be
transformed by new waves of computation capabilities, but cars aren't a
replacement for a personal computer.

Same with Drones and IOT. I think wearables will fall to AR as well - or
rather integrate with it as a peripheral.

So when asking what is next for computing, in the context of the evolution of
interface/platform from Mainframe to Micro computer to Smartphone, AR is
unquestionably "next" as a platform.

~~~
karmacondon
What's the killer app for AR? speadsheets:computing, email:internet, ____::AR?

~~~
jasonwatkinspdx
I'm not quite sure what to call it, other than operations research.

People focus a lot on consumer/entertainment applications, but I think heavy
and high tech industry will rapidly adopt AR. Imagine a mechanic working their
way through an interactive checklist while completing some maintenance task on
a jet engine or other complex machinery. Even with hardware as rudimentary as
google glass that's useful. But then consider Boeing in the moment that said
engine unexpectedly caught fire. Imagine that they can go back and review
footage of every time folks touched some component on that engine, or use
basic machine vision techniques to confirm the position or state of some part?
Now imagine they have that kind of visibility into the majority of what
people's hands do on the shop floor or in hangers.

There will be social issues and debate over this, which we're seeing with
police body cameras, but I think ultimately safety will trump people's
reluctance to have their every task recorded. Certainly in industries with
high safety/risk implications, we'll see similar strong arguments for going
there.

~~~
patrickk
Slightly less complex, consumer-focused possibilities:

\- Cook like an expert (with an AR), you don't have to glance at a screen or
cookbook, all the steps are visible in your view/whispered in your ear as you
need them, reminders to take the pasta off the boil at the correct moment,
etc.

\- Repair or perform maintenance on a car - when you open the engine bay, the
steps to change the air filter are available for your model of car.

~~~
mstolpm
"Cook like an expert": There's much more to cooking than just following steps.
AR don't help much preventing you to slice the onions instead of your fingers.
The wrong temperature, an egg that is larger or smaller ... it's timing and
knowledge/experience, not a rule book that makes a great dish.

Same for the car maintenance: Perhaps you could do it yourself and save a few
bucks. Would you repair your breaks yourself with AR but no knowledge and
understanding about cars and maintenance if the life of your children depends
on these breaks?

Both scenarios might work for an expert or at least someone who knows the
fundamentals, but not for an unprepared consumer.

~~~
ativzzz
I think he's talking about experts.

But for cooking, the AR can find out the temperature (you can do this with
some tools already minus the AR), determine sizes of food and how long to cook
them for, maybe overlay a line on food you're cutting to help you cut it
better... etc

------
vannevar
My own prediction is that in terms of personal computing, the hardware will
start to fade into the background, along with the OS. AI will take over
consumer mindshare. Windows, iOS, and Android will give way to Cortana, Siri,
whatever Google comes up with, and Amazon's Alexa. Probably something similar
from Facebook as well. The consistent user experience with any of them will be
available everywhere. It won't be important to carry a particular device, the
hardware around you will recognize you. I already catch myself trying to ask
Alexa for something in my car. I'm sure Amazon will make that a reality sooner
rather than later.

------
fizixer
The scene is from Terminator 2 (1992) not The Terminator (1984).

------
Houshalter
How did he miss robots? He talked all about AI and self driving cars, but
missed robots.

Robots have been pretty good for decades, but waiting for AI to get good
enough to take advantage of it. This is _just_ starting to happen.

------
mastax
> Of particular importance are GPUs (graphics processors), the best of which
> are made by Nvidia.

Don't let any gaming forums see this, they'll be arguing for weeks afterward

------
crb002
Processors in memory like Micron's Automata
[http://www.micronautomata.com](http://www.micronautomata.com)

Von Neuman arch is dead for a lot of simple compute tasks. CISC central CPU
will only be used for very specific workloads that need custom circutry like
floating point units. Also, every CPU will come with a small FPGA stock, not
just the on die GPU that AMD offers.

------
greendestiny_re
Predicting the future is a thoroughly thankless task.

I just wish the author of this article elaborated more on the financial side
of the AI revolution, rather than just mentioning it at the start and then
dropping it completely.

------
bholdr
"Post hoc ergo propter hoc" (Latin)

------
Sven7
DARPA's SyNAPSE and then the self powered brain of an ant. That's computing.
Everything else could find place on what's next in consumerist garbage.

~~~
rwallace
The 'consumerist garbage' is what's paying to build the factories that make
the chips that power the exciting research stuff.

------
amatic
Plastic analog neural networks. Here is a bit : www.billpct.org

------
DrNuke
All front-end everywhere without programming imho.

