
It's Cold in the Information Age - akakievich
https://natalian.org/2019/01/07/Winter/
======
sek
What you can see here is market maturity, this is a very common thing. There
was just so much to do in the computer industry so far.

Look at the history of other industries, there have been hundreds of car
companies before Ford took out most of them with the Model T.

Imagine you are one of those people working is the car industry before that.
"Company X had a car that drove 50 miles per hour, a new record, insane! The
last record got broken just a few months ago...".

What is happening in other mature markets with good dynamics is that gains
come much more slowly and it becomes "boring" so you don't think about it too
much. You get an incredibly efficient and secure car for a few thousand
dollars, profits come from people who buy for status. If you look at the truck
industry or aviation, the margins are really low, buyers mostly care about
price there.

Most people don't need more than a smartphone, they already buy for status,
others need a Mac or a cloud instance like truck drivers need a special kind
of car to do their work.

There are just so many car companies, there will be just so many cloud hosters
with will compete for ever shrinking margins to run your cloud-function-app
magically infinitely scalable on their gigantic datacenters. If you want your
own get a raspberry pi rack.

This is amazing, this frees so much energy and resources to focus on other
stuff that matters.

Look at software, the next decades we can focus on creating ERP-Systems that
are as perfect as Whatsapp.

Then SpaceX and Blue Origin will compete to bring your 3D-Printed robot
spaceship into space you just simulated in your hyperrealistic Kerbal Space
Program. The enabler here is price per kilogram to orbit.

That's where my imagination stops, because it's "the final frontier" and the
problems there are really infinitely hard.

~~~
amelius
> Look at software, the next decades we can focus on creating ERP-Systems that
> are as perfect as Whatsapp.

Like with zero customizability?

~~~
coldtea
Hopefully yes.

~~~
jl6
Oh great, my general ledger is now free, but in return I have to route all my
data through a semi-trusted 3rd party with unclear motives, who is financially
incentivized to act against my interests, and who can force-deploy workflow-
breaking updates whenever they choose, and maybe is secure but who can ever
tell...

Rant over. I actually like WhatsApp. It’s pretty far from perfect though.

------
buboard
I love myself some pessimism but tech is not slowing, instead its HN being
obsessed with things that don't matter.

\- VR was a niche from the start, a lot of people were doubtful, and the
people that weren't know that it 's not going to work and are already moving
on.

\- javascript frontends are great for programmer fights, but thankfully most
of the public-facing web doesnt use them, they 're irrelevant to tech's
progress

\- Uber has nothing critical to offer to technology except an also-ran in self
driving cars

\- kubernetes is relevant to like 100 people on earth, and doomed to be
obsolete in 1 year. not critical to tech progress

\- AWS isn't critical to tech progress either, its bubbled because of easy
startup money and lazyness. will be a good thing to see it fade away.
Meanwhile , check googles TPU offerings for some progress.

\- Who needs a 3rd browser engine when every website looks like bootstrap?

Those said, crypto is entering the plateau of productivity. It 's being held
back by hostile regulation , governments and banks and AI hasn't even started
, its going to be a wild ride. The web is slowly decentralizing which will be
huge too

~~~
vortegne
"javascript frontends are the great for programmer fights, but thankfully most
of the web doesnt use them"

Wait what? Have you like, been on the web at all? How the hell are javascript
frontends not used everywhere?!

~~~
buboard
they do have a lot of usage but barely everywhere. remember most of the web is
wordpress.

~~~
oxfordmale
Yes, that is true, but most WordPress sites are hosted on a cloud provider.

------
Causality1
>The only thing I can think of is that all my laptops (thinkpads & Apple MBP)
are now rocking USB-C PSUs.

Personally I think type-C might be the best example of moving the problem
behind a layer of obfuscation rather than fixing it. If you look at a random
device with a type-C port you have no idea what that port is capable of. Maybe
it'll charge your Macbook but it won't charge your Asus. Maybe it'll support
USB 3.1 or maybe it'll only be USB 2.0 like OnePlus. Counterintuitively,
chances are if it's a phone and it does USB 3.1 then it won't support USB to
HDMI or DisplayPort although some do, but if it's only USB 2.0 over type-C
it'll probably support HDMI-out with a DisplayLink adapter. Maybe it'll break
half the specifications just for the hell of it like the Nintendo Switch.

It's a bloody nightmare.

~~~
ajmurmann
And you haven't even talked about the cable yet...

~~~
Causality1
It does seem inordinately prone to getting filled up with dust or mysteriously
smooshed flat when you weren't looking.

~~~
ajmurmann
My reference was really to the fact that it has the same hidden properties as
the port. What days throughout does it support? What charging capabilities
does it have? Can't tell by looking at it.

------
cheschire
You know, maybe it’s not a bad thing if everyone takes a break from growing
and changing. There’s plenty of ways where we would benefit from stability and
refinement right now.

------
pbourke
I really don’t get articles like this. You can rent a fleet of computers for
an hour for a few dollars. Your cellular phone is more powerful than the time-
sharing systems of 25 years ago. There is pervasive wireless broadband
internet available in much of the world. Single computers can be bought with
dozens of cores and a lake of RAM. You can look up nearly any piece of human
knowledge in a few seconds. The labor of millions of engineers can be accessed
via APIs which you can use to assemble a stunningly capable application with a
minimal team.

~~~
trabant00
> You can rent a fleet of computers for an hour for a few dollars.

On which you deploy an web app that has 30 working components written in 10
different languages that is able to do a fraction of the work a single
vertically scaled server that runs 3 C written components. And has less than
two 9 uptime. No one person can grasp it mentally. Is statically linked and
must be updated piece by piece, unable to use the distribution repos and
upstream work. Oh, and man, what man? Every one of the 30 components is "open
core" with cool new invented terminology that pushes you into paying for
consulting or reading spaghetti "object oriented" code. It also has stack
trace for logging.

How ungrateful of little sysadmin old me not to appreciate progress.

~~~
sgt101
So, why are we doing this?

In truth because the world of C (I was a C and then C++ programmer 30 years
ago) did not enable the production of effective applications cheaply and
quickly. Yes - you can write anything, but it took months and months and
months.

We've traded efficiency for productivity. We have been able to do this because
of Moore's law and all its friends.

Also, C code is hellish to read and understand (#define anyone?), and logging
almost never happened.

As a sysadmin you will be able to look at the organisation you work in and
count up the number of sysadmins, my guess is that there are less than 50% of
the number 10 years ago, maybe that's the problem?

~~~
jcelerier
> Yes - you can write anything, but it took months and months and months.

The C & C++ of 30 years ago, where compile times took days - sure. Nowadays
building a whole yocto software stack, basically a full linux distro - from
GCC, to the kernel, to glibc, to X11, to Qt takes 6 to 8 hours in a good
laptop. In addition, modern language features & the general shift from runtime
to compile-time polymorphism, especially in C++, make the potential for errors
decrease sharply, and language simplifications - standard library
improvements, terser syntax, also make time to release go down. Finally,
tooling has greatly improved : I barely ever need to run a build with my IDE
to check for errors, because IDEs now embed clang, clang-tidy, clazy... which
checks your code as soon as you type and highlight it in the editor. And asan,
ubsan, handle the remaining cases which cannot be caught at compile time,
leaving only logic errors to handle.

~~~
ckannan90
Pretty sure the comment you're replying to is talking about development time,
not compile time. They're claiming it took months longer to develop a similar
app in C, not that it took longer to compile. Even if compile times are
negligible now for the reasons you state, they still have a point about it
taking longer to develop an equivalent app.

------
baybal2
I can follow up on this with my recent work experience.

The microelectronics has been on the declining trend for half a decade. A lot
of people were saying that we are heading for the "long winter" in the
industry.

Long term investments into manufacturing are on the all time low. Only recent
events with trade war made some bigger OEMs to reopen lines in 3rd world
countries other than China.

It's important to make distinction that it's not only consumer good sagged,
"enterprisey" stuff sees no growth either, and that's why people don't invest
more in high end fab process because they don't see server chips, router
chips, and memory going up in sales any time soon.

Why it's bad? It's bad because every time the industry moves up a node, it's
paid by highest bidders first. If Intel/Broadcom/AMD and other big boys
wouldn't have paid for 7nm, there wouldn't have been 7nm for the rest of us.

We may not see 5nm, despite EUV being "just months away"

On other hand, our engineering consultancy saw an enormous inrush of new
clients, most of whom are complete newcomers to the industry. One of them
is... a furniture company, and a big one.

~~~
dfrage
EUV is in production today with Samsung and TSMC's "7nm" processes, and the
latter has started "5nm" risk production which uses even more EUV. Here's a
recent SemiWiki article on details of what Samsung's foundry business is
doing: [https://semiwiki.com/semiconductor/259664-samsung-foundry-
up...](https://semiwiki.com/semiconductor/259664-samsung-foundry-update-2019/)

Intel's all 193 UV "10nm" node (roughly equivalent to the above 7nm) is a
failure, we'll see if they can get their EUV using "7nm" node to ship in
quantity. In all these cases the demand for less power consumption in battery
powered devices continues to drive demand, even if that demand isn't as
healthy as it used to be.

------
povertyworld
The current crappiness of AR reminds me of the early days of MP3 players. Only
a few geeks had them, and it took forever to load an album over the printer
port if it didn't crash half way. Then 5 years later, everyone had an iPod.

~~~
Illniyar
I was in highschool when MP3 players were a thing, just before and slightly
after the iPod. Everyone had one.

I wasn't in a particularly affluent or tech-savvy area. It was cheap to buy
and just required you to know how to move files from one drive to the other in
a computer - which was a skill that was very common and was actually taught at
schools in my area.

~~~
povertyworld
Well, I just remember I didn't see another person besides me with an MP3
player on public transportation for quite some time in the NYC area.

------
bsg75
> kubernetes's insane complexity proliferating

Is K8S any more complex than the other options for deploying apps across a
fleet?

I’m genuinely curious because it looks complex to me, but I have little
background to determine if what I see is complexity or unfamiliar territory.

~~~
naniwaduni
Deploying apps across a fleet is far more complex than most of its
practitioners need.

~~~
theyinwhy
How is that? You create an object with some metadata and that's it. Don't know
how that can be done any easier. I honestly get the feeling only people who
have never used K8s deem it complex.

Running a cluster yourself is another topic, although there are way more
complex orchestrators on the market than K8s.

~~~
sascha_sl
It seems to be a typical "if I don't understand it and reading up on it for 5
minutes doesn't help it must be too complex" case.

Turns out turning a bunch of machines and maybe a cloud API into a functional
PaaS includes a lot of logic, a lot of which is dedicated to handling edge
cases.

------
peteforde
To quote David Mitchell (the comedian, not the author) it would be tempting to
take this kind of thing seriously if it wasn't for a damnable sense of
perspective.

Some thoughts:

1\. Much like how the entire centrist media machine feeds off the insanity of
the current US administration, somehow this article is at the top of HN. We
put it there. (Well, 42 of you did, as of the time I wrote this.)

2\. Perhaps the best-kept secret in the world is that it is getting better by
almost every measurement. The late, great Hans Rosling was superb at breaking
down these dimensions in video form (eg.
[https://www.youtube.com/watch?v=jbkSRLYSojo](https://www.youtube.com/watch?v=jbkSRLYSojo)
) but if you want the real deal, I recommend Steven Pinker's excellent
"Enlightenment Now".

3\. I couldn't care less that Intel has "stalled" since you can now buy a
Raspberry Pi that pwns my first computer by an order of magnitude or three and
costs less than $25. Meanwhile, Apple is refocusing on Mac hardware again
because so many "poor" people have phones now that they can see a sales
ceiling... this means that a huge percentage of the planet has a significant
computer on them all of the time.

4\. Not only does age and wisdom allow you to observe tech over many cycles,
but the older I get the more I realize that tech is nothing outside of its
relationship to politics, culture, and ourselves. Look at how young children
just assume everything is a screen, now. Tech is now an important aspect of
the daily political conversation. In just 15 years we've gone from a society
that rents VHS movies to one that feels entitled to comment, upvote and
subscribe to everything they watch. It's fucking crazy how much tech has
reprogrammed everything from the way we find love to the way we get from A to
B.

Finally, to the author: sorry a lot of the comments here are negative. They
_aren 't_ wrong, but we're still working on chilling out and defaulting to
presuming that in any given moment, people are generally trying their best in
this community. The good news: there's lots to be excited about in this "worst
possible timeline" we've fallen into.

~~~
FranzFerdiNaN
Remove China from the numbers and Pinker and Goslings points pretty much
disappear. Both aren’t a secret, they are well known and criticized a lot for
their use and abuse of numbers.

~~~
oblio
I'm pretty sure India also has a positive outlook.

And if we remove those countries we're removing half the world's population...

~~~
zasz
Child malnutrition rates in India are worse than in some parts of sub-Saharan
Africa, actually: [https://www.deccanherald.com/state/top-karnataka-
stories/mal...](https://www.deccanherald.com/state/top-karnataka-
stories/malnutrition-in-india-worse-than-sub-saharan-africa-731071.html)

Impressive GDP growth aside, persistent malnutrition suggests that for huge
chunks of the Indian population (a shocking _seventy_ percent of women are
anemic), something's not working. Who cares what GDP growth is? What's it
really measuring if people still aren't getting enough to eat?

~~~
dfrage
The number of calories lower class Indians are consuming is _dropping_ , which
your link has got to in part be reflecting. It's a mess.

------
labster
Unfortunately for those of us living in the Anthropocene, it's hot.

------
0x8BADF00D
What he lists are not technical problems, they’re political problems.

Tech progress has always been incremental in nature. It seems “nothing is
going on”, but there is a constant iterative cycle that will eventually
automate almost every repetitive job or task.

It will be cold for many who do repetitive jobs and tasks.

------
peter_retief
It is sad that open source has been hijacked by the likes of Amazon. However,
open hardware is an exciting option with desktop fabrication, electronics and
low energy wireless creating new opportunities for small entrepreneurs

~~~
tannhaeuser
I think the situation is a bit more complex: the proliferation of F/OSS
_itself_ has driven the market into the situation where integrating and
running other people's commodity software is one of the only commercially
viable options left. Another factor is that commodity hardware is so powerful,
and has been for over ten years, that it just had to be put back into the
hands of datacenters/clouds.

~~~
peter_retief
Sure, every generation has had its own challenges, there are cracks where new
ideas can grow

------
Havoc
Cold as in impersonal and lacking innovative warmth perhaps. Certainly not
slow and in crisis as the author suggests

------
masonic

      Internet speeds never seem to get better faster or more reliable
    

Sure they do. The problem is that tracking/adware consumes bandwidth and
resources at the same increasing rate... or worse.

------
ddmma
From last years my personal favorite will remain ingesting sql into hadoop
data lakes and query big data as sql

------
bni
"VR is struggling" \- linking to Linus Tech Tips video from January
nonetheless. Right...

Meanwhile Oculus Quest came out, and PCVR has or is about to enter the second
gen.

~~~
shasheene
I've played quite a bit with the Oculus DK1, GearVR, Oculus CV1 and the Oculus
Go, and after 2 weeks of usage I can say the Oculus Quest is _much_ better
than any of them. Low barriers to putting the headset on and immediately
walking around in VR is hard to overstate.

I think it will be a very big hit. The virality levels are very high with
people showing their family and friends, who then want to go out and get their
own.

The downsides are the games are a bit too expensive, and the headset itself
costs $400 which is alright, but it does need to be cheaper if they want to
achieve Nintendo Wii type sales figures (which the Quest fully has the
potential to do).

------
Ayesh
About the leave of Uber. Grab is a terrible app compared to Grab compared to
the UI, UX, and overall drivers attitude (personal opinion of course).

However for Indonesia, services like GoJek and Traveloka are picking up. I'm
currently in Indonesia and GoJek has grown into a service that you can bail a
ride to full market place with payments, vouchers, and rewards integrated into
several consumer services such as package delivery, movie ticketing, Massages,
make-up services, etc. They continue to offer even more and better services
and gained a lot of attention die to Uber's leave.

In Georgia, for example, a relatively unheard of country, has a huge startup
boom with many ride sharing apps, mobile money solutions, and the city Batumi
is full of online casino kiosks.

It's probably a cold age in information age elsewhere, but certainly not din
developing countries.

------
thinkingkong
Everything’s amazing and nobody’s happy.

~~~
TeMPOraL
This isn't complaining about today vs. life before the industrial revolution;
it's complaining about today vs. last year or two, within a particular job
sector.

In other words, total amassed progress may be amazing, but according to the
author, its derivative in web/consumer tech sucks (I'm inclined to agree, even
if not for the same reasons).

------
ngcc_hk
Ask any IT people and public, how to follow up the changes? Can you use what
is being done 5 years ago then now?

Yes it is better than in 1990 vs 1995 or 1995 vs 2000 etc. You really have to
learn a total paradigm. Java, unix, windows, object programming gui ... now
the basic is done but still no. The python you learn is not important vs the
one doing AI for example.

------
TeMPOraL
I don't know, #cryptowinter makes me feel warm inside, and for Uber I only
wish it happened sooner :). What also warms my spirit is that it seems GDPR
had a side-effect in that many other places are now considering implementing
similar data protection regulation. So it's not all bad news.

That said, I sort of agree with the sentiment. Where's the technological
innovation on the web these days? Where are the tools and technologies that
allow me to work _better_ than 10 years ago, and not just worse but with
JavaScript, or worse but with high-resolution and shiny (yet bloaty and
unergonomic) UI? What's there currently beyond WebAssembly (i.e. "all the old
stuff, but in web browser", which has a few benefits)?

~~~
antidesitter
> don't know, #cryptowinter makes me feel warm inside, and for Uber I only
> wish it happened sooner :)

What did you mean by this?

~~~
TeMPOraL
I mean that I consider cryptocurrency scene to be net social negative (a nasty
breed of fraud, gambling and MLM) and cryptocurrencies themselves a technology
that's an environmental disaster and which should have never left the research
stage. As for Uber, their _only_ innovation so far was showing the sheer scale
at which you can make an inherently unsustainable business built on
anticompetitive subsidizing and illegal behaviour without people noticing.

~~~
antidesitter
> I mean that I consider cryptocurrency scene to be net social negative

This sounds like nonsense. Cryptocurrencies allow people to perform secure
financial transactions without relying on central banking systems, and to use
the blockchain for things like decentralized time-stamping, automated escrow,
prediction markets, online voting, and other services. There’s a huge amount
of social value to these innovations.

> an environmental disaster

What specifically are you referring to?

> should have never left the research stage

I’m glad it did. In addition to their utility as decentralized mediums of
exchange, they have inspired a lot of research in the fields of game theory
and cryptography, including work on voting protocols, mechanism design, and
secure multiparty computation. I think the world is much better off with this
newfound knowledge, though you seem to think otherwise.

> As for Uber, their only innovation so far

Their innovation was revolutionizing ridesharing and ride-hailing. In 2015,
and in the United States alone, they created $6.8 billion of consumer surplus
[1], a surplus which has only increased since then. Over 95 million people use
it on a monthly basis [2], with 14 million trips completed each day [3]. It is
popular with the public, especially the young [4]. There is huge demand for
services like Uber and Lyft to become more widely available. See for example
[5].

You clearly have an axe to grind and it’s clouding your ability to evaluate
things objectively.

[1] [https://www.nber.org/papers/w22627](https://www.nber.org/papers/w22627)

[2] [https://www.statista.com/statistics/833743/us-users-ride-
sha...](https://www.statista.com/statistics/833743/us-users-ride-sharing-
services/)

[3] [https://www.uber.com/en-CA/newsroom/company-
info/](https://www.uber.com/en-CA/newsroom/company-info/)

[4] [https://vancouversun.com/news/local-news/canadian-
millennial...](https://vancouversun.com/news/local-news/canadian-millennials-
would-rather-hail-an-uber-than-a-taxi-poll)

[5] [https://globalnews.ca/news/4895650/bc-ridesharing-
poll/](https://globalnews.ca/news/4895650/bc-ridesharing-poll/)

------
LifeLiverTransp
Consolidation was also the usual area where new attackers formed and old
companys entrenched themselve mentally, becoming unable to see new opponents.

------
nafey
70s and 80s were relatively quiet on the tech innovation front. We might be
headed for a couple of decades like that.

~~~
tannhaeuser
Except for having invented the microprocessor, the x64 and ARM ISAs, Unix, C,
SQL, SGML, TCP/IP, digital audio and video, electronic music, GUIs, logic
programming, and basically everything we're taking as granted today.

Edit: x86 of course

~~~
314
The x64 ISA was invented in the late 90s. The rest of the list looks accurate.

