
The Miasma - mooreds
https://feld.com/archives/2019/10/the-miasma.html
======
anateus
There are a lot of interesting societal effects caused by the internet and
social media, many of them very problematic. But when was this time in the
past when people agreed about facts? Is it during the period of time when we
lionized science after WWII and before the 80s but during which smoking was
still healthy? Or was it when the Church decided what was true? Or was it when
Hearst could print what he wanted and make it so? We have been in the Miasma
since the dawn of civilization.

Neal Stephenson had a hand in sparking my interest in cuneiform and I've read
many tablets myself (mostly Ugaritic and some Akkadian). The same effects are
evident all the way to the beginning of recorded history, and likely predate
it. The Miasma is the result of any indirect epistemology combined with human
proclivities. There is a certain scale and swiftness that is novel, but it's
not yet a brand new thing: much as flash crashes on the stock market are a new
side-effect of high frequency trading, the crash of 1929 happened just fine
without it.

The Miasma is us, not the tech we use to connect us.

~~~
zamfi
Thank you. There were tabloids and conspiracy theories and alternative facts
long before Facebook's newsfeed.

But: there is the question of whether the tech somehow amplifies it, makes it
worse in a way that we are somehow less equipped to handle.

~~~
drewcoo
Are humans capable of detecting and understanding what that amplification
means? Even though our pattern-matching is a wonderful skill, it doesn't
always serve us well. We exaggerate local effects. We are known to be terrible
at understanding risk. And I've even heard that nine out of ten dentists are
bad at statistics.

I'm not saying we shouldn't ask the question. I'm saying we need a way to
answer it that factors out human perceptual error.

~~~
TeMPOraL
Feedback loops are shorter, news reports are increasingly more useless[0], and
our current age is somewhat unique - reporting of the past didn't have a
strong economic system attached whose sole purpose is making it less truthful,
less accurate, and more disagreeable. I'm, of course, talking about funding
media through advertising impressions at article granularity.

\--

[0] - Gwern makes a really good argument that with the amount of people we
have on the planet and how information moves near-instantly around the world,
you can plausibly assume that all news reports are flukes, one-in-a-million
events, "rare datapoints driven by unusual processes such as the mentally ill
or hoaxers are increasingly unreliable as evidence of anything at all and must
be ignored."
[https://www.gwern.net/Littlewood](https://www.gwern.net/Littlewood)

------
peterkelly
One question that recently came to me regarding technology is: "What happened
to all the hope and optimism?".

Growing up as a kid in the 80s, I remember the development of personal
computing as _exciting_ , and especially with the explosion of the Internet in
the 90s, the possibilities being envisaged by everyone were utterly
intoxicating. Finally we would have world that connected disparate cultures, a
bridge of communication that would help us understand each other and
appreciate different points of view. A true global community. For those of us
coming of age during this period, "surfing the web" was exciting because every
week you'd learn about or discover something new, and there was a highly
experimental, creative, and fun tone to the whole thing.

Instead now we have social media that is largely just a venue for people to
yell without listening, political entities to peddle misinformation that
serves only their own interests, an entire industry sector dedicated to
exploiting personal information to target ads, a culture of consumption rather
than creation, and the centralisation of online life into the hands of a small
corporate oligarchy. This isn't what most people were hoping for in the 90s,
though a few were warning of potential risks.

Like the author, I recently read Edward Snowden's book, and in one of the
early chapters he talks about the experience of growing up with the Internet
in the 90s in a way that resonated closely with me. Fast forward to the end of
the book, which is absolutely chilling, and it feels like you're reading a
dystopian science fiction novel, except it's _real_.

The Internet of today is almost unrecognisable from the early days. Stepping
back I just throw my hands up and wonder what the fuck happened? Is there any
hope that it can become better, or is it just going to be a further descent
into cynicism? And what, if anything, can we do about it?

~~~
dr_dshiv
What we need is to quantify the negative impact that negative storytelling has
on our wellbeing and direction. That will help us understand the importance of
positive heuristics for the future, as in the 60s-90's.

We should avoid a negative vision of the future. But we also need a positive
vision of future to attract us and guide us.

Stewart Brand had one. Many techno utopians did. But now so few intelligent
people believe it is _morally acceptable_ to be optimistic. After all,
climate, race, capitalism...

~~~
tjr225
> But now so few intelligent people believe it is morally acceptable to be
> optimistic. After all, climate, race, capitalism...

We have to consider the moral implications of our social and economic
structures as well as our impacts on the only place we have in the universe to
live if we are going to have a positive vision of the future.

The greatest challenge here is not being silenced by those who have a vested
interest - logical or otherwise - in preserving the status quo with regards to
these things.

~~~
dr_dshiv
Well, my argument is that we need a positive vision without considering how
_exactly_ we will fix our social, economic and environmental issues. That's my
point. We can't wait till we figure that out to have a guiding optimistic
vision of the future. We simply have to assume we will figure it out.

Second, and with respect, I would suggest that your rhetoric, which is quite
common (and, by the way, represents vested interests in status quo
intelligensia), is doing a lot of silencing. It's quite pervasive.

I'd argue that contemporary liberal group norms are much more silencing than,
say, the oil industry.

~~~
tjr225
> Second, and with respect, I would suggest that your rhetoric, which is quite
> common (and, by the way, represents vested interests in status quo
> intelligensia), is doing a lot of silencing. It's quite pervasive.

> I'd argue that contemporary liberal group norms are much more silencing
> than, say, the oil industry.

Do you not see the hypocrisy in this statement? You're attempting to discredit
my entire viewpoint by painting it as "liberal intelligensia rhetoric" \-
whatever that means.

And as to your second point, the oil industry has a long record of covering up
evidence of climate change, silencing victims of pollution/fracking,
manipulating political process in order to continue destroying the planet, and
so on.

I'm not saying Harvey Weinstein should have lost his job, but the oil industry
is certainly responsible for more silencing than "liberal norms" by an order
of some magnitude.

~~~
dr_dshiv
I don't see the hypocrisy, no.

To be clear, I'm not trying to discredit your view, I'm only suggesting that
your comment was an example of optimism silencing.

And, we may have to agree to disagree that the oil industry suppresses more
people's speech than political correctness.

------
ShardPhoenix
>When I described this to Amy, she responded with a magnificent rant that was
something like “this is a romanticized utopian ideal about a thing that was
inhabited by socially inhibited, white male nerds who consider themselves too
smart to be misogynistic but, well, often are.”

This kind of thinking is itself a major cause and component of the Miasma.

~~~
deogeo
I don't get it - the nerds were misogynistic, therefore his memory and
experience are invalid?

~~~
taneq
Not just that, she's saying the nerds themselves were invalid.

Guess she's a misollectual.

~~~
tom_mellior
I don't read her as saying that the nerds were invalid.

I think what she's saying is that if you were part of this group, the early
Internet seemed welcoming and inclusive, but if you weren't, it was the
opposite. From _within_ a community it's easy to feel inclusive!

For whatever it's worth, even though I am a white male nerd myself, I never
really felt welcome in "hacker spaces" and such. It must be a lot worse for
people who are not white or male. And I'll believe her when she says that that
was her experience with the early Internet. (Or, well, today's Internet.)

~~~
TeMPOraL
> _I think what she 's saying is that if you were part of this group, the
> early Internet seemed welcoming and inclusive, but if you weren't, it was
> the opposite. From within a community it's easy to feel inclusive!_

Arguably, the people who weren't "in the group" at that time were people who
weren't interested or didn't have access. Internet was neither global nor
cheap nor popular back then.

As for the misogyny angle, I feel either she fixated herself on a problem and
is projecting, or this remark was included to signal allegiance with the
social justice crowd. As it is now, the comment reeks of antiintellectualism.

> _It must be a lot worse for people who are not white or male._

I don't think it was back then, because back then people didn't care much
about it, and a lot of communities of post-university Internet were text-only
pseudonymous communication anyway. As for what's today, increasingly it's
being white or male that makes you out of place in a hacker space.

~~~
tom_mellior
> a lot of communities of post-university Internet were text-only pseudonymous
> communication anyway

So is this. Which is enough for you to spread the old, tired, harmful, and
incorrect trope that "people who are not in tech are just not interested". A
text-only discussion of how women or minorities don't belong because they
don't _want_ to belong is unwelcoming in itself.

> As for what's today, increasingly it's being white or male that makes you
> out of place in a hacker space.

You are twisting my words. For one thing, I don't feel "out of place" among
non-white non-male people. People are people, I'm not afraid of them because
of the color of their skin or what I guess their genitalia are like. For the
other, all the hacker spaces I've seen where I didn't feel welcome had large
white male majorities.

~~~
TeMPOraL
> _So is this._

Yes. And I don't know what your race or gender is, nor do I care, nor would it
make a difference if I knew.

> Which is enough for you to spread the old, tired, harmful, and incorrect
> trope that "people who are not in tech are just not interested".

At this point in time today, not only it's not a tired trope, it's pretty much
an obvious truth. Programming today has zero structural barriers to entry,
minimal capital requirements, and unprecedented amount of affirmative action
targeted at all kinds of minority groups. If in 2019 you aren't programming,
you're either not interested in doing so (a fine choice!), or can't (due to
economic or health constraints).

Still, I was talking about the times where the Internet was a peculiar things
only particular types of nerds, discriminated against in the physical world,
ever found interesting. Getting on-line, or into programming in general,
required more effort and money - you had to convince yourself or your parents
that a PC and a modem and future phone bills were useful expenses - and didn't
yet offered obvious paths to riches. If you were in there, it meant you were
interested _and_ had wealth to spare. That means, obviously, that a lot of
people were excluded.

But if the point you're making is that any group with barriers to entry will
exclude someone, then I don't see the point of making that point.

(You'll also notice that all the talk of the tech being unwelcoming started
only after commercial Internet exploded and some of those high-school
oppression targets made a shit ton of money and influence. Once tech became
seen as the easiest path into money and fame, people started asking "how come
the population of tech workers isn't uniformly distributed across all the
characteristics you could think of", and people who were underrepresented
followed that with "how can we fix it so we too get a piece of the pie?".)

I see some people on-line have a peculiar definition of what it means for a
field to be welcoming - not only it has to remove all the barriers to entry,
but it also has to bend over backwards to make the demographics uniform. It's
true that the tech, until recently, didn't do the latter.

~~~
tom_mellior
> If in 2019 you aren't programming, you're either not interested in doing so

In a very trivial sense this is true-ish. In the US lots of women seemed to
have lost interest in computing in the mid 1980s:
[https://www.npr.org/sections/money/2014/10/21/357629765/when...](https://www.npr.org/sections/money/2014/10/21/357629765/when-
women-stopped-coding?t=1570789952549)

With a massive change like this, there must be some sort of societal shift
behind it. You don't have to buy the article's thesis regarding the concrete
reason. But there was _something_ going on in society that changed young
women's minds.

But "society pressured me into losing interest" is not the same as "I wasn't,
our could not have been, interested in the first place". So yes, many women
"aren't interested" in the trivial sense of "we had enough discussion threads
in HN and elsewhere signaling that they shouldn't be interested, and finally
this perception stuck".

> If you were in there, it meant you were interested and had wealth to spare.

If we agree that in many cases that wealth came from parents, there is no
reason to assume that young women had less of it than young men. _Unless_
there were factors like parents saying things like "computers are for boys".
From the article above: "In the 1990s, researcher Jane Margolis interviewed
hundreds of computer science students at Carnegie Mellon University, which had
one of the top programs in the country. She found that families were much more
likely to buy computers for boys than for girls — even when their girls were
really interested in computers."

If you were in there, it meant you were interested (check), your parents had
wealth to spare (check), _and you were very likely a boy_.

Anyway, all of this has been rehashed many times before, and we're unlikely to
change each other's minds.

------
whatshisface
> _When I described this to Amy, she responded with a magnificent rant that
> was something like “this is a romanticized utopian ideal about a thing that
> was inhabited by socially inhibited, white male nerds who consider
> themselves too smart to be misogynistic but, well, often are.”_

Hey, there were some black male nerds on the internet then too.

------
DanielBMarkham
The internet was better in the past and that had nothing to do with the types
of people using it. It could have been used by alien circus clowns and it
would have been better. Why? Because it hadn't been weaponized yet. Nobody
important realized in the 90s that there was going to be an upcoming war for
attention span, for control of the narrative, for monitoring of the
population, or for industrial/state-sponsored espionage. And it all was going
to happen on the internet.

So if you ran across a discussion on X, you could poke around, find some
scholarly articles on X written by, well, scholars. Not by scholars who were
paid off or were looking to become the next internet superstar. So no, people
didn't magically agree on facts, but there wasn't this endless chasm of
belief-reinforcing bullshit that there is today, either. You could just go
look stuff up.

There's another, larger discussion about whether it's good or bad, how to sew
a silk purse from a sow's ear and the rest of it. But a difference in quantity
can result in a difference in quality. Simply having more and more people come
online and do things changed the nature of the net.

~~~
thefz
Well said. And weaponisation aside, usage and consumption of the whole new
media was entirely better, because advertising and marketing companies sti
haven't had the hold of it.

------
soylentcola
The first half of that book was so much more enjoyable (to me) than the second
half.

I understand the limits and personal opinions that go along with this sort of
semi-satirical/semi-predictive fiction. Still, the bits about identity
verification, acceptance of "fake news" and slander, personal feed editors,
and the fragmentation of society based on choice of info feeds straddled the
line between believable and crazy/entertaining.

I could have done with a whole novel set in the world prior to the simulation
bits.

------
look_lookatme
> I’ve deliberately disengaged from some of it through deleting my Facebook
> account, limiting my Twitter usage to broadcast only, and trying to use
> LinkedIn in a productive way even though the UX seems to be set up to
> purposely inhibit you from using it in a way that doesn’t suck you into the
> LI vortex.

Unless you have deleted your twitter, you have not disengaged with the
Miasima. It is far more intellectually toxic than Facebook ever will be. I
honestly think it might the worst thing that's ever happened to western
thought.

------
dr_dshiv
Cynicism, I believe, is the thoughtenemy. Trump, for those who loathe him, is
a result of the cynicism of the left to not vote for Hillary. It's not because
the right was too strong! As they say, "The perfect is the enemy of the good."

In the Diamond Age, Stephenson alludes to this, in his criticism of those
focused on hypocrisy. After all, you can only be a hypocrite if you try to
pursue moral actions.

------
blue_devil
The Internet works fine for getting good information on fast-changing things -
like programming. Otherwise, I'd stick to libraries (the book ones).

Unfortunately, while we're staring at the "Miasma", the real miasma has been
wreaking havoc in the real world - climate change. And the Internet is
literally contributing to this miasma now with online video traffic for
entertainment burning up data centres and networks around the world.

~~~
icebraining
Let's do a rough estimation. A home router uses about 10W. It's unlikely a ISP
router will use more for each stream, so if there are 6 on route, were talking
about 60W. For the server, even if the CDN used a dedicated NAS for each user,
we're talking about ~30W. Doubling all that to account for overhead, it's
about 200W.

For comparison, a Tesla Model S uses about 200kWh/km, so a mile driven uses
about the same energy as streaming 1.5hrs of video. A round-trip to a movie
theater 15 miles away = 45 hours of streaming video.

~~~
blue_devil
You're forgetting about networks' and data centres' energy requirements. A
recent report estimates currently use of the carbon footprint of digital
technologies is about 4% of global emissions, and in a worst case scenario by
2025, digital technologies may amount to 7% of global carbon emissions. For
reference, global emissions as of 2018 grew to 37 billion tonnes CO2
equivalent.

>>Digital technologies now emit 4% of greenhouse gas emissions (GHG), and its
energy consumption is increasing by 9% a year. [...] only one form of digital
use, online video, generates 60% of world data flows and thus over 300 million
tons of CO2 per year. This use is far from being “dematerialized”. On the
contrary, it represents 20% of the greenhouse gas emissions of all digital
devices (use and production included), and 1% of global emissions, i.e. as
much as Spain.

[https://theshiftproject.org/en/article/unsustainable-use-
onl...](https://theshiftproject.org/en/article/unsustainable-use-online-
video/)

~~~
icebraining
> You're forgetting about networks' and data centres' energy requirements.

No, I'm literally counting those requirements. Hence routers + server (+100%
for overhead, like cooling).

> A recent report estimates that in a worst case scenario by 2025, digital
> technologies may amount to 7% of global carbon emissions.

Why do you trust that report? The numbers don't even make sense; just as an
example, dividing the watched VoD time by the consumed VoD data, you get
24Mb/s, yet they say on the same row, "Average Bitrate : 3 Mbps". That's a 8x
difference!

~~~
blue_devil
>>No, I'm literally counting those requirements. Hence routers + server (+100%
for overhead, like cooling).

I would not say you're "counting", you're rather doing a back-of-the-envelope
calculation with eye-balled numbers.

These estimations are difficult by nature. I don't "trust" the report per se,
but they do provide sources, from which I can inform myself. Which is more
than I can say about your calculations, which come without sources.

Digital tech is one of the fastest growing carbon polluters, and data centres
alone account for almost 50% of the emissions. Now, the specific attribution
to online video may be less clear-cut, but the trends have been replicated in
multiple studies. You may also be interested in this recent report:
[https://www.sciencedirect.com/science/article/pii/S095965261...](https://www.sciencedirect.com/science/article/pii/S095965261733233X)

Can you point me to this inconsistency? >>Why do you trust that report? The
numbers don't even make sense; just as an example, dividing the watched VoD
time by the consumed VoD data, you get 24Mb/s, yet they say on the same row,
"Average Bitrate : 3 Mbps". That's a 8x difference!

~~~
icebraining
> I would not say you're "counting", you're rather doing a back-of-the-
> envelope calculation with eye-balled numbers.

Yes, of course. My point is that I wasn't forgetting them.

> I don't "trust" the report per se, but they do provide sources, from which I
> can inform myself.

And did you?

> Now, the specific attribution to online video may be less clear-cut

That's the thing, though. We _know_ that streaming a video is a very low-power
activity, since we can do so with a very low power device. And I added 100% of
overhead, whereas the PUE of a real datacenter is actually about 1.09
nowadays.

So any report that tries to tell me Google or Amazon, who spend millions on DC
energy improvements, are wasting orders of magnitude of energy more, is not a
serious report.

A very big issue is that they assume a linear correlation between transferred
data and power use. For the network part that may be reasonable, but for the
DC is absolutely is not. Mining cryptocurrency, for example, takes a huge
amount of energy to produce a message of a few KBs.

> Can you point me to this inconsistency?

It's in the Technical details of the report.

~~~
blue_devil
>>A very big issue is that they assume a linear correlation between
transferred data and power use.

Right, so they do have to make lots of assumptions (same as all reports I've
seen that are trying to estimate carbon footprint) - e.g., about the type of
network, the device, the energy mix supplying DCs etc etc. For instance, only
2% of networks in a country like Germany go over fibre, rest is copper. In
Italy lots of people outside major cities use LTE for regular Internet access
- absolutely insane in terms of energy use.

>>any report that tries to tell me Google or Amazon, who spend millions on DC
energy improvements, are wasting orders of magnitude of energy more, is not a
serious report.

You're talking past the point here, seems to me. The report is _not_ claiming
that any specific technology is "wasting energy" in terms of efficiency - it
poses the question of whether this energy is spent "wisely", seeing that
supposedly the world is trying to reduce global emissions, and the consumption
of say DCs or online video is growing constantly. Surely, not an unreasonable
question to be raised, don't you think?

Amazon may be efficient but their energy comes mostly from non-renewable
sources. On a scale of 1-10 how important do you think this use of dirty
energy is?

------
undershirt
A cloud of malaise in the miasma sky—suffused by Kelly’s technium, by
Mumford’s megamachine, et al.

