Hacker News new | past | comments | ask | show | jobs | submit login

Personally I find it really irritating that so many people, such as the humanities teacher, suggest that technology has made life on Earth worse recently like it’s a commonly-understood truth. Yes there are problems with a handful of companies right now, but for the most part life has been improved for the vast majority of people in the world thanks to things like mobile phones.

It's a form of thinking that comes from a sentiment that human civilization is in need of radical change. The problem is that it's hard to get people to support radical change, which will necessarily come with extreme amounts of suffering and resistance, when everyone is relatively comfy with their way of life.

The solution is to put a hyper focus on the problems and suffering which you can find, warn of an apocalyptic future just on the horizon, and sow complete dissatisfaction with your way of life by making you resentful of those who have it better than you and penitent for having it better than so many. This way, you can take a person who is living an extremely privileged life and make them want to turn it upside down.

I think it's for the same reasons that most developers who work on projects that they didn't create from grouns up want to do a refactor or a total rewrite. The idea in your mind never has the flaws that the real implementation acquires through time and use.

I don't think the concern here is the abuses of companies, or technologies like mobile phones. The concern is technologies like thermonuclear weapons (much worse than even nuclear weapons used against Japan), which we are just very lucky have so far not been used in earnest. I think it is a commonly-understood truth that those technologies present an existential threat to society, if not all humanity.

Artificial intelligence is assumed by many to present such a threat (i.e. Terminator-style intelligence singularity scenario). Many speculate that applications enabled by quantum computing could also be similarly dangerous.

There is also an argument that nuclear weapons were actually a greatest good for humanity, as they prevented wars between countries that have them.

Yet those thermonuclear weapons have been largely responsible for the long peace.

[1]: https://en.wikipedia.org/wiki/Long_Peace

"I'm mad that people making sweeping generalizations about technology not being good, because [sweeping generalization that technology IS good]".

Really not at all what he meant. The problem is not the instance of generalization itself, the problem is that this specific generalization (that technology made it overall worse) is obviously wrong.

He didn't actually make that claim, though. The mention of medical science suggests that he isn't opposed to any and all technological progress.

I do agree that the phrasing of the question suggests a more skeptical opinion of technology than I hold, but his basic concern (a bunch of apes that can now hit each other with nuclear warheads instead of sticks) looks legitimate to me...

This is true if you consider a snapshot in time from the perspective of an individual, but the trajectory technology has enabled us to put the planet on is speeding us toward crisis. We have already caused a major extinction event and continue to do so with no meaningful sign of stopping. Keeping our societies running involves extraction of finite resources. We are living on borrowed time. If we burn through all our fossil fuels and get hit with a big solar flare or some other major event happens, we may not have the energy reserves to recover. It's easy to celebrate now because the worst consequences of our collective actions are barely starting to come back to us. Just because that timescale is long compared to a human lifespan doesn't mean everything is great.

You may want to read "the industrial society and its future". As a philosopher, underrated.

Also the books "Anti-Tech Revolution" and "Technological Slavery"

Why is that? In myself I see a common thread of where I recall "bad" things more strongly than "good" things. Is it an evolutionary benefit kind of thing where avoid bad things is much better than running toward good things because if you're dead it's much more permanent than if you're happy?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact