Superficially, it might seem informative.
But if you look into the various industries, not so much.
In aviation, engines have become 1% more efficient per year, which means 3 related things:
1) they burn half as much fuel as ~72 years ago
2) the range is double, so that 737's can fly globe-spanning international routes now (this is game-changing)
3) engines are "powerful enough" for the first time - for any mission, there is an off-the-shelf engine. Until recently the v1 was always underpowered.
Computers might be banal outside of IT, but I remember when floating point was optional and compilers were a crap shoot. :) Also, Moore's Law kept pace in the time window the author wrote about.
We do need to put the pollution genie back in the bottle though. I remember when the Great Lakes and Niagara River were flammable, with green froth oozing from the banks.
There is little awareness of reuse in the West, which is hugely better than recycling.
Not well-known, while the US recycles 99% of cars, but most municipalities just landfill those blue bins since private garbage contractors can't make money off them.
in computing improvements in CPUs have been continuously improving at a regular pace for the past 20 years and you would not want to work with 2000 era CPU in 2020.
Is the fifty years anniversary 737 much better than the original? Ignoring the MCAS issue, sure. But compared to the best airliners from another 50 years earlier, those 737 perform virtually identical.
And the price for that was the 737 Max, which cannot be made to fly safely but at least it only barely sips fuel when it crashes. Thank goodness it was designed by bean-counters rather than engineers.
This reminds me of a line from Snow Crash describing kids' pajamas, which can either be flameproof or non-toxic but never both at the same time.
I doubt that very much. If the 737 MAX never flies again, it will be because of the same much-increased aversion to risk in modern society that the article describes, not because the design cannot be made safe.
Boeing cut corners with the 737 MAX. As a result bad things happened. If boeing had an appropriate aversion to risk they would be in a better place. Instead they took reckless risks and now they are screwed.
Sure, if we had cheap fusion or room-temperature superconductors, that might be game-changing.
Barring those, I'll take 1%/year improvement for 100 years or more.
For example, you're not going to see supersonic flight again due to the parasitic drag (physics) requiring so much fuel.
You could make ICE engines twice as efficient by cutting the car weight in half.
Municipalities had this great program before - swap your old fridge or water heater for a free, more efficient one.
Maybe have Amazon ship your items to a community mailbox without the brown box?
Those are doable things that pay benefits immediately.
I ship to an amazon owned pickup location sometimes. I thought this would be enough of a nudge. But no, it still comes in a box. Because everyone walks in downtown, the first thing we all do is top remove the box right there, recycle it, and walk out with whatever small item we bought. Come on, just put the stupid thing in the lockbox. I am standing right on the other side :(
If we look at flights, he lists "8 hour Transatlantic flights" as being more important than the cell phone(!!).
Is it 8 hours that was the breakthough? What about 9 hours? 10?
It seems to me that if there was any real breakthough in that domain (and with travel generally) it's the change from "multi-day travel" to "waste one day in travel" to "I can go for a meeting during the day and be back in the evening".
While Concorde technically moved transatlantic travel into the "there and back in a day" bucket, practically it didn't - the timing of the flights meant you had to stay overnight anyway.
In that case, there probably aren't any more breakthroughs possible - except cheap, reliable, high quality video calling (which he totally ignores).
Cell phones have made access to information available to the poorest. A tuk-tuk driver in Cambodia can affordably stream YouTube all day long (and they do!). The impact of cell phones in the last 10 years is far larger, and will impact far, far more people, than all of air travel.
And we should avoid the Kelvin trap, as we can rarely foresee a breakthrough before it happens
And your point about international business made me realise he has ignored all the huge innovations in the payments and international remittance space.
The 'Sec was the standard size of all such units, determined by what can fit comfortably in the human hand. At a quick glance, it did not differ greatly from one of the small electronic calculators that had started coming into general use at the end of the twentieth century. It was, however, infinitely more versatile, and Duncan could not imagine what life would be like without it.
Because of the finite size of clumsy human fingers, it had no more controls than that of its ancestor of three hundred years earlier. There were fifty neat little studs; each, however, had an unlimited number of functions, according to the mode of operation - for the character visible on each stud changed according to the mode.
-- Arthur C. Clarke, Imperial Earth, 1976
That itself was nearly a decade after the newspad of 2001: A Space Odyssey (1968):
See also Vannevar Bush's seminal "As We May Think" (1945) (https://www.theatlantic.com/magazine/archive/1945/07/as-we-m...) which discusses desktop devices, the ever-popular Dick Tracy watch, introduced in 1946 (Garyn G. Roberts, Dick Tracy and American Culture: Morality and Mythology, Text and Context (McFarland, 2003), p. 38), H.G. Wells predicting a Wikipedia-like service, "The World Brain" (1937) (https://sherlock.ischool.berkeley.edu/wells/world_brain.html), and E.M. Forster's "The Machine Stops" (1909) (https://www.ele.uri.edu/faculty/vetter/Other-stuff/The-Machi...)
1830 was the year the Industrial Revolution got out of beta. The Liverpool and Manchester railway opened that year. Double track, multiple steam engines, signals, schedules, and tickets. There were previous prototype steam engines and railroads, but that was the first time it got beyond the prototype stage.
The next 80 years brought huge changes. Electricity. Telegraphs. Steel. (Steel was about as exotic as titanium is now until the Bessemer converter, and it wasn't until about 1870 that the process was fully debugged and steel mills were built on a large scale.) By 1910 there were the first cars and airplanes. The New York City subway was running. Sailing ships were obsolete. Transportation had totally changed for the first time in thousands of years. By 1910, most of the pieces of the modern world before electronics existed in some form.
The first transatlantic dirigible flight was in 1919. The first transatlantic aircraft flight (Lindburgh) was in 1927.
Steel at low cost was an amazingly late invention. The Romans could have built a Bessemer converter. Now that would have changed history.
If you go into a virgin forest and cut down all of the big trees... that's the easy stuff. It's kind of absurd, then, to ask why the world isn't producing as much wood any more.
The CPUs of the 1950s, which used thousands of vacuum tubes and took up whole floors of buildings, once seemed like a breakthrough. Today we only have these little phone things. Only incremental progress, right?
He credits his golden quarter things like antibiotics (penicillin was discovered in 1928 and was used by Allied forces during WW2) and electronics (vacuum tubes had been used since 1902) but calls the world wide web and cell phones incremental.
For example, was the internet seminal or incremental? Most people would say seminal, but most aspects of the internet were already in place at the time - as the telegraphy network.
For example the almost simultaneous development of jet engines by both allied and German airforces during WW2 makes it appear it was inevitable, incremental innovative.
The problem as I see it is it's not really clear how progress should be defined to address the sort of questions being asked in the article. Maybe someone has convincingly, and the article doesn't convey this, but without a clearer definition things seem to be a non-starter.
I have some intuition that the question of the piece is not really nonsense, that there was something about the interwar and postwar period that was different. A lot of advances since then have been powerful incremental improvements on those postwar innovations, but incremental nonetheless. The one exception I personally maybe see is networked communications, which have really revolutionized a lot and are still disrupting things. But outside of that my sense is that life (in the US) hasn't changed in fundamental ways since, say, 1975. Things are really different in a lot of ways, but the basics are the same. Conversely, my sense is things are really different compared to 1915.
On the other hand, I'm not sure any of my subjective senses of things are really accurate. I wasn't alive in 1915. So without quantifying anything convincingly this is all some armchair discussion.
I do think these kinds of questions are important, and I suspect there's some way of getting at this, maybe with very high-level economic or public health data (I've read that the biggest increases in life expectancy in the last 200 years or so, for example, have not actually come from any medical procedure or tech advances, but rather public health advances). But I agree that the way we define things is critical, and it would be interesting to address the question of the article definitively positively or negatively.
It's intuitive for anyone how an automobile is distinctly different than a locomotive. You need some understanding of computer science or physics to understand how quantum computing is different.
At the same time, there was indeed an explosion of new foundations during the earlier 20th century (it's a stretch to claim it went on until the 70s) and it turns out we're not done just by formulating them, in order to explore and succeed in them as a society we need a couple of generations to let them sink in.
Once we get there, I am sure we will have new paradigm shifts. Just wait until after the next global crisis when structures in societies have fallen apart and we have fresh ground to build on. Not that I think that's desirable, but I'm sure I'm not the only one getting pre-WW vibes right now.
Cell phones have been available for 40 years, and ubiquitous for 25. Smartphones have been around for about 20 years, and have been in their fully-modern form for 13 years (dated from the iPhone). They've been pretty much static since then.
Something tells me that in 20 years people will still be trotting out cell phones as a counterexample to claims that technological advancement has slowed down significantly.
Upload photos to the cloud and Google photos does facial recognition of my individual pets!
Stupid? Maybe. But the technology and research to pull that off is impressive.
Smartphone cameras alone have pushed entire fields of research forward. Fancy 3d cameras are now used to save fractions of a second unlocking a phone! Try going back a decade and saying that 3d scanning would be a minor line item component that is shoved in a tiny notch above an OLED screen, a screen that has an image quality beyond anything available back then outside of a research lab.
Oh then show them a 108 megapixel senor. Then finish it off with one of the TPUs that is yet another barely mentioned component in every phone.
For example, WhatsApp now lets people do very cheap, high quality, reliable video calls from anywhere on the earth.
For example, in a lot of ways we live in a sci-fi future, if the number of classic films featuring videochats as a sign of the future is any indication. On the other hand, those films did basically get the idea right, that there would be some screen with some little speaker and microphone that you'd talk to someone through.
If you go back a bit further though, like to the 1910s, there wasn't any idea of that at all. It was off the collective radar completely.
So in some sense, we are living in some continuation of the early 1970s, in the sense it was part of a future we could accurately envision. But we're not living in a continuation of the same era as pre "golden quarter" in that we aren't really entirely part of the future they imagined (I think 1945 isn't quite right, I'd push it back to pre-WWI).
At least that's one argument. I wish we could go back and time and interview a bunch of people who lived like 1900-1960+ about this issue, and compare it to what people who were born in, say, 1950, might say about it now.
FWIW, my sense is that people in the mid-twentieth century underestimated change in communication technology and overestimated change in transportation tech.
My smartphone is also a full navigation device, 3d scanner, barometer, and can morph my face in real time.
Remember when facial morphs where crazy insane sci-fi stuff? Then they became "static image morphs that look like crap but wowee!"
Now the only reason we don't have photo-realistic morphs in real time is because society has collectively decided that looking like a slobbering cartoon dog is a lot more fun.
In a sense even reading adds is "learning", you learn about new products that are out there and where to get them for what price.
At least you are learning that someone is advertising such a product. How useful that is, is of course a different question.
That's textbook example of biased thinking. You're comparing the historical and present condition of other cultures by your own present condition.
When you study history or sociology, the first lesson you will be taught is that you will need to question your own frame of reference, before you can make factual statements about the historical record.
If anything, there are ample examples of traditional tribal communities who are struggling with modern problems - health, addiction, crime - which they didn't have just 50 years ago. Ask any traditional fishing community within the Northern polar circle, for instance.
What you consider as "poverty" is the baseline way of living. Isolated tribes simply don't know any better. The modern concept of "poverty" you try to apply, makes absolutely zero sense from their frame of reference.
If these tribes act resentful or hostile towards modernity, then it's because they see "progress" as an existential threat to a way of living that worked out for them for thousands of years. And that's a totally valid way of reasoning about how a human could live their lives.
This is the noble savage belief, but it really doesn’t match up with reality. Life was nasty, brutish, and short before modern civilization. People died early from preventable disease, war, famine, and crime was still a thing. Most people were peasants, serfs, or slaves, chained to the land and forced to work for a small upper class who enjoyed the fruits of their labor.
If you look at the history of South America or Africa, they still had kingdoms and empires, they still had wars, and they certainly still had poverty. People in China or elsewhere also weren’t in “isolated tribes” they were still poor peasants in an empire.
(Edit: just to be clear I do accept there are some problems unique to modernity but I want to challenge the idea that the past was a golden age)
I was referring to ancient tribal communities that lost their traditional way of living to Western dominated modernity.
> This is the noble savage belief
That's what you read into my statements. That's not what I implied.
Let me be crystal clear here: Of course, life wasn't easy back then. People suffered and died. If you didn't adapt or pulled your weight, you died. If you were unlucky in any respect, you died. I'm very much aware of that. And I reject any glorification or comparison as "one-is-better-then-the-other" vehemently.
The idea that modern civilization and "progress" somehow solved all of that is manifestly false. Millions upon millions are born and live in abject misery, simply because economies of scale have allowed those lives to exist, but without the affordances - social cohesion, cultural identity, an authentic belief system,... - to sustain them properly or give their lives a due sense of purpose and meaning.
I'm pushing back against usage of the past - see, they were poor, primitive and miserable - to be able to laud the merits of "Progress" without considering the misery it has caused and is causing as well.
Beware of generalization.
> tribes want to come into the modern world and experience the creature comforts of money
Beware of switching cause and and effect.
Would they still have wanted to "experience the create comforts of modernity" if modernity hadn't presented itself to their doorstep?
> it is us in the developed world who push back against it because of a desire to maintain a connection to our past, a living history.
That's projection. And it's not even "our past", so that would be cultural appropriation.
If anything, the developed world isn't a homogeneous society of like-minded individuals either. Most ethnological collections in Western museums don't have anything to do with Western culture, beyond the fact that these objects were taking out of context out of sheer curiosity.
There's an ongoing discussion to return objects to traditional communities, or what's left of them, as they carry particular cultural value. It's a debate which is hugely controversial because it forces us to recognize that our Western sense of "progress" hasn't benefited other cultures in absolute terms.
Case in point: Read Heart of Darkness and wonder why Ben Affleck's present endeavour to film that book is making certain European countries quite uncomfortable.
Yes, absolutely. Without either hospitals or contraception, it's quite common for women to have lost multiple children in their life. Without other sources of income, you are a crop failure or drought away from hunger.
It's easy to romanticise pre-modern life from the comfortable position of a developed country. It's much harder for the people who live it.
No, not even close. Pre-modern tribes don't even have language or concepts that define "hospital" or "contraception". If you don't have ideas, it's improbably that those will emerge out of themselves.
The fact that it happened in our Western world wasn't a wilful act. Society didn't wake up one day thinking "let's build a hospital". Our present situation is the result of hundreds of years of evolving thinking and so having those concepts today is more attributable to happenstance then anything else.
It's incorrect to assume that any human tribe can and will come up with those ideas all by themselves and somehow "want them because that's what they want".
I stand by my comment.
> Without other sources of income, you are a crop failure or drought away from hunger.
Of course. Nobody is debating that. We're debating the biased notion that "progress" - Western modernity - is the primary and only way how to solve this; and - more importantly - the underlying, false, assumption that "progress is by definition good".
A slew of researchers, philosophers, politicians, writers, artists,... have moved on from that point of view since the 1940's and shown that this is a naive way of thinking.
It's called "Post-Modernism":
> It's easy to romanticise pre-modern life from the comfortable position of a developed country. It's much harder for the people who live it.
Of course it is. We're not debating that either.
But there are less people in poverty now, so... that seems like a good thing?
This is about which frame of reference you use to look at the world. And how that can severely distort your view.
Consider isolated amazonian tribes, or tribes on islands in the South Pacific. By our standards, they live in abject poverty: no electricity, no clean water, no jobs,... Then again, these tribes do not know anything else. They have lived like this since the dawn of humanity.
Then consider native tribes in America, Alaska, Greenland. People whose traditional way of living have been replaced by the affordances of modern technology. And that hasn't been for the better, necessarily. We are talking about fundamental lifestyle changes that happened over the course of less then a generation.
And that's even without the historical violence we tend to discuss / guilt trip over in the modern world: such as traditional colonization, imperialism and such-like.
Take Inuit tribes. Just 2 generations ago - some 50 years - these people lived in small traditional fishing communities, mostly secluded from the world. The lived according to their own ancestral culture and that worked out fine for them.
Today, their way of living has entirely upended. Sure, they still fish, but they now have social security numbers, salaried yet often low-wage jobs, Internet and television, pollution and noise from cars, snowmobiles and so on. The net result is that unemployment among the active population is a serious issue, as it also causes mental and physical health issues including crime and substance abuse.
When native tribes protest against oil pipelines, that's not a take against the benefits modernity itself. It's a protest against the fact that the arrival of modernity literally implies the disappearance of an environmental and cultural context in which they have been rooted for thousands of years.
To them, the benefits of electricity, healthcare, modern jobs and cars also come with a great cost: giving up their identity, their way of life and the risk of ending up in abject poverty due to the dynamics inherent to modern economic systems rooted in Western / European values and morals.
As such, when we talk about progress, it's presumptuous to assume that anyone who doesn't have access to modern affordances is "living in poverty" and "more progress" is the only answer going forward.
The big thing that traditional tribal cultures have going for them is sustainability. Sure, individuals lived precarious lives and relied on magical thinking to explain the world, but these communities as a whole have persisted for many generations with a low impact on their environment.
From their frame of reference, their way of life totally make sense. Assuming that their way of thinking is backwards or obsolete, or assuming that them ending up unemployed or struggling with modernity is them not pulling their weight in "accepting progress", well, that's just our own superiority bias clouding our thinking.
However, the OP said millions if people shouldn't have ever been in poverty, but were placed there thanks to some combination of capitalism, colonialism, slavery, racism, politics and probably a few more of such fun things which of course isn't much like the "traditional tribe" scenario.
Additionally, the traditional tribe scenario never supported large numbers of people, which is what is measured by the average life expectancy. Large numbers of people are associated with cities, and financial poverty is reasonable proxy for quality of life in a city context, on average.
Exactly. Thing is, cities didn't emerge out of themselves. The underlying dynamic is the Agricultural Revolution of 12.000 years ago.
Moreover, I argue that we're with billions today less because of our own individual volition, but thanks to the invention of another agricultural revolution spawned by the Industrial Revolution: synthetic fertilizer, machinery, etc.
The emergence of cities and metropolitan areas is caused by another dynamic: specialization. If you don't have to gather / hunt food, you can become a specialist. Money and finance are belief systems that helped galvanize that dynamic. As such, urbanization is the collective result of an individual adaptation strategy: get close to a large group of people with lots of demands for specialists. This ranges from doctors to factory workers.
The fundamental problem with poverty, then, is that the survival of an individual doesn't depend on social cohesion in a small tribe. It depends largely on market dynamics of supply and demand for specialist labour. Moreover, the cost of switching between specialisms is prohibitively high (i.e. factory worker becomes lawyer).
Yuval Harari laments the emergence of agriculture and cities as a disaster for Humanity in his seminal work "Sapiens". He makes a worthwhile point to consider.
> Additionally, the traditional tribe scenario never supported large numbers of people, which is what is measured by the average life expectancy.
Ultimately, there is absolutely no moral argument or reason, outside of our own thinking, to be found in nature that obligates us to become a multi-billion species of which each individual should live according to European / Western living standards.
Equally, there's no argument, outside of our own thinking, to be found in nature that says each and everyone is entitled to live at least 80 years; or have biological kids. One can aspire and hope that will happen, but reality just doesn't give out any guarantees.
That type existential precariousness and uncertainty is well understood in small tribes and deeply ingrained in their cultures.
Now, to be clear, I don't want to make that case that living in small tribes is better then living in cities. Absolutely not. My point is that both are vastly different ways of living life and comparing them from the vantage point "one is better then the other", simply doesn't make much if any sense. What we can do is try to understand that tribal / traditional living is just as valid as a survival strategy and comes with it's own advantages and downsides.
Hence why I vehemently disagree with Harari when he laments agriculture and cities. That's just one-sided value attribution.
If we want to live in a society that forcefully favors large metropolitan cities, large industries, optimized specialization and so on: fine. But then we have to accept that poverty will be a thing and there will always be large swathes of the living who have to make do with the left-overs.
Most importantly, if risk acceptance is the magic sauce, why do all these advances just start in 1945. Its not like people were super risk averse in the early 1900s or the 1800s. Then again i suspect there is some selection bias here to make all the data points fit in this time period. Its not like feminism started from nothing in 1945, first wave feminism goes before that.
and they actually didn't start in 1945. We entered WWII with propellers, exited with jets, entered with visual/optical targeting, exited with radar and targeting computer, entered with Enigma style and other mechanical primitive calculation devices, exited with the first real computers, entered with an idea of possibility of nuclear energy, exited with A-bomb, etc. V-2 ballistic missile became the foundation of Space Age in US and USSR. Mass production of antibiotics, and really mass production of anything, Liberty ships for example. We exited WWII with tremendously scaled up R&D and mass production capabilities which were put in significant part to peaceful use after the war. And of course GI bill in the US. And the formation of 2 super-empires - Eastern Bloc and NATO/Bretton Woods. As far as i see at least the technology and economy of the Golden Quarter is firmly rooted in WWII.
* Cell phones
* Sequencing the human genome (and many others)
* Personal computers
* Lithium-ion batteries
* The world-wide web
* Quantum computing
* Video games
* Cheap wind and solar energy
* Hybrid cars
* The ISS (International Space Station)
* Genetically engineered crops
* SMS + instant messaging
* Hubble space telescope
* On-demand TV/streaming
* Self-driving cars
* Video calling
* Digital cameras
* Voice-controlled computers
* Digital music
* Social media
* Kepler space telescope, discovering 2662 exoplanets
* LED lighting + indoor farming
* Gene editing (Crispr)
* Atomic-force microscopy
* Magnetic-resonance imaging (MRI)
* Reusable rockets
* 3D printers
* MOOCs/online education
* Online marketplaces
* Micromobility (e-scooters)
* Biodegradable plastics
* Automated translators
* Optical character recognition
* Asymmetric key cryptography
* Electronic payments
* Online dating
* High-temperature superconductors
* Soft lithography
* Machine learning + deep learning
* Cloud computing
* Immunotherapy for cancer treatment
* Stealth technology (F-117, B2 etc.)
* Mars rovers: Spirit, Opportunity + Curiosity
* Voyagers I & II probes
* Cassini-Huygens probe to Saturn + moons
Out of 53 things on your list, I'd say that ~10 of them unquestionably pass as "large technical inventions, long after 1971". Maybe ~5 such as "online streaming/dating" pass the timeframe but are wishy washy and don't fit the spirit of it for me.
* Cell phones - no, cells proposed for vehicle phones in 1947, mobile phones for vehicles launched in Sweden in 1956. And "n 1959 a private telephone company in Brewster, Kansas, USA, the S&T Telephone Company, (still in business today) with the use of Motorola Radio Telephone equipment and a private tower facility, offered to the public mobile telephone services in that local area of NW Kansas" - Wikipedia, history of mobile phones.
* GPS - US Navy track submarine locations using Doppler shift from polar satellites in "the mid 1960s" - https://www.nasa.gov/directorates/heo/scan/communications/po... Not the modern GPS system, but is "The GPS system" an invention where "demonstrated location tracking from satellite signals" isn't?
* Sequencing the human genome (and many others) - "The first method for determining DNA sequences involved a location-specific primer extension strategy established by Ray Wu at Cornell University in 1970". - Wikipedia, DNA sequencing.
* Personal computers - no, IBM designed a computer for "personal automatic computer" in 1957 - https://en.wikipedia.org/wiki/History_of_personal_computers#...
* Drones - maybe if you count autonomous flying machines with Li-Ion batteries and cameras. Self-propelling, unmanned, or remote control machines (Torpedo, Lunar explorer prototype), no.
* Lithium-ion batteries - yes!
* The world-wide web - the WWW specifically yes, hypertext systems generally, no. Doug Englebart demonstrated a hypertext+mouse+monitor system in 1968 https://en.wikipedia.org/wiki/NLS_%28computer_system%29
* Quantum computing - yes!
* Video games - no, 1950s; https://en.wikipedia.org/wiki/Early_history_of_video_games#I...
* Bitcoin/blockchain - yes.
* Cheap wind and solar energy - maybe. People have been using Windmills for work for years, and sun for heating water to wash in. Modern solar panels, invented a long time ago. Is there a specific "cheapening" invention since then?
* Hybrid cars - nitpicky if you mean "commercial Li-Ion and Gasoline cars", but hybrid diesel-electric vehicles go back to ~1903 for ships, 1920s for submarines and trains. https://en.wikipedia.org/wiki/Diesel%E2%80%93electric_transm... Counting cars as a new invention post 1971 is dubious, I think.
* The ISS (International Space Station) - Salyut 1, the first space station was built before 1971 and launched in 1971 - https://en.wikipedia.org/wiki/Salyut_1.
* Genetically engineered crops - yes.
* SMS + instant messaging - SMS yes 1992, instant messaging maybe - what about pagers (1949), wireless telegraphy, or the "talk" command of realtime chat on multiuser systems going back to the 1960s? - http://web.archive.org/web/20110615131332/http://osdir.com/m...
* HDTV - "The term has been used since 1936" - https://en.wikipedia.org/wiki/High-definition_television#His...
* Hubble space telescope - Hubble yes, space telescopes generally no - https://en.wikipedia.org/wiki/Space_telescope
* On-demand TV/streaming - probably.
* Self-driving cars - they a) don't exist yet, or b) go back to the 1960s with the NASA prototype autonomous wheeled Lunar vehicle which could follow white lines around a track.
* Video calling - no, public videophone system in Germany 1936-1940 - https://en.wikipedia.org/wiki/History_of_videotelephony#Worl...
* Digital cameras - yes, depends too much on silicon process advances to argue.
* Voice-controlled computers - voice recognition of anything no, Audrey in Bell Labs, 1952 - https://medium.com/@joshdotai/its-all-in-the-voice-fba7e5a03...
* Touchscreens - no; "Historians consider the first touch screen to be a capacitive touch screen invented by E.A. Johnson at the Royal Radar Establishment, Malvern, UK, around 1965 - 1967." - https://www.thoughtco.com/who-invented-touch-screen-technolo...
* Digital music - no. Electronic music, Theremin, 1920. Pulse Code Modulation 1928 patent. First computer audio recording, 1950s. First commercial digital audio recording Jan 1971. https://en.wikipedia.org/wiki/Digital_recording#Timeline First digital sampling on a pair of PDP-8s in 1969 - https://en.wikipedia.org/wiki/Sampler_(musical_instrument)#H...
* Social media - vague, depends a lot on microcomputing. Did Ceefax/Teletext and the letters pages count? That's "early 1970s" tech.
* Kepler space telescope, discovering 2662 exoplanets - comes under "space telescopes", and no, for me.
* LED lighting + indoor farming - no; LEDs and Infra-Red LEDs, 1962 - http://www.historyoflighting.net/light-bulb-history/history-...
* Gene editing (Crispr) - yes, 1987.
* Atomic-force microscopy - yes, 1982.
* Magnetic-resonance imaging (MRI) - no, 1971 - https://en.wikipedia.org/wiki/Magnetic_resonance_imaging#His... . Nuclear magnetic resonance itself, 1940s - https://en.wikipedia.org/wiki/Nuclear_magnetic_resonance#His...
* Reusable rockets - created yes.
* 3D printers - yes.
* MOOCs/online education - Hypertext was educational. Arguable depending on "the internet" or not.
* Ebooks - too close to "text on a computer" for comfort.
* Online marketplaces - requiring the itnernet yes, but remote ordering - catalogue, telephone, was not particularly new.
* Ridesharing - no? Taxi cabs, rickshaws.
* Micromobility (e-scooters) - battery scooter no, 1915, otherwise another way of saying Lithium Ion batteries. https://www.smithsonianmag.com/history/motorized-scooter-boo...
* Biodegradable plastics - mass production of them yes, invention/discovery of them, no - https://en.wikipedia.org/wiki/Biodegradable_plastic#History
* Automated translators - no, Logos Machine Translation (MT) system translated military manuals into Vietnamese during the Vietnam war 1950 - before a 1972 report, and the first experimental system was demonstrated in 1951 - https://en.wikipedia.org/wiki/Machine_translation#History
* Optical character recognition - nope, GISMO 1951 - http://www.historyofinformation.com/detail.php?entryid=885
* Asymmetric key cryptography - right on the border; conceived in 1970, implemented by 1973. https://en.wikipedia.org/wiki/Public-key_cryptography#Classi...
* RFID - no, "Russian physicist Leon Theremin is commonly attributed as having created the first RFID device in 1946 (Scanlon, 2003)" - http://www.u.arizona.edu/~obaca/rfid/history.html
* Electronic payments - no, "Electronic payments have their roots in the 1870s, when Western Union debuted the electronic fund transfer (EFT) in 1871." - https://blog.forte.net/electronic-payments-history/
* Online dating - yes, because "online". Telephone dating (remote, technological), probably not.
* High-temperature superconductors - yes, 1986 or so.
* Soft lithography - yes.
* Machine learning + deep learning - Neural Networks first demonstrated in 1959 and put to commercial use - https://cs.stanford.edu/people/eroberts/courses/soco/project... . Deep learning, is there a specific invention there apart from "really big GPUs"?
* Cloud computing - "invention" is what, specifically? Centralised computing, no, that was mainframes.
* Immunotherapy for cancer treatment - no, dates back to 1891 - https://www.targetedonc.com/publications/special-reports/201...
* Stealth technology (F-117, B2 etc.) - no; "Development of modern stealth technologies in the United States began in 1958" - https://en.wikipedia.org/wiki/Stealth_technology
* Mars rovers: Spirit, Opportunity + Curiosity - those two yes, planetary rovers as an invention, no.
* Voyagers I & II probes - Maybe, they launched in 1977 so they must have been invented some years before that. The general concept of space probes such as Russian Venera probes to Venus goes back to 1960s, and the calculations for the Voyager mission lineup with the outer planets does too. Are the specific Voyager probes in the 1970-1975 timeframe strong enough inventions to make the article "absolute nonsense"?
* Cassini-Huygens probe to Saturn + moons - under space probes, so no for me.
It is hard to judge and in many ways arbitrary - there must be a lot of inventions along the way to making something mature enough to be generally available and useful, and improving things gradually does open doors to new ways of using them. Inventing a way to increase LiIon battery density 10% by new factory processes, is valuable and useful, and doing that several times makes for a sea change in the kind of devices which can be economically viable. I'm hoping for room temperature superconductivity to be discovered in my lifetime, and think it will be a breakthrough if it is; but temperatures of superconductors have increased over the years and I have no reason (layperson) to think there is an arbitrary universe cutoff around 273Kelvin, so if we keep progressing and trying, it will eventually appear. The time when Onnes discovered superconductivity at all, was much more breakthrough into new realm, even if room temperature superconductors sounds like crossing a significant threshhold.
The article to me is about the feeling of novelty and new discovery missing from post-1971 life, rather than saying no economic growth in useful mature technology is happening.
I was absolutely certain the article said "we've forgotten what it feels like to have a breakthrough" in more or less exactly words. And it says nothing like that, at all. This bothers me.
By saying everything is innovative, you're making no distinctions - if everything is, nothing is.
Automated translation the 50s
And this is a weird list...some of these things are guinine advances improving thr human condition, then there is bitcoin and e-scooters. Not to mentions things that dont really exist in useful forms yet (quantum computing)
This would be like saying, in 1960, that the transistor wasn't a big deal. It wasn't... yet. It was sure going to be, though.
What might be the current stuff? ML might be a big deal. It's too early to tell. CRISPR might be enormous. Can't tell yet. There might be stuff I haven't heard about yet, too. Will that add up to anything as significant as what happened in the 1950s? Ask me in 60 or 70 years.
Managers couldn't accept more than 0.01%
It turned out to be ... 1%.
Talk about a disconnect in risk assessment!
Even though that is the law of the land.
Sure. But modern politics finds this unacceptable.
In general it has been predicted that the next level of automation is intellectual automation driven by AI. The work of lawyers and doctors and even computer programmers can be done by AI they predict. Or at least AI becomes a powerful helper for them. Does this mean there will be mass unemployment? Well I think it is just fine if work-week is reduced to one day. The only issue is financial inequality but that is easily taken care of with progressive taxation.
We'll really progress when AI cannot be tricked by politicians double speak.
His argument also ignores all the battles for civil rights progress before 1945 (eg, Gandhi's Salt March was in 1930, the Suffragette's between 1900 and 1928. Pretty sure there was some war in the the US that did something about slavery a while before 1945 too).
Edit: err.. heavy downvotes? Isn't this obviously correct? What am I missing here?
This doesn't explain why we haven't sent a person to mars or had meaningful civil rights reforms. But technological progress is often limited by the switching costs from one paradigm to another. And those costs tend to compound over time
On the one hand, it s kind of sad that even the most progressive big trends in R&D today, clean energy and space travel, will only take us forward to an incrementally updated version of 1971. on the other hand, black swan events like the coronavirus are forcing the world to break ranks with the slow pace of progress. Wish the times were more interesting.
Maybe the “golden quarter” was just a coincidence of findings and engineering feats that only owes its success to pure chance.
I mean: if scientific breakthroughs follow some kind of poisson law distribution by discipline, then it is bound to happen that multiple breakthroughs that are completely independent will happen during a relatively small period.
1. Real-world performance is not necessarily linear with technological progress.
2. Landmarks by which we measure progress may be far apart.
#1. We frequently compare computer memory to automobile engines and ask why engines have improved so little while memory capacity has skyrocketed. It's an unfair comparison. If we were still using magnetic core-memory, as we were in the 60's, capacities would now be only incrementally larger. The growth of memory capacity has been driven by a succession of novel technologies, while the fundamentals of an automotive engine are essentially the same as they were a century ago. We must ask, "Why is it harder to come up with novel types of engines than novel forms of memory?". Perhaps it's due to complexity. Computer memory is a very simple thing repeated many times. Improving that simple thing yields great benefit with comparatively little effort. An engine is a collection of many different parts that interact in complex ways. Incrementally redesigning individual parts is unlikely to result in novel types of engines with fundamentally different capabilities. To come up with a new type of engine you have to design a very complex system all at once. Expecting memory and engines to improve at the same rate is not realistic.
#2. What landmark comes after sending men to the moon? Sending them to Mars? The Apollo missions were tremendously expensive and the scientific value of research conducted on the Moon was small compared to the progress achieved in getting there. That may be why we stopped sending manned missions to the moon. Why it's taken so long to send men to Mars may be because the difficulty is much greater. We need far more thrust to transfer orbits. We need to sustain humans in space for long periods of time without protection from the Earth's magnetosphere. We need to send squishy humans down a gravity well much more significant than the Moon's without much atmosphere for braking. We may have been making steady progress on all of these problems but, until we actually land a human on Mars, the Moon remains the high-tide mark.
YET, today, we carry pocket supercomputers with nearly all of the world's information at our fingertips.
Ten years ago this was not the case.
Individually we have ended up first incentivized smartest people to figure how to give mortgages to people who can't afford them and then how to get people clock more ads. Lo and behold, that's what we got ...
And no, I am not promoting full on communism. I am saying there is a goldilocks case here. There are things that are so important to research and develop that we need to collectively put resources more than free market allocates. There also are endeavours that are harmful and should be regulated. And there are things that should be left for free market.
Except the web. And ecommerce. And video streaming. You might call the "banal" but they changed how people live more than then most other things before except electric power and cars.
Humans should have finite needs so if we are competent enough for solve them is only natural that progress will stall. That should be a good thing, is a sign that the most relevant problems were already solved.
But yeah this is a US centric, physically biased view of the world that ignores major changes in technology and society.
Firstly, I'd note that of the key innovations he cites (The Pill, Electronics, Computers and the birth of the internet, Nuclear power, Television, Antibiotics, Space travel, Civil rights) all except the pill are clear refinements of development that was occurring prior to 1945 and most prior to WW2.
Sure, our phones are great, but that’s not the same as being able to fly across the Atlantic in eight hours or eliminating smallpox.
It's not a competition, but I think there are pretty compelling arguments that incredibly cheap, highly reliable instant communication devices available everywhere (with the exception of North Korea) are more important than being able to fly across the Atlantic.
And most recent advances in longevity have come about by the simple expedient of getting people to give up smoking, eat better, and take drugs to control blood pressure.
There is some truth in this, but what an advance it is! In the US, life expectency at birth in 1971 was 71, in 2019 it was 78. But this undersells the advance, since most of the world has improved even more. For example, China went from 60 to 76!
So why has progress stalled? It hasn't stalled because it was never going forward. It was always going around in circles. A few benefiting here, a few losing out there. And those places are constantly changing and morphing. America was indeed making progress, 20, 30, 40 years ago. Now it's going backwards on most of the issues mentioned in the article. We used to be somewhat democratic. Now we are just ruled by oligarchs. Europe has taken up the slack, whereas 70 years or so ago it was in ruins.
To think of progress as some constant driving force is a folly. There is no such thing as progress. Things happen. In any given place, at any given time, sometimes they are for the better, sometimes for the worse. That's the only conclusion one can draw from history, even with a myopic look at only the last 100 years. If you go back further, there is no other conclusion to be drawn. Progress is a dream. It's not reality.
Life expectancy is increasing everywhere, in every socio-economic group except in the US. That's bad for the US, but we don't say progress has stalled just because of political issues in one country.
The biggest trick Devil played on mankind was to make us believe that progress didn't exist :-)