Hacker News new | past | comments | ask | show | jobs | submit login
Where has all the productivity gone? (johndcook.com)
236 points by wellpast on July 31, 2021 | hide | past | favorite | 211 comments



There is a huge assumption here that productivity isn't increasing, which is false. Economic output per hour worked has been steadily increasing in every country since their respective industrial revolutions.

When you go to the source tweet, it turns out that it's comparing the developed world to the developing world. Developing economies will always grow faster, but not necessarily efficiently, because there's more low hanging fruit to pick. These low hanging fruit can be picked by throwing more people at the problem (increasing workforce participation or working longer hours).

One example is how much more time it takes to build infrastructure in America vs China. If you look at the data[0], productivity in both countries is growing at roughly the same rate. It pretty much invalidates the assertion in the blog and the source tweet.

[0] https://ourworldindata.org/grapher/labor-productivity-per-ho...


Those are PPP numbers, which means they are a total joke. The graph you post does not show the post war boom, nor does it show the productivity stagnation after 1973. The USA went from 3% per year productivity growth, before 1973, to 1.5% productivity growth after 1973. None of that is visible in the chart because the chart is using PPP numbers.

But, as a separate issue, the above essay is focused on computers, and so it implicitly raises the Solow Productivity Paradox: why don't computers seem to add anything to productivity?

See:

The Solow Productivity Paradox: What Do Computers Do to Productivity?

https://www.brookings.edu/articles/the-solow-productivity-pa...

McKinsey wrote this in 2018, predicting a productivity boom was coming soon, though of course this prediction has been made often since 1973, and the only time it was even slightly true was during the boom in big box retail 1995-2005:

https://www.mckinsey.com/business-functions/mckinsey-digital...

It's well known that in the 1800s the telegraph and then the telephone allowed a large scale re-organization of capital, and capital invested heavily to take advantage of the new technologies, but it was impossible to see a productivity benefit. These technologies allowed new systems of control, but not necessarily any productivity benefit. Since 1970 computer network technologies have offered a similar reorganization, but they are vastly more powerful than the telephone, and they allowed a re-organization that was vastly more profound than any previous reorganization of capital. Millions of jobs were outsourced to India, China, Hong Kong, Singapore, and then Brazil, Russia, Romania, etc.


> But, as a separate issue, the above essay is focused on computers, and so it implicitly raises the Solow Productivity Paradox: why don't computers seem to add anything to productivity?

Just going by eye, there is a pretty sharp trend upward from about 1990-2005, which I think roughly corresponds to the period when the bulk of computerization happened in homes, offices, schools, as well as being put to use in many of the most important embedded applications -- factories and machinery, automobiles, etc.

Eyeballing the trend before 1990 has it continuing to low-$50s, instead it's mid-$60s, a 20% increase.

So why do you think this graph shows computers don't seem to add anything to productivity? I'm not saying they have, but I can't see how that graph demonstrates they haven't.


I can not take credit for Solow Productivity Paradox, all credit needs to go to Solow himself.

The uptick in productivity between 1995 and 2005 is mostly attributable to 2 sectors:

retail

finance

In retail, it was the era of big boxes, lead by WalMart. Basically, it was the era when big retail learned how to take advantage of big databases to better manage inventory. Sadly, this did not seem to be the beginning of a sustained period of higher productivity trends. Rather, it was a one-off innovation.

In finance, much of the uptick involved HFT. Some economists feel this contributed to the crash of 2008, so how much we should celebrate this particular uptick is open to question.


One of my thoughts is the hard return on efficiency isn't linear.

140 years ago generators were 60% efficient. 20 years later 96% efficient. That's big close to 40% better. 20 years after that 98% efficient. 4% better... Now maybe 99.5% so 1.5% better.

I feel like a lot of things they through computers at to improve productivity are kind already efficient. My mom's first washing machine washed clothes. Mine with a microcomputer also washes clothes. Parents first microwave had a power dial and a mechanical timer. You reheat food in it. Mine with a computer also reheats food.

There is no real fundamental change here.


Do you have a source for that? You're saying a productivity increase that amounted to around 20% across the entire workforce is mostly attributable to advances in HFT and logistics in the retail industry?

Even if we ignore that both of those were probably only made possibly with computers, that seems astounding considering retail and finance makes up less than 20% of the economy by GDP.


So here is a link that is a few years earlier than the one I want, but which covers the productivity numbers in the same way:

https://krugman.blogs.nytimes.com/2009/11/09/reagan-mythbust...

But that was written in 2009, before it was clear that productivity slowed again after 2005.

In a later post, Krugman analyzed the higher productivity of 1995 to 2005 and shows much of it was simply big box stores and HFT.

In general, the period since 1973 has been disappointing for productivity. And as yet, computers have not brought us the kind of sustained productivity growth that we saw in the early to mid 20th Century.


I can't load that page for some reason, and although I don't really like Krugman and certainly if it just comes from a blog with his opinion and vague handwaving in it I don't think it's worth much. But that surprises me that he would have changed his tune so quickly after being famously against HFT, e.g.,

https://www.nytimes.com/2009/08/03/opinion/03krugman.html

> It’s hard to imagine a better illustration than high-frequency trading. The stock market is supposed to allocate capital to its most productive uses, for example by helping companies with good ideas raise money. But it’s hard to see how traders who place their orders one-thirtieth of a second faster than anyone else do anything to improve that social function.

> What about Mr. Hall? The Times report suggests that he makes money mainly by outsmarting other investors, rather than by directing resources to where they’re needed. Again, it’s hard to see the social value of what he does.

> And there’s a good case that such activities are actually harmful. For example, high-frequency trading probably degrades the stock market’s function, because it’s a kind of tax on investors who lack access to those superfast computers — which means that the money Goldman spends on those computers has a negative effect on national wealth.


"he would have changed his tune so quickly after being famously against HFT"

He was still against HFT, that was part of his point. He was saying that even some of the surge of 1995-2005 might have been an illusion, because it was "only" HFT. If you feel that HFT really does raise the USA standard of living, then of course you are free to view the surge of 1995-2005 as entirely real.


The link should be here, if Google was working:

solow hft retail site:krugman.blogs.nytimes.com

At the moment Google is acting as if Krugman never mentioned HFT or retail.

Adding to the problem, the New York Times is giving 503 errors over most of Krugman’s old weblog, so perhaps the Times has given up on maintaining the old blog.

Am I allowed to make the joke that Solow’s Productivity Paradox is mostly explained by rampant link rot? But I’m only half kidding, because the rapid way technology breaks down is an obvious hindrance to productivity.


“You're saying a productivity increase that amounted to around 20% across the entire workforce is mostly attributable to advances in HFT and logistics in the retail industry?”

No one ever suggested that. It’s the excess productivity growth of 1995-2005 that is largely attributable to retail and what improvements there were in the productivity of trading financial instruments.


I'm not talking about the total increase of productivity, I'm talking about the roughly 20% it increased beyond where it would have been if the pre-95 trajectory had continued. I.e., the excess. It just doesn't seem like a plausible explanation.

EDIT: going back and looking at the actual numbers rather than eyeballing it's more like 10-12% excess. That's across the entire economy of course, so it would still be a massive increase to attribute that mostly to a subset of retail and finance sectors.


I have a research assistant I hire to deal with FRED for me, since it is really its own specialty, and at some point we will get to the productivity numbers, but I'm not going to engage them simply for a comment on Hacker News. I probably should not be writing on Hacker News, I keep thinking that I should focus my efforts elsewhere. But eventually I'll have more complete work to share with you and the world. Feel free to follow my blog, you'll see an announcement at some point in the next 6 months.


There are internal corporate studies out there that try to determine which manager or technology brought productivity gains.

The math is so complex they could not draw any conclusions.

Understand; we listened to lesser educated men of the previous generation about building business and economy. Men who missed out on our early education in science, and who were often far more religious. Men who had much different political narratives to satisfy; IBM was gifted government research. How many of our startups are?

Brookings conveniently avoids discussing the weakening political power of the US over the decades; easy to grow when the world needs you to manufacture for it after WW2.

It’s not hard for me to believe much of our economic theory is bullshit, and output is driven by political arrangements. Humans will generate economic activity naturally to survive. One can only induce so much superfluous activity beyond that.

Humans have a history of ignorantly extrapolating the wrong conclusions from reality.


I think the tweet linked to by OP is using the colloquial definition of productivity (i.e. I can get more stuff done) and uses examples of computer technology etc.

This isn't to be confused with the economic definition of GDP per hour worked - which has far more to do with the sectors of industry that a country has.

After all, a few financiers doing million-dollar deals is always going to generate more GDP per hour than coltan miners, even though it's not obvious which is actually generating more wealth. Robert F Kennedy famously said the GNP "measures everything in short, except that which makes life worthwhile".


You increase what you optimize for. GDP increase is the stated goal of most current societies and they work to achieve it.


>There is a huge assumption here that productivity isn't increasing, which is false.

Actually the assumption in the post is the exact opposite: that productivity has increased.

The question it poses then is "so what do we have to show for it?".


Cheap clothes, cheap flights, cheap music, movies, cheap food, longer life expectancy, bigger homes with smaller household sizes, cheap washing machines and various appliances, all with far fewer working hours, and far less household work while you're off-work.

I'm definitely better off than my parents or grandparents. Hell, we have stories in our family where my uncles would shower at the communal bathhouse cause their home didn't have a bath nor hot water. Everyone used to wear hand-me-downs. We used to send cassette tapes to family abroad because international calls were so expensive. We flew every few years instead of multiple times a year. We saved up for CDs, and I literally never, ever went out for dinner to a restaurant (outside of a rare holiday) with my parents. My dad lost both parents as a child to old age, my grandfather worked 6, sometimes 7 days a week, my grandmother spent almost the entire day running the household (e.g. washing clothes by hand), they had no retirement and lived like that till death.

I'm often surprised about this idea we've got it worse off. The conversations usually revolve around housing, healthcare and college having become drastically more expensive. (although, never normalised for interest rates that dropped from 15% to 3%, homes that average 2x larger than 60 years ago with fewer people, life expectancy increasing by a decade since 1950 etc etc). Life isn't better in all respects, but on the whole I certainly wouldn't want to trade with past generations. And a lot of that boils down to productivity allowing new measures of wealth for many.

If we look at it on a worldwide scale, the difference is much, much more pronounced. Many lower income countries saw massive productivity improvements. We shouldn't forget that the past 3-4 decades have been all about globalising markets. Factory workers in the US now compete with those in China. IT staff the UK is competing with IT staff in India. Technology is relatively mobile and goes across borders easily. You cannot measure ceteris paribus productivity improvements in terms of gdp per capita in the west without correcting for increased competition abroad (e.g. in China). And in doing so, you will find that there's even bigger productivity increases in lower/middle income countries, too.


>Cheap clothes, cheap flights, cheap music, movies, cheap food, longer life expectancy, bigger homes with smaller household sizes, cheap washing machines and various appliances, all with far fewer working hours, and far less household work while you're off-work.

So, not much then.

For the consumer devices, cheap = disposable, as opposed to consumer devices you bought and could use for decades.

Likewise, cheap clothes mean diminishing quality (regardless of price bracket, from $5 items to $200 brand-name ones, they are all most often than not made in sweatshops in the developing world, often with the same materials and processes). Plus tons of clothes thrown, which is an environmental attrocity.

(As for cheap music and movies, that's a byproduct of one basic delevopment: high speed internet, which enabled a new delivery method. Not the best example of an increase in productivity, not to mention those are passtimes).

As for "longer life expectancy", all accounts I've read say that we got that early on in the 20th century, so 100+ years ago, and it was due to running water, bathrooms, hand washing, and a few such things. Low hanging fruit. Certainly not due to the last 70+ years of "increased productivity".

Meanwhile we pay more for healthcare, housing, and education (the 3 things that matter more), wages have stagnated since the 70s, and work-life balance has gotten infinitely worse, depression ever more prevalent, the middle and working classes have collapsed, and so on. No savings and no chance of retirement at any reasonable age for the majority as well.

>I'm often surprised about this idea we've got it worse off.

Probably it's from people who are not impressed by more and cheaper trinkets.


As for "longer life expectancy", all accounts I've read say that we got that early on in the 20th century

In the US, life expectancy at birth was around 53 in 1921. Today it's almost 79. That's a nearly 50% increase.

https://www.statista.com/statistics/1040079/life-expectancy-...

It's a particularly interesting time to discount the life extension benefits of modern technology. We just saw multiple vaccines get developed and deployed in record time to significantly reduce the death toll from a worldwide pandemic. It's unlikely we would have seen this remarkable success even a decade ago.

Much of the rest of what you say is also inconsistent with actual data.


I think your graph is a good illustration of the loss of productivity. Compare the gains between 1900 and 1920 and 2000 and 2020. The earlier period saw an 11% gain in life expectancy, compared to a 2% gain in the latter period. Just looking at the line in your graph you can tell progress is slowing down.

Why did we make more progress before and less later? We have better tools, computers, and software. We have more people, both in the US and coming in from worldwide along with a greater portion of the population being college educated and free to work on scientific and engineering problems. We have better developed theories of biological, chemical, and physical systems. And yet, when we knew very little, had little resources, and few people working on the problem - we made huge strides. Now, not so much.

Maybe this issue is that we've already plucked the low hanging fruit. That doesn't seem quite right to me though. Even if we have, we should be equipped, via the advantages I referenced earlier, to pick higher fruits.


I wouldn't be at all surprised that there's going to be slowdowns of growth in various areas, some things are by definition going to be constrained in terms of space/resources. (e.g. growth of # home units in New York City) I don't think infinite liniear or infinite exponential growth is the expectation.

That having been said, for this particular example (life expectancy) it's typically a bit better to look at life expectancy at age x, often 5, 10 or 15 are chosen.

It's because infant mortality used to be sky-high, as 25% (one in four) in 1900 and 16% (one in 6) in 1950 worldwide, to < 2.9% today, with the best countries now around 0.2%.

If say one in four humans becomes virtually zero years old, that skews the average down a lot, e.g. from 65 to 40. But it doesn't mean that the average person who makes it through the first few years would've died at age 40.

If we look at improvements of life expectancy for those who already raeached age 10, it looks like this and is a pretty straight line with fewer indications of slowing down. https://ourworldindata.org/grapher/life-expectancy-at-age-10...


I think you’re reading too much into “life expectancy at birth”. Life expectancy at age 20 hasn’t moved nearly as much. The big wins were reductions in deaths during childhood.

Not to sound harsh, but the main economic benefit is that now people need to have fewer children to maintain a stable population.


Frankly, as a parent, I think "fewer dead children" is a pretty big deal. Perhaps one of the biggest improvements in human welfare that one would hope to see in society.


In the late 80s life expectancy in Italy was around 70, now it's well above 80, and a large part of the difference is due to how many died 55-65 to cardiovascular problems or cancer.

A lot more of my relatives died of heart attack or cancer in the 80s than those from the next generation did in the last ten years. In fact in the last few years I witnessed many people getting surgery for those same heart issues or cancer that would have killed you a couple decades back.


I've heard this argument a surprising number of times, and I must say I find this degree of callousness about dead children to be utterly horrifying.


> It's a particularly interesting time to discount the life extension benefits of modern technology. We just saw multiple vaccines get developed and deployed in record time to significantly reduce the death toll from a worldwide pandemic. It's unlikely we would have seen this remarkable success even a decade ago.

Which appear to be uniquely dangerous and ineffective, so it's odd to use them as an example of progress. Besides that, mRNA is not a new thing. What's new is the EUA.


>In the US, life expectancy at birth was around 53 in 1921. Today it's almost 79. That's a nearly 50% increase.

This has nothing to do with productivity rise and where do we see its results (what is discussed here), but about the merits of technology.

Even so, the raise in life expectancy at birth is about those low hanging hygiene practices (nothing to do with post 1950 techology) and a few vaccines (most of which available before 1950s).

Not any big technological breakthroughs, like artificial hearts and so on (which hardly make a dent), and with extremely little to do with any post 1950s tech.

So the question of the article in the domain still remains, where's all this benefit from the productivity rise.


As you can see from my link, life expectancy at birth was around 67 in 1950. There has been an 18% increase since then.

You continue to make claims that are inconsistent with the data.


>We just saw multiple vaccines get developed and deployed in record time

But can you show me the exact breakthrough that came by this pouring of resources into the research, and that enabled such vaccines, by removing blockers that blocked such vaccines for decades?


The main vaccine “innovation” was that the FDA decided to finally approve an mRNA therapy. Moderna’s been trying to get such things approved for about a decade.

So, the “breakthrough” was a brief suspension of questionable bureaucratic barriers. (Temporary suspension of “Dissipation” from the article.)


>So, the “breakthrough” was a brief suspension of questionable bureaucratic barriers. (Temporary suspension of “Dissipation” from the article.)

So as I doubted.

Naturally, the next question would be "What was the questionable bureaucratic barriers" that was temporary suspended. It really blows my mind that the answers to this is not widely known. It is kind of disturbing..


>Today it's almost 79. That's a nearly 50% increase.

What about the quality of life? We know that people are getting sicker at a younger age, and if we consider that in addition to the ever increasing cost of the treatments, it does not appear to be a net positive.


I would guess that those who are getting sicker at younger would just have died.

Let's say hypertension is now diagnosed at 50 instead of 70. How many people used to die of a stroke between 50 and 70 due to undiagnosed hypertension?


How many people used to NOT have hypertension, and other cardiovascular issues, for want of exercises and other physical activities? Thanks to increase tech addiction in form of 24x7 chitchat using smartphones and video games..

So it technology take away some and then give back some (not without a huge cost, mind you), how can you evaluate its impact on human existence?


You are selectively picking one example from the parent post and making a rather personal attack. There’s no need for that. Stick to your argument and don’t make it personal.


You couldn’t be more wrong, or more dismissive. Ask a working single mother whether inexpensive, effective appliances are a significant productivity and quality of life enhancer.


This reminds me of Simpson's Paradox in the sense that it may be true that life is better for a single mother because modern appliances make life easier but it is also true that society has greatly increased the number of single mothers. In 1980 18% of births were to single mothers compared to 40% in 2020[1].

It's possible that the amount of single mother suffering has increased even though the suffering per single mother may have decreased. (No change, I expect, in the children's suffering)

1 - https://www.statista.com/statistics/276025/us-percentage-of-...


That statistic sounds totally wrong and after looking at your source it is highly misleading. It’s just if the parents were married at birth.

Of the people I know with kids like 50% of them were not married when they started having kids. They either got married latter or never married because there was no advantage to do so and they didn’t care.

Yet none of them are what you would call “single mothers”. They are all still together with the fathers of their children.

So basically it says more about marriage than it does about parenting.


If you have better data I'm happy to look at that. I agree that cohabiting are likely more common now and it is possibly more common to get married just after birth than it used to be.

Table CH-1 from this census website shows that children living with a single parent goes from 10% in 1960 to 36% in 2020. (=E13/C13) I'm not certain but I think they are accounting for cohabiting parents here because they don't use any language related to marriage and the census could tell if the parents were cohabiting or not.

https://www.census.gov/data/tables/time-series/demo/families...


>>>No change, I expect, in the children's suffering

Drug use, gang activity, and other criminality are all positively correlated with single-parent households, and some data shows even MORE positively correlated with single-mother, as opposed to single-father, households.

https://www.ojp.gov/ncjrs/virtual-library/abstracts/single-p...

https://www.researchgate.net/publication/316649639_The_effec...

https://www.heritage.org/crime-and-justice/report/the-real-r....


Again, we can go deeper. How many single mothers would be forced to remain in otherwise abusive situations but for the fact that she is able to both work and care for her family due to “progress”?


My cursory look for some evidence suggests single mothers see dramatically elevated rates of violence and abuse. This source surveys children who live with both parents and just one. Children living with just one are seven times as likely to have witnessed someone hitting their mother in the past year than children living with both parents. I take that to mean that the increase in single motherhood means there is more domestic abuse happening than if no such increase had taken place.

https://ifstudies.org/blog/children-in-single-parent-familie...


Or maybe those women are now single mothers because they were able to escape their abusers. You would have to compare data across decades, and adjust for people being more/less willing to disclose abuse.


I very much agree with you. I grew up without microwave oven or dishwasher. Neither of those are essential in any way, but claiming that those don’t provide great time savers for parents (single or not) with small children would be ridiculous.

I don’t remember where I saw the numbers, but in 1980 a microwave cost more than a months (average) salary. Today I can buy a good one for less than half a day’s earnings.


Ask a working single mother is she would trade "inexpensive, effective appliances" to 1960s era healthcare, housing/rent, and education costs, and working/middle class job availability.

The answer might suprise you...


I don’t think quite as many “working/middle class job(s)” were available to single mothers in the 1960s as you seem to be claiming.


Wow--

I have never heard anyone say, "Bosh made my life easier"

And it reminds me I'm on my second pump for a E12 error.

(After tossing too many Bosh appliances, I finally got lucky repairing one. The guys on the internet helped.)


There’s nothing forcing you to use any electric appliances, with the possible exception the stove if you to make warm food and don’t have access to a wood stove. Try a week washing the dishes by hand, doing your laundry in the sink or in the bathtub. I’m sure you’ll be a bit embarrassed about your above comment after that.


You probably should ask someone who was born in the 1920s. If you asked a middle class woman in the 50s to choose between a car and a washing machine, I would bet she would pick the washing machine.


> For the consumer devices, cheap = disposable, as opposed to consumer devices you bought and could use for decades.

How nicely should a consumer device be made? I feel like technological improvements make the device obsolete and disposable, so making the build quality higher just artificially slows down iteration time. (Why will 5G be widely adopted? Because eventually everyone will drop their 4G phone down the stairs or the battery will be uneconomical to replace.)

If you go back and look at old appliances, they basically have no value despite being well built. They don't solve the problem they were designed to solve anymore. A shitty laptop from Amazon is a whole different piece of technology compared to a hand-crafted Apple 1 from 1976, enabling you to do things that people wouldn't have even dreamed of when the Apple 1 came out. So making it "survive the test of time" is futile -- that computer isn't garbage because it was put together sloppily, it's garbage because technology advanced past it. (Probably a bad example, I think the Apple 1 was just a circuit board cobbled together in a garage.)

I feel like even consumer appliances have changed. Clothes washers use significantly less water. Stoves use precise electronic control instead of burning flammable gas that gets piped into your home. TVs display content on-demand, with 8.3 million individually-colored pixels and 6 channels of sound. Computers connect to a library where you can read pretty much any book ever written, right in your home. Technology changes so people discard their old devices and replace them with new ones. It's not just some scam to make companies that can't innovate some cheap money. Everything has a lifetime, and sometimes it's shorter than the lifetime of the physical materials that the device is built from.

Maybe you don't need any of that, and just want to read novels from the 1950s with your dog in a log cabin. That's totally legit, but where you think people excited about 4k TVs are weird, they probably think you're weird.

Honestly, I think phones are something people hate the most for being a little too breakable, but thinking about it more deeply, I think the right tradeoffs were made. They last as long as they need to on average. 10 years ago, the technology (hardware, software, connectivity, design, everything) was immature. So they broke more and cost less, because you were going to need to upgrade. Now we have $1200 phones, and they are a little nicer, and the technology isn't changing enough to make them obsolete every 6 months. This all sounds about right to me. I have a $1000 phone and thoughts about upgrading it haven't crossed my mind in the ~3 years I've had it. That is very different from the phones I had before that, where I felt like they were compromises on the day they came out. (And hey, I only paid $300 for them.)

TL;DR: maybe everything isn't awful all the time.


In terms of entertainment, I was happy with 40 cable channels, and seeing a great movie every once in awhile.

Hell, I was happy kid watching free cartoons on the Zenith?

While tech got bettter talent seems to have nosed dived? (exception being a lot of free talent on youtube)

The pure talent of the 60's-70's was magic, and that era was before my time?


The fact that more entertainment is available is the growth in wealth, not that it is suitable for your tastes.


>Why will 5G be widely adopted?

I think we should reject 5G as a community.

Not because any danger of radiation. But I think people should be able to say "This is enough", and stop ourselves to be forced to think "We need moar!", thus put some block to this progress for the sake of progress..


Yeah, and what's with these mechanised looms, hand-operated is fine.


>hand-operated is fine...

May be it was.

You make a poor argument, because resources are limited, and "progress" will have to wait at some point. Human race can either opt into slower progress to match the available resources, or be selected out.

Indiscriminate growth is a hallmark of cancerous things. You can detect it, cut it out and attempt to survive, or surely die with it.


Progress doesn't have to imply more consumption, there's been a ton of progress in the field of recycling for instance. Also in renewable power sources.

Finally, with regard to poor arguments, you should probably read this: https://en.m.wikipedia.org/wiki/Reductio_ad_absurdum


> there's been a ton of progress in the field of recycling for instance. Also in renewable power sources.

Obviously that is not the kind of progress we were talking about here in the context of transitioning to 5G here.


It's literally exactly what we're talking about. You seem to be unaware that one of the major benefits of 5g (aside from the significantly increased density per tower requiring fewer towers) is all the energy savings baked into 5g-nr. Try not to believe the hype.

https://www.ericsson.com/en/blog/2019/9/energy-consumption-5...


> requiring fewer towers

It is the first time I hear this: the usual narrative is that it would require more towers. (And the idea of the emitters being more pervasive in the territory is one of the main concerns of some).


> You seem to be unaware that one of the major benefits of 5g

True.

> Up to 15 percent energy savings with software features

'Upto 15 percent' sounds alike it is a very optimistic estimate, (you know when they advertise 'upto 50% discount' most of the products are discounted at much lower rate). And it does not seem that this power saving is also dependent on network traffic. So I am not really convinced.


No doubt. We are so spoiled we just take all our advances as a given.

On top of all this so many people are doing more interesting work. My father didn't just work 7 days a week, sometimes 16 hour days but he did that doing something so boring to me I would rather be homeless or dead.

Healthcare is more expensive because it saves people's lives.

Healthcare is really the best example that we are just spoiled children who can not be satisfied no matter what Santa brings for Christmas. Even getting extra years of life isn't good enough because it "cost too much".


Isn't this always true? If you were born three hundred years ago you could have just listed off some recent technological development and concluded everyone was spoiled babies.


True in recent history, but not always. For 200k years, humans lived the same exact life as their parents.


Amen.

You know what? I paid double what my parents paid for the house I really think of as my childhood house that they bought in 1989, and I paid more than quadruple what they paid for our actual first house in 1979. But my house is also double the size, it's downtown in a major metro, I'm not going to have to rebuild it from the inside out because it's not falling apart and everything already works. There is no asbestos in my walls, no lead in my paint.

I guess I'm paying more for healthcare, but I also get better healthcare. I suffered severe spine damage in the Army, but I'm basically fine now thanks to procedures and techniques that didn't exist 20 years ago. I had severe pneumonia when I was 5. If I'd been born prior to 1928 when antibiotics were invented, I would have just died as a child and wouldn't even be here to be paying for anything.

My grandparents, man. One of my grandfathers effectively worked, grueling manual labor, until the day he died. The other one died with nothing, left in a nursing home rotting with Alzheimer's for the last decade while con-artists tricked him out of what little he could have left to his children. And he experienced unimaginable horrors as a young adult. His ship in the South Pacific in 1945 was destroyed and he survived for six weeks on a raft drinking fish blood. I served 8 years in the Army on a tank crew and I can't even imagine what these people from earlier generations went through. My grandmother literally murdered someone when she was a teenager. I guess it's a minor miracle she never went to prison, but I suppose the police just didn't bother policing the segregated Mexican neighborhoods back then, which is probably why she was killing people.

I have no idea where this idea comes from that we're worse off than our parents. I guess it's because the people believing this kind of thing are mostly writers in a collapsing industry who came from upper middle class highly educated backgrounds? Well, guess what? A lot of us didn't and we're way better off than our parents and grandparents.

College got more expensive. Great, but I was the first person on either side of my family in any generation to even go to college. The idea of getting educated at all and not working your entire life doing manual labor would have been an unimaginable luxury to my great-grandfather who grew up picking cotton and taught himself stonemasonry.

These people who think it's been so downhill since the 70s. Do you seriously not remember the 80s and 90s? I was with my dad and got trapped doing side jobs during the LA riots. Over 60 people were murdered. The National Guard enforcing curfews and perimeters in the nation's second largest city. Nowadays we flip out that a single woman died trying to break into the capitol. My school got shot at in drive-bys more times than I can remember. We weren't allowed to wear red, blue, or Raider's jerseys because the school district was so afraid gangs would kill us for it. My sister's best friend was murdered.

I guess life was great in the 70s for the 0.0001% of the world population employed on the factory floors of major US auto manufacturers, and it sucks those opportunities dried up, but man is it myopic to act like that's the only thing that has happened in the last 50 years.


> Do you seriously not remember the 80s and 90s?

The average American is something like 39 years old atm. That means the average birth date is 1982 so most people probably do not remember the 80s and only barely remember the 90s. If you were alive when your parents bought a house in the 70s then you have lived a life that most Americans don’t have a connection to. You were in the golden years and wonder why people don’t have the same view you do?


>These people who think it's been so downhill since the 70s.

Hmmm. Old enough to remember the 70s (and 60s and a bit of the 50s).

It's easy enough to cast about for a few real advantages of modern times.

. The ability to type to this forum. Essentially a Usenet group with a different interface.

. Replacement of Sears with Amazon

. Video on demand

. LED lightbulbs

. Greatly improved medical care in some areas, particularly surgery.

. Faster closing on stock trades.

A time traveler from 1965 would notice nothing unusual about my house aside from the flat panel TV.

I think that aside from some healthcare tech, I'd willingly trade cheaper beachfront property and skinnier / more athletic kids for 99% of modern advances.


Itt: people being willfully ignorant about climate change.

If we have to destroy the planet to have these things, which we are, then none of it is productive. Quite the opposite.


I make about 20% of what people claim the SV crowd earns.

I just filled up my gas tank. It cost me about $23, or around 6 cents per mile. Somewhere around $3.

I could still afford to drive a fair amount even if gas cost ten times what it does now. There's no guarantee that would reduce demand enough.

But if gas was made extremely expensive, then aren't we going to see this, but bigger?

  https://en.wikipedia.org/wiki/Yellow_vests_protests
My idea is to have a huge fuel tax, but to return it to each citizen in equal amounts, not use it for general purposes, so that people who really need to burn a lot of fuel can, but those who don't are incentivized not to.


I've come around to the idea that we should put excise taxes on the sale of new cars that use fossil fuels and use the revenue to help working class people replace their existing gasoline cars with electric[1]. The idea is help people do what you want them to do instead of apply the rubber hose.

[1] One way to think about that is you force people buying new cars to be carbon neutral by paying to retire existing cars.


The goal is to reduce carbon dioxide emissions. In order to do that, it has to be done in a way that the masses will not revolt and eliminate the political leaders who want to prevent climate change.

I think that your suggestion, while I guess it's mainstream and basically what's already being done in most places, is guaranteed to fail in both respects.

If we tax one thing and subsidize another, without any linkage to CO2 emissions, then we will not get the reductions we need.

We cannot reconcile the idea that CO2 is the problem, while maintaining that some CO2 is more equal than others.

Can you explain further why you do not like the idea of pursuing the goal that matters, and prefer substituting one that doesn't?

Do you personally, at work, observe how metrics and KPIs that aren't chosen well, lead to bad behavior?


I see the fundamental problem is dependence on existing capital assets that emit CO2. Like gasoline and diesel cars and trucks. Focusing that seems reasonable that you a) don't want people to buy new ones. 2) want to expedite the retirement of current ones.

Putting an excise tax on new fossil fueled cars would discourage manufacture and sales. Using the revenue to help working class people retire and replace their older fossil fueled vehicles with zero emissions ones reduces the number currently in service. The key word with that is 'help' as in help that do what's needed. Instead of carbon taxes which punish working class people for not doing something unspecified to fix the problem.

That said I also support banning production of fossil fuel powered cars with enough forewarning that industry and the economy can adjust.


>I see the fundamental problem is dependence on existing capital assets that emit CO2

I don't think you're wrong, but an opinion like you're expressing doesn't say what, who, how it should be done. It doesn't set a goal that can be achieved, and it doesn't imply that success would solve the problem of reducing global CO2 levels.

Have you ever had a project manager tell you how to write code?

>carbon taxes which punish working class people

That's why the revenue needs to be sent right back to the people in equal amounts.

If that isn't good enough, it doesn't mean carbon taxes are less necessary. You haven't stated why it won't work or is not necessary in a way that I can understand yet.

Doing other things seems to me like the ultimate example of "looking for your keys under the streetlight even though you dropped them in the dark".

Or, it's like the joke "we must do something; this is something, therefore we must do it".

Building things produces CO2. People will argue endlessly about how much. It's impossible for everyone to agree on the facts, but the quantity matters.

We don't want any particular activity or product to be generated, we want less CO2. If we reward specific things that are associated currently with less CO2 but don't logically entail it, it will be a very short time until people optimize those goals by producing more CO2.


I need to see most streets and apartment lots lined with chargers. I don’t personally have permission to install any.


And if you're poor and have to live far from your job (because you're poor) screw you? There are a lot of people who will be locked out of driving by this because they don't have the float to overpay for gasoline and wait for the refund.

If the US had better public transit options, then maybe. But you have to do that first.


>My idea is to have a huge fuel tax, but to return it to each citizen in equal amounts, not use it for general purposes, so that people who really need to burn a lot of fuel can, but those who don't are incentivized not to.

Interesting. Have you described it in detail somewhere?


This general idea is called a revenue neutral carbon tax: ex https://yaleclimateconnections.org/2019/07/revenue-neutral-c...


But people seem negative on it recently, and I'm not sure why.

I don't see any feasible alternative and I'm getting cognitive dissonance from people who are getting more vocal about climate change being an emergency and aren't pushing this.


The notion of a carbon tax being revenue neutral is pretty widespread, in fact I think most carbon taxes are designed that way.


I'm unclear on the gap between the "notion" and the protests in France.

The Wikipedia page (https://en.wikipedia.org/wiki/Carbon_fee_and_dividend) is kind of inconsistent - it says that a revenue neutral tax is a "conservative" idea, then it has more about a "universal climate income" under "social justice".

Widespread is not the same thing as generally accepted. I think I've been reading vaguely hostile stuff recently about carbon taxes from vaguely progressive sources, but I'm not sure why.

I'm wondering if progressives are against it because it's considered conservative.


I think one of the reasons is that it is a market-based solution that allows companies to pollute as long as they are willing to pay for it. Many progressives prefer an approach of behaviour and lifestyle modifications.


>I think one of the reasons is that it is a market-based solution that allows companies to pollute as long as they are willing to pay for it. Many progressives prefer an approach of behaviour and lifestyle modifications.

I don't see how that hypothesis is compatible with regarding climate change as an existential threat.

I mean, maybe it is potentially the end of the world for practical purposes, and maybe it isn't.

But if it is, then concentrating on behavior modifications that are appealing to impose independently of CO2 levels seems like an expression of either a desire for human extinction or of a disbelief in the problem.


It seems to be a right wing proposal that's been around for some time. That must be why I either didn't run across it or forgot; I mostly ignore conservative media.


It might be that we're spending the increased productivity on temporary things rather than long-lived things. For example, spending $1M on keeping a retired cancer patient alive. That's good for them, good for their family, good for lots of people to feel like they'll have a chance of survival if they get it, but it doesn't cause a new bridge to be built, etc. The technology improves, the knowledge improves, but those things aren't visible on a skyline.

It might also be things like higher quality food, varieties of food, yachts for the rich that most of us don't ever see, or things like that, things that are either consumed and don't accumulate wealth, or things that are relatively long lived and durable but not part of most people's lives because most of the gains have gone to the top 1%.

Other ideas: paper pushing services like handling unnecessarily large volumes of litigation just to run a business, which requires work and causes GDP to go up but doesn't produce actual products.


> These low hanging fruit can be picked by throwing more people at the problem (increasing workforce participation or working longer hours).

I haven't read the article yet, but note increasing workforce participation or working hours doesn't increase productivity - that just increases the denominator for the productivity calculation, and unless it increases the numerator faster there is no productivity increase.


This is why I said "not necessarily efficiently". At a very macro level, you can increase economic output by either working more or increasing efficiency (productivity per hour).


https://ourworldindata.org/grapher/labor-productivity-per-ho...

Are you sure about that? Are you measuring in absolute GDP per person or rate of growth per person?


GDP doesn't take into consideration public debt.

If public debt is up 100% and GDP has barely changed - I don't think that's cause for celebration that GDP / hours worked = we are more productive.

I'm not saying this is the case. But people use GDP a lot to make points like this - and it's important to talk about how much public debt is increasing GDP.


If you are in a country where pensions are managed by the state, your contributions are considered taxes. If you are in a country with a private pension system, your contributions are in the GDP!

The GDP is not a scientific measure. It is an ideological concept.

It doesn't mean it is useless, but it should not be taken at face value.


Taxed income isn’t subtracted from income in national accounts though...?


> The Great Dissipation.

Yeah I was consulting for about 5 years and this is something I saw inside a lot of companies, especially when they get sizable funding: they become simultaneously obsessed with hiring and process, and very little real work gets done.

To be honest, I think it might be an inevitability of the productivity boom. Thanks to technology and automation, it's possible for a single individual to accomplish what it might have taken dozens or hundreds of people to do a generation or so ago.

When it becomes possible for for a small subset of the population to create enough value to support essentially everyone else, it it just changes the parameters of the social game we are all playing. When essentially all the manpower is needed to progress society, the optimal strategy is to reward the most productive individuals to incentivize others to increase their own productivity. When there's an excess of productivity, the optimal strategy may be to put yourself in a position to capture the value of those few productive individuals.

But I think it's also something we can try to address via IP law. Currently we don't do a very good job of incetivizing creating something new. The current state of patent law essentially serves to bully small players out of the market who cannot afford to defend against dubious patent claims. Copyright is used as a cudgel to keep small creators from iterating on culture. In a lot of ways, it's fair to see the game as rigged, and to come to the conclusion that it's not really worth it to create something new.

If you ask me, the single best thing we could do to improve productivity would be to make it easier for people to own the products of their creativity.


> this is something I saw inside a lot of companies, especially when they get sizable funding: they become simultaneously obsessed with hiring and process

A lot of it just seems to be driven by strange attempts at optimization. For example, at my company, there was a lot of time spent on figuring out developer metrics (for all four of us on the team at its peak) and cost accountability.

On the cost accountability side, one thing that has happened three times is that the developers have been idled so the project will not go over budget in time tracking. We were still paid. We were still at work. But to keep an internal budget on track, we were told to drop our tools and sit there.


Yeah this also sounds like part of the "MBA-ification" of business.

I think a lot of this comes from a reasonable goal to have: if can find appropriate metrics which represent the work your company is doing, this should give you a reasonably objective way of measuring how different interventions move you closer to or farther from your goal.

But I think the problem is that actually finding and implementing those metrics is a bit of a pipe dream in most cases. A lot of these metrics which people come up with to try to measure productivity - story points, ticket flow, time-on-task - are all fairly subjective, and not very good proxies for measuring actual value creation.

I think it gets worse in orgs where there's a disconnect between management and the workforce which is actually getting things done. Often times, upper management only has these artificial metrics to evaluate the performance of a team. A lot of company cultures put management in a bit of a bubble, so middle managers have more of a focus on conversations with other people in that management layer, who are metrics focused, rather than dealing with development teams, who are actually engaged with the product.

As a result, the "ground truth" for management becomes the metrics, and you start to see really silly things happen because the metrics matter more than the product.


You Can't Improve What You Can't Measure Disease

YCIWYCM is a great way to kill productivity and performance improvement efforts when taken too far. Sometimes, you've just got to go with your gut. The amount of effort needed to make measurements, especially for less tangible knowledge work, can kill productivity overall.

Often, the measurable parts are at the conclusion of a particular effort, the total time it took to get there and the value of the output. But it's not necessarily possible to meaningfully measure the internal process of reaching it when (again, knowledge work specific) half the time may be spent thinking about the thing. How do you measure that? How do you indicate progress when you're one "aha" moment away from success but can't force one to occur? Synthesis, what most programmers do, the combining of ideas to form a new theory [Naur1985] is hard to do mechanically. It can be done, for a lot of work, but not necessarily for everything. But because it's not a purely mechanical process (in contrast to a lot of other business processes or physical manufacturing work where you have more mechanical work and less synthesis) it's hard to identify useful metrics and collect them.

When the work is more mechanical ("We've done this before, but...") then you can start finding more places to measure and gain useful insights. But the more novelty that occurs, the more you need to collaborate with each other or spend time in thought, the harder it is to find useful measurements.

[Naur1985] https://pages.cs.wisc.edu/~remzi/Naur.pdf


This empiricism theater that so many engage in these days is just another form of CYA.


I witnessed what you are describing first-hand. Some years ago, in a job far away, the overall account manager for my region managed everything through a single budget spreadsheet. Decisions were made entirely based on cost without any actual thought to the outcome of the decision. While in the middle of cutting cost, the company failed to deliver some important contracts that led to the company being penalized (to the tune of tens of millions of dollars). The irony, if they'd just spent a fraction of the penalty amount hiring people, they would have been able to deliver on the contracts and avoided the penalty.


Sounds like a case of Goodhart's law, "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes."


I feel a metrics are only valuable to the team itself (if they find them helpful). No one else should be looking at them because as you touch on, they are meaningless to anyone else.

My team points stories and we just that as an idea of how much work to take on for the week. If we under or overshoot, it's not a big a deal, but story points help us relay externally what will likely get done this week. No one is going to be able to look at how many points we did in a week be able to get anywhere close to an accurate representation of how much work we did. Even looking at how often we hit our mark is going to tell them nothing about our productivity.


SAP.

Enough said.


> If you ask me, the single best thing we could do to improve productivity would be to make it easier for people to own the products of their creativity.

The problem arises when my creativity arises from finding new uses for your invention. Whose creativity was the end result? It’s clear that the moral rights should be split, but fairly determining that split for legal accounting is near-impossible. As a concrete example, look to the endless lawsuits over unauthorized music sampling against other musicians.


It's indeed very notable that there's a lot of talk how to make people generate more value to others, and nothing about how people find ways to consume the value that others create.


'Why aren't we far more productive?' sounds not too different from 'Why don't we have infinite growth on a finite planet?'

- There is not enough meaningful democratizable work. Most work is either menial, or challenging. And I'm not even talking about the third kind of work [1].

- Menial work is getting more and more automated.

- Challenging work doesn't necessarily scale with human resources beyond a certain point (e.g., If 1000 researchers are already working on cancer, increasing the number to 10,000 isn't necessarily going to help find a cure faster).

[1] https://www.strike.coop/bullshit-jobs/


Why do you think that 10x increase in cancer researchers would not help find the cure faster? Do you think there is one supergenius destined to find the solution and he is already working on it? 10x researches can try out 10x approaches or concentrate of specific types of cancer. From what I understand, most of actual research is done by hungry grad students and the higher your position get the more time you spend of trying to keep or expand the funding. I know that keeping the grad students overworked and hungry keeps many grifters out of science, but at what cost?


IMO Joel Spolsky kind of covers a possible answer to your question in his blog post "Hitting the High Notes"[0]. While I'm sure there's plenty of counter examples, it's important to note a wide ranging variety of discoveries and breakthroughs that have come from individuals and small teams that have. In many of these cases, that person or small team is uniquely breaking against an otherwise agreed upon convention or state of the field.

1,000 people who agree on a paradigm which may be false is not helped by another 9,000 people who agree with that same paradigm. In the era of Einstein, how many physicists accepted the traditional view of space and time? throwing more physicists at the same problem probably doesn't mean getting more Einsteins.

Potential example for today (though IANAD nor Biologists), how many scientists and researchers agree upon removing plaques as a treatment for alzheimers? how long has that theory been the dominant narrative in spite of failed treatments at removing plaques? how many individuals wanted to try/research something different but all the other grants went to people pursuing the held narrative because the grantors also held that same narrative?

There's countless of companies where scaling up # of people just adds noise and bureaucracy and smaller companies with strong-minded individuals were able to break through.

[0]: https://www.joelonsoftware.com/2005/07/25/hitting-the-high-n...


another way to look at it is that cargo cult thinking is the default mode for human thinking. This is something I've noticed recently and increasingly believe to be true. The vast majority of people do NOT take the time to build an opinion from weighing multiple sources of information, looking for counter arguments etc. Cargo cult is the default mode for a civilization.


That has to be true, right? There's just way too much to understand and it's impossible to come up with an absolute ordering of importance. At some point you have to delegate huge chunks of your understanding.

And it works 9 out of 10 times. It's just that one time that's a problem, but it's still probably the most optimal strategy for nearly everyone.


Is it a strategy if it's inevitable?

I have almost no understanding of anything. Everyday I use technology that just works. What happens in the background, I'll never know. I interact with complex social systems without knowing what makes them work or how my interactions affect me and any of these systems.

All of this works, because I build simple heuristics on everything I interact with. Mostly, these heuristics are called expectations. I expect something to happen, because I don't know with certainty that it will.

"I know that I don't know".


There's a difference between knowing you don't understand something and thinking that you do - that's the difference between cargo cult thinking vs actual thinking.

Cargo cults are antivaxx, anti-climate change. Things that are obviously not true, but that many people want to believe so they do. The issue is that this kind of thinking is actually default, which I've grown to understand recently. IE many people who are in favor of good things don't really have a deep understanding of why that is - they've just adopted an opinion that sounds good to them. This is particularly pronounced in ie economics and politics, where understanding is extremely shallow.


Resource allocation is another issue. But even if the other teams try to independently replicate results (imagine such luxury), that would be useful. The more people there are in the field the probability of challenging the status quo is higher. Also why scale up the number of companies? You can increase the number of startups trying something new.


I've worked in research most my career and I tend to agree. The issue is that throwing more people in the same narrow approach most likely won't help if that approach is off to begin with. It has been promising and we've been getting and still get some returns out of it, but is it possible we're stuck in a local minima of the solution space in terms of representing the real system were trying to understand and manipulate? We of course can't know and those who are captains of a discipline and the funding agencies that decide who gets funded and don't steer to focus on narrow incremental changes. Some of this has to do with deeper business management steeping into basic scientific research, where I believe, it doesn't belong because it thinks in the wrong time horizons and has the wrong goals.

High risk, novel approaches often aren't funded or few opportunities exist for them. There is good reason for this because it can be abused by those just looking for fun or easy work while labeling it novel. It may also encourage fringe sciences that are bordering on psuedo-scientific work. They incorporate a bit of science but then go off in directions that are almost provably wrong. Distinguishing which novel approaches seem like genuinely novel realistic and non abusive proposals isn't always easy. In some cases they're just laughably wrong because there are so many assumptions baked in that are almost provably wrong or at least self inconsistent. On the other hand, some aren't quite so easy and involve a significant amount of effort and insight to understand exactly what's being proposed. This is often where paradigm shifting research really occurs.

It can take incredibly brilliant scientists with enough creativity to see the opportunity in a proposal and approve it and those aren't the people often awarding proposals. On the flip side, most of these sort of proposals, no matter how valid and novel they may be, simply aren't going to be correct and the reviewer is right to take a more critical eye. The novel approaches are inherently high risk and most will be wrong. I still think we need to fund these approaches. I've been involved in proposals that seemed only slightly novel in direction from accepted paradigms and reviewer responses came back (some agencies return anonymized responses) in a way that showed they clearly didn't understand enough about the domain to even make the assessment when their critiques were clearly invalid, they just didn't agree with the proposal approach. Maybe they were right the overall approach was off and created a lame excuse, maybe they were wrong, but this tendency to be risk averse even in the few funding opportunities that were clearly budgeted to be high risk shows the culture we have in modern science.

The burden of responsibility often lies on those proposing these huge shifts and we may be reaching points in some domains of human knowledge where the burden of proof is simply too high for an individual to provide for a given novel idea. Imagine if Peter Higgs alone had to provide evidence to his idea. It wasn't until so many other iteratative approaches were exhausted that particle physics decided they had to start testing novel other options. How much time, effort, even careers were wasted chasing other ideas? Should science be depth first search, breadth first search, a mixture of the two to try and hedge out bets that if we are on the wrong direction we might find a better paradigm in parallel we can jump to when we're stuck, or perhaps a different heuristic?

Then the really really difficult question, for me: if we support a mixture model of say BFS, DFS (and perhaps a handful of others) for the search space of acquirable knowledge, who should decide resourcing allocations for the search mixture model? What is the best set of approaches and resourcing? Science already incorporates complex search mixture approaches for knowledge at various levels but it seems at the highest level (how we resource stuff) it doesn't, it's incredibly iterative, risk averse, and focused on short time horizons for returns on investment.


Cancer really isn’t one disease. It’s thousands if not millions or maybe even billions of diseases that share some similarities. The odds of a silver bullet cure are effectively zero, but there’s plenty of opportunity to help sufferers die with rather than of. It’s a game of incremental improvement so it’s likely that greater investment would produce reasonable returns.


>The odds of a silver bullet cure are effectively zero

A way to kill a specific type of cells doesn't sound impossible (zero).


Well it isn't, there's just many different types of cells that cause cancer.


If you have a team of programmers who always miss their deadlines, will increasing the headcount necessarily make them a team that always meets the deadlines? No, because other than having more people means more distractions and need for communication, there might be other problems, such as technical debt, or miscommunicated requirements, or inexperience, or outright unreasonable demands.

The only variable that increasing the headcount necessarily improves is the headcount itself.


No but increasing the number of teams who are all now competing with each other would probably allow us to see progress scaling with the number of teams, not necessarily putting everyone on the same team.


Science discoveries don't just happen because there are more people "competing" to solve a problem. The more likely outcome of what you're saying is that you'd only have more teams competing, sure, but to redo each other's work.


The replication crisis suggests to me that we'd probably benefit from lots of repeated work.


As an outside observer, replication crisis appear to more a result of other things like "publish or perish" and other politics in academics


They don't even compete to solve the problem, they compete for funding. People who are better at politics gets more funding, so adding more people could even be net negative with them draining up all the funding from those who do the actual research.


I hate to tell you, but there is not, and never will be a cure for cancer. Every cancer is unique. The best we can hope for in a realistic time frame is a set of drugs that cover a significant portion of the lineage dependencies corresponding to a majority of the the most common cancers. Ideally, we will be able to drug/ligand every human protein, but we're still a long way off.

Furthermore, science isn't necessarily a tractable problem that scales linearly with the amount of manpower thrown at it - many of our greatest discoveries have happened serendipitously.


I agree with you that there will never be a universal cure for cancer after the person is diagnosed with cancer. The problem is not that each cancer is unique (in fact there are many common patterns and stereotyped responses to treatments) the problem is that cancer is difficult to eradicate once established. Trying to cure advanced cancer is the wrong idea, most probably, despite colossal resources deployed towards that aim. Even curing an early stage cancer is a fraught process, and some apparently cured cancers can sit dormant for 20 or 30 years before returning to be incurable.

However, I believe that there will eventually be a strategy which prevents almost all cancers. Cancer begins from one cell 100% of the time, and eliminating that cell or preventing the transformation in the first place will stop the cancer from ever developing. One way is to genetically modify the human organism to be cancer resistant. We know this is possible in theory, because there are organisms out there which seem quite resistant to cancers for their size (the naked mole rat, elephants, the bowhead whale). But rationally altering the genome of everyone on the planet is a long way off and possibly unpalatable to many. There are other more near term possibilities.

I should probably add that cancer research is my day job.


'10x researches can try out 10x approaches'

Can they though, and is that helpful? Is there sufficient communication between everyone in the field to ensure no one is investigating the same things? Is that still 10x, given the communication overhead in ensuring what you're wanting to start doing isn't already being worked on by one of the 9999 others? Are their 10x as many approaches that can be investigated in parallel at all times?

The lessons from the mythical man month still apply even with research.


Those are all general thoughts and results would be different for different problems. Some can be parallelized for millions of people, some can not be at all. Parallelization is as much about throughput as latency. Nine women can not gestate a child in one month, one can apparently gestate eight in nine months.


The numbers 1000 and 10,000 are just to make a point. It could be 10,000 vs 100,000. I just don't know.

I've added "beyond a certain point" to 10x. So e.g., increasing from 10 to 100 helps. 100 to 1000 helps. 1000 to 10,000? maybe. 10,000 to 100,000? maybe not. And so on (like approaching diminishing returns). (again the numbers are for illustration, and would vary with the nature of work).


I think there are so only so many possible solutions, the first researcher will choose to researcher the possible solution with the highest probability or working. As you add more researchers they will research less probabil solutions. It is the law of diminishing returns.


Scientific research is not a high school problem where the clear algorithm is known. A lot of approaches need to tries before valid ones are found. And for cancer a single approach is most likely not enough.


The grandparent talks about going from 1,000 to to 10,000, not a single researcher. Are you implying the law of diminishing returns doesn't apply to research?


But how do the first researcher choose the approach "with the highest probability of working"? Since the research hasn't been conducted yet they can only use their own previous experience, also known as bias. Judging by the results the approaches to cure cancer that were pursued first haven't been very effective.


"A man cannot want what he wants".

You cannot really pay a man to "focus" (I mean, work passionately) on something, which is what exactly is required in these kind of things.

And I think this might be blocking our progress right now. Because people cannot work on things they are passionate about, because they are told to focus on this other thing, some other idiot thought they better work on..


Our processing speed is way up, but I believe our cost of energy has more or less not grown substantially.

In fact, with proper economic cost accounting like externalities as we try to deal with global warming, costs are going up.

Solar/wind are gradually starting to improve things in meaningful ways but they are a fundamentally different grid design.

Perhaps a modular LFTR reactor design can beat what Solar/Wind will settle at on an EROEI measure once solar/wind are out of their main economies of scale and techonological development curves.

But what we are entering for the next century is clearly one of resource limitations, and working much harder to more efficiently use them.


> The Great Dissipation.

One of the shocks about entering the world of work was how little I often got done in any given day because I was in a meeting or had to deal with some form or was waiting on permission or many other things.

I have had whole days disappear with nothing much to show for them at the end. I am sure I am not the only one. And even though it is almost assuredly known that this happens, nobody seems to care.


I had a job once where I did nothing except for when I would find things to do in order to quell the boredom.

When I tell people this they are usually envious but I always say 'be careful what you wish for'. For me it was the most miserable period of my life.

I've had depression and been homeless - that job was worse.


Agreed. I once had a job like this - the first so many months was little to no work, because of external inefficiencies. Nice in the beginning - I used the time to learn cool stuff (Emacs, etc). But after a number of months you feel like you were put on this Earth to do something useful, and you start looking for jobs.


Meanwhile, one of the great shocks that I experienced when becoming a manager was that I had no better solution to keeping a large group of people aligned and unblocked than a bunch of meetings. Sure, I tinkered around the edges a bit, but it's not fundamentally different. Solutions welcome!


> a large group of people aligned and unblocked than a bunch of meetings

Could you explain this a bit? Why weren't the interested parties talking to each other in smaller groups? Wasn't there a shared base vision? What caused the blocks? What caused the decoherence of alignments? Also, how many meetings? How many people? Thanks!


Maintain a backlog for them, at least an idea backlog


I think people are doing more work than ever, but that doesn't mean the work is useful.

Think about how much easier it is to build a SaaS or something vs. even just ten years ago, but that doesn't mean that the new stuff being built will bring economic value.

In fact we might expect there to be diminishing returns in terms of what delivers real utility and thus improvements in productivity don't translate to a similar increase in wealth generation.


Everyone churning out Tiktok, Clubhouse, and Robinhood while we need more SpaceX and Teslas. I’m unsure how to tilt the equation so solving for hard (necessary) engineering is more profitable (or lower risk) than window dressing.


TikTok, Clubhouse and Robinhood have touched the lives of orders of magnitudes more people than Tesla or SpaceX.

Tesla sold half a million cars last year. TikTok provided entertainment for a billion people.


drug cartels "touch" the lives of millions of people and make billions in revenue and I'd say they are a net negative for society

most consumer apps are net negatives that sell digital dopamine hits and studies have shown make people more depressed the more they use them


I think there’s a disconnect in what we deem to be value. SpaceX has rapidly reduced the cost of lift to orbit (and shortly, revolutionizing global comms via StarLink). Tesla has forced an entire industry (light vehicles) to electrify by demonstrating very little compromise vs combustion vehicles while their stationary storage products are forcing thermal generators off the electrical grid (cannibalizing ancillary/frequency response revenue, the last pieces of significant revenue thermal generators are holding on to). TikTok, Clubhouse, and Robinhood are various degrees of poison (both at the individual level, and the societal level). They're our generation's Philip Morris and Standard Oil (imho).


I have held this view for most of my life. But now I kind of see them as complementary to each other.

At the end of a workday, a spacex or Tesla employee will likely sit down and watch stupid tiktok videos or Netflix or any number of things the rest of us do (or just sleep).

The point is that we need both these things and they do complement each other.

The way I think of it though, is more about scarcity of talent.

The idea that a really smart human being (a finite quantity) can either help build a better renewable energy system or a better video compression pipeline, that feels more obvious. One is using their talents to help humanity. The other is using their talents to help a company’s bottom line but doesn’t actually add to the pleasure of the audience or the enjoyment of humanity like the actual content does. And the cost of having a slightly less efficient video processing pipeline or subtitles software, is negligible compared to the cost of taking another five years to build better toilets, ways to recycle plastic, or use drones to plant trees.

So it’s less about whether the companies are useful. It’s more about where we most need our best and brightest talents used.

It’s that gross misallocation of talent at places like Facebook, Netflix, and Apple that bothers me.


In the past, the systems that thought they knew top down what is "valuable" and what isn't in the economy and society in a more granular level didn't work out so well.

Maybe the only reason we are spared extermination by some alien enemy is because our videos are so great (I think there is a short story from Philip K. Dick simikar to that).


Right, but if we go too far on this, you end up with the USSR's problem of having large investment in infrastructure but no investment in consumer goods.

Although it might be true that we're over-investing in a sort of ultra-sugary sort of consumer goods that are not healthy for society in the long term.


I believe this is the major problem. During all times, humanity spends most of it resources to make more of the same. Does this world really need another social network? More movies? Games? Novels? Fashion? More e-commerce that is reselling existing products in new design?

All of this work doesn't change a single thing. It's more of the same. Tomorrow, it'll be replaced by more of the same. Maybe it'll not be replaced, just added to a growing pile of more of the same.

The space industry has a shot at changing humanities future. So does the aging industry, prosthetics, robotics, ai, fusion and others. It might not work out, but if it does, it'll change something.


And was what TikTok did positive impact?


You don't think bringing people joy is a positive impact?


Were those people not being brought joy before TikTok? Did the amount of joy in the world increase? Does it increase with every new frivolous app? If so eventually we will be overwhelmed with the pure bliss of being alive, but that does not seem to be the trend.


Yes TikTok increases the joy its users experienced, otherwise they would still be doing what they were doing before.

I do think people are much happier than they were 20 or even 10 years ago, largely thanks to the increased joy brought by better apps. Can that continue until people are "overwhelmed with the pure bliss of being alive"? No clue, doesn't seem likely, but it's certainly possible.


Then what's with all the "social media causing depression" we hear all the time, and there also being studies highlighting that, not to mention countless anecdotal stories.

E.g this study: http://journals.sagepub.com/doi/full/10.1177/216770261772337...

Suicide rates increasing because of too much bliss?


It's debatable. Tiktok like other social media could be actually causing harm by a) being addictive, b) causing mental health issues, c) ruining your brain reward system, d) giving you adhd, possibly.

Once you start scrolling there, it is very difficult to stop, and I have tried it, but I don't think at least for me, it would add any actual value to my life even if I can't stop myself from using the app.

There's other entertainment, that I would think is more healthy, meaningful and can improve you as a person, so any time spent on Tiktok, I think is straying away from that other type of media.

So Tiktok definitely has impact, but I don't see how this impact is more positive as compared to say reading books or similar, which could potentially also provide infinite entertainment, but I also improve your brain


Cocaine triggers a dopamine response (simplified, I dated someone with an addiction). Is that a positive impact? Moderation is important, as in reflection as to why you’re enjoying something and if it's healthy.


If cocaine was free I think it'd have a tremendously positive impact.


cocaine was cheap and easy available a century+ ago (for example the original Coca-Cola was an over-the-counter headache treating mix which contained alcohol and cocaine). Didn't lead to much positive impact. (Note: i'm for full unlimited legalization, though not because it is that good, it is just that 1.the adults have the right to practice Darwin theories and 2.the Prohibition is really bad)


Maybe with robust social safety nets and mental health services available (Portugal model).


Not for me it won't.


That is just counting interactions, not the impact of the interactions.


tesla does not exist in a vacuum though.


“The best minds of my generation are thinking about how to make people click ads.” – Jeff Hammerbacher


This has also been my experience.

Clients will always try to push as much as possible. Ie. if a task is now faster to do, the client will ask for that productivity improvement. It makes sense, if you continue to charge the old price and "keep" the productivity improvement for yourself, the client will eventually go elsewhere, where the productivity improvement is passed to him.

You can somewhat mitigate the problem if you do creative work.


> This is what happened with the introduction of household appliances. Instead of spending less time doing laundry, for example, we do laundry more often.

I have thoroughly debunked this commonly referenced saying. I don't have a washer and a dryer any more and I can't go into laundromats because Tide delivers a chemical burn. (Even with 2 empty loads in a row.) No, I hand wash all my clothes and it takes way longer than if I did my laundry 5 times a week in my house if I had the machines.


I took it to mean people used to wear clothes longer between washes, not just that we do more frequent, smaller loads. It would be interesting to see a comparison showing how much longer an average person should wear the same clothes in order to make hand washing time commensurate with machine washing time.


Agree with that, but then. I have 3 kids and take care of laundry with a wife who wares jeans a maximum of one day in a row (yeah...), it probably takes me between 1 and 2 hours per week, most of it being the folding and sorting. If I were to wash clothes manually, even say, 4 times less, it would still be hugely more time consuming


Yeah I re-wear the same clothes a LOT and it still takes way more time.


I don't understand this at all. Doesn't hand washing expose you to more chemicals?


It would except that I use exclusively apple cidar vinegar. not the point.


Possibly dumb question, but is there a reason you couldn't just pour vinegar into a top loader and wash in that?

Also, why apple cider over white vinegar?


Hippie bullshit, mostly.


The Great Disparity In Income surely has nothing to do with it. If the boss is making absurd multipliers of your income and barely paying you enough to get by, the last thing you wanna do is get your work done as fast as possible. Pad those hours, give yourself the raise the boss never will. Take stuff from work.


Acknowledging receipt of the King Missile reference!


<3


6) We've made laws that greatly restrict our abilities to use these magical new technologies in their most productive ways.

Google Books was the canary in the coal mine. Remember when Google was going to organize all the world's information and make it accessible and useful? Then the copyright lobby attacked, and over a decade later that project (which is so obviously a quantum leap in terms of what it could do for human productivity) remains on pause (or permanently abandoned? not sure).

People today have access to an endless supply of information noise. If you want quality signal, it's still a trickle.

All the best Internet ideas now are not getting built because of copyright law.


And the problem here is that writing a good book on a thing is something that takes time and effort. Usually enough that it's a full-time or part-time job.

While I am no fan of the copyright maximalism that Disney and other corporations employing armies of people doing work-for-hire have turned copyright law into, I am also a person eking out a living as an artist, and I would like to be able to say "please don't use this thing I made to promote viewpoints inimical to my existence, please don't sell copies of this thing I made without asking my permission first and working out licensing terms", and even "please don't reproduce this thing at all without giving me some money". And I would also like to continue to make enough money off of my work for my bills to be paid. I've found that Patreon's enabling me to give a lot of stuff away, but if I wanted to make some of my work a scarce good, that's a choice I'd prefer to be able to enforce in the courts.

Or perhaps more succinctly: if you want information to be free, you need to figure out how to free its gatherers, tenders, and creators from worrying about their bills.

If you think you have a better idea for freeing information-creators from worrying about their finances than "Kickstarter/Patreon except XXX" or "put ads on all the things", then you might have a project worth chasing.


> which is so obviously a quantum leap in terms of what it could do for human productivity

that's a very strong claim. and it's very likely, unfortunately, false.

we have the great classics all available for free in a multitude of formats. a lot of works slowly but surely mature into the public domain each year, yet there's no discernible productivity growth based on the availability of books.

also, this sadly goes for research papers too. sure, fuck Elsevier and the whole paper mill industry, but the limiting factor is expertise, tenured spots, academic integrity. (see the replicability crisis in psychology, in particular how the many times the whole scientific enterprise is backwards: https://statmodeling.stat.columbia.edu/2011/01/11/one_more_t... and https://statmodeling.stat.columbia.edu/2016/09/21/what-has-h... )


Weird article, productivity has increased and is increasing. The real question is whether the result of that productivity is necessarily well harnessed, although that is subjective to a degree. Questions still remain, however, over why so much work is done and yet many of the individuals engaging in that work end up in in-work poverty. That huge amounts of the labour force cannot even afford their own housing.

But I digress, I suspect there is a lot of "hidden" productivity that only emerges at the collective level, which is enabled primarily by better communication and transportation, that is not obvious at the individual level.


I find it hard to agree with the article. With the escalation of technology the amount of things I get done every day, like, tangible, countable things, has skyrocketed, even more as my education with that technology went up. Not just programming and admin stuff, that includes stuff like art (cheaper, takes less time, gets nicer results) which some people argued would never ever be a replacement for traditional art (I still do some traditional things but I don't have shelves full of materials anymore). Not to mention new avenues like 3D modeling, I don't need a huge studio and large chunks of rock to create a 3D figure anymore, and there are ways to make it a real sculpture if I wanted. Music composing, I don't need several hundred dollars in instruments anymore (nor a complex soundproof studio). I don't need rooms full of machinery to make code or plan complex network arrangements, etc. And for the things that still need to be done the old way©, there are better tools, information and whatnot.

Most of my computer tasks are automated via scripts and homebrewn programs. Maybe I'm some sort of rare example of the third point in the article ("The productivity is here, it’s just only harnessed by the indistractable few."), but I doubt I'm special in any way. It's not like the past had no distractions, and I'd argue distractions in the past were harder to dismiss (having to personally travel to X place to do Y paperwork instead of resolving that with a quick phone call or digital process).

Sure, the pandemic made a few disasters in terms of paperwork and bureaucracy turning extremely slow, but I'd blame that more on poor organization than a side effect of technology.


> Most of my computer tasks are automated via scripts and homebrewn programs. Maybe I'm some sort of rare example of the third point in the article ("The productivity is here, it’s just only harnessed by the indistractable few."), but I doubt I'm special in any way. It's not like the past had no distractions, and I'd argue distractions in the past were harder to dismiss (having to personally travel to X place to do Y paperwork instead of resolving that with a quick phone call or digital process).

This makes you a rare example. I’ve been saying for years that the real computer revolution will happen when people work with computers, rather than on computers.

Most of the workforce does the same things they did on paper but on a screen. For example, I’ve seen people fill out spreadsheets manually, with a calculator.

A lot of jobs have not seen the speedup of automation, while compliance costs keep growing.


>This makes you a rare example. I’ve been saying for years that the real computer revolution will happen when people work with computers, rather than on computers.

Heh, it's funny you mention that, because I always had a deeply rooted philosophical belief that computers are tools, and a properly configured computing device is an "extension" of oneself. Not in some crazy cyberpunk sense of extension, but same as regular, old-school tools are customizable to better suit the user's hand, size or strength, so we can perform better with them, as if the tool was a part of our body.

Sure, it does require a certain amount of know-how, but even at the barest there's some degree of organizational feature like moving launcher icons so you can reach for a tool without jumping through hoops. Every "papercut" removed helps.

For example, this week I had a bit of a need to get text from some images. It's short enough bits of text that can be typed by hand, but that's annoying, so I set up a couple scripts to get the text via OCR and copy it to clipboard, after massaging the input a little bit using imageMagick, and now I can quickly retrieve it with ridiculously small % of failure (and it's usually very obvious), and works with Japanese and everything. At least half a minute of back-and-forth was turned into a keypress and a verification step (I usually double-check what I type so that step was going to happen anyway). It's a tiny thing and not even remotely my finest work in the field, but everything adds up. Got lots of homemade tools to do pretty much everything I do regularly, and properly commented so it serves as a bit of a personal repo of arcane tricks and best practices. I even recoded a few tools in pure bash or awk/gawk so I can carry them around as fallbacks, which allows me to use those things in obsolete or busybox-tier systems.

At this point I might as well admit it's a bit of a hobby. Having artistic skills also allows me to "brand" my system with custom decorations and of course allow me to draw comfortably by having most automation available from my left hand.

Anyway, sorry for the long personal post, but the point is to use my experience as example of the things the computer can do for you if you are willing to put some time into making it behave the way you want, the way it suits your own usages, experiences and abilities and of course environment. Maybe all you need to be happily productive are a few custom document templates and rearranging some icons, or you do complex coding tasks that can be automated to save you from lots of busywork, or you are working with faulty hardware or unstable internet connection that requires some babysitting that can be automated with a few scripts. I'm not saying it's something everyone should know, but it might be useful for people to openly discuss their use cases and experiences, someone might have a recommendation or trick available and everyone wins. It's a lot like working in the kitchen if you think about it.


I try to do the same, for me WSL has been a godsend, because at work I’m forced to work on windows (a lot of engineering tools are windows only, and some of them don’t work nicely even with fully virtualized environments).

Having the scripts I use on my linux machine close has made my life easier.

To my surprise, even very technical and knowledgeable people don’t do this. For example, at work we log our time of entry and exit (in Spain this is required by government). I built a simple workflow that uses my phone location and automatically fills the log for me. Coders in my team, which I have shared the script with, are still logging each day by hand.

I think it’s more of a mindset than something that requires any particular skill.


> And while moving from Smith Corona 1950 to Word 95 is a big improvement, moving from Word 95 to Word 365 isn’t.

This is an interesting and very important point in overall computing today as well - for the vast majority of use cases, many of the todays top patterns are actually not just a small improvement, but are worse. Keyboard shortcuts are usually (not always) faster than mouse/trackpad driven visual pointers. Many of the current UIs are resource hogs that slow things down by a lot. UI patterns have also gotten somewhat less usable - in many cases one cannot easily tell of a surface is clickable in iOS and Android.


Speculation and rent seeking are currently being rewarded more than hard work in this phase of the economic cycle, combined with high inflation which is eroding gdp in real terms.


the article is a bit weird, I think it refers to productivity paradox: https://en.wikipedia.org/wiki/Productivity_paradox

But it is a well studied topic, with much more than 5 possible explanations offered for it.

I personally think the whole thing is a bit of a nothingburger that comes about as a simple consequence of how we measure productivity, which is via measuring how much money is paid to people. And since monetory / fiscal policy more or less ensures that all the people are always paid something for something, productivity growth is manifested as invention of new service industries, from personal trainers to dietologists. In other words, our productivity metrics mostly measure hours worked.

Better metric would be how many hours worked it takes to produce a ton of nickel, or a bushel of corn, or a typical family car. And those number have been going down consistently, suggestive of strong productivity growth and no paradox.


Do you have links or suggestions for stats about “hours worked per bushel of grain etc.”? I’m interested in this area.


something like this: https://sdgdata.gov.uk/2-3-1/

look at the table of labour productivity, it has been growing consistently and has almost tripled between 1973 and today


The material and capital gains of productivity improvements since the 1970s have gone to mega-corporations and their owners instead of to workers.


Median U.S. income is higher than any other large developed country, including Germany, France, the nordics, etc. More likely is that computers and automation have enabled small numbers of people to have huge impacts. If one person can create a 10% better solution that solution scales infinitely without any supply constraints to the rest of the market, creating a winner take-all effect.

Scale of impact is historically why finance and law paid well, and it's also why management gets paid more than ever now. If you as a lawyer are 10% better than your competition and you have automation to make the grunt work un-fuck-up-able, you have much more time to make decisions on many more cases. That means the gains accrue to you instead of being spread among a bunch of mediocre lawyers, whereas previously without computers you might have been limited in how much time you could spend on high value work by communication and physical constraints.

In contrast, most workers are performing jobs that are either menial (paper pushing) or automatable, which means there is no way for top performing workers to highly differentiate themselves, since there's no barrier to entry. If you look at software engineers, there is a large gap in skill and business impact between even a Fortune 500 engineer and a FAANG engineer. That gap means 1. ability to highly differentiate yourself, at least to a degree and 2. skilled engineers have huge leverage and impact which translates to higher salaries. When the skill gap ranges from "doesn't understand what a linked list is" to "started a billion dollar company" the income distribution is bound to be heavily bimodal and heavily biased towards the top.

It is no coincidence that the advent of automation and software has both caused finance and law salaries to become more bimodal, and created a highly bimodal market for software engineers themselves.


> The Great Dissipation

We should stop reinventing the concept of "efficiency gains can be decreased or even nullified by psychological effects" and instead agree on a term for it. So far I found:

- Rebound Effect

- Jevons paradox

- Le Chatelier's principle

- SnackWell effect

- Induced demand

- Parkinson's law

- Risk compensation

I bet there are many more applications to specific domains.


In 2007 I ran an automated testing team for a product that had 60M in revenue. It was just me by myself. Today you can find startups with less than 5M in revenue with 5 people with the same productivity. Organizations have expanded to fill the budget provided of them mostly through headcount. Similar to Conway's Law.


This anecdote proves little. Even if taken as representative, there are valid reasons for a differing level of investment, like:

- importance of quality to the product domain

- differences in talent levels and supply of talent

- differences in company stage and expected future revenue growth


Does the author really expect for computers to make writing novels by hand way more productive? This is such an edge case and the productivity increase is there (spelling, navigation, version control, there is dedicate software that helps you to make sure your lore and timelines are consistent), but until GPT can make consistent books no technology will help him with the core task: engaging story. In most other areas of human endeavor technology does increase productivity, no bluff there.


The west has a productivity stagnation because it’s concentrating on services instead of industry. The productivity of services grows much slower that that of industry.


Amusingly we outsourced the sector of the economy with high productivity gains while keeping the part subject to Baumol's cost disease.

https://en.wikipedia.org/wiki/Baumol%27s_cost_disease


For me it has gone into being 'Agile'. 'Agile' has been the largest single worse thing to happen to software development in my experience. It may be appropriate for small shops fighting for their lives, but as a 'waterfall' replacement it has no discernable productivity benefits. It just seems to generate a lot of working the work.


> Technology calls our bluff. Improvements in technology show us that technology wasn’t the obstacle that we thought it was.

It would make sense that, as technology improves but humans don't, the bottleneck would eventually become the humans.

To get around this would require improving the humans (via genetic engineering, for example) or going human-less (via artificial general intelligence, for example).


I'd say there's a Great Desynchronization.

Not only are more and more producers simply out to enrich themselves, but social conditions make workers more and more likely to be at each other over topics unrelated to their work. It plays out like destructive interference in phase, where a massive amplitude in effort is taken on all accounts, but the sum is 0.


It went to:

- distractions

- the plethora of (min)information

- the lack of proper books, guids and documentation

- the twisted fact that everybody needs to be rich and famous

- the constant chasing of revenue instead of science, art, philosophy

We're living in a fake bubble of bullshit, and everybody wants to be the polished turd.


? Productivity is to a great extent a measure of producer surplus.

If you can sell something at a higher prices, often that leads to 'greater productivity' when really it isn't.

So if consumers have more buying power, and we make the same amount and quality of stuff, but sell at lower prices, consumers are getting a better surplus i.e. 'consumer profit' ... then that's where productivity can 'disappear' to. Literally higher standard of living.

If we measured in some objective price - or - we just measured in terms of 'quality of things produced' we'd see that better.


The Great Incompetence: Factor productivity increases only if the factor (=new technology) can be fully applied. A better factor can even decrease productivity if less of it applies than from the previous factor (=old technology). I'd say that most people make neither the fullest nor any reasonable use of technology, especially information technology as far as the office and standard company processes are concerned. Either they can't, or simply won't, either by their own volition or by instruction.


Almost feels like what is meant here is some kind of effectiveness at solving problems and coming up with new "stuff" - not productivity in a classical sense,

From my observation there is something to it as most efforts are directed towards efficiency which is much easier to tackle (removing frictions etc.). Effectiveness is often a far less certain endeavor and therefore a tougher sell prior to having the solution.


I am constantly distracted by others, by technology interruption products, and by life as I WFH.

I have to attend too many zero value meetings which I generally ignore.

I have to field questions constantly from people doing busy work then others wonder why I am not more productive.

Technology might make process more efficient but the gains are filled with more distractions which prevent progress to providing real value to the widget I'm working on.


The power of presupposition in questions: we’re all trying to figure out where productivity has gone without questioning if it has in fact gone.


This sort of questioning inextricably leads to (post-) Keynesianism / demand-side thinking. Do get there!


Productivity compared to which time in the past? If we compare productivity today compared to say 1960/1950 then we have definitely improved. We can manufacture at speed & scale. We can communicate with people across the continents like never before. We can work-from-home to a large extent than ever before. We can destroy each other at much grander scale like never before.

But like any system, any improvement in efficiency results in shifting the bottlenecks. It never truly goes away. The nature of the bottleneck may be different but its there. The things like "forms, compliance, process" are the new bottlenecks.

Technology is a means to an end. It cannot be the end itself (at a macro level). Tech is by humans, for humans and of humans. Humans will always be the centerpiece for time to come (unless AI becomes so good as they potray in sci-fi films/tv-series).


Red tape has essentially destroyed the last year and a half of my life.


What you do is more important that how quickly you can do it, unless you're very slow and doing the same thing most other people are.


I'm entirely clueless as to how to measure productivity at scale in a standardised manner.

For example, washing machines create a lot of extra leisure, because it makes humans productive. A washing-machine assisted human can was 100x as much clothing in a given minute of operation time. But I have no clue where such a productivity improvement would show up in the data.

Of course I can study things like 'how many hours do you spend maintaining your household', and you'd see that dropping decade after decade, despite living in ever larger homes with more functionality (and thus, a larger 'maintenance load'). That's productivity, and stand-alone it can be analysed and concluded as such.

But fitting that into the same productivity data as say, the construction of homes per worker in an elegant way? I don't know how you would do that.

Nor do I know how you account for completely different units across time and sectors. For example, how do we measure productivity of the pharmaceutical industry? Patents per person? Medicines launched? Number of lives saved by vaccines?

A GDP per worked hour is of course a great start. But it leaves a lot out, too. For example, technology that sequences a genome for a penny in a short timespan may indirectly lead to gdp growth, perhaps, but in and of itself that massive cheap boost to a researchers' insight into a genome will not show up as a direct boost to GDP, in fact the opposite.

Curious to understand from economists how such challenges are typically handled and what the data says.


The whole falling productivity meme is more of a myth and or overstated.

It doesn't mean anything. Look how well the stock market has done over the past decade. No one losing sleep over a .1% reduction in total factor productivity or whatever.

What matters more is if technology is advancing, are living standards rising,is the economy growing. The answer seems to be yes for all of those.


The economy is growing slower as are living standards(at least in the US post 1973 or so) because productivity growth has been slower. Both are obviously advancing, but more slowly.


We are at a time where trust is low. People want something govs and companies want something else. Left and right completely entrenched. While we keep fighting each other we waste a lot of time. Because things are changing to fast, so everyone keeps reinventing the wheel


It is quite simple. To distractions.

Look at me. I could be doing something useful. Instead I am talking to people about inconsequent things in hackernews..

And man, a day just disappear, just like that...


Is this about productivity measured as economists usually define it (so - value added divided by hours worked)?

It is a very misleadingly named indicator. If you're a barber working in a small village in a poor country you cut hair for 8 people a day for 3 USD and pay 100 USD a month for renting a place and other costs. So your productivity is 20*24 - 100 = 420 - 100 = 320 USD / 160 hours a month = 2 USD/hour.

If you go to a big city in developed country you now take 30 USD per haircut and still do 8 of them a day and pay 1000 USD for rent and other stuff. That results in a productivity of 20 USD/hour. You had become "10x barber" just like that :)

It has nothing to do with how well or fast you cut hair or how "productive" or distracted you are. It's all about the price of labor.

If that's the productivity we're talking about then the answer is "Great Averaging". Every job that could be moved to low-income countries to lower the costs - is. So the cost of labor goes down. Meanwhile land prices and patents and brands remain in high-income countries. So the relative cost of labor is lowered (or at least its growth is artificially slowed down). For people in low-income countries (like me) the "productivity" goes up, for people in high-income countries it goes down. Great Averaging.

A good physical analogy would be 2 containers with water. 1 with hot water and 1 with cold water. When you connect them the water in both changes temperatures in opposite ways. And you can extract work from the temperature difference (that's what the millionaires who do the outsourcing do).

It also increases local inequalities (you get rich people in poor countries and poor people in rich countries), but decreases the inequality globally.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: