This article makes me think of the woodworker's dilemma. You might start working with cutting, planing, joining and finishing wood because you want to make a chair or end table, and you like the idea of learning to do it yourself, maybe saving some money, or at least getting some extra tools out of the process, and having some pride in your work. But before you know it, you've spent 4 years accumulating tools, but more importantly, getting really good at making jigs, shelves, etc. to organize your tools and make your hobby easier and more enjoyable. In fact, you spend 90% of your time building tools. You do this not because you have to, but because you enjoy it. And because the brain makes it so easy to think of new ways you can use the skills you're building to make more tools in a virtuous cycle.
When you actually build a chair or end table, you complete the project, and you do enjoy the fruits of that labor, but there's no real cycle there. It's just an ending.
Software developers might fall into a similar trap, being so enthralled with building their own tools, writing libraries, designing and implementing frameworks, creating processes like CI/CD that obviously make the whole software development life cycle better... but of course it's largely an internal cycle that's more interesting than a lot of the end results of software that might actually benefit business (and measurable productivity.)
1) You never save money building it yourself (at least not in comparison to a standard consumer option, like a chair or table from your local furniture store. This may hold up if you compare what you build to high-end hardwood furniture. But with the cost of tools, you're probably still losing money if this is just a hobby)
2) 90% of my enjoyment of woodworking is just being in my shop working. The results are almost secondary. It's a hobby for a reason.
3) I know professional woodworkers and they definitely do not spend most of their time building organization. They buy anything that will speed up production. I can spend 6 months building a dream workbench, they'll go to Benchcrafted and just buy one.
There's something amazing/odd/horrifying about a world where it is generally no longer cheaper to do anything yourself.
I love to bake and cook. I can do so fairly frugally.
A supermarket frozen pie is still gonna be cheaper than anything I can make, especially if I count my time.
It's a benefit and goal of the modern global industrialized supply chain, but I cannot help feel that making me work a full time job to buy cheap things means we don't put enough economic value on people making things themselves.
Then again, I like not having to mine my own lithium, copper, etc. to get a computer.
> > There's something amazing/odd/horrifying about a world where it is generally no longer cheaper to do anything yourself.
> How? It’s pretty much the point and purpose of civilisation, to say nothing of industrialisation.
Civilization isn't necessary for this to be true. It is even true in small scale societies. You can observe this whenever they get cutoff from each other: they end up technologically regressing. It is more expensive to do everything themselves, so they end up losing access to technologies.
The world where this was last true was when the common ancestor of us and neanderthals were still around.
really, it's about division of labor, aka "allow someone to become much, much better at making chairs than you, a non-chairmaker, will ever be."
When you get so good at making something that it takes you 0.1-0.5 of the time it takes someone less experienced/skilled, the cost of your labor becomes an diminishingly small part of the overall cost. That makes it ever more difficult for the unskilled person to ever be able to make it "cheaper", even if they consider their time to be free.
I agree with almost everything you have above except "a supermarket frozen pie is still gonna be cheaper than anything I can make." I'd say youd buy the same pie at a store (or at least not at most stores). The ingredients list might be the same, but somehow the gloppy, starched-gel filling in a mushy crust you can buy from Kroger is (hopefully) pretty far from any pie you'd make by hand yourself.
There is a middle ground between doing it yourself and buying industrialized food. In Italy, people really value food produced locally. There is even a slogan for it "kilometro zero", which means "buy food close to the source".
No wonder food tastes way better in Italy than in the US (in general), for example.
It’s not a bad thing by to make something yourself even if it costs more. The joy of making it is valuable. As well as the knowledge that your are a little more self-reliant.
Assembly lines and clever optimizations to workflow improve the efficiency of a factory somewhat over DIY. Machinery and automation makes a lot of things much cheaper through reducing labor at the cost of consuming non-renewable resources. Global supply chains waste even more resources to take advantage of underpriced labor from disadvantaged people.
If there was a real accounting of the externalities, the sweet spot for efficiency would involve a lot more numerous, smaller producers of goods that don't require big investments in highly specialized equipment. It would be reasonable to have a couple chairs made by a regional furniture maker if not for the cheapness of burning oil to send things around the world.
I certainly enjoy the perks of living in a post-agrarian society. I too love to cook (and garden) a bit, particularly so when I know that I have options for feeding myself that don't start with tending the fields all day.
It's a matter of quality. That frozen cake pales in comparison to something truly self-made. If I do some bread from basic ingredients (flour, sour dough, water), the result more often than not is great (after some experimenting) and costs maybe half of a n equally nice commercial product. Plus I spent my time in my kitchen and not out there shopping.
I mean, division of labor has been a thing since we started organizing in tribes, and maybe even before then (families?).
DIY starts making sense again if there are really high taxes or regulations or if money is scarce (i.e. a recession, money being just a method to organize labor efficiently and that breaks down during a liquidity crisis).
I think what you're worried about is the death of ownership and how people are increasingly starting to either sell their information or rent the things they need.
Being able to buy things for cheaper from elsewhere is just specialization and it's one of the first things that brought humans from the store age to more modern civilization.
I agree, if what you want is a standard chair or table. But its common that I will want some piece of furniture that exactly fits some space or is otherwise unusual. For example, shelves that exactly fit my rooms.
It does help that I have somewhat low standards for appearance, and am quite content building things out of cheap wood.
You'd still probably be better off hiring a carpenter who already has the tools than doing it yourself. They can amortize the fixed costs over hundreds of projects.
Exactly that. Top brand tools are expensive, but lower mid range these days is surprisingly cheap, and even for one off projects might "pay for itself". And they get cheaper. Labor costs are increasing. Obviously this comes with few caveats: a) you have to enjoy what you do, so the labor is free and you treat the time as me-time and b) you actually know what you're doing... running in to a disaster and then getting someone to fix it can be very-very expensive :)
A carpenter charges like $50/hr. A tablesaw alone would likely set you back more than the labor for some simple shelves. Then they can get better deals on material, they do higher quality work, and they do all the parts that aren't fun, like cleaning up at the end.
Also your time still has value even if you enjoy what you're doing. You could be doing other hobbies you enjoy more, or other aspects of the same hobby, or making money so that you could afford other opportunities instead. You may enjoy woodworking enough that you're willing to forego the value of that time to pursue it, but that doesn't make it any more economical.
I expect to keep doing this, and most of my tools aren't even halfway through their useful life.
There are some downsides, and you're right that a professional would make a lot of these things look better, but there are also serious downsides to working with a professional aside from the cost. It can be quite hard to get them to come when you want, they may not build exactly what you had in mind since communication is hard, and some people will cheat you (https://www.jefftk.com/p/brendin-lange-is-a-scammer)
Over your lifetime, you might need 5 or 6 sets of shelves. Buy yourself a porta-tablesaw for about $400-$500 (or cheaper on craiglist etc), learn to use it, and chances are that somewhere in the middle of your life, the cost equation will have flipped in your favor.
You don't need a tablesaw to make some shelves. You'd get very far with a hand held jigsaw. Can be stored in a drawer and costs around 100€ for a decent brand. You just need to be a lot more careful to get things perfectly straight, compared to a circular saw.
Absolutely fair point. That's definitely a very different situation. Probably one of the reasons I doubt I could ever live in such a situation, at least until I'm 75 or so :)
This comes up in the DIY Audio community. It seems like the economic value proposition is very sensitive to what commercial segment you compare against. You absolutely can't beat low-end, mass-produced speakers on price, but it's easy to beat high-end, boutique speakers on price while getting close on quality.
A common money-saving trick is to buy cheap electronics from AliExpress or the likes and upgrade just the right components.
Often, these dirt cheap products are based on pretty decent ICs, but with corners cut on the surrounding circuitry. Sometimes, just changing some components to match the reference circuit can do wonders. Sometimes it is just a matter of replacing a counterfeit cap with a bigger counterfeit cap :)
I know where you're coming from with this. I've been refurbing my house myself and I doubt I've saved that much because the "saving" has meant that I've been able to buy tools and, of course, better quality materials (e.g., more expensive flooring). I've also often chosen to go the extra mile with improvements where I might have scaled back if I were paying someone. I suppose you could argue this is a saving in that I've got more value out of the money I've spent by treating my own time as "free labour", but have I spent less? I doubt it.
Still, I don't know if it's entirely true in all circumstances. Here, for example, TheGeekPub (The 8-Bit Guy's brother) manages to save himself a ton by building his own electronics station rather than buying one or paying a carpenter to do it for him:
But then, as becomes evident when you watch the video, he already owned all the tools he needed and just had to buy the materials, which were relatively inexpensive.
I think the results look great though, and it's clearly an extremely functional piece of furniture.
Like a lot of things in life, does it save you money? It really depends on your starting conditions (skill level, tools and facilities), and how much you want to invest in the project (both time and money).
I may not have saved myself any money at all by doing things myself when I was 25. But now I'm 57, and I can more or less fix anything in my house, and carry out more or less any changes I want or need to do, without outside assistance.
Me fixing that one toilet leak myself at 25 was probably a net cost to myself - tools & time considered. But at this point, when I can just remove a defunct radiant heating system from my house without even thinking of calling a plumber - by this point, I'm way ahead on cost. Similarly for electrical and basic framing work.
Plus, I manage to almost entirely avoid having to interview/audition contractors and deal with the stress of them (almost invariably) doing a worse job that I would have done myself.
Something to keep in mind is, despite those 30 years of experience that leads to one not needing to think about contacting a professional, knowing whether the work you're about to perform requires any permits/code; the lack of which might affect homeowners insurance claims or local code enforcement.
Part of the 30 years of experience is knowing when you need permits. That's why I got licensed by the state of NM to install my own solar array, and why I don't bother with permits for domestic plumbing here in NM (mostly).
I view it from my perspective as a hobbyist. I enjoy being in the shop working -- and I enjoy making stuff like my work bench, my saw tables and dust shields, my French cleat shelves, etc. I don't really expect to save money (if I do the numbers) so much as I expect to get more out of the process (skills, tools, etc.) than what I get buying something off the shelf.
And yes - a professional most likely values their productive time over time spent making jigs, etc. so they will often allocate capital wisely to save time!
I built my dream workbench. The commercial ones were too light. I wanted one that was 8 feet long, and 4 deep, that I could bolt a big vise to, and whale away at whatever was in the vise without the bench scittering across the floor.
It is build entirely from 4x4s for the legs, 2x4s for the rest of the frame, and 1x8 planks for the top and shelf. It's all held together with carriage bolts so it can be disassembled, and the top can be replaced. No plywood or glued sawdust.
It only took an hour or so to put together. Very happy with it. I later installed wall sockets in the front so power cords needn't be draped over the top.
The only problem was drilling the bolt holes perpendicular. I later acquired a drill press to solve that.
Wow, I built almost the identical thing a few years ago. Wanted a bench to put a small lathe onto while still having a copious amount of non-machine workspace. On the top I used 2x12's though. And (as a complete novice at the time just putting "lego blocks" together sourced from Home Depot), I just used regular screws to hold it together. That thing has moved with us for the past few moves. Super heavy. Also (not literally) bulletproof. Thanks for the description on your build.
Spent so much time planning out and figuring out how to build a custom workbench for a weird area (sometimes, you just need an airbrush booth in a bath/shower stall).
Ended up just buying something from Harbor Freight.
Not much to it. No sawing required other than the length. 6 4x4 vertical posts in a 3 by 2 configuration. A skirt around the top of 2x4s, and another 1/3 up from the ground. A 2x4 connecting the center posts at top and bottom. Then just plank the top, and plank 1/3 up to make a shelf. Drill & bolt.
Don't tighten the bolts until it is all together. Then set it in place and let it settle all the posts firmly on the floor, then tighten.
I left it au natural because I like the look and feel of sawn wood.
Yeah, GP really should have prefaced this with hobbyist woodworker delimma, because it did ring true for me as a, well, hobbyist woodworker, but it is definitely no where near true for professional woodworkers, joiners, cabinetmakers, furniture makers.
For them, there is no "end" even when completing a build. There's simply just the next client to tend to.
Agree completely, with the caveat that standard consumer options are usually absolute crap unless you're spending serious money. I'm probably still in the red on my workshop as a whole (damn you Festool) but once I started needing stacks of large canvases to paint on and shelving to store them, for example, the most expensive piece of kit (Domino XL) started rapidly paying itself off. If you're like me and want to do everything yourself, the payback is exponential the more you do and longer you do it. The possibilities compared to standard consumer options are downright overwhelming but once you get started, you find out a lot of the consumer stuff is easy to replicate with minimal skills or expense. Take a look at what it takes to make a decent sectional [1] for example - it's a lot of time but if you're frugal with fabric at your local uphostery store, the savings on labor are easily $1k+.
The problems set in the second you try to profit from it - professional [anything] is a completely different ballgame.
The deciding factor besides fixed cost of tools is your time. Consider kitchen cabinets. Sure you can build them yourselves, but it'll take weeks, maybe a month of your time which when compared to ikea/box store cabinets you could spend more to save time. If you want to spend even more you can get some custom work done at the cost of having little to no effort needed.
As a woodworker, i've noticed 90% of time is easily spent on setting the tools up. Jigs help save time because they let you spend less time setting up tools.
Economy of scales are gained when you can get more products out of the same number of tool setups.
Very much this. I also work in wood and it amazes people that it can often be easier to make 5 or 10 of something than just one simply because of the jigs we make.
That is why I use hand tools. For most one of items, it’s not slower. And I’m actually shaping and joining wood instead of spending most time adjusting machines. I’ve got enough of that in my day job.
This effect gets worse in a large organization, where eventually you get a named group with a leader for every possible sub-specialty or task. Human nature, then, is to grow (or at least preserve) the size of your group. So you end up inventing projects and work that keep your group busy, rather than ceding people to other groups that actually need more help. There's also a tendency to create mandatory process that forces other groups to engage with your group (forms, approvals, reviews, etc).
People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices. - Adam Smith
This happens even more outside of creator roles. Process people invent processes and then that requires other processes to check those processes, and so on.
Then you are in a situation where just buying a product in a way that hasn't been done before, like from a foreign country, can take months to accomplish because all of these processes were created without that situation in mind.
A real world example of this is the DHS, EFiMS. By the time that they are able to complete all the process, the product they are trying to buy is no longer within the necessary specs and they start all of again every 4 years. https://fcw.com/articles/2020/10/06/dhs-financial-modernizat...
Sometimes I feel like the complexity introduced by those tools results in less productivity. When that happens this weird cycle forms where more and more developers are hired to manage the ever changing interfaces used by internal tools which results in more churn.
> When that happens this weird cycle forms where more and more developers are hired to manage the ever changing interfaces used by internal tools which results in more churn.
The Node ecosystem leaps to mind. To a much lesser extent, the Python ecosystem.
"This article makes me think of the woodworker's dilemma."
I'll definitely say that this applies in the car hobby.
It's a helluva lot more fun to arrange a garage than to pull out a transmission.
In terms of software, and this is perhaps just my age (and industry) showing, but it would be interesting to set up a shop that used only simple/traditional make files, gdb/gcc, simple text editors, extremely simple source control, waterfall design.
It wouldn't work at Google but you sure can get wrapped up in building the garage at smaller companies.
I don't know if it is rare thing, but I also have loved making scripts and automated bits during software dev; and making jigs in woodwork took 50% of my time because that was such fun.
> It's a helluva lot more fun to arrange a garage than to pull out a transmission.
Also, thanks for this comment - it sums up the fun of jig making and also gave me a light bulb moment in games. The fun doesn't have to be killing the orc, it can be organising your weapons and potions ready to kill the orc.
As a software engineer who loves this type of stuff I’m thrilled someone’s willing to pay me exceptionally well to do it. It’s unfathomable to think of a company having 3-4 woodworking teams each with different shops, tools, jigs etc.
> It’s unfathomable to think of a company having 3-4 woodworking teams each with different shops, tools, jigs etc.
Not really, depending on the company. Building something complex, like a house, could require at least carpenters, joiners and cabinet makers, which are quite different jobs.
This is all true. That being said, it's not completely a bad thing. We have to accept that different people are motivated by different things. To run a business, you need both people motivated by money and people motivated by building things. People only motivated by money we work in marketing and management but don't have the mindset necessary to build the stuff they want to sell. So you get people motivated by building things to help. The trick is getting all those people to work together.
Money is a means to an end - profit should not be justification in itself, like it is today. Also, there are other good ways of allocating finite resources - allocation by money tends to create long-term winners and losers, as it is usually allowed to propagate across generations, reaching absurd levels such as people living off good land purchasing decisions made by their ancestors centuries ago. Sure, allocation by money is better than allocation by force or allocating all resources to the king, but it's hardly ideal.
Either way, we weren't discussing limited resources here, but just the opposite: creating new value. There are people who actually create the useful items or services, and people who profit off it - and we have all mostly accepted that it's good and proper that these are different people. But looking at this with a cold rationality shows it to be absurd: we take a group of people who are interested in something of use, and instead of letting them do it, we take another group of people who just want to accumulate money&power and put them in charge of deciding how the useful thing will be produced. This is quite obviously absurd, and it has led to global warming denialism, cigarette cancer denialism, and so many other issues.
> There are people who actually create the useful items or services, and people who profit off it - and we have all mostly accepted that it's good and proper that these are different people.
No we haven't. People can be sole traders, and that's fine. What you're missing is some things need more capital to pay those people than is guaranteed return on investment, and thus that taking a risk is a valuable thing to do in and of itself.
This was exactly me when I was big into video game emulation. I would spend 90% of my time hunting down rare roms and then categorizing them, testing them, ensuring that they were valid, etc and only 10% of my time playing the games. On top of that, the vast majority of the games were barely worth playing at all.
I that while you observation is true the cause is something else. When looking to do something productive (write software or apply wood working skills) we are a mostly a victim of our on worldly experience. A software engineer lives and breathes software development and so it's only natural that he only sees software problems and so commences with implementing solutions for those problems.
Sometimes there are other problems but it's really our live experiences that limit us from building enough of an affinity with them to naturally want to solve them.
It's a useful thing to ask yourself periodically -- at work or "play", programming or woodworking -- "Wait, what am I actually trying to accomplish/build here?".
Woah, I didn't realize I do this, but this really resonates. It seems to me that one "hack" to break out of the tool development cycle is to iterate chairs. If it's iteration itself that is intrinsically pleasurable and addictive, then perhaps simply switching what you iterate on can switch you onto a more productive path, without the usual resistance/procrastination that accompanies work.
I think the woodworkers dilemma is a thing from the dev perspective. But, that still doesn't deal with the software users' side. Why does a modern company need more people in accounting, HR, even management? Shouldn't the ability to email everyone, digitized forms and such make fewer people necessary to do the same job?
If Mcdonalds invents a new sandwich maker that requires half as many cooks per burger...
This is actually ideal when they don’t overdo it. OTOH you have teams that don’t give shit about engineering, rarely apply any thought and push work around. What should take a week takes a year, because engineers don’t take joy in their work, don’t spread knowledge around, learn quickly from others etc leading to Dilbertian managers who want butts in seats, have War rooms to get work done.
I couldn't agree more. We see it all the time that companies spend a ton of time and money on top of their tools. We call it DIY DevOps https://about.gitlab.com/DIY-DevOps/ and frequently see a big saving when moving to a DevOps Platform.
I'm definitely guilty of that. I should focus on some of the copy-paste-edit work to add support for half a dozen configuration modules, but I'm still mentally in the building scaffolding phase, also because for each module I will need to add some functionality (versioning, revert, import, etc).
The original comment citing Musk was fine. The followup comment citing Koshkin was fine. There was no adversarial subtext in the second comment. No defense of the first comment was (or is) needed.
> In fact, you spend 90% of your time building tools.
This is a ridiculous exaggeration with almost any wood worker. Making tools and jigs doesn't require much time and someone usually only does it after they have already done something without them at least once.
Programming tools are much more difficult to make. You need special skills and most tools aren't made to be easily extended.
I make jigs because I can't figure out how to do a project without it. I rarely make the same thing twice, but a lot of things need a special jig to do right.
There are two things here and the more important thing by far is that making something like a jig is nothing compared to making a programming tool. You can measure, scribe and cut a few times then drive self tapping screws and have something useful. Making something not only new but integrate with some existing tool doesn't take an hour, it takes weeks months or years.
I do a lot of jig like things when programming. Sometimes I spend months perfecting helper code, but sometimes I only need it to run once and I'm done in a few hours.
I'd argue woodworking tools are harder. People expect physical objects to work correctly, whereas the word has largely come to accept buggy, glitchy software.
I take issue with the basic assumption of the article. Sustained productivity growth is hard and would not have continued without the software revolution.
Mechanization and the widespread adoption and improvement of mechanized farming has lead to staggering productivity / farmer growth over the last 70 years. But there is only so much you can do with "dumb" machines. Today growth is being driven by computerized information gathering, planning, monitoring, and precision planting / soil maintenance.
To maintain a growth curve takes constant innovation. Just because the growth doesn't significantly alter its slope does not mean that there is a missing improvement bump.
If you decomposed slopes like these you would see they are compound sigmoids where growth is driven by one technology and then another, or an adoption of a new process, etc.
So IMO if "software doesn't show up in productivity" you're not looking hard enough.
And TFP is a real monster of a formula[1]. It's not just GDP / hours. There's like 20 variables going into it's calculation, including things like 'labor quality' and 'capital's share of income' (alpha).
I'm not a smart man, but I think this suggests that a society that lays off factory workers and retrains them as software engineers will not register on this metric. And looking at alpha, there's a pretty clear phase change at 2000 -- it's hovering at 31-33 for 50 years, then marches up from 0.31 to 0.38. Sounds to me like you could tell a story that labor is more productive, but seeing less of the gains than before.
edit: just to belabor the point, here's a random chart I googled for US productivity that _doesn't_ feature the same trendline: https://tradingeconomics.com/united-states/productivity. If anything it looks like productivity has accellerated during the past 20 years.
This is off topic a bit since the article was really talking about something tangently different, and it's probably a really unpopular thing to say, but it's been in my mind for a while. If you look at the worlds population growth rate, you can see that it is beginning to level off. That world of 10 billion that we were all preparing for several years ago is probably not going to come.
If the population stabalizes, or even starts shrinking, how important is growth of productivity? Making "Stuff" is obviously important, but in a world with lowering demand, maybe quality and distribution are the metrics we should be concentrating on.
I have 2 kids, I am the only one with kids in both mine and my wifes family. My 2 kids are the only grandkids between 3 sets of grandparents (wife's parents got divorced and remarried). They are inudated with LOADS of stuff. So much so that it's a real problem. I tell my parents to stop buying them stuff. They think i'm joking. I'm not. IT'S TOO MUCH STUFF. I wish they would all go in, and just get my kids 1 good high quality thing. They just don't need all this cheap low quality stuff.
I bring this up, because thinking this way is a different paradigm. Agile is still very relavent to quality driven development. But scale less so.
That’s a legacy of increasing life expectancy not increasing number of births. ~139 Million people where born in 1988 and ~140 Million people where born in 2020.
Its a different discussion, quite off topic. When the boomers go into retirement, our current economic system with GDP growth focus has a big problem. Our system is based on produce more stuff and buy more stuff. But when the boomers retire, they will buy less stuff. Who will buy supply overhang? Nobody. Even worse, who will pay for the debt incurred to buy overly inflated asset prices, like housing and factories, when boomers start downsizing and start selling?
Look at Japan. They are 10 years ahead of us. It will be tough and depressing until the economic system adapted and prices normalized. On an opitmistic point, humanity will progress and when an economy cannot sell more stuff then it has to sell better stuff.
Japan is not an apt demographic corollary for the USA. The census recently came out and found that the number of Asian and Hispanic residents of the USA has increased (~20-30%) hugely since 10 years ago. These are younger populations with higher birth-rates than "White" people, whose percentile share of the US is heading downwards.
Japan, on the other hand, is xenophobic and has discouraged immigration very heavily. Combine that xenophobia with historical matters: sneak-attacking an industrial powerhouse in 1941 in of the most ill-advised, terrible wars, losing repute by massacring hundreds of thousands of civilians. Meanwhile thousands upon thousands of their own best young men and civilians were killed by the vastly superior man-power and industrial might of the US. Japan was hobbled by WW2 and has never fully recovered, consider the greatest catastrophes of their history were only 80 years ago still, namely losing a generation of youth, their cities being fire-bombed, their savings being depleted for phony war bonds, and being the only country to ever be nuked. Japan is simply not a good demographic comparator for the USA.
You're trying to explain current Japanese demographics with a weird rant about WW2 without mentioning the subsequent economic miracle and population growth? The population of Japan was not far off doubling from 1945 to 2010, and in case you somehow haven't noticed, they became a major first-world economy, eclipsing many many nations which were not nuked.
80 years is really quite a long time. Germany also bounced back rapidly in the second half of the twentieth century to become a major industrial power.
I don't disagree it could definitely be a problem with the US, but so far, with immigration, it's not been a problem. As long as our immigration #'s stay up, the US should be fine.
So far it's been a steady state, but it's unknown if that will continue into the future.
I agree with both you and the GP, and also see immigration as moving the hard problems from one field to another.
To solve the economic issue of maintaining growth, the US/EU moved to the issue of how to maintain a peaceful but diverse and divergent society.
I see Japan's partial bailing on immigration as a sign they don't see any good way to get through it, and we can see a lot of today's US internal fight as the result of not paying enough attention to how hard it is to adapt a society to the new challenges.
I wonder if there would be third ways, with economic powerhouse moving their "growth" to other countries without a stigma of stagnation or exploitation.
In the not too distant future, competition for immigration is going to be tough. I wonder if the US has the political atmosphere to offer competitive packages to win over the immigrants we'll need.
Why do you think that? It feels that the mid-term (e.g. 20-40 years) outlook for migration would include a large increase of migrant supply due to e.g. climate change issues in the "global south", so instead of competition for immigration it seems likely that places like US would be able to pick whatever kind of migrants they'd prefer to allow.
Yes, supply of immigrants will increase because of climate change, but I think it's important to understand the current structure of age distributions in the world. The US had a pretty large Millenial generation. Most other countries did not. Which means, we're not REALLY going to be feeling the need to take on immigrants for a while. But as boomers retire, and millenials move to replace them, the countries that didn't have a sizable millenial generation are going to be in a position to have a much higher demand. Those countries are going to be more desperate than the US, and will likely start developing very sizable offers. The US is going to catch up though, the US does appear to be inverting it's demographic distribution as well, but we're 20 years behind. That's actually a pretty big advantage in a bunch of ways, but in terms of competing for immigrants, it's a disadvantage.
Productivity, especially in relevant areas like administration, stagnated despite computers hitting every desk. I read the Cowen book (Complacent Class) at the same time I was reader Graeber's "Bullshit Jobs." Heterodox writers from both sides of the spectrum. Same observation.
On the face of it, it doesn't make sense. How could, for example, a local college's administration not have become more efficient because of computers?
A factory's productivity, which has legible inputs and outputs is really different to something which doesn't.
Software is management technology, perhaps, but only in cases that management technology is pretty efficient already. Modern warehouses, ports and stuff are more productive because of software. But, they we already pretty efficient. They already had pretty well formalized, legible processes.
That said, software is also a tool. Say your job is to receive applications, payments or such. You process them. File. Respond. Software is undeniably a good tool for such things. We can't abstract that away by looking at the top level trends. It is a productivity tool for administrative tasks. Top line trends don't suggest a productivity gain, but I'm not willing to conclude that software is not an administrative tool.
On the face of it, banks, universities, government departments, the legal sector, accounting, perhaps the whole finance sector are bigger today, not smaller. They have computers now, which are productivity tools. WTF is going on?
Do we have more justice, better records? What is "productivity" anyway, outside of legible productivity like a factory's?
For offices, non tech saavy places, my experience hints at the fact that computers are far from being a main factor.
Software is not mastered even by people in charge of picking it. It changes all the time. Users are (might be different with the next gen) digital-first, also software design regressed a lot, as400 terminals were so damn fast and predictable. Now users have desks with various UI paradigms (2021 people still don't know if single or double click will cause an action, from a hand tool pov it's an absolute failure, but software is not approached like a tool, except maybe industrial settings with big buttons and lag free interfaces)
Fun bit: intranet failed the other day, had to use good old paper template filled manually. It took me (newb here) 2 minutes. With the webapp it's 3 or more, with lots of clicks and waits and maybes... Beside a DB tracking the document creation it's of zero value.
The DB tracking is the primary benefit. Run a report on all of those paper templates. Look up papers filled out last Tuesday, or July 3rd for the past 10 years. The software only makes staff job easier because now they enter it once and they're done instead of pulling files and compiling reports as a manual process every time the boss needs something.
Lots of places have multiple software systems with "glue humans" in between to read information in one system and re-enter data into another.
You will often find that management has no idea that they have extra people doing literally nothing. These data entry clerks have titles that represent a process so it's thought of as the person doing that job (and perhaps that glue process does run partially on their tacit knowledge). The only requirement was for reporting which their pointless task of data re-entry accomplishes. Not much serious thought was given to productivity really.
I often try to estimate how much information is flowing in a give office. I may be wrong but I believe a recent iphone could handle storage and processing. Takes 5 minutes for a person to input a few tokens here and there.
Man, trust me, they probably don't even have useful accountability . By useful I mean the db is there, maybe some very niche service has some statistical view of what's going on, but main manager actually gives tally paper for people to write down by hand what they do on a daily basis. Primitive redundancy on all layers, plus absolute fake data, any employee in this kind of structure will lie and double the digits.
I've had a suspicion for some years that productivity increases from the addition of computers and software to an operation is very uneven, and that apparent great progress overall is because it's a 1000x improvement in select areas while being a 0.5-1.05x change in most areas, with somewhat negative change perhaps even being the norm.
With paper, you do it your way, which maps to the organisation's way at both ends (and is probably just a minor improvement over the organisation's way). With computers, unless you can reprogram the system (which is tricky even for programmers), you're doing it the computer's way, which is often worse than the way the organisation would want to have it set up.
I also suspect management is placing a high value on some ideal of "total visibility" into their organization, for which they are sold computerization as a solution, and are willing to accept significantly greater friction across their organization in order to get it. They imagine it will be of such great value that it'll be worth it.
Of course what happens is one or (usually) both of: they aren't actually ready or able to use that visibility for any productive purpose sufficient to justify its (labor-to-use, and direct monetary) cost; the system doesn't actually deliver so perfect a view as they wanted (though it may act like it does).
"which maps to the organisation's way at both ends"
this isn't always a positive or benefit to an org as a whole, although sometimes it's a net benefit to a specific decision maker.
many times the "org's way" was set years ago by someone who doesn't even work there any longer, and they chose "steps XYZ" because it's all that was available at the time. As things grow, the org info changes, needs change, and people try to squeeze new exceptions and rules in to the existing process. No one has 'authority' to revamp the process (whether with computers or not), and it just gets weirder and weirder.
"If the customer number starts with W and their date is earlier than 2007, give them a 10% credit on any items they ordered from the summer catalog, then email joe@ourcompany.com".
"There's no Joe that works here... ?"
"Don't try to change anything - this works just fine as it is".
> many times the "org's way" was set years ago by someone who doesn't even work there any longer, and they chose "steps XYZ" because it's all that was available at the time.
Whereas the “computer's way” was basically set by some people in Microsoft who'd never heard of the organisation and likely weren't even thinking about what it does, because that's not what they were making the software to do.
if you're just dumping everything in to excel, perhaps. I was more thinking about custom software, internally developed or contracted by the company to automate/amplify whatever their processes are.
Well, they certainly are, you need a much smaller department to do the same things, and those departments relied on people like "secretarial pool" for typing documents, people carrying internal mail, etc - so productivity of generic business administration tasks certainly has increased.
However all of those gains would appear with basic "office computerization" in late 1990s and early 2000s (which is quite visible as productivity growth in the article) with wordperfect/word and visicalc/1-2-3/excel, and not meaningfully changed with more recent develpoments. Accounting today is automated roughly as much as you could in 2000, at least if you were up to date with year 2000 tech.
They are. Hence the steep gradient section 1995-2003 as computers and internet were added. But you don’t keep getting additional benefit at the same rate.
Go from paper to spreadsheet workflow? Useful step. But then what? Eliminate the typing pool? Saves money. Then what?
> Productivity, especially in relevant areas like administration, stagnated despite computers hitting every desk.
I don't know if this has been quantified, but to some extent the extra capability is simply repurposed to more detailed administration. Things that were not possible become possible. Things that we did not have time for, we suddenly have time to do. Per Parkinson, "work expands so as to fill the time available for its completion".
An example would be logistics within the US -- at some point, probably after 9/11 or a similar event, it was decided that all packages flying in commercial airlines within the US needed to be vouched for by entities known to the US government, the individual packages tracked at a more detailed level, etc. This would not have been possible without automation throughout the industry, and definitely "soaked up" some of the productivity benefits of this automation.
That also happens in e.g. manufacturing. I was talking to an uncle who worked in aluminium manufacturing, he was explaining that as computers developed they could convert waste to very precisely understood ingots (in terms of composition), then when an order arrives the manufacturing program would know exactly what ingots should be picked to fulfil the order with as little pure metal (both aluminium and solutes) as possible, as that’s where the plant’s margin was. Iirc he told me they were above 99% (so only needed pure aluminium straight out of a smelter for less than 1% of their production).
Prevented waste is a lot of it. Factories used to produce full on until inventory built up too much, and then they did clearance sales to empty it, or sometimes sent it right to a landfill. Now with just in time there is less produced than before, but it is only produced as needed, so the total produced is less, but what is produced is what is needed.
> On the face of it, it doesn't make sense. How could, for example, a local college's administration not have become more efficient because of computers?
Is the claim from those books that the administration of small colleges has not become more efficient because of computers? I would have guessed that there are far fewer employees (per student) doing clerical work at small colleges now than before office PCs were ubiquitous.
There are more employees (per student) doing administrative work at small colleges now than before office PCs were ubiquitous.
"Clerical* isn't really a term we use much now, and it's often associated with job descriptions from before the PC era. Stuff that happens in colleges, in office, that isn't academia, is generally known as administration. We have more of it, whatever you call it.
"Administration" strikes me as too broad to make a reasonable comparison before and after PCs. I would expect that in the same time frame that office PCs came into existence, the roles of colleges has also expanded greatly, and if those new roles are considered as part of "administration" I don't think the comparison tells you much about the effects of computers. Are we counting things like counseling, career planning, financial aid, health (including mental health), immigration services, legal and compliance work, everything related to sports and athletics, etc.?
IMO the biggest "improvement" that this sort of clerical "productivity" generates is that the same workforce can keep up with inflicting even more inane bureaucratic demands on the people forced to interact with it.
> A factory's productivity, which has legible inputs and outputs is really different to something which doesn't.
I've worked in the semiconductor industry, and the situation is just as bad over there, if not worse.
To the common capitalist's credit, this absolutely has to do with lack of competition (not necessarily due to regulation, but because the cost of the ticket to get in—capital to build and operate a fab—is so high). Under something more Taylor-esque, 2x productivity is on the low end of what I'd expect a contender to be able to operate at, relative to the clip that the small pool of incumbents move at. The main sources of inefficiency based on what I observed 2014–2020 are either people problems or process problems that call for technical solution that doesn't look anything like materials science, chemistry, physics, etc. (Elite overproduction ⨯ poorly trained/selected workforce + terrible, absolutely godawful software supporting the whole operation; bullshit jobs abound.)
>Almost every recent A.I. advance has come from one tiny corner of the field, machine learning. Machine learning exposes a set of connected nodes, known as neural nets, to mass amounts of labeled real-world data in an attempt to give those nodes tacit knowledge. The breakthrough example was software that was able to identify cat pictures.
>So far, these neural nets have given us some great demos but mostly niche real-world applications. We don't have self-driving cars quite yet!
What's the author's threshold for "real-world applications"?
- Google's Youtube algorithm for recommendations uses neural nets[1]. So ~2 billion viewers being affected by it doesn't seem like a "niche" application.
- Google language translation uses neural net[2]
- Apple Siri voice recognition uses neural net
It doesn't seem like neural nets are analogous to the joke that "graphene is the wonder material that can do everything except escape the research lab".
In contrast, deep learning neural nets have escaped the research lab and are widely used in production systems today.
The author's blog post is recently dated August 2021 so it seems like he's not kept up-to-date on this topic since the experimental neural net winning ImageNet in 2012. Yes, that was an artificial contest but things have progressed quickly and there is real-world commercial deployment of NN trained models.
The author didn't really mean "real world" here. That is a misnomer. The article is about explaining why productivity gains have slowed even in the face of better software techniques. Machine learning has possibly made YouTube more addictive than it otherwise would have been, but this isn't increasing aggregate economic output for the world. It's just concentrating ad revenue in Google accounts, where it used to be spread between many more content distribution platforms.
> but this isn't increasing aggregate economic output for the world
How is it not? Demand is increased as people want to consume more content, and supply has risen to match it. We have more creative & interesting content by more creators than ever, that's certainly a huge increase in overall economic output by any measure.
If you want to make some separate point about how "this isn't good for society" then make it, the economic productivity benefits of YT are huge regardless.
I think you'd have to consider the productivity benefits of youtube relative to the productivity benefits of anything else. For instance, a customer goes on youtube for three hours and sees some advertisements or whatever. Pays nothing. Another customer goes to a local restaurant with some friends for three hours. Spends $60 on food and drinks. Of the two, what is better for the economy? I would wager spending disposable income within the local economy is better than watching advertisements for companies registered in Dublin or wherever has the lowest taxes.
I think the author wording is a little bit confusing. I think he want to point out, that narrow AI is used successfully only in (niche) applications with narrow purpose. Niche here is not about total market size.
Not to mention Google search uses neural networks for language parsing [1], social networks use neural networks for ranking feeds, etc. I'd venture to guess the author interacts with the results of neural networks dozens if not hundreds of times per day but simply doesn't realize it.
> While software improves through better tooling and faster hardware
In my experience faster hardware leads to worse software. I'm doing the same things I've always done on my "smart" phone but apparently the same sized text messages now need more phone. Good god, ICQ from 2000 had more user facing features than texting apps and that was running on Windows-swap-everything-unconditially.
Yeah, much software development time is spent on invisible "features" that aren't relevant to the poor bastards that will have to use it. It makes the case for the more vertical in-house software development. There's much less push back to specialized features which often aren't nearly as specialized as the outside developer thinks it is because of absent understanding of the job fortified by arrogance.
But even when the job is well understood... display fucking words on the screen... I mean, come on! Why am I ever waiting 3 seconds after unlock for that?
I've been in software development 28 years. Seems like we're writing the same stuff we always did, except now it's in a language virtual machine running in a container running in a hardware abstraction VM running on actual hardware in the cloud. Thanks to hardware advances, we can move exponentially sharper images, bigger files, more polygons, but the actual complexity of what software is doing is increasing at a slow linear pace. The biggest gains now are in horizontal scaling, we can now handle way more users in parallel than we ever did 10 years ago. But for most products that don't need to handle a billion users, it's at least as hard to produce a shipping product as it was 20 years ago.
Firstly, GDP is a bad measure of productivity. A pill that replaced all healthcare would reduce GDP by 15%, but I doubt anyone (even economists) would call that a catastrophe.
Productivity has stalled mostly because people have already filled their needs, so it makes little sense to buy more. Basically everyone have the clothes they need, the food they need, the car they need, the computer they need, already. Screen entertainment and information is basically free nowadays. So no matter how much you increase productivity these sectors will remain mostly constant.
What do people still buy? Housing, but that is mostly a competitive good, people spend as much as they have on housing and it is limited supply so prices just increases to whatever people can afford. Same thing with education, international flights and the free market healthcare with restricted supply you have in USA.
Another thing is access to other peoples time. You can buy a person to clean your home or do your lawn or drive you somewhere or renovate your kitchen or provide a massage or other things. There is no way to significantly increase that productivity, it is mostly fixed.
So personally I see no need to increase GDP (what he calls productivity) further. Not to mention that many things gets cheaper, a family buying a TV today gets a much better TV for the same amount of money as a family buying a TV 30 years ago. The main thing would be to automate tasks so you no longer need as much access to other peoples time, but that is mostly an unsolved issue for now. Automating information delivery worked great, but it didn't lead to increased GDP rather it lead to those products becoming essentially free to consume effectively making it useless from an economists perspective.
> Productivity has stalled mostly because people have already filled their needs, so it makes little sense to buy more. Basically everyone have the clothes they need, the food they need, the car they need, the computer they need, already. Screen entertainment and information is basically free nowadays. So no matter how much you increase productivity these sectors will remain mostly constant.
I know you are not being literal with your claims, but I just wanted to provide some data, so that others who read it will have some grounded context. Across several categories, there is a stubborn (and maybe surprisingly or not-so-surprisingly consistent) 10-15% of Americans who don't have many of those needs filled that you mentioned:
- It should come as no surprise that many poor people cannot afford cars and hence rely on public transit more, but that public transit systems are woefully underfunded (or funded but misappropriated/delayed/etc.) in America. Because economic disparity is interwoven with racial inequality, this is not just an economic problem; suffice it to say that, no, not everyone in America has the car they need: https://www.urban.org/features/unequal-commute
- The FCC reports large gains in the past 5 years for broadband and mobile broadband access, but this is baselined against a paltry and outdated 25/3 and 10/3 Mbps standard definition for "broadband speed" (trying living in a 25/3 Mbps household while remote-working, video conferencing, streaming Netflix, etc.): https://docs.fcc.gov/public/attachments/FCC-21-18A1.pdf
Those 15% lacks those goods not because we can't provide them but due to how the country is run. If policies were changed so that those 15% got what they need then that would be a small one time bump in GDP and then lead to the same effect I described. Or possibly it would even reduce GDP, since now people would no longer have to fight over the 85% of spots that provides what they need to live a good life. For example, if regulations around healthcare was changed to make it much easier to become a doctor and start clinics greatly increasing the supply of healthcare providers, then the extra competition might even reduce the overall costs of healthcare to similar levels as other countries reducing total GDP even though value delivered to consumers was increased.
Anyway, the point is that USA intentionally keeps 15% in a bad state in order to motivate people to not be a part of that group. There is no need to do that, many other countries doesn't. But keeping people poor doesn't seem to hurt GDP, rather it seems like keeping a part of your people lacking like that increases GDP, making it an even worse measure.
This seems like pure lack of will, not resources to me.
I feel that big productivity gains will be made once government software will be required to be open source.
Also if some form of better organization/decision making emerges that blockchain space folks are working on. I feel currently economy is too much supply driven and if more people could securely "invest" into what they really want build it would really make a drastic difference (you can see that kickstarter or crowdfunding as an example but it's too prone to scammers)
> Firstly, GDP is a bad measure of productivity. A pill that replaced all healthcare would reduce GDP by 15%, but I doubt anyone (even economists) would call that a catastrophe.
This is only true in vacuum. In the real world, if this pill totally annihilated the healthcare sector, a massive amount of resources would be freed and reallocated in other sectors.
I am writing a book covering a lot of this ground (shout out to Roald Coase) - but I have a different (ish) conclusion.
It's not going to be a few "software-friendly" companies like FAANG that eventually lead the charge and we see productivity - it's waaay longer term than that.
My take is software is a form of literacy - and it will only be when managers code daily that we will see enough of the control layers (model, monitor mentor) being actually software that software will show up in productivity stats
If you like an analogy - steam engines used to power factories but there was one central engine and you spread out the power to other areas via bands / chains. Electricity came along but mostly replaced the central engine - it was not till people experimented with having power sent to many engines did the modern (Fordist) factory layout become feasible
In short - everyone needs to learn to code
or - if an SRE is what you get when you ask a coder to design a software development process, a programmable company is what you get when you ask a coder to design a company
No offense but this doesn't sound like a well thought out idea. Have you worked at a tech company as either a coder or a manager?
Why should managers code daily? What would they be coding up anyway? If you're saying they should be coding features in the product, then I think you're just saying there should be no middle managers. That's been tried a lot in many different forms over the years, and there's a reason the majority of successful tech companies still have managers.
If you're saying that managers should be coding the tools that they use themselves for managing their employees - why? Their job is talking to the people they manage, listening to them, helping to solve problems, helping to make trade-offs to deliver the product on time, coordinating with other people in the company to keep things running smoothly on their team, etc. If you have a problem that does require a technology solution, like keeping track of tasks or automating some part of the team's work, then it will be orders of magnitude more effective to buy a solution that already exists or hire a team of coders to build it rather than have each individual manager trying to code up solutions themselves. That would be a big distraction from their main job of actually managing their team. Until you can automate the whole job of a manager with an AI, the important parts of the job are just not things that you can easily code up a solution to.
Do managers write daily? If so my conjecture is they should code daily.
Will they write a UI button. No. But I am not sure that's what a typical coder should be doing in a "programmable company". I am really finding it hard to express this conjecture (it might be rubbish) but why should humans be involved in day to day creativity in the operations of a company? Is it possible at some point to have the operations of a company encoded, moving along and fixed? This is the ideal anyway (the whole point of "change management" is improvements ).
Anyway, my conjecture is that there become three basic management functions - monitoring the company, modelling the company and mentoring the people. I can easily imagine code written to explore new models (it's what everyone does pouring over excel all day - in fact I would suggest a company that can do away with excel from managers is winning)
At my workplace, managing the team is only a portion of each manager's job. The other portion is carrying out administrative tasks assigned by their manager. Examples include gathering data for updating and maintaining a variety of metrics and dashboards, writing procedures, and so forth. In fact a lot of that stuff could be automated, and it is. The managers, including one former coder, use mostly Excel. If they have to create a tool for others to use, with a user interface, they usually find a canned solution that can be bought and installed.
In fact, my lack of interest in that side of middle management is one of the things that led me to turn in my manager hat and return to the rank and file. Ironically, they promoted me out of management, which is a pretty good hint. ;-)
Coding is hard though. I remember helping a smart friend of mine with his homework in college for an intro CS course and even basic fizz-buzz type stuff was very hard for him. This is a guy who has had a highly successful career post-college and did well in his (non-technical) major. I'm skeptical that more than a small fraction of the population will ever learn to code at more than a superficial level.
I don't think anyone has seriously said that for a long, long time, if ever. Literacy is obviously pretty easy, much easier than algebra, for example, because in societies with universal schooling almost everyone learns how to do it.
If software causes soft cost savings (reducing the number of required people) the savings may not actually be realized. The internal feudalism of large enterprises protected by monopolistic moats means people resist headcount reductions. And since these large enterprises still employ the majority of people, they are over represented in the statistics.
> Alienation Is Not ‘Bullshit’: An Empirical Critique of Graeber’s Theory of BS Jobs
> David Graeber’s ‘bullshit jobs theory’ has generated a great deal of academic and public interest. This theory holds that a large and rapidly increasing number of workers are undertaking jobs that they themselves recognise as being useless and of no social value. Despite generating clear testable hypotheses, this theory is not based on robust empirical research. We, therefore, use representative data from the EU to test five of its core hypotheses. Although we find that the perception of doing useless work is strongly associated with poor wellbeing, our findings contradict the main propositions of Graeber’s theory. The proportion of employees describing their jobs as useless is low and declining and bears little relationship to Graeber’s predictions. Marx’s concept of alienation and a ‘Work Relations’ approach provide inspiration for an alternative account that highlights poor management and toxic workplace environments in explaining why workers perceive paid work as useless.
Perhaps because mainstream software of today doesn't add much to what we had in 199x. There already were the same Word, Excel and e-mail those days - . (and I doubt live chats we have today can add much to productivity, well-thought and well-organized emails are better).
It can be possible to boost productivity with something like Roam Research (especially used in collaborative mode) but it would require a lot of enhancements to it and a lot more work on teaching the people to use it the right way.
Even people with skills to use Word the right way (i.e. use styles instead of ad-hoc manual font adjustments and extra CRs) or to use non-basic features of Excel are rare. Teaching (or even getting them interested) the masses something entirely new, requiring a new way of thinking and totally new workflow would probably require enormous effort.
I disagree completely. I'm insanely more productive as part of a team than I was in 199x.
Not even talking about things for software development specifically, but for general-purpose word-processing and spreadsheeting and scheduling, you've got:
- Live collaborative cloud editing over mobile. The back-and-forth that previously might take a week can now be done in half an hour while you're in the back of an Uber in a different country
- Googling how to accomplish spreadsheet tasks. Stuff that you'd just give up and not do before, or would take you days to figure out on your own, there are tons of blog posts and YouTube videos letting you get it done in half an hour
- Tons of scheduling and information-gathering improvements. Looking at people's Google Calendars live to find a meeting that everyone can attend, sending a Google Form to collect lunch preferences rather than contacting people individually, and so on
The productivity of modern administrative office tasks has skyrocketed with collaborative, mobile, cloud-based tools.
Getting stuff done as a team with static Word and Excel files that were stuck on a physical computer at a physical office while you tried to decipher printed software manuals was slow.
On the other hand, for all of these points you could argue you aren't gaining all that much added productivity for the amount of compute power this all took vs the 1990s.
- Live editing requires people to be working on the same thing at the same time, otherwise its back to back and forthing as people have different schedules and don't get to things right away.
- Google has now gone to shit with basic search terms. Too often you end up in some longwinded article seeking another 10 seconds from you to pay advertizers before you back out and look for another. The web had a lot more signal and a lot less noise in the 1990s. I'd even reach for a book on excel today where I can quickly flip through (or ctrlf a pdf) vs waiting for a 15 minute youtube video to get to the point.
- the scheduling improvements have costs, you now have to put everything and anything up on your google calendar lest you be scheduled for a meeting where you are "free" on the calendar but really working on something else. Invites for these sorts of meetings/zoomcalls/calendar events takes places over decades old email.
Really the biggest productivity gain would be from going from a printed book to a pdf you could search. Everything else imo is sort of a wash or a massive waste depending on how you look at it with the compute resources being used to run that zoom meeting (that could just be a conference call like the 1990s).
You're speaking of compute resources like they're the biggest bottleneck and not human time. If something takes 100x more compute resources and it saves me 2x time, then I'm 2x more productive. Sure most of those compute resources are wasted but that's besides the point.
Years ago, if you wanted to do a spreadsheet task Microsoft had a help file on the PC that was readable and indexed and told you how to do common tasks. Now there is no on-system help whatsoever and searching Google doesn’t even get you something from Microsoft but rather a page from the Houston Chronicle (??) that probably doesn’t even work for your version of Excel.
I think in most respects basic “office suite” software has not improved in over twenty years.
95% of the things you wanted to do were never covered in Microsoft's help file.
The help file covered all the "building blocks", but there's a long tail of use cases requiring combining those building blocks in non-trivial ways.
If the Houston Chronicle happens to be the one covering one of those long-tail use cases then that's amazing. Because no user manual could ever be large enough for them all. And even if it's not for "your version of Excel", adapting it is probably the easy part.
> Not even talking about things for software development specifically
In 199x we had straightforward visual RAD IDEs like Delphi and C++ builder. IMHO this was the pinnacle of apps development productivity, what have now (Electron + a soup of web front-end frameworks) is nightmare, postapocalyptic chaos following the golden age.
Interesting read that harkens back to my econ days. I agree with the author based on my experience that digitizing a process often requires the developer to know the system better than the person who operates it - due to the nature of programming. I wonder if AI is really the way to transform software development into a General-Purpose-Technology. Codex is showing the way, in a niche and gimmicky way, but such is the way that many great ideas start. It's really hard to know though if _this is it_ or if it's merely another invention in a long line of inventions that failed to make it.
Sometimes it just seems like we are swimming in a sea of code with no apparent gain. Incredible to think that people managed the construction of Pyramids, Cathedrals and awesome constructions with nothing but papyrus, ink, leather straps and good ol' memory. I can't even remember the function arguments for fs.read!
If we counted goofing off as part of productivity, we might start to see the gains we expected.
Put another way, while computers have made us more productive, the internet has made it much easier to not do our jobs while at work. I don't think it is a coincidence that the graph stops being as steep around 2005.
It is because we are no longer improving our society's "hardware" (read: infrastructure). No matter how quickly i can design a BIM model using state of the art software; i am still constrained by the fact that it takes me an hour to drive to work, an hour to my client's house, and then an hour back to the office, and then an hour home.
"VisiCalc took 20 hours of work per week for some people and turned it out in 15 minutes and let them become much more creative." - Dan Bricklin
It's interesting to me that VisiCalc (1979) and its successors (Lotus 1-2-3 and Excel) undoubtedly made some key business jobs vastly more productive and yet software spreadsheets don't really make a dent in the productivity numbers. I'd argue that software spreadsheets are a 'management technology' as the article defines them, but that they are a counter to the article's claim that management technologies spread slowly. They've been widely adopted by businesses of all scales, starting from the introduction of VisiCalc.
Because of this, I wonder whether we are measuring productivity properly
Because GDP doesn't measure right. It measures dollars, not value.
Take Google search, for instance. I can look up (approximately) all the information in the world, for free. That shows up in the GDP as $0, because it's free.
But what is Google search actually worth? Would your business, say, actually pay for it as a tool? Probably, at least for those of us who need Stack Overflow answers. Real value is being produced, but it isn't being measured because it is being given away. (Yeah, I know, ads. Search itself is still being given away. So is Linux and gcc and...)
And does Google search help productivity? Yes. Does Linux? Yes. Does gcc? Yes. The ability to get all these things for free greatly expands the things you can do.
Software is producing value. But because so much is being given away, the value isn't showing up in the dollar-based metrics.
In my experience (mostly as a consultant to the public sector) all the (potential) productivity gains brought by software are eaten up by two things:
a) Layers of managers whose job appears to be to hold meetings to talk endlessly without ever making a decision or assigning an action, and
b) a complete failure of the same managers to understand how to use the software at their disposal to its fullest extend (eg no enforcement of data quality, no idea how to report useful metrics).
Both of these are the result of the people in positions of authority having no technical background or education and recruiting similar people. If they do think they need someone to analyse data they think that's a low-level position and hire based on a low-level salary with predictable results.
That is a graph of US TFP. I expect that once Asia is included in the mix the change will be a bit more pronounced. Most of the hardware related change in the IT revolution is happening there. The S&P 500 rank 1 company (Apple) would look to an alien like an Asian company since all the actual manufacturing happens there.
Also, while I don't think it is necessarily the major driving factor, the US has a capital misallocation problem. People keep sinking fortunes into companies with bad profit margins.
Also, Total Factor Productivity will include the effects of everything a society does that can be measured in monetary terms.
Dumping poison in rivers would probably show up as an "efficiency gain" at least until it causes ecosystem collapse and widespread illness and death, similarly so would clearcutting forests.
I would expect at least some attempt to ground the claim in something real before leaping to a headline grabbing "computers aren't productive" (except for that bit when the productivity went up faster than previously) conclusion.
For starters, do we care about TFP (the article implies we do) and if so has whatever we predict it to provide as a benefit also followed a similar graph? If not, then who cares if computers or anything else makes the graph go up.
Secondly, how can we tell if computers have made it shoot upwards, but some other unconnected change has mostly negated that impact.
It all feels very shoddy.
The first time this came up in economics neatly lines up with when the graph suddenly changes direction upwards:
"You can see the computer age everywhere but in the productivity statistics." Solow, 1987
My family watched Miracle on 34th Street a couple years ago. Aside from being generally impressed with how well it's aged, I was particularly impressed with the office technology on display (pneumatic tubes etc).
In Victorian London, mail could be posted up to 12 times per day.[1] That's about as often as e-mail can be turned around.
Bronze Age merchants exchanged clay tablets with remarkable throughput.[2]
On the consumer side...
I live in Silicon Valley. My grandparents had better access to services than I do — fresh milk delivery, an MD that came to their bedside, and an electric trolley — in the 1930s in a town of 12k ppl. My grandfather was a driver for a laundry service, my grandmother taught piano. [3]
But maybe the most fundamental issue here is that productivity is 'measured' by dividing GDP by hours worked. But work seems better characterized as a mechanism that distributes, rather than creates, GDP.[4]
A lot of those jobs ultimately became redundant for people. Milk delivery makes no sense when you can choose your bottle down at the grocery store next time you're there. Bedside care became impractical as medical technology advanced, and electric trolleys are pretty cumbersome (especially alongside city streets).
Eventually, we realized that we could cut out the milkman: we laid off a lot of people in the process, but I'm sure the ice deliverymen are thankful that they no longer need to haul 25 pound bricks up New York staircases anymore.
Ice delivery is the only valid example of a reduntdant job here. My grandparents bought milk at the grocery store, too. And I get milk delivered with Instacart. Medical technology did not make bedside care impractical. Quite the opposite. More can now be done from the bedside. Not that a typical doctor's visit involves any meaningful use of technology. Of course there are many other reasons why bedside care is superior.
FYI trolleys were phased out due to auto oil and gas conglomerates buying and phasing out trolley companies and lobbying local governments to drop support.
I'm not sure that we're measuring productivity correctly these days. Take AI, for example. Going from barely being able to play chess, to winning at Starcraft and Go and basically any game Google thinks is worth cracking is a huge, huge leap in technological capacity. What was the impact of the people that worked on these and related technologies on "productivity"? You can count apples. You can't really quantify DynamoDB or Rust on the economy.
Now, median worker productivity growth seems like it's drastically slowed and I think we have real problems in the economy, but as everything started merging with tech it gets harder to see the full picture.
How much more productive is an software engineer that uses Rust compared to one that uses, say, C/C++? That is to say, do I need fewer software engineers to deliver the same product if they use Rust rather than another language?
If the answer is "not really", then Rust does not increase productivity (I don't know the answer, btw).
In general, I don't think that software as a tool made much difference to people's productivity in the last 20 years or so. The boost enabled by internet connectivity was probably over by ~2005.
Since then not much has happened. Uber, Deliveroo, etc. are great for consumers but they don't increase the productivity of drivers: Drivers cannot drive faster or service more customers per hour, really, these are bounded by physical constraints.
> How much more productive is an software engineer that uses Rust compared to one that uses, say, C/C++? That is to say, do I need fewer software engineers to deliver the same product if they use Rust rather than another language?
Anyone who's used both in production for any length of time can tell you that you will not deliver the same product in C++ unless you're willing to put in JPL levels of investment. So yes, it's far more productive for "the same product." The question is what you mean by "the same product." Do you consider a project implementing a feature set with loads of memory errors to be functionally the same thing as one without? If you do, then Rust probably won't be that much more productive than C++ (IME still more productive, but within the realm of subjectivity). If you do not, then it's not even close. This is the calculation companies like Microsoft and Google have been doing when they invested in Rust for new OS development.
The same product means seemingly the same thing in the hands of the customer. Users does not care or know whether MS Word is written in C++, Rust, or what not, they just see what the product can do and how well.
Maybe that's a good way to compare productivity: Let's say you have to develop and maintain MS Word. What's the size of the team if you do in C++ vs Java vs Rust vs whatever? The smallest team has the highest productivity by definition, all else being equal.
I didn't say anything about consumers caring whether a program was written in C++ or Rust, and I don't think consumers care about that. I talked about whether the product is full of memory unsafety or not. Where that is a relevant aspect of your product, Rust is far, far more productive--it's more or less impossible to produce a large C++ program without exploitable memory unsafety bugs, while it's fairly tractable in Rust. Where it is not, I expect productivity gains to be modest at best.
My point here is that there is pretty much no sized team that will deliver a C++ program with equivalent functionality to Rust, when considering security as part of the feature set. We know that about 70% of CVEs in memory unsafe programs come directly from exploitation of UB, that there is no commensurate increase in CVEs in safe languages, and that only about 1% of LOC is unsafe--so weighing the percentage security bug reduction is pretty easy. But not all customers prioritize security very much, so Rust's benefits over C++ in this context will have to be downgraded according to how highly they value a 70% reduction in CVEs. That is why I said it depends on the product--you can't come up with a flat "productivity" metric, like you seem to want, that applies the same to every situation.
There is indeed a flat productivity metric. But of course, a programming language being a tool, its impact on productivity varies from product to product.
I think my previous comment stands: Do you need fewer people to develop and maintain the product? That applies to every situation.
It entirely depends on what you mean by "the product" and I can't help but feel that you are deliberately missing my point. Suppose we take it as a given that for any given program, ignoring performance, a pure Python program requires fewer developers to implement the same number of features than the same program in C++; does it then follow that we would need fewer Python developers to build a web browser than C++ developers? It does not, because you simply could not build a web browser in pure Python that would fulfill people's performance expectations. But for many other projects, the claim does follow, because they are within the realm of "stuff you can do in pure Python."
My claim is that a similar phenomenon applies for C++ and security, regardless of whether Rust is otherwise more productive: no matter how many people you add to a large C++ project, you are not going to come close to the level of assurance on security properties that you get from a Rust program of the same size (your only real hope is formal verification, which for verifying code in just about any language is multiple times less productive than writing the same code without verification). For these cases, asking which is more productive is pointless--C++ cannot deliver the expected product. However, if this is not the case for your project, I believe Rust's productivity benefits are more modest and are probably outweighed by things like team experience and available tooling.
I think I'm being pretty clear here about what I mean and why your question does have a single answer that applies in all situations.
You can't eliminate all the paperwork and archival jobs twice.
Some of the improvements tied to communication require cultural changes that can be slow. Telemedicine has been possible for a long time, but the shift only picked up steam due to Covid.
I find the opening graph to undermine the central thesis a bit: the dot-com bubble burst in 2002, but is barely reflected on the graph; the growth levels off in 2005 or 2006. If the change in productivity was largely software-moderated, I would expect a lager change around the dot-com burst. Meanwhile the large change seems to be the 2007 Subprime mortgage crisis - and presumably the follow-up change in interest rate and investment patterns.
Before we can have a meaningful discussion about this we really need to understand what this Total Factor Productivity graph even means. "In the U.S." implies that it's using GDP or some other nationwide measurement. So we're talking about software's impact on productivity in a system (a large nation) with many other forces at play. That makes the entire discussion a bit narcissistic don't you think?
In my experience, Low Code tries to fix the non-problem and makes the real problem worse. They will get you up to speed fast, but with a much lower output plateau than normal programming tools. Some experience from one low code tool I used this year:
Non-problem: Writing code. This is the easy part. COBOL took typists, gave them a week of courses, which made them successful basic coders. Low code helps the most basic junior but slows down the average coder by forcing everything trough drag and drop.
Problem: Reading code. Most low code platforms I've seen show you only a small part of the code, needing a lot of clicking around in a GUI to make sure you found it all.
It either transform it in a mess of arrows and boxes or spread it out so wide you spend more time scrolling than reading. I've found myself reading the XML dumps of our current tool just to spare me some time.
Problem: One size fits all. You can't polish or finetune the standard components. What you see is what you get. This guarantees you both a minimum and a maximum level of quality. Yes, there are escape hatches. No, they won't help you. You will make parts of your program unstable or less user friendly because your low-code vendor didn't foresee all of your needs.
Problem: Versioning. Boxes and arrows don't merge well. There is generally only a small team working on 1 piece of code. You can't scale it past 3-4 people. Also, emergency fixes in prod don't easily propagate back to dev, especially in a high-stress situations. You'll have to do it manually. This almost guarantees regression bugs.
Problem: Searching code. If you have enough code, the day comes where you'll need to find all references to something. I've grepped code bases of >10 000 000 lines. Can't do it in more than the most limited way with low code.
Problem: knowledge exchange. Something like stack exchange works because you can type text. Print screen is the only option available in most low code tools.
As the saying goes, the core of ICT is not programming but Information and Communication. If you want to make programmers obsolete, you need tools that help you organize information and ease communication.
Low code is simply the wrong way to look at the problem. it ends up throwing tons of man-hours at a problem. In the long term, it creates more programmer jobs, not less.
But that's exactly what people used to think in the 60s and 70s: instead of requiring a bunch of electrical engineers to build some arcane contraption, now ordinary folks can just write something that almost looks like English and you can automate anything and do calculations in seconds that used to take months! If that didn't pan out even though it seemed so freaking obvious that it would, why will No Code be any different?
To add an anecdote: No Code already was the hot new thing in the 90s when I studied CS. You could click together custom interfaces in Delphi and even do basic wiring with clicking alone, IIRC. Devs expected that laypeople click together the solution they want and developers do the remaining wiring. Yet no non-developer could actually use that thing. Nowadays I think the main hurdle is the transformation of a fluffy real world problem into something of an algorithm. Developers do this almost unconsciously, because they practice this all the time, and thus are usually not aware of it. Yet this process of quantification of the real world problem often is the actual problem, not writing it down as code.
> I think the main hurdle is the transformation of a fluffy real world problem into something of an algorithm.
I came to a very similar conclusion after I had been teaching programming in high school for a few years: the difficulty of "programming" is in learning to think algorithmically, and no amount of "No Code" tooling gets you around that problem. The article alludes to this with the "PBJ sandwich problem" - people are used to specifying processes based on a collective (and often unconscious) cultural understanding, which computers obviously do not share!
I'm inclined to agree. One of the most successful "No Code" programs is Excel. Yet we still, time and time again, see people struggle with basic calculations in it. It's literally elementary school mathematics we're talking about.
I think most "No Code" and especially RPA in general will fall into that. The required mindset to think programmatically is not something the majority of people have unfortunately. But "No Code" will enable those that is somewhat technically inclined and able to think sufficiently programmatically.
Yes! SQL for example, was invented for business people to allow them to pull their own reports iso having to bother programmers to do it for them. We all know what really happened.
We are doing that though. There are tons of flexible systems like that, where developers provide components/plugins and somewhat technical people, or rather domain experts fit them together for a specific task. Wordpress, Unity3d, Shopify to name a few.
Software is mostly a management technology. But there is all sorts of automation: CNC machines, industrial robots making stuff.
I think that a graph of "total factor productivity in the USA" is misleading without looking at factors like, say, how much manufacturing has disappeared from the USA and gone overseas in that period!"
You have to look at how much you're producing with how many people; and that cannot be some per capita based on the population, but the actual head counts in those industries that are covered by that graph: what is the productivity with how many people?
I still think that software in general does show up in productivity trends. Take a look at nineties on the chart in the article, this productivity surge can easily be attributed to spreadsheets, text processor and other software innovations that became ubiquitous in that period of time. It’s also true that niche software is hard to make right, but then again, take a look at Amazon — its crazy efficient logistics is based on the custom software and it seems to work fine for them.
Something I don't see addressed here is how much of software is an arms race. I think this reality is hidden a bit from people who have only ever worked on commercial software that exists for the purpose of creating economic value. A lot of software doesn't have that purpose and exists mainly for defensive reasons.
I have spent most of my career working on fielding new software systems for the intelligence community and the DoD. We can't say we haven't seen productivity gains in the form of many processes being automated to the point we can scale them much larger and process much more data. But this isn't economic productivity. 60 years ago, satellite imagery involved dropping film from the satellite on a little parachute and intercepting it before it hit the ground, developing the film, and deploying any improvements in imaging capabilities by launching a new satellite. Now we can do almost all of that with radio and software and we have virtually the entire globe covered, a near non-stop stream of imagery constantly being turned into possibly useful and actionable intelligence depending on what the interest is in knowing what is happening in that region.
But in terms of what we're doing, much of it is economically purely a sink. We're monitoring foreign ports, known locations of military units, missile silos, to maintain the strategic advantage of not being caught with our pants down if anyone out there ever decided to launch a large-scale conventional attack. A lot of people would probably argue what we're doing is pointless, fighting yesterday's wars while losing today's. Maybe. I'm not really trying to make an argument either way for whether this activity is useful or not.
But it's not increasing American economic output, and it's not intended to. But it is an incredibly expensive and enormous scale application of deployed software technologies. It's effectively a new category of cost for the world's major military powers. They now need to spend on maintaining an enormous development pipeline and operational environment for software capabilities that bring no economic gain, but just keep them from being overtaken by their enemies.
You do see some patterns like this in commercial software, especially in the real of information security. We may or may not be able to easily deploy huge force multipliers to make our workforces more productive, but then we find they have vulnerabilities in them and we've exposed ourselves to a new kind of criminal taking advantage of that and extracting some of that value. So we devote more and more resources to securing these systems, often making them less efficient and more difficult to use in the process. We have to do it, because the added security is at least some of the time ultimately worth it due to the enormous cost of a breach. But it's purely defensive spending. You're not making your system any more effective at producing whatever it is your company produces that creates economic value. Often, you're making it less effective at doing that.
Cal Newport points out the lack of productivity increases in his book about email (which is actually about email and instant messaging), and how it's the default workflow tool and how badly it works for that.
Solid thinking. But I think the Jevons paradox deserves a mention. How much of our expanded capacity is spent on intangibles like extra complexity that doesn't show up in GDP?
When you actually build a chair or end table, you complete the project, and you do enjoy the fruits of that labor, but there's no real cycle there. It's just an ending.
Software developers might fall into a similar trap, being so enthralled with building their own tools, writing libraries, designing and implementing frameworks, creating processes like CI/CD that obviously make the whole software development life cycle better... but of course it's largely an internal cycle that's more interesting than a lot of the end results of software that might actually benefit business (and measurable productivity.)