Anytime you start delving into what has changed for a subset of people, over decades, the most important piece of the puzzle is decided ding who qualifies for the subset.
I mean concepts like "middle class" or "without a college degree."
A lot more people get degrees, in 2018. Does the graduates/nongraduates segment still represent a comparable group of people?
Human capital theory that recommended this explosion in tertiary education assumes/d it does. College makes you more productive. Make everyone go, get productive and earn more.
It turned out, college was partly a proxy for class. What college grad earnings in the 60s was actually saying is that upper-middle class kids grow into upper-middle class adults.
...at least partially.
I certainly agree that a lot of cities are too expensive for the average person. But, I don't think that you can take education in 1960 and 2018 and treat the populations as similar.
I'm also dubious about college as a proxy for skill, in the skilled labour sense. A lot of college is very general education, with even less focus on marketable skills than high school.
The American use of the term reflects (whether or not it is borne out in reality) the achievement of a comfortable and secure lifestyle (relative to the current time period) by an employee. In a way, it's a more flexible definition than the British one. The post WWII economic boom was the first time that, en-masse, people who were not inheritors of significant educational or monetary capital or status could achieve that lifestyle.
However, the problem with the American definition is that it chooses to ignore how much that achievement actually has to do with class and other inertial societal constructs, perhaps because frank discussion of class effects is still considered taboo in American society, whereas older societies with openly acknowledged class structures can actually have that discussion.
Working class: work to survive.
Middle class: work to do other things.
Upper middle class: work for wealth.
Upper class: work optional.
People rightfully like to talk about the working poor and the disappearing middle class, but I think we are actually already past that. What we are seeing now is that between education, housing and financial insecurity it is hard to be even upper middle class anymore without existing wealth.
I guess I don't have anything to add, just struck a nerve because that is exactly how I feel, so I wanted to say something.
I'm not sure that's really true - or at least, it's an incomplete model.
Rather, what's happening with education, health care, & housing is that you have three sectors with flat productivity growth and large barriers to entry, which implies that the number of people they can serve remains constant. The population is growing. When you can provide a service to X people but there are Y > X people who need that service, Y - X people will necessarily end up going without. The financial insecurity is because of the mad scramble to avoid being part of the Y - X who get left out in this game of musical chairs.
The reason the middle class (using danan's definition of "employees with a comfortable and secure lifestyle") expanded so much after WW2 was because we had a number of technological breakthroughs that dramatically increased the productivity of these sectors. Antibiotics and vaccines meant that illnesses which previously would kill or require long medical care could be prevented by a simple shot. The green revolution increased the yields on crops by 10x. The interstate highway system and automobile opened up vast suburban areas for development, and developers (like our current president's father) became wealthy applying mass-production techniques to building housing. The number of skilled scientists in the U.S. increased dramatically during WW2 as they left turmoil in the rest of the world, which could support larger college populations.
We still see those technological breakthroughs in other areas like computers & media, but they don't happen anymore in fundamental areas of life like health care & education. Some of this is probably inherent in the problem domain (with all the easy diseases cured, we now die of heart disease, cancer, suicide, overdoses, etc, most of which have proven stubbornly resistant to cures) while some is because of artificial restrictions (like housing zoning laws).
There's probably some artificial barriers to entry for education as well - it's hard to explain how Ph.Ds have no job prospects at the same time that college tuition keeps on rising. And I suspect that has to do with college's role as a signaling device for job prospects. If a bunch of Ph.Ds wanted to open a new college where they personally tutor a bunch of incoming college freshmen for much less than college costs, those kids would probably get a better education than any of their college-bound peers, but they also wouldn't have a degree from any credible university or much in the way of job prospects.
...come to think of it, I wonder if colleges know that their primary value is as a signaling mechanism, and intentionally restrict the size of the incoming class to preserve that signal. In other words, they're aware that they don't actually add any educational value, they just ensure that they only admit students who would've succeeded anyway and then point to all their successful graduates.
I think what you're saying is that they don't add any additional educational value beyond what your hypothetical band of PhDs could. They do provide educational value and also the signalling, though perhaps they don't acknowledge the latter so much.
But at least with pre-university education, as far I've seen, the process you describe is what most private preschools, primary and secondary schools engage in: They can essentially select their student population for the children who are most prepared, either by means (== family support system) or by their natural abilities.
Even those private schools that have scholarship programs for underprivileged students only accept students who are exceptional and don't have serious social functioning issues.
These schools then point to the achievement of their students as an indication of their superior educational process (which in some cases might be true, due to just having more money to provide better educational services).
Public schools, by contrast, must accept students of all abilities, and thus look worse on paper, when they are often dealing with a much more diverse, and often more challenging set of student backgrounds.
Housing (which you have to procure from the University for at least two years), amenities, and administration, is.
You can provide a world-class education by having professors use a whiteboard to lecture to students in cargo containers, or prefabs, for a fraction of the price of a modern education. But then you don't get the 'college experience' of paying tens of thousands of dollars a year for living in a hotel, surrounded by underage boose, drugs, and parties.
A lot of the costs go into intangibles, like my college had some pricey guest lecturer programs. It is true though that university overhead has been growing quite fast lately and a lot of money in the last decade has shifted to things that are decidedly unrelated to education (I mean, LSU has their own lazy river).
Working class: One who makes money by collecting a wage.
Upper class: One who makes money through investments or similar business dealings.
Middle class: One who partially makes money by collecting a wage and partially through investing.
Wealth figures are more difficult to come by, but the internet suggests the median household has a net worth of around $200,000. However, I would venture to guess that a large portion of that is tied up in a primary household which does not normally generate any income. It is not clear how much a typical person does make in investment income.
If you are right that 60% of the population do not have investment income, that seems reasonable enough. I think it is fair to say that most people are working class. I am sure that we can agree that, under modern usage, the middle class is meant to represent something that is not easily attainable, but more attainable than upper class.
I believe that the push to educate all young Americans was a push to expand the power of the middle social classes at the expense of the typically uneducated lower classes. Success in this endeavor is measured by the proliferation of middle class norms and tastes, not necessarily the expansion of middle class economic prosperity.
This is also why so many Americans talk about being “middle class”, even if they fall well outside the middle of the economic range, because they’re talking about the middle social class(es), not the middle economic class.
Also because money is the only god in this country.
it also helps us to conveniently ignore economic sovereignty when contemplating 'success' or 'prosperity'
which of course is a useful frame given the bulk of the post war american middle class was created by essentially trading economic sovereignty in the form of small farms for corporatism
not sure how it could have developed otherwise given the rise of factory farms, etc, but this direction was certainly influenced by large players as well..
I really enjoyed college, especially my liberal arts classes.
That being said, with AWS Certs going for between $75-$300 how long do colleges think they can survive?
If you were a 22 yr old who studied from zero-level computing knowledge for 12 months and got the AWS Solutions Architect Pro cert (somewhat
possible assuming 2 hours / day) I can already guarantee you’d receive a written offer for 100k plus within days from my current company (in Texas so that’s quite a bit of money).
Again don’t think colleges are going to compete well against that.
Last thing, here in Texas high schools can now issue Associate Degrees usually. Also the whole model of high school has shifted from test-focused to job-placement focus. Starting this year you can get an Associates in cybersecurity out of high school...
I think this highlights the true question. Do colleges exist to promote education and intellectual curiosity or to train workers?
According to PEW, Americans believe colleges are for work training:
"Americans are split on the main purpose of college, with 47% saying it is to teach work-related skills and 39% saying it is to help a student grow personally and intellectually."
When college cost in the US what it did in the 1960s this sort of little bit false-dichotomy, little bit navel gazing question was at least fairly harmless. Now it's just fucking ridiculous even for most public schools.
There’s no jobs for people.
Therefore people try to get college degrees, even if they aren’t suited for it or going to be happy with it.
This drives up the demand for college, and specifically for degrees that lead to jobs.
This degrades the image and purpose of college (students aren’t here to expand their mind. They’re here so that recruiters won’t ding them.)
All of which leads invariably to Downstream issues, like credential inflation and growing student debt.
Eventually, college = jobs, and one of the greatest achievements of America, its liberals arts education, will tarnish and crumble.
Economic value now comes from fewer roles, which require specific skill sets to perform.
With more lopsided reward structures (few people corner most of the gains), there is increasing pressure for people to move into the few places where rewards collect.
This ties into the issues like " History majors can't get a job that will help them hold down a house or family".
The societal cost of it is that society becomes a monoculture of a few degrees and sources of information.
The greater advantages of a modern first world society in terms of societal achievement, art, culture and so on weaken.
There's something very broken with our modern society.
We also should make it easier for those academics to bring those massive gains, because academia isn't organized for that.
I still think this is not the real goal, but maybe it was at some time. (Is there more than one goal? The system appears to not be optimized for any reasonable goal.)
On top of that, it can be difficult to separate the people who are actually intellectual elites and the people who are good self-marketers.
I kind of think we should just be committing a lot more funding to those interested, who meet some reasonable, objective level of expertise.
I'm not saying either of these are the case, but it is worth exploring before we outright dismiss.
The other point I think worth adding isn't that many of these humanities and arts fields are unemployable, but largely that they are some of the most popular majors and in turn their economic value is diminished as such. If there were twice or three times as many graduates per year in Computer Science, I'd expect the perceived economic value of a CS degree to decline as well.
I don't really think the teaching part at most universities oriented toward training workers. If they were oriented toward training workers, then the administration would be worried about what skills are in demand by employers...probably training students on particular software, computer languages, jargon and techniques that are common. There would be more emphasis on getting various certifications while still in undergrad. I can't speak to promoting "intellectual curiosity" because it seems hard to define and if you did I bet it would depend mainly on the nature of the student...probably his/her genes, peer group and childhood role models...not on the actions of someone the student sees three hours per week during adulthood.
As it is, a professor can (very often, though not always) be totally incomprehensible and irrelevant to work, as long as they are not offensive, without any repercussions nor even any gentle communication that he/she explain things clearly. Humans tend to craft stories favorable to themselves, so even if the student evaluations are negative, the professor can just blame e.g., short attention spans in the era of marijuana and fortnite...or the professor just might not look at the student evaluations if they don't want to. Over time, some professors even develop a sort of perverse joy at confusing students...their confusion being the sign that the professor is very smart and that he is "challenging" them, when in all likelihood he is just asking them things he never explained clearly in the first place. In my undergrad, an EE professor bragged that the class average was a 23%; he also used his TA as a translator because he could not speak english.
It is expected most everywhere that students will learn work skills on the job and in internships...not at school. Professors at research universities spend all their time thinking about hard frontiers of a topic, and that frontier might have little to do with what you do on the job.
And taking classes also means giving up time wise something else. Opportunity cost is also a thing.
So yes, liberal arts are the higher forms of education, which are also further from the rest of us who need a credential because of credentialism so we can merely live. (Food, water, electricity, housing, medical, internet are effectively all required to live these days. )
Also your characterization of community colleges only being for those too poor to attend a 4 year school is totally out of touch and frankly a little stigmatizing towards an important pieces of public education.
I know many people who graduated with 4 year degrees after getting an AS at community college, which requires the exact same core liberal arts + math and science course requirements that state schools require. These people didn't do it because they were poor, they went because their high school GPA didn't let them get admitted to the schools they wanted right away. The fact that they got to save a ton of money, pick schedules conducive to part time work, amd get a real credential after 2 years (I know many employers who prefer an AS degree holder to someone who dropped out of a 4 year degree 2 or 3 years in) is just icing on the cake. And it didn't hurt long term prospects for career or education either, one of them even has a graduate degree from a school that is extremely respected in the field he studied.
Community colleges are awesome things, both for people studying specific trades and careers and for eventual 4 year school attendees. The more we can destigmatize them, invest in them, and keep them affordable and accessible to working people - the more we'll all share in the benefits.
Colleges shouldn't be in the business of training people for jobs - that is what tech schools are for. If you just want to get a job, go to a tech school, don't go to college. College is (or should be) about expanding your mind, your horizons. It should be about exposing you to new ideas and thoughts you hadn't been exposed to before. There is a reason why University and Universe have the same root.
Even if we pretend that the majority of people attend college not for the purpose of increasing their value to employers and getting out of their parents' house, but for personal edification, colleges don't even accomplish this ostensible goal! Most students are more ignorant in most subject matter than the day they step foot on the campus - https://www.nysun.com/new-york/students-know-less-after-4-co... ; https://www.theatlantic.com/magazine/archive/2018/01/whats-c.... Often they're propagandized for four years, saddled with debt, and told that they got an "education", so it's nobody's fault but their own if they can't find a decent job.
Three cheers for the AWS certs and modern-day educational options and good riddance to the corrupt university system (propped up by taxpayer-subsidized monies, of course).
the motte being the liberal arts/wholistic focused learning (only a philistine would be against that) and the bailey being the economically useful learning salesmanship.
> I must study Politicks and War that my sons may have liberty to study Mathematicks and Philosophy. My sons ought to study Mathematicks and Philosophy, Geography, natural History, Naval Architecture, navigation, Commerce and Agriculture, in order to give their Children a right to study Painting, Poetry, Musick, Architecture, Statuary, Tapestry and Porcelaine.
Nothing wrong with the arts, but the fact is very few can support a reasonable lifestyle on it. By burning the inheritance you can produce art but nothing remains for your children. Study art by all means, but make sure you have something to support yourself on without burning up the hard work of your ancestors first.
I imagine the answer to that question would be interesting and nuanced.
If we're doing our job as a generation we are moving the focus of human education from the gross to the subtle.
This is patently false. Colleges compete and advertise based on their job placement rankings. Just a few years ago, law schools were sued for misrepresenting their employment statistics.
Whatever ideal, romanticized version of university and college experiences are touted, the reality is that they are marketed to parents and kids as the way to a better lifestyle.
Sure, "Liberal Arts" are largely about enriching the mind, but that doesn't change the fact that very, very few people can afford the luxury of going to get a 4 year degree just to "expand your horizons".
But then you have to be clear-eyed about what we're talking about here: from the perspective of a college student, is it worth paying 1-200k to become more worldly, with some auxiliary benefits to their income potential? I would probably still send my (future) kids to such a place, but then I'm in a pretty high income/wealth percentile. I suspect that many people (including my family when I was a kid), looking at this clear-headedly, would rationally decline this deal. If you think the pricing of college reflects the "broaden your horizons" aspect instead of the perception that it's "protection money for a middle-class career", you're sorely mistaken. Which means there are and can be far, far, far more cost-effective ways to broaden your horizons, particularly in the era of Internet communities, cheap travel, and all of humanity's knowledge at your fingertips at all times.
 I'm aware that there are ways to get through college more cheaply, like starting in community College. But a system in which the majority of people are going through ccs first is a very different system, so the thrust of GP's point still stands.
I don’t think programmers will be obsolete; but I think the skill set will shift away from the analytical and mathematical traits that set programmers apart today, towards the fuzzier knowledge of the humanities. (Though my personal belief is that liberal arts educations must include science and mathematics as a core pillar.)
Specialization in algorithmic and mathematical thinking will still be of extremely high value, but the level of achievement required to be successful in that area will likely be crushingly competitive.
But, this is just a fun guess — let’s check back in ten years so we can see how utterly off base I was :)
People thought the same thing in the 80s....
Also, given that computers can test stuff on billions of people, I would say there is a good chance that you could have popular pieces of music and art that are produced largely by algorithms in the very near future.
I keep seeing this on HN. Only the privileged can afford to go to college to “expand their minds”. The middle class and even the upper middle class are not spending tens of thousands of dollars a year to go to college to “expand their minds”. They are doing it so they can get a job.
Where are all of these students with “expanded minds” going to get a job? I specifically told both of my sons that I would not support them in getting a degree that didn’t have an outlook for a decent paying job. I specifically had them look at the starting salaries of graduates, the placement rates, and the five year salary averages for the school and the degree they choose.
Whatever it "should" be for doesn't really matter. The majority of people (in America) operate under the assumption that going to College helps you get a job.
Anecdotally, the college I went to, New College of Florida, is a public liberal arts school whose costs aren't too far out of alignment with other state schools in Florida. According to their fees page, in-state tuition fees, room and board would total $16K a year; for out-of-state students, that would be $39K a year.
Actually, I'd like to gently question your figures: you're indicating your kids are paying $2000 a semester -- so that would be $4000 a year, or $16,000 over four years. And you're saying that doesn't include their tuition, so that's just room and board, right? (That's under half the cost of New College's, which again sounds like quite a deal.) Further, room, board and tuition at this private college would still be under $20,000 over four years? Unless I'm missing something, that would mean tuition at this private school is less than $500 a semester. I can't help but suspect that either one of us has our math wrong, or you have the best school deal in the history of ever.
I too am very interested in the answer to this question. if you're spending that much and not going to a top 30 school, you're a chump. if you are going to a truly elite school, you're not paying anything close to the sticker price unless your parents are quite well off.
I'm not saying college isn't unreasonably expensive these days, but I often see this number for undergrad and I always wonder where the hell it came from.
Where did you get this idea? The term "university" (as an Anglicization of "universitas") refers to its guild-like corporate organization.
- college is in every way, dumped on teenagers as THE ONLY way forward
Absolutely true in my experience. High schools are entirely structured around two things: standardized test performance, and pushing kids into college. If you are not a college-bound kid, you essentially don't exist.
- No one is there to mature. No one is there to improve their concept of individuality. No one goes to college to learn for the sake of learning.
"No one" is too strong, but I think this is true for the majority. Most people in an undergraduate program (and probably also a majority in Master's programs) are there because they think it is the path to a good job. Because that is what has been drummed into them since elementary school. Learning, expanding horizons, etc. are secondary.
If most people at college were there simply for the sake of learning, enrollment would be a small fraction of what it is.
I studied Ancient Near Eastern Studies and Historical Linguistics, neither of which is part of anyone's plan for getting rich. But I studied those things because I was interested in them, purely for the sake of learning without any hope of financial reward.
Also, I made the best friends of my life in college, many of whom, 20 years later, are still my best friends.
Overall, I feel like I matured a lot in college.
As for my own kids, I'm encouraging them to do what I think will be best for them and their future based on their skills and temperment. My two oldest I encouraged to go to college. But my third I am strongly encouraging to go to trade school. I encouraged the two oldest to study whatever made them happy in college and to not worry about studying to make money. One is studying non-profit management and the other is studying animal behavioral psychology (she loves working with dogs). Neither of those are good career paths, but they are things they enjoy learning.
You don't need to pay tens of thousands of dollars for college. My wife and I graduated without any student debt. My two oldest are about halfway through and so far they have no debt either (through a combination of working their way through college and academic scholarships).
Well, colleges used to be for the elites (either rich or very talented) to learn science and to do research.
It wasn't meant for vocational training for the general salaried worker. Now it's mostly that.
I ask one simple architectural question that weeds out many of the paper tigers:
“We need to deploy a website that is fault tolerant, secure, can handle one AZ going down and scalable. How would you do it using AWS technologies?”
I’m saying this as someone who has 5 AWS certs and interviews AWS Devops candidate. But I also have 20+ years of professional software development experience, a degree, and before that I was a hobbyist for 6 years.
Of course you can learn the foundations without college and not learn them with college, but on average it's much easier to do so with that.
When I started my career as a scientist in a corporate analytical lab 36 years ago, corporate funding for travel and continuing education was plentiful. When budgets got tight, that was the first to go. Those of us who were self-directed learners survived the downsizing and cost-cutting much longer than those who didn't.
That’s all well and good until they find themselves competing with people from other countries who are being trained to hit the ground running and are willing to work for less....
14 chapters of Digital Signal Processing in 12 weeks w00t!
For many of my engineering courses at a top tier public university it was in-one-ear-out-the-other.
Though, the most of stuff I learned at community college (pre-engineering) did stick.
As it happens, I use stuff I learned in my distributed systems courses--in that oh-so-awful university, hiss--on a weekly basis, if not daily, to solve those problems. (I use stuff I learned in my microeconomics and my political science and my sociology classes daily, but you can't put that on a resume, so obviously it doesn't exist.)
In the Salesforce consulting world the Certified Technical Architect (CTA) cert means you walk in the door at any of the big consulting firms at managing director or equivalent rank and that comes out to around $200k/year. The cert isn't that easy to get, it requires a dozen or so prerequisite certs and then an exam plus a board review but you could probably do it in a year and a half.
Also, i think over half of all CTAs are employed at Salesforce itself so you could always go work for them if consulting isn't your cup of tea (it's not for everyone that's for sure).
(Nobody cares about SWF.)
Except for button-pusher roles I've always hired for whys, not hows, and certification programs seem to never care about the whys. College degrees don't assert that you know how to learn, that you know how to derive the whys--but, in my experience, they correlate better than cert-hunting does.
Who thinks they're ever gonna do any cloud service right employing some dude who only has worked with traditional servers?
You'd be surprised how many insanely expensive and dorked up cloud setups out there that honestly could have been solved / way cheaper by some guy with a random AWS cert.
I'm not talking former barista turned AWS guy in a couple weeks, some background is required for that and random server guy of course, but the barriers are dropping.
I'd likely never hire an engineer with one of these certs and no relevant experience into a full-time position. I might offer an internship, but that's not a six-figure position.
Even still, I agree with the parent up above. A degree isn't strictly necessary, and if someone had the motivation, the talent, and the cert, I'd be willing to give them a chance as an intern.
Edit: I see we have some touchy SJW-types in the house this evening :)
However: Some of your other recent comments are spot on. The upvotes are on me. Try not to spend them all in one place.
When TX is taken the US will officially turn into a social/welfare state, the US is doomed already, similar to what happened to Rome in the past.
People from CA or Mexico are running from there for a reason, however when they arrive TX, their first to-do is to make TX look like where they used to live. They are virus.
Lots of blue-collar jobs (especially industrial ones) paid enough for a "middle class" lifestyle and financial security. Small businesses were competitive and many of their owners were not college educated either.
College wasn't a necessary qualification for most blue-collar jobs, including some for which it now is (such as police work, in many places) or even for many entry-level white-collar jobs - you could start out as a file clerk with a high school degree and work your way up into middle or even upper management (at least if you were white and male).
Unrelated, a huge point I'd like to see addressed more is how housing costs have outgrown inflation over the course of decades. We used to build enough housing, but NIMBYism and restrictive zoning has taken over recently. Housing is much more expensive in big cities in real terms than it was then, even though fewer people live in them in many cases now (e.g. the population of Manhattan reached its peak in the 1910 census and is down 30% since then). Real wages haven't decreased since the middle-class heydays of the 60s, and a lot of consumers goods have gotten much, much cheaper since then in real terms. The real headwinds in the face of middle-class prosperity are housing costs, healthcare costs, and (to end the digression) university costs, all of which have grown well above inflation.
Hundreds of thousands would live on a street that now houses a thousand people.
I don’t think that density is coming back or desirable.
No one is saying the tenements were desirable, or wishes for them to come back. What we want is the construction of more dense residential housing that can meet or exceed that population density while providing good quality of life. Said construction is entirely possible, doable, and profitable, except that zoning prohibits it in most places.
to save anyone else a search.
> I don’t think that density is coming back or desirable.
I think there's plenty of examples of dense cities (certainly far denser than most US cities) that aren't full of death-trap tenements and it's borderline intellectually dishonest to equate density with that.
In fact, letting NIMBYs have free rein to obstruct/delay/interdict housing construction is far more likely to cause overcrowded/unsafe living situations (which include homelessness) than the other way around.
But on housing, there's a major issue you're not considering. This  is a list of US city populations in 1950. This  is the list for 1960. So we're looking at the sweet spot of the housing boom and a period where the population also dramatically grew increasing by about 30 million (~20%) in those 10 years. Now you might notice something funny. Nearly every major city shrank in size!
The housing boom did not involve building up (rather literally as is the desire of some people today) in desirable areas so everybody could affordably live there. Instead people sacrificed some comfort and moved outside of cities and started building houses in uninhabited areas outside of the cities. Houses were cheap because people were developing in areas where there was nothing and that nobody wanted before. In turn this movement away from city centers helped keep those prices within the city reasonable, as it had a depressing effect on demand.
The reason prices are so high today is because of simple market dynamics. People are less willing to live in less desirable areas. This is driving the prices in desirable areas into crazy land. There is still immense cheap housing available outside of these areas. Some cities are so hungry for new citizens that they're literally even giving away land on the condition that you put or build a house on it.
 - https://www.biggestuscities.com/1950
 - https://www.biggestuscities.com/1960
That upwards mobility depends on one thing: that old people drop out of the workforce either by death or by pension, so that young people can rise up the ranks.
Now, given that people in their 80s need to work (!) to survive, this "generational contract" is broken. People are stuck with shit jobs until well in their 30s, so how are they supposed to procreate or save for their own retirement?
We should hope that older works stay in jobs as long as they're able. The coming demographic bulge isn't pretty.
Imagine an island society, with a bunch of 40 year old workers. You own the company that gets all the profit from their labour. So, you have a real resource, which you could trade with other places.
Now of course this island has no children. How do you feel about your investment once the workers reach age 80. It wouldn't produce very much. In fact you'd need to put resources in just to keep your island ex-employees alive.
But that's an extreme example. Let's suppose that 20% of the population is only 20 instead of 40. When the current 40 year olds hit 65 and retire, the younger generation will be 35 and can support them. What happens to your profits?
They will probably still either collapse or near collapse. You had 100% of people in the labour force, now you have 20%. This poor 20% not only has to support themselves, they must support the 80% idle old. And that's before any surplus is generated for your profit as owner.
So, your savings in the form of ownership of the island weren't very useful. It just dies out.
Could you saved otherwise? Not really. To a limited extent you could store the outputs of the island. We do this currently with grain reserves, oil stockpiles, etc.
But most of the stuff we consume are produced in the same year. You can't save real goods, only ownership of entities.
So if the entities all have a shrinking labour force, they aren't worth as much. The old people in the us won't have meaningful assets with which to pay for their care.
(The one exception is a small old country that owns rights to part of the output of other countries which are younger. There's also an exception for automation, if the companies owned by the old are massively more productive per worker by the time the demographic bulge retires)
You're making the mistake of those who think that money itself as wealth, as opposed to a medium to exchange for wealth. Money is mostly fungible, but in edge cases it just can't buy certain things.
Such a society would nonetheless face much difficult if the worker/retiree ratio changed from 100:0 to 20:80. That's true regardless of individual productivity level: case two is harder than case 1. Regardless of capital stock such as ports, rail lines, buildings, etc
And here is where it gets funny. I wonder how pension systems would look like if they were supported by the billions of dollars that especially FAANG but also other huge megacorps such as IKEA, McDonald's, Burger King "creatively avoid" in taxes. Or if minimum wages were set so high that a pension built on a life of minimum wage were enough to live on in old age (this is a problem that currently runs high in Germany, as many old people have not enough pension to get by and so require social welfare)?
And corporate taxes are just taxes on people, in the end (the owners of the shares). If there are fewer people working and more people to support, things just get harder, absent large productivity increases or immigration of young workers.
It's not like the people running these corporations are all evil and out to ruin people's lives. People behave based on the incentives laid out in front of them and the current combination of capitalism and laws incentivizes the behavior we are seeing.
True, but the same people also largely support programs and politicians that preserve or extend those incentives. They're not innocent bystanders.
Pension systems own the FAANG companies. For example, in the US, CalPERS manages a third of a trillion dollars. Two thirds of that is in equities such as Apple.
An interesting touchstone for the article: high-rent cities in the US have started to see restaurant turnover not from 'normal' financial failures but from labor shortages. For midrange restaurants that aren't either selling luxury or exploiting captive demand (e.g. lunch for nearby offices), there's no price point where they can make ends meet while paying servers, dishwashers, etc enough to accept high rents and/or long commutes.
I agree that treating "non-college" as a constant bracket is a misleading measure, and I can complain all day about the spread of overpriced credentialism in hiring. But observationally at least, I think there's a separate issue with big cities and low-skilled labor where work we consider part of "normal functioning" for our society is increasingly incompatible with the housing prices and transit options available around urban centers.
Restaurants that try to charge "too much for their station" don't last here.
> In SF, for example, fine dining is doing well, while the mid range is slowly failing and being replaced by fast casual, where they focus on fast turnover.
It has to do with readily available substitutes for a given price point affecting elasticity of demand. The type of food offered by a casual dining concept can be replaced by fast-casual, whereas higher end food has relatively inelastic demand, because it has fewer substitutes.
That's mostly not a change in type of food, just front-of-house service model. And casual and fast-casual both focus on turning tables, fast casual just has counter ordering (and often counter service for drinks) rather than full table service.
I think there's a very plausible argument that Applebee's range restaurants are very labor-intensive without adding substantial value, and so they naturally get shut out of any market where labor is expensive. The "cheap hot meal" role is filled by fast food, while the "place to sit and chat" role is filled by everything from pubs (order at the counter, high margin alcohol) to dedicated desert places (order at counter, often cheap nonperishable ingredients).
But granting that it's not something to 'solve', it's still newsworthy. There are ~3M waitstaff jobs in the US, and despite tipped minimum wage it's often pretty well paid, with nonstandard hours that can help part-time students, households with two working parents, etc. If our biggest cities are no longer compatible with common forms of work, that's worth discussing and looking at the secondary impact of.
Makes sense to me.
But I do think can versus will is a thorny question. Restaurant and delivery dining are highly elastic in general, and big cities aren't necessarily an exception. People who can stably afford to live in downtown SF, NYC, LA, Boston, etc. are still frequently paying >33% of income in rent, and ~50% isn't shocking for younger people. As a result, you've got a major demographic that's relatively high-income but still tight on cash and consequently price sensitive. (More speculatively, I think there's also a grain of truth to the "avocado toast" thing; young urbanites seem likely to cut back on purchase frequency before they cut back on quality.) So it's not necessarily the case that restaurants can make things work by raising prices; there's no rule saying that a given type of restaurant has to be viable at any price point.
I realize that we can label all of this as free markets doing their thing; maybe mid-tier restaurants with lots of staff are just an inefficient use of valuable urban space. But real estate is infamously messy, and there are some interesting questions about whether city rents would be more functional if not for the conversion of usable space into artificial demand (e.g. mandatory parking with housing) and disused investment space. And even if that's not the case, it's still news if we've built a market where "going out for dinner" is no longer a standard transaction.
Elasticity of demand depends on whether or not there's readily available substitutes. If you're a mid-end restaurant, you can be replaced easily by fast-casual concepts (food quality doesn't differ much from fast-casual to casual full-service). However if you're high end, your competitors will be other high-end restaurants, with the same staff and space constraints as you. This is why in higher-rent markets such as NYC, Tokyo, London, Hong Kong, high-end restaurants flourish.
> there's no rule saying that a given type of restaurant has to be viable at any price point.
Of course not. But in a dense, urban, high-rent area, there's likely a model that does work.
> maybe mid-tier restaurants with lots of staff are just an inefficient use of valuable urban space
They are. Rent is your main fixed cost, staff is your main variable cost. COGS matters, but you need a product to sell in the first place, and every restaurant brings in similar products. Having too much staff and too much space is a recipe for disaster.
To conclude, here's some statistics for you concerning household expenditures on food: https://www.bls.gov/regions/west/news-release/consumerexpend...
> San Francisco-area households spent $4,487, or 50.3 percent, of their food dollars on food at home and $4,431 (49.7 percent) on food away from home. In comparison, the average U.S. household spent 56.3 percent of its food budget on food at home and 43.7 percent on food away from home.
In San Francisco, households are spending a higher percentage of their (already higher) income on food away from home than other Americans.
Anyhow, in my experience, restaurants fail because there's a lot of shitty restaurant owners out there. Not knowing their demographics, costs, labour situation, etc..., qualifies them as being a shitty restaurateur.
That would assume that the real estate market is a free market, which in many cities is a laughable idea.
I think it's quite likely that midprice, full-service restaurants are an inefficient use of expensive urban space. It's also possible that they're an inefficient use of valuable urban space, and would be squeezed out of dense cities even by healthy, responsive real estate markets.
That second claim is much bolder, though, and my personal guess is that taller buildings and better mass transit would relax labor costs via lower rent, while also lowering the price of downtown real estate. A few extremely dense cities (NYC, Tokyo, whatever) might use better transit for a "reverse commute" where people live near work and go outbound for restaurants, but that's even more hypothetical. And I can't really think of a "free real estate market" city to use as a reference, because I'm not sure anything close exists at this point.
There's far more high end restaurants downtown in my city than there are fast-food franchises.
And even the more-than-working-class definition clashes with other trends; some pundits seem to define "working class" as nothing more than "does not have a college degree", which means that some trends, rather than being interesting demographically, instead just become a restatement of an underlying trend (like you said, the increased prevalence of college degrees).
And the final problem of all attempts to analyze this is that we don't really have comprehensive data that supports a single way of measuring these aspects historically, because the definitions and criteria keep changing. Even when there is insightful research, it really is hard for it to be applicable outside its direct area because the definitions and proxies for historical values are not guaranteed to be uncorrelated with new phenomena that are attempting to be measured.
I think this is a critical observation that lots of people--including some economists--have missed.
Per a Pew study Brookings cites (https://www.brookings.edu/blog/social-mobility-memos/2014/02...):
> children born in the lowest income quintile who do not earn a four-year degree are four times as likely to wind up in the bottom (47%) as those who earn a four-year degree (10%).
So while upper middle class people are still the ones with the best chance at getting admitted, paying for, and finishing college degrees, lower class people who persist and do the same really are far more likely to join the middle or upper middle class from doing the same. Though it's far from guaranteed, especially since the relative value of a degree does depend on which degree, and sometimes school prestige as well, which is definitely harder for the lower class to prioritize (costs, scholarship availability, and nearness to family if they need to provide income support or other support to them tend to matter more in my experience).
The observation you mention is accurate, but I just wanted to provide additional context to readers who may not realize the life changing effect a degree can still have for poor people. This is as someone who experienced a comfortable middle class upbringing only because of the combination of a state school in the rural south that gave merit scholarships in the 80s to locals, and my mom's efforts to graduate with a degree that would be worth money.
That said, the idea of pursuing liberal arts in college is still incredibly foreign to me. Being able to support oneself after getting that sort of degree is definitely something I can see as mostly a proxy for upper middle class upbringing, and likely feeds into the rising trends see today where the payoff for the lower class is not hugely significant or comparable to well connected peers, even upon degree completion.
While it is true that the school system acts as a filter that removes those who have almost no chance of upward mobility (those with crippling disabilities, for example) from completing higher and higher levels of schooling, we need to be careful to not reverse that observation. Attaining a degree is not going to undo the disability that limits one's economic growth.
With the great push for everyone to have a post-secondary education we've witnessed over the last decade or two, virtually everyone who is capable of attaining a post-secondary education has done so. Those who are not completing those levels of schooling now are those who had no chance in the first place, and they struggle equally in the rest of their life for the same reason they did not succeed in school.
That said, if you were born with what it takes to be top of the class in Harvard, but chose not to pursue that avenue of life, chances are you still have every bit of upward mobility potential as someone who did graduate from Harvard.
Well, minus the Harvard contacts and network, which is a huge part of the success factor of Harvard grads.
So again, without a college degree, there's a 73% of people born poor, will remain lower or low-middle class. Only 3% will make it into the top quintile (vs 9% with degrees) and 8% into the 4th (vs 17% with degrees). So if we define only as 3rd quintile up as middle and up (which is questionable in itself given the shrinking middle class) a degree lands a lower-class person 52% chance of genuine upwards mobility, compared to 27% without. And it nearly guarantees that they at least won't remain in the very worst stratum of poverty. 52% for real mobility may not be great odds, but neither is 27% and if it was my future on the line, I definitely would not want to be staking it on 27%.
Hypothesis A: college coursework actually increases a student's future work productivity.
Hypothesis B: the least productive laborers are unable to be admitted to a college, or unable to graduate. The admissions and graduation filters thus raise the productivity average in the pool of graduates.
Hypothesis C: businesses with the most productive workers tend to prefer college graduates. Workers without degrees appear to be less productive because the only jobs available to them have inherently lower productivity.
Hypothesis D: productivity is measured differently for some workers, based in part on whether they have college degrees.
Hypothesis E: some workers are able to influence the measurement of productivity, such that productivity originating from other workers is reassigned to them, and those cheating workers tend to have degrees.
Hypothesis F: workers with degrees tend to have debts, and are forced by necessity to be more productive in order to first pay them off, and then later save enough for retirement with a shorter savings window.
Hypothesis G: workers with degrees are more productive because the past correlation generates an expectation that they be more productive.
The observation could have complex cause, and all of the above may be true.
Claiming that high education has only signaling value is ridiculous. (It's even more ridiculous than claiming it has no signaling value.)
At one time there was an argument that general skills had value. We as a society have become super specialized and those general skills relatively ubiquitous that those general skills aren't terribly valued any more. Your employer doesn't care if you understand Cartesian philosophy or having an understanding of classical rhetoric, they just want to see you spent 4 years learning accounting/law/human resources because those are the skills that's required.
This may also be why our leaders have gone from
"Success is not final, failure is not fatal: it is the courage to continue that counts."
"We love winners. We love winners. Winners are winners."
in the last century.
Being able to effectively communicate over email takes more writing skills than most people assume. The internet adds a lot of flexibility in the kind of things the average office worker will do.
The general consensus I can see is that that is not the case - employers want specialties.
And absolutely communication is a skill we require, but for some reason it's not valued.
Part of it is the size and scope of the projects and the general complexity added to the field, but 20 years ago 1-2 people would have done all of these.
I'm not sure exactly where the bifurcation lies--it certainly isn't unique to software. Medicine and law are also getting highly specialized. On the other hand, are administrative jobs going the other way? Accounting, HR, compliance? I'm always a bit puzzled when companies recruit CEOs from unrelated industries. Does domain knowledge matter that little for some, even very senior, roles?
Inside a company programmers such as my self have been assigned to work on React without ever having seen it before with the expectation they will pick it up. It's only when looking for new employees that these differences have much weight.
PS: I have been told to pick up low level network programming, frameworks such as React, new langues, even jumped into web programming from nothing. I can only assume this is generally the norm.
A car salesman needs to know a lot about the product, more general sales tactics, more general skills like email, and even more general skills like just speaking. But, as you narrow down into the ultra specific niche the percent of time working in that domain decreases. What percentage of the time is the sales guy dredging up specific horsepower numbers etc related just to the car they are selling?
Over time what we could consider generalists jobs like secretary have been cut while the tasks have not. So, by handing out those tasks to others those other jobs have in turn become more generalist on a day to day basis. Dev-Ops for example is in many ways the opposite of specialization.
PS: Put another way, if my last job had been using Java instead of C# I would have done the same thing with ~80% of my time. You would be reading the same requirements of the code was in another language.
That works well for languages, yes, but what about data scientists, business intelligence, cyber security, and machine learning experts? Those are all jobs that launched off the dev backbone, but are very different, have unique, specific knowledge and require training past what a normal degree requires. Dev-Ops may be a generalist position, but you'd need see someone who does Dev-Ops do those jobs.
Yes the occasional maverick will also study outside their target degree but that is rare my mate who is a classics teacher (public school) has a class size of two
According to this  article from 2017 only 33.4% of Americans over the age of 25 have finished their undergraduate education. As such, MOST people are still without undergraduate education, and thus a median American does not have undergrad education. Since middle class should represent a median resident, it would follow that people without undergrad education should be quite solidly in the middle class.
Well, it could if it meant “near median income” if few had near the median income. Median doesn't mean particularly common (or even the most common, which is the mode.)
^ ^ ^
| median |
^ ^ ^
| median |
(working) ^ ^ ^
median | (upper)
It doesn't force the working and upper class to be equally large unless you further assume the distribution is symmetric, but yes, there is a limit to how small the upper class can be since at least half of the distribution has to be split between the upper and middle class.
Currently in developed countries it happens to be the case that the people with median income are middle class. However this was not always the case, ie the group referred to as middle class did not always include people with median income and so the above definition does not coincide with what middle class typically referred to.
> Today’s bachelor’s degree is the equivalent of a high school graduation certificate from fifty years ago, and today’s graduate degree falls short of a bachelor’s degree from a generation ago.
This is not factually correct - tuition at non-state schools increased 10X in the last 30 years, way ahead of the pace of inflation. College education is significantly much less affordable than it was years ago.
> "This is not factually correct - tuition at non-state schools increased 10X in the last 30 years, way ahead of the pace of inflation."
You are only considering the monetary cost of college which has, indeed, skyrocketed.
However a more significant cost in the era you are both speaking of was the cost of not working immediately after high school - especially when that same social class very likely had started a family, or had a family they were coming from that needed support.
People coming from a high SES background could afford four years of non-earning, etc.
The result is college is more attainable to people across the social stratum. And then degrees are less rare, commanding a lower premium while the people holding them have more debt.
So, I agree, have a college degree is only a proxy for skill if that degree is in something that is marketable. Comp Lit is not marketable, hence the reason a lot of people at Starbucks have degrees.
Add the very real prospect of social security collapsing and we have a good setup for the streets being filled with homeless and vast portions of the population failing simply because there was just not enough money to go around.
Where do people get this strange idea that if someone else makes money it somehow precludes others from doing it? Bill Gates got rich with an invention that also generated probably a trillion dollars in economic growth for everyone. Including most people on this sites income.
There is an easy way to see that and it is if the median inflation-adjusted hourly wage has risen in the US since the early 1970s. Or if inflation-adjusted weekly earnings have risen. They have not, with GDP growth, both have fallen.
That this even has to be discussed shows the deep control those heirs who expropriate surplus labor time from workers have over discourse, the media, forums like this (run and controlled by an accelerator) etc.
Workers create wealth at a mature company. Some of that pays the electricity bills etc., but then the rest goes to either dividends to the heirs, or to wages. That is the "preclusion". The heirs expropriate the profits of the surplus labor time from the workers creating the wealth. They are shorted on their wages. Sometimes this is explicit like the cabal between Steve Jobs, Eric Schmidt that came out in the lawsuit.
Insofar as Bill Gates and invention - it takes a hell of a lot of gullibility to swallow the fantasy you concocted. Kemeny and Kurtz created BASIC. Gates hacks into a military research computer at Harvard and steals computer time from it according to Paul Allen's book (and Harvard admin found out and had proceedings) - they port BASIC to the Altair.
Then IBM comes to Microsoft. IBM got wealthy with computers on taxpayer funded government contracts and a monopoly which was lightly overseen. Gates's mother is on the United Way board with IBM CEO Opel who helps makes this meeting happen. Microsoft sells Seattle Computer Product's Qdos to IBM (Gary Kildall said it was a complete ripoff of his Drdos). A purchased ripoff of another product, sold thanks to family connections. So much for "invention".
At times like the current one, with real wages falling since the early 1970s despite economic growth, that this is even discussed is a sign of how the heirs have bought and paid for the narrative as well.
The question was really simple. Your answer is the kind that inevitably ends with "you just have to read Das Kapital". That is not an answer.
As I said, when, after the electricity bill etc. is paid, created wealth is split between dividends to heirs and wages to those who worked and created the wealth.
If in your hypothetical situation there is no split - if the person who worked and created wealth keeps everything - there is no expropriation.
The expropriation is the last few hours of work he does, of the wealth he creates - none of it going to him, the one who created the wealth and did the work. All of the profit going to the heir. But if there is no split, this expropriation does not happen.
Nobody, and Marx doesn't say that anyone is in such a case.
>Is it not possible for everyone to be richer than they started and also create wealth??
In some way, it is possible - after all, the worker is richer than he was before (he has more money thanks to his wage) as is the capitalist (who has made money from the sale). Now what wealth is created? Arguably, social wealth in the case of new technology being developed, for example. But Marx doesn't deal with this concept of "wealth" or even immediately money, he deals with the concept of value, which works at a different level of abstraction to money. The value-form only becomes the price-form, they're not identical.
I'd suggest looking up theories of exploitation as they have been figured in Marxisms that don't rely on a labour theory of value. It's too much to summarize in one HN comment, but you don't need to read Capital either (though it would be helpful).
The same concepts can be applied to the wealthy having access to lobbyists and consequently the ability to reduce regulations leading to long term health care costs for those forced to live in now less regulated environments. Gentrification is another venue for this. People live in a neighborhood, wealthier people move in and property taxes go up. Poor people are forced out. The actions of the wealthy have ramifications on the lives and incomes of the less wealthy.
I've read conflicting things about gentrification. But on the "gentrification is good" side, the theory goes that areas that tend to "suffer" the most gentrification, tend to be areas that have very high levels of displacement already. They tend to be poor, lots of foreclosures or dead beat renters. In other words, gentrification doesn't necessarily increase levels of displacement. Owners in gentrified areas stand to greatly benefit from gentrification. The renters... they get displaced - but that was happening whether the area was gentrified or not.
Any monopoly tilts the playing field so 90% of the business sector's profits pour into the pockets of a tiny few, leaving others high and dry. And virtually all the notable tech successes of the past 20 years arose and thrived via monopoly.
So it's little wonder why VCs like unicorns so much. The profits involved aren't shared with others.
Yes, it's possible for the rich to get richer while the poor also get richer. It's also possible for the rich to get richer by taking all the economic gains for themselves, leaving the poor to stagnate or to get poorer.
Over the past 40 years the bottom half of the economy in the US has seen zero growth. This despite massive growth overall. This despite rising house prices, growing student loan debt, etc. The rich have gotten richer and the poor and the middle class have been edged out. It doesn't have to be this way, but it is.
Why? Because of wage suppression, union busting, wage theft, predatory and usurious student loans, corporate welfare, tax cuts for the wealthy, and on and on and on.
You seem to believe there's a fixed pile of money in the world that every human competes for (e.g., 16th century Mercantilism).
Go into the woods and turn a tree into a chair. Where did that value come from? Will it run out?
The problem is when a company makes x, ceo takes 0.1x in compensation, while everyone else on the payroll combined also takes 0.1x in compensation just because they didn’t show up to work with the same provenance as the ceo. For a company, there is very much a fixed pile of money that every employee competes for.
I don't see how it's inherently wrong for a CEO to take 10% of revenue and employees split 10%. And why do companies have a fixed pile: shouldn't an effective CEO grow top-line revenue? What if employees are splitting twice the revenue compared to a year ago? (If the company isn't growing, replace the CEO for someone who earns their 10%.)
Money-making inequality, on its face, doesn't seem different than artistic inequality. Should I be angry that Warren Buffet gets 20% annual return on his investments while index funds only do 8%?
Stories of countless species hunted to extinction show that this is not what happens.
Let's look at the physics(?) of the economy: where does wealth actually come from? It's either from transforming things into more useful things or increasing efficiency of processes. There are limits to both of those things.
We live on a planet that's in a rough equilibrium in terms of both matter and energy.
So we have a finite maximal amount of value that we can extract from a constant amount of matter - which means that overall wealth is limited - we're just moving it around and at a finite rate at that - somebody is getting a bigger slice of this pie than the others.
There may be a finite limit on potential wealth but it’s effectively limitless for human planning purposes.
(For the trees, we can certainly mismanage our resources; it doesn’t mean it’s a priori impossible to create a growing and sustainable form of wealth.)
Actually a pretty decent amount over the life cycle of a person. After all those things require a functioning civilisation that educates properly to begin with.
The internet is mostly a means of making communication more efficient - that was one stupidly inefficient process ripe for an upgrade.
So yeah, no additional wealth really - just much less losses.
The internet and other tech doesn't create new wealth? Why has the size of the economy grown exponentially since the industrial revolution?
If wealth were fixed, as population in increased we'd be getting vastly poorer worldwide (1.6 billion in 1900 -> 7 billion people today). Have we gotten 5x poorer on average?
(These facts are easily googleable.)
I’m not saying local deviations don’t exist - I can lose my job - but globally wealth has been growing continuously and poverty is being eradicated. This is only possible with wealth creation. Look up Hans Rosling’s work.
Either you care and the 'allocation' is unfair.
Or you don't care, which means that the market doesn't work since you caring is what is supposed to allocate resources fairly, and it is still unfair.
Thinking "someone earned money which would have gone to me otherwise" is like thinking "someone got in shape and that physical fitness would have gone to me otherwise".
When I deposit $1k in the bank and you take a $500 loan, $500 is created out of nowhere. You are using my deposit yet I still have a balance of $1000 on the books.
Uber created a lot of wealth. Some got more than others. Is that objectively bad if the wealth didn't exist before?
...there is. not in the sense of "money" but in the sense of "wealth" and ownership of private property. Wealth is not "created" it is diverted. If wealth grows in one place it's because it shrank in another.
Unless you believe that growth and resources are infinite, like many capitalist economists.
Potential wealth is ‘finite’ because humans are finite but there’s no practical limit. (Just like there’s no practical limit to how much art can be made. If art isn’t being made it isn’t because there’s only so much creativity to go around.)
You described one of the only scenarios in which wealth is "created" and the only reason it's created is because it derives "free" energy from the sun. But that is just one of the economic inputs of the seed, and there are many more (the labor to water it, the water to grow it, the land on which it grows).
The sun's energy in this case is a small portion of the actual economic inputs the seed requires. And those economic inputs are diverted from somewhere else. They do not appear out of thin air.
So yes, I agree, we should harness the sun's energy, or other "free" energy sources, as economic inputs.
That doesn't change the fact that the overwhelming majority of wealth already exists and is not created out of thin air, but diverted.
We take natural resources, often plentiful, apply labor + skill, and get a more valuable product. We're wealthier as a result.
Yes, natural resources are finite, but not the limiting reagent for most things. There's a zero-sum game in that land used for farming is not available for an entertainment complex. But we have so much used for "nothing" that switching it to "something" is a giant increase in wealth. You can buy an acre in Kansas for a month's worth of minimum wage work. Las Vegas was built in a desert surrounded by hundreds of miles of wasteland. Did turning land, earth, trees, and iron ore into Las Vegas add zero value? (Not saying it was the best use of resources, just that assembling buildings improved the value of the raw materials.)
As a counterexample, consider melting down a car into slag. Are you less wealthy with your charred steel than a working car? Of course -- charred steel is less valuable. The number of atoms is the same. In other words, is an assembled watch worth the same as a pile of gears? Are you indifferent to the two?
If wealth were fixed in the earth, we must be getting poorer as the population grows. A few thousand years ago we had 1M people on earth. Were those peasants 7000x wealthier than us?
In regards to your watchmaker: a pile of gears might be marginally useful to a few, but a working watch will be valuable to many. Has the watch created wealth? No, because the people who were not buying watches are now not buying something else in order to buy the watch. The creation of the watch did not add the dollar-value of exactly one watch to everyone's wealth, allowing them to spend money on the watch. The wealth was diverted! The creation and selling of the watch certainly did create societal value but it did not create wealth, it merely diverted wealth from some other purchase each watch-buying customer would have made and sent it to the watchmaker.
My point is that the overwhelming majority of transactions are diversions, not creations, of wealth.
To your very original point: "You seem to believe there's a fixed pile of money in the world that every human competes for"
I suppose my argument should be changed to: Perhaps amount of wealth is not currently fixed (as defined by the shrinking number of resources one could use to create wealth), but those who create wealth (and not just merely divert it) are generally already very wealthy and are generally the only ones who have the means to create this wealth due to extremely high barriers to entry.
So, I would say theoretically, you're correct: there is a growing pool of wealth. But practically, any normal, everyday person cannot go around creating new wealth...they can only hope it is diverted to them by someone who already has wealth.
You may enjoy the elephant curve: https://www.brookings.edu/research/whats-happening-to-the-wo...
The very rich and moderately poor have seen wealth increase while the top quintile (E.g. many Americans) have seen wealth stagnate. Globally we’re richer on average but not everyone participated. Clearly it’s best if we can lift all boats and not certain subsets.
The ability for new wealth to be created is predicated on the fact that we have an expansionist monetary policy, or else people would simply never invest and we'd have a society comparable to a feudal state.
There is only so much land, food, and materials to go around.
I say "ish" for the first one because hopefully humanity will grow beyond Earth in the near future.
Generally, all of the above are false unless you are talking about a population which never stops growing, and in that case that would be the problem. Not wealth inequality.
There is also something to be said for personal choices of the non-wealthy. I see many in my lower-middle class city with new luxury BMW or Benz ($30-50k) vehicles, who are living in $120k condo units. I see many people with shiny massively spec'd out trucks that they never used to haul anything and just park at their office job. What would the country look like if average people saved more and invested some of their lifestyle splurging?
So, while the tax structure and skill gaps are big, we can also say consumers need to be much smarter and more modest if they want to build wealth and improve their station.
Ture. But this is an interesting point. It becomes a "fool's game" like a casino, where everyone knows that the odds are stacked against them. So when they loose, they can internalize their fault. Yet there's an abundance of "shiny" going around so that people's choice becomes emotional and hope-based, instead of rational.
Therefore there's no incentive to change the system because it's so easy to blame those who lose out: "see they made poor choices"—nevermind how we bombard those people with a ton of fine-tuned marketing. Taking the massively spec'd truck as an example, consider what the car manufacturer is selling: the feeling of power. Now watch a popular game on TV and get that message drilled into you about 30 times each week, while living a life that is otherwise very constrained in terms of finances and opportunities... many give in.
How the hell do you know the financial situation of strangers? What economic data do you have to back up this assertion that lots of people would be better off if they weren't foolishly spending your money? Or are you the local "leading authority on what shouldn't be in poor people's grocery carts"? 
Wealth is not a zero sum game.
A.) The upper class are extremely rich and the rest are desperately poor, or:
B.) The masses have enough money to live comfortably and the upper class has 9x of that growth because that's how percentages work--thus this world has to have many times more total wealth relative to the world in option B. It works if that is feasible considering that world's industry.
But what about when inequality keeps growing as projected, and the 0.1% have 90% of the wealth? Then the 0.01%? For the rest of the people to have a similar quality of life, the total number of assets in the world would have to grow by 10,000x. Otherwise, some people have to get poorer for a perpetually smaller percentage of people to have 90%.
Wealth isn't strictly a zero-sum game, but at a certain absurd point the math doesn't work. At that threshold, cold hard physics kick in and physics is most certainly a zero-sum game. Physics is as zero-sum as it gets. Nature balances her books mercilessly. If wealth inequality is at reasonable levels and there's a reasonable amount of growth, it's perfectly possible for the middle and lower classes to be prosper while wealthy classes obtain much more money than they. But when the wealthy have an extreme majority of all assets--in order to keep the rest of society at a tolerable standard of living you'd be raising the total assets in the world by like ten thousand times. Obviously that wouldn't work. It assumes totally unrealistic rates of growth, which is a recurring issue for our civilization these days. Wealth does become a zero-sum game under specific conditions. Our civilization isn't a perpetual motion machine.
This isn't up for debate. The two things are, by definition, not the same. You may think they are related, but that burden is wonderwonder's (and perhaps yours) to prove, not mine. Two separate concepts are not correlated, causal, or interchangeable until they have been shown to be. Not the other way 'round.
I don't think everyone else is stupid. But people that conflate those two things are either missing something or doing it deliberately (knowing full well they are wrong). Telling someone this is not arrogance, and my comment was perfectly coherent to a great many people if upvotes are any indication. If you want an example of incoherence, look no further than the comment which talked about "driving past food banks" as if that was some sort of "gotcha" about wealth inequality. I think it was lost on wonderwonder that food banks themselves are a great example of wealth not being a zero sum game.
You really should work on your debating style. Phrases like "this is not up for debate" are just bullying to stop a debate you don't want to have. If you are the same at work I feel for the people who have to put up with this.
Now, as to your comment (which was never the topic of discussion in this part of the thread): if the entire economy grows at 2%, and 0.1% of people’s wealth grows at 10%, the growth of the entire economy can absolutely account for that subset’s gain in wealth. It depends entirely on how big those respective groups are, not purely on “this % is larger than that %”. The money is not necessarily coming from someone else. The growth of the economy is not a fixed rate — if more people produce more good work, the economy will grow more. You also might be forgetting globalism and the fact that the "1%"’s wealth is not necessarily coming from
One Place™. For example, Jeff Bezos is wealthy because of AMZN. AMZN operates in many countries all over the world. Therefore, saying “the US is growing at X rate, so Jeff Bezos’ wealth growth rate is too high!” is misguided at best. This is true for many of the "1%" -- they are creating value all over the world, not just in their home country, so it follows that the growth of their wealth is not tied to the growth rate of their home country.
I hope making calculations is not part of your job, because if so I feel for the people that are forced to rely on them.
Good luck with your wealth redistribution campaign. I'm sure it'll work out great. Who knows, you might get a life of abundance you did nothing to deserve!
 or, at least, rivalrous, in that for anyone to gain utility from wealth compared to an alternative scenario, someone else must lose; it's not real clear that utility can be meaningfully aggregated across individuals.
That comment reads like someone fed a thesaurus of non-words and a copy of the Communist Manifesto into a Markov chain.
There is absolutely no prospect whatsoever of Social Security collapsing. A 0% chance. There is no factual basis, only flat-earth anti-vaxxer type scare mongering, for any claim to the contrary. Social Security is funded by payroll taxes. The only way Social Security could "collapse" is if people stop getting paid.
There are less and less workers having a job and paying taxes, while there are more and more benefits to pay for. This is especially applicable to elderly and pensions, in countries with a socialized pension system.
Imagine the situation where both parents are getting retired and were promised 80% of their salaries, while both their children can't get a job. It's all too common nowadays. There is no tax to collect.
This may not be too much of a problem in the US where there is limited pension, healthcare or unemployment. Europe will hurt though.