Hacker News new | comments | ask | show | jobs | submit login
Big Cities No Longer Deliver for Low-Skill Workers (bloomberg.com)
336 points by pseudolus 28 days ago | hide | past | web | favorite | 406 comments



"middle-class life to those without a college degree"

Anytime you start delving into what has changed for a subset of people, over decades, the most important piece of the puzzle is decided ding who qualifies for the subset.

I mean concepts like "middle class" or "without a college degree."

A lot more people get degrees, in 2018. Does the graduates/nongraduates segment still represent a comparable group of people?

Human capital theory that recommended this explosion in tertiary education assumes/d it does. College makes you more productive. Make everyone go, get productive and earn more.

It turned out, college was partly a proxy for class. What college grad earnings in the 60s was actually saying is that upper-middle class kids grow into upper-middle class adults.

...at least partially.

I certainly agree that a lot of cities are too expensive for the average person. But, I don't think that you can take education in 1960 and 2018 and treat the populations as similar.

I'm also dubious about college as a proxy for skill, in the skilled labour sense. A lot of college is very general education, with even less focus on marketable skills than high school.


The American term "middle-class" means something far less specific than the original British meaning of "well to do, town dwelling, non-aristocratic business owner". In that definition, a "middle-class" person is never an employee (which would make someone "working-class").

The American use of the term reflects (whether or not it is borne out in reality) the achievement of a comfortable and secure lifestyle (relative to the current time period) by an employee. In a way, it's a more flexible definition than the British one. The post WWII economic boom was the first time that, en-masse, people who were not inheritors of significant educational or monetary capital or status could achieve that lifestyle.

However, the problem with the American definition is that it chooses to ignore how much that achievement actually has to do with class and other inertial societal constructs, perhaps because frank discussion of class effects is still considered taboo in American society, whereas older societies with openly acknowledged class structures can actually have that discussion.


The most concise modern class definition I have seen goes something like this:

Working class: work to survive. Middle class: work to do other things. Upper middle class: work for wealth. Upper class: work optional.

People rightfully like to talk about the working poor and the disappearing middle class, but I think we are actually already past that. What we are seeing now is that between education, housing and financial insecurity it is hard to be even upper middle class anymore without existing wealth.


It's not even just that it's hard to be upper middle class (build wealth), but that people aren't even being taught to build wealth. Learning to build wealth isn't taught in schools. It's passed down from parents to children and it was far more common to find families where the parents taught these skills when there were lots more mom&pop businesses and homesteads where they built their own wealth and taught the skills to their progeny. The rise of the company man and the firm all but eliminated the need to teach about wealth building. People outsourced thinking about wealth building to pension plans. Now that pension plans have failed we have 2-3 generations now that have lost a lot of the teachings of how to build wealth.


So can you give more detail about this strategy to build wealth? Today the strategy might be get a good education, marry someone also well educated, get good jobs in lifetime possible careers (uh, doctors or programmers), buy a house of lasting and increasing value, buy stocks in low-cost mutual funds, don't waste money. This worked for me but this world was barely beginning in 1950, even though it was probably common in the 1960s. It's only available to a few fortunate fields today.


This is something I think a lot about. If you have a pension plan, you have zero incentive not to spend all your money. You have guaranteed income to maintain your standard of living. Having money for a rainy day becomes someone else's problem, just like you said.

I guess I don't have anything to add, just struck a nerve because that is exactly how I feel, so I wanted to say something.


I would also add that if you had a pension and pensions are a cultural norm, you not only have an incentive to spend everything, you also have no incentive to teach your children to save and invest in anything except maybe education (so they can get a job with a good pension), since you expect pensions to still be a thing when they hit retirement age. Pensions were a form a corporate welfare that essentially wiped out the cultural norm/practice of investing in your own future and made people dependent on the firm. No one foresaw a time when pensions would end up being a massive corporate expense that would lead bigger companies with them to fall behind nimbler upstarts without them.


"What we are seeing now is that between education, housing and financial insecurity it is hard to be even upper middle class anymore without existing wealth."

I'm not sure that's really true - or at least, it's an incomplete model.

Rather, what's happening with education, health care, & housing is that you have three sectors with flat productivity growth and large barriers to entry, which implies that the number of people they can serve remains constant. The population is growing. When you can provide a service to X people but there are Y > X people who need that service, Y - X people will necessarily end up going without. The financial insecurity is because of the mad scramble to avoid being part of the Y - X who get left out in this game of musical chairs.

The reason the middle class (using danan's definition of "employees with a comfortable and secure lifestyle") expanded so much after WW2 was because we had a number of technological breakthroughs that dramatically increased the productivity of these sectors. Antibiotics and vaccines meant that illnesses which previously would kill or require long medical care could be prevented by a simple shot. The green revolution increased the yields on crops by 10x. The interstate highway system and automobile opened up vast suburban areas for development, and developers (like our current president's father) became wealthy applying mass-production techniques to building housing. The number of skilled scientists in the U.S. increased dramatically during WW2 as they left turmoil in the rest of the world, which could support larger college populations.

We still see those technological breakthroughs in other areas like computers & media, but they don't happen anymore in fundamental areas of life like health care & education. Some of this is probably inherent in the problem domain (with all the easy diseases cured, we now die of heart disease, cancer, suicide, overdoses, etc, most of which have proven stubbornly resistant to cures) while some is because of artificial restrictions (like housing zoning laws).


In higher ed, at least, we have an abundance of people with PhDs that could theoretically teach at a college level - which is why a single tenure-track job will have 300 applicants. Not saying that supply and demand aren't involved here, but a simple application of supply and demand like you suggest isn't an adequate description.


Don't many of them actually teach (for bargain-basement wages)? At least, most of my friends who went to grad school had to teach undergrads at some point.

There's probably some artificial barriers to entry for education as well - it's hard to explain how Ph.Ds have no job prospects at the same time that college tuition keeps on rising. And I suspect that has to do with college's role as a signaling device for job prospects. If a bunch of Ph.Ds wanted to open a new college where they personally tutor a bunch of incoming college freshmen for much less than college costs, those kids would probably get a better education than any of their college-bound peers, but they also wouldn't have a degree from any credible university or much in the way of job prospects.

...come to think of it, I wonder if colleges know that their primary value is as a signaling mechanism, and intentionally restrict the size of the incoming class to preserve that signal. In other words, they're aware that they don't actually add any educational value, they just ensure that they only admit students who would've succeeded anyway and then point to all their successful graduates.


> ...come to think of it, I wonder if colleges know that their primary value is as a signaling mechanism, and intentionally restrict the size of the incoming class to preserve that signal. In other words, they're aware that they don't actually add any educational value, they just ensure that they only admit students who would've succeeded anyway and then point to all their successful graduates.

I think what you're saying is that they don't add any additional educational value beyond what your hypothetical band of PhDs could. They do provide educational value and also the signalling, though perhaps they don't acknowledge the latter so much.

But at least with pre-university education, as far I've seen, the process you describe is what most private preschools, primary and secondary schools engage in: They can essentially select their student population for the children who are most prepared, either by means (== family support system) or by their natural abilities.

Even those private schools that have scholarship programs for underprivileged students only accept students who are exceptional and don't have serious social functioning issues.

These schools then point to the achievement of their students as an indication of their superior educational process (which in some cases might be true, due to just having more money to provide better educational services).

Public schools, by contrast, must accept students of all abilities, and thus look worse on paper, when they are often dealing with a much more diverse, and often more challenging set of student backgrounds.


Teaching is not the expensive part of University education.

Housing (which you have to procure from the University for at least two years), amenities, and administration, is.

You can provide a world-class education by having professors use a whiteboard to lecture to students in cargo containers, or prefabs, for a fraction of the price of a modern education. But then you don't get the 'college experience' of paying tens of thousands of dollars a year for living in a hotel, surrounded by underage boose, drugs, and parties.


So room and board often gets separated out from tuition and while significant, is not the larger part of what a student pays at a "four year campus" college or university. One can also typically map room and board to cost of living in the area and find that you are merely getting a bad deal in most cases, not an astronomically bad deal.

A lot of the costs go into intangibles, like my college had some pricey guest lecturer programs. It is true though that university overhead has been growing quite fast lately and a lot of money in the last decade has shifted to things that are decidedly unrelated to education (I mean, LSU has their own lazy river).


No signalling required here. A degree from a non-accredited institution might as well be toilet paper.


For a modern definition, I like:

Working class: One who makes money by collecting a wage.

Upper class: One who makes money through investments or similar business dealings.

Middle class: One who partially makes money by collecting a wage and partially through investing.


I don't believe that "makes money partially through investing" is true for a significant part of the population given that 60+% of Americans don't have $500 in savings. Though perhaps that is the point


The internet suggests that the median American household has closer to $5,000 in a savings account. Regardless, while you may see some paltry returns, savings is not really what I think most would consider any meaningful investment income. The primary purpose of a savings account is to keep a modest sum of liquid cash available for emergency purposes and not where one would want to keep most of their assets.

Wealth figures are more difficult to come by, but the internet suggests the median household has a net worth of around $200,000. However, I would venture to guess that a large portion of that is tied up in a primary household which does not normally generate any income. It is not clear how much a typical person does make in investment income.

If you are right that 60% of the population do not have investment income, that seems reasonable enough. I think it is fair to say that most people are working class. I am sure that we can agree that, under modern usage, the middle class is meant to represent something that is not easily attainable, but more attainable than upper class.


In addition to most "middle class" wealth being tied up in home equity, I would wager that the vast majority of "investments" is tied up in retirement funds. Very few middle class people have regular income from investments. Outside of Silicon Valley where RSUs and ESPPs are common, few people can rely on stocks or other equities to generate any usable income outside of retirement.


The upper middle class is actually growing.


it's it growing as a percentage of total population or only in absolute terms?


You’ll also find that Americans regularly connotate social class with economic class, because talking about the former makes us uncomfortable.

I believe that the push to educate all young Americans was a push to expand the power of the middle social classes at the expense of the typically uneducated lower classes. Success in this endeavor is measured by the proliferation of middle class norms and tastes, not necessarily the expansion of middle class economic prosperity.

This is also why so many Americans talk about being “middle class”, even if they fall well outside the middle of the economic range, because they’re talking about the middle social class(es), not the middle economic class.


> You’ll also find that Americans regularly connotate social class with economic class, because talking about the former makes us uncomfortable.

Also because money is the only god in this country.


That’s true, but our nominal classlessness was a big point of pride and distinction from old Europe and it’s explicit class system.


well put

it also helps us to conveniently ignore economic sovereignty when contemplating 'success' or 'prosperity'

which of course is a useful frame given the bulk of the post war american middle class was created by essentially trading economic sovereignty in the form of small farms for corporatism

not sure how it could have developed otherwise given the rise of factory farms, etc, but this direction was certainly influenced by large players as well..


In my experience, in Britain middle class means driving an Audi and eating hummus (eg a lifestyle), nothing to do with owning a business


You make very good points above.

I really enjoyed college, especially my liberal arts classes.

That being said, with AWS Certs going for between $75-$300 how long do colleges think they can survive?

If you were a 22 yr old who studied from zero-level computing knowledge for 12 months and got the AWS Solutions Architect Pro cert (somewhat possible assuming 2 hours / day) I can already guarantee you’d receive a written offer for 100k plus within days from my current company (in Texas so that’s quite a bit of money).

Again don’t think colleges are going to compete well against that.

Last thing, here in Texas high schools can now issue Associate Degrees usually. Also the whole model of high school has shifted from test-focused to job-placement focus. Starting this year you can get an Associates in cybersecurity out of high school...


==I really enjoyed college, especially my liberal arts classes.==

I think this highlights the true question. Do colleges exist to promote education and intellectual curiosity or to train workers?

According to PEW, Americans believe colleges are for work training:

"Americans are split on the main purpose of college, with 47% saying it is to teach work-related skills and 39% saying it is to help a student grow personally and intellectually."

http://www.pewresearch.org/fact-tank/2011/06/02/purpose-of-c...


> Do colleges exist to promote education and intellectual curiosity or to train workers?

When college cost in the US what it did in the 1960s this sort of little bit false-dichotomy, little bit navel gazing question was at least fairly harmless. Now it's just fucking ridiculous even for most public schools.


The state school school is the interesting issue. California has the CSU system that offers education for about 280 a unit or about 3000 before financial aid with over a dozens campuses throughout most of the state in both rural and urban areas. The Cal States have become the primary options for any California residence below upper income. Most states don't have a system similar to this set up with at most 1 or 2 major state schools per state. The cheaper state schools often are key to bringing the price down.


This. For what is costs in 2019 college had better be opening a door to a lucrative career path.


Or it is a luxury good for those who can afford it for their children, like in the old days.


The American college question is always a jobs question dressed up.

There’s no jobs for people.

Therefore people try to get college degrees, even if they aren’t suited for it or going to be happy with it.

This drives up the demand for college, and specifically for degrees that lead to jobs.

This degrades the image and purpose of college (students aren’t here to expand their mind. They’re here so that recruiters won’t ding them.)

All of which leads invariably to Downstream issues, like credential inflation and growing student debt.

Eventually, college = jobs, and one of the greatest achievements of America, its liberals arts education, will tarnish and crumble.


To be clearer, the issue is the hollowing out of the job ecosystem in America.

Economic value now comes from fewer roles, which require specific skill sets to perform.

With more lopsided reward structures (few people corner most of the gains), there is increasing pressure for people to move into the few places where rewards collect.

This ties into the issues like " History majors can't get a job that will help them hold down a house or family".

The societal cost of it is that society becomes a monoculture of a few degrees and sources of information.

The greater advantages of a modern first world society in terms of societal achievement, art, culture and so on weaken.


Wasn't "education and intellectual curiosity" a way to increase people's productivity? (Not even a way, wans't it supposed to be the best way?) Because that's the reason colleges are funded by the government.

There's something very broken with our modern society.


I always thought that college was not to increase your typical office worker's productivity by X% but instead to be a long term bet on massive gains for society through academic research.


If that's the true, we should make colleges more intellectually elitist and put most of the funding on places that provide that better worker's productivity, or simply cut funding if there is no better option.

We also should make it easier for those academics to bring those massive gains, because academia isn't organized for that.

I still think this is not the real goal, but maybe it was at some time. (Is there more than one goal? The system appears to not be optimized for any reasonable goal.)


That's probably not the (only) way to do it. There are two interrelated but distinct types of breakthroughs that happen in research: the kind that occurs well within expectation, the next step in a well-understood plan; and the kind that is serendipitous, usually by way of uncommon or novel interpersonal or intellectual communication. For the latter, you need... diversity. Expertise, too, of course, and some amount of common ground. However, the most interesting and creativity-enabling interactions require people who aren't all aiming at the same thing before they initially interact.

On top of that, it can be difficult to separate the people who are actually intellectual elites and the people who are good self-marketers.

I kind of think we should just be committing a lot more funding to those interested, who meet some reasonable, objective level of expertise.


It'd be hard to justify the costs - both actual college costs and opportunity costs if it was purely for intellectual curiosity.


What if increased intellectual curiosity has a high enough ROI over someone's lifetime to cover the costs? What if the higher education allows them to enjoy life more and thus increase quality of life over the long-run?

I'm not saying either of these are the case, but it is worth exploring before we outright dismiss.


Especially since intellectual curiosity can be easily satisfied in other ways these days.


This is because college majors people do can be broken down into half as being generally viewed as practical or career focused such as STEM, business, education, agriculture, communications, ect. Where as the other half tend to be more enriching fields, Psychology, Visual and Performing arts, Humanities and other social sciences. (https://nces.ed.gov/programs/digest/d17/tables/dt17_322.10.a...)

The other point I think worth adding isn't that many of these humanities and arts fields are unemployable, but largely that they are some of the most popular majors and in turn their economic value is diminished as such. If there were twice or three times as many graduates per year in Computer Science, I'd expect the perceived economic value of a CS degree to decline as well.


> Do colleges exist to promote education and intellectual curiosity or to train workers?

I don't really think the teaching part at most universities oriented toward training workers. If they were oriented toward training workers, then the administration would be worried about what skills are in demand by employers...probably training students on particular software, computer languages, jargon and techniques that are common. There would be more emphasis on getting various certifications while still in undergrad. I can't speak to promoting "intellectual curiosity" because it seems hard to define and if you did I bet it would depend mainly on the nature of the student...probably his/her genes, peer group and childhood role models...not on the actions of someone the student sees three hours per week during adulthood.

As it is, a professor can (very often, though not always) be totally incomprehensible and irrelevant to work, as long as they are not offensive, without any repercussions nor even any gentle communication that he/she explain things clearly. Humans tend to craft stories favorable to themselves, so even if the student evaluations are negative, the professor can just blame e.g., short attention spans in the era of marijuana and fortnite...or the professor just might not look at the student evaluations if they don't want to. Over time, some professors even develop a sort of perverse joy at confusing students...their confusion being the sign that the professor is very smart and that he is "challenging" them, when in all likelihood he is just asking them things he never explained clearly in the first place. In my undergrad, an EE professor bragged that the class average was a 23%; he also used his TA as a translator because he could not speak english.

It is expected most everywhere that students will learn work skills on the job and in internships...not at school. Professors at research universities spend all their time thinking about hard frontiers of a topic, and that frontier might have little to do with what you do on the job.


The higher cost, the more college has to directly justify itself in terms of better prospective earnings.


Does it matter in practice? College is ridiculously expensive, forcing many (most?) people to go into massive debt in order to get a degree. If you come out with a master's in art history you're going to be feeling the economic pain for quite some time, and that's not good for society. Until costs come down, college needs to be about job training (which is a shame; I think many aspects of the system need to change.)


Well, the plebes can't afford to take classes to further ones academic horizons. When a class at a community college (read:too poor to go to a proper uni) is $900 and 30 students and no equipment needed, that class had better have strong income assurances. Else its a waste of money and further burdening of student loans.

And taking classes also means giving up time wise something else. Opportunity cost is also a thing.

So yes, liberal arts are the higher forms of education, which are also further from the rest of us who need a credential because of credentialism so we can merely live. (Food, water, electricity, housing, medical, internet are effectively all required to live these days. )


What community college is charging $900 for a single class? That really is highway robbery. I just looked up costs at the community college classes I took, and its $76 per credit hour for in-state tuition.

Also your characterization of community colleges only being for those too poor to attend a 4 year school is totally out of touch and frankly a little stigmatizing towards an important pieces of public education.

I know many people who graduated with 4 year degrees after getting an AS at community college, which requires the exact same core liberal arts + math and science course requirements that state schools require. These people didn't do it because they were poor, they went because their high school GPA didn't let them get admitted to the schools they wanted right away. The fact that they got to save a ton of money, pick schedules conducive to part time work, amd get a real credential after 2 years (I know many employers who prefer an AS degree holder to someone who dropped out of a 4 year degree 2 or 3 years in) is just icing on the cake. And it didn't hurt long term prospects for career or education either, one of them even has a graduate degree from a school that is extremely respected in the field he studied.

Community colleges are awesome things, both for people studying specific trades and careers and for eventual 4 year school attendees. The more we can destigmatize them, invest in them, and keep them affordable and accessible to working people - the more we'll all share in the benefits.


My wife and kids are all taking one class at the local community college. Tuition was nearly $400/student, plus a $200 book and a mandatory $200 online program. You can find the book used, and we economized by only buying 2 books for 4 people.


You can usually google the book and find a pdf on the first page of results. The online codes some of the professor's require is highway robbery.


WaMAP and similar open source offerings are finally starting to eat into this market, and they work significantly better than Pearson's MyLab & ilk.


I wholeheartedly agree. I went to a community college for 1 year before transferring to a top regional technical university. The upshot was I obtained a $14k/year scholarship for transfer students just for getting a 3.7 GPA. I would not have been accepted right out high school, and I had the privilege of avoiding dorms.


"That being said, with AWS Certs going for between $75-$300 how long do colleges think they can survive?"

Colleges shouldn't be in the business of training people for jobs - that is what tech schools are for. If you just want to get a job, go to a tech school, don't go to college. College is (or should be) about expanding your mind, your horizons. It should be about exposing you to new ideas and thoughts you hadn't been exposed to before. There is a reason why University and Universe have the same root.


Colleges sell themselves as providing remunerative employment training when it suits them, and hide behind this excuse when kids (esp. Humanities majors) can't garner gainful employment. "What, you thought you were paying $200k to increase your earning potential?! Crazy kids---this is all about personal development. Life is about more than money!" (Try not paying your student loans back, and you can see exactly how little they care about money.)

Even if we pretend that the majority of people attend college not for the purpose of increasing their value to employers and getting out of their parents' house, but for personal edification, colleges don't even accomplish this ostensible goal! Most students are more ignorant in most subject matter than the day they step foot on the campus - https://www.nysun.com/new-york/students-know-less-after-4-co... ; https://www.theatlantic.com/magazine/archive/2018/01/whats-c.... Often they're propagandized for four years, saddled with debt, and told that they got an "education", so it's nobody's fault but their own if they can't find a decent job.

Three cheers for the AWS certs and modern-day educational options and good riddance to the corrupt university system (propped up by taxpayer-subsidized monies, of course).


the colleges are guilty of a motte and bailey argument

the motte being the liberal arts/wholistic focused learning (only a philistine would be against that) and the bailey being the economically useful learning salesmanship.


Letter from John Adams to Abigail Adams, 1780-05-12:

> I must study Politicks and War that my sons may have liberty to study Mathematicks and Philosophy. My sons ought to study Mathematicks and Philosophy, Geography, natural History, Naval Architecture, navigation, Commerce and Agriculture, in order to give their Children a right to study Painting, Poetry, Musick, Architecture, Statuary, Tapestry and Porcelaine.


And the great grand children don't get to study because they have to get a job right away. After a few generations they finally save up enough that some kid can can study "Mathematicks and Philosophy, Geography, natural History, Naval Architecture, navigation, Commerce and Agriculture" and the cycle repeats.

Nothing wrong with the arts, but the fact is very few can support a reasonable lifestyle on it. By burning the inheritance you can produce art but nothing remains for your children. Study art by all means, but make sure you have something to support yourself on without burning up the hard work of your ancestors first.


What proportion of Americans now have the ability to study topics of less directly monetary connections as compared to those of Adams' day?

I imagine the answer to that question would be interesting and nuanced.


There were only a ~dozen colleges at the time, and women couldn't attend. 90% of colonists were farmers. I don't see where there can be much nuance. American's today have vastly more ability to study a bunch of nonsense.


So narrow the lens a bit - compare only those with the ability to attend university.


This.

If we're doing our job as a generation we are moving the focus of human education from the gross to the subtle.


> If you just want to get a job, go to a tech school, don't go to college. College is (or should be) about expanding your mind, your horizons

This is patently false. Colleges compete and advertise based on their job placement rankings. Just a few years ago, law schools were sued for misrepresenting their employment statistics.

Whatever ideal, romanticized version of university and college experiences are touted, the reality is that they are marketed to parents and kids as the way to a better lifestyle.

Sure, "Liberal Arts" are largely about enriching the mind, but that doesn't change the fact that very, very few people can afford the luxury of going to get a 4 year degree just to "expand your horizons".


I fully agree that you can't ignore the broader education that college provides, even when that diverges from marketability.

But then you have to be clear-eyed about what we're talking about here: from the perspective of a college student, is it worth paying 1-200k[1] to become more worldly, with some auxiliary benefits to their income potential? I would probably still send my (future) kids to such a place, but then I'm in a pretty high income/wealth percentile. I suspect that many people (including my family when I was a kid), looking at this clear-headedly, would rationally decline this deal. If you think the pricing of college reflects the "broaden your horizons" aspect instead of the perception that it's "protection money for a middle-class career", you're sorely mistaken. Which means there are and can be far, far, far more cost-effective ways to broaden your horizons, particularly in the era of Internet communities, cheap travel, and all of humanity's knowledge at your fingertips at all times.

[1] I'm aware that there are ways to get through college more cheaply, like starting in community College. But a system in which the majority of people are going through ccs first is a very different system, so the thrust of GP's point still stands.


Today it’s hard to make an economic case for a well-rounded liberal arts education — but I’d bet that in ten years, computers will be pretty good at everything except the stuff you get from a good liberal arts education.


Ten years? Do you really think software development as we know it will be gone in ten years? I think it’s important on hacker news that we have perspective on our place in society and history and that we don’t think of ourselves (speaking for software engineers mostly here) as gods or something, but it seems very extreme to suggest that what we do will be gone or severely reduced in ten years.


The timing is a guess. But I do believe that we’re not far away from a point at which computers are much better at generating a CRUD app, or finding the most efficient algorithm, etc, but are still far from crossing the uncanny valley of making a truly moving piece of music, or work of art.

I don’t think programmers will be obsolete; but I think the skill set will shift away from the analytical and mathematical traits that set programmers apart today, towards the fuzzier knowledge of the humanities. (Though my personal belief is that liberal arts educations must include science and mathematics as a core pillar.)

Specialization in algorithmic and mathematical thinking will still be of extremely high value, but the level of achievement required to be successful in that area will likely be crushingly competitive.

But, this is just a fun guess — let’s check back in ten years so we can see how utterly off base I was :)


The timing is a guess. But I do believe that we’re not far away from a point at which computers are much better at generating a CRUD app,

People thought the same thing in the 80s....


In regards to art and music, it is more about artificial scarcity and popularity. The few artists at the top make a lot of money. As for the rest - the term "starving artist" describes them.

Also, given that computers can test stuff on billions of people, I would say there is a good chance that you could have popular pieces of music and art that are produced largely by algorithms in the very near future.


Actually, it's becoming increasingly less difficult to make a living in art by finding a niche, because the niche that any given person can access has grown.


Colleges shouldn't be in the business of training people for jobs - that is what tech schools are for. If you just want to get a job, go to a tech school, don't go to college. College is (or should be) about expanding your mind, your horizons.

I keep seeing this on HN. Only the privileged can afford to go to college to “expand their minds”. The middle class and even the upper middle class are not spending tens of thousands of dollars a year to go to college to “expand their minds”. They are doing it so they can get a job.

Where are all of these students with “expanded minds” going to get a job? I specifically told both of my sons that I would not support them in getting a degree that didn’t have an outlook for a decent paying job. I specifically had them look at the starting salaries of graduates, the placement rates, and the five year salary averages for the school and the degree they choose.


>Colleges shouldn't be in the business of training people for jobs - that is what tech schools are for.

Whatever it "should" be for doesn't really matter. The majority of people (in America) operate under the assumption that going to College helps you get a job.


What is it with this attitude that somehow learning new things is only honorable when those things are useless in the job market? Do you really think that anybody is taking up 4 years of their life and 200 thousand of their (probably borrowed) dollars just for the purpose of "expanding their horizons"?


Who is paying 200,000 for college? That's crazy. I have 2 kids attending a private college and each one pays $2000 a semester. They both have academic scholarships - so their tuition is covered - but even if it wasn't, that is less than $20,000 over 4 years. You can get a good education without attending an ivy league university.


It's terrific that you've found a private school that would cost $20K total for four years but everything that I've read indicates that such a school is a tremendous outlier. The average tuition for a public school last school year was $5500 for in-state students and $12K for out-of-state students; private schools tended to be substantially higher.

https://www.collegetuitioncompare.com/statistics/

Anecdotally, the college I went to, New College of Florida, is a public liberal arts school whose costs aren't too far out of alignment with other state schools in Florida. According to their fees page, in-state tuition fees, room and board would total $16K a year; for out-of-state students, that would be $39K a year.

https://www.ncf.edu/admissions/tuition-and-fees/

Actually, I'd like to gently question your figures: you're indicating your kids are paying $2000 a semester -- so that would be $4000 a year, or $16,000 over four years. And you're saying that doesn't include their tuition, so that's just room and board, right? (That's under half the cost of New College's, which again sounds like quite a deal.) Further, room, board and tuition at this private college would still be under $20,000 over four years? Unless I'm missing something, that would mean tuition at this private school is less than $500 a semester. I can't help but suspect that either one of us has our math wrong, or you have the best school deal in the history of ever.


I didn't say anything about room and board (their housing is about $1300 a semester). The $16,000 would be the tuition cost. I was saying that they have academic scholarships which is covering their tuition. But, if they didn't have those scholarships, the 4 years (8 semesters) of tuition would be less than $20,000. I know of at least 3 private colleges with similar tuition. I imagine there are lots more if I bothered to look.


> Who is paying 200,000 for college?

I too am very interested in the answer to this question. if you're spending that much and not going to a top 30 school, you're a chump. if you are going to a truly elite school, you're not paying anything close to the sticker price unless your parents are quite well off.

I'm not saying college isn't unreasonably expensive these days, but I often see this number for undergrad and I always wonder where the hell it came from.


I just googled the current tuition at the school I went too and roughly multiplied by 4. The exact number isn't my point just that it's a ridiculous amount.


Costly signaling.


> There is a reason why University and Universe have the same root.

Where did you get this idea? The term "university" (as an Anglicization of "universitas") refers to its guild-like corporate organization.


At the same time, maybe college shouldn't charge that much for a major/profession that might not pay off the student out of his/her student loan back, in like decades?


[dead]


Sounds like you had a bad experience with college but I think you raised a couple of good points:

- college is in every way, dumped on teenagers as THE ONLY way forward

Absolutely true in my experience. High schools are entirely structured around two things: standardized test performance, and pushing kids into college. If you are not a college-bound kid, you essentially don't exist.

- No one is there to mature. No one is there to improve their concept of individuality. No one goes to college to learn for the sake of learning.

"No one" is too strong, but I think this is true for the majority. Most people in an undergraduate program (and probably also a majority in Master's programs) are there because they think it is the path to a good job. Because that is what has been drummed into them since elementary school. Learning, expanding horizons, etc. are secondary.

If most people at college were there simply for the sake of learning, enrollment would be a small fraction of what it is.


>No one goes to college to learn for the sake of learning.

I studied Ancient Near Eastern Studies and Historical Linguistics, neither of which is part of anyone's plan for getting rich. But I studied those things because I was interested in them, purely for the sake of learning without any hope of financial reward.

Also, I made the best friends of my life in college, many of whom, 20 years later, are still my best friends.

Overall, I feel like I matured a lot in college.

As for my own kids, I'm encouraging them to do what I think will be best for them and their future based on their skills and temperment. My two oldest I encouraged to go to college. But my third I am strongly encouraging to go to trade school. I encouraged the two oldest to study whatever made them happy in college and to not worry about studying to make money. One is studying non-profit management and the other is studying animal behavioral psychology (she loves working with dogs). Neither of those are good career paths, but they are things they enjoy learning.

You don't need to pay tens of thousands of dollars for college. My wife and I graduated without any student debt. My two oldest are about halfway through and so far they have no debt either (through a combination of working their way through college and academic scholarships).


You are claiming a lot of absolutes in this. I attended college to expand my mind, learn new things, and mature into an adult. I went to a large but reasonably prices state school and did everything you said no one does. I don't even use my degree but going through thought me a lot of lessons I still use (working well with people I don't know, learning things quickly and applying them, making new friends). I, and I bet a lot of others, learned and grew a lot during University with out the notion of finding a career.


In my view, there's some survivorship bias going on there. Somebody has to know that the cert is in demand, and that they have a good chance of getting it. We think it's easy because we're the ones for whom learning computer skills was in fact easy. But it's not easy for everybody.


>That being said, with AWS Certs going for between $75-$300 how long do colleges think they can survive?

Well, colleges used to be for the elites (either rich or very talented) to learn science and to do research.

It wasn't meant for vocational training for the general salaried worker. Now it's mostly that.


And if you just take the cert without the understanding of the underlying technology, or any practical skills you will be embarrassed when you interview. Certs by themselves are worth next to nothing.

I ask one simple architectural question that weeds out many of the paper tigers:

“We need to deploy a website that is fault tolerant, secure, can handle one AZ going down and scalable. How would you do it using AWS technologies?”

I’m saying this as someone who has 5 AWS certs and interviews AWS Devops candidate. But I also have 20+ years of professional software development experience, a degree, and before that I was a hobbyist for 6 years.


This


I really wouldn't compare learning a very specific technology with college. The idea with college level education is that you get the foundations of what you study and a learning method, so that you can adapt to changes (eg new technologies) faster and better.

Of course you can learn the foundations without college and not learn them with college, but on average it's much easier to do so with that.


Yes. Technology and business environments change. Students will only succeed if they become self-motivated, life-long learners who can identify the key problems/technology that their employers need solved/developed and rapidly build upon their skill-set to solve those problems and implement the solutions. Ideas/technology are like Legos - the point is to know how to assemble the pieces you have to build what you need.

When I started my career as a scientist in a corporate analytical lab 36 years ago, corporate funding for travel and continuing education was plentiful. When budgets got tight, that was the first to go. Those of us who were self-directed learners survived the downsizing and cost-cutting much longer than those who didn't.


I really wouldn't compare learning a very specific technology with college. The idea with college level education is that you get the foundations of what you study and a learning method, so that you can adapt to changes (eg new technologies) faster and better.

That’s all well and good until they find themselves competing with people from other countries who are being trained to hit the ground running and are willing to work for less....


In my experience undergrad programs can fail hard in this too because of the limited time to cover the subject matter.

14 chapters of Digital Signal Processing in 12 weeks w00t!

For many of my engineering courses at a top tier public university it was in-one-ear-out-the-other.

Though, the most of stuff I learned at community college (pre-engineering) did stick.


Even if it does stick unless you use it you will likely forget.. but I think it's still useful to have learned since you hopefully retain enough that you know things exist and can look them up


Who on earth employs a "solutions architect" whose only qualification is some random certificate that tests AWS trivia?


Quite a few people. The returns are generally poor, which is why I chuckled at the post to which you replied; as a consultant I consistently found myself doing more work (and costing more) coming in after such credentialed-but-not-taught folks to fix the mess of "well, I know AWS, so all I'll use are AWS-recommended AWS solutions whether or not they actually fit what my project needs. AWS!". Which isn't, I stress, a knock on AWS--AWS is fine. Its managed services aren't appropriate for every situation, but the AWS certs that I've taken have never-not-once entertained the idea of not using a proper-nouned hosted service at any point. There are plenty of places where those services are fine, if sometimes overpriced even taking into account people costs...but there are plenty of places where they're not and no AWS cert I've seen has even nodded at the idea.

As it happens, I use stuff I learned in my distributed systems courses--in that oh-so-awful university, hiss--on a weekly basis, if not daily, to solve those problems. (I use stuff I learned in my microeconomics and my political science and my sociology classes daily, but you can't put that on a resume, so obviously it doesn't exist.)


lots of people.

In the Salesforce consulting world the Certified Technical Architect (CTA) cert means you walk in the door at any of the big consulting firms at managing director or equivalent rank and that comes out to around $200k/year. The cert isn't that easy to get, it requires a dozen or so prerequisite certs and then an exam plus a board review but you could probably do it in a year and a half.

Also, i think over half of all CTAs are employed at Salesforce itself so you could always go work for them if consulting isn't your cup of tea (it's not for everyone that's for sure).


Granted, I know little to nothing about the IT industry, but is a "managing director" someone who has a high school diploma plus some technical skills?


The best I’ve ever had never went to college. I was extremely productive under this person because there was a high level of trust and ownership. This was at a company whose CEO also didn’t finish his undergrad at Yale. Prior to this company I worked as an engineer at Google and had some launch/promo obsessed managers who lacked basic management skills.


So no different than a high-level Cisco, VMware, or Redhat cert.


More employers than you would think. The professional certs are legit. Also, 50% of some cloud native DevOps and infrastructure engineering roles are covered by the pro certs.


Covered pretty poorly. I've let my AWS certs lapse because they're not worth the time to keep them up. The hard parts of my job have very little to do with "these AWS services exist", which is the extent of what I've gotten out of AWS certs.

(Nobody cares about SWF.)


Yeah, a serverless certification would be great. These certs are more foundational in scope.


I don't see why it would be, to be frank. "Serverless" is a how. It's a tool, not a solution, and you should be expected to pick up new tools without waving around a piece of paper saying you know a tool.

Except for button-pusher roles I've always hired for whys, not hows, and certification programs seem to never care about the whys. College degrees don't assert that you know how to learn, that you know how to derive the whys--but, in my experience, they correlate better than cert-hunting does.


AWS certification exams test for basic and foundational AWS knowledge. They don’t try to do anything other than that. The “whys” are learned in the field, certified or not.


It goes both ways.

Who thinks they're ever gonna do any cloud service right employing some dude who only has worked with traditional servers?

You'd be surprised how many insanely expensive and dorked up cloud setups out there that honestly could have been solved / way cheaper by some guy with a random AWS cert.

I'm not talking former barista turned AWS guy in a couple weeks, some background is required for that and random server guy of course, but the barriers are dropping.


Backwater IT orgs deep in the heart of Texas that have heard that they have to move everything into the cloud?


No, startups in Silicon Valley are starting to look at these certs because they realize how expensive poorly configured cloud infrastructure can be. It’s a skill that few people have at this point.


As a hiring manager for an infrastructure engineering team in a startup in the Silicon Valley, I have some opinion about the value of these certs.

I'd likely never hire an engineer with one of these certs and no relevant experience into a full-time position. I might offer an internship, but that's not a six-figure position.


I agree. Work experience always comes first. Anyone obtaining AWS certs without work experience should be a huge red flag.


Maybe not a "red flag" but I wouldn't be desperate to hire anyone with no experience, even in the Silicon Valley where competition for engineers is fierce. A headcount is a headcount as far as investors are concerned, so I'd rather have someone with more experience that also costs more.

Even still, I agree with the parent up above. A degree isn't strictly necessary, and if someone had the motivation, the talent, and the cert, I'd be willing to give them a chance as an intern.


The more I learn about Texas, the more it seems like they're doing everything right.


Hear hear. I found the “Draw Mohammed Contest” particularly amusing.

Edit: I see we have some touchy SJW-types in the house this evening :)


I downvoted you not to express disagreement, but because I think your comment detracts from this particular conversation. I suspect others did for the same reason.

However: Some of your other recent comments are spot on. The upvotes are on me. Try not to spend them all in one place.

ausjke 27 days ago [flagged]

Until Texas is overwhelmed by migrants from CA and Mexico that is, that joint-force is changing TX from red to purple, and to blue soon. At the moment, Austin is more liberal than most CA cities already.

When TX is taken the US will officially turn into a social/welfare state, the US is doomed already, similar to what happened to Rome in the past.

People from CA or Mexico are running from there for a reason, however when they arrive TX, their first to-do is to make TX look like where they used to live. They are virus.


I couldn't hear you over your dog whistling

codinger 27 days ago [flagged]

No kidding, wait until he finds out Texas was formerly Mexico and many of the brown people are descendant of US citizens when Texas became a part of the US.

ausjke 27 days ago [flagged]

that's like you saying most Americans should return to Europe because what they have done here was so evil, unfortunately you can not roll back history


No, it's not like I said that at all. What point are you trying to make?


Are these certs actually worth anything? I usually see tech certs as a red flag. I couldn't imagine hiring a highschool grad with an aws cert and expect anything close to someone with a cs degree.


I didn't know about these courses, and have been contemplating getting some certifications for a while. This looks like it might really help. Thanks for posting this!


Only a couple generations ago, college was not a proxy for "middle class".

Lots of blue-collar jobs (especially industrial ones) paid enough for a "middle class" lifestyle and financial security. Small businesses were competitive and many of their owners were not college educated either.

College wasn't a necessary qualification for most blue-collar jobs, including some for which it now is (such as police work, in many places) or even for many entry-level white-collar jobs - you could start out as a file clerk with a high school degree and work your way up into middle or even upper management (at least if you were white and male).


The over-arching point I believe the parent comment is making is that the kinds of people who didn't get college degrees back then because they weren't needed are getting college degrees now, because they are needed as qualifications for jobs. They weren't incapable of it back then, it just wasn't necessary nor widespread. Now it is. So it's the same kinds of people doing police work then as now, with the same aptitudes, just those people are going to college now because they have to whereas they didn't then. (And in a way, increasing prevalence of higher education is a marker of success for a society.)

Unrelated, a huge point I'd like to see addressed more is how housing costs have outgrown inflation over the course of decades. We used to build enough housing, but NIMBYism and restrictive zoning has taken over recently. Housing is much more expensive in big cities in real terms than it was then, even though fewer people live in them in many cases now (e.g. the population of Manhattan reached its peak in the 1910 census and is down 30% since then). Real wages haven't decreased since the middle-class heydays of the 60s, and a lot of consumers goods have gotten much, much cheaper since then in real terms. The real headwinds in the face of middle-class prosperity are housing costs, healthcare costs, and (to end the digression) university costs, all of which have grown well above inflation.


Just a point on the population of Manhattan, it’s be careful with those numbers and applying them to modern day. People used to cram into death trap tenements 10 to one windowless room in some cases.

Hundreds of thousands would live on a street that now houses a thousand people.

I don’t think that density is coming back or desirable.


You're exaggerating the population density. It definitely wasn't hundreds of thousands per street! Some figures here: https://urbanomnibus.net/2014/10/the-rise-and-fall-of-manhat... And it's worth pointing out that the peak population density in walk-up tenements was still well below what can be accomplished with tall modern day residential apartment buildings.

No one is saying the tenements were desirable, or wishes for them to come back. What we want is the construction of more dense residential housing that can meet or exceed that population density while providing good quality of life. Said construction is entirely possible, doable, and profitable, except that zoning prohibits it in most places.


SROs have also largely gone away. Not that many of them were very nice, but better than the street for a single person with limited means.



> Hundreds of thousands would live on a street that now houses a thousand people.

> I don’t think that density is coming back or desirable.

I think there's plenty of examples of dense cities (certainly far denser than most US cities) that aren't full of death-trap tenements and it's borderline intellectually dishonest to equate density with that.

In fact, letting NIMBYs have free rein to obstruct/delay/interdict housing construction is far more likely to cause overcrowded/unsafe living situations (which include homelessness) than the other way around.


Yes, in 1880 people were crammed into tenements. Today we live roughly as densely as we did in 1960 (maybe slightly more densely due to more high rises), but housing is far, far more expensive.


I think you phrase the grandparent's comment much better than he did. That is much more clear. Appreciated.

But on housing, there's a major issue you're not considering. This [1] is a list of US city populations in 1950. This [2] is the list for 1960. So we're looking at the sweet spot of the housing boom and a period where the population also dramatically grew increasing by about 30 million (~20%) in those 10 years. Now you might notice something funny. Nearly every major city shrank in size!

The housing boom did not involve building up (rather literally as is the desire of some people today) in desirable areas so everybody could affordably live there. Instead people sacrificed some comfort and moved outside of cities and started building houses in uninhabited areas outside of the cities. Houses were cheap because people were developing in areas where there was nothing and that nobody wanted before. In turn this movement away from city centers helped keep those prices within the city reasonable, as it had a depressing effect on demand.

The reason prices are so high today is because of simple market dynamics. People are less willing to live in less desirable areas. This is driving the prices in desirable areas into crazy land. There is still immense cheap housing available outside of these areas. Some cities are so hungry for new citizens that they're literally even giving away land on the condition that you put or build a house on it.

[1] - https://www.biggestuscities.com/1950

[2] - https://www.biggestuscities.com/1960


I’d also argue it has to do with such availability of previously undesirable areas in coastal regions which are also close enough to large commerce centers. Everyone wants to live on coastline. It’s the reason why amazon can’t just relocate to North Dakota and start a second Silicon Valley there


A couple generations ago US had a military draft and wars to send low-skill working class men to. That significant changes the meaning of "start out as a file clerk with a high school degree and work your way up into middle or even upper management".


> you could start out as a file clerk with a high school degree and work your way up into middle or even upper management (at least if you were white and male).

That upwards mobility depends on one thing: that old people drop out of the workforce either by death or by pension, so that young people can rise up the ranks.

Now, given that people in their 80s need to work (!) to survive, this "generational contract" is broken. People are stuck with shit jobs until well in their 30s, so how are they supposed to procreate or save for their own retirement?


This is the lump of labour fallacy. Rest assured that life will be harder for young people if a sizeable portion of the population is old, out of the labour force, and needs to be supported by a rising share of taxes from a small number of young workers.

We should hope that older works stay in jobs as long as they're able. The coming demographic bulge isn't pretty.


The old people in America now own the majority of the assets. If they transfer those assets to the young in exchange for work, things will get better. If the old are split into those that have a huge amount of money and spend a little percent of it in retirement, passing the rest onto their heirs, and a large number of old people that need to keep working just to survive, locking out the young people from moving up, there will be trouble. Unfortunately it seems like the latter is happening at the moment.


At a global scale, ownership of assets isn't really savings.

Imagine an island society, with a bunch of 40 year old workers. You own the company that gets all the profit from their labour. So, you have a real resource, which you could trade with other places.

Now of course this island has no children. How do you feel about your investment once the workers reach age 80. It wouldn't produce very much. In fact you'd need to put resources in just to keep your island ex-employees alive.

But that's an extreme example. Let's suppose that 20% of the population is only 20 instead of 40. When the current 40 year olds hit 65 and retire, the younger generation will be 35 and can support them. What happens to your profits?

They will probably still either collapse or near collapse. You had 100% of people in the labour force, now you have 20%. This poor 20% not only has to support themselves, they must support the 80% idle old. And that's before any surplus is generated for your profit as owner.

So, your savings in the form of ownership of the island weren't very useful. It just dies out.

Could you saved otherwise? Not really. To a limited extent you could store the outputs of the island. We do this currently with grain reserves, oil stockpiles, etc.

But most of the stuff we consume are produced in the same year. You can't save real goods, only ownership of entities.

So if the entities all have a shrinking labour force, they aren't worth as much. The old people in the us won't have meaningful assets with which to pay for their care.

(The one exception is a small old country that owns rights to part of the output of other countries which are younger. There's also an exception for automation, if the companies owned by the old are massively more productive per worker by the time the demographic bulge retires)

You're making the mistake of those who think that money itself as wealth, as opposed to a medium to exchange for wealth. Money is mostly fungible, but in edge cases it just can't buy certain things.


Savings are invested into creation of long-lasting capital goods that are just as real as grain reserves or oil stockpiles, and that yield a positive return over the original investment - companies being "massively more productive per worker" isn't the exception, it's (hopefully) the rule. Yes there is risk involved and some investments won't work out, but that's why you diversify and make conservative assumptions. (Now if the economy was projected to stagnate in the future, you might have a bit of a point. But since our economy is growing, even "paper" wealth like government bonds can easily yield a positive return.)


Wouldn't all of that be true for my island society too, as long as the population was stable?

Such a society would nonetheless face much difficult if the worker/retiree ratio changed from 100:0 to 20:80. That's true regardless of individual productivity level: case two is harder than case 1. Regardless of capital stock such as ports, rail lines, buildings, etc


> and needs to be supported by a rising share of taxes from a small number of young workers

And here is where it gets funny. I wonder how pension systems would look like if they were supported by the billions of dollars that especially FAANG but also other huge megacorps such as IKEA, McDonald's, Burger King "creatively avoid" in taxes. Or if minimum wages were set so high that a pension built on a life of minimum wage were enough to live on in old age (this is a problem that currently runs high in Germany, as many old people have not enough pension to get by and so require social welfare)?


This still doesn't work. Monetary savings aren't real goods. Retirement is funded from current production. If the labour force participation rate drops, then a greater percentage of current production is needed for retirement.

And corporate taxes are just taxes on people, in the end (the owners of the shares). If there are fewer people working and more people to support, things just get harder, absent large productivity increases or immigration of young workers.


It’s unacceptable that social services are a zero-sum game between those employed and those not employed, while employers cut margins and funnel out basically all profits - the famous „socialize the losses, privatize the profits”. I hope FAANG, McDonalds, IKEA, Deutsche*, automotive companies, fossil fuels companies, investment funds, and whatnot will get the feel of cold talons of the social justice.


> I hope FAANG, McDonalds, IKEA, Deutsche*, automotive companies, fossil fuels companies, investment funds, and whatnot will get the feel of cold talons of the social justice.

It's not like the people running these corporations are all evil and out to ruin people's lives. People behave based on the incentives laid out in front of them and the current combination of capitalism and laws incentivizes the behavior we are seeing.


> People behave based on the incentives laid out in front of them and the current combination of capitalism and laws incentivizes the behavior we are seeing.

True, but the same people also largely support programs and politicians that preserve or extend those incentives. They're not innocent bystanders.


"I wonder how pension systems would look like if they were supported by the billions of dollars that especially FAANG but also other huge megacorps such as IKEA, McDonald's, Burger King "creatively avoid" in taxes"

Pension systems own the FAANG companies. For example, in the US, CalPERS manages a third of a trillion dollars. Two thirds of that is in equities such as Apple.


aahhh, the ol' lump of labour fallacy fallacy


Or that employment is growing rather than shrinking - but I absolutely agree.


> Does the graduates/nongraduates segment still represent a comparable group of people?

An interesting touchstone for the article: high-rent cities in the US have started to see restaurant turnover not from 'normal' financial failures but from labor shortages. For midrange restaurants that aren't either selling luxury or exploiting captive demand (e.g. lunch for nearby offices), there's no price point where they can make ends meet while paying servers, dishwashers, etc enough to accept high rents and/or long commutes.

I agree that treating "non-college" as a constant bracket is a misleading measure, and I can complain all day about the spread of overpriced credentialism in hiring. But observationally at least, I think there's a separate issue with big cities and low-skilled labor where work we consider part of "normal functioning" for our society is increasingly incompatible with the housing prices and transit options available around urban centers.


Re. failing restaurants, that's a result of bad business planning. If you can't find staff, pay more, pass those costs on to customers. If people can afford to live in high-rent areas, they can afford to pay more for food.


The reality doesn't agree with you. In SF, for example, fine dining is doing well, while the mid range is slowly failing and being replaced by fast casual, where they focus on fast turnover. The only way to pay the bills seems to be to either be super high end (fewer customers overall, but at a much higher service level and price point), or just above fast food, where you serve a reasonable-quality menu, but items that can be prepared quickly, and served without waitstaff.

Restaurants that try to charge "too much for their station" don't last here.


Funny, I just wrote a reply to another post which explains exactly what you wrote.

> In SF, for example, fine dining is doing well, while the mid range is slowly failing and being replaced by fast casual, where they focus on fast turnover.

It has to do with readily available substitutes for a given price point affecting elasticity of demand. The type of food offered by a casual dining concept can be replaced by fast-casual, whereas higher end food has relatively inelastic demand, because it has fewer substitutes.


> The type of food offered by a casual dining concept can be replaced by fast-casual,

That's mostly not a change in type of food, just front-of-house service model. And casual and fast-casual both focus on turning tables, fast casual just has counter ordering (and often counter service for drinks) rather than full table service.


Yes, hence them being substitutes.


is this truly indicative of market failure though? maybe it's not unreasonable that traditional table service becomes a luxury in high CoL areas.


I commented on this elsewhere, but my general reaction is that it's an interesting change regardless.

I think there's a very plausible argument that Applebee's range restaurants are very labor-intensive without adding substantial value, and so they naturally get shut out of any market where labor is expensive. The "cheap hot meal" role is filled by fast food, while the "place to sit and chat" role is filled by everything from pubs (order at the counter, high margin alcohol) to dedicated desert places (order at counter, often cheap nonperishable ingredients).

But granting that it's not something to 'solve', it's still newsworthy. There are ~3M waitstaff jobs in the US, and despite tipped minimum wage it's often pretty well paid, with nonstandard hours that can help part-time students, households with two working parents, etc. If our biggest cities are no longer compatible with common forms of work, that's worth discussing and looking at the secondary impact of.


Isn't this literally indicative of the economy as a whole? The entire middle is being hollowed out. You've got to go high or go low, which is the same segregation we've seen in terms of wages.

Makes sense to me.


I don't disagree, exactly - I'm strongly opposed to most rent-control measures other than scrapping protectionist zoning rules.

But I do think can versus will is a thorny question. Restaurant and delivery dining are highly elastic in general, and big cities aren't necessarily an exception. People who can stably afford to live in downtown SF, NYC, LA, Boston, etc. are still frequently paying >33% of income in rent, and ~50% isn't shocking for younger people. As a result, you've got a major demographic that's relatively high-income but still tight on cash and consequently price sensitive. (More speculatively, I think there's also a grain of truth to the "avocado toast" thing; young urbanites seem likely to cut back on purchase frequency before they cut back on quality.) So it's not necessarily the case that restaurants can make things work by raising prices; there's no rule saying that a given type of restaurant has to be viable at any price point.

I realize that we can label all of this as free markets doing their thing; maybe mid-tier restaurants with lots of staff are just an inefficient use of valuable urban space. But real estate is infamously messy, and there are some interesting questions about whether city rents would be more functional if not for the conversion of usable space into artificial demand (e.g. mandatory parking with housing) and disused investment space. And even if that's not the case, it's still news if we've built a market where "going out for dinner" is no longer a standard transaction.


> Restaurant and delivery dining are highly elastic in general

Elasticity of demand depends on whether or not there's readily available substitutes. If you're a mid-end restaurant, you can be replaced easily by fast-casual concepts (food quality doesn't differ much from fast-casual to casual full-service). However if you're high end, your competitors will be other high-end restaurants, with the same staff and space constraints as you. This is why in higher-rent markets such as NYC, Tokyo, London, Hong Kong, high-end restaurants flourish.

> there's no rule saying that a given type of restaurant has to be viable at any price point.

Of course not. But in a dense, urban, high-rent area, there's likely a model that does work.

> maybe mid-tier restaurants with lots of staff are just an inefficient use of valuable urban space

They are. Rent is your main fixed cost, staff is your main variable cost. COGS matters, but you need a product to sell in the first place, and every restaurant brings in similar products. Having too much staff and too much space is a recipe for disaster.

To conclude, here's some statistics for you concerning household expenditures on food: https://www.bls.gov/regions/west/news-release/consumerexpend...

> San Francisco-area households spent $4,487, or 50.3 percent, of their food dollars on food at home and $4,431 (49.7 percent) on food away from home. In comparison, the average U.S. household spent 56.3 percent of its food budget on food at home and 43.7 percent on food away from home.

In San Francisco, households are spending a higher percentage of their (already higher) income on food away from home than other Americans.

Anyhow, in my experience, restaurants fail because there's a lot of shitty restaurant owners out there. Not knowing their demographics, costs, labour situation, etc..., qualifies them as being a shitty restaurateur.


> I realize that we can label all of this as free markets doing their thing; maybe mid-tier restaurants with lots of staff are just an inefficient use of valuable urban space.

That would assume that the real estate market is a free market, which in many cities is a laughable idea.


Sorry, I should have broken down that thought - because you're of course right about urban real estate.

I think it's quite likely that midprice, full-service restaurants are an inefficient use of expensive urban space. It's also possible that they're an inefficient use of valuable urban space, and would be squeezed out of dense cities even by healthy, responsive real estate markets.

That second claim is much bolder, though, and my personal guess is that taller buildings and better mass transit would relax labor costs via lower rent, while also lowering the price of downtown real estate. A few extremely dense cities (NYC, Tokyo, whatever) might use better transit for a "reverse commute" where people live near work and go outbound for restaurants, but that's even more hypothetical. And I can't really think of a "free real estate market" city to use as a reference, because I'm not sure anything close exists at this point.


But once you pass the rising costs to customers, you’re in the bracket of the high end restaurants.


Yes, you are. And the business owner should know that to exist in a high-cost market, they need to be higher end.


Can a fast food chain restaurant exist in such a market?


Maybe, with high enough turnover. Of course franchise fees kill margins, and you need to sell an awful lot of Big Macs to equal the kind of revenue a 1500 sq. ft. fine dining restaurant can do in a year.

There's far more high end restaurants downtown in my city than there are fast-food franchises.


according to google maps, there are about ten mcdonald's restaurants in SF, so I would argue yes.


How can they afford the labor?


Part of the problem here when discussing specifics and trends like this is that we use a term like "middle class" to mean two things; either the simple middle quantile (1/3 or 1/5 or 3/5 depending on who you talk to) of income or the more historical definition of less-than-upper-class (those people who do not need to work; who have sufficient capital that they can live off of passive income from it) and more-than-working-class (those who have to work; who do not have sufficient income to accrue savings).

And even the more-than-working-class definition clashes with other trends; some pundits seem to define "working class" as nothing more than "does not have a college degree", which means that some trends, rather than being interesting demographically, instead just become a restatement of an underlying trend (like you said, the increased prevalence of college degrees).

And the final problem of all attempts to analyze this is that we don't really have comprehensive data that supports a single way of measuring these aspects historically, because the definitions and criteria keep changing. Even when there is insightful research, it really is hard for it to be applicable outside its direct area because the definitions and proxies for historical values are not guaranteed to be uncorrelated with new phenomena that are attempting to be measured.


> It turned out, college was partly a proxy for class. What college grad earnings in the 60s was actually saying is that upper-middle class kids grow into upper-middle class adults.

I think this is a critical observation that lots of people--including some economists--have missed.


I've heard this said in another, less correct way: "the real number of 1960s equivalent college graduates has never changed, its just become harder to find them based on a label". But, broadly, I've heard a lot of acknowledgement for this. If the education is right, but the cultural trappings are not, then suddenly the education isn't the issue.


A college degree may not be a panacea for class mobility, but lets not forget that class mobility without a degree is pretty much not possible in the slightest.

Per a Pew study Brookings cites (https://www.brookings.edu/blog/social-mobility-memos/2014/02...):

> children born in the lowest income quintile who do not earn a four-year degree are four times as likely to wind up in the bottom (47%) as those who earn a four-year degree (10%).

So while upper middle class people are still the ones with the best chance at getting admitted, paying for, and finishing college degrees, lower class people who persist and do the same really are far more likely to join the middle or upper middle class from doing the same. Though it's far from guaranteed, especially since the relative value of a degree does depend on which degree, and sometimes school prestige as well, which is definitely harder for the lower class to prioritize (costs, scholarship availability, and nearness to family if they need to provide income support or other support to them tend to matter more in my experience).

The observation you mention is accurate, but I just wanted to provide additional context to readers who may not realize the life changing effect a degree can still have for poor people. This is as someone who experienced a comfortable middle class upbringing only because of the combination of a state school in the rural south that gave merit scholarships in the 80s to locals, and my mom's efforts to graduate with a degree that would be worth money.

That said, the idea of pursuing liberal arts in college is still incredibly foreign to me. Being able to support oneself after getting that sort of degree is definitely something I can see as mostly a proxy for upper middle class upbringing, and likely feeds into the rising trends see today where the payoff for the lower class is not hugely significant or comparable to well connected peers, even upon degree completion.


> lets not forget that class mobility without a degree is pretty much not possible in the slightest.

While it is true that the school system acts as a filter that removes those who have almost no chance of upward mobility (those with crippling disabilities, for example) from completing higher and higher levels of schooling, we need to be careful to not reverse that observation. Attaining a degree is not going to undo the disability that limits one's economic growth.

With the great push for everyone to have a post-secondary education we've witnessed over the last decade or two, virtually everyone who is capable of attaining a post-secondary education has done so. Those who are not completing those levels of schooling now are those who had no chance in the first place, and they struggle equally in the rest of their life for the same reason they did not succeed in school.

That said, if you were born with what it takes to be top of the class in Harvard, but chose not to pursue that avenue of life, chances are you still have every bit of upward mobility potential as someone who did graduate from Harvard.


> chances are you still have every bit of upward mobility potential as someone who did graduate from Harvard.

Well, minus the Harvard contacts and network, which is a huge part of the success factor of Harvard grads.


...A 53% chance is "pretty much not possible in the slightest"?


Did you look at the graph that article linked? Do you think that moving from the first income quintile into the second one is genuine class mobility? People in the second quintile income range may not be relying on food stamps, but for most of America that's still a less than a comfortable situation, one that's still in the territory of living nearly paycheck-to-paycheck (23% of americans have no savings, and another 22% have less than 3 months worth), and it's not going to buy you many advantages in living situations or education opportunity for kids either. Of the ones without college degrees, 26% of that 53% you're calling out still ended up in that bracket.

So again, without a college degree, there's a 73% of people born poor, will remain lower or low-middle class. Only 3% will make it into the top quintile (vs 9% with degrees) and 8% into the 4th (vs 17% with degrees). So if we define only as 3rd quintile up as middle and up (which is questionable in itself given the shrinking middle class) a degree lands a lower-class person 52% chance of genuine upwards mobility, compared to 27% without. And it nearly guarantees that they at least won't remain in the very worst stratum of poverty. 52% for real mobility may not be great odds, but neither is 27% and if it was my future on the line, I definitely would not want to be staking it on 27%.


College doesn't necessarily make people more productive. Rather it acts as a filter to knock unproductive people out of the hiring pool. And in economics terminology, completing a degree acts as a signal for candidates to demonstrate intelligence and persistence to employers.

https://en.wikipedia.org/wiki/Signalling_(economics)


The signaling isn't just for intelligence and grit (although those are factors); as Caplan argues in The Case Against Education, a college degree also signals capacity for obedience and conformity.

https://press.princeton.edu/titles/11225.html


I don’t buy this. Are saying if you get college a degree in Chemical Engineering you wouldn’t be far more productive as a Chemical Engineer than if you only went to high school? Or in say, physics or chemistry or botany?


No the commenter is saying that having completed a degree in chemical engineering serves as a signal to your productivity. It does not in fact determine how productive you actually are.


Observation: college graduates are more productive laborers in the work force.

Hypothesis A: college coursework actually increases a student's future work productivity.

Hypothesis B: the least productive laborers are unable to be admitted to a college, or unable to graduate. The admissions and graduation filters thus raise the productivity average in the pool of graduates.

Hypothesis C: businesses with the most productive workers tend to prefer college graduates. Workers without degrees appear to be less productive because the only jobs available to them have inherently lower productivity.

Hypothesis D: productivity is measured differently for some workers, based in part on whether they have college degrees.

Hypothesis E: some workers are able to influence the measurement of productivity, such that productivity originating from other workers is reassigned to them, and those cheating workers tend to have degrees.

Hypothesis F: workers with degrees tend to have debts, and are forced by necessity to be more productive in order to first pay them off, and then later save enough for retirement with a shorter savings window.

Hypothesis G: workers with degrees are more productive because the past correlation generates an expectation that they be more productive.

The observation could have complex cause, and all of the above may be true.


Almost everything interesting that is above chemistry[1] in the science hierarchy has complex causes. It would be a good thing if people in those fields would get over their physics envy and try to communicate this fact to the public better. The number of "this flawed study suggests that some small minority of people might be a bit more healthy if they restricted their sodium chloride intake" becomes in the general population "eating less salt is good for you". And foods can somehow be labeled "low salt" when the manufacturers lower the sodium and jack up the potassium (both are salts) level to keep the taste close to the same.

[1]https://xkcd.com/435/


Well, I understand that the GP is saying that expecting the same person to have the same productivity as a chemical engineer after and before a graduation on that area is ridiculous.

Claiming that high education has only signaling value is ridiculous. (It's even more ridiculous than claiming it has no signaling value.)


There are varying extents to which signalling is a good explanation and employers have more fine grained information then just degree or not. As usual the literature is less univocal.


> I'm also dubious about college as a proxy for skill, in the skilled labour sense. A lot of college is very general education, with even less focus on marketable skills than high school.

At one time there was an argument that general skills had value. We as a society have become super specialized and those general skills relatively ubiquitous that those general skills aren't terribly valued any more. Your employer doesn't care if you understand Cartesian philosophy or having an understanding of classical rhetoric, they just want to see you spent 4 years learning accounting/law/human resources because those are the skills that's required.

This may also be why our leaders have gone from "Success is not final, failure is not fatal: it is the courage to continue that counts." to "We love winners. We love winners. Winners are winners." in the last century.


Jobs have become less specialized not more.

Being able to effectively communicate over email takes more writing skills than most people assume. The internet adds a lot of flexibility in the kind of things the average office worker will do.


> Jobs have become less specialized not more.

https://www.forbes.com/sites/joshbersin/2012/01/31/the-end-o... https://hbr.org/2011/07/the-big-idea-the-age-of-hyperspecial... http://theconversation.com/want-a-job-its-still-about-educat...

The general consensus I can see is that that is not the case - employers want specialties.

And absolutely communication is a skill we require, but for some reason it's not valued.


I disagree with that first part. At least in software engineering. Remember when you just had software engineers? Eventually we got things like game programmers and web developers. Now every game has at the very least an audio programmer, gameplay programmers, network programmers, graphics programmers, platform engineers, etc. Web Developers are even more specialized, walk into any web development shop and you can find people that only ever work with say React on mobile, or one person that specializes on browser compatibility, or analytics engineering.

Part of it is the size and scope of the projects and the general complexity added to the field, but 20 years ago 1-2 people would have done all of these.


I think there are different trends happening in different professions. I remember once when the sales team was hiring. They invited in a half dozen people one day and gave 3-4 offers. Whereas in engineering it'd take weeks to find anyone with the matching experience for the roles we had open. This was a startup selling something pretty new, so I doubt the sales peeps had similar experience with a different company. I'm assuming that sales experience was just considered highly transferable.

I'm not sure exactly where the bifurcation lies--it certainly isn't unique to software. Medicine and law are also getting highly specialized. On the other hand, are administrative jobs going the other way? Accounting, HR, compliance? I'm always a bit puzzled when companies recruit CEOs from unrelated industries. Does domain knowledge matter that little for some, even very senior, roles?


That's an outgrowth of the same rise in general competency.

Inside a company programmers such as my self have been assigned to work on React without ever having seen it before with the expectation they will pick it up. It's only when looking for new employees that these differences have much weight.

PS: I have been told to pick up low level network programming, frameworks such as React, new langues, even jumped into web programming from nothing. I can only assume this is generally the norm.


Programming is a specialized, incredibly dynamic, role. Do you think it's fair to assume that all positions in a company act like that?


All jobs past and present involve some very specific domain knowledge and a range of more general skills.

A car salesman needs to know a lot about the product, more general sales tactics, more general skills like email, and even more general skills like just speaking. But, as you narrow down into the ultra specific niche the percent of time working in that domain decreases. What percentage of the time is the sales guy dredging up specific horsepower numbers etc related just to the car they are selling?

Over time what we could consider generalists jobs like secretary have been cut while the tasks have not. So, by handing out those tasks to others those other jobs have in turn become more generalist on a day to day basis. Dev-Ops for example is in many ways the opposite of specialization.

PS: Put another way, if my last job had been using Java instead of C# I would have done the same thing with ~80% of my time. You would be reading the same requirements of the code was in another language.


> Put another way, if my last job had been using Java instead of C# I would have done the same thing with ~80% of my time. You would be reading the same requirements of the code was in another language.

That works well for languages, yes, but what about data scientists, business intelligence, cyber security, and machine learning experts? Those are all jobs that launched off the dev backbone, but are very different, have unique, specific knowledge and require training past what a normal degree requires. Dev-Ops may be a generalist position, but you'd need see someone who does Dev-Ops do those jobs.


'Data scientists' is one of those interdisciplinary fields that does not have an ultra deep dive into any one silo. Rather it's a collection of several skills that are all useful for doing other things which is not really specialization. People can dip into and out of that role with minimal transition unlike say becoming a Doctor.


You're both right. Specific jobs are becoming more specialized with a higher emphasis on understanding specific frameworks, languages and patterns. At the same time we switch jobs more often, so engineers have to to become more adaptable and less specialized to remain relevant to new roles.


American degrees and education are much more general than many countries in the UK an entrant into STEM degrees will have specialised in relevant classes for the last two years of high school as well as for the two years prior to that at GCSE level.

Yes the occasional maverick will also study outside their target degree but that is rare my mate who is a classics teacher (public school) has a class size of two


That almost seems a shame. It's a thing of beauty to hear those with a classical education speak. The ability to be able to put together clear, cogent, complicated paragraphs of thought on the fly is awesome to hear.


>Does the graduates/nongraduates segment still represent a comparable group of people?

According to this [0] article from 2017 only 33.4% of Americans over the age of 25 have finished their undergraduate education. As such, MOST people are still without undergraduate education, and thus a median American does not have undergrad education. Since middle class should represent a median resident, it would follow that people without undergrad education should be quite solidly in the middle class.

[0] https://thehill.com/homenews/state-watch/326995-census-more-...


I don’t think middle class refers to median income. At least it certainly didn’t used to. How else could a population have a small middle class? I don’t think the answer is by having equally large upper and working classes.


> I don’t think middle class refers to median income. At least it certainly didn’t used to. How else could a population have a small middle class?

Well, it could if it meant “near median income” if few had near the median income. Median doesn't mean particularly common (or even the most common, which is the mode.)


The median is what you get if you sort all people by income and pick the middle person. If one forces the median to be in the middle class then this forces the upper class to be large when the middle is small: Example with a large middle class:

  |...|...........|...|
    ^       ^       ^
    |     median    |
  (working)      (upper)
Example with small middle class:

  |.......|...|.......|
    ^       ^       ^
    |     median    |
  (working)      (upper)
Real life example with small middle class:

  |.............|...|.|
  (working) ^     ^  ^
          median  | (upper)
                middle


> If one forces the median to be in the middle class then this forces the upper class to be large when the middle is small

It doesn't force the working and upper class to be equally large unless you further assume the distribution is symmetric, but yes, there is a limit to how small the upper class can be since at least half of the distribution has to be split between the upper and middle class.


If you look here [0] it would suggest that median is well within middle class (only 29% are "lower").

[0] http://www.pewresearch.org/fact-tank/2018/09/06/the-american...


Well sure. The point I contest is that the definition of middle class is “the class of the person/household with median income.”

Currently in developed countries it happens to be the case that the people with median income are middle class. However this was not always the case, ie the group referred to as middle class did not always include people with median income and so the above definition does not coincide with what middle class typically referred to.


The entire comment chain was not about some abstract middle class, rather middle class in the US right now.


A lot more people get degrees, in 2018. Does the graduates/nongraduates segment still represent a comparable group of people?

http://www.unz.com/anepigone/average-iq-of-college-undergrad...

> Today’s bachelor’s degree is the equivalent of a high school graduation certificate from fifty years ago, and today’s graduate degree falls short of a bachelor’s degree from a generation ago.


Sure, I'm not sure anyone actually denies this. College education was something only upper class folks could receive (especially a liberal arts degree with dubious value in society) due to it's expense. With the emergence of things like state schools, the barrier became lower. But the fact of the matter is higher paying jobs required college degrees, and the best chance for most people at a middle class lifestyle are those jobs. Well now, wages have stagnated for most jobs, the true value of a non-STEM college degree has becoming glaringly obvious, and opportunities for people to maintain a middle-class lifestyle are rapidly declining. The MBA has been serving as that proxy for how upper class you are especially from expensive prestigious universities. When the highest value of a degree is "the opportunity to network" that should tell you something. Now with the pressures of off-shoring manufacturing, automation, and growing immigrant populations willing to do low-skilled labor to survive, is it really at all surprising that many people are being shut out of the economy? It all comes down to a fundamentally broken economic model that values growth above all. Our population grew in tandem with our economy. Once automation of manufacturing processes took hold, we quickly started to need less people. The reality is we probably don't need 7 billion humans anymore, and we need to transition to a more efficient sustainable economic model that isn't aiming for growth above all.


> College education was something only upper class folks could receive (especially a liberal arts degree with dubious value in society) due to it's expense. With the emergence of things like state schools, the barrier became lower.

This is not factually correct - tuition at non-state schools increased 10X in the last 30 years, way ahead of the pace of inflation. College education is significantly much less affordable than it was years ago.


Its even more stark when you look at University of California. My first year at UCI in 1990 tuition was $80, 5 years later my last year of tuition had leaped to $10,000. By the time all my student loans were paid I had paid off $750,000, there's a lot a person could do with that money. Baby Boomers with their $80 a year tuition were able to buy homes, invest in the market. There was a decision made in the 90s to transform education from an investment in society from the top down to a system of indentured servitude.


Wait, how do five years of University at a price of $80 - $10000 a year add up to 750'000. Because that is a huge number.


Really, UC tuition was essentially free before 1990? I had no idea. Is that true?


>> "College education was something only upper class folks could receive ... due to it's expense."

> "This is not factually correct - tuition at non-state schools increased 10X in the last 30 years, way ahead of the pace of inflation."

You are only considering the monetary cost of college which has, indeed, skyrocketed.

However a more significant cost in the era you are both speaking of was the cost of not working immediately after high school - especially when that same social class very likely had started a family, or had a family they were coming from that needed support.

People coming from a high SES background could afford four years of non-earning, etc.


Part of the reason it exploded though is there seems to be a bottomless supply of student loans, I am guessing because of how hard they are to forgive unlike regular debt. So many teenagers are encouraged to borrow whatever the schools ask by virtually everyone they know so that they can get their degree, and lenders will gladly give them 30k a year to study zoology or some other field with a rough job market.

The result is college is more attainable to people across the social stratum. And then degrees are less rare, commanding a lower premium while the people holding them have more debt.


College is fundamentally a positional good - many job markets will preferentially hire applicants with a college degree over those without, particularly for well-paying and white-collar jobs. What the college graduate earnings indicate is that the people you are willing to preferentially hire end up making more money. When you have more college graduates, this preferential hiring effect is less significant, since you're now looking at the top X% of jobs rather than the top Y% of 50 years ago.


Right. Your credential only has value because other people don't have it. Increase the supply of degree holders and the value of the degree drops. If I made this statement about any other economic commodity it would be called obvious, but for some reason people don't get it with education.


The other side of the coin is that college has become increasingly more and more expensive and there are now many times more people with bachelors degrees, so college is no longer the sure path to middle class it once was.


eh splitting hairs a bit. In a recent span of 5 years we had 150k new jobs and built 14k new housing units (sfba).


A lot has been written about people with college degrees working at Starbucks, having a lot of student loan debt, and never being able to save for retirement.

So, I agree, have a college degree is only a proxy for skill if that degree is in something that is marketable. Comp Lit is not marketable, hence the reason a lot of people at Starbucks have degrees.


They could have started at Starbucks and gotten the degree for free from ASU online.

https://www.starbucks.com/careers/working-at-starbucks/educa...


The top 1% of American's own more wealth than the bottom 90% and this concentration is only increasing. Soon no cities regardless of size will deliver for low skilled workers as there will simply not be enough of the proverbial pie to go around. As wealth is concentrated and rents and property values rise the vast majority of this country will (does?) live paycheck to paycheck. This is unsustainable and will either need to be addressed or we will be living in a very real dystopia.

Add the very real prospect of social security collapsing and we have a good setup for the streets being filled with homeless and vast portions of the population failing simply because there was just not enough money to go around.


“not enough money to go around”

Where do people get this strange idea that if someone else makes money it somehow precludes others from doing it? Bill Gates got rich with an invention that also generated probably a trillion dollars in economic growth for everyone. Including most people on this sites income.


> if someone else makes money it somehow precludes others from doing it?

There is an easy way to see that and it is if the median inflation-adjusted hourly wage has risen in the US since the early 1970s. Or if inflation-adjusted weekly earnings have risen. They have not, with GDP growth, both have fallen.

That this even has to be discussed shows the deep control those heirs who expropriate surplus labor time from workers have over discourse, the media, forums like this (run and controlled by an accelerator) etc.

Workers create wealth at a mature company. Some of that pays the electricity bills etc., but then the rest goes to either dividends to the heirs, or to wages. That is the "preclusion". The heirs expropriate the profits of the surplus labor time from the workers creating the wealth. They are shorted on their wages. Sometimes this is explicit like the cabal between Steve Jobs, Eric Schmidt that came out in the lawsuit.

Insofar as Bill Gates and invention - it takes a hell of a lot of gullibility to swallow the fantasy you concocted. Kemeny and Kurtz created BASIC. Gates hacks into a military research computer at Harvard and steals computer time from it according to Paul Allen's book (and Harvard admin found out and had proceedings) - they port BASIC to the Altair.

Then IBM comes to Microsoft. IBM got wealthy with computers on taxpayer funded government contracts and a monopoly which was lightly overseen. Gates's mother is on the United Way board with IBM CEO Opel who helps makes this meeting happen. Microsoft sells Seattle Computer Product's Qdos to IBM (Gary Kildall said it was a complete ripoff of his Drdos). A purchased ripoff of another product, sold thanks to family connections. So much for "invention".

At times like the current one, with real wages falling since the early 1970s despite economic growth, that this is even discussed is a sign of how the heirs have bought and paid for the narrative as well.


[flagged]


It's strange that you dismiss a heavily researched area well respected in modern philosophy and heterodox economics as being "idiots".[0] You may say their research is flawed, but there is no grounds to call such people (who usually hold PhDs in economics and work at world-tier universities) "idiots".

[0] https://news.ycombinator.com/item?id=18490388


If someone invents a tool and sells it for a profit - who was exploited? Is it not possible for everyone to be richer than they started and also create wealth??

The question was really simple. Your answer is the kind that inevitably ends with "you just have to read Das Kapital". That is not an answer.


> If someone invents a tool and sells it for a profit - who was exploited? Is it not possible for everyone to be richer than they started and also create wealth??

As I said, when, after the electricity bill etc. is paid, created wealth is split between dividends to heirs and wages to those who worked and created the wealth.

If in your hypothetical situation there is no split - if the person who worked and created wealth keeps everything - there is no expropriation.

The expropriation is the last few hours of work he does, of the wealth he creates - none of it going to him, the one who created the wealth and did the work. All of the profit going to the heir. But if there is no split, this expropriation does not happen.


I never understood why people are so casual about inequality. To put it bluntly a lot of people will eventually, if not already, be working something like three quarters of their lives for society, a company and a land owner. And only one quarter for themselves.


>If someone invents a tool and sells it for a profit - who was exploited?

Nobody, and Marx doesn't say that anyone is in such a case.

>Is it not possible for everyone to be richer than they started and also create wealth??

In some way, it is possible - after all, the worker is richer than he was before (he has more money thanks to his wage) as is the capitalist (who has made money from the sale). Now what wealth is created? Arguably, social wealth in the case of new technology being developed, for example. But Marx doesn't deal with this concept of "wealth" or even immediately money, he deals with the concept of value, which works at a different level of abstraction to money. The value-form only becomes the price-form, they're not identical.

I'd suggest looking up theories of exploitation as they have been figured in Marxisms that don't rely on a labour theory of value. It's too much[0] to summarize in one HN comment, but you don't need to read Capital either (though it would be helpful).

[0] https://plato.stanford.edu/entries/exploitation/


Just one example. If I live in city A and all rental properties are owned by a select few wealthy companies and individuals, they are able to set the rental market. If they raise prices and I am a wage earner, I have no other option but to pay what they want, reducing my net income and increasing theirs. The vast majority of people are not able to spin a startup into being in their garages, they simply live and pay their bills the best that they can. I am not suggesting we vilify the wealthy but to suggest that one set of people earning greater portions of the total wealth does not negatively affect the less wealthy is disingenuous.

The same concepts can be applied to the wealthy having access to lobbyists and consequently the ability to reduce regulations leading to long term health care costs for those forced to live in now less regulated environments. Gentrification is another venue for this. People live in a neighborhood, wealthier people move in and property taxes go up. Poor people are forced out. The actions of the wealthy have ramifications on the lives and incomes of the less wealthy.


They can't really set price of the rental market, cant they? Salaries either have to keep pace with the rents, or renters get displaced. Displacement can only be sustained so long as there are transplants take their place, who can afford the rents, and there is usually finite supply.

I've read conflicting things about gentrification. But on the "gentrification is good" side, the theory goes that areas that tend to "suffer" the most gentrification, tend to be areas that have very high levels of displacement already. They tend to be poor, lots of foreclosures or dead beat renters. In other words, gentrification doesn't necessarily increase levels of displacement. Owners in gentrified areas stand to greatly benefit from gentrification. The renters... they get displaced - but that was happening whether the area was gentrified or not.


Gates succeeded by creating a 40 year monopoly which established and largely controlled the heartbeat of the computer revolution. That's about as exclusive as it gets.

Any monopoly tilts the playing field so 90% of the business sector's profits pour into the pockets of a tiny few, leaving others high and dry. And virtually all the notable tech successes of the past 20 years arose and thrived via monopoly.

So it's little wonder why VCs like unicorns so much. The profits involved aren't shared with others.


When the statistics back it up?

Yes, it's possible for the rich to get richer while the poor also get richer. It's also possible for the rich to get richer by taking all the economic gains for themselves, leaving the poor to stagnate or to get poorer.

Over the past 40 years the bottom half of the economy in the US has seen zero growth. This despite massive growth overall. This despite rising house prices, growing student loan debt, etc. The rich have gotten richer and the poor and the middle class have been edged out. It doesn't have to be this way, but it is.

Why? Because of wage suppression, union busting, wage theft, predatory and usurious student loans, corporate welfare, tax cuts for the wealthy, and on and on and on.


So you can’t see how a CEO being compensated 1000x more than whoever cleans their office is a problem? Money flows like a river, and when every executive diverts egregious amounts of money that they’d never spend in their lifetime into their little pools, theres less money flowing downstream along the river. Do you really think that Bill Gates needed to become one of the richest men on earth to proselytize ms dos? Beyond paying the man enough to keep him fed and working, thats millions and millions of dollars of compensation thats removed from the broader economy solely to generate Mr. Gates a comfortable dividend.


Do I care if someone has 1000x the artistic talent I do? Is there a shared river of inspiration we're all drinking from?

You seem to believe there's a fixed pile of money in the world that every human competes for (e.g., 16th century Mercantilism).

Go into the woods and turn a tree into a chair. Where did that value come from? Will it run out?


Eventually you will run out of wood to make chairs or people who want chairs, you can’t make infinite value as every product ever made requires commodities going in one end and customers with disposable income on another.

The problem is when a company makes x, ceo takes 0.1x in compensation, while everyone else on the payroll combined also takes 0.1x in compensation just because they didn’t show up to work with the same provenance as the ceo. For a company, there is very much a fixed pile of money that every employee competes for.


(As people use forests, others will notice and start making tree farms.)

I don't see how it's inherently wrong for a CEO to take 10% of revenue and employees split 10%. And why do companies have a fixed pile: shouldn't an effective CEO grow top-line revenue? What if employees are splitting twice the revenue compared to a year ago? (If the company isn't growing, replace the CEO for someone who earns their 10%.)

Money-making inequality, on its face, doesn't seem different than artistic inequality. Should I be angry that Warren Buffet gets 20% annual return on his investments while index funds only do 8%?


(As people use forests, others will notice and start making tree farms.)

Stories of countless species hunted to extinction show that this is not what happens.

Let's look at the physics(?) of the economy: where does wealth actually come from? It's either from transforming things into more useful things or increasing efficiency of processes. There are limits to both of those things.

We live on a planet that's in a rough equilibrium in terms of both matter and energy.

So we have a finite maximal amount of value that we can extract from a constant amount of matter - which means that overall wealth is limited - we're just moving it around and at a finite rate at that - somebody is getting a bigger slice of this pie than the others.


Is wealth truly proportional to matter? How many physical resources are used when making software, music, art, theater, writing? How much wealth did the internet create and how much matter did it use?

There may be a finite limit on potential wealth but it’s effectively limitless for human planning purposes.

(For the trees, we can certainly mismanage our resources; it doesn’t mean it’s a priori impossible to create a growing and sustainable form of wealth.)


How many physical resources are used when making software, music, art, theater, writing?

Actually a pretty decent amount over the life cycle of a person. After all those things require a functioning civilisation that educates properly to begin with.

The internet is mostly a means of making communication more efficient - that was one stupidly inefficient process ripe for an upgrade.

So yeah, no additional wealth really - just much less losses.


What "pretty decent" physical resources does making software consume? (Beyond energy, which is renewable.)

The internet and other tech doesn't create new wealth? Why has the size of the economy grown exponentially since the industrial revolution?

If wealth were fixed, as population in increased we'd be getting vastly poorer worldwide (1.6 billion in 1900 -> 7 billion people today). Have we gotten 5x poorer on average?

(These facts are easily googleable.)


Many people think the economy is a fixed pie. Do a quick google: inflation adjusted gdp per capita is increasing exponentially(!) over time as we find new ways to create wealth. If the pie were fixed our living standards should decrease over time as there’s more mouths to feed.

https://www.google.com/search?q=world+gdp+per+capita+over+ti...

I’m not saying local deviations don’t exist - I can lose my job - but globally wealth has been growing continuously and poverty is being eradicated. This is only possible with wealth creation. Look up Hans Rosling’s work.

https://goo.gl/images/QejAcY


I think it is hard to make that into a constructive argument.

Either you care and the 'allocation' is unfair.

Or you don't care, which means that the market doesn't work since you caring is what is supposed to allocate resources fairly, and it is still unfair.


The word 'allocate' presupposes a fixed pie that we're expected to split in a zero-sum game. That works for redistributing the fraction of wealth captured in taxes. But the parent's argument is about the process of accumulating wealth itself being zero-sum somehow.

Thinking "someone earned money which would have gone to me otherwise" is like thinking "someone got in shape and that physical fitness would have gone to me otherwise".


Where does the dollar come from? A printer in a U.S. mint building. Every single one. No dollar can come into existance without being printed in one of these buildings. This is not like working out or singing a song, you can’t will money into existance and into your pocket, ultimately someone in charge of a larger pile of money wrote you a check. There is a finite amount of money in the world.


Banks will money into existence every time they issue a loan.

https://en.wikipedia.org/wiki/Money_creation

When I deposit $1k in the bank and you take a $500 loan, $500 is created out of nowhere. You are using my deposit yet I still have a balance of $1000 on the books.


Banks cant will money into existance. If you withdrew your money the bank has to pay you from other funds. During the great depression banks collapsed this way, the only thing stopping that today is banks have enough funding to buffer people withdrawing from their accounts.


I don't really matter if it is a growing pie or not, you are still getting a share. I am sure Uber creates a lot of new wealth, but the average driver most likely isn't seeing a lot of it.


Why doesn't the absolute share matter? If I give everyone an iPhone and a few people get an iPhone + Macbook, is that "bad"? Isn't everyone better off? (Yes, humans can be petty and jealous, but they are objectively better off.)

Uber created a lot of wealth. Some got more than others. Is that objectively bad if the wealth didn't exist before?


> You seem to believe there's a fixed pile of money in the world

...there is. not in the sense of "money" but in the sense of "wealth" and ownership of private property. Wealth is not "created" it is diverted. If wealth grows in one place it's because it shrank in another.

Unless you believe that growth and resources are infinite, like many capitalist economists.


When you invent the wheelbarrow, stack bricks into a house, plant a seed, where is wealth diverted from? Who is losing? What shrank?

Potential wealth is ‘finite’ because humans are finite but there’s no practical limit. (Just like there’s no practical limit to how much art can be made. If art isn’t being made it isn’t because there’s only so much creativity to go around.)


Where did you get the land to grow your seed? Where did you buy the bricks to build your house? And the fence to keep the deer away? What of the police and army to keep people from killing you and taking your seedling? The water to nourish it?

You described one of the only scenarios in which wealth is "created" and the only reason it's created is because it derives "free" energy from the sun. But that is just one of the economic inputs of the seed, and there are many more (the labor to water it, the water to grow it, the land on which it grows).

The sun's energy in this case is a small portion of the actual economic inputs the seed requires. And those economic inputs are diverted from somewhere else. They do not appear out of thin air.

So yes, I agree, we should harness the sun's energy, or other "free" energy sources, as economic inputs.

That doesn't change the fact that the overwhelming majority of wealth already exists and is not created out of thin air, but diverted.


It's helpful to separate natural resources (land, woods, oil, etc.) from wealth (valuable possessions).

We take natural resources, often plentiful, apply labor + skill, and get a more valuable product. We're wealthier as a result.

Yes, natural resources are finite, but not the limiting reagent for most things. There's a zero-sum game in that land used for farming is not available for an entertainment complex. But we have so much used for "nothing" that switching it to "something" is a giant increase in wealth. You can buy an acre in Kansas for a month's worth of minimum wage work. Las Vegas was built in a desert surrounded by hundreds of miles of wasteland. Did turning land, earth, trees, and iron ore into Las Vegas add zero value? (Not saying it was the best use of resources, just that assembling buildings improved the value of the raw materials.)

As a counterexample, consider melting down a car into slag. Are you less wealthy with your charred steel than a working car? Of course -- charred steel is less valuable. The number of atoms is the same. In other words, is an assembled watch worth the same as a pile of gears? Are you indifferent to the two?

If wealth were fixed in the earth, we must be getting poorer as the population grows. A few thousand years ago we had 1M people on earth. Were those peasants 7000x wealthier than us?


Fair points in regards to my response. However, I believe part of your argument is still predicated on endless resources. Agreed, there are vast many (ever-shrinking) resources we can use while the population is growing, however at some point we will reach an equilibrium where there is no more vacant land and the creation of new wealth (barring space exploration, at least for this argument) will cease.

In regards to your watchmaker: a pile of gears might be marginally useful to a few, but a working watch will be valuable to many. Has the watch created wealth? No, because the people who were not buying watches are now not buying something else in order to buy the watch. The creation of the watch did not add the dollar-value of exactly one watch to everyone's wealth, allowing them to spend money on the watch. The wealth was diverted! The creation and selling of the watch certainly did create societal value but it did not create wealth, it merely diverted wealth from some other purchase each watch-buying customer would have made and sent it to the watchmaker.

My point is that the overwhelming majority of transactions are diversions, not creations, of wealth.

To your very original point: "You seem to believe there's a fixed pile of money in the world that every human competes for"

I suppose my argument should be changed to: Perhaps amount of wealth is not currently fixed (as defined by the shrinking number of resources one could use to create wealth), but those who create wealth (and not just merely divert it) are generally already very wealthy and are generally the only ones who have the means to create this wealth due to extremely high barriers to entry.

So, I would say theoretically, you're correct: there is a growing pool of wealth. But practically, any normal, everyday person cannot go around creating new wealth...they can only hope it is diverted to them by someone who already has wealth.


Thanks for taking the time to clarify, I mostly agree with your updated version. Global wealth can grow but is not necessarily (or often) distributed, and not everyone is in a circumstance to grow it on their own.

You may enjoy the elephant curve: https://www.brookings.edu/research/whats-happening-to-the-wo...

The very rich and moderately poor have seen wealth increase while the top quintile (E.g. many Americans) have seen wealth stagnate. Globally we’re richer on average but not everyone participated. Clearly it’s best if we can lift all boats and not certain subsets.


[flagged]


The Gates foundation does do tremendous good. However, certainly Bill Gates pays for a good deal of luxury premium in his life that ultimately is a dollar that could have gone to the gates foundation instead of, say, luxury airfare (unless Gates does indeed fly economy). Maybe that’s also a dollar that could have gone to employees at the bottom of his payroll who struggle to exist in Seattle.


The whole purpose of a business is to capture the wealth they create. Business will create less wealth if they can capture more. Housing market being a prime example.


Because that's how our economy works, there is a limited cash supply dictated by the central bank, the only way for more liquid cash to be created is through central banks buying up securities and in turn introducing more liquidity into the economy (which then results in all other sorts of consequences).

The ability for new wealth to be created is predicated on the fact that we have an expansionist monetary policy, or else people would simply never invest and we'd have a society comparable to a feudal state.


What was Bill Gates invention? Wasn't that PARC?


Microsoft.


I mean which invention that created a trillion dollars in economic growth. Microsoft's main thing was to retard the progress of web browsers for half a decade.


Really unfair to discredit Microsoft from anything but the web browsers issue. They massively pushed the idea of personal computers, probably more than any other company in history.


Commodore? ...Apple?


Likely because money and wealth is a zero sum game. There is only so much land, food, and materials to go around.


Money is just a tool to represent value. Value is not a zero-sum game. If a person creates something of value, she is creating something that did not exist before.


For that item to have quantifiable value to others though, they must be willing to exchange their money for the new item. The money that is to be exchanged already existed and is supposed to be limited in supply via fiscal policy. If the majority of that supply is owned by a small group of people the value that can be exchanged for the newly created item is limited as well unless those with the wealth want it.


Assuming that the monetary supply is fixed, then there is a deflationary effect. Even those with less money are able to purchase more value as more value is created. This is why incentives for creating value are important to an economy as opposed to just increasing the monetary supply.


Non monetary value exists, look at entertainment or education, both gained free via YT


Its not free. You are the product. Advertisers pay google and google is then able to provide the infrastructure to provide your entertainment. Its provided to you in hopes that you will buy something based on the ads you are shown. It must also be effective advertising if people continue to pay for it.


  There is only so much land, food, and materials to go around.
True(ish). False. False.

I say "ish" for the first one because hopefully humanity will grow beyond Earth in the near future.

Generally, all of the above are false unless you are talking about a population which never stops growing, and in that case that would be the problem. Not wealth inequality.


If you have a population that did stop growing, but in which class divisions exist, and the upper class keeps taking larger and larger share of overall wealth generated by the society, you're going to have a similar problem.


I don't disagree, but the wealth disparity is different than the income disparity. On the one hand, a wealth disparity can be directly passed down, and repealing the estate tax will only deepen this trend. On the other hand, reducing income inequality may help with the here-and-now, helping families to BUILD wealth while not also living paycheck to paycheck. So, IMO, we should tackle both income inequality via progressive taxation, and we should tackle wealth inequality by instating a much stronger estate tax that prevents a landed gentry, a permanently wealthy and insulated class of millionaires and trust fund babies.

There is also something to be said for personal choices of the non-wealthy. I see many in my lower-middle class city with new luxury BMW or Benz ($30-50k) vehicles, who are living in $120k condo units. I see many people with shiny massively spec'd out trucks that they never used to haul anything and just park at their office job. What would the country look like if average people saved more and invested some of their lifestyle splurging?

So, while the tax structure and skill gaps are big, we can also say consumers need to be much smarter and more modest if they want to build wealth and improve their station.


> we can also say consumers need to be much smarter and more modest if they want to build wealth and improve their station.

Ture. But this is an interesting point. It becomes a "fool's game" like a casino, where everyone knows that the odds are stacked against them. So when they loose, they can internalize their fault. Yet there's an abundance of "shiny" going around so that people's choice becomes emotional and hope-based, instead of rational.

Therefore there's no incentive to change the system because it's so easy to blame those who lose out: "see they made poor choices"—nevermind how we bombard those people with a ton of fine-tuned marketing. Taking the massively spec'd truck as an example, consider what the car manufacturer is selling: the feeling of power. Now watch a popular game on TV and get that message drilled into you about 30 times each week, while living a life that is otherwise very constrained in terms of finances and opportunities... many give in.


> I see many in my lower-middle class city with new luxury BMW or Benz ($30-50k) vehicles, who are living in $120k condo units.

How the hell do you know the financial situation of strangers? What economic data do you have to back up this assertion that lots of people would be better off if they weren't foolishly spending your money? Or are you the local "leading authority on what shouldn't be in poor people's grocery carts"? [0]

[0]https://local.theonion.com/woman-a-leading-authority-on-what...


In some American cities the streets are already filled with homeless. For example in Miami and San Francisco.

fastball 28 days ago [flagged]

Repeat after me:

Wealth is not a zero sum game.

Wealth is not a zero sum game.

Wealth is not a zero sum game.


Sure, okay. What about inflation? If we have 1% of people with 90% of money and assets, either:

A.) The upper class are extremely rich and the rest are desperately poor, or:

B.) The masses have enough money to live comfortably and the upper class has 9x of that growth because that's how percentages work--thus this world has to have many times more total wealth relative to the world in option B. It works if that is feasible considering that world's industry.

But what about when inequality keeps growing as projected, and the 0.1% have 90% of the wealth? Then the 0.01%? For the rest of the people to have a similar quality of life, the total number of assets in the world would have to grow by 10,000x. Otherwise, some people have to get poorer for a perpetually smaller percentage of people to have 90%.

Wealth isn't strictly a zero-sum game, but at a certain absurd point the math doesn't work. At that threshold, cold hard physics kick in and physics is most certainly a zero-sum game. Physics is as zero-sum as it gets. Nature balances her books mercilessly. If wealth inequality is at reasonable levels and there's a reasonable amount of growth, it's perfectly possible for the middle and lower classes to be prosper while wealthy classes obtain much more money than they. But when the wealthy have an extreme majority of all assets--in order to keep the rest of society at a tolerable standard of living you'd be raising the total assets in the world by like ten thousand times. Obviously that wouldn't work. It assumes totally unrealistic rates of growth, which is a recurring issue for our civilization these days. Wealth does become a zero-sum game under specific conditions. Our civilization isn't a perpetual motion machine.


It absolutely is. Western economies have been growing at about 0-4% percent yearly over the last few decades. That means if somebody's wealth grows at a faster rate the money has to come from somebody else. If the US was suddenly full of people like Bezos or Gates I don't think the economy would grow much faster. They would just fight for the same piece of the pie.


Should I just chant this as I drive past the food banks at the people in line?


[flagged]


Is that... is that really your argument? That everything is fine because these people in line at the food bank are able to get free food so that their children don't starve to death? So we can ignore the massive wealth disparity and the vast number of people living in poverty because we don't force them to die of hunger. That's very magnanimous of you. Imagine what they could accomplish with both free food and knowledge of your mantra.


[flagged]


Your first two paragraphs should get an award for most arrogant comment in quite a while.


How do you figure?


You basically told the previous poster that he is an idiot. That's arrogance in my view especially since you are only stating things with nothing to back them up or any coherent thought besides that you are right and everyone else is stupid and not even worth discussing with.


The idea that two different things are in fact different does not require "back up". Wealth inequality and poverty are not interchangeable concepts, but wonderwonder was treating them like they were by appealing to everyone's innate sense that poverty is bad and then conflating that with wealth inequality.

This isn't up for debate. The two things are, by definition, not the same. You may think they are related, but that burden is wonderwonder's (and perhaps yours) to prove, not mine. Two separate concepts are not correlated, causal, or interchangeable until they have been shown to be. Not the other way 'round.

I don't think everyone else is stupid. But people that conflate those two things are either missing something or doing it deliberately (knowing full well they are wrong). Telling someone this is not arrogance, and my comment was perfectly coherent to a great many people if upvotes are any indication. If you want an example of incoherence, look no further than the comment which talked about "driving past food banks" as if that was some sort of "gotcha" about wealth inequality. I think it was lost on wonderwonder that food banks themselves are a great example of wealth not being a zero sum game.


Your starting point was that wealth is not a zero sum game. I replied that it is. If you do some arithmetic and look at growth rates you will see quickly that the economy never has grown beyond certain rates. That means if someone or a whole industry or population group accumulated wealth at a higher rate then the money had to come from someone else. It wasn't "created" out of thin air.

You really should work on your debating style. Phrases like "this is not up for debate" are just bullying to stop a debate you don't want to have. If you are the same at work I feel for the people who have to put up with this.


I’ve dealt with enough people who willfully conflate different topics to know that it is not worth my time to engage on that level and be forced to show repeatedly that two things are different, only to have the other person continue to conflate them. It is not up for debate because it needs to first be shown that there is some relation in order for it to be debated. This is neither arrogance or bullying: it is pragmatism.

Now, as to your comment (which was never the topic of discussion in this part of the thread): if the entire economy grows at 2%, and 0.1% of people’s wealth grows at 10%, the growth of the entire economy can absolutely account for that subset’s gain in wealth. It depends entirely on how big those respective groups are, not purely on “this % is larger than that %”. The money is not necessarily coming from someone else. The growth of the economy is not a fixed rate — if more people produce more good work, the economy will grow more. You also might be forgetting globalism and the fact that the "1%"’s wealth is not necessarily coming from One Place™. For example, Jeff Bezos is wealthy because of AMZN. AMZN operates in many countries all over the world. Therefore, saying “the US is growing at X rate, so Jeff Bezos’ wealth growth rate is too high!” is misguided at best. This is true for many of the "1%" -- they are creating value all over the world, not just in their home country, so it follows that the growth of their wealth is not tied to the growth rate of their home country.

I hope making calculations is not part of your job, because if so I feel for the people that are forced to rely on them.


psst, look at yourself and the way you talk to people, there are several problems here but I'm not one of them. I hope you are eventually able to find whatever you are looking for. Have a good night and optimally a better life.


I plan on having a better life every day, as I work towards that goal and get rewarded for my efforts.

Good luck with your wealth redistribution campaign. I'm sure it'll work out great. Who knows, you might get a life of abundance you did nothing to deserve!


Ad hominem attacks are an extremely poor choice for comments and have no place on this site.


Misusing the phrase "ad hominem" is not constructive and has no place on this site.


Wealth is absolutely a zero sum game depending on how you segment things. Increases in productivity and innovation are not zero sum, but things like land ownership on earth is absolutely zero sum.


That's, at best, extremely misleading; outside of the most extreme absolute deprivation, relative deprivation (or prosperity) appears to be much more of a contributor to experienced disutility (or utility) than absolute deprivation or prosperity, such that, in practice, and modulo alleviating the most crushing poverty (which isn't where marginal gains go in the developed world, even where such poverty still remains) and maintaining illusions which shield the less well off from the prosperity of th well off (consumer-oriented capitalism actively seeks the opposite), wealth is, in practice, zero sum [0] in terms of impact on ezperienced utility, even if it isn't in measures of absolute possessions or consumption.

[0] or, at least, rivalrous, in that for anyone to gain utility from wealth compared to an alternative scenario, someone else must lose; it's not real clear that utility can be meaningfully aggregated across individuals.


[flagged]


[flagged]


What substance?

That comment reads like someone fed a thesaurus of non-words and a copy of the Communist Manifesto into a Markov chain.


Not just that, my God, the Alpine sentence structure!


Repeating something 3 times doesn't make it true.


[flagged]


[flagged]


I am impressed with the level of nonsense in this comment.


> Add the very real prospect of social security collapsing

There is absolutely no prospect whatsoever of Social Security collapsing. A 0% chance. There is no factual basis, only flat-earth anti-vaxxer type scare mongering, for any claim to the contrary. Social Security is funded by payroll taxes. The only way Social Security could "collapse" is if people stop getting paid.


Collapse is maybe a strong word, let's call it an adjustment.

There are less and less workers having a job and paying taxes, while there are more and more benefits to pay for. This is especially applicable to elderly and pensions, in countries with a socialized pension system.

Imagine the situation where both parents are getting retired and were promised 80% of their salaries, while both their children can't get a job. It's all too common nowadays. There is no tax to collect.

This may not be too much of a problem in the US where there is limited pension, healthcare or unemployment. Europe will hurt though.

More

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: