Hacker News new | comments | show | ask | jobs | submit login
Google commits $1B in grants to train U.S. workers for high-tech jobs (techcrunch.com)
366 points by thesanerguy 63 days ago | hide | past | web | favorite | 351 comments



I wonder how long it would take for Google to spend an extra $1 Billion by bringing janitorial, food service, and other non-technical jobs in-house instead of contracting it out to the lowest bidder.

To put it in perspective, $1 Billion is 10,000 times $100,000. Or 1000 $100,000/year jobs for ten years before discounting for the time value of money. Instead, big chunks of the money will get siphoned off to administrators and technical instructors and computer manufacturers and lots of other areas that already have plenty of money.

A billion dollars is less than half the annual budget of University of Nebraska for serving ~50,000 students [1]. Back of envelope turns $1 Billion into ~22,000 student years which is in the same people-helped ballpark as the 10,000 worker years, with the difference being that those 10,000 worker years come with actual jobs at $100,000 a year. And the 10,000 worker years are offset by the current cost of contracting out the work and the value that work returns to Google's bottom line.

[1]: https://en.wikipedia.org/wiki/University_of_Nebraska_system


They aren't doing this for charity - they are doing it to ensure a steady supply of technical hires. Technical jobs are profit centers; non-technical contracted services are cost centers.


The thing about hiring more software developers than you need is that it just adds complexity to the project without adding any new features.

Big companies are terrible at this. As companies add more engineers, the engineers become less productive, then managers get worried and so they 1. Hire more engineers and 2. introduce new project management tools and methodologies to the company which lowers productivity even more but which creates an illusion of progress for idiotic upper management — Over time managers and engineers get used to lower levels of productivity as it keeps declining and the company keep hiring more and more engineers whose engineering skills themselves keep declining from being poorly used.


> As companies add more engineers, the engineers become less productive

This is definitely true, but it's a necessary evil. If you want to build something small, you can do it with a super efficient tiger team. But if you want something huge, you need a massive inefficient team, even if each additional engineer is only 1/10th as efficient as the engineer in the tiger team.

This is explained, with real-life examples, in the classic Mythical Man Month.


They are getting tons of apps from top schools like harvard, stanford, etc. I don't think they will have any problem hiring people. Maybe this would help some small company a lot more that can't afford pay google salaries. Also If you believe the Pareto principle most of people at google probably aren't that important to their successes.


They actually do have problems hiring enough. They get tons of applications, but the vast majority are not worth looking at.

The people from top schools are in high demand, and Google no longer has monopoly on cool tech workspaces with free lunches and random expensive perks.

They have plenty of openings and they harass qualified people trying to poach them just as hard as everyone else. And since they actually DO have some unique challenges, they sometimes actually DO need people who know more than Rails CRUD apps, which makes things even trickier.


It seems the only stories you hear of them are interviewing people and then ghosting them. If they actually cared about hiring people you figure at least they would treat their candidates better. In my experience even amazon has been more on the ball with the recruiting process.


I'm not particularly fond of Google or the way they interview, but I've lived just a few blocks from their Cambridge office (so almost every engineer I know around here has interviewed there at one point or another).

Aside for their criterias and interview questions which I find absurd (but that's subjective, I know a bunch of people who think they're fine), I haven't heard much that was really wrong with it aside for obviously underqualified candidates being brought in and then leaving pissed off after bombing it.

The reality is that in this industry right now, unless what you're doing is little webapps with REST apis that store/retrieve data and not much else, hiring people is hard. All the somewhat large companies have 100+ openings at any given time and the majority of candidates are code monkeys. My current employer is doing decently and our reputation seems to allow us to get a steady stream of above average people applying, but there's still so many openings.

Only so much you can do.


Their goal is to increase the supply and diversity of good engineers, likely decreasing salaries in the long run. It's about damn time!


Google could cut salaries by half or more while delivering equal or better quality of life if it built substantial engineering presence in cheap cities.

Education is a wickedly complicated problem compared to not being in California.


>They are getting tons of apps from top schools like harvard, stanford, etc. I don't think they will have any problem hiring people.

You're outright dismissing the idea that hiring is difficult for Google? You know they have over a thousand recruiters working for them, right?


Don't forget that if they increase the supply of technical jobs, they get to drive down the cost of those technical hires.


I think I know what you meant to say, but what you typed is logically wrong. s/jobs/workers/


Even if google doesn't hire most of the people who graduate, having more people who can do tech jobs could help increase the supply and stop salaries from inflating more.


I guess other than wanting a continuing supply of in-country tech's to hire (as others have touched on), it sort of comes back to the old "give a man a fish" vs "teach him to fish" shenanigans.

They could hire a bunch of janitors, who are already employed anyway, and essentially just transfer that $1b directly into their pockets but at the end of the day they're still going to be janitors. By investing that some amount of money into education though, they're enabling them to go off and achieve something more meaningful that they wouldn't otherwise have been able to do.


This implies that janitor is an inherently inferior job that deserves lower pay. It might be true, but in my opinion an average janitor is more beneficial to society than an average programmer.

More to the point though, what happens when a janitor learns to code and gets a coding job? You still need a janitor!


Direct vs indirect utility. Janitors are directly beneficial to a small number of people. The typical Google engineering is indirectly and marginally beneficial to millions.


Is that what we tell ourselves, "I'm making a difference"?


It might not be positive, but we are making a difference


True.


> More to the point though, what happens when a janitor learns to code and gets a coding job? You still need a janitor!

That is a good point, but I think it comes down to the barrier to entry - it's easier to become a janitor than it is to become a programmer so one could probably assume there's always going to be a steady stream of people who are willing to do janitorial work.

> This implies that janitor is an inherently inferior job that deserves lower pay

It does unfortunately. Given some of the awful stuff janitors have to deal I believe they're deserving of higher pay, but unfortunately the job market doesn't agree with me on that.


The janitor's salary goes up, and the incentive to automate increases. It's all win really assuming knowledge work is more enjoyable than being a janitor


If they brought that support staff in-house, wouldn't it be easier to teach them how to fish?


Possibly, but I wouldn't think it's likely that every janitor is going to want to be a programmer. Whereas if you invest money in a school that teaches programming, there's a good chance that most of the people attending that school will want to be a programmer.

But you make a good point though, with even a fraction of that budget + maybe a little time investment from their staff, they could create some great opportunities for their otherwise non-technical employees.


I suspect 1 billion in the hands of a google will go much further than an academic institution.


It's never been easier to get training in all sorts of things, for free. For example, I'm currently taking MIT's 6.002 electronics course because I never learned circuit analysis properly. It's on youtube, it's free, I watch it anytime.

There's the Khan Academy, too.


Exactly.

Im learning equity investing, and there seem to be never ending stream of free learning material online. Lectures from the top universities, youtube videos that teach you curious spreadsheet skills, assignments. As a matter of fact you have very few reasons to justify anything these days.

It just comes down to personal motivation.

For all practical purposes these days the cost of learning anything is equal to your monthly internet bill.


Yeah, but there's no certificate. And, despite what everyone wants to believe, most employers still want that certificate or degree. They're not gonna accept "I watched a bunch of videos on YouTube" as an alternative.


In the programming biz, they often will if you've got a track record to back it up. Contributing to open source projects is a route to getting such a track record.

If you already have a degree, just in the wrong field, you don't need to get another degree. Just get the training by whatever means.

Heck, my degree is in mechanical engineering, yet I had jobs writing software.


In my own experience a track record is meaning less and less, and employers are really starting to fixate on credentials. It's very weird and disconcerting to a fellow like myself who likes to believe an impressive record should speak for itself. But apparently it's not speaking as loudly as it used to.

There really is a "cult of the degree" and I really don't get it. Is the field getting saturated and employers are finding it necessary to erect barriers? Does this, an average, make good fiscal sense? I would think successful, relevant experience is something (that when verified) would reduce risk for businesses.


Usually, as soon as you get through the first interview, your papers won't matter much. Credentials might help to get you that first interview, but getting a good engineering job still requires deep technical skills and that's where those, who "watch a bunch of youtube videos" can shine. I have my credentials written on CV, but never have been asked to show any of these papers or shown any interest in them.


"Usually, as soon as you get through the first interview, your papers won't matter much"

Unfortunately, getting that first interview can highly depend on those papers.


My company seems happy to hire qualified college dropouts. A college diploma is a signal that you've learned something, but to be honest, it's not a very strong signal. Some of our most talented engineers didn't graduate college, or didn't major in CS, and I know several CS majors who aren't very competent at all. (I might be one of them--I'd say I was when I first graduated.

Big, mature bureaucracies want formal signals. Small/young businesses and startups care about competence. Google is very large, but I suspect still manages to act "young" in many respects.


>>They're not gonna accept "I watched a bunch of videos on YouTube" as an alternative.

Soon they will.

If the internet people come at cheaper prices.

There isn't much difference between people who watch a lecture on the internet vs those who watch it live.


> If the internet people come at cheaper prices.

But filtering out the good ones from the "internet people" is more expensive than seeking through college graduates.


And I can rewind/pause yootoob if I missed something or need a little more time to understand it.


No degree here, and when I was in college I was essentially an artist. No one has ever asked me about it in an interview. Instead I get asked how to solve problems, and maybe what problems I’ve solved in the past.


A certificate is one thing.

Expecting your education to make something of you, instead of making something of your education (track record) is another.

Managing to get a certificate does not guarantee one can apply those YouTube videos.


I never, ever once implied anything of the sort.


Training often requires other resources besides one-way communication. A professor who only lectured and didn't otherwise interact with students would be a bad, ineffective teacher.


The internet is full of people who can help you, like stackexchange.com for programmers.


Yes, but it's not really the same thing at all as a teacher in a classroom and after class, focusing on the student and their educational development, or a professor in their office. Few I've encountered online are as knowledgeable as 90% of my professors. (And most fields of knowledge have less online community than IT.)

If someone said they were shutting down local schools and just telling the kids to watch YouTube and ask question online, I'd be horrified, the kids would get far less attention and teaching, and the kids would be far less educated. Also, few would have the self-motivation to pull it off.

Not to knock Khan Academy, etc. completely. One idea I've heard is having students watch the lecture at home - everyone can see the best lecturers in the world - and do their 'homework', with personal attention from the teacher, in class.


>>Yes, but it's not really the same thing at all as a teacher in a classroom and after class, focusing on the student and their educational development, or a professor in their office.

In most top universities, teaching largely works through learning by doing. Its just that they keep the quality of assignments high. So your projects make up for it. Also note those taking online courses have an active interest to get good at things, so they would work on these projects.

In the real world you need as much as you can receive from a lecture. Beyond that mostly you have to learn on the job. So online learning will work for most cases. Yeah in some cases you can't fully learn it without a teacher, such cases will exist, but will be exceptions to the rule.

>>If someone said they were shutting down local schools and just telling the kids to watch YouTube and ask question online, I'd be horrified, the kids would get far less attention and teaching, and the kids would be far less educated. Also, few would have the self-motivation to pull it off.

There is a reason why despite all the awesome public education system US has compared to the rest of the world, only a fixed percentage of awesome people come out of it every month.

Every thing at the end comes down to personal motivation. You can't make anyone succeed without their participation and will.


This is how most of uni lectures are. The only interaction you get is badly answered questions that would not have been asked if it was to possible to pause or rewind the lecture.


6.002 was a great resource when I was in CSE.


How does flooding the market with developers that will never be able to pass their interviews help them? I guess it will increase the amount of people in the 1% by expanding the size of that pool?

Just seems like Google out of all companies has a need for the best, not just blue collar code slingers. Many people with 4 years CS degrees from good schools do not get hired.


A couple misconceptions seemed buried in here. First of all, plenty of Google engineers are doing “boring” work that can be done by coding school grads. Secondly, Google declines qualified candidates roughly 50% of the time according to people who work there. The hiring process is far more random than we’d like to imagine it is when you have a “high bar”.


> First of all, plenty of Google engineers are doing “boring” work that can be done by coding school grads.

Google's hiring process does not reflect this at all. Even entry level positions have very high bar. Boring has nothing to do with it, Google wants the best.

> Secondly, Google declines qualified candidates roughly 50% of the time according to people who work there.

That's not random, that's just setting a really high bar.

Have you been through a Google interview? Their phone screen questions are equivalent to many companies' "hardest" interview problems.


I've been through the entire Google interview, yes. And I'm going to go (slightly) out on a limb and say that if Google could reduce their false negative rate, they would. IOW, yes, that's a factor of having a high bar, but what that statement is capturing in aggregate is the fact that their interviewers are not all equally calibrated and the interview process can be very uneven, although always looking for very strong people. They then turn around and give those really strong engineers a very high salary and (in many cases) relatively uninteresting work. It would be surprising to me if that changes as a result of this $1B grant, but I would be equally surprised if they don't try to skim the cream off that particular crop.


Maybe the want to free up "good" developers that work jobs requiring a lower skill level, by providing a better fitting low skill person for the job?

(Not to be taken too seriously)


I'll take you seriously.

$1bn is a lot of money when put towards education and Google can make 5+ year investments when it comes to tech talent.


It's a nice sum of money, but education is extremely expensive, it's probably less than you think. It's enough to pay roughly, for 7000 to go to college, or maybe 50,000 people to take a one semester programming bootcamp.

A great start, but it's not exactly transformational.


Google spends what, upwards of $10B a year on software engineer salary and benefits? So, seems like a good deal. If they can spend $1B and decrease salary’s and benefits by at least 10% over a few years, they get return on investment.


Why would salaries or benefits decrease?


Your paid based on your leverage not how much your output is worth. If supply of workers with your skills goes up, your leverage to negotiate goes down and so does your wages.

Labor does not get to share in economic growth, they get to split an ever decreasing share of profits. The more people that are available to sell their labor, the more pieces that shrinking pie has to be cut into.


Law of supply and demand. More supply, steady demand => More bargaining power on the side of the company.


This was exactly my thought as well


I wish Google could spend just a fraction of that money employing technical writers to better document their technologies. So much of the documentation is outdated or flat out broken. Even stock sample projects right on the Android Studio sometimes fail to work.

Oh, and remember the mess Gradle was in 2015/2016? How much money could it possibly cost to better document some of the major tools?


I don't understand why they don't open more remote offices instead. Around 90% of their employees are currently within the US, wouldn't it be a lot easier to find tech talent if they had a major position in other areas of the world as well?


Where are you getting that 90% statistic from? I'm pretty sure at least 30-40% of their employees are based internationally, and even within the US they have offices in pretty much every major city. So most Google employees are already working "remotely".


two reasons time difference and communications


Those are not issues when you have thousands of employees at a location, then you can base entire products there. There is plenty of people in Europe who would love to work for Google but there are so few positions here that it is almost impossible to get one unless you want to move to the US, it wouldn't be hard for Google to get ten times as many engineers here as they currently have if they just wanted to.


There's plenty of people in Europe that work in Google's European offices. When I left in 2014, Zürich was something like 1500 people, with London and IIRC Paris having similar sizes, and many other smaller ones. That's hardly "so few positions", let alone "almost impossible to get one".

Also, time zone differences (and to a lesser extent communications) were inconvenient, although there were very viable ways to work around them.


I know, I work there. Most I know don't even bother applying since it is hard to get an interview, they don't see getting into Google as an alternative. All Google would need to Gobble up all talent in Europe is basically to start pestering every developer like they do in Silicon valley, they already pay twice of what 99% of developers are earning so taking everything would be easy for them.


Does Google pay more than decents contracts in European big cities (i.e. 120k+ €)?


Depending on your skill level: yes.


And why would Goggle move a product away from its HQ - low level support maybe (but google doesn't do much support and they can just recruit in Ireland for that)


Several products are already stationed in Europe so it is just a question of how much. Ireland does mostly support, yes, but both London and Zurich owns products.


Universities need to start focusing on skills instead of general education. Imagine the amount of skills you could learn if you didn't have to take 60 hours of nonsense credits and focus on skills you require.

Programming / Networking / Hardware need a new type of University that is similar to Trade colleges but focus primarily on the skills and nothing more. The first 1-2 years could focus on the foundations while the next 2 years focus around solid design principles and actually developing projects (real or fake).


I know this is a cliche, but universities (are supposed to) teach you critical thinking as well as social and life skills, not just train you for a specific vocation.

Watering down university education is not necessary. If someone wants to focus just on tech skills, they can go to a boot camp. Expanding boot camps so they become multi-year experiences may be a good idea, but they should not be called universities.


I would agree and my use of the term University is not necessarily meant to be used in that traditional sense. What I believe is that there should be an alternative to Universities that could focus more on that aspect and present you with a 4 year degree.

I guarantee if you created a tech school that focuses strictly on tech related classes for 4 years compared to a University that you would produce higher quality students than you would from schools that spend half of your college teaching you things that are not directly related to your job.


Maybe "higher quality" in the sense that they've spent more time growing their tech skills, but there is more to a person than their skills in a chosen vocation.

Also, what happens if a student wants to switch majors? Or graduates but later wants to change careers? Your proposal doesn't make that feasible.


I would say few universities teach critical thinking or transferable skills in the context you are referring to.

Some programs in universities can be glorified vocational schools. Other programs are excellent, relevant and evolving both for industry, and as innovators.


> ...universities (are supposed to) teach you critical thinking as well as social and life skills...

I disagree. I think they provide services to customers. Customers decide what the services are for. I suppose some students are very interested in laying a foundation in the liberal arts. Most, in my experience, are more worried about getting their careers started out on the right foot (especially considering all the loans that they are taking out). There is also a nontrivial number of students there for unsupervised extensions on their adolescences, though.

> ...they should not be called universities...

Why not? Is that going to be a regulated word now? What purpose does the distinction serve? We don't look at a degree and think, "Oh! University degree! This is a well-rounded person with a good foundation in the liberal arts!" No, we see, B.S. in Communications from Boise State and draw inferences from there.


There's nothing preventing you from naming your institution a "university", but if you want accreditation, there are already certain criteria you have to meet.


> nonsense credits

The assumption here is that education is nonsense. I think the skills are far less valuable: Education teaches you about the world you have to live in, in business, and as a citizen, a parent, a consumer, etc., by exposing you to leading people and ideas from around the world and throughout history. It teaches you to reason, by exposing you to the great thinkers now and in history. Reading blogs on the Internet isn't nearly the same thing.

If we send people home from college only with the vocational skill of building an integrated circuit, they will find that it doesn't begin to address most of the challenges in life, much less their community's and society's.

Integrated circuits aren't what the West needs to move forward at this point, to address poverty, discrimination, war, and all the other issues. In our current society, we do very well with integrated circuits and very poorly with life, social and political issues, perhaps we need to focus on education in the latter, not the former.


[flagged]


Yes, education teaches you things, important things, that likely you will not know otherwise. For example, I studied Chinese history and culture with some of the world's leading experts; I'm confident that I know much more about it than people who spent the same time learning Java. And because I know those things, I have a much better understanding of everything I read about China and also about my own country's history and culture, because I can see it context of what people in other countries do.

Another thing education teaches you is how to reason effectively, the most important skill of all: Reason is what separates people from animals, the Enlightenment from the Dark Ages, science from superstition, rationality from hokum. And almost the first thing you learn is intellectual humility: I am wrong about so much, and others know so much that I don't and have such different experience, that it's foolish to dismiss them. You learn we are each prisoners of our own narrow perspectives and experiences, and that the more challenging another point of view is, the more likely it is to be worth listening to. So when you say you "laugh" at my ideas, I think you should have spent less time on Java and more on learning.


I'm sorry but reason isnt taught in college its taught with experience. I know many people who have reason, empathy, and everything else without having to be taught. It's part of your upbringing not taught in college.

I believe most people would benefit from learning the skills they need for a job rather than learning about China because rarely will a tech job require your vast knowledge of Chinese history. They want to know whether or not you can push code that the business needs.

I understand your point and I am not saying that these classes are useless. But at the end of the day colleges are there to produce for jobs and my point being is that if you had more time on tech related classes than classes you don't need then you would most certainly be more ready for a job.

When I went to college there were so many classes I really didnt need. Regardless of if they were "good" for me it was a waste of my time because I didn't care nor need that information. I took classes in sociology, macro and micro economics, accounting 1 and accounting 2 and so on and so on.

Did these classes help me in some way? Yes I would say so but what if I took an extra 5 classes honing skills related to the job I was going to seek after college. Can you really sit here and say that those 5 classes are better off? I don't think so.


> reason isnt taught in college its taught with experience

I know I learned an awful lot about reason in college, and many others I know did too; sorry you missed out! If you read your Facebook feed and I study the great thinkers of history, guided by modern-day experts, I'm very confident I'll be far ahead. Unless you think you can come up with all that on your own - who needs Descartes or Hannah Arendt, apparently; anyone can figure it out? To disparage all that knowledge and learning is easy to say but very hard to support. To say those people knew and said nothing valuable seems like willful ignorance.

> colleges are there to produce for jobs

Says who? I disagree strongly. I know the parent's claim is fashionable now, but that certain hasn't been true for most of history. The liberal arts, which are not vocational, long (always?) have been dominant - that wasn't for job skills.

I'll add that few businesses value actual job skills learned in college or grad school. A new lawyer or engineer right out of school knows nothing, in many ways, and needs to be trained in real-world job skills by their employer.


You mean making meat robots that can code? Higher education is supposed to be much more than just acquiring technical skills. If you want purely technical skills, there are plenty of intense courses, workshops etc. available.


Univerisities should remain centers of education and the pursuit of knowledge.

Training people for jobs is the responsibility of vocational schools and corporate training .


I am not sure, how "skills" would be helpful in long term. People need to develop a good understanding and ideas on Computer science, not learn a particular software/language or whatever.

As, there is no guarantee that it will even be alive by the time you leave university/your potential employer would be looking for that particular "skill".


Wait what? If you went to school today and learned java i bet in 4 years java is still around. I bet 4 years C# still around. and I bet in 4 year C++ is still around. What kind of world are you living in? You are saying that taking 60 hours of non tech related classes are more beneficial than taking that many hours dedicated to actually putting those skills to work?

Also you realize that learning a programming language can translate into other programming languages right?


Learning a programming language doesn't translate into other language, but the principle of programming in general does.

In another language you may not be using classes, objects heavily, so learning about how to make class factories mean very little. What matters is, through this you might learn about good abstractions and its power, and that's what is important.

And that is what universities should be teaching with the use of any language/tool. I am not sure what you mean "non-tech" class, i am pretty sure there aren't any such things in CS programs. If you dedicate yourself to one particular tool, you will have a very narrow view of software development. Don't need to go to university, you can learn it by yourself and save both time and money.


Name a giant in this industry, that person benefitted from a liberal arts education.


Are we training giants now ?


I'm sure all the gender studies classes has helped the vast majority of tech students become better developers


Given all the stuff that's happening lately with people like Susan Fowler, a whole lot of developers could have benefitted from more of those.

Of course, that's completely ignoring that there is far, far, far more to "liberal arts" than gender studies.


Diversity training doesn't work though and it often makes things worse.

https://hbr.org/2012/03/diversity-training-doesnt-work


You will need more citations to prove that point. Same publication, more recently published, "Two Types of Diversity Training That Really Work".

"For one, a recent meta-analysis of over 40 years of diversity training evaluations showed that diversity training can work, especially when it targets awareness and skill development and occurs over a significant period of time."

https://hbr.org/2017/07/two-types-of-diversity-training-that...


I think it's more the people around them that would have benefitted from it.


That too. Everyone would benefit!


Steve Jobs


You know, I'm actually really impressed with Google for speaking with their money on this. There's a lot of "why don't these hicks just get a real job", but no one really seems interested in furthering that sentiment.

There's popular opposition to Trump's promise to give people jobs by resurrecting industries that a lot of people (probably Google as well) would rather see stay dead. But there's no denying people need jobs, and formal education ain't cheap.

Now we've got a tech giant backing that up with cold hard cash. It would be great to see other companies getting on board and putting some dough in the ring or at least offering some kind of internship/work experience programs for people coming out of an education funded by these grants.


Stated differently: "In light of recent criticism on its ability to import foreign, cheap labor, Google commits to improving their public image for American labor"


How is Google importing foreign, cheap labor? Their H1B applications are public and you can check the salaries yourself. None of the majors are abusing foreign labor.


You missed something "Google commits to improving their public image with a future of cheap American labor".

There you go. Perfect!


Yeah, more bootcamp trained JS-"artisans" and Ruby-"artists" for all of us! If they wanted to do good, invest in the school system, use your power to monitor Betsy DeVos and call her out for the bullshit to come from her. If Google could raise interest in STEM in High Schools already that would be way more organic.


Don't think there is anything wrong or ignoble teaching people JS. It's an excellent start.


To the critics in this thread- maybe it's time to consider unionizing tech workers after all?


I agree but how? Or would co-op consulting shops make more sense?


Why not try both? The point is that the status quo is untenable, corporate power is going to commodify devs just as they have done to workers in many other industries, and you might as well try fixing about it rather than getting disrupted like so many other dinosaur entities that tech has done to.


The status quo is untenable???

I am not sure what world you live in, but the world that I live in is one where software engineering salaries and jobs have continued to increase over the last 5, despite all these bootcamp grads and the large increase in CS majors.


The tech labor shortage seems to me to be essentially a fabrication invented by tech companies to encourage further oversaturation. I just cannot believe jobs are growing faster than the workforce when my experience has been this: https://news.ycombinator.com/item?id=15461080

The status quo is untenable because it's a bubble. There are far too many junior programmers.


Hmm, I am a couple years out of college and therefore fit the definition of "senior developer", so perhaps you are right, and I just haven't noticed the effects yet.

But right now, things are really really good for developers who only have a couple years of experience. But perhaps that is going to change as all these junior devs become senior.


software dev is too outsourcable for unionization to make sense. Also, real engineering professions are protected by licensure requirements and furthermore, their labor by nature of the discipline cannot be shipped out. Software is too global and portable to protect like that. So, lets enjoy the good times while they last.


Google is often criticized for not hiring developers who "still need more work", and instead hires experienced people. Hiring less capable people and training them internally would make some sense to me.


I'm not Google, but my guess is they'd rather invest in the market to let it float the "experienced" people rather than take a gamble on a known "inexperienced" person and attempt to train them.

Maybe this is shitty, maybe it isn't, but it's probably more economical.


Makes sense given the antitrust movement is getting stronger.


I'd be more impressed if they committed to hiring employees that don't already have the skills they are looking for and doing on the job training. That would be far more effective both in terms of actual skills transfer and in terms of future career trajectory than Yet More Retraining Programs untethered from any actual employer or employment opportunities. We've been doing the latter since at least the Kennedy administration to little effect.


> I'd be more impressed if they committed to hiring employees that don't already have the skills they are looking for and doing on the job training

I work for Google and I find there's already a very large amount of on the job training required due to all the powerful but complicated internal tools. My manager has told me that I should not feel pressured to contribute at all for my first 6 months and I should feel free to just focus on learning as much as I can.

I've been here for almost a year now and feel like I still know nothing so the training is still certainly not done.

I didn't mean to jump to the defense of my employer as I'm obviously biased but do think we are far from perfect (although I am very happy here :). I just wonder what the right balance is. It seems like if people can't program at all then it's not really on the job training because they wouldn't be working right?

At least my own experience has been that I came in with some college experience but no college degree and not really knowing anything besides the bare minimum to contribute (being able to program, having some grasp of CS fundamentals) and have learned a ton on the job and still have much much more to learn. I think there is an expectation (and pressure!) for all SWEs to hit the senior level, L5, so in a way everyone who is hired under that level is doing on the job training no?

Sorry for the long post!


At the dawn of the computer age ATT, IBM, and other companies taught people how to program from scratch. Early on because there were no college programs to do so and later on because programs didn't graduate nearly enough people. And no one was self taught because there were no home computers.

I don't see why Google couldn't do that today.

I'm not saying they are morally obligated to do so. At the end of the day they are a for profit company and they don't have to do anything at all about this problem. And, sure, the donations they are making instead are probably better than nothing. But as I mentioned, and as top19 mentions on what is as of this writing the top post on this article, the track record for these job retraining programs going back decades isn't very good.

What I'm saying is that if Google were to go to Pittsburgh, Youngstown, or Detroit, hire some bright unemployed people, pay them a decent but by no means exorbitant salary while it taught them how to code and then, as you point out, how to be a productive engineer at Google, at the end of the process those people would likely have very marketable skills to either continue moving up the ladder at Google or elsewhere. Of course this would cost Google something -- not only in the salaries while people were learning and not contributing but also in the salaries of people that were training and mentoring them. But Google is planning on spending a billion dollars anyway, so here's another way they could do that. A way that I think would be more effective.

--

Congrats on getting what sounds like a great job.


My company, LinkedIn, has started an in-company program called Reach to do exactly that. https://careers.linkedin.com/reach

Rather than hire a traditional Stanford CS graduate, we find people who have self-taught to some basic extent and give them the chance to work under an apprenticeship. We started with a class of about 25 with decent results and will tune it more based on what we learn


> Early on because there were no college programs to do so and later on because programs didn't graduate nearly enough people. And no one was self taught because there were no home computers.

This made sense back then because most of the tools, languages and techniques were either proprietary or niche. IBM wouldn't hire an ATT employee as FB hires a Google employee today, and back then people saw jobs as "until retirement", with a very different culture.

If you invested a lot of money training someone, you could expect them to stick around for a while and "pay back" your investment.

Today that isn't worth it. You could hire someone out of high school, spend a lot of resources training them just to have them poached by another company right after they are "ready".

That time is gone.


It's true that it doesn't make as much sense from a business perspective as it once did to have a deep training pipeline. If I were suggesting that Google switch over to such a model for all their hiring that would be all there was to say on the matter.

But here Google is looking to donate $1 billion as a charitable contribution to address a specific social problem. I'm suggesting an alternative way of "spending" that money. From that standpoint the fact that such a training program is sub-optimal in a bottom line sort of way isn't as important -- indeed it is kind of the point. The difference between the optimal business focused process and such a hiring and training program would be in lieu of the donation that Google is planning on making.

It's my contention that this way of spending money would yield more benefits than the current plan. I could well be wrong, but just criticizing the idea from a business efficiency angle misses the point.


> At the dawn of the computer age ATT, IBM, and other companies taught people how to program from scratch. Early on because there were no college programs to do so and later on because programs didn't graduate nearly enough people. And no one was self taught because there were no home computers. I don't see why Google couldn't do that today.

That does make sense and I think that would be super cool if companies could do that today.

I admit I don't know much about the history but isn't building software today more complicated though not necessarily harder than it was before?

My impression is that before we would just build programs that ran on computers. So in that sense, if you knew how to program and had a compiler to use, that was sufficient to do what companies wanted to do. However, now we have people building other things like services, which is requires additional complexity and tooling on top of what was previously necessary.

The point I'm trying to make is that I suspect nowadays there's more prerequisite knowledge for most jobs than before.

> But Google is planning on spending a billion dollars anyway, so here's another way they could do that. A way that I think would be more effective.

That's a really good point. I imagine even if it takes a very long time to train people, with a billion dollars you have the time to do that..

The optimistic side of me wants to say it's because handling the logistics for that would be a nightmare and it's really outside of our core expertise.

The realistic side of me thinks that it probably is too expensive to give the same benefits to those employees that we give to current full time employees.

The pessimistic side of me also thinks that there's a certain level of prestige associated with working at Google that has been, at least to some level, successfully marketed both inwards and outwards, and hiring people who don't meet whatever "bar" would undermine that.

> Congrats on getting what sounds like a great job.

Thanks!


Training people on-the-job from scratch to be professional engineers is extreme.

A more reasonable path is HelpDesk / Hardware Tech -> operations software / SysOp -> software engineering


But Google and maybe a few other really big tech companies are exceptions since they are so big they have their own internal tooling for everything. In general though, it's probably more useful teaching people the generic solutions that are used in most medium sized companies, rather than Google's specialized tooling. And generally a lot of the workflows are transferable.


Another major glaring problem that no one wants to admit is that lots of tech work requires (at the very least) college-level-IQ, which excludes the majority of the population. And the lower-IQ jobs are quickly being automated away.


The vast, vast majority of tech work doesn't require any exceptional intelligence. It just requires a system of thinking, one which can be...trained.


... and the ability to be trained into learning arbitrary skills is IQ. Higher IQ means its easier to train yourself into being able to use a new system of thinking.

Like, being a soldier doesn't require any sort of exceptional intelligence either. It doesn't even require average intelligence. What it requires is an IQ of 85 - below that point, and the US army cannot effectively train you to become a soldier. Despite having every incentive in the world, the US army still rejects about 15% of the population.


15% of the population doesn't apply to join the Army. And those that do self select. Using the Army rejection rates for determining the IQ level of the population isn't a good method.


It's the definition of IQ. 15 points is 1 standard deviation, so about 15% of the population has an IQ of 85 or below by definition.


Just following up to add that these two 15's are unrelated, in case any readers get the wrong idea.

The area under the normal distribution curve from minus infinity to -1 (one standard deviation left of centre) is about 0.1586 or about 15-16%.

The 15 points = 1 stddev property of IQ is arbitrarily established.


IQ just means potential not capability. Given enought time I believe most people can be trained to do almost any job.

I don't think our brain suddenly evolved in 200 years from illiterate peasants to software engineers. We just have more school time these days.


Since it is empirically obvious that 200 years ago the world was not entirely populated by illiterate peasants ...perhaps your population model of the distribution of IQ and causes thereof is inaccurate.


Not entirely but the vast majority. It's empirically obvious writting existed 200 years ago therefore someone somewhere knew how to write.

What do you think people did before the industrial revolution?


A lot.


> IQ just means potential not capability.

Ok, you can say that.

> Given enough[] time I believe most people can be trained to do almost any job.

That's a statement about potential, not capability.


It probably helps but saying it's necessary seems a little far fetched to me. Also since pretty much everyone I knew growing up went to college I don't think "college-level" is really meaningful for much except wealth nowadays


That may say as much about you as it does about the population at large, right now about a third of adults in the US have a bachelor's degree[1].

1: http://thehill.com/homenews/state-watch/326995-census-more-a...


I would guess the vast majority of people in my highschool were in the top fourth of income in America (including my family, we were well off but not super rich because my dad was in the military and then a commercial pilot and my mom was a teacher). I'm sure other people have different experiences but waaaayyyy more than the national average of people from my highschool went to college and I can tell you that while it was a decent school it was not filled with geniuses. It's why I think a large amount of the reason people go to college is cultural and that to many people college ends up being more of a symbol of status than a intellectual achievement.


That's not true. It's a common complaint among engineers that developing line-of-business or CRUD applications is tedious and not challenging enough as it doesn't require a lot of creative thinking.

Truth of the matter is that most tech work doesn't involve AI, machine learning, self-driving cars or augmented reality but down-to-earth business applications. Developing those requires abstract thinking, empathy and problem-solving skills but it doesn't necessarily require a college-level IQ.

In fact a high IQ could even be harmful in that situation because apart from getting bored quickly highly intelligent people can display a tendency to overthink problems (which is probably how many notorious enterprise frameworks came about ...).


You've clearly never met a person who is unable to place a hanger inside of a shirt and hang it on a rack, or a person who can't sort 5 single-digit numbers mentally. What you are describing are 120+ IQ problems.


You make this sound like someone with an IQ less than 120 is mentally impaired.

An IQ of 100 is defined as the median IQ level for a population. The range between 90 and 130 covers the whole gamut of human intelligence that's commonly considered normal.

Much of the day-to-day work in IT often doesn't need original thinking but merely skillful application of known methods and patterns, which in turn doesn't require a college education.


Find one successful programmer at Google/Apple/Facebook/Cisco/etc. with an IQ of 90. Remembering basic equations, keeping long sequences of functions sorted in your head, even understanding FTP/Git/CL instructions require this level cognitive ability. When you deal with people who can't hang shirts you'll start to understand this. There are just under 200 million Americans with IQs between "can't hang shirts" and "productive programmer."


> Find one successful programmer at Google/Apple/Facebook/Cisco/

Point proven, eh? I'm not talking about these people (though I surmise there are quite a few in these companies as well who don't have that level of intelligence and simply tag along ...) but about the vast majority of IT workers working on some supposedly 'boring' database application.

Google and Apple in particular have been known for hiring highly qualified engineers only to then have them maintain some run-of-the-mill administrative software they're vastly overqualified to work on.


>doesn't require a lot of creative thinking

It doesn't, but you still need "higher level" developers to cover security and concurrent/parallel/distributed problems and maybe architecture.

It's not that someone can't also learn that stuff, but exposing a product publicly without some experience in those areas is asking for trouble.

IMO, Software is resisting division of labor and work by collapsing roles into "Full stack devs" or "DevOps Engineers".


I agree there have to be different skill levels but I think that distinction has to occur within denominations like 'full stack' and 'DevOps'. These labels simply determine what you do not how skilled you're at it.

In my opinion, full stack and DevOps is the normal, sane way of approaching software development. Artificially dividing up roles into labels such as 'front-end', 'back-end', 'database programmer', 'system administrator' only leads to more silos, less collaboration and sometimes even downright hostility between these roles.


IQ is standardized to a population distribution. As such someone with an IQ of 120 is not 50% smarter than someone with an IQ of 80; rather, they simply have a 50% higher "score". But there is no standard zero-point on the IQ scale; the "IQ of a rock" could be anywhere from -5000 to +50 or so. That doesn't mean IQ is meaningless -- it correlates with a variety of positive outcomes -- but that low-IQ people are not "proportionally" less intelligent as measured by the IQ. Rather the IQ scale is set up to have a seemingly normal distribution relative to the observed distribution of fluid reasoning in the human population. The standard deviation of IQs is arbitrarily set at 15, and the mean is set at 100. We could easily reform the IQ scale to have a mean of 1000 and a sigma of 1, or a mean of zero and a sigma of 100. The number, itself, is meaningless without context.

There is no evidence that a person with low IQ cannot accomplish the the tasks involved in e.g. software engineering, although they would probably do so more slowly than a person with high IQ. From Wiki:

"The prevailing view among academics is that it is largely through the quicker acquisition of job-relevant knowledge that higher IQ mediates job performance. "

Also, going to college has no effect on IQ, so "college-level IQ" is meaningless gobbledygook.


Re. college level IQ, I think they were referring to the level of IQ required to be able to successfully complete a college degree, ie. keep up with courses (which requires quick acquisition of knowledge), rather than the level of IQ 'acquired' after completing a college degree.


What exactly is a college-level IQ? Does everybody who goes to college have a college-level IQ?


A college-level IQ these days means something low like 100 -- you're probably thinking of a more stringent standard.


An IQ of 100 is supposed to represent the mean score of the general population.

The average IQ of someone with a college degree is a good bit higher than that. From a quick Google search it looks like it's around 115, which is a standard deviation higher than the general population.


I was referring to more of a minimum than an average.


I was under the impression among people with undergraduate degrees it was closer to 110-115.


Isn't half the population below 100? How can you consider that low?


That's lower than half the population.


Because it's terrible.

Pretty much the same way we can consider the median personal income ($31,100 in 2016 - https://fred.stlouisfed.org/series/MEPAINUSA646N ) "low". Because it's low.


The vast majority of the population already has a "college-level-iq" for whatever definition of. Tech work is not a geniuses' only game.


Maybe - or maybe it just requires a specific way of thinking about things i.e. Problem Solving and Time/Task Management? As well as the ability to learn the task of course.


What's the difference between "problem solving" and "IQ" ? They seem like almost the same thing.


"IQ" is a scalar semi-objectice well-defined metric that predicts some things about the much more complicated and less-well-defined concept of problem solving.


Problem-solving: the ability to deconstruct a problem into atomic pieces and construct solutions

IQ: How fast you do it.

tl;dr: Problem-solving is software, IQ is hardware


"IQ" measures something called "G", which is pattern-recognition ability (coincidentally the cutting edge of today's ML neural nets). The scientific theory of IQ is that that if you are good at pattern recognition, then you will be good at a wide range of cognitive tasks.


What is it about tech work that requires such a high intelligence, as you state? Because I'm a tech employee and would argue that your claim is completely unfounded. There's nothing special about learning how to code—it's a very trainable skill. Takes a lot of time and dedication, sure, but trainable nonetheless.


You can't even teach calculus to some people to the point where they can pass a class. Some people in programming classes can't even submit homework that compiles.


Calculus and programming are in two different leagues of complexity, and most programming doesn't require a knowledge of calculus.

Like any other skill, programming comes more easily to some than others, but with the right approach can be acquired by anyone.


Calculus classes are more passable than programming classes, and the claim you're making is just pulled out of your hat, made in such a way that you could never be found wrong, because no matter what you could just say you're not taking the right approach.


What criteria determines who should get a position then? What salary should those positions command? Sounds pretty handwavy.

Edit: responders to this comment seem to miss that the parent comment is suggesting Google hire new individuals and train them, not find talent in their workforce and do training there. Thats the unrealistic part- creating a secondary application process for individuals without the skills -- when they already reject a ridiculous number of people with many of the skills.


On the job training was pretty standard in the tech industry in the past. I worked with a guy at an HP (oldest high-tech co. in CA) spinoff that was the head embedded software architect and worked his way up from the machine shop in the '70s. He started out manning a drill press and learned to program when they started using NC (numeric control) machine tools.

Do you think Google lets anyone to work their way up from front desk admin or barista?


This is basically how my mother was trained as a programmer. She got a job, was literally handed a stack of books and reference material and started figuring shit out.

She changed careers before I can really remember her at that job, but I too am self-taught in this field (but I trained myself at home).

The whole idea of companies not investing in their people is relatively new.


To be fair, employees are no longer particularly faithful to employers-- after all, it's considered completely normal to spend only 2-5 years at a job, then move to the next one. This definitely complicates the benefits of on-the-job training: why waste 3-12 months training somebody to basic competence when they might leave a few months after that? Some of this is obviously due to employees fighting back against stagnant wages/lack of promotions, but it makes me wonder: who started the faithlessness first, employers or employees?


I don't know about started, it's been back and forth forever, but in recent history a big epochal change was the hostile takeover movement of the 80s. Many companies, including those that had had a reputation for loyalty to their employees, were bought out by corporate raiders who often took actions that lead to huge number of layoffs.


Yes, actually Google does have programs to let people work their way up on the inside. Barista is a bit extreme and not really internal since they're vendors, but admins, help desk, etc. do have avenues to move into other roles.


That's good to hear. Carly Fiorina worked her way up from Admin to CEO. Though, that is a bad example because it didn't work work out too well for HP.

I don't think being a barista is much different than my example, though. Maybe they should make them internal hires.


>"What criteria determines who should get a position then?

A basic aptitude for the role or transferable skills maybe?

>"What salary should those positions command?"

Whatever the company wants to pay. Why does that matter at all? Employment is still an agreement between the two parties.

>"Sounds pretty handwavy."

Not at all, "on the job training" has a history stretching from Medieval Ages and the guild system, to the industrial revolution, powering the war-time workforce etc:

https://msu.edu/~sleightd/trainhst.html


I don't have solid answers, but that's exactly how my father-in-law got into the computer field back in the early 60s. He was working an administrative job at a bank, someone IDed him as having potential, and the bank put him through internal training in programming and computer science-y stuff. It worked out, he spent some time programming, and eventually moved into management.


Who determines who should get a training slot? What kind of subsistence shouls that training command?

The commitment of capital to tanglible jobs that lead towards ojt is not handwavy at all. That said, i bet the tax breaks from the schooling is substantial


I'm thankful I found a company like that after graduating. Seemingly the only such specimen in America today. It's a giant international consultancy company that seemingly has a questionable reputation online but it was all I could get that paid anything remotely respectable and they even do paid training. It's seems to be one of a rapidly dwindling handful of paths for new grads who are graduating and finding out the online hype like "it's easy to get a job in CS if you do [some combination of good side projects, internships, good gpa, etc]" is an outright falsehood at this point in time - it's very hard unless an opportunity approaches you like what eventually happened to me amd it turned out all that applying and trying was a waste of time.

In my case I had no choice but to do a draining unpaid internship in college, their return offer was only $15/hr, nowhere I applied to for months (likely hundreds of applications) would take me even though I did all those things in the list and continually put my resume through those resume threads on reddit, except one offer for 30k. 15/hr and 30k are abysmal insults to the amount of time money and effort put into programming since middle school, getting a BS CS degree, volunteering on big online projects since high school, etc. I'm not in the middle of nowhere either, this was NY/NJ.

These experiences signal to me that the tech field is hightly oversaturated for new grads and it's a matter of time before people realize the bubble popped. Someone in my position should not be getting offered what amount to poverty wages taking into account student loans, the high price of car insurance for a driver of my age range, etc. Programs like what google is doing are just going to make this problem even worse as companies feel further emboldened to require increasingly more experience out of junior programmers and offer them salaries further approaching minimum wage.


Was this Accenture? They used to have a 3-week java training program for those entering as new grads, but I believe that's been reduced in scope now.


Tata. Is Accenture one of those ones with clauses that you have to pay a fine if you quit within a few years like Revature?


Yup, companies like Accenture have been doing new hire "training" programs for well over a decade. It's a great way for young college grads who didn't go to one of the elite schools that the Googles of SV recruit from to get over the "Jr. Engineer with 3 years experience" hurdle.


Yeah, it's frustrating how the narrative is that where you got your degree doesn't matter when it absolutely does.


> continually put my resume through those resume threads on reddit

From curiosity, and certainly not to offend.. you _did_ place your resume elsewhere as well, right?


I meant resume critique/improvement threads. I put the resume on sites like Monster as well as sending it to specific postings, usually customized for that posting.


Oh, good, that makes a lot more sense. Sorry for my confusion.


I doubt that hiring average talent and training it is best for Google business. They believe that it is better to try hiring only the very best, and do it even if employees do not match a specific position they ostensibly interviewed for (e.g., train for skills or find new positions, but after a very high entry bar).

I think retraining programs primarily help sharp, energetic folks who somehow got into a bad state (useless major, bad school, rough childhood, etc.); maybe even social connections and stability are more important than the skills they end up getting. However, those programs are IMO worthless for folks who lost a stable job and hope that Yet Another Certification Class will put them into a pipeline for a similar one. I am not sure how to help the second type.


Google hasn’t hired “the very best” for at least five to seven years now; there are plenty “average talent” employees who do their job adequately.

Mind you they do hire some of the best, but the idea that google engineers are “the best” is a myth.


I never said google engineers are "the best". I said that this is what Google tries to do. IMO it did this pretty well 10 years ago, but as a company gets big this gets much harder.

I heard though they are still trying to do this, even being aware of the lower success ratio.


These companies do on-the-job training. The problem is that a college degree in computer science or software engineering doesn't actually prepare you to work at these companies. It makes you effectively literate, so that you are feasibly trainable.


Usually I'd agree that on-the-job training is more beneficial than untethered training programs.

However, just today I listened to the most recent Freakonomics Radio episode, which was about how Germany managed to become the economic powerhouse it is today. Most economists that were asked agreed that an essential ingredient of Germany's economic success is its unique concept of vocational training, which combines on-the-job training with school education and general - as opposed to employer-specific - job training.

Perhaps a system that's essentially a combination of both on-the-job training and more formal training programs would be conducive in this case as well.


Yeah, it's also helpful that German tertiary education is free.


I suggested exactly that 2 months ago [1]; the idea was not popular here.

[1]: https://news.ycombinator.com/item?id=15003048


I am glad to see Google doing something like this.

All of humanity growing together towards a brighter future for everyone is truly our highest calling.


But nobody wants to address the mounting student loan problem!!

Degrees don't come for free, technical graduates don't come for free, students have to go to college for that and in US you need a LOT of money for that.

This is the chicken and egg problem where nobody wants to address the real problem an everyone is going around giving superficial solutions.


Student loans are only a problem to students doing one or more degrees in low-2nd and 3rd tier schools with few job market prospects.

The rate of student loan defaults is actually inversely proportional to the amount owed.


I don't say that student loans aren't getting refunded, I am saying that you need to take a loan to study. That's the problem.


What they mean by high tech specifically ? Android dev, web dev, etc qualifies as high tech job?


I think not only do we need to be thinking about getting people "prepared" with new skills, but also about smooth lateral movement across industries.

In a lot of cases these are probably viewed as the same thing but, for example, I would ask: When was the last time a Senior Java Developer was a candidate for a Senior FrontEnd Web Developer position?

I think the future is going to be a lot less about being hired for "jobs" with "companies". Instead it's going to be substantially more about "projects" being done by "groups / organizations". The groups / organizations being assembled / disassembled with high frequency.


Start with basic observations. Tool acquisition is not tool mastery. Tool mastery does not mean jobs for or at Google. Janitors are not getting IPO payouts for lives of leisure. Cold war coastal higher education, city living and corporate trading prowess have drained many interior communities of their best brains throughout decades of deindustrialization and deskilling.

Some better nerds here can hardly imagine perfectly smart people who cannot yet touch type or turn a spreadsheet into a group calendar. Our miraculous simple decision support tools are still opaque to majorities of Americans. Only Americans far outside Google will create value to create jobs. That takes planning for any possible sweat equity or financial investment. We have generations of people to train with tools. The boy genius prizes for ever new tooling are not really separate concerns. Cultivating and harvest new boy geniuses from the field is expensive. They don't exactly grow on trees.

Google like Apple or Microsoft had to discover and rediscover their own relevance. They cultivate their markets now with intensive growth. This is a good move.


Everyone hates a cynic so bring on the downvotes:

I'm going to warn everyone of what's coming.

Software engineer jobs will be blue collar, $40-$60k a year jobs, by 2030.

The HUGE push from government, and private business, to fill the PERCEIVED lack of engineers, will come to fruition around that time.

Make no mistake about it - there is NOT a lack of skilled engineers right now. There is a disinterest among business to pay higher, and higher salaries.

If you are a SWE right now, save your money, and invest your time into improving YOURSELF. Have a backup plan, because I promise you, the good times are coming to an end sooner than you think.


I disagree, I see people fail learning programming over and over again. Even after three years completing a computer engineering degree, they struggle with how a for loop works, (No this is not a joke in Europe).

Learning to program takes a lot of dedication and focus. Which a lot of people have no interest in, it is just too much work and too difficult. Every student that takes a engineering degree here, have to have a class with introduction to coding. And everyone, except those few who enjoys computer science, says that class was the hardest class to pass by far compared to the rest.

So I believe the opposite will happen. The demand for software developers will grow beyond our imagination.


I just can't share that optimism. I taught myself to code using youtube videos and books. There are 14 year olds on YouTube coding iOS video games in a matter of weeks.

Combine that with the fact that big companies (like Google) release SDKs that make application development trivial, and you've got a recipe for the skill cap lowering along with wages.


Do those apps that 14 year olds work on require an in depth understanding of how threads work? How different hardware components work together at scale? Most people with CS degrees never acquire the skillset to do such work, why should I believe that those who are trained via job training programs will be able to do so. I'd be more afraid of potential retraining of other highly skilled workers who want to switch careers than anything else.

The skillset of building a personal website, or even a website for your small business should be something anyone can do, and will in no way impact the overall salary of software engineers in the future.

Major companies will always need people who understand the computational sciences, as scale and complexity follow some of the same rules as entropy, in that they are always increasing.

Additionally, the reason for high salaries is not a lack of engineers, it is that top companies have decided that it is in their best interest to outbid each other for top talent. In parts of the midwest, where there is less competition, engineers are already paid 50k a year.


>Do those apps that 14 year olds work on require an in depth understanding of how threads work?

I am constantly surprised that when other CS students in my classes have zero idea how anything beyond the particular language we're learning works. Even in higher skill-level classes that require a fair amount of proficiency with the language if you asked what the length of the pointer they just properly used was all you would get is blank stares.


> "how anything beyond the particular language we're learning works. "

and

> "if you asked what the length of the pointer they just properly used was all you would get is blank stares"

So what is it?

It's a computer science program, not learn a dozen language's quirks and implementation detail that you use for a single class to understand some concept.


It's not that bad in the midwest. Salaries in the 70-80K range are common enough.


Maybe you are both right?

The new class of programmers that governments and Google want to train up from your average worker will not be as skilled or intelligent as the current generation of programmers. But, there are still opportunities for them in software development. They will take jobs that pay 40-60k a year, while the higher skilled and more intelligent programmers will be architects or leads who command much higher salaries.


Having more code technicans is great. I'd love to have more people to help maintain shit, write documentation, do small bug fixes etc...

There will be a differentiation between Engineers and Coders soon. Hooking up to a few different APIs, doing some JS and HTML does not count as engineering.


Making it easier to do some things we do with code today doesn’t mean there won’t be new hard problems to solve tomorrow.


I've not only seen the same, I've seen people who actually work as developers who struggle with basic for-loops. I used to get worried when people rang alarm bells about jobs going overseas or there being an influx of developers into the market, but a large portion of the population seems to either just not be able to wrap their head around programming, or just not find it that interesting.


"PERCEIVED lack"? Doesn't the fact that salaries are sky rocketing essentially disprove your point? Why are the salaries getting pushed higher and higher? Because demand for SWEs is outstripping supply.

This to me looks like the same kind of privileged outlook that other professional guilds like the AMA desire. Do you want cheaper healthcare, or doctor compensation to keep going up? Hey, letting nurse practitioners take on some of the load is "flooding the market with n00bs"

This just seems like protectionism by another name.

Yes, the good times for software engineering will come to an end. I'm a software engineer, this will affect me. But the question is, do I have a natural god given right to have a ballooning salary every year, while fighting attempts to increase labor supply that might cut that growth rate?


> Doesn't the fact that salaries are sky rocketing essentially disprove your point?

What's your definition of "skyrocketing"? Outside of, maybe, a dozen high-prestiege companies located in a couple specific areas I don't see salaries skyrocketing. Mine hasn't; not saying I'm not well compensated, just not as overpaid or in demand asbsone people make it sound.

Further, my experience with the aforementioned high prestige companies is that they are picky as hell. That tells me that either there is no shortage of talent for them or they are choosing beggars.


While it may appear that salaries are rising quickly, in reality, for the average tech employee, they have been standing still for quite a while.

In 2003, the average salary in tech was $69,400.

By 2017, the average salary had risen to $92,081.[1]

While that might seem like a pretty large payrise, after adjusting it for inflation, you come to a clear conclusion that over the last 12 years, salaries have stood basically still. You can point to people getting $120k+ as a first year employee at Google, but salaries like that are massive outliers. Average developer in america earns much less.

[1] Dice Tech Salary Survey 2017 https://marketing.dice.com/pdf/Dice_TechSalarySurvey_TechPro...


Storage and Networking != SWE, and these surveys cover a much wider ground than particular tech hub job markets, and do not adjust for purchasing power parity/cost of living.


> Yes, the good times for software engineering will come to an end. I'm a software engineer, this will affect me. But the question is, do I have a natural god given right to have a ballooning salary every year, while fighting attempts to increase labor supply that might cut that growth rate?

Does our owner class have a natural god-given right to a 6% return on their investment every year, for doing nothing?

They are certainly spending their energy on fighting attempts to spread the economic pie around. We need solidarity, not shaming people for protecting their means to make a living.


> Does our owner class have a natural god-given right to a 6% return on their investment every year, for doing nothing?

You, too, can become an "owner class" by opening an online trading account and buying stocks. Commissions are often under $10 for a trade.


Since I wasn't born into money, or won the lottery, I can't live off that 6% return for another two decades.

Either way, even if I can, the guy who makes my morning coffee can't, and never will be.


> for another two decades

I.e. start investing and in 20 years you'll be financially independent. Sounds good to me to be living in the US.

> the guy who makes my morning coffee can't

I talked with a guy once who told me he "can't". He was driving a new car, and the payments, rent, etc., added up to more than his income. I suggested he sell the car, buy a car he can afford to pay cash for, and start investing.

He partially did take my advice. He sold the car, bought one he could pay cash for, and then blew the extra income on some other luxuries. Of course, then he still was in "can't" territory.

My current car I bought used 25 years ago and still drive every day. It costs me practically nothing.


You're forgetting the part about having enough money to make meaningful investments.


If you'd invested $1000 in Boeing in the early 80's, it'd be worth $200,000 today. Sounds meaningful to me.


And if I was a fetus in the early 80s? Or if I got sick and needed to sell that stock early to pay for treatment?

Your point still is predicated on the "already having money" part.


Far and away most people are healthy 20-60 and are not sidelined by disastrous health problems.

The point is invest early in your working life, and you'll have the needed results when you're ready to retire.

$1,000 is not what people would consider "having money" is. $200,000 is. If you have a car, it surely cost far more than $1,000.


A car is a necessity in most of America for getting around. Most people likely do not have a spare $1000 laying around. And knowing what to invest in is another can of worms in itself.

And depending on who you're talking to, having a spare $1,000 risk on investment is "having money".


> Most people likely do not have a spare $1000 laying around.

True enough. Because they spend it, like the person who bought a new car. How much do they spend on beer/cigarettes/weed in a year?

> And knowing what to invest in is another can of worms in itself.

That's true. Apparently investing is more than "doing nothing", and one is taking a risk. But it is within the means of the vast majority of adults.


> And knowing what to invest in is another can of worms in itself.

For the record; knowing what to invest in is easy: a broad based low cost index fund. The research in "A Random Walk Down Wallstreet" statistically shows that actively choosing specific companies has a less than 50% success rate and often comes with higher fees. For the record, most public libraries have a copy of "A Random Walk" and its conclusions are well shared.


What if I invested in Blockbuster instead of Boeing? How much would my $1000 be worth today?


If you're not willing to invest because it's risky, that's fair, but if you're also going to complain that those who do are getting something for nothing, then it isn't so fair.


I wasn't the original commenter, but I don't think when they said "owner class" they meant passive minority shareholders with $200K portfolios or people with some money in their 401Ks.

While some of the richest are certainly entrepreneurs, the largest percentage of Forbes 400 are people who got there with OPM, other people's money, like hedge fund managers. Another chunk are heirs like Waltons or Kochs. Did Waltons get something for nothing without any risk? I would say yes, they did.


Check this on Sam Walton before assuming he got something for nothing:

https://en.wikipedia.org/wiki/Sam_Walton

More accurately he made something from nothing.

You and anyone else could have made a little something, too, if you'd bought Walmart stock. Lots of people did.

Also, you could be a millionaire by retirement if you adhered to a regular investment program rather than only one investment of $1,000.


Sam Walton is dead, I'm not talking about him. I'm talking about his children, they're in top 20 richest people or something like that. Each one.


And if I didn't have the money to buy that stock?

Also, the Walton kids most assuredly got something for nothing.


> And if I didn't have the money to buy that stock?

But you do.

> the Walton kids most assuredly got something for nothing.

85% of American millionaires are self-made. See "The Millionaire Next Door" by Stanley.



Stop paying $10, and use Robinhood which offers Free Trades.

It has limitations, so you'd want a traditional broker also.


Sure. Solidarity to push back against the owner class to take a smaller piece of the pie, so the rest of the workers underneath - whether long-time veterans or recent entrants - may share it. In this, new workers shouldn't be seen as a threat, but an opportunity to create a bigger force to band together against the execs.


I continue to be thoroughly surprised by the people who are accusing me of things like "protectionism".

If you re-read my original post you will notice that I did NOT advocate for artificially restricting worker supply, or any kind of protectionism.

I merely warned that what we're seeing in tech WILL bring an end to the "high" salaries, and that those who wish to maintain their current state, should consider PERSONAL GROWTH and advancing their skillset as a means of protection.

We can debate all day about whether its a "perceived" lack, or a real lack. I whole heartedly disagree that salaries are, as you put it, "sky rocketing", especially when you account for cost of living in the areas where the "skyrocketing" is happening.

And oh, yes, how dare a doctor who spent upwards of $500,000 in medical training, and devoted years to internships, and residency, be worried about a lowered skill cap or regulatory protections (which they counted on) for entering their profession! How dare they!

Please. Of all the examples of protectionism you could have given, you chose perhaps the most acceptable and understandable. Yes, people care about their livelihood and the ability to retain efforts and investments they have made, so would you.


The difference is people saying "I got mine, and I'm going to vote against and oppose any thing that increases opportunities for other people to enter my industry and compete with me."

To me, that's unfair and wrong.


"Yes, the good times for software engineering will come to an end. I'm a software engineer, this will affect me. But the question is, do I have a natural god given right to have a ballooning salary every year, while fighting attempts to increase labor supply that might cut that growth rate?"

Does your employer have a natural, God given right to cheap labor?


Do I have a right not to have competition from other people in my industry underbidding me? My employer's rights are not the issue, it's whether I have a right to restrain others from entering the market through means like trying to oppose training and educational opportunities.


Your employer's rights are very much the issue.


Are we're talking about a minimum wage for SWEs now? I mean what, specifically, are you suggesting? Employers can't hire people who bid lower?

Salary in the tech industry is a function of demand. Right now, for example, people with expertise in Machine Learning are in high demand. You can graduate with an MS or Phd in machine learning, form a startup with no product whatsoever, and get acqui-hired just because of the insane bidding war going on right now.

When I moved to SV in the 90s, at the height of the dot-com boom, kids fresh out of college were getting insane signing bonuses worth $10k or more. I knew people who would switch jobs every few months, just to collect freebies.

Perhaps it is different elsewhere, but here, tech workers are very highly privileged. Seeing people complain about a desk job that pays $100k+, weeks vacation, great healthcare and benefits, flexible working hours, compared to the utter suffering that's happening in the working class across this country just looks tone deaf to me.

Before we worry about the poor suffering tech workers, in their gentrified neighborhoods, in swanky cafes, facing stagnating nominal wages, how bout we consider the masses of people who missed out on the tech-utopia for the upper 10%, people who would like to move up the value chain, and have a right to compete for your job.

My mother worked as a cashier at Safeway before she died. Comparing my hourly wage as a SWE to hers is kind of obscene, and for the people I grew up with, seeing tech-bros complain about their current threatened position has got to look like people completely ignorant of how much privilege they have.


>Why are the salaries getting pushed higher and higher?

Because Bay Area cost of living is getting pushed higher and higher. The median programmer already has roommates, a 2-hour roundtrip commute, and no hope of family-sized housing or (gasp) ownership; if standard of living falls any further, we'll all go do something else.

Obviously many people have it worse, but we have alternatives.


Counterpoint:

The amount of programmers needed is rising across the globe. Every country is going to try to retain its IT talent. In a cut-throat globally competitive world you can't afford to be the country that lets its best minds leave to greener pastures.

Also, programming isn't the sort of job where you can fake your way through. If new people are trained up to enter the field they'll need to skills to match. As a relative share to population size the number of people studying computer science has been falling, not rising. These bootcamps and training programs are trying to bridge the skills gap, and so far have not succeeded. If anything there's going to be a skills glut, with a corresponding rise in pay.

Sure, employers always try to minimize pay. IT is not special, this is the case for all industries. Labor price is set through supply and demand, and programmer's wages are no different. Can you give a single example of an industry that used to have high wages but now has low wages? It would be exceedingly unlikely for IT to behave unlike every other industry.


> Also, programming isn't the sort of job where you can fake your way through.

Not sure why you think programming would be special in that regard. In a lot of companies, you will be able to keep your badly done job for a really long time as long as you have a good bond with your higher-ups. This goes for programming the same as sales or any other profession.


Yep. You need to move often enough to stay ahead of the technical debt you run up as you string your spaghetti code together. Or transition into a role where you talk about code much more than read or write it.


That’s true, but at some point someone has to actually do the work. The company will need to hire a certain level of skill to remain competitive.


I'll believe there's a shortage or urgent demand when I notice a change in hiring standards and practices, and recruiting methods. As a former hiring person, yeah, it's kind of a pain in the ass to have to interview a lot of people before you find one who's any good. But if there was a genuine shortage + this alleged urgent demand, I have to believe that something about the tired, inefficient, and false-positive-allergic process would change in a newsworthy way.


The solution to lower wages created by more workers isn't to clamp down on the supply. The solution is to unionize and to ensure that labor has leverage through collective bargaining. Trying to play gatekeeper against bringing in more workers seems as wrongheaded as arguing against building more homes in a housing crisis.


> Trying to play gatekeeper against bringing in more workers...

Except many unions do exactly this through closed shops and other tactics.


Sure. 19th century unions in the U.S. were often racist and or nativist, seeing non-WASPs workers as a threat. But surely a unions formed by 21st century tech workers with over a century of historical wisdom and the modern day spirit of innovation should be able to do better, no?


I sincerely cannot tell if you're being sarcastic or not. The first order of business of any tech union will be to stall the H1-B program and then follow that up with strict gatekeeping rules. Just read HN, a supposedly enlightened class of programmers. It's all short term protectionism from people insecure of their ability.


1. it seems to be more complex than that. i mean, if you look at, say, the SEIU's position on immigration, it does not oppose additional immigration to the US. if anything, the SEIU has taken a pro-immigration position: http://seiu.org/cards/solutions-for-immigration-reform-expla...

2. protectionism from people insecure of their ability describes both tech workers and tech companies. Google, Facebook, Apple et al have increased their political presence in recent years with additional lobbyists. corporations love an unfair playing field as much as anyone.


Surely an industry that prides itself on thinking different can figure out new and innovative solutions to seemingly impossible problems.


I think the point is that they don't want to.


i partially agree. e.g. public sector labor unions in Los Angeles have succeeded at maintaining high wages and great benefits for themselves.

but when you say the solution "isn't to clamp down on the supply" i lose the thread of your argument.

clamping down on the supply is exactly what a public sector union does. union work rules and other union-favorable city regulations exclude or limit non-union workers who might otherwise be hired to carry out various city functions.

working for the city or the department of water and power can be a very good deal for the worker, but city residents pay more in taxes and see less service as a result. merely unionizing the labor force helps some people but hurts others.

indeed, a case can be made that, because police officers are so highly paid and benefitted, they are scarce. and because they are scarce, there's more property crime, and murder, than there would be otherwise. it seems quite plausible that some city residents pay a very high price because of this public sector unionization.

in politics, this leads to a strategy wherein city residents who live in "electorally unimportant" areas (i.e. poor areas with lower voter turnout) receive lower levels of government service than city residents in areas that vote a lot.


"but when you say the solution "isn't to clamp down on the supply" i lose the thread of your argument."

I've not seen an example of this. And I do not consider being a member of the union to be clamping down.


here's an example: in LA a large number of electrical power poles are at end of life and need replacement. but the city is slowed by union rules in efforts to contract this work out to take advantage of the labor force at large:

One key obstacle, officials say, is the contract with DWP’s largest union, IBEW Local 18. The agreement requires that managers negotiate with the International Brotherhood of Electrical Workers before hiring contractors. Initially, the department is supposed to attempt to fill any internal vacant positions, Howard said. The contract also obligates managers to offer IBEW workers overtime to fill some of the need.

IBEW business manager Brian D’Arcy declined to be interviewed for this story.

from: http://www.dailynews.com/2014/05/06/dwp-lagging-behind-on-re...


Unions? How has that worked for every other craft industry in the US?

Forgive my crass reply, but, I have only seen unions become weaker and weaker in the U.S. And even when they were "strong" they didn't protect labor interests against increases in labor supply. Globalization has wrecked a number of industries, which unions were powerless against. The strongest card in the hand of any laborer is their scarcity.

Also, nobody (including myself) is advocating for "playing gatekeeper". I merely made a post warning people of what's coming. If you look closely my advice was to the individual - invest in yourself. Don't count on unions, or governments playing gatekeeper, to protect your current salary.


Doctors, actors, pro athletes. There are many types of guilds and professional associations beyond unions.

Furthermore, if tech is about disrupting everything, including the nature of work itself (through on-demand) and even the nature of human relations itself (through social media), then surely some attempt could be made to better labor relations by inventing a better type of union. It's especially rich to hear "no, it can't happen, it's always failed in the past" comments wrt labor unions come from workers who work in an industry that's supposedly all about innovation.

I wasn't trying to attack you for your original comment, in any case. You actually offer good advice. But my general sentiment is that we shouldn't try to restrict the labor market- it seems as wrongheaded as trying to fight gentrification by limiting house construction just because some of those units will be luxury condos instead of affordable housing- and that lowered wages could be fought by the presence of a tech union that protects tech workers.


How does flooding the market with more developers who can't pass their interviews help them lower salaries?


If you depress market rates aggressively, you can "increase" pay from 120% to 140% of market rate without giving raises to anyone.


>software engineer jobs will be blue collar, $40-$60k a year jobs, by 2030.

There have always been 'blue collar' engineering jobs. When I started in the tech industry I was making $13 an hour writing HTML and a bit of SQL here and there.

There are probably tens of thousands of "Software Engineers" putting together PHP sites, doing front end JS work at an entry level, hacking together some minor software customizations. I think you're right, this will become more prevalent. The world needs a lot more engineers to do this kind of work.

>There is a disinterest among business to pay higher, and higher salaries.

Evidence points to the contrary, salaries have skyrocketed in the last ten years... have you been paying attention?


Agreed. I'll add that it is also worth noting current job openings don't mean "potential job openings". There is entire things we might not even be considering as a society simply because there is not enough people to consider it. I think this is the case in tech. So much of our society is lagging behind what current technical capability allows. I still have to fax paper forms sometimes. Why? The potential employment might be 10x, maybe even 100x what it is now. Just to keep entire industries even remotely up to date.


> There have always been 'blue collar' engineering jobs.

I think the industry is overdue for consolidating on some language around the various kinds of software jobs. Nurses, doctors, surgeons, anesthesiologists, PAs, medical technicians, orderlies, general practitioners, obstetricians, pediatricians, pharmacists, etc. could all be "medical engineers". But broadly understood names for the different roles clarify expectations for each.


>Software engineer jobs will be blue collar, $40-$60k a year jobs, by 2030.

I'd be fine with that. $60k is a solid living. 1/4th of a pretty nice house. In Wisconsin.

I'd take that salary today - in Wisconsin. But I'm not sure you could even get your own room for that in the Bay Area, where they'll inevitably require you to be to earn it.


I don't think you're wrong; I could see certain kinds of engineering jobs becoming more blue collar. But at the risk of sounding pretentious, there's a bit of an intelligence restriction on certain kinds of engineering work. There will probably always be more of a demand than supply for people who are really good and useful.


Software engineer jobs will never be blue collar because it's not a blue collar job. Sure, anyone can make a tic tac toe go, but that's different from software engineering. A good software engineer can effectively do the work of several mediocre ones by making decisions that would benefit the company in the long run.


I would agree with this but its not going to happen by 2030 thats for sure. Its going to take a lot longer.


Personally, I think you are wrong. If you are an average engineer, you should be very afraid. But if you are exceptional, you will see your comp shoot through the roof in the coming years.


Average developer here, and I am scared. I'm worried that programming is going to have a similar winner-take-everything compensation scheme not unlike music or art, where only the very best make any money at all.

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: