To put it in perspective, $1 Billion is 10,000 times $100,000. Or 1000 $100,000/year jobs for ten years before discounting for the time value of money. Instead, big chunks of the money will get siphoned off to administrators and technical instructors and computer manufacturers and lots of other areas that already have plenty of money.
A billion dollars is less than half the annual budget of University of Nebraska for serving ~50,000 students . Back of envelope turns $1 Billion into ~22,000 student years which is in the same people-helped ballpark as the 10,000 worker years, with the difference being that those 10,000 worker years come with actual jobs at $100,000 a year. And the 10,000 worker years are offset by the current cost of contracting out the work and the value that work returns to Google's bottom line.
Big companies are terrible at this. As companies add more engineers, the engineers become less productive, then managers get worried and so they 1. Hire more engineers and 2. introduce new project management tools and methodologies to the company which lowers productivity even more but which creates an illusion of progress for idiotic upper management — Over time managers and engineers get used to lower levels of productivity as it keeps declining and the company keep hiring more and more engineers whose engineering skills themselves keep declining from being poorly used.
This is definitely true, but it's a necessary evil. If you want to build something small, you can do it with a super efficient tiger team. But if you want something huge, you need a massive inefficient team, even if each additional engineer is only 1/10th as efficient as the engineer in the tiger team.
This is explained, with real-life examples, in the classic Mythical Man Month.
The people from top schools are in high demand, and Google no longer has monopoly on cool tech workspaces with free lunches and random expensive perks.
They have plenty of openings and they harass qualified people trying to poach them just as hard as everyone else. And since they actually DO have some unique challenges, they sometimes actually DO need people who know more than Rails CRUD apps, which makes things even trickier.
Aside for their criterias and interview questions which I find absurd (but that's subjective, I know a bunch of people who think they're fine), I haven't heard much that was really wrong with it aside for obviously underqualified candidates being brought in and then leaving pissed off after bombing it.
The reality is that in this industry right now, unless what you're doing is little webapps with REST apis that store/retrieve data and not much else, hiring people is hard. All the somewhat large companies have 100+ openings at any given time and the majority of candidates are code monkeys. My current employer is doing decently and our reputation seems to allow us to get a steady stream of above average people applying, but there's still so many openings.
Only so much you can do.
Education is a wickedly complicated problem compared to not being in California.
You're outright dismissing the idea that hiring is difficult for Google? You know they have over a thousand recruiters working for them, right?
They could hire a bunch of janitors, who are already employed anyway, and essentially just transfer that $1b directly into their pockets but at the end of the day they're still going to be janitors. By investing that some amount of money into education though, they're enabling them to go off and achieve something more meaningful that they wouldn't otherwise have been able to do.
More to the point though, what happens when a janitor learns to code and gets a coding job? You still need a janitor!
That is a good point, but I think it comes down to the barrier to entry - it's easier to become a janitor than it is to become a programmer so one could probably assume there's always going to be a steady stream of people who are willing to do janitorial work.
> This implies that janitor is an inherently inferior job that deserves lower pay
It does unfortunately. Given some of the awful stuff janitors have to deal I believe they're deserving of higher pay, but unfortunately the job market doesn't agree with me on that.
But you make a good point though, with even a fraction of that budget + maybe a little time investment from their staff, they could create some great opportunities for their otherwise non-technical employees.
There's the Khan Academy, too.
Im learning equity investing, and there seem to be never ending stream of free learning material online. Lectures from the top universities, youtube videos that teach you curious spreadsheet skills, assignments. As a matter of fact you have very few reasons to justify anything these days.
It just comes down to personal motivation.
For all practical purposes these days the cost of learning anything is equal to your monthly internet bill.
If you already have a degree, just in the wrong field, you don't need to get another degree. Just get the training by whatever means.
Heck, my degree is in mechanical engineering, yet I had jobs writing software.
There really is a "cult of the degree" and I really don't get it. Is the field getting saturated and employers are finding it necessary to erect barriers? Does this, an average, make good fiscal sense? I would think successful, relevant experience is something (that when verified) would reduce risk for businesses.
Unfortunately, getting that first interview can highly depend on those papers.
Big, mature bureaucracies want formal signals. Small/young businesses and startups care about competence. Google is very large, but I suspect still manages to act "young" in many respects.
Soon they will.
If the internet people come at cheaper prices.
There isn't much difference between people who watch a lecture on the internet vs those who watch it live.
But filtering out the good ones from the "internet people" is more expensive than seeking through college graduates.
Expecting your education to make something of you, instead of making something of your education (track record) is another.
Managing to get a certificate does not guarantee one can apply those YouTube videos.
If someone said they were shutting down local schools and just telling the kids to watch YouTube and ask question online, I'd be horrified, the kids would get far less attention and teaching, and the kids would be far less educated. Also, few would have the self-motivation to pull it off.
Not to knock Khan Academy, etc. completely. One idea I've heard is having students watch the lecture at home - everyone can see the best lecturers in the world - and do their 'homework', with personal attention from the teacher, in class.
In most top universities, teaching largely works through learning by doing. Its just that they keep the quality of assignments high. So your projects make up for it. Also note those taking online courses have an active interest to get good at things, so they would work on these projects.
In the real world you need as much as you can receive from a lecture. Beyond that mostly you have to learn on the job. So online learning will work for most cases. Yeah in some cases you can't fully learn it without a teacher, such cases will exist, but will be exceptions to the rule.
>>If someone said they were shutting down local schools and just telling the kids to watch YouTube and ask question online, I'd be horrified, the kids would get far less attention and teaching, and the kids would be far less educated. Also, few would have the self-motivation to pull it off.
There is a reason why despite all the awesome public education system US has compared to the rest of the world, only a fixed percentage of awesome people come out of it every month.
Every thing at the end comes down to personal motivation. You can't make anyone succeed without their participation and will.
Just seems like Google out of all companies has a need for the best, not just blue collar code slingers. Many people with 4 years CS degrees from good schools do not get hired.
Google's hiring process does not reflect this at all. Even entry level positions have very high bar. Boring has nothing to do with it, Google wants the best.
> Secondly, Google declines qualified candidates roughly 50% of the time according to people who work there.
That's not random, that's just setting a really high bar.
Have you been through a Google interview? Their phone screen questions are equivalent to many companies' "hardest" interview problems.
(Not to be taken too seriously)
$1bn is a lot of money when put towards education and Google can make 5+ year investments when it comes to tech talent.
A great start, but it's not exactly transformational.
Labor does not get to share in economic growth, they get to split an ever decreasing share of profits. The more people that are available to sell their labor, the more pieces that shrinking pie has to be cut into.
Oh, and remember the mess Gradle was in 2015/2016? How much money could it possibly cost to better document some of the major tools?
Also, time zone differences (and to a lesser extent communications) were inconvenient, although there were very viable ways to work around them.
Programming / Networking / Hardware need a new type of University that is similar to Trade colleges but focus primarily on the skills and nothing more. The first 1-2 years could focus on the foundations while the next 2 years focus around solid design principles and actually developing projects (real or fake).
Watering down university education is not necessary. If someone wants to focus just on tech skills, they can go to a boot camp. Expanding boot camps so they become multi-year experiences may be a good idea, but they should not be called universities.
I guarantee if you created a tech school that focuses strictly on tech related classes for 4 years compared to a University that you would produce higher quality students than you would from schools that spend half of your college teaching you things that are not directly related to your job.
Also, what happens if a student wants to switch majors? Or graduates but later wants to change careers? Your proposal doesn't make that feasible.
Some programs in universities can be glorified vocational schools. Other programs are excellent, relevant and evolving both for industry, and as innovators.
I disagree. I think they provide services to customers. Customers decide what the services are for. I suppose some students are very interested in laying a foundation in the liberal arts. Most, in my experience, are more worried about getting their careers started out on the right foot (especially considering all the loans that they are taking out). There is also a nontrivial number of students there for unsupervised extensions on their adolescences, though.
> ...they should not be called universities...
Why not? Is that going to be a regulated word now? What purpose does the distinction serve? We don't look at a degree and think, "Oh! University degree! This is a well-rounded person with a good foundation in the liberal arts!" No, we see, B.S. in Communications from Boise State and draw inferences from there.
The assumption here is that education is nonsense. I think the skills are far less valuable: Education teaches you about the world you have to live in, in business, and as a citizen, a parent, a consumer, etc., by exposing you to leading people and ideas from around the world and throughout history. It teaches you to reason, by exposing you to the great thinkers now and in history. Reading blogs on the Internet isn't nearly the same thing.
If we send people home from college only with the vocational skill of building an integrated circuit, they will find that it doesn't begin to address most of the challenges in life, much less their community's and society's.
Integrated circuits aren't what the West needs to move forward at this point, to address poverty, discrimination, war, and all the other issues. In our current society, we do very well with integrated circuits and very poorly with life, social and political issues, perhaps we need to focus on education in the latter, not the former.
Another thing education teaches you is how to reason effectively, the most important skill of all: Reason is what separates people from animals, the Enlightenment from the Dark Ages, science from superstition, rationality from hokum. And almost the first thing you learn is intellectual humility: I am wrong about so much, and others know so much that I don't and have such different experience, that it's foolish to dismiss them. You learn we are each prisoners of our own narrow perspectives and experiences, and that the more challenging another point of view is, the more likely it is to be worth listening to. So when you say you "laugh" at my ideas, I think you should have spent less time on Java and more on learning.
I believe most people would benefit from learning the skills they need for a job rather than learning about China because rarely will a tech job require your vast knowledge of Chinese history. They want to know whether or not you can push code that the business needs.
I understand your point and I am not saying that these classes are useless. But at the end of the day colleges are there to produce for jobs and my point being is that if you had more time on tech related classes than classes you don't need then you would most certainly be more ready for a job.
When I went to college there were so many classes I really didnt need. Regardless of if they were "good" for me it was a waste of my time because I didn't care nor need that information. I took classes in sociology, macro and micro economics, accounting 1 and accounting 2 and so on and so on.
Did these classes help me in some way? Yes I would say so but what if I took an extra 5 classes honing skills related to the job I was going to seek after college. Can you really sit here and say that those 5 classes are better off? I don't think so.
I know I learned an awful lot about reason in college, and many others I know did too; sorry you missed out! If you read your Facebook feed and I study the great thinkers of history, guided by modern-day experts, I'm very confident I'll be far ahead. Unless you think you can come up with all that on your own - who needs Descartes or Hannah Arendt, apparently; anyone can figure it out? To disparage all that knowledge and learning is easy to say but very hard to support. To say those people knew and said nothing valuable seems like willful ignorance.
> colleges are there to produce for jobs
Says who? I disagree strongly. I know the parent's claim is fashionable now, but that certain hasn't been true for most of history. The liberal arts, which are not vocational, long (always?) have been dominant - that wasn't for job skills.
I'll add that few businesses value actual job skills learned in college or grad school. A new lawyer or engineer right out of school knows nothing, in many ways, and needs to be trained in real-world job skills by their employer.
Training people for jobs is the responsibility of vocational schools and corporate training .
As, there is no guarantee that it will even be alive by the time you leave university/your potential employer would be looking for that particular "skill".
Also you realize that learning a programming language can translate into other programming languages right?
In another language you may not be using classes, objects heavily, so learning about how to make class factories mean very little. What matters is, through this you might learn about good abstractions and its power, and that's what is important.
And that is what universities should be teaching with the use of any language/tool. I am not sure what you mean "non-tech" class, i am pretty sure there aren't any such things in CS programs. If you dedicate yourself to one particular tool, you will have a very narrow view of software development. Don't need to go to university, you can learn it by yourself and save both time and money.
Of course, that's completely ignoring that there is far, far, far more to "liberal arts" than gender studies.
"For one, a recent meta-analysis of over 40 years of diversity training evaluations showed that diversity training can work, especially when it targets awareness and skill development and occurs over a significant period of time."
There's popular opposition to Trump's promise to give people jobs by resurrecting industries that a lot of people (probably Google as well) would rather see stay dead. But there's no denying people need jobs, and formal education ain't cheap.
Now we've got a tech giant backing that up with cold hard cash. It would be great to see other companies getting on board and putting some dough in the ring or at least offering some kind of internship/work experience programs for people coming out of an education funded by these grants.
There you go. Perfect!
I am not sure what world you live in, but the world that I live in is one where software engineering salaries and jobs have continued to increase over the last 5, despite all these bootcamp grads and the large increase in CS majors.
The status quo is untenable because it's a bubble. There are far too many junior programmers.
But right now, things are really really good for developers who only have a couple years of experience. But perhaps that is going to change as all these junior devs become senior.
Maybe this is shitty, maybe it isn't, but it's probably more economical.
I work for Google and I find there's already a very large amount of on the job training required due to all the powerful but complicated internal tools. My manager has told me that I should not feel pressured to contribute at all for my first 6 months and I should feel free to just focus on learning as much as I can.
I've been here for almost a year now and feel like I still know nothing so the training is still certainly not done.
I didn't mean to jump to the defense of my employer as I'm obviously biased but do think we are far from perfect (although I am very happy here :). I just wonder what the right balance is. It seems like if people can't program at all then it's not really on the job training because they wouldn't be working right?
At least my own experience has been that I came in with some college experience but no college degree and not really knowing anything besides the bare minimum to contribute (being able to program, having some grasp of CS fundamentals) and have learned a ton on the job and still have much much more to learn. I think there is an expectation (and pressure!) for all SWEs to hit the senior level, L5, so in a way everyone who is hired under that level is doing on the job training no?
Sorry for the long post!
I don't see why Google couldn't do that today.
I'm not saying they are morally obligated to do so. At the end of the day they are a for profit company and they don't have to do anything at all about this problem. And, sure, the donations they are making instead are probably better than nothing. But as I mentioned, and as top19 mentions on what is as of this writing the top post on this article, the track record for these job retraining programs going back decades isn't very good.
What I'm saying is that if Google were to go to Pittsburgh, Youngstown, or Detroit, hire some bright unemployed people, pay them a decent but by no means exorbitant salary while it taught them how to code and then, as you point out, how to be a productive engineer at Google, at the end of the process those people would likely have very marketable skills to either continue moving up the ladder at Google or elsewhere. Of course this would cost Google something -- not only in the salaries while people were learning and not contributing but also in the salaries of people that were training and mentoring them. But Google is planning on spending a billion dollars anyway, so here's another way they could do that. A way that I think would be more effective.
Congrats on getting what sounds like a great job.
Rather than hire a traditional Stanford CS graduate, we find people who have self-taught to some basic extent and give them the chance to work under an apprenticeship. We started with a class of about 25 with decent results and will tune it more based on what we learn
This made sense back then because most of the tools, languages and techniques were either proprietary or niche. IBM wouldn't hire an ATT employee as FB hires a Google employee today, and back then people saw jobs as "until retirement", with a very different culture.
If you invested a lot of money training someone, you could expect them to stick around for a while and "pay back" your investment.
Today that isn't worth it. You could hire someone out of high school, spend a lot of resources training them just to have them poached by another company right after they are "ready".
That time is gone.
But here Google is looking to donate $1 billion as a charitable contribution to address a specific social problem. I'm suggesting an alternative way of "spending" that money. From that standpoint the fact that such a training program is sub-optimal in a bottom line sort of way isn't as important -- indeed it is kind of the point. The difference between the optimal business focused process and such a hiring and training program would be in lieu of the donation that Google is planning on making.
It's my contention that this way of spending money would yield more benefits than the current plan. I could well be wrong, but just criticizing the idea from a business efficiency angle misses the point.
That does make sense and I think that would be super cool if companies could do that today.
I admit I don't know much about the history but isn't building software today more complicated though not necessarily harder than it was before?
My impression is that before we would just build programs that ran on computers. So in that sense, if you knew how to program and had a compiler to use, that was sufficient to do what companies wanted to do. However, now we have people building other things like services, which is requires additional complexity and tooling on top of what was previously necessary.
The point I'm trying to make is that I suspect nowadays there's more prerequisite knowledge for most jobs than before.
> But Google is planning on spending a billion dollars anyway, so here's another way they could do that. A way that I think would be more effective.
That's a really good point. I imagine even if it takes a very long time to train people, with a billion dollars you have the time to do that..
The optimistic side of me wants to say it's because handling the logistics for that would be a nightmare and it's really outside of our core expertise.
The realistic side of me thinks that it probably is too expensive to give the same benefits to those employees that we give to current full time employees.
The pessimistic side of me also thinks that there's a certain level of prestige associated with working at Google that has been, at least to some level, successfully marketed both inwards and outwards, and hiring people who don't meet whatever "bar" would undermine that.
> Congrats on getting what sounds like a great job.
A more reasonable path is HelpDesk / Hardware Tech -> operations software / SysOp -> software engineering
Like, being a soldier doesn't require any sort of exceptional intelligence either. It doesn't even require average intelligence. What it requires is an IQ of 85 - below that point, and the US army cannot effectively train you to become a soldier. Despite having every incentive in the world, the US army still rejects about 15% of the population.
The area under the normal distribution curve from minus infinity to -1 (one standard deviation left of centre) is about 0.1586 or about 15-16%.
The 15 points = 1 stddev property of IQ is arbitrarily established.
I don't think our brain suddenly evolved in 200 years from illiterate peasants to software engineers. We just have more school time these days.
What do you think people did before the industrial revolution?
Ok, you can say that.
> Given enough time I believe most people can be trained to do almost any job.
That's a statement about potential, not capability.
Truth of the matter is that most tech work doesn't involve AI, machine learning, self-driving cars or augmented reality but down-to-earth business applications. Developing those requires abstract thinking, empathy and problem-solving skills but it doesn't necessarily require a college-level IQ.
In fact a high IQ could even be harmful in that situation because apart from getting bored quickly highly intelligent people can display a tendency to overthink problems (which is probably how many notorious enterprise frameworks came about ...).
An IQ of 100 is defined as the median IQ level for a population. The range between 90 and 130 covers the whole gamut of human intelligence that's commonly considered normal.
Much of the day-to-day work in IT often doesn't need original thinking but merely skillful application of known methods and patterns, which in turn doesn't require a college education.
Point proven, eh? I'm not talking about these people (though I surmise there are quite a few in these companies as well who don't have that level of intelligence and simply tag along ...) but about the vast majority of IT workers working on some supposedly 'boring' database application.
Google and Apple in particular have been known for hiring highly qualified engineers only to then have them maintain some run-of-the-mill administrative software they're vastly overqualified to work on.
It doesn't, but you still need "higher level" developers to cover security and concurrent/parallel/distributed problems and maybe architecture.
It's not that someone can't also learn that stuff, but exposing a product publicly without some experience in those areas is asking for trouble.
IMO, Software is resisting division of labor and work by collapsing roles into "Full stack devs" or "DevOps Engineers".
In my opinion, full stack and DevOps is the normal, sane way of approaching software development. Artificially dividing up roles into labels such as 'front-end', 'back-end', 'database programmer', 'system administrator' only leads to more silos, less collaboration and sometimes even downright hostility between these roles.
There is no evidence that a person with low IQ cannot accomplish the the tasks involved in e.g. software engineering, although they would probably do so more slowly than a person with high IQ. From Wiki:
"The prevailing view among academics is that it is largely through the quicker acquisition of job-relevant knowledge that higher IQ mediates job performance. "
Also, going to college has no effect on IQ, so "college-level IQ" is meaningless gobbledygook.
The average IQ of someone with a college degree is a good bit higher than that. From a quick Google search it looks like it's around 115, which is a standard deviation higher than the general population.
Pretty much the same way we can consider the median personal income ($31,100 in 2016 - https://fred.stlouisfed.org/series/MEPAINUSA646N ) "low". Because it's low.
IQ: How fast you do it.
tl;dr: Problem-solving is software, IQ is hardware
Like any other skill, programming comes more easily to some than others, but with the right approach can be acquired by anyone.
Edit: responders to this comment seem to miss that the parent comment is suggesting Google hire new individuals and train them, not find talent in their workforce and do training there. Thats the unrealistic part- creating a secondary application process for individuals without the skills -- when they already reject a ridiculous number of people with many of the skills.
Do you think Google lets anyone to work their way up from front desk admin or barista?
She changed careers before I can really remember her at that job, but I too am self-taught in this field (but I trained myself at home).
The whole idea of companies not investing in their people is relatively new.
I don't think being a barista is much different than my example, though. Maybe they should make them internal hires.
A basic aptitude for the role or transferable skills maybe?
>"What salary should those positions command?"
Whatever the company wants to pay. Why does that matter at all? Employment is still an agreement between the two parties.
>"Sounds pretty handwavy."
Not at all, "on the job training" has a history stretching from Medieval Ages and the guild system, to the industrial revolution, powering the war-time workforce etc:
The commitment of capital to tanglible jobs that lead towards ojt is not handwavy at all. That said, i bet the tax breaks from the schooling is substantial
In my case I had no choice but to do a draining unpaid internship in college, their return offer was only $15/hr, nowhere I applied to for months (likely hundreds of applications) would take me even though I did all those things in the list and continually put my resume through those resume threads on reddit, except one offer for 30k. 15/hr and 30k are abysmal insults to the amount of time money and effort put into programming since middle school, getting a BS CS degree, volunteering on big online projects since high school, etc. I'm not in the middle of nowhere either, this was NY/NJ.
These experiences signal to me that the tech field is hightly oversaturated for new grads and it's a matter of time before people realize the bubble popped. Someone in my position should not be getting offered what amount to poverty wages taking into account student loans, the high price of car insurance for a driver of my age range, etc. Programs like what google is doing are just going to make this problem even worse as companies feel further emboldened to require increasingly more experience out of junior programmers and offer them salaries further approaching minimum wage.
From curiosity, and certainly not to offend.. you _did_ place your resume elsewhere as well, right?
I think retraining programs primarily help sharp, energetic folks who somehow got into a bad state (useless major, bad school, rough childhood, etc.); maybe even social connections and stability are more important than the skills they end up getting. However, those programs are IMO worthless for folks who lost a stable job and hope that Yet Another Certification Class will put them into a pipeline for a similar one. I am not sure how to help the second type.
Mind you they do hire some of the best, but the idea that google engineers are “the best” is a myth.
I heard though they are still trying to do this, even being aware of the lower success ratio.
However, just today I listened to the most recent Freakonomics Radio episode, which was about how Germany managed to become the economic powerhouse it is today. Most economists that were asked agreed that an essential ingredient of Germany's economic success is its unique concept of vocational training, which combines on-the-job training with school education and general - as opposed to employer-specific - job training.
Perhaps a system that's essentially a combination of both on-the-job training and more formal training programs would be conducive in this case as well.
All of humanity growing together towards a brighter future for everyone is truly our highest calling.
Degrees don't come for free, technical graduates don't come for free, students have to go to college for that and in US you need a LOT of money for that.
This is the chicken and egg problem where nobody wants to address the real problem an everyone is going around giving superficial solutions.
The rate of student loan defaults is actually inversely proportional to the amount owed.
In a lot of cases these are probably viewed as the same thing but, for example, I would ask: When was the last time a Senior Java Developer was a candidate for a Senior FrontEnd Web Developer position?
I think the future is going to be a lot less about being hired for "jobs" with "companies". Instead it's going to be substantially more about "projects" being done by "groups / organizations". The groups / organizations being assembled / disassembled with high frequency.
Some better nerds here can hardly imagine perfectly smart people who cannot yet touch type or turn a spreadsheet into a group calendar. Our miraculous simple decision support tools are still opaque to majorities of Americans. Only Americans far outside Google will create value to create jobs. That takes planning for any possible sweat equity or financial investment. We have generations of people to train with tools. The boy genius prizes for ever new tooling are not really separate concerns. Cultivating and harvest new boy geniuses from the field is expensive. They don't exactly grow on trees.
Google like Apple or Microsoft had to discover and rediscover their own relevance. They cultivate their markets now with intensive growth. This is a good move.
I'm going to warn everyone of what's coming.
Software engineer jobs will be blue collar, $40-$60k a year jobs, by 2030.
The HUGE push from government, and private business, to fill the PERCEIVED lack of engineers, will come to fruition around that time.
Make no mistake about it - there is NOT a lack of skilled engineers right now. There is a disinterest among business to pay higher, and higher salaries.
If you are a SWE right now, save your money, and invest your time into improving YOURSELF. Have a backup plan, because I promise you, the good times are coming to an end sooner than you think.
Learning to program takes a lot of dedication and focus. Which a lot of people have no interest in, it is just too much work and too difficult. Every student that takes a engineering degree here, have to have a class with introduction to coding. And everyone, except those few who enjoys computer science, says that class was the hardest class to pass by far compared to the rest.
So I believe the opposite will happen. The demand for software developers will grow beyond our imagination.
Combine that with the fact that big companies (like Google) release SDKs that make application development trivial, and you've got a recipe for the skill cap lowering along with wages.
The skillset of building a personal website, or even a website for your small business should be something anyone can do, and will in no way impact the overall salary of software engineers in the future.
Major companies will always need people who understand the computational sciences, as scale and complexity follow some of the same rules as entropy, in that they are always increasing.
Additionally, the reason for high salaries is not a lack of engineers, it is that top companies have decided that it is in their best interest to outbid each other for top talent. In parts of the midwest, where there is less competition, engineers are already paid 50k a year.
I am constantly surprised that when other CS students in my classes have zero idea how anything beyond the particular language we're learning works. Even in higher skill-level classes that require a fair amount of proficiency with the language if you asked what the length of the pointer they just properly used was all you would get is blank stares.
> "if you asked what the length of the pointer they just properly used was all you would get is blank stares"
So what is it?
It's a computer science program, not learn a dozen language's quirks and implementation detail that you use for a single class to understand some concept.
The new class of programmers that governments and Google want to train up from your average worker will not be as skilled or intelligent as the current generation of programmers. But, there are still opportunities for them in software development. They will take jobs that pay 40-60k a year, while the higher skilled and more intelligent programmers will be architects or leads who command much higher salaries.
There will be a differentiation between Engineers and Coders soon. Hooking up to a few different APIs, doing some JS and HTML does not count as engineering.
This to me looks like the same kind of privileged outlook that other professional guilds like the AMA desire. Do you want cheaper healthcare, or doctor compensation to keep going up? Hey, letting nurse practitioners take on some of the load is "flooding the market with n00bs"
This just seems like protectionism by another name.
Yes, the good times for software engineering will come to an end. I'm a software engineer, this will affect me. But the question is, do I have a natural god given right to have a ballooning salary every year, while fighting attempts to increase labor supply that might cut that growth rate?
What's your definition of "skyrocketing"? Outside of, maybe, a dozen high-prestiege companies located in a couple specific areas I don't see salaries skyrocketing. Mine hasn't; not saying I'm not well compensated, just not as overpaid or in demand asbsone people make it sound.
Further, my experience with the aforementioned high prestige companies is that they are picky as hell. That tells me that either there is no shortage of talent for them or they are choosing beggars.
In 2003, the average salary in tech was $69,400.
By 2017, the average salary had risen to $92,081.
While that might seem like a pretty large payrise, after adjusting it for inflation, you come to a clear conclusion that over the last 12 years, salaries have stood basically still. You can point to people getting $120k+ as a first year employee at Google, but salaries like that are massive outliers. Average developer in america earns much less.
 Dice Tech Salary Survey 2017 https://marketing.dice.com/pdf/Dice_TechSalarySurvey_TechPro...
Does our owner class have a natural god-given right to a 6% return on their investment every year, for doing nothing?
They are certainly spending their energy on fighting attempts to spread the economic pie around. We need solidarity, not shaming people for protecting their means to make a living.
You, too, can become an "owner class" by opening an online trading account and buying stocks. Commissions are often under $10 for a trade.
Either way, even if I can, the guy who makes my morning coffee can't, and never will be.
I.e. start investing and in 20 years you'll be financially independent. Sounds good to me to be living in the US.
> the guy who makes my morning coffee can't
I talked with a guy once who told me he "can't". He was driving a new car, and the payments, rent, etc., added up to more than his income. I suggested he sell the car, buy a car he can afford to pay cash for, and start investing.
He partially did take my advice. He sold the car, bought one he could pay cash for, and then blew the extra income on some other luxuries. Of course, then he still was in "can't" territory.
My current car I bought used 25 years ago and still drive every day. It costs me practically nothing.
Your point still is predicated on the "already having money" part.
The point is invest early in your working life, and you'll have the needed results when you're ready to retire.
$1,000 is not what people would consider "having money" is. $200,000 is. If you have a car, it surely cost far more than $1,000.
And depending on who you're talking to, having a spare $1,000 risk on investment is "having money".
True enough. Because they spend it, like the person who bought a new car. How much do they spend on beer/cigarettes/weed in a year?
> And knowing what to invest in is another can of worms in itself.
That's true. Apparently investing is more than "doing nothing", and one is taking a risk. But it is within the means of the vast majority of adults.
For the record; knowing what to invest in is easy: a broad based low cost index fund. The research in "A Random Walk Down Wallstreet" statistically shows that actively choosing specific companies has a less than 50% success rate and often comes with higher fees. For the record, most public libraries have a copy of "A Random Walk" and its conclusions are well shared.
While some of the richest are certainly entrepreneurs, the largest percentage of Forbes 400 are people who got there with OPM, other people's money, like hedge fund managers. Another chunk are heirs like Waltons or Kochs. Did Waltons get something for nothing without any risk? I would say yes, they did.
More accurately he made something from nothing.
You and anyone else could have made a little something, too, if you'd bought Walmart stock. Lots of people did.
Also, you could be a millionaire by retirement if you adhered to a regular investment program rather than only one investment of $1,000.
Also, the Walton kids most assuredly got something for nothing.
But you do.
> the Walton kids most assuredly got something for nothing.
85% of American millionaires are self-made. See "The Millionaire Next Door" by Stanley.
It has limitations, so you'd want a traditional broker also.
If you re-read my original post you will notice that I did NOT advocate for artificially restricting worker supply, or any kind of protectionism.
I merely warned that what we're seeing in tech WILL bring an end to the "high" salaries, and that those who wish to maintain their current state, should consider PERSONAL GROWTH and advancing their skillset as a means of protection.
We can debate all day about whether its a "perceived" lack, or a real lack. I whole heartedly disagree that salaries are, as you put it, "sky rocketing", especially when you account for cost of living in the areas where the "skyrocketing" is happening.
And oh, yes, how dare a doctor who spent upwards of $500,000 in medical training, and devoted years to internships, and residency, be worried about a lowered skill cap or regulatory protections (which they counted on) for entering their profession! How dare they!
Please. Of all the examples of protectionism you could have given, you chose perhaps the most acceptable and understandable. Yes, people care about their livelihood and the ability to retain efforts and investments they have made, so would you.
To me, that's unfair and wrong.
Does your employer have a natural, God given right to cheap labor?
Salary in the tech industry is a function of demand. Right now, for example, people with expertise in Machine Learning are in high demand. You can graduate with an MS or Phd in machine learning, form a startup with no product whatsoever, and get acqui-hired just because of the insane bidding war going on right now.
When I moved to SV in the 90s, at the height of the dot-com boom, kids fresh out of college were getting insane signing bonuses worth $10k or more. I knew people who would switch jobs every few months, just to collect freebies.
Perhaps it is different elsewhere, but here, tech workers are very highly privileged. Seeing people complain about a desk job that pays $100k+, weeks vacation, great healthcare and benefits, flexible working hours, compared to the utter suffering that's happening in the working class across this country just looks tone deaf to me.
Before we worry about the poor suffering tech workers, in their gentrified neighborhoods, in swanky cafes, facing stagnating nominal wages, how bout we consider the masses of people who missed out on the tech-utopia for the upper 10%, people who would like to move up the value chain, and have a right to compete for your job.
My mother worked as a cashier at Safeway before she died. Comparing my hourly wage as a SWE to hers is kind of obscene, and for the people I grew up with, seeing tech-bros complain about their current threatened position has got to look like people completely ignorant of how much privilege they have.
Because Bay Area cost of living is getting pushed higher and higher. The median programmer already has roommates, a 2-hour roundtrip commute, and no hope of family-sized housing or (gasp) ownership; if standard of living falls any further, we'll all go do something else.
Obviously many people have it worse, but we have alternatives.
The amount of programmers needed is rising across the globe. Every country is going to try to retain its IT talent. In a cut-throat globally competitive world you can't afford to be the country that lets its best minds leave to greener pastures.
Also, programming isn't the sort of job where you can fake your way through. If new people are trained up to enter the field they'll need to skills to match. As a relative share to population size the number of people studying computer science has been falling, not rising. These bootcamps and training programs are trying to bridge the skills gap, and so far have not succeeded. If anything there's going to be a skills glut, with a corresponding rise in pay.
Sure, employers always try to minimize pay. IT is not special, this is the case for all industries. Labor price is set through supply and demand, and programmer's wages are no different. Can you give a single example of an industry that used to have high wages but now has low wages? It would be exceedingly unlikely for IT to behave unlike every other industry.
Not sure why you think programming would be special in that regard. In a lot of companies, you will be able to keep your badly done job for a really long time as long as you have a good bond with your higher-ups. This goes for programming the same as sales or any other profession.
Except many unions do exactly this through closed shops and other tactics.
2. protectionism from people insecure of their ability describes both tech workers and tech companies. Google, Facebook, Apple et al have increased their political presence in recent years with additional lobbyists. corporations love an unfair playing field as much as anyone.
but when you say the solution "isn't to clamp down on the supply" i lose the thread of your argument.
clamping down on the supply is exactly what a public sector union does. union work rules and other union-favorable city regulations exclude or limit non-union workers who might otherwise be hired to carry out various city functions.
working for the city or the department of water and power can be a very good deal for the worker, but city residents pay more in taxes and see less service as a result. merely unionizing the labor force helps some people but hurts others.
indeed, a case can be made that, because police officers are so highly paid and benefitted, they are scarce. and because they are scarce, there's more property crime, and murder, than there would be otherwise. it seems quite plausible that some city residents pay a very high price because of this public sector unionization.
in politics, this leads to a strategy wherein city residents who live in "electorally unimportant" areas (i.e. poor areas with lower voter turnout) receive lower levels of government service than city residents in areas that vote a lot.
I've not seen an example of this. And I do not consider being a member of the union to be clamping down.
One key obstacle, officials say, is the contract with DWP’s largest union, IBEW Local 18. The agreement requires that managers negotiate with the International Brotherhood of Electrical Workers before hiring contractors. Initially, the department is supposed to attempt to fill any internal vacant positions, Howard said. The contract also obligates managers to offer IBEW workers overtime to fill some of the need.
IBEW business manager Brian D’Arcy declined to be interviewed for this story.
Forgive my crass reply, but, I have only seen unions become weaker and weaker in the U.S. And even when they were "strong" they didn't protect labor interests against increases in labor supply. Globalization has wrecked a number of industries, which unions were powerless against. The strongest card in the hand of any laborer is their scarcity.
Also, nobody (including myself) is advocating for "playing gatekeeper". I merely made a post warning people of what's coming. If you look closely my advice was to the individual - invest in yourself. Don't count on unions, or governments playing gatekeeper, to protect your current salary.
Furthermore, if tech is about disrupting everything, including the nature of work itself (through on-demand) and even the nature of human relations itself (through social media), then surely some attempt could be made to better labor relations by inventing a better type of union. It's especially rich to hear "no, it can't happen, it's always failed in the past" comments wrt labor unions come from workers who work in an industry that's supposedly all about innovation.
I wasn't trying to attack you for your original comment, in any case. You actually offer good advice. But my general sentiment is that we shouldn't try to restrict the labor market- it seems as wrongheaded as trying to fight gentrification by limiting house construction just because some of those units will be luxury condos instead of affordable housing- and that lowered wages could be fought by the presence of a tech union that protects tech workers.
There have always been 'blue collar' engineering jobs. When I started in the tech industry I was making $13 an hour writing HTML and a bit of SQL here and there.
There are probably tens of thousands of "Software Engineers" putting together PHP sites, doing front end JS work at an entry level, hacking together some minor software customizations. I think you're right, this will become more prevalent. The world needs a lot more engineers to do this kind of work.
>There is a disinterest among business to pay higher, and higher salaries.
Evidence points to the contrary, salaries have skyrocketed in the last ten years... have you been paying attention?
I think the industry is overdue for consolidating on some language around the various kinds of software jobs. Nurses, doctors, surgeons, anesthesiologists, PAs, medical technicians, orderlies, general practitioners, obstetricians, pediatricians, pharmacists, etc. could all be "medical engineers". But broadly understood names for the different roles clarify expectations for each.
I'd be fine with that. $60k is a solid living. 1/4th of a pretty nice house. In Wisconsin.
I'd take that salary today - in Wisconsin. But I'm not sure you could even get your own room for that in the Bay Area, where they'll inevitably require you to be to earn it.