Hacker News new | past | comments | ask | show | jobs | submit login
The STEM Crisis is a Myth (ieee.org)
139 points by RougeFemme on Aug 31, 2013 | hide | past | web | favorite | 145 comments

"Wages for technologists have not increased" is often evidenced by claims about averages. By the same token, one could argue persuasively that devs do not age, because a year goes by and they get a few months younger.

The explanation is that the number of devs (and many related trades) is increasing rapidly, largely driven by hires of young neophytes rather than retraining of e.g. 50 year old lawyers. The neophytes are younger and have lower wages than the mean dev. This brings the average down faster than raises and aging bring the average up.

You can verify this if you look at data for particular career cohorts, either people or experience levels. In the 4 years I've been on HN the quotes I hear for "generic CS undergrad from Stanford" are up more than 25% (80k to 110k or so, starting salary offer). Companies that I work with complain bitterly that their competitors are bidding irresponsibly. The going wages for interns are substantially in excess of what I expected to earn in mid career when I graduate not even 10 years ago.

Also, anecdotally, of the couple hundred people whose professional statuses I'm routinely exposed to I don't think I've ever heard "Boo, got to take a pay cut" except for making the BigCo to startup or self-employment transition.

>In the 4 years I've been on HN the quotes I hear for "generic CS undergrad from Stanford" are up more than 25% (80k to 110k or so, starting salary offer).

Just looking at Stanford grads is far from representative. In fact Stanford is possibly the most extreme outlier!

The US median salary for a fresh CS grad in 2013 was $59,221 [1]. Adjusted for inflation, that's $41,323 in 1998 dollars [2], which at that time was a pretty unremarkable salary for a fresh CS grad (or anyone who just knew HTML, for that matter).

Barring a more methodical argument, I think it's more likely that your perception of the market is skewed by "generic CS undergrads from Stanford" you rub shoulders with than it is that the widely cited averages of the entire market are skewed by a sudden youthfulness of the workforce.

1) http://www.forbes.com/sites/susanadams/2013/01/11/starting-s...

2) http://www.bls.gov/data/inflation_calculator.htm

Is starting salary the best thing to look at though? We are all aware of the 'non-programming programmer' problem, and therefore graduate salaries will be accordingly risk adjusted from the employer's POV. Maybe a better indicator would be salaries for roles requiring 5 years of professional experience, where the danger of hiring someone completely inept should be more remote. Or maybe you need to look at a more sophisticated measure like the range of salaries being on offer for programmers in the same age group.

Wow, really? That median was $65k/year when I graduated in 2011.

You and the IEEE are sampling different universes. Specific parts of the universe of "devs and related trades" currently have high demand and rising wages.

However, as a Ph.D. holder in quantum optics and semiconductor manufacturing, I would be happy to introduce you to a dozen physicists and EEs who are unemployed or who work for half my salary. There is no apparent shortage of physicists. My impression is that the biologists have it even worse.

The problem in this conversation is the notion that (e.g.) "web and mobile software development" is a subspecies of STEM. This premise gets sillier by the year, as more and more babies grow up teething on their parents old iPhones. Many of the best developers I know have no training in science or engineering: They're musicians or historians or artists, or they have no formal schooling beyond high school. Web development is as much about project management, self-direction, self-education, patience, consistency, pattern recognition, memory, design sense, decisiveness, experience, practice, persistence in the absence of immediate positive feedback, equanimity when faced with a thousand distractions, and – above all – communication as it is about scientific or mathematical knowledge.

And, while the combination of skills that make a great developer is evidently rare – or so the market conditions seem to say – it doesn't serve anyone to pretend that this represents a shortage of physicists or biologists or even of computer scientists. [1] It's a shortage of developers, and a particular subspecies of developer at that.


[1] Well, anyone who doesn't run a school for STEM, anyway.

It sounds to me as if you are saying average wages are not increasing because of the influx of young developers. Isn't that exactly the point the article makes, that the non-increasing wages indicate that there is no shortage of tech workers?

It also seems unlikely that the decreasing averages don't affect the wages of older tech workers. They compete with the young, cheap tech workers after all - why should companies continue to pay high salaries to old tech workers if they could replace them with cheap young tech workers?

why should companies continue to pay high salaries to old tech workers if they could replace them with cheap young tech workers?

Because young workers are not a substitute for old workers.

You could argue that four fresh graduates are a substitute for a single good senior dev, but in many cases even that is not true. Thus, old guys continue to find good jobs for good money.

This worry is as old as software. The cheap hordes will replace you. It used to be cheap hordes from overseas, and now it's cheap hordes fresh out of school. They're on there way. But somehow they never seem to arrive.

They're on there way. But somehow they never seem to arrive.

I wonder if that's because the demand keeps increasing. Software is eating the world. Sure, supply is increasing from overseas, and young people, but demand is growing more.

Non-increasing wages is a statistical mirage, not a statement which is a truthful depiction of on-the-ground reality. The overwhelming experience of actual, countable persons is that their wages have increased substantially. The only reason this does not raise the aggregate wage is because too many people have increased from $0 to a number which is, relative to the average, low. (You occasionally see this in larger economy statistics, often as a result of immigration, by the way. "Wages are frozen since the 70s!" is not all that true -- wages are way up for Mexicans who are now Americans, mildly up for Americans who are still Americs, etc.)

Although your argument could be correct, I am unconvinced:

Everyone sees their wages increase substantially, as they gain experience. This does not mean that wages are increasing.

Inflationary effects should be taken into account - wages should at least increase with inflation. Everyone seems satisfied to receive a 5% pay increase, even if that is a freeze in real terms.

If wages were indeed frozen since the 70s, your actual countable people may not notice due to the above two effects. I gather that you understood that, but I thought it worth making clear.

Your argument boils down to the idea that wage figures are biased by the fact there are more new entrants into STEM careers than there were in the past. Do you have evidence for that? I suggest (based on the graduate figures) this may not be the case.

On your point about Mexicans creating low paid work: I would assume that immigrants are now doing jobs that Americans used to do - so it's not that they have created additional low paid work that biases the statistics, it's just that they have taken over a portion of the low paid work. So, I think this should not be taken into account in the stats.

It's unclear whether immigrants are doing jobs that Americans used to do, or whether additional low paid work was created. I lean towards the latter, but I don't know how to prove it.

However, the data does show that when you restrict the data to Americans with American parents, incomes have risen for everyone outside the top quintile.


Not really, influx does not automatically imply lack of shortage. I think there is a common impression of quite the contrary: a demand bubble, at least in my local SFBA tech scene. Replacement may not be happening because of a demand spike and/or other reasons having to do with cheap young tech workers not being the right fit, or not being perceived as a right fit, etc.

There are many factors at play, but wages I've seen in the US have remained flat, or even dipped once one factors in inflation. I make less than I did in 1999, in spite of the fact that I've been promoted. Part of this is the use of offshore talent, and H1-B visa workers. I've got people from both camps on my team now.

It might be in your best interests to secure an opportunity to renegotiate the wages you receive, either with the firm you currently work for or any of the numerous ones which would be happy to take your call.

Thanks for the suggestion. In fact, I've already done that, and am aware of what talent can charge for my line of work.

I severely question your conclusion that offshore talent and H1-B visa workers lower your personal salary by a huge amount.

It might be just pretty argument which is very useful for politicians and labor unions (or the like).

Actually the opposite can be argued. If those workers get a lot less, the team's cost as a whole drops, making room for paying you raise. Except for external factors like management just taking the cost savings and put them somewhere else. But then again, these savings could have been taken out of your paycheck ...

No. This is simply supply and demand. The more workers available for positions, especially ones who are willing to work for less impacts the supply and prices I can charge for my work go down. Management isn't going to give me a raise because they saved a buck on someone else. In fact, they're more likely to try and find a way to get rid of me and replace me with someone who charges less.

People aren't paid based on what their companies can afford to pay them though, they're paid what the market will bear.

I think there are a couple of factors at play. Labor markets especially for highly skilled jobs tend not to be exactly the best examples of "free market" and "perfect competition". There seems to be a lot going on... imperfect information, differentiation, "cartels" on both sides...

Agreed, but wouldn't those factors also affect your scenario in which lower paid workers free up cash to raise salaries? I only meant to argue that the "what the market will bear" model is closer to reality than the scenario your presented, not that it is a perfect fit.

On the other hand, as you age, your salary increases and as you compare with 10 years ago, you omit inflation.

Averaging same-age, same-years-of-experience, inflation-adjusted developer salaries across years would make better sense.

Anecdotal evidence, and looking at outliers like Standford grads (not even 15yr vets mind you) should be tossed aside.

Neither represent the overall technology picture. Wages have stagnated, so "Boo, got to take a pay cut" won't happen, but you may not get a raise for 1-2-3 years which in effect is a pay cut.

My experience is that it's very hard to find good developers, or university graduates who want to become developers.

The CS grads that I've interviewed fall into three groups:

1) They were good at Math, spent 4 years avoiding anything involving computers, and whose career goal is to find a cushy middle-management job.

2) Arrogant CV-fillers who think that their HTML/JS 'skillz' make them them God's gift to the world. Who then end up working for Big Consultancy Firm working on Sharepoint (because the money is good).

3) The rare gem, who love computers, love coding, and get usually get snapped up instantly by Google/Microsoft.

As someone who doesn't have a degree, I find the level of CS graduates from (top!) universities disheartening. I've seen people who couldn't implement a simple sorting algorithm in pseudo-code (or even explain how they'd do it!).

Things I can do with my eyes closed; you know, because I have actually read my algorithms book cover-to-cover.

The cause is clear: developers can earn very good money at the moment. It's a good career move. So it's attracting the lazy chancers who just see the $$$$$.

What ever happened to pursing a career doing something because you, you know, love doing it?

There are things from a formal CS background I wish more people mastered. Things like understanding how JOINs work, basics of memory management and understanding that two nested for loops is likely to be worse than a single one, even in the cloud. But knowing how to implement a specific sort isn't one of these. It's simply never come up outside of an exam.

I'm shocked that a proper CS course wouldn't teach them. We covered Joins (and the relational maths behind them) in first year, and "nested for loops being bad" would be a necessary by-product of learning anything about complexity.

Oh, I don't mean to say these aren't taught (though being taught something and mastering it isn't quite the same).

What I mean is that those are the parts of a CS background that have actually been useful in my everyday coding life, and that I would expect from candidates I interview. To be opposed to knowledge/implementation of different types of sorts, which is also a part of a CS background, but has low practical use in my everyday coding.

> the parts of a CS background that have actually been useful in my everyday coding life

Someone fresh out of school would have no idea how to discern that from anything else in the wide, wide world of CS unless they have experience working in the industry (from, say, a summer internship or co-op terms).

Hell, mastery of basic DVCS operations (say, git pull, git commit, git merge, although equivalent experience with hg, perforce, or TFS would be reasonable) isn't a guarantee, even if students spent four months in a lab "using" such a tool.

The people who "do best" in university get told by their professors to become graduate students. Most are not optimized to teach the skills you seek -- if you picked a random fresh graduate and they happened to have those skills, it's probable that either they taught themselves or they learned them at another employer.

mm, I agree with you there. I guess it's because sorts come up so much in a CS curriculum that they're considered to be magnificently important. In other words, they're just really handy bodies for imparting central ideas (like complexity), and people mix up the messenger with the message.

I think that's kind of the point. If radical is looking for (3), they will probably know the information because it is interesting.

I love coding, but if you make me implement a quicksort in the interview I'll probably be too nervous and fuck it up. Have you tried having more interactive interviews, or handing out projects?

I just like doing cool things, honestly. And I know a few people that are just like me.

We tailor the interview to the applicant: so someone coming from (GOOD!) Uni should be able to verbally explain a sorting algorithm. We don't ask them to code - it will take too long and a discussion is much more valuable (and almost impossible to pass by memorizing a couple of algorithms).

Heck, I passed a harder version of the test (including the theory, I'm a senior/lead/old-fart so I was expected to implement everything on top of the discussions) and I left formal education at 17 :)

They don't have to implement a working version, just give a good outline and be able to answer questions about it :)

For experienced people we go further, and do ask for some coding.

You've got 5 years experience in SQL? Then solve this simple problem which requires using JOINs and aggregates.

You're a javascript expert? Great, then demonstrate that you know the syntax and know what a closure is.

Java expert with a focus on patterns? Then you should be able to implement one of: Singleton/Factory/Command.

We get more people who have 10 years of experience with C# are are self proclaimed experts, but who don't know the difference between the heap and the stack then we do CS students applying for the wrong job.. but that's for a different thread :)

Here's my problem with using quicksort at interview:

A developer who writes an implementation of the quicksort algorithm as part of their job should usually be removed from their post for incompetence.

The interview process should be a two-way procedure which also allows the candidate to understand what is expected of them in a role (and make their decision about the job accordingly). If regurgitating quicksort is expected, then this is a fair test: otherwise, this is not how you should be presenting the job.

Ah, that seems pretty reasonable. I think my biggest problem is that when I'm interviewing I'll try to answer too quickly before thinking through a problem completely.

But I love discussions too, and if it's something I'm comfortable with, I could keep talking about the topic for as long as the interviewer wants.

I think there's value in recognizing when you don't know the answer to something. I had an interviewer ask me the difference between call and apply in JavaScript, at the time I wasn't sure, so I started to answer him... I explain that I knew you could use both to set the context in which the function was called, but eventually I just said: I'm not sure. And the interviewer gladly explained it to me.

In my case I got an internship where they gave me a 24 hour project to demonstrate my skills. I found that way less stressful than any interview I've ever had. And after the internship the company was so happy that they offered me a full time job!

Oh, that's exactly what I want to hear: everyone has gaps in their knowledge and it takes guts to admit it. As long as you can see that, and know where to gain the knowledge, than that won't be a problem.

Unless that knowledge is something vital and general. I.e. like not knowing what HTML is.

In the future I want to use internships to find younger engineers - it lets you get to know people a bit, and keeps new ideas flowing through the company.

How can you mess up quicksort? Pick a pivot, lower values on one side, higher values on the other. And recursively rinse and repeat.

I get really nervous and I make stupid mistakes. Every single time I've had an interview and I'm asked to code I end up making stupid mistakes because I try to answer too quickly or because I don't sit down to properly think about the problem. It's definitely something I need to work on.

Thankfully I got an internship that let me do a small project to demo my skills, instead of hazing me. And after my internship they were so happy that they offered me a full time job.

Quicksort is amazingly easy to screw up. There's a lot of opportunity for off-by-one errors in all the index manipulation. I coded quicksort as an interview practice exercise in plain C a few years ago and it took quite a number of iterations until getting it all correct.

I once messed up a string size parser in an interview by going len instead of len - 1 in measuring the number of characters displayed.

It is stress. Introverts like myself I imagine collectively are horrible under intense pressure from random strangers who can dictate your future for months or years.

Maybe the real lazy person is you and the way you interview. What does writing mergesort on a whiteboard have to do with writing Web software in 2013?

Nothing, but I think the implication is that it is a signal that can be used to predict which of the three categories the person falls into. People who fall into the third category (the desirable one) are likely to be able to implement mergesort on a whiteboard because they have a genuine interest and paid attention.

That being said, other, more practical questions may exist that retain the signaling characteristic.

With respect, I'd like to add a fourth category:

4) Actual computer scientist. If you ask him how to write mergesort, he will ask for a book to look it up in, same as if Einstein was asked for the value of pi. If you ask him to describe his favorite algorithm or the details of his current research, he will gladly tell you in extensive detail.

The problem with hiring those of us in category (4) is that it's hard to tell if we're bullshitting you, since after all we operate at a far more advanced and specialized level of expertise than most software development jobs actually use or require.

Sorting algorithms in general aren't constants, though. The implementation can have significant impact on running time, memory use, etc. And there are an infinite number of them with various tradeoffs you can tune.

I wouldn't expect an interviewee to know sorting algorithm X because they may just have forgotten. But why wouldn't they be able to come up with a sorting algorithm, and have some feel for the behavior? Seriously, selection sort is about as easy to come up with as fizzbuzz. If they can't think through a simple problem and come up with a solution, is it wrong to question their problem-solving skills?

Actually, if you asked in an interview how to sort, I'd try to come up with quicksort or mergesort. I wouldn't give you insertion sort or selection sort because I'd feel terminally embarrassed to go with so bad an answer.

Is it terminally embarrassing to say "this is bad, but..." and pass the test, then discuss (maybe write out) more advanced options? If all they care about is selection sort, you've saved time (often a good quality). If they care about more, you've demonstrated something, and opened the door for more.

Depends. I've been in interviews where they let me damn well know I was losing points for giving an algorithmically sub-optimal answer.

Sounds like (a) fun interview(s) :/ Alas, they exist. I guess it's too much to hope they don't.

Where did he say he was writing web software?

The author of the parent post probably assumed it from radicalbyte's "about" section.

Doing what you love, without getting paid a lot for it, is for social workers and teachers and other people whose jobs don't make someone else a ton of money.

Social workers can build parks or public housing or feed people, where in each case there is increased traffic through an area creating a boon in income for local business, there are public utiltiies companies that profit off more people in homes, and there are farmers and food distributors who have increased demand from food kitchens, even if the margins are sour.

Likewise, teachers lead to the student often becoming more productive and wealthy by knowledge transfer.

There is value in almost anything you do. We just have a social organization where some things are overvalued and others are undervalued.

Tuition from schools that don't really teach?

I'm not in the US, but from what I've seen here and in other boards, the whole STEM Crisis is actually a HR Crisis.

No wonder you can't hire anyone if you want rocket scientists to build your world-changing photo sharing app.

It is an HR crisis on two points -

1) There is a business preference to find the best candidate for the lowest rate (practical sense) which leads towards hiring the youngest/cheapest resource available who meets the criteria.

2) Existing companies refuse to invest in building up / educating employees. In the old days you hired based on merit and potential, but you also provided training to make sure employees adapted with technology. These days, there is no training and employees are expected to adapt on their own (there are certainly arguments for keeping on top of things, but if you have a full time job and other obligations, external interests might not be technical/career related)

Unfortunately, because of (1) and no requirement for companies to commit to (2), there is a push to allow for more non-domestic workers through programs like H1-B and related visas.

Personally, I think most engineers should hone their skills and continue to learn and evolve. That said, some situations don't allow that and when coupled with a refusal of companies refusing to hire/train based off potential, we end up in the mythical "STEM crisis".

I'm not talking the founder visa issue here, I am talking the normal boots on the ground. For which, no longer training and wanting the cheapest option have created a toxic environment for a certain segment of the tech community.

I was thinking of more than two points. The two you mentioned, plus looking for overqualified candidates, plus:

1) Requiring more experience than necessary or even plausible on certain technologies. 5 years of experience in X. When X is 2 years old.

2) Focusing too much in a specific technology stack. We use X, Y and Z. You lack Z? Bye. Or we use Z 1.2, you've only used 0.9? Bye.

3) The 3 day interview, the brain-teaser interview, the work-for-free interview, the you-only-got-1% psychometric test interview.

Regarding your point #2, Mexico recently implemented tax incentives for companies hiring first-time employees. That's something other countries should consider.

Specifically regarding #2, I would like to see an emphasis on training and any company requesting a foreign worker visa pay into a pool that is dedicated towards training domestic technologists in transition. Unemployment is one thing, but, not everyone has availability to learn/transition into new skills.

Again, my argument is based off the norm that companies invest much less in their employee development than they used to - the onus is strictly on the employee these days. Which is the other extreme.

I don't understand this whole parochial line of reasoning. We shouldn't protect local capital and products with tariffs, and neither should we protect local labour.

Free trade (and movement of labour and capital) is better for our economy, and better for the rest of the world. The people on the other side of the fence are human, too!

The kind of free trade you're talking about is better for buyers but as I'm an employee and not an employer, I'm more concerned about the sellers market. Complete, unrestricted labour movement means all wages average out. I'd rather not be getting paid the average between the lowest paying employer out there and the highest, so I'm all for setting up boundaries between them.

You are arguing that distorting the market to increase pay for developers is a good thing for you.

Some work can be easily moved around the world. Some can't. For instance, workers on your local railway can for up wages because the railway can't move, and local people need it. Software can be written anywhere, so if US wages are high, it will be written elsewhere.

And you do see this. US wages, and Japanese wages, are higher than in Europe for developers. Here in the UK, a lot of engineering is essentially R&D outsourced from the US to a place with cheaper STEM workers.

At the same time, I feel wages can be at current levels because developers don't realize how much they are worth. A big part of this is because it's hard to link revenues, even at a software company, to an individual developers achievement. If profit is a pie(chart) carved up between different workers and the share holders, then the worker who has best visibility how the money gets there, and how they contributed, will do best. Hence sales/marketing, management, and finance, all tend to do better, because they get to see the budget.

Dispite what you have heard you can't cheaply outsource most development work, many projects outsourced to India actually end up costing more than they did when they where in the US. Granted the developers cost a lost less, but communication barriers often drive external costs through the roof.

I'd draw a distinction between outsourcing, where you buy programming as a service from a third company, and relocation of a department, which is common in large companies.

So for instance, in Basingstoke, an area where land is cheap just outside London, Sony, Huawei, and ST Ericcson (semiconductor major), all do high grade R&D work. Friends that work for those companies tell me that the work is run from a head office in America, Japan, or China, but done in the UK because of good IP laws, and lower cost workers than in Japan or America.

Similarly, AMD have a huge R&D presence in India, and a lot of foreign talent is willing to relocate there because although wages are lower than in the US, they can enjoy a very high standard of living.

I think the outsourcing you are talking about (small, non IT business buys a custom warehouse management system, programmed as s service), has little resemblance to the actual movement of jobs within large companies that I discuss.

This (movement for whatever reason) has happened to whole industries. The UK used to be a center for semiconductor manufacture, but lost that status for 2 reasons. One was a series of catastrophic strategic decisions at UK semiconductor companies. But the other was that the UK government stopped giving tax breaks to the same extent, and were outbid by Singapore, which was willing to provide tax breaks, infrastructure, and an easy ride through city planning.

The reason semiconductors moved was that the industry involve huge capital investments, and would make them where the deal was best (otherwise they would be outcompeted by a company that got a better deal).

In software, the primary cost is definitely wages, so I'd presume large companies will be very sensitive to wage prices, and go the place the trade off is best.

Unrestricted labour movement will mean you get paid average if you are average.

The positions with higher pay should be open to the candidates with better skills, wherever they happened to be born.

> Unrestricted labour movement will mean you get paid average if you are average.

There are quite a lot of assumptions in this statement. You assume an efficient market (something that does not and probably cannot exist at scale, if at all). You assume information equality, which is the exact opposite of actual practice, you assume the people who make the hiring decisions have some means of separating the average from the above or below average. In practice development skill is a completely separate and unrelated skill to marketing (of one's self). Ability derives from the former, salary from the latter.

So no, unrestricted labour movement most definitely does not mean you get paid average if you are average (not that I care about this anyway: I think all developer pay should go up across the board). What it will mean is employer perception will be that they should be able to get even better talent at even lower prices. And since employers can wait literally years to fill positions and people can rarely go more than a month without a job, this will certainly drive salaries down even further.

Of course, one could imagine that eventually everything would become more efficient and companies should recognise that better developers cost more. But how many companies actually care about having better developers? For the majority of IT jobs, the hiring company doesn't care about developer quality, they care about costs. If development takes longer, so what. Everyone else in their market will be working under the same constraints so it doesn't matter.

By that argument, we should put up trade barriers for all goods and professions. What makes you special?

But THERE ARE trade barriers for other professions. How many H1B CEOs are in the US? How many lawyers or doctors are on H1B visas here (or their work outsourced off-shore?)

Software development appears to be the one the most exposed to cheap competition.

Yes, and those trade barriers are bad, too. (Ask any economist.)

So the fence benefits you, but the people on the other side of the fence are human too...

As if their only choices are between working in my country's economy or starving to death.

The issue is, those people "on the other side of the fence" can have dramatically lower costs of living. A completely free labour market for software development would mean that nearly all software developers would have to live in some extreme low cost location just to survive.

And who does this benefit exactly? Not the software developers, the majority of them would take a massive reduction in salary or complete loss of one. Not consumers because that's us. Those who just got a massive pay cut or lost their job. Nope, just executives who are already making far over their share.

In other words, globalisation of the labour force helps those who need the least help.

(2) is seen by the company to be in its best interest, because otherwise the trained employees can find better work elsewhere or demand more money.

Or ads, as Jeff Hammerbacher said better: "The best minds of my generation are thinking about how to make people click ads. That sucks."

Another way to look at it is that the best minds are thinking about how to optimise capitalism.

They are doing that and in effect we are all getting a better more efficient society. Plus the best minds aren't working on world peace because people vote for people with the best hair to do that job.

That only really makes sense if you think that ads are providing information that's relevant to people's best interests and is going to be fed into fairly rational decision making, rather than exploiting cognitive biases as an attack vector.

I struggle to believe that our current models of advertising can in any way lead to overall societal efficiency. We can do much better.

Another way to look at it is that the best minds are thinking about how to optimise capitalism.

If this is true, we should all be mortally terrified and running around like headless chickens, screaming to the heavens to save us.

Unless you think of a dynamic ad market as an efficient market signals system critical to the economy.

Ie get the most interesting market messages to the right people, while not bothering the rest.

Not that I do anything related to advertising, but when done intelligently it can actually be a pretty important.

The "while not bothering the rest" part effectively doesn't work, though. People who will never buy anything from an ad in their lives still get ads served to them, because the cost of serving ads is roughly zero.

The facts that ads are paid per click and not per impression creates more ad impressions, not more efficient ones.

I think there are a lot of people who claim to not be influenced by advertising but actually are. Just because you’re not clicking on ads doesn’t mean they’re not succeeding at influencing your behavior:


Somebody has to pay the bills for self driving cars and fancy eye-wear.

Ads can't really be that "somebody". If everything is paid for by ads, what exactly are those ads advertising?

I thought we were talking about Google.

It almost feels like VCs are putting together teams of people overqualified+tempting enough for MicroFaceGoogHoo! to want to pay millions to acquihire them, and then telling them to do various money-burning attention-grabbing "peacocking" things until it happens.

Sometimes the illusion wavers for a moment, when the product part fails to happen (e.g. Color) but everything else still does.

That's... a surprisingly good hypothesis.

It's also true in the US. We've gone through several transitions from mainframes all the way to smartphones, bringing technology to more people each time, and my anecdotal experience is this the average HR worker understanding of computing has actually decreased (in that I face forms of ignorance now that I had not in the past).

"That report argued that the best indicator of a shortfall would be a widespread rise in salaries throughout the STEM community. But the price of labor has not risen, as you would expect it to do if STEM workers were scarce."

The huge disparity of pay between different STEM disciplines should clearly show where the demand is. Biologists compete over shrinking government grants, while petroleum engineering students receive job offers of $90K starting. Treating STEM as a homogenous bloc will hurt the less in-demand fields while failing to treat the shortfall in others.

One of his points was that it is extremely hard to predict where the next boom in the greater STEM field will be. Several years ago, internet boomed, then bust, then boomed again. Meanwhile, petro rose while aerospace fell... It's precisely the point that, because STEM skills cannot cross fields (you try getting a rocket scientist to build you a photo-sharing app) that leads to moments of massive oversupply, then terrible shortage, and then massive oversupply again.

Cannot cross fields? My major was Me/Ae, took no CS courses, worked for Boeing designing airplane gearboxes and hydraulic systems, and yet I had no trouble transitioning to a CS career.

When I took mechanics/fluids/electronics/materials classes, I discovered that the math was all the same, except that the EEs liked to use 'j' instead of 'i'.

This speaks to how quickly what we're taught in school goes out of date. You were able to transition to CS, compete with same-age CS graduates, because much of what they were taught in school had become obsolete, while what you learned on the job became more and more valuable and relevant.

> ... except that the EEs liked to use 'j' instead of 'i'.

And that only because of the coincidence that 'i' refers to current in electrical equations.

It also addresses the issue that, with respect to getting useful results from a computer, one's mathematical knowledge and ability is now more important than one's knowledge of a computer's inner workings.

The subject matter becomes obsolete, but the mental processes and skills stay relevant.

Too much of education is aimed and evaluated for data storage rather than brain training.

Very true. Americans are taught what to think, not how to think.

In other countries it's worse rather than better. China comes to mind. In Germany I'm not hearing a lot of "mind training matters" noise out of education either.

> In other countries it's worse rather than better. China comes to mind.

Yes, or Japan, where being different or standing out in any way is seen as the height of rudeness. Sumimasen, arigatōgozaimashita

I guess we have to face the fact that the ability to think for oneself, and think critically, is rather unpopular, with rare exception.

I was very fortunate to attend Caltech, where the focus was on how.

Programmers can cross fields, as can most enginners.

One of the things that seems suspect to me about this article is the conflation between "STEM" and "Computer Science / Software Engineering / Computer Engineering".

No where in the article is "STEM" defined precisely, but the author clearly notes that is NSF number includes health-care workers, psychologists and social scientists. Even if we replaced "STEM" with "Engineering" in the entire article, it's still a fallacy to say "there are 0.3 million engineering jobs, and 11.7 million engineering graduates -- there can't possibly a shortage of engineers." I wouldn't automatically assume that someone with an undergraduate degree in Chemical Engineering or Civil Engineering is qualified to work at <insert software company here>[0], in the same way that I would be completely unqualified to be hired run an oil field or draft up a plan for my city's mass transit system. Sometimes people do have extra skills atop their degree that lands them in a job outside of their field, but that's not true for everyone.

If anything, I think the whole "STEM shortage" sounds like a plea to US post-secondary institutions to make their CS/EE/CE/SE graduating classes bigger -- perhaps at the expense of other STEM programs.

Edit: [0] I'm assuming a software company small enough that it's not doing city planning on its own. I wouldn't be surprised if there actually were Civil Engineers working at the likes of Microsoft and Google, trying to figure out the master plan for their main campuses, but that's not my point.

No, I think the issue is that software developers have reached a limit of how low a salary they are willing to work for, and not enough of them have a limit that is low enough for what companies are willing to pay so rather than cave in, the companies are playing politics with the hope of flooding the markets to drive prices down nationally.

If that is indeed the rationale, I don't think having more bodies in the SF Bay Area is going to help. (More people means housing prices will go up, and thus, the rational price accepted by half-decent developers should go up. It's easy to spend 70% of a entry-level Google salary on housing in the city right now.)

I wonder if buying up a bunch of houses in Detroit, and offering one as a starting bonus (along with gigabit Internet and a nearby Wal-Mart) would be enough to entice people to work for less.

Anecdotally, my brother, with a Mech Eng degree, has been unemployed ever since he graduated a year and a half ago, despite looking constantly. He's married with a kid and lives with his wife's parents in a spare bedroom.

The STEM shortage is nothing more than employers looking for cheaper workers (either salary-wise, or without any obligation to train them the necessary skills).

If employers were looking for cheaper workers, your brother would have taken the cheapest job available wouldn't he?

After all, working as a Mech Eng even at minimum wage should for him be preferable to flipping burgers at minimum wage or being unemployed, especially with a family...

It depends. Countries with actual safety nets often provide enough unemployment money that one isn't forced to take some ridiculous "just be employed" job while looking for their next real job.

In the Netherlands we have enough unemployment money that one isn't forced to take some ridiculous "just be employed" job. But the longer you are unemployed, the harder it will be to find a new job. A hole in your resume doesn't look god

Of course. There is always the driver to actually work. In places that lack a safety net you have to take some "flipping burgers" job which you still can't really report on your resume so you end up with no time plus that ugly hole in your resume.

It has been shown that humans prefer pay for work over the same pay for no work (within reason). Many othere species too. Except Cats.

Also there is lots of room between an "adequate" engineering salary and minimum or no wage...

Figuring out how to get underachieving academics into paying or even wellpaying jobs should be a higher priority in politics.

I am going to challenge you to find a single peer reviewed study that concludes what you just said. Almost all recent claims about how paying less is somehow better come from some random HR person's anecdotes. Studies either do not exist or are flawed in some obviuos way.

I did not say "paying less is better". That sounds like a strawman's argument.

I did say that "pay for work is (within reason) preferred over pay for no work", all else being equal.

The term is "contrafreeloading", and is for example explained in Dan Ariely's coursera course: https://class.coursera.org/behavioralecon-001/lecture/57

I respect your argument about how what you said is different from the example I gave. I am not going to comment on this person or the lectures. I am skeptical about any "cool" theory that can't be traced back to a peer reviewed study. People need to realize that really knowing something is very hard. Extraordinary claims require extraordinary evidence.

Type in "contrafreeloading" to scholar.google.com. Instant flood of peer-reviewed study.

By the way, "this person" in the lecture happens to be one of the preeminent researchers in behavioral economy...

I am sorry but I can't find a study that concludes "humans prefer pay for work over the same pay for no work". You are just trying to overwhelm me to go and read a vast library of articles. I found a few studies where the "work" was lever-pushing and the study was with children where as they age they are less and less stupid exactly as expected (by me) http://www.amsciepub.com/doi/abs/10.2466/pr0.1981.49.3.859?j...

So please try to refer me to a study that I can apply in a general office environment with people aged 30 and a real-world job so we can check the claim.

The only people who claim there are plenty of talented developers to hire are people who have never tried to hire one in the last three-plus years.

I hire developers anywhere in the world to work on cutting edge infrastructure software and it's still tough. But not as tough as restricting us to the Bay would be.

I'm a freelancer who is available and I don't have the impression companies are desperate for software developers. I get requests from recruiters, but they offer significantly less money than a couple of years ago, and they seem to be able to find other developers when I decline or don't express great enthusiasm or years of experience with whatever frameworks they require. (Berlin, Germany)

If you can't attract high-quality candidates then either you are low-balling salaries or your company does something shady. Sorry to be blunt but that's the truth.

Sigh... Could you please add the salary that you are looking for? I'm going to guess it's too low.

To be fair from his point of view he would still be correct - there are not enough developers available to push the price below his threshold.

Tough. That's the real world. Complaining that "there aren't enough good cars available to buy" when all I want to spend is €500 will not get much sympathy.

If you say "not enough x" the question is always "for what?". That usually isn't even answered in those articles. Not enough developers to defeat cancer and world hunger? Or what?

"plenty of talented developers.."

From my experience in bay area & india, I never saw shortage of resumes for open positions. Large enough even if you assume people applying to multiple companies. The shortage though was for talented folks meeting the expectations of mgr/co-workers.

So train them. Give them a general aptitude test, tell them you're taking a chance and hope they'll stay for at least three years and give them mentors and a couple of months to come up to speed or they'll be let go

What you describe is very close to how hundreds of thousands of grads are hired into Indian software service industry. Aptitude test followed by training and plenty of time to come to speed and be evaluated. It has worked out fine for them. Layoffs are very rare.

Startups are a different story as time to market is crucial and you have almost no slack.

Maybe this is a problem with startup culture, perhaps technologists have it backwards.

In other industries , new startups seem to usually come from people with experience who have been round the block who have some idea of what is required to ramp up a company in the sector.

In technology it seems that if you haven't founded a company in your 20s then you have "missed the boat".

Yeah but to do that you would likely be offering less money than they would get elsewhere and so as soon as they could they would jump ship.

We hire talented developers on an ongoing basis. I don't see any shortage at all.

Offer $1,000,000 salary. Solved.

And until those new recruits enter the workforce, tech companies like Facebook, IBM, and Microsoft are lobbying to boost the number of H-1B visas—temporary immigration permits for skilled workers—from 65 000 per year to as many as 180 000.

So, assume a $30K salary differential, times 115K jobs = $3.45 Billion dollars in savings.

Well, that will pay for some lobbyists...

Duh? There has never been a STEM shortage, at least in the US. Ever. There has only been duplicitous propaganda on the part of companies that want to raise the H1-B limit in order to push down wages.

"The government of India has said it needs to add 800 new universities, in part to avoid a shortfall of 1.6 million university-educated engineers by the end of the decade."...

That's wildly optimistic. The reality is unemployment rate for indian engineering graduates now ranges from an optimistic 50% upto 80%..

[1] http://www.livemint.com/Industry/HCWB4sLvFBxfIFyNBYtqOP/Degr...

[2] http://articles.economictimes.indiatimes.com/2013-06-18/news...

Unsurprisingly, different groups define "shortage" differently. Employers call the shortage real because they can't hire employees cheaply. Employees call the shortage a myth because they can't find a position with a desirable salary.

Although difficult to analyse, we must also consider the quality of STEM degrees from different institutions. There's a world of difference between a first class degree in Physics obtained from Cambridge compared to say Stafforshire. Some Universities in the UK have unemployment rates for graduates over 25%. http://www.telegraph.co.uk/education/educationnews/10159647/...

The problem is that many STEM majors are not the kind of STEM workers there is a demand for. In my country a big share of people who have a CS degree barely know how to write real code. I think it's because 1. the education is too theoretical 2. free education system (no barrier means a lot of people who study without having any real interest in programming).

2 seems unlikely. If education is free, why not learn what ever you want? If educate is insanely expensive then unless your family is rich, you have to try and make sure what ever you study can recoup those costs.

Because people don't know what they want to do so they just jump on any education. I've been there. It's so easy to start university that literally everyone does it and what you actually read is secondary.

In fact, many people choose bogus courses like "hat science" just to get study allowance and access to the other benefits. So it's not surprising people pick CS thinking "it's good money so why not".

Part of the problem is that we also put people through an extraordinarily difficult educational process (CS or similar) for what's largely become a skilled vocational problem. 90% of developers working on 90% of their work will never have to implement an algorithm of any kind that matters. They won't have to understand higher maths, computational theory, language theory, etc. They have to glue this piece from here with this piece from there and write some documentation that they did so.

On the flip side, spending four years learning about computability theory instead of the latest frameworks in foo language likewise doesn't prepare a CS major for employability. Their first employer will have to spend lots of time and money training them in the vocational skills they'll need.

Universities' attempts at recognizing this and educating students for this reality give us majors like Software Engineering and Information Systems which don't exactly match what the market needs either.

What's needed is to take a relatively smart, but ignorant person off of the street, and in 4 years spit out a mid-level developer, with specialties in web development, server development, etc. Their coursework should be nothing but learning languages, APIs, development environments and frameworks and how to coherently document their work.

For those specializing in web development, their senior project should be to build, from scratch, an entire e-commerce site, with product catalogs, promo codes, wishlists, shopping carts etc, including setting up all the servers, and gluing them together. The entire thing should be meticulously documented and launchable tomorrow.

Similar senior projects should go to those specializing in game development (like the senior projects out of digipen), desktop software development (make a basic office suite), server software development (write a database and web server from scratch with full APIs), etc.

In very specific markets this has already happened. Video Game development, for example has spawned vocation focused education like digipen.

CS is both over and underkill for most jobs. And the result is that we interview for CS skills, for a job that doesn't really need them.

My personal anecdote supporting the idea of an STEM crisis:

I'm not a developer. Last year, I was learning programming. I also had a track record of starting and successfully executing projects. Small personal ones, but they made money.

I tutored a student for the LSAT, and he thought I was smart. When he heard I was learning programming, he introduced me to his sister's boyfriend, who was a manager at a local hardware/software firm.

The guy met with me, was interested in turning me into a developer and offered me a decent salary to do so.

He said he couldn't hire enough competent developers.

I ended up pursuing another opportunity (large commute at this job + I wanted to build my own stuff), but this impressed on me the high value currently given to STEM expertise. I didn't have any! Just an interest was enough.

Edit: I has learned the basics of programming, having gone though most of K & R, and a couple of Udacity courses. But I has no programming skills that were worth paying for at that particular moment. Nothing tht could immediately make anyone money.

No one even talks about a crisis of too few English majors. That seems to be beyond dispute. So maybe there is a bit more of a kernel of truth to the idea that some (perhaps not all) STEM occupations have a limited supply of capable workers.

Unlike English majors, software job requirements change practically quarterly. Companies (mostly start-ups) always appear to look for min 2-year experience in some latest, hottest, tech/framework, which has barely even been available that long. If someone out there happens to have that skill, s/he will ask for a decent (but not outrageous) salary/rate for a (temporarily) rare skill. Companies looking are not willing to pay that(the CEO needs to get his $2M bonus, after all). So there is your "limited supply".

Maybe, just maybe a couple of passionate employees could be quickly trained? What a radical idea....

limited supply of capable workers, yes. 100% agree. But if you have an oversupply of workers in general, out of which it's difficult to tell which ones are capable, then what?

A fair question. Use research-based hiring procedures


to sift through the many persons with degrees (or even without degrees) to find the smaller number of persons who can actually do the job. Key idea: use a work-sample test to hire for any job.

you can't really do that for biology or chemistry.

STEM seems to me to be very much not zero-sum, so I think the more the merrier.

it can be worse than zero-sum. If you dilute the workforce with idiots, you could wind up with a situation where projects requiring insight get completely derailed and moreover actually competent people get passed over time after time and quit the endeavor in frustration. This is what is happening in the sciences.

> If you dilute the workforce with idiots

This seems to be the sort of problem that markets are pretty good at dealing with. I wouldn't worry about it.

Market heavily distorted by governments wanting to fund science as a source of national pride. As you well know, number of PhDs correlates to national greatness.

"They" have been crying about a lack of STEM (it used to just be scientists and engineers) for at least 30 years.

I don't listen to it any more. When I walked out with my doctorate, jobs were very scarce in the science/engineering fields (although software jobs were increasing).

From what I can gather, the soothsayers simply do a linear extrapolation of current jobs and candidates, and show a shortage. They do no real forecasting by trying to reason about future needs.

So, I don't pay attention any more.

I'm generally happy to see STEM funding in education[1] really, I am happy to see any funding in education. But, the "STEM shortfall" (if there is any) isn't because of a lack of funding for specific education program, but because of a lack of demand in those fields. If companies are having trouble finding proper candidates for jobs (they tell me that they are), they either need to look at their selection process (which IMO is pretty terrible) or offer more money; probably a little bit of both. It requires a little more discipline and sacrifice to earn a STEM degree vs many of the other degree tracks, many of these students change plans to a simpler track that gets them more money for their effort.

[1] Disclosure: I am a STEM educator, and I am a little bit worried that this effort might even flood the market for STEM grads lowering wages.

Grade inflation (related to political pressure about the STEM shortage narrative) is such that a larger proportion of STEM graduates than ever before are mostly useless.

Maybe the homogeneity of STEM labour is what the myth is.

I don't understand what a "shortage" entails.

Wow, My Friend just launched a stem job board on Monday. If there is a transition from "Alarm" to "Boom", this seems like a great business opportunity.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact