
U.S. Software developer wages fall 2% as workforce expands - biotech_anon
https://www.computerworld.com/s/article/9239308/Software_developer_wages_fall_2_as_workforce_expands
======
hristov
This does not necessarily mean that wages are falling. It could be that a lot
of the new jobs created are "beginner" jobs with lower wages that pull the
average down, but the existing jobs still pay what they used to or more.

~~~
zanny
What is / will drive wages down is making the process of becoming an
(experienced) programmer easier, by lowering the per case work needing done
and lowering the barrier to entry (compare writing C++ in 2002 to 2012), and
the resources available) and that most use cases are best suited by high level
dynamic languages like Python or Ruby, which have significantly less work
investment to get results compared to old favorites like Java or C.

 _Most_ people don't need a tech savant. As that becomes more apparent as it
becomes a game of legos putting blocks together rather than having to plastic
cast mold every block yourself, I expect mean salary to drop.

~~~
guard-of-terra
What's up with C++ in 2002 and today? If you knew C with classes in 2002 (MFC
style) you were good, if you knew GoF patterns you were elite.

Today you have to know STL, auto/smart/shared pointers, all kinds of right
casts, you have to know boost (a huge code base) and on top of that you are
expected to know C++0x (or how is it called?) with its lambdas, autos and
stuff, while still understanding the whole "classic" C++.

(But at least nobody cares about MFC and ATL anyway)

And don't forget templates of templates of templates which crept into many
code bases. And 64-bit (or ARM) is now reality which you should account for.

If anything, you have to know ten times more in C++ in 2012 compared to 2002.

~~~
ababababa2
Your example is flawed.

C++0x is easy to learn. For example, autos is just the ability to say "auto
myFloat = 5.0;" and it will default to float.

Lambdas in C++ is fairly easy. right casts? trivial [especially with auto].

Templates of templates of templates are easy as well.

64-bit? ARM? The whole point of C\C++ is to make yourself not assembly. You
shouldn't worry about 64-bit vs 32-bit as long as you use proper things.

~~~
zanny
This, my point was that you can use C++ make_shared (and eventually C++14
make_unique) instead of manual management, you can get away with a lot more
reference passing instead of of raw pointer manipulation, the std now has an
actual hashmap implementation now, auto and lambdas make things faster, etc.

Modern C++ vs second iteration standard C++ in the wild west days when I was
first learning about it is an entirely different affair.

I also contest having to know boost - I work in some KDE projects and boost is
only a dependency on around 1 in 4 that I have found. Though qt in many ways
becomes the surrogate stack to learn.

------
JOnAgain
I really hate DOL stats. They have a menu of jobs, and you have to slot people
into them. Often you have jobs that are halfway between 2 and you don't know
which is "better", so you just pick one (e.g. I write a lof of excel macros,
does that make me a programmer or an analyst? To me, it's obviously not a
programmer, but I wouldn't be surprised if that's not applied universally).
Also, categorizing people is often left to the discretion of the employer, and
sometimes they have an incentive to lie (R&D tax credit anyone?). These stats
include tech support and also capture a wide range from "Web Developers" to
"Computer and Information Research Scientists".

<http://www.bls.gov/soc/2010/soc151130.htm>

------
textminer
Am I the only one driven crazy by reporting that states an ambiguous "average"
for something asymmetric like salary, when the choice between median and mean
as a measure of centrality makes all the difference in the world?

~~~
elchief
You are not alone, amigo.

------
jebblue
The Dice salary survey says salaries went up. I wonder which is right:
<http://media.dice.com/report/2013-2012-dice-salary-survey/>

~~~
ISL
To compare, you'd need to know the statistical and systematic uncertainties of
each measurement.

------
ardit33
I think this has due to an expansion of entry level jobs. ie. Code/App
Academy, people learning Ruby on Rails and going to entry level jobs. Since
tools are getting better and better, there is an expansion on the lower level
of engineering. They should have been measuring people with the same level of
experience (e.g. 5 years or experience), and see how they stack up.

My personal experience, and from what I have seen around for experienced devs,
expect a 5%-6% increases annually (few percentage points above inflation). It
is never linear though. You usually you might see this increase as a modest
early 3%-4% on your current job, and then if you switch jobs every 3 years or
so, get another 10% or sometimes more from your new employer. This is for
experienced engineers (5+ years). If you are very early in your career you
probably will see higher jumps on your early years.

~~~
riggins
_Code/App Academy, people learning Ruby on Rails and going to entry level
jobs_

I kind of doubt it's Code Academy.

This article as of Jan 2013 said Udacity (which IMO is more rigorous than Code
Academy) has only placed 20 people and Coursera only a 'handful'.

[http://online.wsj.com/article/SB1000142412788732433920457817...](http://online.wsj.com/article/SB10001424127887324339204578173421673664106.html)

 _About 350 companies have signed up to access Udacity's job portal in recent
months, though it has placed just about 20 students so far._

 _But the company (Coursera) matched only a handful of students in its months-
long pilot_

------
arasmussen
This title seems very misleading. There's a difference between "U.S. Software
developer wages [falling] 2%" and the average falling by 2%. Since the demand
for software engineers is so high, perhaps companies have been lowering the
bar for how qualified an engineer needs to be in order to be hired. Less
qualified generally means paid less. That's just a theory but it's what I
think is happening.

I've heard from industry sources that the same engineer is worth as much as 7%
more than he was last year (not including the experience he gained since last
year). I'd believe it. I think the qualification bar is just lowered because
the demand is so high and supply so low.

------
mkumm
I wonder how much of the $2k decrease could be contributed to geography? I
think we are seeing more opportunities for technical jobs throughout the US,
lowering the percentage of developers that need to be paid a Silicon Valley
wage.

~~~
bicx
I really wish they would weigh geographic location into their stats. I live in
TN where around half that $99k average wage is the norm for CS grads, and
that's considered a good salary. Cost of living is much lower here than in
Silicon Valley, so it partially evens out. A country-wide average is really no
help to anyone with such disparity.

------
guest
U.S. Software productivity falls 200% as legions of the technically illiterate
conspire to make it look like they are doing work they can't actually do.

------
charlesjshort
So does this mean the U.S. needs to issue more work visas?

~~~
skylan_q
Yes, because there is a shortage of X workers until wages hit the federally
mandated minimum.

~~~
nandemo
Two related thoughts that always occur to me in these discussions:

* Many (I wouldn't say most but at least a sizeable minority) American programmers and IT workers display reservations regarding the policy of issuing visas such as H1Bs for a large number of foreign workers, because it's perceived as unnecessary or perceived as catering purely to the interests of megacorps. I wonder, do those software developers elsewhere) have a problem with buying products manufactured in China or other countries with low wages? Buying stuff made in China, including from American companies that outsourcing to China, effectively lowers American blue collar wages below the mandated minimum.

* It seems a lot of people on HN are favourable to policies allowing working from home. Most companies still don't have such policies, but I suspect that if telecommuting ever becomes commonplace (e.g. due to software development process or technological changes), it will depress salaries far more than the measly 60~80k H1Bs per year have ever done. For every H1B holder there are certainly many others who are skillful enough programmers and speak English, but cannot work in the US due to the limited number of visas, or don't have a degree, or don't want to take the risk of working under the constraints of H1B, or simply don't want to move to the US for any old reason.

~~~
nknighthb
I think you're overly focused on global economics.

H1B abuse presents _local_ economic problems. Wages for programmers working in
San Francisco are higher than wages for the same programmers in, say,
Lexington, both are higher than China, and all are related to local cost of
living.

If a company abuses H1Bs to import cheap programmers from China to San
Francisco and keep paying them Chinese rates, San Francisco programmer wages
are depressed and SF programmers are disadvantaged relative to the entire
local San Francisco economy.

Note I keep saying "abuse". The problem people fear with H1Bs is that their
nature opens them to abuse. Not abuse by the immigrant programmers, but abuse
by the corporations employing them, which can use the conditions of the H1B
program to essentially hold immigrant workers hostage in below-market-rate
jobs.

I think you'll find that if you talk to programmers reasonably informed about
the nature of H1B visas, most will have no general objection at all to
programmers immigrating to the US from China, India, or anywhere else. Only to
the particular circumstances of the H1B program, which break local market
forces.

~~~
nandemo
Outsourcing manufacturing also depresses local salaries. It's just that it's
not very visible anymore. I think it's fair to assume that, out of all
unemployed Americans, there are some who would happily take a factory job
paying $X; but $X is too high compared to the equivalent Chinese worker
salary.

I totally understand that some programmers feel more competition is not in
their best interest. If they would just say they are protecting their turf,
then I'd understand. Doctors and lawyers limit the number of licenses, certain
trades restrict their jobs to unionized workers, etc.

I'm just wondering if their objections have other grounds, e.g. moral or
public policy principles, and in that case whether those principles would
apply to e.g. blue collar workers (or phone technical support, or farm workers
etc), and whether they would be willing to pay more for US-made products in
order to support American-based manufacturing.

> If a company abuses H1Bs to import cheap programmers from China to San
> Francisco and keep paying them Chinese rates,

I think you're resorting to unnecessary hyperbole here. There's public data on
H1B salaries and they're far above median Chinese salaries.

In any case, my question above concedes the assumption that programmer
salaries are depressed to some extent.

~~~
nknighthb
> _Outsourcing manufacturing also depresses local salaries._

No, it moves jobs, generally entire categories of jobs, to lower-cost areas.
If someone wishes to continue working in that type of job, they must do it in
a place with lower cost of living.

> _I totally understand that some programmers feel more competition is not in
> their best interest._

That has nothing to do with anything I said, and in fact I pretty clearly
articulated that such a view has nothing to do with the fear some people have
of the H1B program.

> _I think you're resorting to unnecessary hyperbole here._

I think you're reading things into my comment that are not there, because you
wrongly assume _I_ oppose H1B visas.

------
brown9-2
What is the margin of error for a statistic like this? Seems like 2% could be
within measurement error, unless the DOL data is counting every single
software job in the US.

