

Ask HN: What are the current (IT) industry skills employment cash cows? - clyfe

What do you think are the (IT) skills that employers currently pay loads of cash for and are widely searched ?<p>Spring, LINQ, DB scaling, BPMN, others ?
======
nostrademons
Anything where you play buzzword bingo is not going to be lucrative in the
long term. The technology's popularity attracts wannabes, which increase the
supply of available labor, which drives down wages.

I'd go with something that requires specialized knowledge, either domain-based
(bioinformatics, geology, finance), systems programming (compiler, OS, and
distributed computing), or one of the AI subfields (information retrieval,
machine learning, data mining, computer vision, etc.)

~~~
reinhardt
Another area of specialized, albeit less glamorous, knowledge: legacy app
programming (Cobol, RPG, Fortran, etc.)

------
Daishiman
SAP implementer, if you want a real, honest-to-God cash cow. It will drain
your soul. You will curse ABAP until your last breath. But your pockets shall
never be empty.

------
unoti
Some skills are more accessible to you than others. So a good answer is
different for different people. I recommend thinking of 5 or so things that
really interest you, and then checking what kinds of opportunities are
available for those. I did that exercise a while back, and was surprised to
find that there appeared to be a tremendous demand for people with data mining
experience using Hadoop. This kind of approach has the further benefit of
giving you an idea of what kinds of places might want to hire you when you
hone your new skills. It'd be a shame to become skilled in compiler design,
for example, and then not have much of an idea of who would want you.

------
noname123
Nah, bro. Sping/LINQ are all work that could be easily by Java/.net shops in
India. Any enterprise work is easily outsourced to India.

Stuff that can pays high six figures because they can't outsourced. Computer
security (especially if you could get security clearance); geology simulation
for oil exploration (high-level domain expertise knowledge); quantitative
modeling and trading system design for HFT (domain and quant skills required);
risk modeling in actuarial science.

All of these are if you want to work for BigCo. You could also make a lot of
money by going consulting in some obscure and corporate package such as SAP
modules.

------
earl
Hadoop

Every idiot out there wants to spray some "big data" retardation all over
whatever they're doing, so they throw together a cluster -- if one were really
generous -- of a handful of machines so they can use hadoop too!

Meh. It's not big data until you need a couple hundred cores.

However -- it's an opportunity to cash in for devs. Of course hadoop is a
needlessly difficult programming model for most people, but never mind that.
My .02 is if you can _possibly_ solve your problem by scaling up instead of
out, then scale up. Hadoop works well precisely for perfectly parallelizable
no communication required problems, which are few and far between. For
everything else, it's a giant pita to shoehorn the solution into hadoop. Of
course, most managers are retards, so spending maybe $25K on a 256GB ram, 16
xeon machine stuffed full of 15k rpm sas or ssd drives is out of the question,
but wasting months of dev time to sprinkle Real Big Data (TM) bullshit on
stuff is worthwhile. Sigh. 15 years into this industry and still, everybody
views salary expenses / opportunity costs of dev time and equipment as
completely separate things.

I'm grumpy today.

 _However_ , hadoop has a steep learning curve for anything but toy problems,
and I get 2-3 requests a week on linkedin just because of my employer and
having hadoop on my resume. So it's great for devs. I know people charging
hundreds of dollars as their consulting rates.

Edit: just checked silicon mechanics to make sure my hardware cost wasn't too
far off. If you're doing most numerical stuff, parallelization is hard, so:
256GB RAM, 2 of the fastest QPI highest cache procs, and 6x intel 160GB SSDs
in a 2U: less than $30K delivered.

~~~
rubinelli
Personally, I don't think Hadoop's model is all that complex (obviously, if
you can solve the same problem with a quick script, you have to be insane not
to) but it's nowhere near in terms of polish to the tools Enterprise Java devs
are used to. Creating a new cluster for example is a pain compared to the
usual "unzip, copy datasource XMLs and jars, run start" for application
servers.

~~~
earl
Hadoop is extremely complex to do anything requiring communication. My
favorite example is ranking: say you have a set of <set (string), id (string),
score (float in [0,1])> tuples and for each unique class, you want to rank ids
by their scores. This is trivial on a single core. Now try doing it on hadoop:
first thing you'll do is try to do it all on the head node, but that won't
work if you're doing this on more than a trivial amount of data. Next you may
try partitioning by score, but scores aren't distributed uniformly.
Partitioning by set won't work since you have more data than a single node can
possibly handle.

It took me a week's worth of work to write a distributed ranker that will work
on billions of ids in hundreds of sets; the algorithm is not obvious. On a
single node you'd write a for loop that did a sort.

Those are the sort of obstacles that the hadoop programming model introduces.
Again, if you haven't used it for real work, you'll be surprised at how few
tasks are truly parallelizable without communication. The model is very
simple, but accomplishing tasks that don't quite fit in the model is
difficult.

Another example is a matrix mult. There is no way to segment the data to
eliminate internode communication. It can be accomplished in an iterative
fashion, but it's decidedly nontrivial. Or most optimization problems -- I
don't think there even are known algorithms for no communication
parallelization.

