

All of IT - eldelshell
http://www.eldelshell.com/all-of-it.html

======
PMdev
Hi Eldelshell, your blog shows, that we're lucky to have you in our new team
as you also regard things from an abstract way as much as from the material
scope. Seeing forward to meet you, "A PM-ing Developer". ;-)

------
switch33
Too long didn't read summary: Ask open questions, provide people with code,
talk to them how they would find answers or improve it. And spend a little bit
of your time learning some IT if you expect to hire anyone decent.

The problem is that there are many limiting factors in questioning surveys and
what you are testing them on. And determining a person's aptitude for
technology is a very subjective process. Do you test their resourcefulness by
asking them what sites they visit and learn from? Do you test their actual
programming ability by having them do a live coding that could be too specific
case? Do you ask them about terminology which just shows they may know some
buzzwords?

Google for a while gave up on asking algorithm questions supposedly for a
certain subset of jobs they had.

I think as IT is getting more specialized and there is a difference between
people knowing certain things it is important to have some form of domain
knowledge which can be in the form of terminology quizzed, but at what level
is up for debate. But maybe a more open question format would be better.

I think always presenting the interviewer with the code and having them look
it over then provide input is maybe a better fit. It is like math, everyone
wants to just use a calculator but in school you use your head at first for a
while, but eventually everyone in a work environment uses a calculator anyway.

The computer questions should be the same, they should have access to a
computer. They should then be quized on how to reason out this x code and how
can I change it to be more like y. They need to be problem-solving questions,
that use some critical thinking but also involves some reasonableness to them.
Some may ask terminology, but it shouldn't be the main focus.

It also does not help that employers don't know what to ask for their tech
jobs as well in many cases. If someone wants to hire a web developer they may
just copy and paste and re-write the job requirements of a random posting they
find online. This is what I imagine happens 90% of the time. The reason why
job posts are such crap is because when someone posts a new requirement they
probably all just update their templates that they had before-hand. It's
technological ignorance at best!

What is getting worse is also specialization in different sectors of computing
is just confusing companies more-- Devops, machine learning, security
programming, etc etc. And as computer literacy is improving in population yet
code is getting automated more and more there is becoming an increasingly
difficult question of what is necessary to learn for aspiring programmers who
will be joining the workforce.

A few good devop team can easily be more productive then 3 times the size of
it in developers if the developers know nothing about devops.

A few good machine learners can solve real problems with data mining that may
save a big company millions that a group of programmer interns wouldn't know
how to do.

A few good pentesters can find holes you completely missed and need to patch
providing your company from losing tons of credit cards through an open point
of sales or losing important company data.

One funny way to see experience in IT security is to simply ask are we getting
more safe, or less safe security wise. The answer less safe shows that they
have been working in the IT security field, however if you ask researchers at
universities they may not know about the level of problems in the security
industry being so prevalent but they may still be good fits for specific
research.

It is hard to manage which are necessary, and it is also hard to address skill
level. There are many groups of interns now who could easily possess such
skills that may be of way more use than people know as well.

Many companies rather hire a student out of college than a phd student because
they may have some interest in some of these things and because he may have
some sample projects or notice in communities even though he may not have
specialized skills.

One of the most major faults though is that we have tons of jobs that are not
using any technology at all and involve 0 programming, while others that are
entirely programming. The future of programming should adapt to things where
there are jobs that have people do both. Project managers and devops people
should be developers as well, not just people who manage but coders. But this
is only applicable to certain technology focused companies, which is becoming
harder to reason not to be technology focused.

For the reasons above I've actually been learning on my own lots of diferent
technologies way before trying to get a job in IT. However, IT is still too
rapidly changing to learn "everything", but people can benefit from learning
how big companies like yahoo, reddit, facebook, netflix, linkedin etc deploy
their infrastructure to handle large amounts of data. They can also learn
security and machine learning as well as other more special skills which can
get them a job or even better start a startup. With the ease of setting up a
Software as a Service, it is becoming more approachable to start a startup
then to get locked in the horrid cage of mismanaged companies as well.

