
Measuring software engineering competency - ferroman
http://www.savvyclutch.com/measuring-software-engineering-competency/
======
mbesto
I do technology due diligence on behalf of investors for a living, so I live
and breathe this type of stuff on a weekly basis.

All companies build software differently. Some have automatic deployment, some
don't. Some have strong testing procedures, some don't. Just because a company
doesn't use a CI, doesn't necessarily make them "worse". It's just an
indifference to the indoctrination of the "SV mindset".

The more important answers are not binary yes/no by rather "why aren't using a
CI". Common answers are:

\- I'm not sure what CI is

\- We don't have enough unit tests to justify it

\- We're a small team and it doesn't really justify the effort to setup

\- We're working on setting up and should be live in the next 3 months

You can tell a lot about engineering competency and leadership from those
answers.

~~~
deegles
Any advice on getting into this field? Seems like it would be pretty rewarding
(mentally) work.

~~~
mbesto
Honestly - I stumbled into it. I really don't have any good advice other than
know the right people.

------
ryanbrunner
This has 3 big problems that make it a poor substitute for the Joel Test:
there's way too many questions, some of the questions don't have a universally
accepted "good" answer, and questions have too much ambiguity and wiggle-room.

One nice factor of the Joel Test (not that I saw it being used in reality -
but as a mental model anyways), was that you could easily categorize companies
into places you want to work or places you don't want to work. A perfect
score? You want to work there. More than 2 things they don't do? You don't
want to work there. 1 thing? Maybe look into it and see how important it is to
you.

With this, your scores could be all over the map. What's more, the questions a
company misses on might be ones that aren't that important to you (having a
library), or even where a 'no' might be preferable to you (daily stand-up).

Even once you get past all that, many questions aren't easily answered
objectively. What's a short iteration? I've worked in places that touted 2
weeks as a remarkably short iteration, and others who bemoaned how long that
was.

~~~
jcadam
> _A perfect score? You want to work there. More than 2 things they don 't do?
> You don't want to work there. 1 thing? Maybe look into it and see how
> important it is to you._

That's all well and good if you have the luxury of picking and choosing from
multiple offers. Here in the real world (i.e., not in SV), getting a decent
offer (if you're not entry level) that's at least equal to your current pay
generally takes 6 months - 1 year of hard interviewing. If I demanded a
prospective employer scored even 50% on the Joel Test, I'd be perpetually
unemployed. Which is probably why employers generally get away with providing
sucktastic working environments for software developers.

I find it especially disheartening that the only one of Joel's 12 'tests'
that's pretty much a universal 'yes' these days is _uses source control._

~~~
ryanbrunner
I'm not from SV, but I am from a tech hub, so I'm sure our experiences are
pretty different. Regardless of that, adding granularity and ambiguity to the
score doesn't help much in your case either (particularly since things like
source control were removed, probably because "it's a given" where that might
not be the case outside of startups)

~~~
jcadam
While I haven't seen a shop that completely eschewed source control in a long
time, I did once work at an especially dysfunctional company where only one
designated person (the QA) was allowed access to the svn repo.

So individual programmers were basically forced to work without any of the
benefits of an scm. The designated repo master would place a zip of the latest
code in a folder, the programmers would copy it to their home area, and when
we wished to 'commit' something, we'd copy our files to a staging area and put
in a request with the QA to do the actual commit.

 _Can 't I just commit to a branch and let the QA merge the branches into the
trunk later?_ __No. __

I need to edit a file, time to: cp file.c file.c.bak1

~~~
snaily
If all you want is structured save points, git runs locally! See also git-svn
and its ilk, which allow you to interface with a different SCM while you keep
using a familiar git interface.

Assuming you're entrusted to install software locally, that is.

~~~
jcadam
> _Assuming you 're entrusted to install software locally, that is. _

BWAHAHAHAAA! No.. :(

At my current job, I've been waiting on a software install request (you know,
just an IDE, a small thing) since early February. Coding in Notepad++ till
then (w00t). Been waiting on a RAM upgrade (stuck at 4GB) for... well, since I
started back in October.

Programming as a job sucks. Can I do something else and just keep coding as a
hobby (at home, where I have decent tools and don't have to ask permission for
every $%@! little thing)? I swear, if I could get paid just as well to bag
groceries/serve coffee/etc. and not have to deal with an
antagonistic/uberpolitical IT dept run by Vogons, I would do it in a
heartbeat.

~~~
aidenn0
Can you relocate? I have seen shops like this, but if you can relocate that
expands your options by several orders of magnitude.

One example: where I work we have our choice of windows/mac/linux (though if
you want to run a linux other than debian, ubuntu or redhat/centos, you're on
your own as far as IT is concerned); workstation upgrades are every 3 years.
I'm due for a new one in May, so I only have 8GB of ram right now. Every desk
has at least two monitors, and every office has a door[1].

This is not in SV, nor is Fog Creek (where Joel wrote the "Joel Test" from).

It costs nearly $200k to employ a software developer after taxes and benefits,
so not being willing to shell out a few thousand a year in tools that will
give a performance benefit (even if it's just making that person happier in
their job), is wasteful.

1: There have been points in the past where cubes were used as a stopgap while
we were finding more square-footage, including when I started. The person
getting me set-up apologized for putting me in a cube.

------
mcguire
A) Actually, we have plenty of ways of measuring software engineering
competence.[1][2][3]

B) Weirdly, all of them are heavily process focused and none of them have much
interest in the ability to write functioning code, because

C) Software engineering, as a field, is built on the belief that _anyone_ can
write software _if properly managed._ [4]

[1] [http://cmmiinstitute.com/](http://cmmiinstitute.com/)

[2]
[http://www.sei.cmu.edu/certification/](http://www.sei.cmu.edu/certification/)

[3]
[https://www.computer.org/web/education/certifications](https://www.computer.org/web/education/certifications)

[4] and that would make software development much cheaper.

~~~
mannykannot
Re C): Indeed - and the belief extends to the proposition that to properly
manage it, you do not need much understanding of what the detailed design and
coding aspects of development actually entail, so long as you know Software
Engineering.

~~~
mcguire
That is an extension of the old 'managers need to know how to manage, not how
to do things.'

------
martijn_himself
It seems to me that more and more ceremony is being added to software
development which distracts from the actual work. This only benefits 2 groups
of people: people that don't like the actual work but still want to fulfill a
role in the process, and the agile 'industry'.

~~~
s73ver
Some of that is needed, though. If you're going to just shut one or two people
in a room for 9 months, then you probably don't need it. But today usually
you're going to have bigger teams, and the business is going to need more
visibility into the project, while also giving developers a degree of autonomy
and ownership of it.

When the business is willing to let it work, Agile can be kinda nice. But if
the business doesn't buy in, if they keep interrupting, changing priorities
and tasks mid-sprint, then it's just going to make everyone miserable.

~~~
marcosdumay
> Some of that is needed, though.

The most important thing you need to make sure is that developers talk with
end users. That is one of the biggest point of Agile.

Now, if you look again at the article, you'll notice this one interaction is
completely missing. Not only that, but Scrum de-emphasize it too by creating
middle-men, and most formal "Agile" methodologies don't even think about it.

------
quantumhobbit
I'd settle for a place that stuck to the original Joel Test.

After all most of the CICD stuff in this list is covered by his original
1-step builds rule. If you have that then scripting cicd stuff is trivial.
Similarly his test covers testing.

Everything else in here is just disguised Scrum.

~~~
ferroman
True, but not much team follow these, in practice. We just want to make it
more specific and modern.

~~~
vog
_> True, but not much team follow these_

Isn't this a very good indicator that the Joel test is still sufficient?

Only once almost all companies pass the Joel test with 12/12 points [1], a
modernization is needed to distinguish further between multiple offerings.

[1] or 11/12 points, see
[https://news.ycombinator.com/item?id=14078069](https://news.ycombinator.com/item?id=14078069)

EDIT: Fix typo.

~~~
ferroman
It is relevant, but we try to make it more specific.

~~~
vog
Sorry, typo. I meant "sufficient", not "relevant".

------
NumberSix
Conspicuous by its absence is:

Does your software work?

These criteria are all rituals and processes, rather than the end result.

~~~
ryanbrunner
They're signals that indicate a well-functioning development team, that can be
researched with little effort and answered objectively.

"Does your software work?" is almost impossible to answer objectively, and
doesn't help you determine if the software is going to work 2 years from now
(which a good deal of development best practices work to achieve). You might
as well replace the test with "Is this company awesome?"

~~~
NumberSix
In general, the way to objectively determine if software works is to give it
to those pesky end users, have them use it, and tell you if it works or not,
and how well it works. Lacking end users, have testers and QA folks who play
the role of end users evaluate it.

In some areas of software development, such as heavy duty
algorithmic/mathematical programs such as encryption, video compression,
computer graphics, there are pretty rigorous objective ways to measure whether
the program works and how well, without human testers or QA people. Usually
best to do both even in these cases, just in case there are some subtle issues
that the performance metrics don't capture.

On the other hand, predicting the future is noted for being rather hard.
Predicting correctly whether a program will need to be changed in two years,
in what way, and whether the program can in fact be changed easily all two
years in the future is speculation, a matter of opinion, rarely objective at
all.

~~~
ryanbrunner
You're right that there are classes of application that can be said to "work"
objectively.

For the majority of business facing SaaS applications (to name an example),
"working" is an elusive target. If we gave it to those pesky end users, and
there's more than 5 of them, I guarantee you'd hear multiple answers to how
well it works certainly, and even if it works.

I'm not saying that looking at how well your product works for people isn't a
noble endeavour or anything. For the purpose of what this is supposed to be -
an easily obtainable, objective measure of what it's like to work for a
company, it's horrible.

------
jt2190
A little additional context: Joel's "test" was just a quick-and-dirty way for
a job candidate to assess whether or not a company had competent software
engineering practices. Assessing your own team's practices could (and should)
go far more in depth, since all the messy details are right there. The spirit
of this article is good, but I wonder if we can formulate a good test that
applies universally to all teams and carries enough detail to help that team
improve.

------
Androider
A lot of companies get the basic source control, builds, bug tracking and
writing code during interviews parts right, but tend to skimp on these aspects
of the Joel test:

\- Do you fix bugs before writing new code?

\- Do programmers have quiet working conditions?

\- Do you use the best tools money can buy?

~~~
vog
While I agree with most questions, I always found number 9 to be a bit
strange:

 _> 9\. Do you use the best tools money can buy?_

I hope I don't do too much injustice on Joel, as I always loved to read his
writings back when his "Joel on Software" blog was active.

However, this item on the list always sounded to me as an attempt to promote
their FogBugz tool, not as an objective advice.

Recognizing there are many excellent Free Software tools, specially in the
software development area, I'd rephrase it is as:

 _> 9a. Do you use the best tools available?_

or maybe:

 _> 9b. Do you invest in your tools?_

which means, depending on the exact tool, one or more of:

\- buying a proprietary tool

\- using a Free Software tool, and donating money to the project

\- using a Free Software tool, and providing bug fixes and/or new features

\- having one or more team members dedicated to improve the tooling and
infrastructure

~~~
Androider
Frequently the objectively best tool is Free or open source software (which
doesn't mean it's priced at $0, although often it will be). But many times
it's not, and that's when companies can become extremely penny wise and pound
foolish.

Hiring someone to exclusively babysit a Jenkins instance is incredibly
expensive. Paying for Travis CI/Codeship/Gitlab CI is really cheap in
comparison. Having developers fill out purchasing orders and waiting for
software or hardware is very expensive.

I like to call it the "IntelliJ test", can I requisition IntelliJ ($499) and
have it the same day (week? month?) or is the company going to flinch, hem and
haw at the absolutely inconsequential price of the software in comparison to
the expensive developer time they're paying for.

~~~
oblio
It's not always cut and dry. I was the "Jenkins babysitter" for a lot of
years.

At scale I don't think most off-the-shelf CI/CD tools hold up. You will need
dedicated people to take care of them.

Of course, if all you have is 100x plain software projects which don't depend
on one another and there's no sort of other interaction between them, by all
means, go for SaaS CI/CD.

If there's any kind of orchestration needed... it doesn't hurt to hire a
professional to do it than force 40+ developers do it piecemeal between their
other tasks, which will often have a higher priority due to management
demands.

To rephrase your statement, I think you should get the best tools that are
realistically affordable for your process. On top of that you should also get
the best supporting cast for your process since often tools on their own don't
cut it.

------
jondubois
This test completely misses what it means to be a senior (effective) engineer.
The real difference between a mid-level engineer and a senior engineer is that
the mid-level engineer will mechanically apply the same strategies to all
projects without thinking - All the points mentioned in this article; CI, one-
step deployment, daily status check-in meetings, etc, etc... are in fact not
necessary for ALL projects.

The quote "Those who only have a hammer tend to see every problem as a nail"
is a good summary of the junior/mid-level mindset.

I don't know the author, but based on the rigidity of the article, I would
guess that they've only worked for big companies. I would argue that a most of
these rules are only effective in the context of a very large company; in
literally every other context, many of these rules are inefficient.

Big companies are all about risk mitigation; they are willing to sacrifice
speed and agility in exchange for stability, certainty and visibility but this
is actually a luxury that only big companies can afford and should not be
taken as a rule of thumb.

------
mlashcorp
I may be in the minority, but I find the 50% unit test coverage not useful,
and sometimes harmful. Caveat emptor - depends a lot on the project, and how
often you are changing the code and/or the complexity of said code.

~~~
ferroman
What do you mean? Is it's too low or too high?

~~~
quantumhobbit
It is less a question of whether the percentage is correct than whether the
tests are useful. I've seen plenty of useless tests (testing getters and
setters in Java) that assert nothing related to the codes functionality but
exist solely to boost coverage. Which is why asserting a strict coverage
percentage is dangerous.

Better to just do real TDD in the first place.

~~~
jghn
This. Slavish fetishization of a specific code coverage target is indicative
of an underlying problem, IMO and that problem is far greater than one having
relatively low code coverage.

It is far better IMO to go in with an understanding of where your potential
hot spots are than simply adding a test to everything. Sure, in an ideal world
we'd have 100% coverage of everything but this field is about tradeoffs and
sometimes writing tests simply isn't worth the time it takes to have written
them in the long run.

------
throwaway729
A Joel List is a quick-and-dirty list you can use to assess the competence of
an organization. I don't think this list achieves that goal.

#1 and #2 (CICD) are fair additions to Joel's list, but I'd argue are already
encapsulated by "do you make daily builds?". In most shops, if you make daily
builds, then you CICD.

#3 = Joel's #4

#4,#5,#8,#10,#11,#12,#14,#15 are all "Do you SCRUM/TDD?". If that's the kind
of place you're looking for, great. But there are many competent code-oriented
organizations that do not SCRUM. So these don't really belong on a Joel List.
(Also, "We don’t know the better way to make sure that code does what it’s
supposed to, then to have another code [author means unit tests] that runs it
and check results" just isn't true. We know better ways, and sometimes they're
even relevant to a list like this. "Do you use any form of static or dynamic
analysis (e.g., types, valgrind, quick-check style tools, linters, etc.)" is
on my personal "Joel Test".)

That leaves "do you have a library?". IMO work-place libraries are close to
useless as signals (everyone has one), and rarely useful in practice (unless
you're curious how PHP code was written in 2003 or really want to brush up on
complexity theory).

As an aside, it's kind of depressing to me that we still make these lists.
Back in the 90's, software engineering was still a relatively young craft with
relatively few experts. Joel was part of a surprisingly small group of people
who: 1) had a career's worth of experience developing software for micro-
computers in high level languages; and 2) had deep and successful experiences
across several organization roles in different types of organizations (coder,
manager at MSFT, CEO at Fog Creek). The existence of managers who were in
charge of software engineers but had no engineering experience wasn't
surprising at all, given the youth of the field. Hence the Joel Test.

The world is a very different place today. There are a lot of people with this
level of experience. Joel Tests aren't ubiquitous in other engineering
domains, and hopefully they'll eventually die out in software as well. Not
because the items on them aren't important, but because experienced Engineers
manage Engineers.

~~~
ferroman
You can generalize it, but that was a point to make the list more specific.
And yes, world is a very different place today. The number of SW engineers
doubles every year, so, at least half for them are new. We are engineers and
we should try to measure or competency, аnd should try to systematize the
things that we use. Of course, this list is not something absolutely
universal, but we should at least try think about standards we that we want to
meet.

------
BeetleB
>Do you contribute to Open Source?

Of all the Merits, this is one I disagree with. It's merit is primarily
"joining like minded folks" (i.e. cult) than any inherent merit in and of
itself.

How many great developers do you know who do not have this "merit"?

~~~
azrazalea
I know way more great developers who contribute to open source than great
developers that don't.

~~~
Dayshine
Of course you do, they publicize themselves by contributing to open source...

------
djb_hackernews
The problem with this test and the Joel test is an employer can check all of
the boxes (essentially do you follow modern software practices that were
revolutionary 20 years ago, and are you "Agile") and it can still result in a
toxic or less than optimal environment.

I did like the questions around OSS and sharing expertise. I'd like to see
more questions that address recruitment anti patterns (diversity, agism,
disclosing previous salary, etc) and tech organization anti patterns (an
actual career path on par with management, non transparent equity grants, etc)

Like, what would the questions be if even, say, Google didn't look so good if
it answered them.

------
JCDenton2052
Such tests are not very informative when they are disconnected from the type
of companies/industries one is looking for. This is a criticism I have of the
Joel test, too, since at the time he wrote it it seemed to me a good way to
evaluate prospective software houses. After all, he worked in one. Yet I dare
say most of us will probably spend at least part of our careers working in in-
house IT departments. Different rules.

It is not unlikely to find companies that fulfill all requirements, although
they will likely know how attractive their working environments are and will
filter candidates accordingly. An interview I had with such a small-sized
software house two months ago confirmed this. I gave them the Joel test, which
they had never heard of before, and they scored perfect. Dedicated testers,
usability testing, quiet working environments (like a library, the team lead
said; no need for headphones). Predictably, they were extremely picky as to
who they let in.

The ones much less likely to get good test scores? 1) Government IT, by and
large 2) IT for any non-tech company less than a certain size. 3) Non-tech
corporations (and even some tech ones). One notable one I was aware of used
excel for bug tracking, was full of red tape and their main technical test was
a 20 question multiple choice.

~~~
s73ver
I'm not quite so sure what your point is. The examples you gave of places that
don't do so well on the Joel test still sound like places most of us would not
want to work.

------
TeeWEE
Do you have automated end-to-end tests in DSL?

Seriously? What is considered a DSL?

~~~
ryanbrunner
And what's more, why does it matter? I assume they mean something like
Cucumber, which, while I'm OK with, doesn't really add any benefit as well
organized tests written in the language your code is in.

~~~
ferroman
Because it allow to keep the domain logic somewhere. In these days people
don't do documentation, or documentation is always outdated. Having high-level
tests in DSL will do more than just tests - it give you information how your
application behave from the user perspective, in more general sense. So you
will have less issues when new people join to the project, or when project
owner changes. And it force you to focus goal during feature implementation.
From my experience, new features often have behavior that are not oblivious,
and some times conflict with other application features logic. These tests
allow to see these conflicts before implementation.

~~~
ryanbrunner
I disagree, but I get where you're coming from - which is exactly what the
problem with this list is. The Joel Test was great because everything on it
was universally accepted as something that every developer would want to
happen where they worked. With this, different developers are going to have
different opinions of many things on the list, making the "score" lose it's
usefulness, since now every time a company has less than a perfect score, I
need to figure out why (is it because they don't have a library? I don't care
so much about that. Is it because they don't have CI? I care a lot about
that.)

------
kazinator
_Do you follow the shibboleths of my religious sect?_

------
Silhouette
I think I get -9 for one current project, though I'm not sure because I don't
understand several of the weird ones.

Then again, the client likes that project, because their customers also like
it. It solves a problem for them that no-one had solved in a similar way
before. I don't think we've had a single major bug reported against that part
of the system by any customer in over five years, and typically that includes
a multi-month lab evaluation by each customer before deployment.

So, does the development process on that project suck or not? :-)

------
dwheeler
If you're looking for a set of basic criteria for well-run open source
software projects, please check out the CII Best Practices Badge:
[https://bestpractices.coreinfrastructure.org/](https://bestpractices.coreinfrastructure.org/)

Full disclosure: I lead the project.

Constructive comments very welcome!

------
obstinate
Never even heard of half the terms in "obligations". Somebody should tell my
employer that I'm not competent.

~~~
samtolife
Eventually some one will :)

~~~
mcguire
Probably the new intern with the 'information technology' major.

------
kensai
If you really want to improve the Joel Test, fine. The suggestions are even
right. But please, KISS. The original test had 12 items, this one 20. This
should be the upper limit, more or less, for any improved version. Otherwise
it is probably too long or detailed to be of practical use.

~~~
solatic
It's also way too specific and biased. Rules like "do you have end-to-end
integration tests?" aren't always obligations for all teams, whether that's
because your team is doing embedded work or because you're doing something
better (like consumer-driven contracts). Other rules like "do you have a
primary communication channel?" are even at times counter-productive (in
particular, when dealing with a variety of customers who have their own
preferred methods of communication, which you must accommodate). Daily status
meetings are sometimes unnecessary, if you have a small enough team sitting in
its own room or more tightly involve team members in planning and review
practices, and indeed daily status meetings often clash with more important
practices like flexible scheduling and not interrupting flow.

------
suhith
It would be great to have some compiled data on various companies/teams and
the obligations + merits that they do or do not meet.

