Hacker News new | comments | ask | show | jobs | submit login

I'm not sure I'd agree with all of these. For example, the ones about automated testing or building internal tools sound a lot like "Do you like to work the same way I do?" and if the answer is no then the dubious assumption is that you know more about what this potential employer does than all of their existing staff. The list seems reasonable for the most part, though.

Aside from obvious questions about the kinds of projects you'd be working on and the skills and technologies involved, I used to rely on a few simple tests:

1. What is the employee turnover like, and if people have left recently, why? [This is a reasonably objective proxy for what the rank and file staff really think, and they collectively know this potential employer much better than me.]

2. Did they ask brain teaser type questions or do a pop-psych personality test in the interview? [Do their hiring practices suck, in which case even if they hired me, would I want to work with anyone else they hired? Also tends to indicate an unhealthy assumption of superiority and desire to conduct one-sided interviews, which is a red flag in its own right.]

3. Will they show me a sample of their production code and of the related documentation, and if so, what is it like? [If they won't and don't have a very good reason, my assumption is that they aren't confident their code will impress, which is not a good sign. If they will, what I see is about as good a marker for whether they're any good technically as any benchmark you can fit into an interview.]

4. What does the car park look like on the way in? [Far from scientific, but a handful of executive cars in reserved spaces and then nothing but old models in the few remaining spaces is rarely a good sign.]

5. Does the contract impose on my life outside of work? For example, would I need permission from the employer to take on an unrelated side job that had no bearing on my ability to work properly for them? Would they be claiming IP rights to any work I did unrelated to that employment, so that for example I couldn't release a hobby project as Open Source if I wanted to? [Any attempt to control life completely outside work is a fairly reliable red flag for a generally over-controlling attitude and/or a show run by HR/legal rather than sensible management or the people I'd actually be working with.]

I totally agree with this. I personally do not use unit tests at all.

Why? Because I do assert-driven testing. My test code is integrated into the actual business logic. So no need to contrive scenarios.

So you do do tests, just not specifically separate unittests.

That's totally different from an organisation that doesn't do tests at all (yes they exist).

yes, but if someone asks me in an interview if we do unit testing, the answer is really a no...

so maybe the takeaway is if people don't do what you expect, maybe digging a little deeper before running away screaming ;)

Without scenarios, how do you run your code before it goes into production?

i use the combination of assert-driven development + UI/UX testing. (manual, but that could be a valid candidate for automation in my projects)

so I'm not saying there are no scenarios, but they all end up being the customer facing scenarios, and verifying those meet expectations. Also just developers making sure the functionality "works" naturally catches the technical bugs (because asserts in production code would cause the app to crash)

You might be interested in `contracts' which are basically assertions on steroids. See https://en.wikipedia.org/wiki/Design_by_contract (or http://blog.racket-lang.org/2012/11/contracts-for-object-ori...) for an introduction.

You might also want to explore QuickCheck, which helps to generate these scenarios.

Could you expand on what you mean by assert-driven testing? I'm not finding any good explanations from a Google search. Sounds interesting.

My GUESS was that he/she means that the code has asserts in it that check if something is not right at runtime. I personally would not call that a replacement/alternative for unit tests. In fact, it may be counterproductive ... a small error in the code would bring down and entire application.

i would counter this by saying that a small error in code SHOULD bring down the entire application.... during development. otherwise the error is not visible to developers when they can best/easiest fix it.

of course, asserts should not crash the app once it's in production. instead they should be channeled to a log that gets sent back to the production support team for analysis.

but during development, I'd say crashing during small errors is a great thing.

Assert driven testing means that the tests are part of the actual code that is running. It's helpful because when an assert fails, the stack trace is right at the position where it failed. In many ways it's an easy way to incorporate testing into your application without the need for an additional framework.

A simple example can be done by writing a single function in the root scope or some global function (javascript)

function assert(assertion, description) { if(!assertion) { console.log(description); //could also have other application error handling code } }

Then in a function in your application:

function myFunc() { var x = 1; assert(x == 1, "x does not equal 1 in myFunc()“);

\\Continue code... }

This is obviously a very simple example and can definitely be iterated upon, but it gets the basic point across and I'm on my phone so writing code isn't the easiest :-P

Some systems have something like this built in, such as node which has more features.


Sounds like Design by Contract.

i think that's right, but maybe more "agile" in that there is no explicit contract being fulfilled, just what the developer ensuring valid input and output to the functions they implement.

I find it interesting that you decide that assumptions about software process maturity are dubious but are happy to make assumptions about the lack of utility of "brain teaser" questions or personality tests.

I find it interesting that you decide that assumptions about software process maturity are dubious

You appear to be making your own questionable assumption, which is that the techniques mentioned before necessarily imply a more mature software development process. If a development team didn't use automated testing, I'd certainly be interested to know why, but I can immediately think of multiple plausible alternatives that might have proven to be more effective for that team depending on the circumstances. Unlike the use of brain teasers and popular psychology to assess applicants' value as employees in technical fields, there is a significant body of empirical data to back up the effectiveness of some of those alternative testing strategies.

I'm very interested in this. Can you name some of these strategies?

Two strategies I was thinking of were technical peer reviews and formal proofs. Of course, these aren't mutually exclusive either with each other or with automated testing, and these are all generic terms that each cover a multitude of specific implementations.

All three have a strong track record of finding bugs when implemented well. All three also add significant overheads, so there is a cost/benefit ratio to be determined. The relative costs of implementing each strategy will surely vary a lot depending on the nature of any given project. The benefit for any of them would likely be significant for a project that didn't have robust quality controls in place, but you'd get diminishing returns using more than one at once.

I could easily believe that skilled developers had evaluated their options and determined that for their project some other strategy or combination of strategies provided good results without routine use of automated testing and that the additional overhead of adding the automation as well wasn't justified.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact