

Ask HN: Best way to get people to particpate in a survey - sdesol

I need to quantify how time is wasted in the software development processes, so I need to do a survey. I work in the Ottawa region so there is a lot of high tech companies for me to query and I was wondering what is the best way to approach them.<p>I was thinking about offering each company something like a $100 gift certificate that can be given out to winning employees who participated in the survey.  I figure if I buy the certificates in bulk, I should be able to work out a discount.  Or would it be better to just pay a company to do the survey?<p>On a side and begging note, if you can explain how you are wasting time, I would greatly appreciate that as well.  I'm focusing on how you are wasting your time answering questions that are related to your software changes.  These can be questions from verification teams, management, senior management, executives, etc.  I'm also looking for examples where a lot of time was wasted to fix something because there was poor communication with regards to not understanding what was changing.<p>Thanks
======
DanielStraight
You are already starting with two preconceived notions. First that you need to
quantify how time is wasted in software development. Second that you need to
do a survey. I don't know what you're trying to achieve, but those may not be
necessities (well unless of course what you're trying to achieve is exactly
and only raw data on how time is wasted in software development as would be
obtained by a survey).

The use of a survey especially seems more apt to be flexible. Especially when
you consider the simple fact that all survey answers are lies to some extent
whether the responder intends for them to be or not. Is there perhaps an
opportunity for direct observation? Can you perhaps check how time is spent by
looking at commit logs of open source projects? Can you track stock values
over time to see which period of time was least utilized? The list goes on.

I'm not saying a survey is necessarily a bad idea, but they have lots of flaws
(such as lying and not providing the right answers (or questions)).

~~~
sdesol
The product that I'm working on is designed to improve how we communicate and
access information. It's based on a system that I had designed and implemented
for a very large software company where errors and missteps tend to have a
cascading affect.

It was fairly easy for people to not know what was going on around them due to
the complexity of the software. The problem was also exasperated by people
having different skill sets and social skills. You'll be surprised or maybe
not, but people will waste time trying to find an alternate way to get an
answer, if they know the person who can answer it is not sociable.

What I'm hoping to better understand is how we spend our time answering
questions like, "what did you do?", which requires e-mail and/or meetings
and/or reports. And if these types of question could have been self answered
if the person asking could have figured this out themselves.

The main selling point of my product is I make it very easy for non developers
like managers, seniors managers, verification, etc. to understand what is
changing. It also makes it easy for developers who have to deal with a very
large code base.

In order for me to show customer value, I'll need to quantify how buying my
tool, will help their bottom line.

I do understand that lying would be an issue but I also think if I can come up
with a good set of questions. That those frustrated will be more than glad to
provide honest feedback.

The challenges that I'm faced with is defining all the questions, which is
something I can work on. But in order for me show customer value, I'll need to
quantify time wasted, with real world metrics. Time is money. And in this
market, if you can't save somebody money, they probably won't listen.

~~~
DanielStraight
If I understand correctly, your product helps people keep track of what's
going in the development of a project.

That may be even more susceptible to lying in surveys. First of all, people
don't want to admit that they don't know things, especially things directly
related to their job. I would have a hard time admitting I had no idea what
was going on with a project I was working on [of course, I'm the only
developer where I work, so that's not usually an issue ;)]. Also, those
frustrated by not knowing what was going on are likely to exaggerate the
impact of that lack of knowledge ("I couldn't do anything at all for 3 weeks!"
or something like that).

This is a case where I think the best selling points may be testimonials from
previous customers. Another thing you might consider is offering an audit
service to evaluate how much time you think your product could save.

I'm not trying to discourage by any means, but surveys are hard to do right,
so I'm just trying to suggest other options that you can use to supplement and
strengthen the results of whatever survey you end up doing.

For me, my likelihood of doing a survey is [assuming I'm not unintentionally
lying too much ;)] based more on the quality of the survey and how much I care
about the product/process I am being surveyed about. I would be more than
happy to take an interesting survey about Web.py (a Python web development
framework).

The problem is most surveys are neither interesting nor about interesting
things. If your survey involves rating statements Disagree Strongly, Disagree,
Neither Agree Nor Disagree, Agree, Agree Strongly, you've already failed...
epically. If your survey involves rating a long list of things on a scale of
1-10, you've failed epically. In fact, if your survey involves answering a
long series of similar questions of ANY kind, you've failed epically. When I
see a survey like that, I go into 5 mode (or 3 mode if it's 1-5). The answers
are meaningless after about the 3rd question, and I'm not convinced that
people can rate things at the granularity level of a 1-10 scale anyway. If you
must have a scale, use 1-4. It's impossible to answer a middle meaningless
answer and it's small enough that the answers may be accurate (what's really
the difference between a 6 and a 7 or a 2 and a 3 on a 1-10 scale?).

Surveys should be dynamic and look like they were actually written by people,
not machines. Ask questions in different ways. Ask open questions and give
plenty of room to answer. Don't try to limit answers to an unreasonable set of
options. Don't ask too many questions (I think about 20 should be considered
the limit).

~~~
sdesol
I hear what you are saying. I'll definitely need to come up with something
that can translate into dollar signs. Thanks for the input.

