Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Wisdom-of-crowds survey tool that finds hidden consensus at scale (opinionx.co)
94 points by danielkyne 51 days ago | hide | past | favorite | 26 comments



After talking with a bunch of user researchers, we realised that a lot of the research process is stupidly clunky. User researchers often run a survey with open ended questions, manually group responses into themes and then turn that into a normal multiple-choice survey based on what gets said most. Overall it can take a full week of work and is pretty frustrating.

How our tool makes this process better:

- By analysing user votes we surface the most important topics/opinions, not just the topic with the most submissions.

- You get qualitative input alongside quantitative analysis, making product prioritisation way easier.

- By letting users submit new opinions you can discover new insights you never considered before.

We're in the process of setting up a freemium subscription model. Right now it's free to create your first survey [1] and we've got a sample survey going too if you want to just play around with it from a participant perspective [2] (it asks for your email but feel free to put anything in, it doesn't verify it).

[1] https://app.opinionx.co/sign-up

[2] http://app.opinionx.co/5fc62b9f9891c20db5dc2d14


My problem with the most surveys is vague questions.

Do you like meeting new people? Well, it depends on circumstances, situation and what kind of people these are.

Do I always learn something new? Nope. No way any human being can be always learning. So why even ask such a question?


This is often combined with multiple choice answers that are completely unambiguous. My favorite is that in this scenario, there's never a "Other" box and you're not allowed to leave the question blank. I figure the researchers deserve what they ask for so I try to pick the most ridiculous answer from those provided.

  What is your favorite color?
  [ ] 1.  Seafoam
  [ ] 2.  Taupe
  [ ] 3.  Periwinkle
  [X] 4.  African Swallow
(if only they'd offered "Blue, no Green!")


I think it makes sense if you see the vague questions as proxies for cultural markers. There's a type of person who says "Yeah, I learn something new every time I meet someone! People are all so interesting and unique!". What the survey really wants to know is whether you're that type of person.


> "Yeah, I learn something new every time I meet someone! People are all so interesting and unique!".

If you want to know if people feel that way, would it not be better to ask about that issue directly, such as by asking how much they agree with the above sentiment? It might not be a very reliable way to find out, but I am skeptical that asking "do you always learn something new?" is a better way.

There's also something circular about the idea that "there's a type of person who...", if those types are coming from the surveyor's interpretation of the replies to the questions they asked, especially if the questions themselves were framed on the assumption that there's a type of person who...


Yup.

Ask me if I like meeting new people and I'll say yes.

Ask me if I like meeting new people in a gym, when all I want is to break some sweat, have a shower and go home and I'll say no.

Go figure.


This is frustratingly common with traditional surveys. In our conversations with user researchers we found that researchers often know what they want to ask, but they end up breaking it up into loads of different (often vague) questions in order to retrofit their research objective into survey format. Because you're less limited in scope with an open-survey like OpinionX, you're able to much more directly set the direction and question of a survey. At the end of the day, it's up to the researcher to write the best question possible, but we feel that OpinionX gives more breathing room and allows them to ask more directly if needed.


This reminds me of pol.is, which is used in various places to develop rough consensus in multi-stakeholder situations.

https://pol.is/ https://github.com/pol-is/

Taiwan use it for their multi-stakeholder decision making to find points of agreement. Their application of it to the Uber vs Taxis situation was quite interesting.

https://debconf18.debconf.org/talks/135-q-a-session-with-min... https://blog.pol.is/pol-is-in-taiwan-da7570d372b5 https://blog.pol.is/uber-responds-to-vtaiwans-coherent-blend...


pol.is was certainly an inspiration. We were really interested in the power of wikisurveys broadly, including pol.is and AllOurIdeas. When we started working on OpinionX, we couldn't find any wikisurvey tools that weren't really complex to set up (especially if you didn't have a technical background), so we figured we should try and democratise the wikisurvey model for broader use in a self-serve SaaS tool.


I've tried to setup the open source version of pol.is but never succeeded, so I think there is a definite market there for you.

I'd encourage you to write some case study blog posts about the analysis tools on offer, their blog posts about on topic was the most interesting thing about pol.is for me; their principle components analysis stuff with all the graphs and stuff.


I would love to know how my survey responses change the recipient behavior.

If I say that I love meeting new people, will my gym invest into expanding the lounge area instead of buying a new barbell?


Neither, it’ll change their ad copy!


Why do I need to submit an email address and agree to terms for the demo?


Hi Peter, sorry I understand that this isn't the ideal setup for a demo. Right now, all OpinionX surveys are automatically set up to collect email addresses (mainly because we haven't implemented a caching system to enable participants to re-join existing surveys).

We've just linked a normal OpinionX survey as our demo for now as we haven't built something specific that's better suited for a landing page demo yet - this is on our to-do list though, we know it's not the most user friendly for trying it out.


Got it, thanks for clarifying that.


How does this tool fight the natural tendency towards answer herding?


What is answer herding? No result on google.


People will just upvote the answer that they see is most popular already (whether because it’s popular or because it’s the first visible choice to vote on)


Participants don't see the voting results of any statements before they vote, limiting any potential groupthink herding. Also, opinions are presented in different orders for different users, so any impact of them voting differently on the first one they see gets balanced out as the voting grows.


i like the idea of rethinking surveys.

1) have you thought about voting schemas besides multiple choice? ex. rank choice, weighted preference, exclusion, multiple round voting.. there's a field of economics called "social choice theory" that you might have come across or would be a good area of research.

2) it would be interesting if people could predict survey outcomes, compare predictions with colleagues, and compare predictions against outcomes.

3) how are you thinking about anonymity?..which impacts what people are willing to reveal. we use google forms for surveys at work and you have to be logged in with your company email address. when the survey collects email addresses, it says so, but when that is turned off, it doesn't label the survey as anonymous. it would be nice if anonymity/privacy was more clear, or to have anonymity as an option to the responder.


What happens when there are a lot of unique submissions? Will people get fatigued of reading all the options?


Great question! Right now it's open ended to allow participants to leave whenever they want (not overly high tech!)

We've got loads of plans for how we can improve this over time though, using NLP & word-embedding to provide a broad participation across all the core themes and powering the order of opinoins by decision-trees, k-means clustering and matrix factoristation so that we're showing the right opinions to the right people to get a predictable view of the entire population of users.


This is super smart.

It would be cool to see a content site with this mechanism.


Thanks man! There's a lot of use cases for a platform like this, so we're focusing our efforts on user researchers but we agree that it can be (and already has been) used in very interesting and different ways.

We've already seen people use it for large-scale needs assessments, school curriculum co-creation, office Christmas party planning, startup problem statement creation, pizza company market research and more :)


reddit comment threads have upvotes and downvotes, that's sort of an open-ended survey with a voting mechanism.


100%. When trying to explain OpinionX in the earlier days (before our first MVP), people often found that comparing it to Reddit helped them understand what we were building. It was probably the most common comparison we got.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: