Hacker News new | past | comments | ask | show | jobs | submit login
Don't overreact to weak signals (themarketplace.guide)
83 points by rognjen on Oct 25, 2022 | hide | past | favorite | 35 comments



This is good advice.

Unfortunately, it's really hard, because an opposite of the statement is also good advice: Don't underreact to weak signals. I honestly couldn't tell you which I see more of. Overreactions are intrinsically easier to notice, since they're big and showy, but I kinda suspect there may be more "underreaction" in the world.

Unfortunately, having the precisely correct level of reaction is really hard and I don't have any solid snappy advice for it, except that's one of the things that is probably irreducibly related to having experience and wisdom and not just raw intelligence.


Well usually there are lots of signals. Overreacting means you might do something that's misguided and expensive. Underreacting means you miss a signal or two and delay a big risk until the information is certain. The pay-off is asymmetric.

If you have a process for spending money in place and people with too much free time on their hands then overreacting might be OK. Too much research never hurt anyone so "overreacting" by investigating thoroughly is probably not wasting your time. But if your reaction process is throwing money at ideas you'll probably just get scammed.


It also depends on what climate you are.

Overreacting to signals in a bull market was probably the right move with the abundance of opportunities.

The opposite is of course true now.


This is generally good advice. Poor management tends to overreact to these sorts of signals which helps them create an illusion of busy-ness but creates havoc within their team.

I’ve seen teams spend days consumed by their overreaction, make unnecessary changes to product, and then shrug a week or month later when everyone collectively agrees when that change should be undone. Bizarre.


And I've seen product management in particular overreact at several different companies, besides us


I’ve seen this especially with teams looking for product market fit

You have a handful of customers using a product. One point of data comes in. The team, hungry to build a product customers want, jumps on that one point of data and builds a whole feature around it. Nobody uses the feature.


Or takes forever building the feature and/or takes forever getting it right, and the one customer who wanted it gives up and leaves in the meantime. Or even worse they hang on with some tiny engagement that turns out to not actually bring in a lot of money, but still sucks up 15% of developer time (not to mention customer support reps & business development people constantly having to keep them happy) and adds extra value for no other customers.


Having team use their own product for same reasons as customers is a great investment to reduce these issues. Ideally from launching as a new customer to daily tasks.

I try and do this at everyplace I work. Usability issues become completely obvious very fast.

Idea struggles if customers are wanting to adapt your product to them. Best if Customer is able to adapt business to product.


In general, this is excellent advice. But we were running marketplace for food ingredients. And while my co-founder had lots of experience in the industry, there was no realistic way to dogfood it.


We were exactly looking for a PMF when I realized that our sample size was entirely too small to be considered a signal.


My initial reaction when reading the title was like: "Yeah, that's good advice when going on a date!" After reading the first few lines, I realize that the topic is completely different, yet somehow the same.


You beat me to this comment... advise to my past self on an exciting pseudo-date.


Should I submit it to a dating website advice column, I wonder...


Yes, go ahead! Also, a Youtube tutorial would be awesome!


It's interesting to consider this together with Jeff Bezos's famous quote: "The thing I have noticed is that when the anecdotes and the data disagree, the anecdotes are usually right. There is something wrong with the way that you are measuring it." And my understanding is that he frequently forwards customer feedback to product teams - is this not "Considering one instance of user feedback to represent a pattern"?

Perhaps the synthesis of these ideas is not to overreact to weak signals, but to be sure you're aware of them.


I think that's just a story about how Amazon is always trying to quantify performance and getting it wrong. Seriously, Google quantifies a lot of things too but they don't screw up. If Bezos still hand picks examples and that's the only way real change can happen, that's probably why Amazon has a reputation as a meat grinder.


Contrast with: https://markxu.com/strong-evidence which claims that much of the evidence you interact with is strong.


Evaluating the strength of evidence is one thing. Another is correctly what the evidence supports, regardless of strength.

Another common family of mistakes are those that overgeneralize or overapply. Taking evidence to.mean something deeper or broader than it has any business commenting on.


Interesting, thanks. I'll update the article to add it as a counterpoint.


I don’t think the article I linked is disagreeing so much as saying something different but related. But I’m glad that you got to see it.


Would have liked this more if the advice was wrapped into a story of something that actually happened.

The way this advice is presented it reads rather abstract.

PS: I have read the examples. They don’t make it better. There is also no real story behind them


I certainly don't think this is bad advice, but without some discussion of how to assess the strength of a signal or the appropriate reaction size, I'm not sure it's useful advice. Knowing not to overreact to weak signals seems like the easy part. The hard part is knowing the difference.


Your downvotes are unjustified. This is good feedback. I'll update the article to talk about sample size etc.


Thanks. I spend a lot of time working around pretty thin data, so I'm always interested to hear how people think about extracting insights when you don't have the luxury of doing proper, robust experimental design or sophisticated statistical analysis. I'll keep an eye on the page for updates.


Large business decisions need the context of the business and team, right?

For example, an early team/company may only have a handful of datapoints and will tend to rotate with every piece of new data that comes in. Survival mode.

A large company with hundreds or thousands of employees needs to react differently because rotating on most new single datapoints causes so much thrash. However a good leadership team will know when that one new datapoint is worth the thrash of a quick rotation.

Great leadership and great management carry the context.


Perhaps more dangerous is over-reacting to data that was recently gathered (recency/availability bias).

I’ve seen founders come out of discovery interviews wanting to pivot right away. In this case it wasn’t necessarily the strength of the signal; rather, it was how recently that data point was gathered.


This is also a very good point -- the most recent feedback isn't always the most relevant


Classic HN off-topic post, sorry.

Was a particular CMS or static site generator used for this page? It reminds me a lot of software documentation pages, and I never considered before that this kind of site layout and design would be good for a blog.


Based on a quick search through the source, it looks like a statically generated page based on the Jekyll theme "Just the docs." [1]

[1] https://jekyll-themes.com/just-the-docs/


This is correct.


Judging by the site's source, it's a Jekyll theme.

And one that has a few bugs in it:

  </p> </p>

  <a target="_blank" rel="noopener" href="mailto:...?subject=...&quot;...&quot;&body=..."> ... </a>


Thanks for pointing that out. I'll look into it.

If you mean the mailto link, that shouldn't be a bug, but should pre-populate some content. Lemme know if it doesn't.

e. Couldn't find where the </p></p> happens -- lemme know if you can screenshot it...


The counterargument to this of course is that humans are prone to the Normalcy Bias [imaginary link to wikipedia here]


Sorry for the meta commentary but this format always cracks me up.

[catchy title]

[completely anodyne observation]

[completely anodyne observation]

[common knowledge]

Here at throbbing rooster dot io we’ve really learned this lesson thanks for reading.


I guess you've learned this lesson yourself so it's not useful for you ¯\_(ツ)_/¯




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: