Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah it is weird that it is believed that there is a linear scale to work in and quality considering how well known pareto/power distributions are. These distributions are extremely prolific too. I mean we even codify that sentiment in the 80/20 rule or say that 20% of time is writing code and 80% is debugging it. What's interesting is this effect is scalable. Like you see this when comparing countries by population but the same distribution (general shape) exists when looking at populations of states/regions/cities (zooming in for higher resolution) for any given country or even down to the street level.

> Obviously every situation is different, but people seem to be pretty okay with relinquishing agency on these things and just going along with whatever local maxima their org operates in. It's not totally their fault, but they're not blameless either.

I agree here, to the letter. I don't blame low level employees for maximizing their local optima. But there's two main areas (among many) that just baffles me. The first is when this is top down. When a CEO and board are hyper focused on benchmarks rather than the evaluation. Being unable to distinguish the two (benchmarks and metrics are guides, not answers). The other is when you have highly trained and educated people actively ignoring this situation. To have an over-reliance on metrics and refusing to acknowledge that metrics are proxies and considering the nuances that they were explicitly trained to look for and is what meaningfully distinguishes them from less experienced people. I've been trying to coin the term Goodhart's Hell to describe this more general phenomena because I think it is a fairly apt and concise description. The general phenomena seems prolific, but I agree that the blame has higher weight to those issuing orders. Just like a soldier is not blameless for their participation in a war crime but the ones issuing the orders are going to receive higher critique due to the imbalance of power/knowledge.

Ironically I think we need to embrace the chaos a bit more. But that is rather in recognizing that ambiguity and uncertainty is inescapable rather than abandonment of any form of metric all together. I think modern society has gotten so good at measuring that we often forget that our tools are imprecise whereas previously the imprecision was so apparent that it was difficult to ignore. One could call this laziness but considering its systematic I'm not sure that's the right word.



> To have an over-reliance on metrics and refusing to acknowledge that metrics are proxies and considering the nuances that they were explicitly trained to look for and is what meaningfully distinguishes them from less experienced people. I've been trying to coin the term Goodhart's Hell to describe this more general phenomena because I think it is a fairly apt and concise description.

McNamara fallacy or quantitative fallacy.[1]

[1] https://en.wikipedia.org/wiki/McNamara_fallacy




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: