Hacker News new | past | comments | ask | show | jobs | submit login

Is it unintuitive?

The concept of diminishing returns applies to everything any living thing does. I think it's very natural, even in the 'our brains are wired for this' sense.




Yes, it's counterintuitive to many people that "the optimal amount of $BAD_THING is non-zero" for a rather large set of possible values for BAD_THING.

"You can't put a price on human life" is one common occurrence of this. I thought it was just a throwaway phrase, or a wish of how things could be, but I've regularly run into people who don't understand why we would ever fail to spend absurdly large amounts of money to save a single life.

I have had people say "Is the only reason we aren't doing something about X money" for various forms of X. I personally spent over an hour walking one such person through the concept of opportunity costs, the fact that money is representative of value, &c. with the conversation ending with them still certain that the US Government could just print a trillion dollars and solve the problem.

I have talked to people who honestly think that every person with depression should be forcibly committed to prevent them from committing suicide and respond to the "but only a fraction of those people have suicidal ideation, and only a fraction of those commit suicide" with a "if it saves just one life, it's worth it!" So forcibly committing 8% of the adult population is not too high a cost in their minds.


Well, the optimal quantity for many things is actually zero.

It's the edge cases-- the things where it's really hard to get rid of and we get some useful benefit from the related activity-- where it's nonzero.

Of course, those edge cases are our biggest world problems-- for example, pollution, because pollution abatement is hard and the industry and commerce that produces pollution is beneficial... or fraud, because fraud protection is hard and the industry and commerce that provides opportunity for fraud is beneficial.


because they somehow have this idea that they own the rights to control others. Once you consider that you can force others to do anything, regardless of reason, you can begin to rationalize anything, and it will be "for the greater good" or "for their own good". They probably even sincerely mean it too.

some kind of bald man once quoted someone: "With the first link, a chain is forged..."


> because they somehow have this idea that they own the rights to control others

The flip side, of course, being the folks who believe they somehow have no responsibility for how they use their rights to impact others.

Society can only function with at least some balance between these two extremes. Some of us need a little bit of chain.


you cannot perform crimes on others, thats about it. If someone comes to my house dying of cold, they have no right to demand I help them. I would be an asshole if I dont, and I think people SHOULD help, but you have no right to demand I do (and again, not saying I wouldnt, just that nobody gets to be entitled to it)


Unfortunately, we encounter far more complex scenarios, from "can the car dealership dump engine oil into the creek behind their maintenance bay?" to "we sold a product that provably killed thousands of people, but none in a way that can cause us direct individual liability".

Societally, we've largely decided we're all better off without a pile of frozen bodies at our door.


> Once you consider that you can force others to do anything, regardless of reason, you can begin to rationalize anything, and it will be "for the greater good" or "for their own good". They probably even sincerely mean it too.

Eh, I know lots of people who are neither anarchists nor totalitarians, so I'm not sure this is true.


i said you CAN begin to rationalize... I didnt say most would do it about everything, but as is clearly evidenced here, many would about a great many things.

"Red cars are more involved in traffic accidents, I therefore think you must be a murderous lunatic if you get a red car, and we cant have that, so lets forbid it"


> Yes, it's counterintuitive to many people that "the optimal amount of $BAD_THING is non-zero" for a rather large set of possible values for BAD_THING.

How much "earth is sterilized" risk is reasonable per year?


It is unintuitive. It hits some primal part our brain, the same as is isolated in that capuchin monkey experiment where two monkeys get varying levels of grapes. People get either really excited or really pissed off when they see people getting away with something; at the same time, when we read stories about bureaucracy --- a perennial bête noire on HN --- we rarely think in terms of the fraud we should be accepting. We just think fraud is bad, and anti-fraud is bad.


I think it is in part rooted in the difference between committing fraud and detecting that there is fraud. They're entirely different things and the misunderstanding results from the fact that. It is indeed very interesting how deeply hardwired this sort of thing is.


It is indeed unintuitive in practice.

One of the key factors is that a large part of the cost is opportunity cost. People tend to get all tangled up when thinking about such counterfactuals. We want to have our cake and eat it too. And if we can't, we persist in trying to find ways to believe that we can, and are surprised that the world continues not working that way.


The optimal amount of airline catastrophes is non-zero.

The optimal amount of pollution is non-zero.

The optimal amount of [insert almost anything bad] is non-zero.


The optimal amount of X under the assumption that it costs more and more resources to reduce X is non-zero. It's this middle part that you omitted that turns an unintuitive statement into an obvious one.


Those are all true statements, right?

The ideal amount of anything bad, is zero. But the optimal amount is going to be higher than that, given we don't have unlimited resources to spend.


Again this is totally misleading.

The optimal amount of airline catastrophes is zero. It is also impossible.

Should all humans spend all of their effort and all of their resources to lower the amount of airline catastrophes? I believe that pretty much everyone finds it reasonable to say no.


Why is that zero? I don't want to die in a plane any more than the next person, but is zero really optimal? It's least life losing, it's least catastrophic, but is it optimal? the question becomes, what are we optimizing for? If the optimization equation is for lives lost, I can make that zero real easily by just stopping air travel entirely. If no one travels by air, then no one can die by air. But of course that's not a useful solution at all. So we're not optimizing for lives not lost. We're optimizing for people being able to travel. Loss of life is tragic, but not the be all-end all. The gross reality that we don't want to face is that there's a price for a human life, the only question is how much?

Would you take a $10 10,000 mile flight across the world with a 1% chance of dying? How about a $100 flight with a .1% chance of dying?


Optimum is zero, because if you could wave a magic wand to magically make the amount of accidents zero, it would be a good thing to wave the wand.

The optimum is non-zero only if there are enough costs associated with making it zero.

That's why asking for optimum amount of fraud is misleading. It omits the costs. Once the costs are taken into account (i.e. it is clarified what the question means) the answer is obviously above zero.


"Optimum is zero, because if you could wave a magic wand to magically make the amount of accidents zero, it would be a good thing to wave the wand."

You are conflating optimum (highest value outcome for all variables) and ideal (highest value outcome for one variable). The ideal number of any bad thing is zero. I can't, off the top of my head, think of any bad thing for which the optimum number is zero. Extinction level events, perhaps.


> Is it unintuitive?

Yes when we look at it from both an individual level and a societal level. There's a very very strong aversion to loss and many decisions (including my own) are based on the concept of not losing something. Many times these decisions lead to making decisions that are sub-optimal (for the thing that's being optimized for).

By extension, many people apply this thinking to businesses and higher levels as well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: