
‘Strong Opinions, Weakly Held’ Doesn't Work That Well - shadowsun7
https://commoncog.com/blog/strong-opinions-weakly-held-is-bad/
======
jrockway
> this is not how the human brain works

To some extent, to succeed in the modern world, you have to override your
instincts with a thought-out response to things. Whatever heuristics we've
built up over thousands of years are interesting, but some of them don't apply
anymore. (If you were a prehistoric human and you encountered a beehive, you'd
eat all the honey. But now you can get hundreds of beehives worth of honey in
one trip to the grocery store, and your body's not going to tell you to not
eat it. It will tell you the opposite -- this tastes great! You just have to
learn that it's bad for your health and choose not to do it, no matter how
good it would taste.)

Basically, the human brain is a neural network trained on very old data.
Fortunately, it is also very adaptive and can ignore that training data with
some conscious effort. You need to operate your brain in that mode more now
than you did 10,000 years ago.

"Strong opinions weakly held" means "argue your point but sometimes you're
going to lose." Evolution might not have rewarded losing over the course of
millions of years, but whatever meeting you're arguing in doesn't have that
kind of staying power. You can lose the argument and the human race will
survive another generation. Whatever argument you're having probably doesn't
matter in any meaningful sense. It's not life or death.

~~~
austincheney
I have found that in software that is wrong. If you want to succeed in
software you need to do what is most popular which largely follows innate
human behavior. Unlike many other professions there is no licensing,
accredidation, or common ethic which means there is no standard baseline of
competence.

This is visually explained when thinking about the difference between success
and capability and then viewing that difference on a graph, such as a bell
curve.

[https://news.ycombinator.com/item?id=23768137](https://news.ycombinator.com/item?id=23768137)

~~~
zbentley
> If you want to succeed in software you need to do what is most popular which
> largely follows innate human behavior

I totally disagree. Many of the most valuable, respected, and successful
software engineers I've encountered have evinced deep distrust of what is
currently popular in the field. Their avoidance of (and advocacy for continued
avoidance of) new/trendy software fads, particularly on the frontend, enabled
excellent engineering, and instilled a powerful attitude of pragmatism and
productivity among their colleagues.

It's anecdata so take it with a grain, but in organizations big and small I've
observed that some of the most successful and capable engineers are
distinguished by antipathy towards popularity-driven engineering.

~~~
austincheney
> Their avoidance of (and advocacy for continued avoidance of) new/trendy
> software fads, particularly on the frontend, enabled excellent engineering

I absolutely agree with that. You are describing an expert, which is not
likely somebody successful or respected, particularly with regard to their
front-end web technology peers. Expertise is not frequently revered by other
developers, particularly when compared with compatibility. That distinction is
even more evident when applying for jobs at other locations.

~~~
zbentley
Maybe I've just been lucky with where I've worked, but I've often seen those
people get significantly rewarded for their anti-fad efforts--both in terms of
money/promotions and respect from their colleagues.

I've seen the opposite too, but not as often.

~~~
austincheney
A complete disdain for jQuery is what allowed me, as a full time JavaScript
developer, to become promoted to a senior and nearly double my salary. That is
only because I was working, at the time, in a niche area where jQuery failed
amazingly in production. I was deemed an expert and rewarded for it, but that
is astonishingly rare. Now you cannot get hired, much less attain any kind of
success, without something like Angular or React even you can write superior
code without those in half the time. That is why I only look for jobs that
primarily deal with writing for Node, but even still competence is not well
appreciated when interviewing.

------
lambdatronics
My take on "strong opinions, weakly held:" It's about combating your own
confirmation bias by being willing to do the work of re-examining your
conclusions in the light of new evidence. It's hard to do b/c we get
emotionally invested, and also because once you make a conclusion an
assumption, it becomes implicit & fades into the background -- so it's harder
to question.

On the flip side, it's also about trusting your own reasoning above the crowd
-- you are thus able to pick up the $20 bill on the ground instead of being
sure it's fake b/c nobody else has picked it up already.

~~~
godelski
Honestly I DON'T trust my own reasoning. I can do this adversarial process as
described in this article on my own, but that's incomplete. People outside of
me have information and ways to view things that I don't. Others work as a
discriminator and you update your opinions and views as more information comes
in and challenges your own ideas. But it is because I don't trust my own
reasoning that I read textbooks, seek out experts, and try to get as many
views as possible, because no one has the complete picture.

~~~
zimpenfish
> Honestly I DON'T trust my own reasoning.

I have this and it's a real problem for software development because everyone
expects a senior to be gung-ho about their solutions and perfectly confident.
Whereas I'm always asking for second and third opinions about things and
people (largely management but occasionally other developers) view that
negatively.

~~~
q3k
I think this might be due to a difference in optics. What you see as 'asking
for a second or third opinion' might be seen by others as lack of
independence, leadership and ownership.

Mostly everywhere senior developers are expected to self-manage and make
difficult decisions on their own, unprompted, and just present working
solutions to the business. It's okay to take time on this - solving it on your
own, consulting literature, peers, etc. But looping in management into
technical decision making might make you look like you can't make decisions on
your own, or need micromanagement. Management, business and product care about
things being done - not technical approaches or having to choose between
multiple solutions.

~~~
blaser-waffle
I agree with this. Even if you as an engineer aren't 100% sure, sign-offs from
other groups or extra opinions are something for a project manager or product
manager to fish around for. There are some optics to consider there too --
e.g. the implication that the PM things you're a fool -- but having them fish
around for buy-in and alternatives sells better than you doing it -- it makes
you look weak or unprepared.

Context matters here too, if you know Alice and Bob on Team [X] are just great
at crypto or DB work or whatever, then getting their feedback before
implementing a feature might make sense / be a standard due-diligence thing.

~~~
zimpenfish
> sign-offs from other groups or extra opinions are something for a project
> manager or product manager to fish around for

Not always possible - my current gig, for example, has neither project manager
nor product manager.

------
redelbee
I think a better headline for the post would have ended with “... doesn’t work
that well for me.” The author tried the “strongly held” strategy, found it
difficult, and decided to justify another strategy of asking “how much are you
willing to bet on that?” instead.

Why not both? After all, deciding “how much to bet” is probably one kind of
fairly strong opinion. It’s at least strong enough to bet on. Betting strategy
changes as information changes, so maybe you could construe that bet (or
strong opinion) as being loosely held.

I agree that pithy phrases shouldn’t be used to justify “strongly held bad
opinions” but how are we even deciding what a “bad” opinion looks like?

In the end we probably all want to come to the most correct conclusions and be
willing to change when presented with new information. How we get there
probably doesn’t matter as much as a majority deciding it’s worth the time and
effort to do so in the first place.

~~~
kashyapc
On how do you tell apart a bad opinion from a good one, the ancient Stoics
(lately I'm quoting them more, as I'm immersed in their writings) have some
thoughts here. For the longer version, you have to read their works[1], but to
give an extremely simplified version, without butchering the concept:

The Stoics have this notion of 'impression' and 'assent'. It goes like this:
An impression of walking strikes you. But only after you told yourself "yes,
it is fitting for me to walk", thus giving your 'assent' (agreement) to it,
will you actually go for a walk. Of course, we know that we don't actually
verbalize like that; as it all happens too quickly. Their goal here is to not
evade responsibility to shape ones own judgements, opinions, and even emotions
"in accordance with reason".

Thus, the Greek philosopher Epictetus' favourite way of describing the Stoic
project is: "making correct use of mental impressions".

    
    
                - - -
    

The same technique of impression/assent is also used, along with others, to
diagnose "passions" (Greek, _páthos_ —it's a loaded word that is used to
categorize many emotions, including the debilitating ones). Thus, for the
Stoics, the cause of any "passion" is an "error of judgement". What sort of
error? Mistaken system of values—there's a ton more to this, but I have to
skip it for brevity's sake. FWIW, some reading recommendations on this topic
on a thread here[1].

[1]
[https://news.ycombinator.com/item?id=22990579](https://news.ycombinator.com/item?id=22990579)

------
gav
There's a type of exercise where you're not asked "I want to do $x, how do I
achieve that?" but instead are asked "I don't know what to do, can you find
out?" where the scope might be anywhere from "pick an new ERP" to "reorganise
the entire worldwide operations to be a more effective at all things digital".

If faced with such a wide open question you could do some research and start
asking all the questions you think of, but you're then just hoping to narrow
in on something by luck. The "Strong Opinions, Weakly Held" method works well
here, if within the first week you can learn enough to form some opinions, you
now have some ideas to test against and disprove. You can start to decide on
what information is important and what isn't important, rather than trying to
gather all the possible information and synthesize it later on.

If you have an opinion such as "you should formulate a strategy to sell direct
to the consumer instead of relying on distribution alone" you have a starting
point and have narrowed things down from "how do we sell more things?". You
might not have one opinion, you might have five. It takes experience to come
up with opinions quickly when faced with limited data and potentially a large
problem space.

It's hypothesis-driven decision making. It does require both iteration and the
willingness to let your original opinions go--kill your darlings.

~~~
yodon
You may be interested in reading about "wicked problems" which are a well
studied limiting case of the situation you describe.

------
DoreenMichele
I was unaware of the origin story for this phrase. I've mostly seen it used to
duck actually arguing with people, a la this paragraph:

 _More generally, “strong opinions weakly held” is often a useful default
perspective to adopt in the face of any issue fraught with high levels of
uncertainty, whether one is venturing a forecast or not. Try it at a cocktail
party the next time a controversial topic comes up; it is an elegant way to
discover new insights — and duck that tedious bore who loudly knows nothing
but won’t change their mind!_

I think the original intent of "strong opinion" is _make a decisive conclusion
without hemming and hawing._ In other words, if it looks like a duck and
quacks like a duck, say "Well, I believe it's a duck" but allow for the
possibility that you are wrong and be willing to admit that _if evidence comes
forth showing you are wrong._

I don't operate that way. I have a high tolerance for ambiguity and I am more
comfortable with the answer "I don't know" than most people seem to be. Most
people seem to have a tremendous need to categorize things and I think this is
useful for such people: Go ahead, categorize it. Just don't be overly
committed to categorizing it. _Be willing to change your mind about it._

Most people fail at the "Be willing to change your mind about it" part.

Most people seem to use this phrase not as a rule of thumb for how to think
their way through something -- which can take work and it helps if you go
ahead and deal with whatever is in front of you and then take the next step --
but simply to deflect fightiness in online forums.

I'm happy to debate with people, but a lot of argumentation on the internet
isn't really intellectual debate trying to tease out the merits of an idea. It
gets personal. It gets ugly. It is actually fighting, not debating, and we
call both "argument" and do a poor job of distinguishing the two things.

So I have seen this phrase, but it was consistently used to basically say
"Don't @ me!" In other words, "I want to go ahead and speak my mind in public
to satisfy some need of mine, but I don't really want to deal with other
people not agreeing and all that. I just want to say a thing and that's it."

I think the original idea has some merit -- go ahead and state firmly what you
think it is _but be willing to change your mind_ \-- but that's not what most
people seem to use the phrase to mean. Not at all. And what it has come to
mean is pretty lame.

~~~
GlennS
The first time I saw this phrase was in some sort of "qualities of an
effective leader" type article, but targeted at techies.

Don't know if that's the real origin though.

Edit: I should've read the article first. The actual original is much less
tiresome.

------
recursivedoubts
I prefer "weak opinions, strongly held", as in, "I don't know. And I am pretty
sure you don't either."

Won't sell me any books, but it's been my experience.

~~~
barrkel
On the other hand, when faced with a fork in the road, it's important to
choose. Often it doesn't matter which - but if you vacillate, and especially
if you are leading - then you can fail by not choosing.

So, strong opinions: decisively go down one road; weakly held: turn around if
information suggests it looks like the wrong road.

You can only update your opinions if you engage with reality, which demands a
certain conviction in those opinions.

~~~
BeetleB
> On the other hand, when faced with a fork in the road, it's important to
> choose. Often it doesn't matter which - but if you vacillate, and especially
> if you are leading - then you can fail by not choosing.

The number of times people _think_ they are at such a fork where non-action is
worse than any action is an order of magnitude more than in reality.

In my experience, most of these cases are reflective of a social issue. The
problem at hand doesn't require one make a decision any time soon, but if you
don't, you are viewed negatively. So the social systems around you push you to
make a decision even when one is not needed.

Almost every time I've heard someone use the phrase "fence-sitter" it is this
scenario. At my work, we have surveys where we rank things on a 5 pt scale: 1
is strongly against, 5 is strongly in favor, and 3 is neutral. There's a
significant block of people who keep pushing HR to make it a 4 point scale so
there is no neutral option. I've had several discussions with them, and
they've never been able to give me a good reason, beyond statements like "You
have to have _some_ opinion!" and some negative comment about fence sitters.

But yes, for certain things (e.g. investing for retirement), it's probably
better to pick a safe option than not pick anything.

~~~
CarbyAu
Not sure where I heard it but: "No decision is a decision." IE Decide to put
off your decision until later.

People agitating for everyone to "voice their opinion now" are often looking
to form a mandate from those opinions.

~~~
dsr_
In the lyrics of Rush's _Freewill_:

If you choose not to decide, you still have made a choice

~~~
cutemonster
Maybe it's important to be clear with the others that indeed one had made a
choice: to wait and learn more.

Rather than seeming uncertain, one can seem certain that there's not enough
data

------
ZephyrBlu
> In my experience, ‘strong opinions, weakly held’ is difficult to put into
> practice

I 100% agree when interacting with other people, but I think it's still
valuable for your personal growth if you're intellectually honest with
yourself.

"How much are you willing to bet on that?" is definitely a smart question to
ask other people though.

------
owenversteeg
This is one of these HN posts where the article is excellent and the comments
are meh. Go read the article, everyone!

------
xtiansimon
> “Why does the framework not work very well?“

Framework? That’s just silly. It’s a turn of phrase that helps you change your
attitude or approach a problem from a new direction.

“If at first you don’t succeed...” I suck. I’m never going to try again! Haha.

I didn’t know the origin of the phrase, and that’s a testament to its
creativity.

As a life long learner, I found it useful in ways not related to the origin
story. It helps me to overcome imposter syndrome.

I know a little bit about a lot of things, and a lot about a very few things.
And I love solving problems with design thinking. Go for it. Feel confident
about your knowledge if you have done the work, but know there are others who
know more.

All that gets nicely summarized by Strong opinions, weakly held.

~~~
zbentley
> Framework? That’s just silly. It’s a turn of phrase that helps you change
> your attitude or approach a problem from a new direction.

While you're welcome to interpret it however you wish, one of the main points
of the article is that it _is_ a methodological framework laid out by Paul
Saffo[1], but that most people ignore the framework and focus on the
catchphrase.

1\. [https://www.saffo.com/02008/07/26/strong-opinions-weakly-
hel...](https://www.saffo.com/02008/07/26/strong-opinions-weakly-held/)

------
notacoward
> it is quite difficult for the human mind to vacillate between one strong
> opinion to another.

For the specific subset of people in tech, I don't believe this is true. How
many times have you and someone else had a strong disagreement about the cause
of a bug or design of a system, conclusively determined that one answer is
correct, and then had the other person come back (usually after a small delay)
acting as though they'd agreed with you all along? I've been seeing this for
over thirty years. Half the time, the other person even tries to claim they
came up with the idea on their own and everyone else was slow to pick it up.
Same thing is frequently evident right here. It's _easy_ for some people to
switch from one strong opinion to another, and the popularity of "strong
opinions weakly held" makes it even easier.

I also think that SOWH is behind a lot of cargo culting and conspiracy
theories. People want to get credit for being the _champion_ of an idea, even
if they don't fully understand it or it has low odds of being correct, and the
appeal increases with the challenge of convincing others. After all, if
they're wrong they can just switch sides and claim they'd been on the right
side all along. If they're right, it's an _epic_ victory (in their own minds
at least).

Strength of belief is not inherently virtuous. It should be proportional to
strength of evidence, not armor worn for the sake of a silly maxim. Strong
belief in SOWH itself is an example of faith over empiricism.

~~~
tomaskafka
The difficult thing is to change opinion to which you attached your identity.

Switching Python to eg. Node is easy as long as I don't see myself as "the
Python guy".

------
DarkWiiPlayer
I don't like the idea all that much. It seems close to how I operate
instinctively, but somewhat neglects the coexisence of mutually exclusive
ideas, which, I believe, is rooted in the poor choice of the word "opinion".

The reality of human perception is that we never have absolute knowledge and
can only operate on a framework of assumptions; when two possible assumptions
are mutually exclusive, we often pick the most likely candidate and focus on
that scenario.

This seems to be a reasonable methodology throughout most of human evolution,
where decisions more often needed to be immediate.

In a modern world though, it seems like a way more helpful mental model is a
superposition of scenarios and adequate responses to any of them; in
conclusion, the only reasonably principle on which to make decisions would be
maximizing the probability of being adequately prepared for the outcome of a
situation, which can be simplified as "Be prepared for as many likely outcomes
as possible", or, more correctly, act in such a way that maximises the sum of
the probabilities of all the outcomes you are adequately prepared for.

EDIT: I probably should have read the entire article before writing all that;
just one paragraph after where I had stopped the author actually makes a very
similar point.

------
gnicholas
I understand what “weakly held” means, but what does the “strong opinions”
part mean? I couldn’t get this from the post.

What would be an example of a weak opinion and a related strong opinion? Is
this just the difference between “the 49ers will win most of their games this
year” and “the 49ers will win the Super Bowl this year”?

If so, why is the latter preferred?

Edit: thanks for the downvote, perhaps you can help me understand what is
meant by the term, or why you found my comment to be inappropriate?

~~~
renewiltord
Examples below.

Weak opinion: I cannot conclude anything about why my car won't start.

Strong opinion: I think my battery is dead and that's why my car won't start.

Strongly held: I tested the battery and voltage and current are within good
thresholds. I still believe what I believed before.

Weakly held: I tested the battery and voltage and current are within good
thresholds. I have adjusted my belief about what I believed before.

WOSH: I cannot conclude anything about why my car won't start. You've tested
my battery as dead. I still cannot conclude anything about why my car won't
start.

WOWH: I cannot conclude anything about why my car won't start. You've tested
my battery as dead. It could be the battery, maybe. Hard to say.

SOSH: I think my battery is dead and that's why my car won't start. You've
tested my battery as functioning fine. It's definitely the battery, though.

SOWH: I think my battery is dead and that's why my car won't start. You've
tested my battery as functioning fine. The reason why my car won't start is
not my battery; I think it won't start because my starter motor is broken.

Downvotes occur for all sorts of reasons. Please don't introduce noise into
the discussion. It's annoying to other readers for all sorts of reasons. I am
often tempted to downvote anyone who complains about them.

~~~
gnicholas
Thanks for the examples. Is it possible for a weak opinion to be something
other than "IDK why X is happening"? That is, can it be an affirmative opinion
about some causal relationship or future event?

Honestly, if someone asked me whether the WO example above was strong or weak,
I would tend to say it is strong because it is a categorical statement. To me,
a weak statement would be "I think my car won't start because there's either a
battery problem or a starter engine problem or a wiring problem". That doesn't
take a strong stand or eliminate many possibilities.

~~~
renewiltord
I think in the context of the original statement it was a heuristic for
searching for truth, i.e. "to know, form hypotheses then subject them to
falsification tests".

So yes, in the meaning of SOWH, all WO are ones that leave you unable to
progress. Your reading of 'categorical implying strong' is reasonable, imho,
in just the meaning of the word 'strong'. It's just not what I believe was the
intended meaning in the statement of SOWH.

~~~
gnicholas
Interesting, so the SO part of SOWH just means "have opinions that can be
validated"? That seems very weak, no pun intended. Would anyone go around
advocating for people to have WO, using that definition? I guess if that's
what it means, then SOWH is pretty vacuous.

Perhaps it's my background (law, economics, logic) but I assumed that "strong"
referred to "the strong form" of an argument or hypothesis (such as the
efficient market hypothesis). I never understood why someone would say it's
better to embrace strong forms of arguments, since these tend to be more
extreme (and IMO, things tend to be wrong more when they are taken to
extremes).

~~~
renewiltord
Despite your belief that it is vacuous, WO is a pretty standard mode of
operation for many. It usually manifests as "We just don't know enough. Let's
collect more information." or "There's not enough to go on. Let's wait it
out." etc.

i.e. unstructured search is more common than hypothesize-falsify despite the
latter being established epistemology since early 1900s or before.

------
jakeogh
There is an entire industry devoted to giving people strong opinions about
stuff they have not researched themselves.

They hide behind the false idea that people are unable to evaluate evidence.
They will chew it for us. As proof of this, they offer up endless anecdotes of
others believing things that the viewer is really sure are either true or
false. The most effective 'bias hacks' are easily shown to be false; all that
matters is that the viewer believe that "other" people hold that opinion.

A real test is who is willing to discuss it without getting emotional and
derailing. The side that needs insulation does that for a reason.

Gaming confirmation bias is perhaps the 2nd most effective tool used to
manipulate mass psychology.

Propaganda and Manipulation: How mass media engineers and distorts our
perceptions
[https://www.youtube.com/watch?v=Pfo5gPG72KM](https://www.youtube.com/watch?v=Pfo5gPG72KM)

------
hn_throwaway_99
I perhaps haven't been using the phrase as originally intended, but I've
always used the phrase "strong opinions, loosely held" when it comes to hiring
and building a team, and I've found it incredibly useful.

The "strong opinions" part for me means hiring someone who not just has a lot
of experience on some topic or technology, but they understand it at a very
deep level. The strong opinions come from a place of perhaps having been
burned hard by a particular technology or process (or, contrarily, loving a
technology or tool for some reason), and being able to point out the 10 little
details that turn out to be big issues in real-world usage.

The "loosely held" part to me means being able to trust that (a) you don't
know everything in all situations and (b) most importantly you're willing to
really listen to other people on the team and are open to the idea that you
may be wrong.

~~~
brongondwana
We recently hired somebody who had this on their resume:

"I understand the wins and pitfalls of Agile methodology".

I mean, how can you go past that - for succinctly describing a pragmatic view
of the world and showing real hard-won battle scars at the same time!

------
jonahbenton
It's terrible as a meme, and terrible as a culture guidepost for a big group.

It is only effective in the context of a closed small trusting group making
decisions.

One needs to make decisions, often with laughably insufficient information. So
make one and watch carefully and be able to reverse if evidence tells you
differently. Only works if you already have a strong trust culture and
relatively equal power.

Using that policy as a leader of a larger group with a necessarily weaker
trust culture and unequal power fails terribly because it comes across as
capricious, and irresponsible.

~~~
lambdatronics
Yeah, it can devolve into the Bryan Caplan syllogism:

1\. Something must be done

2\. This is something

3\. Therefore, this must be done.

------
jermier
I do this on Hackernews sometimes. I just state my opinion, however weakly
held, and wait for it to be torn apart, where I learn a lot, and all my biases
are revealed to me. That's how learning works: you challenge your own
assumptions, or let your assumptions to be challenged by others.

~~~
gklefnbkon
That does work to educate you, but has the side effect of distorting the
discussion here. For instance, I'll see a bunch of comments recommending a
particular approach for writing software. I might come away thinking most of
HN approves of that approach, and so there must be something to it. But what
if all those comments are just people trying out the idea, in hopes that other
people will tear it apart?

I don't think there's anything wrong with qualifying your opinions. You can
still get people to challenge you on them. Isn't that why people start their
sentences with "for sake of argument" or "playing devils advocate"?

~~~
cutemonster
What about adding "??" after such "show me how I'm wrong" statements? Ppl
would still reply I think, but fewer/no one would be fooled?

------
Dangeranger
I've found this is a phrase loved by those who enjoy argument for its own
sake, and who don't have a very deep understanding of a particular subject
matter.

~~~
chillacy
I like how the author put it:

> Saffo’s original idea is so quotable it has turned into a memetic
> phenomenon... ‘Strong Opinions, Weakly Held’ turns into ‘Strong Opinions,
> Justified Loudly, Until Evidence Indicates Otherwise, At Which Point You
> Invoke It To Protect Your Ass.’

------
skybrian
I think an approach of collecting questions rather than answers works well. If
you're really interested in a question then you should try to answer it, but
be wary of accepting the first answer you see or think of.

I have questions that I've thought about for years, and some tentative
hypotheses to go with them.

Inspired by: [http://kiriakakis.net/comics/mused/a-day-at-the-
parkhttp://k...](http://kiriakakis.net/comics/mused/a-day-at-the-
parkhttp://kiriakakis.net/comics/mused/a-day-at-the-park)

------
xref
Honestly I rarely remember the full details of why I chose a particular tech.
I did the deep research, picked what I believed to be the best tool,
implemented and deployed it and now that tool is my baseline.

If you asked me a couple years later “why the hell did you chose that?!” I
probably couldn’t give you many details that formed my opinion originally and
couldn’t vociferously defend it by arguing minutiae of spec sheets. But I do
know if you want to sell me on something new its gotta beat that baseline
tool, not whatever opinions I might have had at the time of choosing.

~~~
modernerd
Architecture Decision Records (ADRs) are intended to help with that:

[https://adr.github.io/](https://adr.github.io/)

[https://github.com/joelparkerhenderson/architecture_decision...](https://github.com/joelparkerhenderson/architecture_decision_record)

[https://github.com/joelparkerhenderson/architecture_decision...](https://github.com/joelparkerhenderson/architecture_decision_record/blob/master/adr_template_by_michael_nygard.md)

They're overkill for small personal projects but can be helpful in big teams
because it won't just be you wondering, “why the hell did they choose that?”.

~~~
xref
This is great, thanks for the links!

------
heisenbit
For me the „strong opinion weakly held“ is a way of mental prototyping:

> Allow your intuition to guide you to a conclusion, no matter how imperfect —
> this is the “strong opinion” part. Then – and this is the “weakly held” part
> – prove yourself wrong. Engage in creative doubt. Look for information that
> doesn’t fit, or indicators that pointing in an entirely different direction.
> Eventually your intuition will kick in and a new hypothesis will emerge out
> of the rubble, ready to be ruthlessly torn apart once again. You will be
> surprised by how quickly the sequence of faulty forecasts will deliver you
> to a useful result.

The author of the blog post however finds failure with it for guiding
investments gradually, it failing when new information is discovered along the
way. But that isn‘t the purpose, to guide one in small day to day adjustments.
The purpose is to explore a not well known landscape for strategic decision
making. To gather enough solid non trivial insight to make a well founded
strategic decision. There are better tools for operational management once one
has committed to a direction.

Strong opinions help me often to escape analysis paralysis. It also helps me
to surface my premature judgements and transcend them.

------
bena
I feel like a lot of these types of adages come back to try to convince humans
to do one thing: Be Wrong.

And I don't mean, deliberately make incorrect decisions, I mean, allow
yourself to have made a mistake.

Some people get the reputation that they act like they know everything. You
can even see some of those accusations creep up in various threads on this
topic.

There are two major types of people who get that reputation. People who will
never admit they're wrong and people who will admit they're wrong so fast, you
don't even notice they've changed their stance.

The first type of people think the second type is just doing what they're
doing. But no, the second type is more than willing to be wrong. They know
that it's going to happen. It goes back to Socrates, knowing you know nothing.
Accepting that your knowledge is incomplete. There are no stakes for being
wrong if you don't put them up. The second type are also the type to not worry
about laying blame about who is wrong. They just want to correct the mistake.

Be more concerned about _what_ is wrong rather than _who_ is wrong.

------
naveedn
An great counterpoint to this argument comes from Allen Holub in his talk
“#NoEstimates”.

The system of making bets does not translate well when communicating that to
business folks and managers — which is why it’s largely non-existent for story
pointing in agile environments. People are overly optimistic; they hear 80%
and think 100%. Doing percentages isn’t foolproof either.

------
tunesmith
I've thought along these lines recently and I think a better formulation for
clear communication of opinions is:

1) State a complete argument with premises, reasoning, and conclusion

2) State whether you believe the argument to be true

3) Check your stated reasoning to the point that you believe it to be
logically valid

4) Check your stated premises to the point that you believe them to be true.

I believe that by sharing our premises and our reasoning, we are not only
offering some humility, but we're also inviting respectful engagement. We
express accountability by stating our beliefs. That way, if any of those are
challenged, it means they're being challenged on the merits and it helps us
update our own point of view.

By stating an argument in that manner, that can be compared to stating it
"strongly", even if you are also welcoming input on the truth of your premises
or the validity of your reasoning, which can be compared to "weakly held".

------
ChrisMarshallNY
This reminds me of this old (full-fat clickbait) article from Cracked:
[https://www.cracked.com/article_19468_5-logical-fallacies-
th...](https://www.cracked.com/article_19468_5-logical-fallacies-that-make-
you-wrong-more-than-you-think.html)

------
dwd
It unfortunately doesn't work because people in general don't get that
knowledge progresses and what may previously been considered a truth, no
longer is - and that should be how it works because we learn and grow as a
civilisation.[1]

But it's not - people don't accept that what they were told previously is no
longer correct. A good example is the current arguments around mask wearing.
By originally saying don't wear masks, that has stuck and a lot of people
think either you lied originally or you don't know what you're talking about -
ideas made worse by being promoted for political gain.

[1] [https://en.m.wikipedia.org/wiki/Half-
life_of_knowledge](https://en.m.wikipedia.org/wiki/Half-life_of_knowledge)

~~~
jakeogh
Why not wear the masks forever?

~~~
dwd
In places like Japan you are expected to wear a mask if you're sick, so
there's no reason that shouldn't become standard in the West.

More important could be that workplaces require staff to either stay home and
not infect the office when they have the flu, or wear a mask if they have to
go in.

~~~
jakeogh
Nobody is suggestion people go to work when they are sick. That's not new to
the West, odd to suggest it is.

Fake masks:

[https://stpauls.vxcommunity.com/Issue/Us-Experiment-On-
Infan...](https://stpauls.vxcommunity.com/Issue/Us-Experiment-On-Infants-
Withholding-Affection/13213)

~~~
dwd
If you're a casual employee and don't have any sort of paid sick leave you
either turn up or forgo the money (and potential future work). There are also
those who pride themselves on never taking sick leave and "soldiering on", but
invariably infect their coworkers. Here's some survey stats:

[http://www.nsf.org/newsroom_pdf/Flu_in_the_workplace_(final)...](http://www.nsf.org/newsroom_pdf/Flu_in_the_workplace_\(final\).pdf)

------
dack
While I want to believe this post, I also wonder how long they have been doing
the new technique (did i miss them mentioning that?). I say it because I
wonder if "percentage confidence" is now the hard part, where humans are
terrible at estimating such things.

------
asveikau
> For instance, Steve Jobs was famous for arguing against one position or
> another, only to decide that you were right, and then come back a month
> later holding exactly your opinion, as if it were his all along.

Unrelated to the larger theme of the post, but I absolutely hate it when
people do this. I thought it sounded like a silly parable until I saw people
do it in real life. At which point I concluded it was some kind of
characteristic of narcissism.

To this type of person, an idea isn't good until it's theirs. They will trash
your idea, and trash you. When your idea seems convenient they will take
credit and fail to cite you as an influence. That's ok, one can suppose, if
not for the next part. Likely they have forgotten you gave them that idea in
order to help them. They still think you are trash. They will have no qualms
trashing your next idea with personal attacks just like the last one.

~~~
ofrzeta
I have this "theory" that one trait of successful people is forgetfulness. In
my experience they tend to forget their mistakes and also forget the source of
their successful ideas.

~~~
RealityVoid
I tend to suffer from this, only in the _other_ dirrection, in that I forget
when I give something or I add something or I help with something or I forget
conflicts with people and slights reeal easy. My mistakes and failures, I
remember vividly. Does that set me up for anti-success I wonder?

~~~
mercer
It can be a recipe for serious anxiety issues, I suppose.

------
karmakaze
The best way I can see this play out is with development. At the start you
might have an intuition but there could be several good contenders for
direction. Instead of talk/paper analyze them all pick one, any one and move
forward as if you'd picked the right one. Along the way look for signs that
perhaps this isn't the best or even second best choice. Keep going anyway
until you have a clear idea as to what's wrong, at which point you should know
which direction is better and what to change to go that way.

It closely follows ship early ship often. These are not hard things to do just
takes some practice and being honest. Just don't tie any personal stakes to
the initial direction.

------
OliverJones
Of course, "strong opinions weakly held" is a refactoring of the scientific
approach to knowledge. Karl Popper taught that scientific hypotheses
necessarily must be disprovable. Hypotheses that aren't disprovable
(falsifiable) are useless in science.

Insisting on falsifiable hypotheses is a difficult personal discipline, not to
mention collective discipline. "Nothing is true unless it might be false!"
Wait, what? How can a company make decisions based on that sort of
epistemology? How can a government set policy? It's much easier to make
decisions when we have the illusion we're sure about the facts.

Maybe the only sensible path is to hold all our opinions, personal and group,
lightly.

------
kerkeslager
TL;DR: Strong opinions, weakly held doesn't work because of Anchoring Bias[1].

Obviously if you fall prey to anchoring bias, you're doing the "weakly held"
part wrong, but I think that _almost everybody_ does this wrong, even people
who know about anchoring bias and do their best to guard against it. I've
known about anchoring bias for maybe a decade, recognized it in my own
beliefs, taken steps to attempt to address it (i.e. meditation and reading
opposing opinions to my own) and while I'll give myself the credit that maybe
I'm a bit better than an average person at changing my own beliefs now, I'm
still objectively very bad at it, frequently coming across places where it's
clear looking back that I was wrong for years due to anchoring bias. It's much
easier, I think, to permeate everything in my own belief system with a
fundamental level of doubt, and only form really strong opinions with
overwhelming evidence. But even that is only somewhat effective. Anchoring
bias is a pretty powerful foe.

[1]
[https://en.wikipedia.org/wiki/Anchoring_(cognitive_bias)](https://en.wikipedia.org/wiki/Anchoring_\(cognitive_bias\))

------
klagermkii
I think it's difficult to approach a problem with a Depth-first Search
mentality without using "Strong Opinions, Weakly Held". If one goes in with
"weak opinions" it's easy to find oneself constantly backtracking and checking
the other nodes early on in the chain of assumptions and doing a mental
Breadth-first Search instead.

That's not to say that it's the right approach for every situation, but for
problems that will have a lot of dependent unresolved assumptions "Strong
Opinions, Weakly Held" is sometimes necessary to maintain focus to break
through the problem.

------
gridlockd
What I like to do is to just just put out provocative views that I don't
necessarily hold, just to see what happens.

If there's something epistemologically wrong with them, some smart person will
probably correct you, for free.

If there's something emotionally wrong with them, some indignant person will
probably chastise you.

Either dimension is important.

Message boards are good places to do this. Always keep in mind:

"Communication occurs only between equals"

[https://en.wikipedia.org/wiki/Celine%27s_laws#Celine's_Secon...](https://en.wikipedia.org/wiki/Celine%27s_laws#Celine's_Second_Law)

------
yellowstuff
I think you can draw an analogy between probability and counting. Negative
numbers seem weird and very different from positive numbers, but for many
applications 0 is not a particularly interesting boundary, and you benefit
from not treating negative and positive numbers as two different cases. EG, I
could owe you $2 or -$4.

In probability, 50% feels like it should be a special boundary. But often
there's a lot of benefit to not treating it that way, and treating a 49%
belief more like a 51% belief than like a 10% belief.

It sounds trivial written out, but how often do people behave that way?

------
kazinator
The fix:

> _Use Probability as an Expression of Confidence_

If the probability is just some made up number reflecting a hunch, it's no
better at informing us to take an action or drop an opinion. It's just a way
of talking about how weakly we are holding the opinion, and not the real deal
from statistics.

The concept still provides no concrete framework for assigning the
probability, and for taking action. E.g. do you abort the plan when the
probability of success feels like it has dropped to 65%, or does it have to
feel like 35%?

~~~
qznc
You just have to compute the expected value. Let's say we play a game where
you have to invest $10.

If you have a 65% chance to win and winning means you get $2 on top of your
$10, then abort the plan. The expected value is $12×65% = $7.80 which is less
than your investment.

If you have a 35% chance to win and winning means you get $30 on top of your
10$, then continue. An expected value of $40×35% = $14 is worth it.

In reality, the tricky part is often to assign numbers (How much does it cost
to switch to Rust? How much errors does the stricter type system catch?) but
the framework is available.

Ok, also look up the Kelly criterion because you can go bankrupt even with a
high expected value.

------
oh_sigh
This viewpoint leads to blog posts like "Why you MUST do X to be a good
engineer (2015)", and then "Why I was wrong to say you MUST do X to be a good
engineer (2019)"

------
rdiddly
Has always been a terrible name for "take a guess, then try to come up with a
better guess." It's not rocket science. Although rocket scientists probably
use it.

------
huffmsa
On the contrary, I've found that one of the worst things you can do when
demonstrating / deploying a machine learning model is to show the underlying
prediction confidences.

You can be right 95% of the time, but critics will only remember, and fault
the whole system, because of a single, incorrect, but high confidence error.

Which would be reasonable if that erroneous prediction had fat tail
consequences, but in cases I've had this happen, it has not

------
InternetPerson
Paul Saffo writes, "I will force myself to make a tentative forecast based on
the information available, and then systematically tear it apart."

That's similar in spirit to scientific method: Formulate a hypothesis and then
set out to disprove it. (Overly simplified, of course.)

So, there's definitely something here. But in my experience, the phrase
"Strong Opinions, Weakly Held" is mostly just used to excuse bad behaviors.

~~~
thisrod
> That's similar in spirit to scientific method

Yes! I think that science also demonstrates how Bayesianism, the author's
proposed remedy, misses the point somewhat.

What probability do I put on my belief in quantum mechanics? Zero! It has to
be wrong, because it doesn't account for gravity. On the other hand, I'd bet
on it for every question it can answer.

Scientific theories are the strongest opinions of all. If you can imagine how
the laws of physics, the antiquity of Earth, or natural selection could
possibly be wrong, you don't understand them. But, historically, those
theories replaced previous theories, which were strongly accepted in their
day, and did turn out to be wrong.

There is no way that some theories could possibly be wrong, but it's almost
certain that some of those theories are wrong. This is about a deeper type of
uncertainty than probability can describe.

------
oxfordmale
This hits the nail on its head:

In simpler terms, ‘strong opinions, weakly held’ sometimes becomes a license
to hold on to a bad opinion strongly, with downside protection, against the
spirit and intent of Saffo’s original framework.

Beliefs are very hard to change as the human brain has a perception bias in
favour of currently held believes. Data driven goals are a much more objective
way to test your beliefs and drive business decisions.

------
cel1ne
"Strong opinions, weakly held" makes sense in exactly one situation: When you
are definitely an expert in a topic you should have your strong opinions about
it.

But since everybody is sometimes wrong about something, and that should be
very seldom in the expert case, you should be able to let go of one of your
opinions, in case you get counter-arguments.

------
hugey010
I've always been this kind of person, opinionated but backed by reasons, and
most people find it abrasive. There's nothing I like more than having someone
change my opinion, because that means they taught me something. The catch is,
most people are either unwilling to teach such an opinionated person, or have
nothing of value to teach.

------
rezeroed
Ah, good. I've always thought this saying to be utterly stupid. I was geared
up for a rant, but now I see from the article that the guy who came up with it
has done a poor job of distilling his approach into a catchy saying. His
approach is basically agile; his catchy saying sounds more like bluster and
bravado until you look stupid.

------
dwighttk
There is a sort of financial epistemology that a lot of people like because it
tells them they are successful and therefore right. It also allows them to
talk about testing hypotheses and get results in hard numbers. And it even
works as far as it does, but I’m more and more skeptical that it explains
everything.

------
082349872349872
"Skate to where the puck is going to be" is an example of SOWH working well. A
weak opinion (waiting to get clearer data) would result in skating to where
the puck has been. A strongly held opinion would result in skating to where
the puck isn't going at all, and has never been.

------
douglaswlance
To answer the question of "when should you change your opinion?":

All new data should change opinions, even if it's a small piece of data that
affects your opinion only on the margins, it is still a change toward an
opinion that will better map onto reality.

~~~
alexpetralia
Bayesian updating!

------
karlkatzke
> It is quite difficult for the human mind to vacillate between one strong
> opinion to another

The author has apparently never met someone on the autistic spectrum with ADHD
and who has enough life experience to know how to be wrong gracefully.

~~~
smegma2
Not sure what you're getting at here, but this sounds like a rare occurrence
i.e. difficult

~~~
anarchop
You are obviously completely wrong. But on the other hand...

------
renewiltord
Ultimately, truth-seeking is a personal thing and you should tune your methods
to the machine that is you.

tl;dr SOWH is good. Don't claim it for yourself. Sloppy Priors are good. Be
careful of not adjusting on evidence. The bet trick is very good. Be careful
about your utility function.

Personally, among the people I know the following doesn't happen:

> _In such cases, the failure mode is that ‘Strong Opinions, Weakly Held’
> turns into ‘Strong Opinions, Justified Loudly, Until Evidence Indicates
> Otherwise, At Which Point You Invoke It To Protect Your Ass.”_

And that is mostly because it's really easy to claim you're operating in SOWH
and really hard to determine if you actually are. So most truth-seekers apply
the same techniques of epistemology to it as they do to other things that
require auto-evaluation with no objective truth set: you trust weighted
external input more than your own. Am I charismatic? Am I intelligent? Am I
whiny? It is hard for me to say. It is far easier for me to find that out from
people I can identify as trusted on the subject.

I remain convinced that operating in SOWH is empowerment (at a 80% certainty
har-har). I believe that the Bet Trick and the Sloppy Prior Trick are both
clever techniques to improve your search for truth as well.

The Bet Trick is very good. My personal danger is that I do not apply linear
utility (i.e. an unlikely win makes me much happier than a string of losses).
So if I predicted that the US would be open by August and I'm right that makes
me way happier than if I predict repeatedly that median summer temperature is
greater than median winter temperature in Saskatoon. I'm okay losing $200
repeatedly on the former for the win.

The Sloppy Prior also has the trick that it allows you to examine your
reaction to evidence. Your Sloppy Posterior to the weakest of evidence must be
different from your Sloppy Prior! If it isn't, your cognition is currently
failing you. The only problem is that the sloppiness gives you room to avoid
having different posteriors.

That last part I find very hard in almost-certain and almost-never situations:
if a set of instrument measurements show that temperatures across the Earth
are the same as they were 40 years ago, then it's highly likely that the
measurements are broken, but it is not certain, so my posteriors for AGW given
that evidence should drop. But they don't unless I am conscious of this.

------
CarbyAu
Strong Opinions, Weakly Held" \- I missed this being a thing. Seems like
terrible wording for something that already had english words.

Science. Devils Advocate.(although this is not obviously a better phrase...)

------
hejja
the truth is probably more nuanced

objectively considering information is further than most people ever get. most
people are just out here trying to win arguments

so let's assume you're open minded

you could weave components of this new info into what you already know, and
discard the rest

but what do you keep?

does the most convincing empiricism trump everything? are personal values
involved?

that's up to you

------
physicsguy
I think that 'Strong Opinions, Weakly Held' often makes people intolerable to
work with, to be honest.

------
JohnBooty

        More generally, “strong opinions weakly held” is 
        often a useful default perspective to adopt [...]
        Try it at a cocktail party the next time a 
        controversial topic comes up
    

Ah, yes. We've all met this guy at parties.

"[Technology X] is the worst thing ever and people who use it are setting the
industry back by ten years. Prove me wrong!"

Don't be that guy.

The guy who's even worse than a regular blowhard: the blowhard that doesn't
even necessarily believe what he's saying; he's just trying to stir up debate.

The social equivalent of a message board troll, essentially.

If you want to incorporate "strong opinions, weakly held" into your own
internal decision-making and opinion-forming process, cool. Not a bad way to
go. It's also appropriate in many contexts with others - ie spitballing
sessions where everybody is tossing out possible solutions to a problem
they're tying to solve or hell, maybe a drinking session with friends where
you're all three or four drinks deep and shooting the shit, being silly.

But at "cocktail parties" or other casual social situations like hallway or
lunch table conversations? _Yeeesh._ Don't be that person, where you
obnoxiously dump your "strong opinions" onto others. A lot of people don't
enjoy debate for debate's sake. (I actually tend to enjoy it - but many do
not.)

------
legerdemain
In my experience, "strong opinions weakly held" is a style of workforce
management, not a thinking style. In brief:

\- Project total confidence to your employees. At any given moment, your
course of action is the best one available.

\- Announce changes instead of proposing them. Announce them at the last
possible moment, or even retroactively.

\- Never mention past iterations you have discarded. Now is now.

------
skrebbel
tl;dr: it doesn't work that well because it's difficult.

To be honest this feels a bit to me like saying that cryptography doesn't work
that well because it's difficult. The article itself even has _examples_ of
people for which it has worked well (eg Steve Jobs).

I recognize the argument that there's people who don't realize that they're
bad at and still quote the "strongly opinions, weakly held" idea, but that
means it's misused, not that it doesn't work well.

Again, the crypto analogy holds up. We all know some examples of crypto rolled
by people who didn't realize it was too difficult for them. But that doesn't
mean crypto doesn't work.

~~~
robertlagrant
> To be honest this feels a bit to me like saying that cryptography doesn't
> work that well because it's difficult. The article itself even has examples
> of people for which it has worked well (eg Steve Jobs).

No, the article specifically does not say that. It says the principle doesn't
generalise well across people, for exactly the same reason cryptography
doesn't. We have "don't roll your own crypto"; this article is the equivalent
for "strong opinions weakly held".

------
Godel_unicode
> ...make a tentative forecast based on the information available, and then
> systematically tear it apart, using the insights gained to guide my search
> for further indicators and information...

This is literally the scientific method. Come up with a falsifiable premise,
then attempt to falsify it. Arguing that science is hard is pretty irrelevant.
Most of the failings present in this article are really just evidence of not
doing the second half. You should be looking for evidence that you're wrong,
not evidence that you're right.

This is later borne out by the proposal of adding confidence intervals and
dates by which assumptions should be robust against being disproven. You'll
find this type of language on experimental design in lots of scientific
fields.

This article is about someone's journey to rediscover the thing their source
already knew (using scientific rigor can take a hypothesis into a theorem, and
gradually bring you nearer an understanding of the underlying truth), and then
call it something else because the original distillation of a theory into a
soundbyte flew past them.

~~~
godelski
The article was strange because the author said the method doesn't work
because people don't hold opinions weakly, i.e. don't update them upon new
evidence. And then dives into trying to question what new evidence even means.
I've never heard the phrase "Strong Opinions, held weakly" but during my
education and work I've always been told "stick to your guns and be confident.
You came to conclusions for a reason. But keep your mind open because you can
never have the full picture."

As I see it, if the method is wrong because people don't use it correctly then
don't throw out the method, update it to be more usable. Which is exactly the
systematic tearing apart and rebuilding process that the author actually is
engaging in himself. Adversaries to an idea is a good thing.

------
kgwgk
> It is quite difficult for the human mind to vacillate between one strong
> opinion to another

The author has apparently never met a Trump supporter.

------
godelski
This is such a strange article. The tldr is "Strong opinions weakly held
doesn't work because people don't do it." Then the author makes some pretty
strong opinions about how that's not how the brain works. I also believe that
he's talking about two different ideas.

The first idea: having strong opinions but they aren't held strongly.

I have never read Saffo's work or even heard of these people here, but this
tactic described here is something I use frequently and has been very
successful for me as a scientist and is what many other scientists I know do.
The key issue of his complaint is that people do not hold their ideas weakly
enough.

Here's how I see it. Every opinion I have is just wrong. You cannot have all
the data and all the relevant facts, so no matter what conclusion you make it
is incomplete. The question is just "how wrong." If it is a little wrong, no
worries, but if it is a lot wrong, big worries. The strength of your opinion
should be proportional to this evidence. Essentially we're all making Fermi
estimates, but as time goes on they become better and better. But that doesn't
mean you aren't possibly missing some piece of key information. So you should
always be open to changing your opinion. You can also hold an opinion strongly
but change it with updating evidence. If you don't update your opinion to
account for new evidence you are just a bad and stubborn scientist.

Second idea: method for developing good ideas

The second idea is about building up, tearing down, and repeating the process.
Teams I've been on have used this method successfully to develop new theories
and new products. You don't have to account for everything but it provides a
good baseline. This model development really only is in the initial stages of
development. You use it to figure out what to test and probe. Before you start
a million dollar experiment you sure better have some good ideas and
explanations for why you're doing what you're doing. This is essentially
creating a red and blue team. You can do this as a group or you can do this
individually (harder because you have to accept cognitive dissonance). This
adversarial process can be highly successful in creating good conclusions
(Hell, this is analogous to what a GAN does). The issues are when someone is
really stubborn about their conclusions. But the big reason this works is
because when you're submitting a proposal you've already answered basically
any question anyone can ask of you. This is because a reviewer SHOULD be
trying to find reasons to reject your proposal because you don't want to waste
money.

So how this works in the real world is that I develop opinions based on the
evidence that I have. I stick to my guns because I didn't form these opinions
willy nilly, even with a lot of self doubt (focus on the adversarial benefit
and it is okay to be wrong). (And the key part) When someone presents new and
compelling evidence you update your model. But it is perfectly acceptable to
determine if this evidence is irrelevant or an outlier. I do know this is hard
for many people but it isn't that hard if you just accept the relativity of
wrong [1] as a fundamental principle. In my undergrad studying experimental
physics it is hounded into you to account for error of your measuring tools.
The next logical conclusion is to try to account for the error from your most
important measuring tool, you. If you accept that you aren't perfect, can't
have perfect knowledge (i.e. "The map is not the territory"), this is not that
hard. But then again, I'm considered weird, so I'm completely open to being
wrong.

[0]
[https://en.wikipedia.org/wiki/Fermi_problem](https://en.wikipedia.org/wiki/Fermi_problem)

[1]
[https://chem.tufts.edu/AnswersInScience/RelativityofWrong.ht...](https://chem.tufts.edu/AnswersInScience/RelativityofWrong.htm)

------
aeternum
The article is spot on. As humans it is incredibly difficult to avoid being
committed to your decision, a big problem for 'strong opinions, weakly held'.
I think Elon's advice is much better and I try to exercise it regularly:
"Assume you are wrong. Your goal is to be less wrong."

~~~
oska
How's that going with Elon's pronouncements on Covid-19? I only saw lots of
him assuming he was right against clearly better informed medical opinion.

~~~
jakeogh
Excellent really (Table 1, col 4):
[https://www.cdc.gov/nchs/nvss/vsrr/covid19/index.htm](https://www.cdc.gov/nchs/nvss/vsrr/covid19/index.htm)

CDC Influenza and pneumonia deaths by influenza season and age: United States,
2008–2015: [https://www.cdc.gov/nchs/data/health_policy/influenza-and-
pn...](https://www.cdc.gov/nchs/data/health_policy/influenza-and-pneumonia-
deaths-2008-2015.pdf) (these are not estimates, see the footnote)

    
    
      2015-2016
      Flu: 7,961
      Pneumonia: 131,858 
      All: 1,769,940

------
gklefnbkon
Strong Opinions Weakly Held, sounds reasonable in theory. In practice, it
means Always act like your right, until you change your mind, then pretend
like you always held the other view all along. At least that's how I see it
used on this site, whenever people praise Steve Jobs, Linus Torvalds, etc, for
being dismissive and rude.

Yes, leaders have to make bold choices, and they can't waffle. But that
doesn't require you to be rude to others, or to act like you're always right.
It doesn't require you to take a confrontational style to discussion, where
you assert your beliefs loudly and expect others to fight you on it.

Can we just admit that for some people on this site, technology and
entrepreneurship plays into a power fantasy?

