
Ask HN: When have you taken a decision in code outside your domain of expertise? - mprev
I&#x27;m writing a book about the role of software developers in the global economy.<p>One of the book&#x27;s themes is that developers hold a strange kind of power: we get to make decisions in code that affect end-users but only other developers (and sometimes not even then) can really hold that code to account before it goes into production. Seemingly mundane decisions in code can have profound consequences.<p>I&#x27;m gathering stories from people who&#x27;ve had to take decisions like this and especially where it was in a domain for which they had no experience.<p>I&#x27;d love to hear from people on HN who have stories to share. I&#x27;m also interested in hearing from people who dispute that this is even a thing.
======
mattlondon
This happens all the time I've found, particularly in "agile" processes. You
are coding away implementing something and realise there is some corner-case
or edge-condition that was not considered in the original design and/or UX
spec. Stuff that only becomes obvious once you are staring at the code you've
just written and are thinking "What should we do if this is null?"

 _So you unilaterally implement something to handle that condition._

Especially in agile projects with tight deadlines and the idea of continual
refactoring etc, rather than block further work on that feature while you wait
for the product owner/business analysts/UX team/etc to come up with an answer
and get back to you, you check-in your "best guess" implementation and move
on, with a TODO or bug left open to revisit it.

A lot of the time (maybe 75%+ in my experience) the developer's instinct tends
to hang around as the final solution, even if the developer is not an expert
in the context of the users of the application they are writing (this is rare
in my experience - generally developers are developers, and not likely to
coincidentally be experts in the subject area of the application's use cases
unless it is some niche areas - e.g. people writing software for surgery
robots are probably also unlikely to be expert surgeons too I would imagine?
Not impossible, but I'd want people to be an expert in surgery robot
programming, or an expert in surgery and not a half-arsed kinda-ok-done-a-bit-
before level of skill in either area!!)

~~~
lettergram
I had an interesting case a few years back... I was supposed to follow the
spec exactly. However, in all the UI/UX diagrams myself and team were given,
there was no “close window” or “back buttons”. This included things like pop
up notifications. I went ahead and implemented it anyway. Then sent them to
the design team to ask if I should remove (took under an hour of work). I was
later reprimanded for not following the spec, BUT they kept my implementation
and design.

This is one experience that sticks out because I was reprimanded. However, I’d
say every project I’ve been on has been at least 10% of the the final project
designed & implemented by engineers on the spot.

Beyond UI design decisions engineering can have a major impact on UX. Every
single function can make-or-break the experience. That’s why we, as engineers,
wield a lot of power.

~~~
Faaak
I would've gone against them: either you reprimand me and we take out the
feature, or we keep it and then you thank me.

~~~
PretzelFisch
At that point it's best to find a new employer. It sounds like there was some
weird ego game being played.

------
unnouinceput
My power? Total access to entire VISA and Mastercard databases of real life
citizens credit cards/debit cards. The US company I wrote an entire solution
(not just a simple application) also had a part where processing payments was
a requirement. Read the user CC, take his credentials, put it in database,
start transaction pre-authorization process and later finalization. So to
protect the user data from prying eyes I've encrypted the CC data that was
read by the magnetic card reader. But for proof of concept I used a simple
encryption scheme, which was never meant to be used in production. Countless
mails were exchanged between me and the manager regarding the encryption
scheme, to upgrade to a modern one, like once per month. Nevertheless this
weak encryption entered the production despite my many, countless by now,
warnings. Eventually things fallen apart between the upper management of the
company and my manager and civil suit ensued. In the end, FBI was involved too
and I had to write a affidavit to them regarding this. Offered all my mail
exchanges to them which proved that while I was not an US citizen, I had more
privacy concerns then the usual US citizen and businessman. Dunno what
happened in the end, as I exited the project around 2014, but looking at their
site it seems that my code is still in production. Talking about why so many
holes and security fails happens, I know first hand how "careful" the average
US manager is with sensitive data.

~~~
epberry
How often does an average US manager deal with a civil suit and the FBI?

~~~
unnouinceput
no idea. my experience was this one only in 10+ years of freelancing

------
kstenerud
The rule of thumb is: Describe the problem, ask for clarification, offer a
default solution.

Most often, people just won't be interested in the problem, and will ignore
you, hoping you and the problem go away, at which point your default solution
wins and is documented. But every now and then, it'll set alarm bells ringing
up and down the chain of command, and THAT is why you bring it up.

Of course it's also important to get a feel for what kinds of issues should be
brought up, and what issues should just be quietly solved. This skill is half-
technical, half-political, and comes with experience.

~~~
TravHatesMe
> and what issues should just be quietly solved

As a pedant, it sometimes makes me uncomfortable to make these decisions
"quietly".

While I wholeheartedly agree that this is an important skill for all
developers, it begs the question: Why should the dev be responsible for
deciding when an issue needs to be brought up? Why should the dev be making
these quiet decisions, is this not evidence of incomplete requirements?

There should be protocol here instead of relying on the dev's subjective sense
and opinion.

Generally speaking, I feel like this goes outside the bounds of the
developer's responsibility. Not every dev has honed this skill; it could be
dangerous. I would choose to err on the side of caution and apply your rule of
thumb above in almost all situations.

~~~
kstenerud
Yes, it's not ideal, it could be dangerous, someone could launch nuclear
missiles by mistake.

But the reality is that we live in an imperfect world, where imperfect things
can and do happen all the time, and we have to deal with them. You can't have
a rule for everything; when you do, nothing can get done because the rule
makers can't anticipate everything, and often get the things they DID think
about wrong (rule making is remarkably similar to program design, with the
same drawbacks and limitations).

So our imperfect world demands that we exercise our own judgment in deciding
what to do. If you don't trust your developer's judgment, you shouldn't put
them in charge of things that can cause a lot of damage. The alternative is a
rule for everything, which is guaranteed to collapse under its own
bureaucratic weight.

~~~
ufmace
> The alternative is a rule for everything, which is guaranteed to collapse
> under its own bureaucratic weight.

That's an important point that's too often missed. There's a cost to creating
rules, maintaining them in a list of rules somewhere, and having people
actually read the whole list and apply those rules to real situations,
handling appeals processes when you discover something that the rules handle
badly, etc. You need some rules, but the can't ultimately fix the need to
trust people.

------
BossingAround
From my experience, it's more of an accident. For example, one senior backend
developer was introducing new feature, and to do that, he created a POC UI. It
looked fine. At the time, the project had very little UI, and it was focused
more on the developers, so nobody would use the UI a lot. Little did he know
the POC UI would not only ship, but become de-facto the face of the product
for the next 6 years, since when other UI pages were added, they'd take his
simpler design. After a while, pretty much all the UI followed his design.

~~~
Aeolun
Nothing more permanent than a temporary solution eh?

------
notjustanymike
I'm a UI developer who specializes in designing and developing tooling for
SASS products.

My last company was in ad tech, and our UI was for setting up ad campaigns. A
big campaign consisted of 1 campaign, 30 line items, 900 tactics, and 2,000
creative assets. We offered managed service, meaning the account managers were
in house and I could observe them work.

When you're working with quantities like this, every UI choice is hit by a
multiplier equivalent to the campaign size.

My favorite was a request for table sorting. Makes sense, users want to sort
2,000 creative assets, and sorting is something every UI should have However,
asking why they're sorting revealed that they were trying to identify "orphan
creatives", assets which had no assigned tactic.

They'd open each creative in a new tab, and assign it a tactic. Also not a big
deal, until you multiply that action by 2,000.

They'd also need to spot check the assets to ensure they weren't accidentally
assigned to incorrect tactics, a feature that no one thought to request.

Ultimately, the request from our AMs and Product Management was: "Please add
sorting to tables." What I ended up building took over a week, and took the
shape of a nested folder browser that allowed bulk actions on multiple
entries. All because of a sort request that was hiding an issue.

So what was the impact? We had lots of large campaigns, which could take up to
two stressful days to set up. The new tool minimized errors and took at most
an hour (thanks to some smart generator tools I added later on). We had 5 full
time technical account managers who went from extremely stressed during
Christmas to fairly calm. Errors decreased, resulting in better campaigns.

One thing remained, and that was the muscle memory senior account managers had
developed for setting things up incrementally. I learned that when you build a
tool people use for hours a day, every small task is important. The motions
embed themselves in a user's brain. The mistakes or inefficiencies of a UI,
seemingly insignificant during development, can become someone else's rote
action, stressor, and even source of unhappiness.

I don't affect the global economy, but questioning a sort feature made five of
my coworkers happier.

~~~
Drdrdrq
Great story! It is incredible what kind of impact a competent developer can
have if they are given the chance and if they understand the problems users
are having. A little automation can eliminate majority of repetitive and
mundane work of users, leaving them time and energy to cope with more
important aspects of their jobs. Everyone wins, including the company
customers.

~~~
WrtCdEvrydy
The 5 Why's.... I try it to the extreme... just ask why until people explain
the actual issue.

------
speedplane
I had to add a search engine to an app and I didn't know much about search
engines. But whatever, it had to be done and I was the one to do it. So, you
educate yourself. You read about all the options, pros/cons, and try to
anticipate future needs. Switching search engines isn't quite as difficult as
databases, but it's hard, so I knew it was a big decision to make. After
learning about the subject, weighing the options, I made a decision. Looking
back, it was still the best decision.

The moral of this story is that if you're tasked to do something outside your
expertise, you make it your expertise.

~~~
LeonM
>The moral of this story is that if you're tasked to do something outside your
expertise, you make it your expertise.

I don't think the OP is talking about learning some search engine, library or
programming language. With 'outside of your expertise', he meant outside of
programming.

Say you are writing some medical software that has to diagnose patient based
on some inputs. You, as a programmer, are not a doctor and you can't "make" it
your expertise within the time constraints of the project.

~~~
Drdrdrq
Still, the rule is the same - you make it your expertise. You can't
competently develop something unless you know how it will be used.

> Say you are writing some medical software that has to diagnose patient based
> on some inputs. You, as a programmer, are not a doctor and you can't "make"
> it your expertise within the time constraints of the project.

Either you need to partner with someone who knows the domain (a doctor) and
discuss every detail with them, or you have lots of learning to do. :)

~~~
LeonM
> Either you need to partner with someone who knows the domain (a doctor) and
> discuss every detail with them, or you have lots of learning to do. :)

Right, so your answer from before wasn't correct.

And sure, if you give me 8 years I can do a medical study and make the
expertise my own, but in reality there is not a single customer on the planet
who is willing to wait 8 years and pay millions for some programmers to become
medical professionals.

~~~
speedplane
>> Either you need to partner with someone who knows the domain (a doctor) and
discuss every detail with them, or you have lots of learning to do. :)

> Right, so your answer from before wasn't correct.

No, it's still correct. Regardless of whether you learn something on your own
or consult with others, you still need to become an expert in the subject
matter. Using outside guidance can help accelerate that process, but you still
need to build that expertise yourself.

------
astazangasta
Pretty much all of the code I write is outside of my domain of expertise; my
training is as a biologist. I have a computational background only as a result
of my own (lifelong) amateur interest. This is the case for most people in
biology doing computational work, since there are not many good programs for
integrating study of biology and computing (despite the fact that biology is
now 100% dependent on computing and statistics to understand experimental
results).

As a result I ended up taking on the task of creating software infrastructure
to support biology work and fill in these holes. This means I'm creating
applications from the ground up, handling every aspect of it - front end, back
end, authorization, calculation, storage, deployment, etc. I have zero
training in any of these things, which is sometimes harrowing. I make the best
choices I can, but ultimately I think I'm pretty hampered by my limited
understanding of the available methods. I.e., I can write CSS, but I don't
know how to write a grid layout engine using flex. I have read some Bruce
Schneier books, but I don't know how to design or audit login protocols. I at
some point learned how to use a relational database but don't know all the
fancy new map/reduce type datastores that are available that might be more
appropriate to my work. Etc.

I suspect that if you look in any domain outside of computing, you'll find
people like me who are writing code by making things work without much
specific training.

------
swalsh
When I write engines that implement business rules, I NEVER just "take the
liberty". If it's not in the spec, then I either get clarification, or I throw
an exception. In my opinion it's better for the program to fail.

------
maxxxxx
I see that all the time with internal systems. Let's say the devs didn't add
an option to export data a file. This omission can later on trigger other
departments to create very expensive workarounds but due to internal
organization/politics it's almost impossible to change the original system to
add an export option.

Especially with new stuff often nobody in the organization has any expertise
so it's common that the devs take a first cut at the problem.

Another common thing is that even if the devs ask for clarification they get
none but the deadline still ticks so you just take your best guess.

------
arendtio
Probably not exactly what you are looking for but the topic reminds me of the
cookie expire date.

About 15 years ago I learned about cookies and their expiry date. At the time
it was totally up to you as a developer if you wanted to have a login that
lasted 10 minutes or three years. While relevant for security, it was just a
number you had to define. So it didn't feel like a big thing.

When I learned about concepts like 'remember me' I was a bit surprised, as in
my world it was just about increasing the number for the cookie lifetime. In
most cases, that is not entirely true as the modern 'remember me'
implementations are more complex (e.g. to support re-authentication for
modification of data), but the core principle is still the same (using a long
living cookie for authentication).

So what was just a simple number back then, became a complex topic with legal
implications nowadays.

~~~
ape4
That's a decision I have made without a spec many times.

------
hayksaakian
Look at the series of articles titled "falsehoods programmers believe in X".
Those articles exist because someone at some point made an intuitive
assumption about how the world works, which was convenient at the time but
eventually ended up biting them in the back side.

------
daneyh
As someone who works in capital markets and seeing more and more power moving
to the dev, I totally gel with this theme. Sounds like a really good idea for
a book. Anyway that I can follow along from home....release/blog?

------
cjfd
At my previous job I wrote a quotation wizard that customers could use to get
quotes. They could select what options they wanted, what kind of maintenance
contract and for how many work places and that kind of thing. In the end the
quotation wizard would calculate how much this would cost.

For the bigger decisions I would consult the sales person but I also made
quite a few of the smaller decisions myself. It did turn out that some of the
things that I had decided were not that much according to how the sales
process actually would go in practice. In particular regarding maintenance
contracts. For instance, my quotation wizard would allow a maintenance
contract to start at any date while in practice they always start at the
beginning of the month.

~~~
TeMPOraL
> _For instance, my quotation wizard would allow a maintenance contract to
> start at any date while in practice they always start at the beginning of
> the month._

I'd say you made the right call; in my experience, a statement like "in
practice they always start at the beginning of the month" is, in practice,
quickly followed by "except when they don't".

~~~
travisjungroth
Yeah, in my opinion people often enforce business logic way too strictly. My
rule of thumb is “If your boss told you to, would you?” If yes, the data model
should support it.

------
cosmie
Not directly as a developer, but I've wrangled with this a few different
times. The most impactful:

I was standing up an analytics function at a growing company. There were
several internal systems that had been organically created to handle different
parts of the companies internal processes, with little structural support in
the form of project or program management. Each system was developed by a
single, but different, developer. The developers knew their systems really
well, but there was little formal documentation on anything.

As part of creating an analytics/BI function, I had to start digging into the
data models of the different systems and ETL'ing the data into a data
warehouse, and was the first one outside of the initial developer to really
dig into the databases for their systems. Each developer used company
terminology for their entities that made it relatively easy to intuit the data
model. But each one had been left to interpret business needs and definitions
themselves, and had done so differently. And neither one of them matched up to
how the actual business users defined such things/processes, or presumed the
software was defining them. Yet because they were using the same terms, each
developer was presuming/projecting their implementation logic for a given
process or entity onto the other system, without actually confirming.

I spent well over a year finding and having inconsistencies corrected, and
getting system processes aligned with what the business actually expected to
be happening. By the end of it, we improved our production efficiency[1] by
about 30-40%, which had previously been getting silently lost in the void as
the two systems would have subtly incompatible definitions of what was valid
and what wasn't. Funny enough, the primary value that came out of the
analytics and BI team I stood up wasn't the actual analytics work, but rather
the operational discipline and system refactoring that was necessary as a
prerequisite to support the analytics.

[1] What was being produced was digital products, not physical. And there
wasn't previously any end-to-end analytics in place, so it was non-obvious
that stuff was getting lost in the mix.

------
tootie
I work in the digital agency space and have had projects across a huge variety
of domains (ecommm, finance, health care, retail) and different types of
deliverables including web, mobile, kiosks, IoT or VR.

Anytime a developer on my team had the power to deliver something without
checks and balances, it's a red flag. We always determine the expected
behavior before writing code and always check that behavior before delivery by
at least one non-coder and usually more than one. A decision in code that
affects the experience is always a bug and it usually gets noticed before
anyone sees it.

So, I'm honestly not familiar with the situation you're describing.

------
hyporthogon
On the business logic side, speaking mainly from experience in enterprise
software consulting (and some related research work): the domain is often
complex enough, and the knowledge of how the business actually works (as
opposed to what your TOGAF/Zachman diagrams tell you) is tacit and distributed
enough, that the code you're writing is often the first time that everything
in a particular process/subdomain has been made explicit. (This is especially
true during 'digital transformation' at e.g. old manufacturing companies,
where individual IT systems have been pretty nicely decoupled at a technical
level and work together only via people systems.)

In these situations, certain 'core' parts of the code quickly become the
_only_ accurate spec. (Whether or not it's worth updating the spec docs is a
management decision. But the test scripts will pass if the code is correct,
and under sufficient time and money pressure etc..) Other developers then
treat these 'core' parts of the code (usually some fairly high-level classes,
but usually something more concrete than an interface) as the _true_
documentation of the business requirements. If the company respects developers
enough, this means that the developers that worked on those 'core' bits of
code are also treated as domain experts in future business discussions.

On the purely technical side: sheesh, how much heat is generated by
horrifyingly algorithmically inefficient or vastly I/O-wasteful or just
redundant design (for instance religious/unnecessary use of immediate-mode
GUI) -- stuff that quite possibly the IT managers don't care about at all
(because of e.g. cheap horizontal scaling and inadequate measures of software
project success)? The heat is bad for ecological reasons (locally at least),
but also intrinsically (why are you destroying information, O Information
Worker?? -- and again e.g. Toffoli gates fix this only locally). Based on code
I've seen and, sadly, written (laziness, time pressure and all that) -- there
must be many, many orders of magnitude of unnecessary heat/information-
destruction happening because of purely technical decisions that on-the-ground
developers (not even architects/designers, I mean the people that write the
stuff that gets compiled/interpreted) make. @OP if you know some way of
measuring this I'd love to hear more.

~~~
sansnomme
By immediate mode GUI do you mean the game dev variety or the Web browser DOM
type i.e. React etc.?

------
schoen
> I'm also interested in hearing from people who dispute that this is even a
> thing.

My intuition is that it's definitely a thing, but I appreciate that you're
engaging with this question!

The first argument that I can think of on the other side is this:

Systems always effectively make decisions about how to handle every case, even
if the rules about how to handle some cases are tacit, implicit, ambiguous,
unacknowledged, disputed, or typically punted to some other system. Someone
might be mad at a programmer for explicitly handling some situation that had
previously not been addressed explicitly (or just skeptical or curious about
whether the programmer did a good job), but the programmer's solution might
not be worse or a more inappropriate exercise of power or judgment than
whatever was happening before.

This ties in to a lot of other issues about formalizing procedures and
interactions. You might want to look at James C. Scott's _Seeing Like a State_
and perhaps Michael Polanyi's _Personal Knowledge_ for examples of people who
are skeptical about doing this -- but I'm sure there's another side there.

------
kissgyorgy
Almost every day I guess? I mean writing new code always needs seemingly
subtle decisions which will affect users sooner or later.

------
Insanity
That always happens. You work on something new, so it is outside your
expertise. Then you learn and it becomes expertise. :)

------
mistrial9
This is an interesting premise, but .. in many economically-driven situations,
a programmer is a team member, who then has a technical lead, who then has a
project or product manager, who then answers to management via objectives. The
details of the code are serious, but within a context.. because basically,
economic activity can be very social, and also rule-based.

This makes the problem different.. instead of a coder directly writing an IF-
THEN sort of decision, intended outcomes of code behavior are controlled ..
BUT if the game rules are such that deception or more often, exerting control
over others, is profitable, then a very powerful system is being built to
execute a morally-ill process.

There are many, many divergent cases, however this sequence is very much at
the core of quite a lot of economically-driven programming IMO.

------
Dowwie
I recommend you focus on the management of technology rather than the mythical
powers of software developers

~~~
bjornlouser
Working title “Over-Under: how the tech industry gambles by overpromising and
underdelivering, sometimes with disastrous results.”

------
contingencies
Perhaps in our modern environment it is quite rational to be more
fundamentally against the notion of well delineated domains of expertise. All
should be open to questioning, in particular from fresh perspectives honed in
alternate experience. Everything should be questioned: perhaps not in every
project, but at least in each generation.

 _It went as an unspoken, unquestionable assumption that telephony was the
right model for data networking._ \- Van Jacobson

 _There are lots of "old and fundamental" ideas that are not good anymore, if
they ever were._ \- Alan Kay (2016)

 _Living in the present: man, you 're just out of it._ \- Alan Kay (2017)

... via
[http://github.com/globalcitizen/taoup](http://github.com/globalcitizen/taoup)

~~~
mprev
I can see benefits to getting input from outside your of area of expertise but
what you’re describing feels like the attitude that led to Theranos.

How do you avoid ending up in a Dunning Kruger situation?

~~~
DoreenMichele
I'm pretty confident Theranos was partly a product of a company being founded
by an attractive young woman surrounded by powerful men cooing at her instead
of holding her accountable or giving her the kind of constructive criticism
they would give a male founder.

Women get personally attacked and dismissed a lot while no one tells them "x
needs to be done differently." They get inured to hearing everything
"mansplained." They start tuning out the ugliness.

We don't have well developed good paradigms for how to do this effectively.

~~~
blotter_paper
I think the "woman founder" bit might have impacted media coverage, and even
investor decisions, but to make a gendered narrative out of the blatant fraud
ignores Balwani's role in the affair. There was at least one high-ranking male
who knew exactly what was going on and where the bodies were buried. To me
this is a clear case of "fake it till you make it, and defraud investors in
the mean time hoping you actually make it."

~~~
DoreenMichele
I'm a woman. A lot of men find me attractive, though my youth is long gone and
it took a lot of my beauty with it when it left.

Men mostly either coo at me or treat me like an idiot who desperately needs a
heavy heap of Mansplaining.

I've been on HN nearly a decade. I appear to be the only openly female member
to have ever spent time on the leader board.

I am endlessly mocked and belittled for pointing out how differently I get
treated from the guys on the leader board. People go out of their way to make
it clear that expecting this to be a professional networking opportunity that
enhances my career and bottom line is point-and-laugh worthy, never mind the
overwhelming evidence that participation here routinely enhances the careers
and bank balances of countless men.

So that's the lens through which I view the Theranos debacle where the world
imagined the company wasn't merely a billion dollar _unicorn_ but, instead,
valued it at _ten billion_ and called it a _dekacorn_. Then, overnight, it's
valuation dropped to zero.

I don't think the same scenario would have flown for so very long and gotten
so crazy out of hand with a charismatic man as the front person.

I desperately want good constructive feedback and mostly can't get it. I'm
quite confident that Holmes was largely starved of honest and factual feedback
about life, the universe, the company and how business is done.

The most recent article I read indicated she shacked up with one of the
investors. Her public narrative was that she was completely celibate out of
single-minded devotion to the company.

I have never seen an article that really questioned that. The media has bent
over backwards to be respectful of this known fraud.

I'm still waiting to hear that the real secret of her success was sleeping
with multiple powerful men who then backed the company in exchange. I think
the one other time I said that on HN, it was downvoted.

Sexual politics. Can't give the obvious answer about how a college drop out
with no business experience managed to "fake it" so long without ever making
good on any of those empty promises, even after it has come out that she moved
in with some old guy who invested millions in her company.

------
RantyDave
If (relevant) decisions are made and make it into production, then it's a
management failing. Specifically one of testing since at that point (at least)
you should know what the system _should_ be doing.

Of course, I'm in management utopia la-la land here and, in practice, if
nobody objects then it goes in and stays. The very high majority of the time
this is fine and has no impact, but every now and then someone decides it's OK
to only read the angle of attack from one sensor and ....

------
altitudinous
Hmm, if I am an expert in a domain, then there is no decision to be made - it
is obvious what the right answer is to everything, you don't have a decision
to make - there is no communication to others. All decisions are outside my
domain of expertise! The outcome of these decisions are usually compromises
based on the needs of a stakeholder, manager or the least stable third party.

~~~
lugg
Spoken like someone at peak experience..

Real experts have nothing but questions about their own assumptions.

~~~
AYBABTME
"Real experts do $XYZ" where XYZ is your opinion.

~~~
deathanatos
While the OP may have inadvertently committed a "No True Scotsman!", I think
he's nonetheless correct: I think experience often comes with a better
appreciation of what's out there, what's still unknown to you. Before, you
might have taken X somewhat for granted, but if you delve into and research X,
you learn it's a complex interaction of A, B and C; now you have 3 things
you're taking for granted / need to learn. A fair number of people have felt
_stupider_ after learning, which is of course the opposite of how it should
be. Imagine having never seen the inside of a modern car hood, and you open it
for the first time. Perhaps all you need to do is "change the battery"
(simple, right? You've changed the battery in other things before) and upon
seeing the inside of the hood for the first time, one might reasonably be
overwhelmed by the amount of stuff crammed inside there.

Questioning your own assumptions, I think, falls out of repeatedly learning
that often things are not simple, and the experience making your own mistakes
and getting burned: it teaches you when to proceed (you don't want the project
to get bogged down with "analysis paralysis"), but with caution and the
knowledge that you made an assumption. (Whereas a less experienced person
might not realize they made the assumption at all.)

I would also point to the Impostor Syndrome[1] as a sort-of evidence of that,
though it's certainly possible for someone to be an expert and not feel that
way.

[1]:
[https://en.wikipedia.org/wiki/Impostor_syndrome](https://en.wikipedia.org/wiki/Impostor_syndrome)

------
rhacker
I think I'm less interested in those decisions having an effect on the global
economy and more interested in the effects (a person dying because of a poorly
thought out condition, a food delivery person that gets paid less BECAUSE of a
large tip, the decision to NOT mask for passwords and its effects).

Maybe that _is_ what you mean by global economy, but that is probably more
interesting.

------
rb808
> we get to make decisions in code that affect end-users but only other
> developers can really hold that code to account before it goes into
> production.

Honestly if that is in an important product then there is a serious management
problem.

Its also another reason why devs usually specialize in an industry eg
healthcare, aeronautics, robotics.

------
z3t4
I can't remember any time where I made a "buisness" decision without being the
customer myself or asking the customer. _Even_ when I was the domain expert.

------
microcolonel
Sometimes, maybe rarely, the software people are the only ones who have a
different enough take on things to do something productive, even if they do
not start out as experts.

------
tempodox
What a funny question to ask around here. For true HNers, there is no such
thing as “outside their domain of expertise”. Witness the discussions in this
very thread.

------
sramam
Speaking from an end-user experience POV:

When I first switched to a mac from windows more than a decade ago, the
instant-on feature was one of the most delightful experiences.

I have often wondered what the cost of not having that on Windows meant to the
world - in lost productivity and green-house gas emissions.

------
bjourne
Every single damn morning when I have to decide on what clothes to wear.

~~~
Drdrdrq
Always buy multiple sets of clothes then wear them the whole month. This way
you only need to make a decision on first of the month. :)

------
xiphias2
[https://www.lucidchart.com/techblog/2015/08/31/the-worst-
mis...](https://www.lucidchart.com/techblog/2015/08/31/the-worst-mistake-of-
computer-science/)

------
imauld
Every decision I make in code is outside my domain of expertise

------
lugg
I think a lot of those are hard to see and you're going to have a hard time
tracking them down from the horses mouth. Youll have a better time looking at
the symptoms and tracing back to the source.

