
Learn Big-O and stop hacking your way through algorithms - danielwbean
https://triplebyte.com/blog/why-you-should-learn-big-o-and-stop-hacking-your-way-through-algorithms/?ref=hnpost
======
read_if_gay_
I don’t get the fuss about Big O. Back in my algorithms class, it always
seemed like one of the simplest things about algorithms to me. For example, I
think it’s way harder coming up with complex algorithms in the first place. Is
that just me or is Big O just something people outside of university never get
around to learn even though it’s simple?

~~~
hervature
I think there are phases to this.

1) Not knowing

2) Knowing Big O and thinking it is always important

3) Knowing Big O and realizing constants matter in practice so you stop
optimizing prematurely

Anyway, that was my evolution. I agree with most of what you say except I
think I would change complex algorithm to complex data structure.

~~~
megameter
There is a 4 and 5.

4) Realizing that there is a Big O around the larger architecture and thinking
it is always important

5) Realizing that archecture is determined by the problem space, so addressing
the problem itself as directly as possible will converge on optimal Big O

Once you've done it over and over for years it becomes more clear how Big O
just reflects the assumptions, and fighting it at the terminating point of the
specific algorithms and data is a thing to defer until an easy win is needed.
The only effort greenfield code deserves for optimization, OTOH, is to quickly
review the language's built-in structures, and pick the one that is
approximately best. Push the features, then as time permits build profiling
mechanisms to find the hotspots as the system gets production data.

~~~
pmiller2
There's a 6 as well.

6) Realizing that Big O doesn't matter when you can guarantee your data is
always small.

Maybe this is more like 4.1 or 5.1, but it certainly does come up.

------
dvt
Oh look another Triplebyte article that makes to the top of HN while also
being completely devoid of useful information -- and, in fact, quite wrong.

> The truth is, Big-O is just the name of the notation used to describe how
> efficient an algorithm is.

The truth is that this is absolutely unequivocally unabashedly WRONG. Big O
(there's no dash, first of all), describes the behavior of some function f as
f tends to infinity (but we can actually look at Big O when f approaches
arbitrary values, as well). The "O" in Big O references the function _order_
of the limiting function --, e.g. logarithmic, quadratic, cubic, and so on. If
you're going to make a hubbub about truly understanding Big O, at least get
the math bits right.

> Learning about how they work without understanding the associated
> implications of time and space complexity misses the point.

Some fresh grad probably wouldn't realize how laughable this claim really is,
but seasoned developers _do_. 99.9% of times you don't care (or even
understand; some are quite hairy) about the (micro-)optimizations found inside
standard library functions or data structures.

> It's like memorizing dates in roman history without having heard of Julius
> Caesar...

Yeesh, capitalize "Roman" please. What a low-quality trash-heap of an article.

> And on top of making learning easier, understanding the point of Big-O (and
> algorithms in general) will seriously help you avoid mistakes in interviews.

Ah yeah, here we go. Finally, we get to the only practical use case of Big O
unless you're a CS PhD: getting through interviews.

Overall, this is a zero effort pseudo-marketing blog post that hasn't been
researched, edited, or proof-read. It's all so Triplebyte can make the top of
HN, get a bit of traffic, and keep reinforcing horrible interviewing practices
that don't work. I swear, one of these days I'm going to write a book about
how to actually do a _good job_ of interviewing software engineers.

~~~
tzs
> Big O (there's no dash, first of all)

NIST disagrees [1]. It is in common use both with and without the dash in both
CS and math. Looking around in various books I downloaded from Springer when
they made nearly 400 books free for a couple of months in response to COVID, I
see several that include the dash.

Laczkovich and SoS, "Real Analysis"

de Berg, Cheong, van Kerveld, and Overmars, "Computational Geometry"

Lee and Hubbard, "Data Structures and Algorithms with Python"

Borthwick, "Introduction to Partial Differential Equations"

Paar and Pelzl, "Understanding Cryptography"

Haubold an Borsch-Haubold, "Bioinformatics for Evolutionary Biologists"

[1]
[https://xlinux.nist.gov/dads/HTML/bigOnotation.html](https://xlinux.nist.gov/dads/HTML/bigOnotation.html)

~~~
dvt
> NIST disagrees

I'd say whoever wrote the NIST article is wrong because there's absolutely no
grammatical reason to hyphenate, but that's a minor nit.

------
lotyrin
I end up doing hand-wavey explanation of things that abuses the terms of Big O
because then it suddenly seems like its a real, valid argument, but what I
really wish I could just say it how it is and have people understand and
respond correctly:

"If the time the server spends is proportional to number of requests coming
in, and the number of requests coming in is proportional to user engagement,
and we expect user engagement to grow over time by a rate also proportional to
user engagement -- but the amount of server time we have available to serve
requests is constant (there's no designed solution to scale horizontally) then
we have a problem."

Not a surprise problem that suddenly appears on the day where we run out of
resources, but a problem to solve today, right now, because we are already
over-committed by the nature of this design, and not a problem we can solve by
making the single node any bigger to start out with...

But many are seemingly strongly conditioned by education (whether that
includes CS or not) to only pull out certain techniques when a problem
presents itself in a pre-packaged familiar way, so if I whip out Big O, folks
are suddenly able to see that they're on the losing side of a bet pitting a
flat line against a curve.

------
wampwampwhat
is this an appropriate place to ask if triplebyte ever plans on staffing more
interviewers? I'm tired of getting marketing emails telling me to schedule my
interview when there hasn't been availability for coming up on 9 months now.
Might make more sense to hire interviewers, rather than blog writers.

~~~
quickthrower2
Maybe they are but they need interviewers to interview the interviewers.

------
nikki93
Wasn't mentioned anywhere that the input is sorted such that you could assume
a binary search works.

------
beervirus
This is just an ad for triplebyte.

------
trishankkarthik
BS. Big-O is designed by academics for academics. This is not how you would
teach high school students with an apprenticeship model.

