
Ask HN: As a developer what are your aha moments? - greenleaf3
To me the aha moments where as follows:<p>1. When I was introduced to QBasic<p>2. When I got to know how simple and amazing Lisp is<p>3. When I was able to code at the speed of my thoughts with VIM<p>4. When I got to know Express.js (after learning Django)<p>5. When I knew that everything in Smalltalk is a message including if else statements
======
haroldegibbons
My "aha" moment was realizing most of my ideas and most apps out there are
complete garbage. Not needed. Damaging, even. 99.9% of all of it.

For example, most "cutting edge" web apps are better off as PHP monoliths.
Facebook was a PHP file for a long time. But most apps in general should never
make it past being shell scripts, which are better off staying as spreadsheets
or better - text files which are better off as pieces of paper whenever
possible. And all paper is better off left as thoughts whenever possible, and
most thoughts should be forsaken.

~~~
ajb92
Ha, nice. I’m reminded of that quote by Jenny Holzer:

> Disgust is the appropriate response to most situations.

------
jdmoreira
1\. When I realized programming was hard

That’s it. For a long time I thought I was good at programming and I kept
making the same mistakes over and over. All nighters, leet coding, runtime
hacks, etc...

And then one day it struck me. It’s hard and my mind can’t keep up with
yesterday’s cowboy coding.

Slowly I started putting defense mechanisms everywhere - good type systems,
immutability, compile type instead of runtime, never overwork and sleep the
best I can.

Life is much better now and I'll never go back to thinking I'm good at
programming.

~~~
quickthrower2
How did this work - genuinely curious? If you work as a team did everyone else
stop being a cowboy too then pay down that tech debt? Or is this solo work or
perpetual greenfield work?

~~~
jdmoreira
It’s a bit more complicated than I made it look of course. My bit didn’t flip
in a day and the reason why I changed my ways was also because I've worked
with a few people that seemed to have figured this out already and just by
observing them I could figure it out myself.

Nowadays it really depends, some people for sure get it more than others. Some
just dismiss a lot of the safeguards I value as some kind of functional
programming hype. Teams are hard :)

------
02020202
Not AHA moments, but some pointers:

\- most tech is stupid simple, like retard levels of simple, with incredibly
complex description

\- SOA is stupid and monlith is always the way to go(maybe with handful simple
microservices)

\- microservices are not SOA

\- event sourcing is almost never a good idea

\- raw bytes over the wire or in storage are not as complex and mysterious as
you might think

\- compiled languages are not "smarter" than interpreted languages, don't feel
like you are lesser of a programmer if you do php or javascript

\- new cool and shiny tech is cool and shiny for 5 minutes, until you
implement it, then you realize it brings a ton of new complexity and a ton of
wasted time and you should have stayed with what worked before and you realize
the people who promoted it were just trying to make a sale

\- mysql/mariadb will handle way more than you think

\- if you want to waste resources, use java/jvm

\- yagni should be tattooed on every programmer's hand so when he types on the
keyboard he is constantly reminded that he is wasting time with stupid crap
nobody will use

\- don't think in "what if" terms or try to predict the future, just code what
is needed right now, avoid being a smartass

\- you can charge more for your services the more essential you are for the
project and the harder it is to onboard new people, ie. your value increases
over time

\- if you charge less now, you will get paid less tomorrow

\- running your project entirely in cloud will bankrupt you, use it to gather
usage data, than move to bare metal. cloud is cool and "in" but it will eat
your wallet, for no good reason whatsoever

\- single binary is always better than docker

.. i could go on and on...but thesre are more guidelines than aha moments so
that is it for me.

~~~
quickthrower2
With bare metal, are you not just paying more for staffing to maintain it?
Bare metal as cheaper sounds suspicious. Bare metal as essential maybe if you
are doing something like 3D graphics processing or mining or “big data”.

~~~
neurostimulant
> With bare metal, are you not just paying more for staffing to maintain it?

People always say this as an argument against bare metal servers, but aren't
you still have that staffing cost even if you're using a cloud provider? Cloud
servers are more reliable than bare metals, but certainly not 100% sla yet.
Shit happen regardless of using cloud or bare metal and we'll have to prepare
for it regardless whatever type of servers we used.

Most issues I had with both cloud and bare metal servers are usually
connectivity issues, which addressed by the vendors or data center operators
with me doing nothing but refreshing status page rigorously. I never have
hardware issues but it's probably because I always retire old servers after 4
years of operation.

~~~
machello13
> People always say this as an argument against bare metal servers, but aren't
> you still have that staffing cost even if you're using a cloud provider?

I think their point is that it may not be cheaper if you account for staffing
cost. Also, especially as a lone developer or a small team, it may be worth it
to pay for someone else to manage the metal instead of doing it yourself.

------
kleiba
1\. When I realized programming was easy

That's it. For a long time, when I thought of the hardest stuff to program, I
was thinking of computer games. Mind you, not the latest AAA billion dollar
development budget 300+ people on the team kind of games. No - given my age, I
was thinking about single-programming doing code and graphics and sounds on a
C64 kind of games. To me, it was incomprehensible how one person alone could
manage to do a machine, especially with the hardware limitations, do all
_that_.

And then one day it struck me. I had started to dabble with programming in my
early teens and then never stopped but, I guess, gotten better at it over
time.

Like with every process that evolutionary rather then revolutionary, there's a
good chance to lose track of your progress. I remember clearly after finishing
college, I really though that everything I knew then about computers and
programmers I had already known even _before_ I entered university. But then
one day, a freshman asked me a question about a programming assignment in
their class, and it was completely trivial to me. Yet, when I was a freshman,
I would have probably asked the same question to someone else.

And yet, that was not my aha moment. It still took many years for me to
realize that my idolization of game programmers was perhaps a bit much. Mind
you, I do realize of course that someone specialized in _any_ area will be
able to produce better code than someone who's not - for any definition of
"better". I'm still not a game programmer and never will be, so I still have
the highest respect for their profession. But I do realize now that there's
not outer-worldly skill that separates the game coder from the crop of all
other programmers. Anything is within reach - what you need is not some innate
talent, it's just dedication.

Life is somewhat better now and I'll never go back to thinking I'm not good at
programming.

~~~
thorin
After watching the first few weeks of Handmade Hero [1] I realised my idea of
hard work and dedication was different to a lot of people I was trying to
emulate. Doing this as a spare time, free project and the amount of knowledge
and experience on-show was incredible

[1]
[https://www.youtube.com/watch?v=Ee3EtYb8d1o](https://www.youtube.com/watch?v=Ee3EtYb8d1o)

~~~
kleiba
I really enjoyed watching that series too for a while, although it was
sometimes moving along a bit too slowly for my taste. I also hear it's not
really going anywhere even after a few years in the making now (?!).

One thing I always find amusing about such endeavours is the discrepancy in
the level of difficulty when presenting their contents. Often you see (or
read) them explaining programming concepts on a kind of baby level ("imagine a
variable is like a box that you can put something in...") but then when it
comes to video games math, they have no trouble assuming people know advanced
concepts from linear algebra or even calculus. Cute.

~~~
thorin
Or as the kids say "that escalated quickly!"

No surprised it didn't really go anywhere, but an excellent resource if you
have the time.

------
gonzo41
1\. No one working in tech actually wants to solve a problem and move on. The
problems are too fun to leave alone once there's an MVP. Enter stage left, all
the framework churn. Adoption of graph databases that don't fit the problem.
All because the developers are bored and not business aligned.

2\. Knowing the business is key to having personal buy in for work. IF you
work in a bank writing bank software, understand the bank and banking so you
understand the context of the software. It's a real 10x thing to know why a
requirement is a requirement.

3\. The software you write will live a lot longer than you planned. Your
experiments from 1. will haunt you.

~~~
growlist
> 1\. No one working in tech actually wants to solve a problem and move on.
> The problems are too fun to leave alone once there's an MVP. Enter stage
> left, all the framework churn. Adoption of graph databases that don't fit
> the problem. All because the developers are bored and not business aligned.

There's another problem arising out of this for those of us that DO like to
solve problems and move on - we're seen as not 'proper' techies if we're not
completely obsessed with technology, able to bore for hours, with a github
full of projects etc etc...

I like using technology to solve problems. I really don't give a stuff about
the intricate subtleties of a versus b technology unless it absolutely
matters, because for the vast majority of situations the obvious, pragmatic
choice will do the job just fine.

~~~
gonzo41
Oh yeah, big time. The current craze at my work could be summed up by the
phrase "How longs your pipeline".

It really irks me because we as 'tech' people don't think about time and money
and that spending it in one direction stops you from another. It's very
frustrating.

------
DanielBMarkham
1\. That programmers were a _market_ , we weren't just a bunch of nerds
sharing cool stuff with one another. That speaker at the conference? The one
talking about Wheezle-snort 7.0? Yeah it sounded awesome, but it was _supposed
to_. It's a sales pitch, even if the software is free (Actually, especially if
the software is free)

2\. That programming and making stuff people want are two entirely different
things, although programmers always assume that whatever you're building, it's
the right thing. You can spend your entire career in programming, learning all
sorts of goodness about things like Erlang innards, and never really
understand what your job is.

3\. That all of that mousing around, learning a new IDE every two or three
years when I started was a complete waste of time. I automatically assumed
that the more cool and shiny the programming environment, the easier it would
be to code and the better my code would be. But no. grep and awk work (mostly)
the same way now as they did 30 years ago, and any time I spent learning which
hotkeys did that work in some long-lost dev stack was a complete waste of
time.

4\. Conversely, that UX beats internal architecture, every time. If folks are
having a blast using your app, you win, even if it crashes every five minutes
(How they could have fun if it crashed every five minutes is a good question.
You might want to ask them)

5\. The more smart people you throw at a programming project, the bigger mess
you end up with. I know when I say that people are thinking of "The Mythical
Man-Month", but it goes deeper than that. Even if you somehow manage to stay
on-schedule, human communication around innovation stops working at a certain
scale, and that looks like a hard limit. There are ways around it, using
things like Code Budgets, following CCL, layers and DSLs, but nobody does
that, so it doesn't matter. We, as an industry, have absolutely no idea how to
staff or run projects.

ADD: One thing that was quite profound that I discovered late: if you code
well, the simple _act_ of programming can tell you something about the way
you're reasoning about the problem that you couldn't learn any other way.
Programming is by no means a simple one-way street where ideas come out of the
head of the programmer and end up as bits on the tech stack. It's very, very
much bidirectional. Our programs influence us as coders probably much more
than we influence them.

~~~
scandox
> You can spend your entire career in programming, learning all sorts of
> goodness about things like Erlang innards, and never really understand what
> your job is.

Sometimes I feel like my job is to protect other programmers from finding out
what their job is.

------
lioeters
A few that come to mind:

\- Implementing small languages, and how it inevitably leads one to a deeper
understanding of all the layers that make computing and programming possible

\- The Make-a-Lisp project and the language-agnostic conceptual design at the
heart of it -
[https://github.com/kanaka/mal/blob/master/process/guide.md](https://github.com/kanaka/mal/blob/master/process/guide.md)

\- Declarative nature of React (the view as function of state); using
centralized unidirectional state management and actions

\- Test-driven (and, similarly, type-driven) development

\- TypeScript - Benefits and costs of type definitions, dynamic/gradual and
structural typing; power of a language integrated with the editor environment

\- Virtual machines, QEMU, and later Docker - Commoditized, reproducible
builds for applications and their environments

\- Build pipeline scripts

------
cafemachiavelli
1\. More of a CompSci aha: Hard problems are hard. Sometimes I get frustrated
over getting stuck in a bad local optimum that I can't easily get out of. But
then if I take a step back, I notice that the problem is NP-hard, that I'm not
gonna solve it by throwing random heuristics at it and I need to either change
the constraints to make it P, lower the input size, or lower my expectations.
Simplest example: Deciding what goes in which shelf in my home is NP-hard.
Solution: Get rid of stuff more liberally -> (n' < n) -> f(n') is much easier
than f(n)

2\. Doing NAND to Tetris the first time. It taught me not only how computers
work, but how powerful recursive layers of abstraction can be. I had
absolutely no idea how my system looked on RTL, but I was still able to build
it.

3\. Also Lisp. I wish Nand to Tetris had picked something more lisp like in
the second half to show how simple and powerful it is.

4\. When I'm just coding something- alternating writing and testing - is much
quicker than writing a bunch and then debugging it. For bigger projects,
setting up CI/CD early can similarly save headaches.

5\. The functional big three (map, filter, reduce), but for me even more so
closures. I had gotten stuck trying to hardcode coefficients for a polynomial
until I noticed I'd end up with a lot of duplicate code, which was what I was
trying to avoid with FP in the first place. Then I realized I could just put
the polynomial function itself in a closure and call it with the coefficients
I wanted, when I wanted.

------
melvinroest
Writing a computer graphics engine during university taught me a couple:

1\. I can use math, intuitively, even if I don't know how the exact
calculation works.

2\. I can use a debugger and replay it again and again to understand what's
happening.

3\. I can diagram a high level design and not have the complete context in
mind and still produce a 10K LoC OpenGL powered computer graphics engine.

After that project, programming never felt impossible to me anymore. After
that project, I always had some confidence of being able to learn whatever I
needed, as long as I have enough time.

Here is the engine (2013):
[https://www.youtube.com/watch?v=PH6-dLvZEiA&t=1s&ab_channel=...](https://www.youtube.com/watch?v=PH6-dLvZEiA&t=1s&ab_channel=Melvin)

~~~
ivars
I had a similar confidence boost when I developed a Vulkan graphics engine.
People talk about this library or that framework being hard to learn and use,
but when you have struggled through Vulkan and 3d engine quirks, some of them
seem like a walk in a park now.

So to answer the original question, my epiphany is that something being "hard"
is a relative thing and that "hardness" should never be a barrier that keeps
me from reaching my goals.

------
superasn
My aha moment was that it is 100x better to focus on one project and make it
the absolutely very best instead of trying 10 things.

With these new and easy frameworks it's incredibly easy and quick to create
new projects but creating and launching something is just the tip of the
iceberg.

The main work comes after that, i.e. traffic, leads, conversions,
optimization, etc which unfortunately can only be done with good old tedious
hardwork and laser focus.

~~~
layoric
100x this and thank you for posting.

------
brainwipe
1\. Seeing a company do agile right by keeping the customer so close their
almost in-house. All the other stuff is meaningless. 2\. Realising I didn't
have to code for faceless corporates shifting cash around, I could work for a
small company actively trying to make people's lives better. 3\. Breaking
neural networks out of a predefined topology and letting them grow. 4\. If it
changes during runtime, put it in the database. If it changes per environment,
put it in config. Every else in code. 5\. That programming is a craft and you
must treat it like one; just bashing out code to fill a brief isn't enough.

~~~
seahyc
Curious about 1 and 3! 1 especially, are the customers literally sitting in
their planning meetings? Always find it tough to have customers make the time
investment, curious to know what models might work.

------
hirundo
When I learned about the repl. For me 90% of debugging is about figuring out
how to break at the crucial line and then shining the repl light on it. 75% of
writing new code is about trying stuff in the repl and then stepping through
the code in the repl and testing everything.

For me that quick feedback loop makes coding as fun as gaming.

~~~
debuggingnoob
Would love to see a video of what this actually means. I'm not sure I've ever
written code for which I can use a REPL in any meaningful way so I must be
doing it wrong. Videos of actual real world coding/debugging (not examples
that are too simple and therefore not real world) would probably be very
enlightening.

I mostly do graphics stuff and there are 70 lines of setup until I get to the
1 line that executes something based on the 70 lines of setup. But even in non
graphs code my program 1000s of lines, all of which have to execute where as a
REPL is one line at a time so yea, a video would really help see how to apply
this.

~~~
ladberg
It's easier in some languages than others, but I would say the most common
case is in Python (partially due to ease of dropping a REPL, partially due to
lack of compile-time checks). The one line of code you _always_ should have at
your disposal is:

    
    
        from IPython import embed; embed()
    

Basically, if you have an error and can't figure out what's happening within a
few seconds, stick that line in before the error and Python will give you a
REPL to play around with. It's like GDB but way better.

------
mcv
When I realised that programming is just taking a big problem and chopping it
into a couple of smaller problems, and then repeating that until your problems
are trivial.

The first time (1993) I was logged in on a remote machine and downloaded a
file to there from another remote machine. It was just magic that the machine
I was touching had nothing to do with that file.

When I hacked together a mailinglist system in shell script.

When I realised that C is basically just all memory locations.

When I understood closures.

------
semicolonandson
Becoming proficient with tools that allow me to pry into running processes and
see what they are doing under the lid

e.g.

lsof to see files and ports opened up by a process

strace to look at what system calls to the Kernel by program is making (and
ltrace for library calls)

pstree to see subprocesses spawned

gdb to inject myself into infinite loops of programs written in c-based
implementations (e.g. Ruby)

I'm a collector of aha-ish moments so if you want more, my YouTube account is
linked to in my profile.

------
namuol
1\. Even though it requires precision, the vastness of the solution space
effectively makes programming a mushy, tangled mess of technology.

2\. End of list.

~~~
scollet
ah....

a..

------
baron816
I went from Ruby to JavaScript development. First, I’d like to say I think
starting with Ruby/Rails is a bad idea. Ruby uses a lot of higher order
functions, but it isn’t super clear from the syntax how that all works. Higher
order functions in JS are much more clear, IMO, on account of having to use
parens to call the function (in Ruby, you don’t need parens to call function,
you can just name the function and delineate the arguments with spaces).

So with Ruby, I was just pattern matching—I never had an intuition of what
lambdas were doing until I moved over to JS.

The Feynman Method was another aha moment. Learning how to learn, in my case,
by writing down stuff in my own words as if I were teaching someone else was
probably the most important thing. That helped me develop a much deeper
understanding of important concepts.

Similarly, not trying to learn everything was a big deal. Don’t try to learn
all the niche technologies you see in job listings expecting you need to know
those things in order to qualify for the position. If you just do dozens of
tutorials, you’ll end up knowing a lot of things very superficially. It’s much
better to know a handful of ubiquitous, related technologies very well and
have a strong foundation of programming fundamentals. Those things can
transfer over to other technologies, when you need to pick them up.

~~~
Doctor_Fegg
> First, I’d like to say I think starting with Ruby/Rails is a bad idea.

It's telling that comments like this inevitably involve Rails, not plain Ruby.

Rails's "magick" is obfuscatory and at times counter-intuitive. Ruby, the
language, is pretty straightforward. I wish more people spent time playing
with Ruby before diving into Rails.

------
Regenschirm
\- My team lead showed me code which i wrote just two years ago and i didn't
believe at first, that i wrote that crappy.

\- When i realized that someone might quit his/her job in his trail phase. It
did not occure too me that both sides are checking if its fun/good for them

\- The main reason why i'm successful is that, besides my character flaws
(which did become better over the last 15 years), good software engineeres are
in a very strong demand and when i look on how hard it is to get good people,
i just might never have a really hard live

\- Softskills are crucial: Taking responsibility, being on Time, being
reliable, taking action when it matters without hesitation

\- You had a salary negotiation or a discussion and something was decided? You
still can get back to this 1-x days later and say 'you know i thhought about
it and i'm not happy with the outcome at all. We need to discuss this topic
again'

\- Don't complain if shit is shitty. Either change it, try to change it,
accept it, or quit. Stop telling others thats it shitty and do nothing.

\- Estimation is bullshit, never works, never aligns, no one really retrospect
it and if you ever find a team where it works, your team gets dismantled for
whatever reason and you have to start at 0. Prioritise for relevance, optimize
how you work, accept the outcome.

\- Never accept a deadline. Without a deadline, your manager can't come back
and say 'you promised' which leads to you doing overtime for a misstake your
manager did: he/she missmanaged!

\- Do less but better. Whatever you do shitty now will come back

\- Not doing something because you actually figure out what the other person
needed/wanted, is more often then not the better result if it does not to lead
writing more code

\- Understanding how you programm a computer game for three main reasons: 1.
memory allocation can fail 2. how the game loop works and how to programm it
3. Randomness

\- Sentences to know:

\-- I can try to get it done, but i can't promise

\-- I have a date tonight, i can't stay (if they insist:) I have expensive
cards for <event>... (pressuring you into doing overtime just because is
missmanagement)

\-- No

~~~
greenleaf3
> \-- No

This is one thing I need to learn to say. I have been doing lots of stuff
other devs want and hating myself for not being able to say no.

------
daxfohl
Powershell? (I've done windows most of my career, but "shell" would apply
anywhere).

It always seemed so lame compared to neat new functional languages or
distributed actor frameworks. And, not being much of an ops person, I almost
wanted _not_ to be any good at powershell (or any shell) so I wouldn't get ops
assignments.

After getting familiar with it though, it's improved my workflows and
ultimately quality of life. I'm not only more fluent in ops now, but I also
get to spend less time on it, which is what the rationale for _not_ learning
it was (write a script once and never have to remember what I did when
creating a new environment).

------
gitgud
Learning the [1] _Unix Philosophy_. Writing & Using small tools that can be
piped together... was a huge AHA moment for me

[1]
[https://en.wikipedia.org/wiki/Unix_philosophy](https://en.wikipedia.org/wiki/Unix_philosophy)

~~~
greenleaf3
Yup this was one of my aha moment too.

------
medymed
When I found that interfaces/abstraction was way more critical to understand
than recursion or obscure algorithms for most real projects. Half of the
battle of starting (or understanding) any project is clarifying the interfaces
in which one should be nestled.

------
philihp
My favorite all-time aha moment was coming from a CVS world and learning how
to use git. Oh, you just hash the entire lot of the files together and use
that as a version for the repo? Aha! The change set should be for all the
files together!

------
ralphc
Mid 90s, when I was introduced to Java after C++. With the combination of
garbage collection vs malloc/free, and just seeing that Java gave you 80% of
C++'s features with 20% of the complexity, I knew that it was going to be big
and take over C++ as the language for company and enterprise development.

------
tootie
I think the simplest realization is that the compiler is always right. Yes,
technically, compilers can have bugs but 99.99% of the time the bug is yours.
I used to stare at non-working code and think "This should work!" But that's
never true. If it doesn't work, I made a mistake.

My next realization was that a well-defined and clearly communicated product
definition is 10x more important than good coding.

------
qw
1\. Talking to other developers.

Sometimes using "uncool" technologies and languages is ok, and they often have
a good reason.

2\. The concept of "innovation tokens" if you solve a business need

[https://mcfunley.com/choose-boring-technology](https://mcfunley.com/choose-
boring-technology)

3\. There is no silver bullet.

Learned the hard way after switching to the new "flavour of the month"

------
sedeki
1\. When I realized that there's a thing called "computer science", and you
can actually study things related to programming. It's not just ad hoc tricks
written by experienced and bearded C programmers. You see, I'm self-taught
from an early age, and learnt programming via skimming "VB for Dummies" when I
was 8 y.o. or so. I didn't know anyone at all that knew how to program. It
would take another decade before I met someone who knew how to program.

2\. When I read K&R at the age of 14. Before that, I had just followed random
tutorials that I found on the internet. From that point on, I went straight to
the source (no pun intended) when it comes to learning new stuff.

------
ChicagoDave
1\. Polymorphism

2\. Vendors are often pushing bad architecture

3\. Architects often push things for the wrong reasons

4\. Companies often push risk to their vendors, avoiding collaboration,
inherently increasing risk

5\. ORM's hide business logic, preventing a company from understanding its
business and adapting to change

6\. Relational Databases are an operational anti-pattern

7\. Graph databases are excellent tools for common problems like security,
social media

8\. Document databases are excellent tools for Event Stores and CQRS

9\. Cloud native development (FaaS) is awesome

10\. Domain-Driven Design and Event Storming are excellent disciplines to add
to a corporate development group

11\. Corporations inherently don't understand Agile because they have to
measure everything through strong process definition

~~~
attil-io
> 6\. Relational Databases are an operational anti-pattern

Could you please expand on this? What is wrong with Relational Databases and
what would be the "correct" pattern?

~~~
ChicagoDave
The keyword in my statement is _operational_. Relational databases are
excellent for analytics and reporting. But in an operational system with well-
defined boundaries, it's more than likely that only portions of the larger
system are required for any transactional behavior. In a modern architecture,
you could simply write all transactions to an event store (write only) and use
CQRS and a relational cache store for reads.

Another aspect of all of this is the misconception that we should design
systems based on a relational data model. This only leads you through the
whole impedance mismatch of ORM's and how you serialize/deserialize your
bounded contexts or objects. We should not do this at all. We should design
systems on those bound contexts and determine the data store accordingly.
Operationally, a Document Database or Event Store is going to be the best
tool.

We can fire change events at a data warehouse to store data in a relational
manner for analytics and reporting, but none of that is necessary for our
operational system.

In your operational system, if you need any sort of complex join, you've
already created a poor design.

I will say this. This did not occur to me until I'd gone through the process
of developing a complex system using Domain-Driven Design. Until you've done
that, the old patterns will remain like concrete.

~~~
zamubafoo
If I'm reading this correctly, then it's not relational databases/relational
data models that are the problem, but instead the normalization across bounded
contexts within a problem domain?

~~~
ChicagoDave
Partly yes. But it’s also the realization that we tend to start with an ERD or
domain model before even looking at boundaries. We’re product-oriented. And
vendors don’t help. Look at how hard everyone is pushing containers.
Containers have uses, but there’s no rational explanation for the effort
behind their marketing push.

Simpler is better. Boundaries need to be respected. Tools should be selected
based on need, not desire.

------
LaundroMat
While I was reading "Eloquent JavaScript", I came across this absolute eye-
opener (to me at least):
[https://eloquentjavascript.net/07_robot.html#p_MtO6TwqB5I](https://eloquentjavascript.net/07_robot.html#p_MtO6TwqB5I)

In this chapter the author demonstrates that it is wrong to solve a problem by
creating a series of objects that do stuff. He shows that the right way is by
abstracting the problem into its most essential elements. Do not try to
emulate reality with code, but make it its own, abstract thing that solves the
issue instead.

This chapter and its code felt like pure poetry to me.

------
trumpeta
1\. Use ADTs instead of OOP, always. a.k.a composition over inheritance

2\. No ORMs

3\. No Frameworks, use libraries to build up your project

4\. Use strong type systems for modelling (make invalid states not
representable)

------
bfuclusion
When I stopped coding, and started thinking. Turns out hacked together stuff
is slower to produce than thinking overnight and getting it right.

------
dinkleberg
Finally understanding the appeal of static typed languages after doing a
project in Go using Goland.

I always thought I loved the freedom of python and JavaScript (and I still do
to some extent, you’ll have to take Django from my cold dead hands). But the
power of static type becomes super clear when you’re using a great IDE.

I can hit a hot key when hovering over any function and it’ll show me the docs
and take me to the full page if I want. Hitting another hotkey it’ll auto fill
parameter options. It’ll auto import the libraries I need. It’ll complain if
I’m using the wrong type.

It gives me so much more confidence that the code I’m writing will actually
work. It’s slightly more of a pain to write, no doubt about that, but the
payoff is huge.

I was burnt by java in the past and can’t stand how every project seems to end
up like this
[https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris...](https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpriseEdition).
But Go has shown it doesn’t have to be like that.

~~~
JoeAltmaier
Whoa I was with you up to the last line. Static typed languages have been
around for decades. Go is a very, very recent addition. It hasn't shown squat.

~~~
dinkleberg
Should’ve said “it has shown me”. I know there are many great choices out
there, it’s just the first one that has sunk in with me.

~~~
JoeAltmaier
Fair enough.

------
Raed667
When react/redux finally "clicked" in my head. Coming from an MVC world it was
quite a paradigm change.

~~~
el_dev_hell
Same here. The first time I added a spinner animation while secondary data was
being fetch was an eye opener.

Although, I have a growing dislike for Redux.

~~~
Raed667
What don't you like about Redux (genuine question)? I find myself using
useState and useReducer a lot more recently, but I still think redux, or at
least the pattern of having actions/reducers/dispatchers/selectors the best
way to approach state in the UI

------
jacobwilliamroy
When I learned what a kernel was, and the implications of proprietary closed
source kernels, I was instantly radicalized to the cause of gnu and libre
software.

------
Oras
I always thought programming was my passion as I never get tired or loose
motivation when I code until I realised that my passion was DIY and
programming was just a tool to create things.

Starting from there, I don't take any programming language seriously anymore.
Whatever makes the job easier and faster to accomplish in context.

------
temporallobe
When I finally understood that browsers only understand HTML, JavaScript, and
CSS. Yes that’s a simplification, but it’s essentially true. Early on in my
career when I started more front-end development, I was under the impression
that there was somehow much more going on under the hood, i.e., more
languages, executables, ways of setting styles and layouts. When this finally
clicked with me, it was all so much clearer and immediately less intimidating.
I’m still surprised to this day that even many experienced developers think
that browsers can natively interpret SCSS, TypeScript, or whatever templating
language they’re using — heck, some people even used to call jQuery a
“language”.

As a relatively new Clojure developer, the meaning of the parentheses was the
ah-ha moment. I imagine it’s the same for other Lisp-y languages.

------
Antoninus
Making the realization that I only do this for money and attaching my self-
worth to my output or current role is futile.

------
mekster
What makes money is with a business sense with a shitty developer skill and
not with a developer sense with shitty business skill.

Of course, being purely technically talented is one way to go as I have chosen
to be.

------
philholden76
Not really an 'aha' moment, more of a dawning realisation as I gained
experience.

When starting out as a developer, I think there is a tendency to see the
particular language/framework/syntax that you're using being all-important.

Over time, and with experience, you realise that the language and syntax are
just the "fine detail" of how you solve problems as a software developer: as
your understanding deepens, it's as if a kind of abstraction happens in your
brain, because you stop thinking so much about the fine details of language
and syntax, and start to worry about things like managing complexity and
optimising design etc.

And at that point, the realisation for me was that I can apply that experience
to any language/syntax/framework, which frees me up to pick the best way of
solving any particular problem, and to not be stuck in a rut with any
particular technology.

An added benefit is that a lot of the debate over stuff like "language X is
better than Y", or "this code style is correct and that one is wrong" become
unimportant, because you're thinking at a level that's not limited by
specifics.

------
manjana
Trying to master non-framework CSS and realizing how dead simple styling a
website can be when you are using a framework instead of trying to reinvent
the wheel all over.

~~~
adventured
What is your favorite CSS framework/s?

~~~
manjana
Bootstrap - my one and only experience with a CSS framework. Love it, but
can't compare to anything yet.

I'll properly give Bulma, Tailwind or another popular framework a go for my
next project - let me know if you can recommend any other.

~~~
darkhorse13
You can give Halfmoon a try:
[https://www.gethalfmoon.com](https://www.gethalfmoon.com)

Disclaimer: I built it.

~~~
searchableguy
Were you inspired by tailwind?

~~~
darkhorse13
Definitely yes!

------
vincent-manis
\- when I read Clark Weissman's `LISP 1.5 Primer' in 1968

\- when I finished a student programming assignment a year later and realised
that a well-chosen set of procedures constituted what we'd now call a DSL

\- when I ported Martin Richards' very clean BCPL compiler and realized that
you could write efficient code in something that wasn't Fortran

\- when I read the Thompson/Ritchie paper on Unix

\- when I completed my first reading of SICP

~~~
greenleaf3
SICP made me fall in love with Lisp even more. It taught me how powerful the
tandem of function and closure can be.

------
bilinualcom
When I actually understood the idea behind functional programming, specially
being able to think in the form of recursive functions than loops.

------
logicslave12
When I watched a world class software engineer build a real time data
processing platform using only functional programming.

At a major company, over ten teams were automated in a year.

Beautiful and useful abstractions around data and data processing tasks that
provided extreme value.

I could never replicate it, but it reminded me about the power of software and
the extreme ability that some have to weird it.

~~~
quickthrower2
If someone is given the freedom to think and apply it can be amazing. Most
companies don’t support this though. This was my experience early on in my
career - hardly anyone wants to let coders think, instead it’s about fitting
in to the processes. Not saying I’m that clever just common sense stuff. Like
err let’s not get 3 people to simultaneously build 3 similar screens in the
app.

------
BiteCode_dev
Programming is using conventions. Lots and lots of it.

Conventions for syntax, names, logic, API, structure, vocabulary and so on.

Conventions are by nature arbitrary, influenced by culture, history, social
behavior and a whole lot of human weirdness.

Don't try to learn all the conventions first, it comes faster with practice
and exposure to it. Solve the problem, then find the conventions you need to
apply the solution in your context.

The beginners paradox is that they need to learn a bit of specific conventions
(E.G: part of a language, one paradigm and a few libs) to start working on
solving a problem, so it's a frustrating experience.

There is no easy way to build the rest of the conventions based on your
knowledge of what you know now, because it's artificial.

It's also what leads people to say "don't learn languages, learn to program".
Which makes no sense to you at all, until, well, you finally know how to
program. But you got there by learning conventions on the way.

------
majky538
Spending so much time on meetings, especially awesome agile scrum - standups
and even when working in probably 5th company, business processes do not work
well, people deal with same problems of work organization. For me as
developer, this kills my productivity and annoys me sometimes when it's just
to much in a week.

------
animesh
1\. When I understood everything that happened in a custom chart with d3 I
created, including d3 patterns, animation, svg, everything else just paled in
comparison.

2\. When I gave into Power BI finally and fixed a report someone else created,
it felt so good. I don't think there is anything that compares to it and I
only scratched the surface.

~~~
Raed667
this is on my bucket list! I have managed to get by using either simpler
libraries such as chart.js or adapting the data of an already-made chart so
far.

What would be a good resource that helped you get to that aha moment ?

~~~
animesh
This was when d3 was v3.5. I did a lot of experimentation but also
simultaneously read multiple blog posts for the smallest things.

On top of my head I can remember these:

1\. [https://github.com/d3/d3/wiki](https://github.com/d3/d3/wiki)

2\. Jerome Cukier blog posts

3\. dashing d3 js

4\.
[https://alignedleft.com/work/d3-book](https://alignedleft.com/work/d3-book)

------
Jtsummers
I've had a few, but I'll pick one in particular. I got started with
programming via BASIC (various forms), none of which supported recursion. I
didn't really _know_ what it was, but knew that I just wanted to call into the
same routine again from itself or via mutual recursion (again, didn't know the
term). This didn't work as expected, however. Either the program wouldn't
compile/execute (threw an error at the recursion) or the recursion just
corrupted the data (each call shared the same local data, so they'd clobber
each others' work). While playing around on a TI calculator I programmed
(something, can't remember the details) but created a stack using the list
data structure. I then looped (versus recursed) but pushed a data element onto
the stack or popped elements off of it. The program quit when the stack was
empty.

Later, in college, we were learning lower level programming details (like what
C translated to in assembly and how it managed calls and the stack frame).
Despite this being my third CS course in college, I hadn't really _grokked_
recursion yet. But I had a flashback during one of the classes to the TI-BASIC
programs I'd written using a stack, and realized I'd recreated recursion (but
manually). After that recursion and loops were synonymous in my mind (as they
should be, at least in many cases) since I knew how to translate between them.
Whenever I saw someone managing a stack and looping until it was empty, I knew
both that it could be and how it could be translated into recursion (and vice
versa).

It seems to be one of the hardest topics for many of my colleagues (especially
those without a CS degree, so lacking practice with recursion) to understand
or ever use. But I can usually get them to understand it once I draw a few
things out on paper and show the two solutions to a problem (recursive or
iterative). This doesn't mean they like recursion, most still avoided it, but
they started to understand that it wasn't magic, it was just the computer
managing the same data structure they were manually managing.

------
znpy
I was studying operating systems and computer architecture in university (this
included a tiny amount of assembly programming) and was interning at a company
that used scala and learning scala. At the time there was quite a fuss about
the fact that the jvm did not support tail call optimization (at least until
Java 7 IIRC?).

So on one side i was programming in scala and on the other side i was "hand-
compiling" small C functions into m68k assembly...

The aha-moment came when I was hand-compiling a recursive function down to
m68k assembly and I saw that I could completely eliminate ALL the recursive
calls by re-arranging some register values and some values in the stack frame,
inserting a small preamble in the assembly and then at the end of the assembly
routine jump back to said preamble instead of making a recursive call.

------
meheleventyone
Realising that 90% of programming advice is unsupported junk.

~~~
nicklaf
Heh, I initially read this as "realizing that 90% of programs are unsupported
junk".

~~~
meheleventyone
Haha, definitely this too.

------
jariel
1) Complexity is very expensive

2) Written code can take a long time to stabilize and is thus expensive to
change and maintain

3) Ostensibly orthodox legacy systems are full of anti-patterns.

4) How easily a small problem can turn into an infinite number of other, small
problems, and the intuition required for where to 'draw the line'.

------
Megalepozy
Finding Anki after 17 years of programming and commiting to insert every new
data into it and practice it daily.

For me there is no other way of learning anymore and it serve as my real
memory. Basically it give me the confidence that something that I learn now
will be known years later.

~~~
blazeWayne
Can you share a real example? How do you use it? What tools do you employ?

~~~
Megalepozy
Sure thing! (and sorry for the late reply :))

Anki is a program which allow you to learn/memorize by using a Spaced
repetition technique where newly introduced and more difficult "ideas/notes"
are shown more frequently, while older and less difficult ones are shown less
frequently in order to exploit the psychological spacing effect. More about
the technique at
[https://en.wikipedia.org/wiki/Spaced_repetition](https://en.wikipedia.org/wiki/Spaced_repetition)

I use it for everything that I learn which I also want to know in the future,
this range from CS related knowledge like a new algorithm that I learn to an
English word which I don't know to even remembering the main concepts from
books that I read or names of people that I meet.

The flow goes something along the lines of: 1\. Learn something new till you
actually understand it 2\. Summarize it to a note or few notes with proper
questions (in a way this stage reminds me the Feynman method) 3\. Find/Create
(by create and find I mainly just do screenshots or just save pre-made
images... ;)) some pictures (if applicable) as adding visuals enhance memory
capabilities 4\. Insert it into Anki!

Admittedly this flow demands way more time and energy from "just
listening/reading" some content, but after so many years of programming I
found that I learned so much but forgot most of it... so I prefer to learn
better and slower, and honestly after doing this for around 1-2 years now I
got to say that it's absolutely amazing (I tell it to anyone who is willing to
listen not only to programmers).

There is also a daily practice where you need to pass through some notes (this
is the memory spaced repetition part), this usually take anywhere from 5 to 45
minutes a day (for me and based on how intense I am learning during the last
period).

You can download Anki for free and with no subscription fees at
[https://apps.ankiweb.net/](https://apps.ankiweb.net/) It also support
Linux/PC/Mac/Android/iOS (the iOS version cost money).

------
agentultra
Some more recent ones of note...

0\. When I learned how to model systems in TLA+

1\. When I figured out how to structure programs using monad transformers...
sort of the moment where I started reasoning algebraically about programs on
my own

2\. When I learned how to specify propositions in types and verify my designs
and programs

------
shivenigma
1\. When I finally understood how event loop in Javascript works for the very
first time, I had an Aha moment on the language design.

2\. When I read about how garbage collection works in V8 Javascript engine, I
had an aha moment about how hard things are just one layer below my working
area.

3\. When I started learning Rust, had another aha about how the language is
designed without GC. I never thought that was possible in a language.

These maybe simple things for many, but I have no educational background in
any of these, so I'm amazed by things that people actually got used to.

------
dipiert
1) Some people don't share your passion for software. They consider this like
just another job, they are not interested in improvement as long as they can
do daily programming tasks. That's ok.

2) It's not enough that an idea makes sense to you to communicate it. People
usually hate changes, you need to understand all the details before opening
your mouth.

3) I remember many a-ha moments regarding how the Internet works while reading
"Computer networks" by A. Tanembaum.

4) When I was finally able to exit vim.

------
l0b0
1\. My first job out of university, when I discovered you can do _interesting_
things with programming.

2\. Learning XForms, which can do most of what JavaScript is used for in a
tiny amount of code.

3\. Learning JetBrains IDEA, the only IDE that I ever enjoyed using.

4\. Learning red-green-refactor TDD. Now refactoring is something I do as a
matter of routine, not dread.

5\. Understanding the fractal complexity of Bash. It's weird how a language
can make stream processing and parallelization basically trivial, while making
things looping over non-trivial sets of files reliably PhD-level tasks.

6\. Doing pair programming, and then keep doing it for four years because it
was brilliant.

7\. Installing Arch Linux, the first distro where things weren't constantly
broken because of version drift.

------
daxfohl
Watching a coworker and his Zettelkasten notes. Junk that he did three years
ago, just pulling them right up, finding everything related in a couple
keystrokes, and reproducing the results. Unlike my lame attempts to keep notes
that I can never find again.

~~~
jhonnycano
Did you remember what software did he use? I have been looking for some
solution with this "methodology" but has not been able to find something
satisfactory 4 me. tks in advance!

------
lonelyasacloud
How easy I am to distract when I think I have something important to say ...
gosh dang it ;-)

------
stunt
When I realized I can use a simplified version of my take on Agile and Scrum
with a simple Trello board, and apply that on my side projects. And that could
help me to build them twice as fast and actually finish them too.

------
tehlike
The moment when I first wrote my cgi/perl thingy and deployed my first
website.

------
Havoc
Docker & associated devops stuff like docker-compose. Had a bit of that "speak
the right incantations and magic appears out of thin air" vibe to it as with
learning to do basic programming.

------
vbsteven
That everything software and most hardware/electronics is just logic all the
way down with different levels of abstraction.

This is why I love software so much and struggle with people. Software is all
logic and mostly deterministic, while with people feelings are involved (which
could be argued to be also deterministic but then we get into philosophic
discussions about free will)

But on the software/hardware side, any bug can be resolved by digging through
the layers of abstraction and figuring out where the logic error is.

------
Rerarom
1\. When I first learned to program (in Pascal)

2\. When I learned event-driven programming using Windows Forms in C# and I
was able to create programs that resembled the ones I used

3\. When I took a course in POSIX, programmed using fork and pipes and learned
about stuff like race conditions

4\. When I spent a year learning everything there is to know about coding
(including assembly, lisp, smalltalk, rust) and realized I would never feel as
happy as I felt during 1-3 because I had changed as a person

------
david_m
1\. Figuring out and loving functional programming in JS/Node.js, coming from
a Java/OO background

2\. Hating working on a Java/Angular/OO project after 5 years of FP

------
erik_seaberg

      [1]> (+ (expt 2 32) (expt 2 32))
      8589934592
    

Numeric overflow is a _choice_. Even a 16-bit program that might have had a
32-bit ALU could simply decide to support larger numbers rather than producing
something meaningless and dangerous.

    
    
      def retry(block: =>T): T = ???
    

Half of what I missed about macros is being able to control (re)evaluation of
an expression, and call by name handles that really well.

------
abetusk
I'm not sure if these are too meta, but here are ones that I think are
relevant:

\- The free/libre/open source community care deeply about fundamental problems
of our society and are trying to provide legal and technical tools to help
take steps to create a better world. When I was younger, I thought "suckers!
They're giving their compiler away for free!" It took me a while to
internalize the free software ideals and even longer to be an active
proponent.

\- Corporate software, especially Microsoft, is in the business of creating a
walled ecosystem, charging end consumers for their product and charging
software developers to be part of that ecosystem. The first 'a-ha' moment was
when I realized they, and others like them, were a racket, or at least trying
to be one. The later 'a-ha' moment was when I realized there was a viable
alternative to this game.

\- Most (but not all) computer programming language flame wars about which
language is better boil down to whether the developer prioritizes speed of
program execution or speed of program development. See [1]. Newer language
wars center of safety and bug creation, so maybe this point is dating me.

\- Programming languages are more about human psychology than about some
mathematical proof of correctness. Programming languages are designed as a
compromise between how a computer needs to be told to operate and how humans
communicate. All the theoretical and mathematical infrastructure around
language design are there to justify decisions once the language has passed
the "psychology" test. This is the same idea between JPEG, MP3, etc.
compression. Fourier transforms and wavelet transforms don't inherently create
a saving, it's only when we can throw away higher order coefficients because
the human perceptual system isn't as sensitive to them as lower order
coefficients, does it give benefit. That is, JPEG and MP3 are specifically
tailored to the human perceptual system. The math is there as the building
block but which bits/coefficients to throw away are determined empirically. To
belabor this point a bit more, programming language discussions arguing over
functional vs. imperative, type safety, etc. that don't try to determine
measurable metrics of success, preferably through testing their hypothesis
with data collected on actual usage, are expressing an opinion only.

[1]
[https://web.archive.org/web/20200407014302/http://blog.gmarc...](https://web.archive.org/web/20200407014302/http://blog.gmarceau.qc.ca/2009/05/speed-
size-and-dependability-of.html)

------
quickthrower2
You can get paid more and have an easier time.

You can’t possibly know what a company you join will be like until you
actually have been working there 12 months.

For large code bases you can’t really rearchitect anything. You are stuck with
how it works. Maybe on small scales you can refactor.

Don’t blindly apply design patterns. SOLID is good as a thinking framework
rather than a code review gate.

Marketing isn’t what you think it is until you study it somewhat. E.g. it’s
not glossy ads!

~~~
etripe
> You can’t possibly know what a company you join will be like until you
> actually have been working there 12 months

Does it actually take you 12 months, or did you mean that as a measured,
sensible statement? Or are you perhaps looking at it from a bi-directional
loyalty or advancement perspective and not just general culture?

For me, that would only apply to companies that I interpret as
middling/unimpressive at first glance. The really good or bad companies stick
out much more, so I can usually tell by the hiring and onboarding processes
and the first couple of tasks you're given, even as a consultant.

~~~
quickthrower2
I don’t see how you can know he team dynamics and politics (all companies have
politics) before joining somewhere unless you have a friend working there. And
even then what they pick up on might be different to you. I’ve seen a company
I haven’t worked for but who was on my radar go from hero company that
employees loved to mass walkout not soon after that was bought out.

Buy outs are a big factor plus reshuffles of management and teams.

Not only that, I’ve had excellent vibes at places in the first 6 months to
find out the asshole factor is high later on too. Also without any of this
stuff just changes, and I don”t see how that is avoidable. Change is constant!
MMMV.

~~~
etripe
Please note I wasn't saying _before joining_, either. I agree on your points,
though.

------
kangnkodos
When I first started compiling code, there were often pages and pages of
compiler errors. I felt that I had to read all the errors every time, and then
somehow the most important information would magically emerge from taking in
the big picture.

I learned to focus in on just the first compiler error, and ignore all the
rest.

Read the first error. Resolve the first error. Recompile. Repeat.

This is just one example of breaking big problems into smaller problems.

------
baq
1\. computer science is maths, software engineering is power point.

2\. in software engineering it's always a people problem no matter what they
tell you. see point 1.

------
yodsanklai
There were so many... I learned programming by myself with limited resources
when I was a kid. Eventually I got a formal education and there were a lot of
aha moments!

For instance:

1\. Writing a Scheme interpreter in C

2\. Abstract data types and encapsulation

3\. Functional programming and recursive algorithms

4\. Groking OOP and design patterns (this took me _a long time_ )

5\. Understanding how processes and scheduling is done in an OS

More recently, not really a "aha" moment, but Git has been a game changer.

------
josgraha
my aha moment as a developer

\- I would learn some technology to a point where it does what I want

\- I take what I learned to some new technology but find myself doing things
from what I learned in other technologies

\- this puts me in a strange loop where reality wasn't lining up with my
expectations or I would do things that were more work than necessary because
of learned habits

\- the aha moment was when i started learning the theory of the thing I was
working with

* it was key to getting out of this rut

* turns out this applies to any theory from abstractions all the way down to computational theory, type theory, automata theory, software analysis theory, hey if i ever get there maybe even software synthesis

in a nutshell i would summarize this aha moment as "you can keep doing the
same old tricks and eat the cost or you can always dig deeper and see what
costs can be avoided."

~~~
josgraha
this might be a bit vague so to put another way perhaps I learned to zoom in
before zooming out, and once I have done something, it never hurts to zoom in
again and see what can be gleaned from the assertions I was making about the
world at the time. in practice some things I have learned to ask myself are

\- what is the shape of the input data (requirements, configuration,
dependencies are also data) of the thing I'm working with and what is the
shape I'm looking to produce as output of this software (could be any
combination of side effects, screens, or just data)?

\- on a larger feature or set of features i look for a domain language to be
discovered or did I create a domain language and does it hold true?

\- what are the fundamental assumptions I was making and could they be
improved from first principles?

\- what is the expected and unexpected behavior of what I am building today or
what I built in the past and why? (learning opportunities)

\- the takeaways are my assertions are only as good as my understanding and
understanding requires detail, attention to detail is only as good as my
checklist, and my checklist is only as good as the questions I'm asking, this
helps create a habit loop where I can hopefully improve outcomes with each
iteration based on deeper introspection

------
chubot
Using shell as a REPL for C.

I learned C on Windows, and before I learned any dynamic languages. And before
I had ever written a unit test.

I knew all the rules, but I was not good at making a reasonably sized, correct
program in a reasonable amount of time.

\---

But then I developed a good workflow for writing Python, shell, etc., and then
went back to writing C, and it helped immensely.

C is a dynamic language in many respects anyway!

------
phendrenad2
When I realized that my code won't last forever, and in fact shouldn't. My
code will solve a problem now, but in 5-10 years someone else will rewrite it.
The company may pivot, or get acquired, or merger. So build a solid
foundation, but don't plan for it to become the next world wonder.

------
7sigma
1\. lisp

2\. realising that its more about delivering than dreaming up the perfect
abstraction (get it done).

3\. what you think the user wants vs what the user thinks they want vs what
the user actually wants vs what the user actually needs.

4\. that there are always tradeoffs

5\. building a product by yourself (whether on the side or starting up) will
give you invaluable experience.

------
stack_underflow
1\. The first language I ever used/learned was batch to script things in
windows. When I learned python shortly after I recall it taking quite a bit of
convincing myself to accept/use the "magic" of control flow being able to
automatically jump back after a function returned as I was so used to manually
wiring up all my gotos (was more of a painful experience than an 'ah-ha' I
guess...)

2\. When I was going through Tim Roughgarden's Algorithms course and saw the
derivation of runtime complexity for mergesort and finally
understood/visualized what a logarithm actually did (in school it was just
taught as some rote function to help you manipulate eqautions of the form
y=b^x)

3\. Learning how TCP works from the bottom up. I think the biggest aha moment
was when the textbook I was reading explained the TCP algorithm as a state
machine that's just running on the client and server machines with the rest of
the underlying network just forwarding packets, i.e. "pushing the complexity
to the edge of the network".

4\. Working through the nand2tetris project resulted in a lot of "oh X is just
basically this at its core"

5\. When going through a textbook explaining how relational database engines
were implemented and seeing that they're essentially just using disk-persisted
hash tables and self-balancing search trees to build indices on columns and
make `select`s O(1)/O(log) time (I wasn't taught this in my uni's database
course and assumed there was some fancy magic going on to make queries fast)

6\. Realizing that I could just do a form of graph search/dependency
resolution when learning a new codebase/trying to understand how a function
works. I think before seeing someone do this in front of me I would usually
just panic at the thought of "thousands of lines of code" rather than just
"keep iteratively diving into the functions being called". Whenever I'm
learning a new language, the first thing I'll do is setup the LSP plugin in
vim so that I can quickly navigate up and down call graphs. Tbh I don't
understand how some developers claim to not need this and instead just
manually grep+open file in order to "jump to definition".

7\. Forcing myself to derive the rotation logic for AVL trees. I was curious
if, given just the high level properties of how an AVL tree behaves in order
to guarantee O(log) time lookups, if I would be able to figure out all the
rotation cases. Was a very rewarding exercise and something I plan on writing
a blog post about (eventually...)

(edit)

8\. Learning about the log data structure and how state can be replicated by
replaying/executing this stream of transformations/updates.

~~~
bear8642
>Working through the nand2tetris project

Done various sections of this and would heartily agree with how illuminating
working through the project's been

------
sys_64738
When I realized Emacs can do everything.

------
lazharichir
1\. When I learnt TypeScript after having used mostly PHP and JS until then –
not sure how I lived without stronger typing

2\. When I started reading programming and software engineering books after
having just "done it" – this brought so much thought and structure to what I
used to improvise

------
dave_sid
My favourite Aha moment is the chorus of The Sun Always Shines on TV. Gives me
shivers down my spine.

------
sebastien_b
In the 90’s, as a teen I was reading the Amiga ROM Kernel Manuals to try to
learn programming on the Amiga.

The section on graphics and UI described BOOPSI - an object-oriented way of
constructing UI elements with inheritance, etc. Had never been exposed to that
before and it blew my mind.

------
SargeZT
1\. Objects finally 'clicking' into place for me.

2\. Being an early adopter of Tulip (later renamed asyncio) and gaining an
understanding of the event loop and concurrency without threads.

3\. Understanding that all code is just a particular representation of some
S-expresssion.

------
adontz
Microsoft Excel is a functional language IDE where each cell is a function
expression.

~~~
ravivyas
+1 to this, but I came upon this indirectly.

My biggest "OK Moment" (1) was non developers building tools via Spreadsheets
Sheets when they do not have a tech team or the team's bandwidth, Which makes
them IDEs. They also function \- independent micro data stores \- a data
exchange mechanisms

1: [https://ravivyas.com/2020/07/07/stack-the-bricks-with-ok-
mom...](https://ravivyas.com/2020/07/07/stack-the-bricks-with-ok-moments-
rather-than-bet-on-aha-moments/)

------
globular-toast
When I realised it's not magic.

The first computer I had access to was my family's Windows 95 PC. I learnt to
write HTML in notepad and see it rendered in Internet Explorer. This was the
beginning of my career, but so much of what was happening was accepted by my
brain as simply "magic".

I would later chip away at that stack of magic and learn how more and more
things worked, but even while being an accomplished programmer I still had
this feeling that magic existed. It wasn't until I learnt to build my own
computer from discrete logic components (thanks, Ben Eater) that I finally
felt like I understood it. Computers are just machines.

I've since revisited textbooks (like Knuth) and the history of computing
(starting with Babbage) and feel like my eyes are no longer obscured by my
preconceptions about what a computer is.

------
tibbydudeza
Dad's HP-67 .. could program it ... he later got a HP41C. Dad bringing home
his HP-85 for my school vacation for "work". Turbo C Unix What is new is
actually old ... "Mother of All Demos".

------
scott31
When I was introduced to Arc, it was the missing piece that completed LISP for
me

------
cryptica
When I realized that the entire software industry is a farce; the only purpose
of which is the creation of jobs in an unnatural supply-side economy driven by
unequal, preferential access to easy money.

------
christophilus
Learning Clojure. All of the FP theory I’d learned in university finally
clicked and made sense. Also, it challenged my notion that the only sane
languages were statically typed.

------
tmaly
If you are good at communicating, sometimes you do not even need to write a
line of code.

Understanding what someone really needs should be a skill taught in CS.

------
austincheney
Most recently: HTTP2 has unpredictable performance problems when not used for
typical get/response or binary streams.

------
endori97
Reactive programming coming from OOP was it for me.

------
switch007
I'd be interested to know what the 'aha' was when learning Express (and how
you find it compared to Django)?

------
leeman2016
1\. Finding out WPF after years of work on WinForms; in that you can do MVVM
stuff, control templating, vector graphics, etc.

2\. React and Typescript

3\. Jetbrains IDEs

------
neillyons
After going through Ben Eater's 8bit computer video course on YouTube. He
builds a computer on breadboards. It is amazing.

~~~
mentos
Yea I would say the same about nand to tetris for me. I only discovered it
after college I wish I had found it sooner

------
RickJWagner
When I purchased the most amazing problem solving device I've ever had. A
personal whiteboard.

~~~
eclectric
How do you retain the learnings? I have a black board for the same purpose but
am not able to commit to it due to this reason.

~~~
RickJWagner
Yes, that's a downfall. I bought a large sheet of whiteboard material for
about $12 and cut it into quarters. If I have some subject that's really
important, I'll leave that one stay for a while.

I probably should transfer the really good findings to a notebook, or maybe
take a photo. But I haven't yet.

------
HeadHonchoSP
1\. That you only need VIM :qa

------
cutler
My discovery of Clojure followed by my discovery of Ruby's Lisp roots.

------
throwaway_pdp09
Realising that functions could be first class object, that is, they could be
passed around just like integers or strings. Suddenly a whole new world of
possibilities to simplify code magically opened up.

------
yitchelle
The less code I write, the less bug it will contain.

------
scollet
Networking and byte packing. Still amazes me.

------
m00dy
the moment that I realised what a syscall is.

------
slmjkdbtl
(gamedev) When I switched from Rust to C.

------
username90
Most of the time spent developing is spent making decisions.

To me technical debt is therefore defined as how many decisions you have to
make in order to create a feature. Clean is when you don't have to make many
decisions to get things done.

Example decisions, I'd say I spend at least 90% of my time developing on these
decisions:

\- What feature would be good to have?

\- Is the feature worth the effort to build?

\- Is the feature worth the compute costs?

\- What language/framework should we use for this feature?

\- How should we structure persistent data related to this feature?

\- Where should the code for this feature live?

\- How should we test this feature?

\- How performant should this feature be?

\- What name should this helper function/variable have?

The more of those you have to think about when developing the slower you will
make progress. Therefore the main productivity hack is to write down
guidelines or roadmaps or design documents for all of those so you don't have
to think much about it when developing. This means don't be a manager when
coding, let someone else do that work or do it before you start programming.

Things you can do to reduce mental cost of above decisions:

\- Product roadmap with features that would be good to have.

\- Discussions in the roadmap related to how much value said feature will
provide and the effort to produce it.

\- Discussions in the roadmap related to how expensive the feature will be to
run.

\- General guidelines on what language/framework you use.

\- Have a very good architectural document describing how you structure
persistent data.

\- Have list of example commits showing where to put code for different
features.

\- Have a well documented testing strategy with examples pointing to commits
with good integration and unit tests.

\- Have guidelines on how much typical actions are allowed to take, like "page
update should take 100ms at most".

\- Try to write code where you don't need a lot of long superfluous names,
namespaces and anonymous functions are your friends.

\- Lastly, as much as possible try to make reasonable defaults for shared
code. If you have to make 20 configuration decisions in order to use a library
then you wont save a lot of time using it, and likely people will just copy
the configuration from other places anyway since making 20 decisions is too
much work. For example, lets say your library have a flag that can speed up
processing 2x in some cases but with extra overhead most of the time. You
could think that forcing the developer to decide in each case to ensure we
aren't missing any performance improvements would be a good thing, but in
reality a 2x performance improvement rarely matters. So the cost of having
every developer making this decision outweighs the performance benefit.
Instead have it as an optional config that they can set when they actually
need the performance.

------
non-entity
Not sure I can pinpoint it to a specific time or anything, but as of recent
nothing feels like "magic anymore". Many years ago there were all sorts of
classes of software, patterns, etc. that looked and sounded arcane and complex
and I'd never understand how it worked. Now nothing really feels like that. I
can typically get an "aha" moment by doing a bit of reading and understand
from a very high level how something works. Not to say that I'm a good
programmer and can't write and implement anything, because that's far from the
truth, but I can typically understand stuff and lose the intrigue. Kinda sucks
though because its killed the motivators that got me into the craft.

~~~
daxfohl
I agree with this. Lots of aha's come from "hey why isn't this working",
digging into library implementations or system internals, and figuring it out.
It's stuff that's annoying at the time because you're working on something
else, but each time it happens you learn more about what's going on under the
hood. And eventually you realize these things you picture as dark arts are
actually pretty straightforward, and roughly how you'd have guessed they were
implemented if you'd thought about it hard enough.

