Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: As a developer what are your aha moments?
125 points by greenleaf3 14 days ago | hide | past | favorite | 184 comments
To me the aha moments where as follows:

1. When I was introduced to QBasic

2. When I got to know how simple and amazing Lisp is

3. When I was able to code at the speed of my thoughts with VIM

4. When I got to know Express.js (after learning Django)

5. When I knew that everything in Smalltalk is a message including if else statements




My "aha" moment was realizing most of my ideas and most apps out there are complete garbage. Not needed. Damaging, even. 99.9% of all of it.

For example, most "cutting edge" web apps are better off as PHP monoliths. Facebook was a PHP file for a long time. But most apps in general should never make it past being shell scripts, which are better off staying as spreadsheets or better - text files which are better off as pieces of paper whenever possible. And all paper is better off left as thoughts whenever possible, and most thoughts should be forsaken.


Ha, nice. I’m reminded of that quote by Jenny Holzer:

> Disgust is the appropriate response to most situations.


I've been a lurker for 5 years.

Congratulations you just wrote my favourite comment on hacker news.

Thanks for the laugh.

You certainly make a strong case.


I was really into learning new tech, new algo all in my free time and one day I got the ahah moment you describe and since then I have started reading fiction.

Truth! Low tech or no tech is much easier to deal with if you are forced to.

Where’s the fun in that?


Very poetic! But true.


1. When I realized programming was hard

That’s it. For a long time I thought I was good at programming and I kept making the same mistakes over and over. All nighters, leet coding, runtime hacks, etc...

And then one day it struck me. It’s hard and my mind can’t keep up with yesterday’s cowboy coding.

Slowly I started putting defense mechanisms everywhere - good type systems, immutability, compile type instead of runtime, never overwork and sleep the best I can.

Life is much better now and I'll never go back to thinking I'm good at programming.


I figured it out in another incidence. I was in my first year as programmer, and I believed I was a pretty good one.

I remember I was hanging around in a small computer book store, and there was a big book of 'code complete 2' there. I have totally no idea what's that, just opened it out of interest, and I remember I flipped to a chapter about how to write a good 'if else'. And I was like 'holy crap, I wasted my entire life, I never thought in that way, I must be the rookiest rookie in the entire programmer world'. I immediately purchased the book, and that indeed helped me a huge deal in the career.


Paraphrased famous CS quote from somewhere: "The sooner you acknowledge that you cannot fit the entire thing into your head, the sooner you can start factoring that acknowledgment into your design."

similarly: realise that i can only think about a small number of things at once. so try to structure things to separate concerns so that i get to avoid thinking about nearly all things when working on any given part of the system. a lot easier said than done.


> good type systems, immutability, compile type instead of runtime, never overwork and sleep the best I can

I agree with you on this.


> I started putting defense mechanisms everywhere

Just weeks ago, the software I had just finished writing complained to me at first run that I had forgotten to configure a setting. Defensive coding ftw.

> never overwork and sleep the best I can

Your best work can be done while not at your desk. Don't mind taking frequent breaks.


How did this work - genuinely curious? If you work as a team did everyone else stop being a cowboy too then pay down that tech debt? Or is this solo work or perpetual greenfield work?


It’s a bit more complicated than I made it look of course. My bit didn’t flip in a day and the reason why I changed my ways was also because I've worked with a few people that seemed to have figured this out already and just by observing them I could figure it out myself.

Nowadays it really depends, some people for sure get it more than others. Some just dismiss a lot of the safeguards I value as some kind of functional programming hype. Teams are hard :)


Not AHA moments, but some pointers:

- most tech is stupid simple, like retard levels of simple, with incredibly complex description

- SOA is stupid and monlith is always the way to go(maybe with handful simple microservices)

- microservices are not SOA

- event sourcing is almost never a good idea

- raw bytes over the wire or in storage are not as complex and mysterious as you might think

- compiled languages are not "smarter" than interpreted languages, don't feel like you are lesser of a programmer if you do php or javascript

- new cool and shiny tech is cool and shiny for 5 minutes, until you implement it, then you realize it brings a ton of new complexity and a ton of wasted time and you should have stayed with what worked before and you realize the people who promoted it were just trying to make a sale

- mysql/mariadb will handle way more than you think

- if you want to waste resources, use java/jvm

- yagni should be tattooed on every programmer's hand so when he types on the keyboard he is constantly reminded that he is wasting time with stupid crap nobody will use

- don't think in "what if" terms or try to predict the future, just code what is needed right now, avoid being a smartass

- you can charge more for your services the more essential you are for the project and the harder it is to onboard new people, ie. your value increases over time

- if you charge less now, you will get paid less tomorrow

- running your project entirely in cloud will bankrupt you, use it to gather usage data, than move to bare metal. cloud is cool and "in" but it will eat your wallet, for no good reason whatsoever

- single binary is always better than docker

.. i could go on and on...but thesre are more guidelines than aha moments so that is it for me.


With bare metal, are you not just paying more for staffing to maintain it? Bare metal as cheaper sounds suspicious. Bare metal as essential maybe if you are doing something like 3D graphics processing or mining or “big data”.


> With bare metal, are you not just paying more for staffing to maintain it?

People always say this as an argument against bare metal servers, but aren't you still have that staffing cost even if you're using a cloud provider? Cloud servers are more reliable than bare metals, but certainly not 100% sla yet. Shit happen regardless of using cloud or bare metal and we'll have to prepare for it regardless whatever type of servers we used.

Most issues I had with both cloud and bare metal servers are usually connectivity issues, which addressed by the vendors or data center operators with me doing nothing but refreshing status page rigorously. I never have hardware issues but it's probably because I always retire old servers after 4 years of operation.


> People always say this as an argument against bare metal servers, but aren't you still have that staffing cost even if you're using a cloud provider?

I think their point is that it may not be cheaper if you account for staffing cost. Also, especially as a lone developer or a small team, it may be worth it to pay for someone else to manage the metal instead of doing it yourself.


You can just rent bare metal (or "root servers", as it was once called). Then it just have to worry about the occasional hardware defect, but at small scales that basically never happens.


Of course bare metal is cheaper, that is literally the reason cloud providers can be profitable.


TCO. Total Cost of Ownership. Bare metal + staffing costs should be compared to cloud + its staffing costs.


>event sourcing is almost never a good idea

Wait isn't event sourcing the fundamental principle of redux ? And redux is the most popular UI state manager library out there. and why you feel event sourcing is not a good idea ?


Think he means event sourcing in a distributed transaction context, not a front-end context: https://microservices.io/patterns/data/event-sourcing.html

Spot on ! Agreed with everything ! Some add ons

- whatever cool code you write, if you can't sell it, it is useless

- every dev should work as support tech for some period. then only you understand what your customer needs and challenges in debugging

- Before adopting any new tech, always always look how difficult it is to debug when shit breaks

- keep things simple, so you can have peaceful sleep

- You will never learn everything, accpet this. Learning is continuous process. Just learn to stay curious, that's all.

- learn more and more CS fundamentals instead of getting certs in over-hyped products

- if you want to learn coding, code. don't watch video tutorials.


> whatever cool code you write, if you can't sell it, it is useless

code can be valuable in more ways than just ‘someone will pay money for this’.


1. When I realized programming was easy

That's it. For a long time, when I thought of the hardest stuff to program, I was thinking of computer games. Mind you, not the latest AAA billion dollar development budget 300+ people on the team kind of games. No - given my age, I was thinking about single-programming doing code and graphics and sounds on a C64 kind of games. To me, it was incomprehensible how one person alone could manage to do a machine, especially with the hardware limitations, do all that.

And then one day it struck me. I had started to dabble with programming in my early teens and then never stopped but, I guess, gotten better at it over time.

Like with every process that evolutionary rather then revolutionary, there's a good chance to lose track of your progress. I remember clearly after finishing college, I really though that everything I knew then about computers and programmers I had already known even before I entered university. But then one day, a freshman asked me a question about a programming assignment in their class, and it was completely trivial to me. Yet, when I was a freshman, I would have probably asked the same question to someone else.

And yet, that was not my aha moment. It still took many years for me to realize that my idolization of game programmers was perhaps a bit much. Mind you, I do realize of course that someone specialized in any area will be able to produce better code than someone who's not - for any definition of "better". I'm still not a game programmer and never will be, so I still have the highest respect for their profession. But I do realize now that there's not outer-worldly skill that separates the game coder from the crop of all other programmers. Anything is within reach - what you need is not some innate talent, it's just dedication.

Life is somewhat better now and I'll never go back to thinking I'm not good at programming.


After watching the first few weeks of Handmade Hero [1] I realised my idea of hard work and dedication was different to a lot of people I was trying to emulate. Doing this as a spare time, free project and the amount of knowledge and experience on-show was incredible

[1] https://www.youtube.com/watch?v=Ee3EtYb8d1o


I really enjoyed watching that series too for a while, although it was sometimes moving along a bit too slowly for my taste. I also hear it's not really going anywhere even after a few years in the making now (?!).

One thing I always find amusing about such endeavours is the discrepancy in the level of difficulty when presenting their contents. Often you see (or read) them explaining programming concepts on a kind of baby level ("imagine a variable is like a box that you can put something in...") but then when it comes to video games math, they have no trouble assuming people know advanced concepts from linear algebra or even calculus. Cute.


Or as the kids say "that escalated quickly!"

No surprised it didn't really go anywhere, but an excellent resource if you have the time.


1. No one working in tech actually wants to solve a problem and move on. The problems are too fun to leave alone once there's an MVP. Enter stage left, all the framework churn. Adoption of graph databases that don't fit the problem. All because the developers are bored and not business aligned.

2. Knowing the business is key to having personal buy in for work. IF you work in a bank writing bank software, understand the bank and banking so you understand the context of the software. It's a real 10x thing to know why a requirement is a requirement.

3. The software you write will live a lot longer than you planned. Your experiments from 1. will haunt you.


> 1. No one working in tech actually wants to solve a problem and move on. The problems are too fun to leave alone once there's an MVP. Enter stage left, all the framework churn. Adoption of graph databases that don't fit the problem. All because the developers are bored and not business aligned.

There's another problem arising out of this for those of us that DO like to solve problems and move on - we're seen as not 'proper' techies if we're not completely obsessed with technology, able to bore for hours, with a github full of projects etc etc...

I like using technology to solve problems. I really don't give a stuff about the intricate subtleties of a versus b technology unless it absolutely matters, because for the vast majority of situations the obvious, pragmatic choice will do the job just fine.


Oh yeah, big time. The current craze at my work could be summed up by the phrase "How longs your pipeline".

It really irks me because we as 'tech' people don't think about time and money and that spending it in one direction stops you from another. It's very frustrating.


1. That programmers were a market, we weren't just a bunch of nerds sharing cool stuff with one another. That speaker at the conference? The one talking about Wheezle-snort 7.0? Yeah it sounded awesome, but it was supposed to. It's a sales pitch, even if the software is free (Actually, especially if the software is free)

2. That programming and making stuff people want are two entirely different things, although programmers always assume that whatever you're building, it's the right thing. You can spend your entire career in programming, learning all sorts of goodness about things like Erlang innards, and never really understand what your job is.

3. That all of that mousing around, learning a new IDE every two or three years when I started was a complete waste of time. I automatically assumed that the more cool and shiny the programming environment, the easier it would be to code and the better my code would be. But no. grep and awk work (mostly) the same way now as they did 30 years ago, and any time I spent learning which hotkeys did that work in some long-lost dev stack was a complete waste of time.

4. Conversely, that UX beats internal architecture, every time. If folks are having a blast using your app, you win, even if it crashes every five minutes (How they could have fun if it crashed every five minutes is a good question. You might want to ask them)

5. The more smart people you throw at a programming project, the bigger mess you end up with. I know when I say that people are thinking of "The Mythical Man-Month", but it goes deeper than that. Even if you somehow manage to stay on-schedule, human communication around innovation stops working at a certain scale, and that looks like a hard limit. There are ways around it, using things like Code Budgets, following CCL, layers and DSLs, but nobody does that, so it doesn't matter. We, as an industry, have absolutely no idea how to staff or run projects.

ADD: One thing that was quite profound that I discovered late: if you code well, the simple act of programming can tell you something about the way you're reasoning about the problem that you couldn't learn any other way. Programming is by no means a simple one-way street where ideas come out of the head of the programmer and end up as bits on the tech stack. It's very, very much bidirectional. Our programs influence us as coders probably much more than we influence them.


> You can spend your entire career in programming, learning all sorts of goodness about things like Erlang innards, and never really understand what your job is.

Sometimes I feel like my job is to protect other programmers from finding out what their job is.


A few that come to mind:

- Implementing small languages, and how it inevitably leads one to a deeper understanding of all the layers that make computing and programming possible

- The Make-a-Lisp project and the language-agnostic conceptual design at the heart of it - https://github.com/kanaka/mal/blob/master/process/guide.md

- Declarative nature of React (the view as function of state); using centralized unidirectional state management and actions

- Test-driven (and, similarly, type-driven) development

- TypeScript - Benefits and costs of type definitions, dynamic/gradual and structural typing; power of a language integrated with the editor environment

- Virtual machines, QEMU, and later Docker - Commoditized, reproducible builds for applications and their environments

- Build pipeline scripts


1. More of a CompSci aha: Hard problems are hard. Sometimes I get frustrated over getting stuck in a bad local optimum that I can't easily get out of. But then if I take a step back, I notice that the problem is NP-hard, that I'm not gonna solve it by throwing random heuristics at it and I need to either change the constraints to make it P, lower the input size, or lower my expectations. Simplest example: Deciding what goes in which shelf in my home is NP-hard. Solution: Get rid of stuff more liberally -> (n' < n) -> f(n') is much easier than f(n)

2. Doing NAND to Tetris the first time. It taught me not only how computers work, but how powerful recursive layers of abstraction can be. I had absolutely no idea how my system looked on RTL, but I was still able to build it.

3. Also Lisp. I wish Nand to Tetris had picked something more lisp like in the second half to show how simple and powerful it is.

4. When I'm just coding something- alternating writing and testing - is much quicker than writing a bunch and then debugging it. For bigger projects, setting up CI/CD early can similarly save headaches.

5. The functional big three (map, filter, reduce), but for me even more so closures. I had gotten stuck trying to hardcode coefficients for a polynomial until I noticed I'd end up with a lot of duplicate code, which was what I was trying to avoid with FP in the first place. Then I realized I could just put the polynomial function itself in a closure and call it with the coefficients I wanted, when I wanted.


Writing a computer graphics engine during university taught me a couple:

1. I can use math, intuitively, even if I don't know how the exact calculation works.

2. I can use a debugger and replay it again and again to understand what's happening.

3. I can diagram a high level design and not have the complete context in mind and still produce a 10K LoC OpenGL powered computer graphics engine.

After that project, programming never felt impossible to me anymore. After that project, I always had some confidence of being able to learn whatever I needed, as long as I have enough time.

Here is the engine (2013): https://www.youtube.com/watch?v=PH6-dLvZEiA&t=1s&ab_channel=...


I had a similar confidence boost when I developed a Vulkan graphics engine. People talk about this library or that framework being hard to learn and use, but when you have struggled through Vulkan and 3d engine quirks, some of them seem like a walk in a park now.

So to answer the original question, my epiphany is that something being "hard" is a relative thing and that "hardness" should never be a barrier that keeps me from reaching my goals.


My aha moment was that it is 100x better to focus on one project and make it the absolutely very best instead of trying 10 things.

With these new and easy frameworks it's incredibly easy and quick to create new projects but creating and launching something is just the tip of the iceberg.

The main work comes after that, i.e. traffic, leads, conversions, optimization, etc which unfortunately can only be done with good old tedious hardwork and laser focus.


100x this and thank you for posting.


1. Seeing a company do agile right by keeping the customer so close their almost in-house. All the other stuff is meaningless. 2. Realising I didn't have to code for faceless corporates shifting cash around, I could work for a small company actively trying to make people's lives better. 3. Breaking neural networks out of a predefined topology and letting them grow. 4. If it changes during runtime, put it in the database. If it changes per environment, put it in config. Every else in code. 5. That programming is a craft and you must treat it like one; just bashing out code to fill a brief isn't enough.


Curious about 1 and 3! 1 especially, are the customers literally sitting in their planning meetings? Always find it tough to have customers make the time investment, curious to know what models might work.


When I learned about the repl. For me 90% of debugging is about figuring out how to break at the crucial line and then shining the repl light on it. 75% of writing new code is about trying stuff in the repl and then stepping through the code in the repl and testing everything.

For me that quick feedback loop makes coding as fun as gaming.


Would love to see a video of what this actually means. I'm not sure I've ever written code for which I can use a REPL in any meaningful way so I must be doing it wrong. Videos of actual real world coding/debugging (not examples that are too simple and therefore not real world) would probably be very enlightening.

I mostly do graphics stuff and there are 70 lines of setup until I get to the 1 line that executes something based on the 70 lines of setup. But even in non graphs code my program 1000s of lines, all of which have to execute where as a REPL is one line at a time so yea, a video would really help see how to apply this.


It's easier in some languages than others, but I would say the most common case is in Python (partially due to ease of dropping a REPL, partially due to lack of compile-time checks). The one line of code you always should have at your disposal is:

    from IPython import embed; embed()
Basically, if you have an error and can't figure out what's happening within a few seconds, stick that line in before the error and Python will give you a REPL to play around with. It's like GDB but way better.


That is my Aha moment too, but I have generalized to: Never start real work, until you can set a breakpoint in your IDE, where you can try things in the console.

I know there are people that can write code for couple of days without running it (seen this with some java colleagues I've had), and then it all almost always works when run, but I am just not that kind of developer.


When I'm in the middle of my program and in need to change or implement some kind of algorithmic behavior, I often find that trying stuff out in a repl (node) help me understand and examine the problem much faster and in a more flexible way.


Out of curiosity, what language(s) are you working with?


Aka interactive window for c# devs using VS


When I realised that programming is just taking a big problem and chopping it into a couple of smaller problems, and then repeating that until your problems are trivial.

The first time (1993) I was logged in on a remote machine and downloaded a file to there from another remote machine. It was just magic that the machine I was touching had nothing to do with that file.

When I hacked together a mailinglist system in shell script.

When I realised that C is basically just all memory locations.

When I understood closures.


Becoming proficient with tools that allow me to pry into running processes and see what they are doing under the lid

e.g.

lsof to see files and ports opened up by a process

strace to look at what system calls to the Kernel by program is making (and ltrace for library calls)

pstree to see subprocesses spawned

gdb to inject myself into infinite loops of programs written in c-based implementations (e.g. Ruby)

I'm a collector of aha-ish moments so if you want more, my YouTube account is linked to in my profile.


1. Even though it requires precision, the vastness of the solution space effectively makes programming a mushy, tangled mess of technology.

2. End of list.


ah....

a..


I went from Ruby to JavaScript development. First, I’d like to say I think starting with Ruby/Rails is a bad idea. Ruby uses a lot of higher order functions, but it isn’t super clear from the syntax how that all works. Higher order functions in JS are much more clear, IMO, on account of having to use parens to call the function (in Ruby, you don’t need parens to call function, you can just name the function and delineate the arguments with spaces).

So with Ruby, I was just pattern matching—I never had an intuition of what lambdas were doing until I moved over to JS.

The Feynman Method was another aha moment. Learning how to learn, in my case, by writing down stuff in my own words as if I were teaching someone else was probably the most important thing. That helped me develop a much deeper understanding of important concepts.

Similarly, not trying to learn everything was a big deal. Don’t try to learn all the niche technologies you see in job listings expecting you need to know those things in order to qualify for the position. If you just do dozens of tutorials, you’ll end up knowing a lot of things very superficially. It’s much better to know a handful of ubiquitous, related technologies very well and have a strong foundation of programming fundamentals. Those things can transfer over to other technologies, when you need to pick them up.


> First, I’d like to say I think starting with Ruby/Rails is a bad idea.

It's telling that comments like this inevitably involve Rails, not plain Ruby.

Rails's "magick" is obfuscatory and at times counter-intuitive. Ruby, the language, is pretty straightforward. I wish more people spent time playing with Ruby before diving into Rails.


I agree on your sentiments about Ruby/Rails. Rails also applies the philosophy of convention over configuration, so newcomers can’t usually just intuit what’s going on — you have to know their specific conventions, which also seem to change with each major release. And one of my pet peeves about Ruby as a language is the optional use of parentheses for function definitions and calls, making it harder to read through code quickly. In C-style syntax languages, it’s very obvious at a glance where functions are being called. Not so with Ruby.


> Similarly, not trying to learn everything was a big deal.

This is one of my recent decision. Previously I would jump to every shiny frameworks and try to learn every fancy programming languages. This drained my energy so much I had to stop doing it. Instead I have decided to learn ubiquitous tools e.g math, algorithms and data structures, etc.


- My team lead showed me code which i wrote just two years ago and i didn't believe at first, that i wrote that crappy.

- When i realized that someone might quit his/her job in his trail phase. It did not occure too me that both sides are checking if its fun/good for them

- The main reason why i'm successful is that, besides my character flaws (which did become better over the last 15 years), good software engineeres are in a very strong demand and when i look on how hard it is to get good people, i just might never have a really hard live

- Softskills are crucial: Taking responsibility, being on Time, being reliable, taking action when it matters without hesitation

- You had a salary negotiation or a discussion and something was decided? You still can get back to this 1-x days later and say 'you know i thhought about it and i'm not happy with the outcome at all. We need to discuss this topic again'

- Don't complain if shit is shitty. Either change it, try to change it, accept it, or quit. Stop telling others thats it shitty and do nothing.

- Estimation is bullshit, never works, never aligns, no one really retrospect it and if you ever find a team where it works, your team gets dismantled for whatever reason and you have to start at 0. Prioritise for relevance, optimize how you work, accept the outcome.

- Never accept a deadline. Without a deadline, your manager can't come back and say 'you promised' which leads to you doing overtime for a misstake your manager did: he/she missmanaged!

- Do less but better. Whatever you do shitty now will come back

- Not doing something because you actually figure out what the other person needed/wanted, is more often then not the better result if it does not to lead writing more code

- Understanding how you programm a computer game for three main reasons: 1. memory allocation can fail 2. how the game loop works and how to programm it 3. Randomness

- Sentences to know:

-- I can try to get it done, but i can't promise

-- I have a date tonight, i can't stay (if they insist:) I have expensive cards for <event>... (pressuring you into doing overtime just because is missmanagement)

-- No


> -- No

This is one thing I need to learn to say. I have been doing lots of stuff other devs want and hating myself for not being able to say no.


> - Do less but better. Whatever you do shitty now will come back

As everything else in life. This was a life aha moment.


- Don't complain if shit is shitty. Either change it, try to change it, accept it, or quit. Stop telling others thats it shitty and do nothing.

Thank you.


Not sure I can pinpoint it to a specific time or anything, but as of recent nothing feels like "magic anymore". Many years ago there were all sorts of classes of software, patterns, etc. that looked and sounded arcane and complex and I'd never understand how it worked. Now nothing really feels like that. I can typically get an "aha" moment by doing a bit of reading and understand from a very high level how something works. Not to say that I'm a good programmer and can't write and implement anything, because that's far from the truth, but I can typically understand stuff and lose the intrigue. Kinda sucks though because its killed the motivators that got me into the craft.


I agree with this. Lots of aha's come from "hey why isn't this working", digging into library implementations or system internals, and figuring it out. It's stuff that's annoying at the time because you're working on something else, but each time it happens you learn more about what's going on under the hood. And eventually you realize these things you picture as dark arts are actually pretty straightforward, and roughly how you'd have guessed they were implemented if you'd thought about it hard enough.


Powershell? (I've done windows most of my career, but "shell" would apply anywhere).

It always seemed so lame compared to neat new functional languages or distributed actor frameworks. And, not being much of an ops person, I almost wanted not to be any good at powershell (or any shell) so I wouldn't get ops assignments.

After getting familiar with it though, it's improved my workflows and ultimately quality of life. I'm not only more fluent in ops now, but I also get to spend less time on it, which is what the rationale for not learning it was (write a script once and never have to remember what I did when creating a new environment).


Learning the [1] Unix Philosophy. Writing & Using small tools that can be piped together... was a huge AHA moment for me

[1] https://en.wikipedia.org/wiki/Unix_philosophy


Yup this was one of my aha moment too.


When I found that interfaces/abstraction was way more critical to understand than recursion or obscure algorithms for most real projects. Half of the battle of starting (or understanding) any project is clarifying the interfaces in which one should be nestled.


My favorite all-time aha moment was coming from a CVS world and learning how to use git. Oh, you just hash the entire lot of the files together and use that as a version for the repo? Aha! The change set should be for all the files together!


Mid 90s, when I was introduced to Java after C++. With the combination of garbage collection vs malloc/free, and just seeing that Java gave you 80% of C++'s features with 20% of the complexity, I knew that it was going to be big and take over C++ as the language for company and enterprise development.


I think the simplest realization is that the compiler is always right. Yes, technically, compilers can have bugs but 99.99% of the time the bug is yours. I used to stare at non-working code and think "This should work!" But that's never true. If it doesn't work, I made a mistake.

My next realization was that a well-defined and clearly communicated product definition is 10x more important than good coding.


1. Talking to other developers.

Sometimes using "uncool" technologies and languages is ok, and they often have a good reason.

2. The concept of "innovation tokens" if you solve a business need

https://mcfunley.com/choose-boring-technology

3. There is no silver bullet.

Learned the hard way after switching to the new "flavour of the month"


1. When I realized that there's a thing called "computer science", and you can actually study things related to programming. It's not just ad hoc tricks written by experienced and bearded C programmers. You see, I'm self-taught from an early age, and learnt programming via skimming "VB for Dummies" when I was 8 y.o. or so. I didn't know anyone at all that knew how to program. It would take another decade before I met someone who knew how to program.

2. When I read K&R at the age of 14. Before that, I had just followed random tutorials that I found on the internet. From that point on, I went straight to the source (no pun intended) when it comes to learning new stuff.


1. Polymorphism

2. Vendors are often pushing bad architecture

3. Architects often push things for the wrong reasons

4. Companies often push risk to their vendors, avoiding collaboration, inherently increasing risk

5. ORM's hide business logic, preventing a company from understanding its business and adapting to change

6. Relational Databases are an operational anti-pattern

7. Graph databases are excellent tools for common problems like security, social media

8. Document databases are excellent tools for Event Stores and CQRS

9. Cloud native development (FaaS) is awesome

10. Domain-Driven Design and Event Storming are excellent disciplines to add to a corporate development group

11. Corporations inherently don't understand Agile because they have to measure everything through strong process definition


> 6. Relational Databases are an operational anti-pattern

Could you please expand on this? What is wrong with Relational Databases and what would be the "correct" pattern?


The keyword in my statement is _operational_. Relational databases are excellent for analytics and reporting. But in an operational system with well-defined boundaries, it's more than likely that only portions of the larger system are required for any transactional behavior. In a modern architecture, you could simply write all transactions to an event store (write only) and use CQRS and a relational cache store for reads.

Another aspect of all of this is the misconception that we should design systems based on a relational data model. This only leads you through the whole impedance mismatch of ORM's and how you serialize/deserialize your bounded contexts or objects. We should not do this at all. We should design systems on those bound contexts and determine the data store accordingly. Operationally, a Document Database or Event Store is going to be the best tool.

We can fire change events at a data warehouse to store data in a relational manner for analytics and reporting, but none of that is necessary for our operational system.

In your operational system, if you need any sort of complex join, you've already created a poor design.

I will say this. This did not occur to me until I'd gone through the process of developing a complex system using Domain-Driven Design. Until you've done that, the old patterns will remain like concrete.


If I'm reading this correctly, then it's not relational databases/relational data models that are the problem, but instead the normalization across bounded contexts within a problem domain?

Partly yes. But it’s also the realization that we tend to start with an ERD or domain model before even looking at boundaries. We’re product-oriented. And vendors don’t help. Look at how hard everyone is pushing containers. Containers have uses, but there’s no rational explanation for the effort behind their marketing push.

Simpler is better. Boundaries need to be respected. Tools should be selected based on need, not desire.


Could you please elaborate/navigate to some theory/resources? What is Domain-Driven Design? What are the “bound contexts”?

Google it. Eric Evans wrote a book 15 years ago. It’s kind of a big deal.

How exactly do ORMs hide business logic?


I've seen implementations of ORM's where the complicated joins of relational operations are embedded in the ORM definition. It's done for performance, without considering the impact on future business process review.

This inherently hides that logic from external review.

One of the things we need to enable as architects is to make sure the systems we build are adaptable. Any "framework" that begins to hide logic through efficiencies is bad architecture.


While I was reading "Eloquent JavaScript", I came across this absolute eye-opener (to me at least): https://eloquentjavascript.net/07_robot.html#p_MtO6TwqB5I

In this chapter the author demonstrates that it is wrong to solve a problem by creating a series of objects that do stuff. He shows that the right way is by abstracting the problem into its most essential elements. Do not try to emulate reality with code, but make it its own, abstract thing that solves the issue instead.

This chapter and its code felt like pure poetry to me.


1. Use ADTs instead of OOP, always. a.k.a composition over inheritance

2. No ORMs

3. No Frameworks, use libraries to build up your project

4. Use strong type systems for modelling (make invalid states not representable)


Finally understanding the appeal of static typed languages after doing a project in Go using Goland.

I always thought I loved the freedom of python and JavaScript (and I still do to some extent, you’ll have to take Django from my cold dead hands). But the power of static type becomes super clear when you’re using a great IDE.

I can hit a hot key when hovering over any function and it’ll show me the docs and take me to the full page if I want. Hitting another hotkey it’ll auto fill parameter options. It’ll auto import the libraries I need. It’ll complain if I’m using the wrong type.

It gives me so much more confidence that the code I’m writing will actually work. It’s slightly more of a pain to write, no doubt about that, but the payoff is huge.

I was burnt by java in the past and can’t stand how every project seems to end up like this https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris.... But Go has shown it doesn’t have to be like that.


Whoa I was with you up to the last line. Static typed languages have been around for decades. Go is a very, very recent addition. It hasn't shown squat.

Should’ve said “it has shown me”. I know there are many great choices out there, it’s just the first one that has sunk in with me.

Fair enough.

When I stopped coding, and started thinking. Turns out hacked together stuff is slower to produce than thinking overnight and getting it right.


When react/redux finally "clicked" in my head. Coming from an MVC world it was quite a paradigm change.


Same here. The first time I added a spinner animation while secondary data was being fetch was an eye opener.

Although, I have a growing dislike for Redux.


What don't you like about Redux (genuine question)? I find myself using useState and useReducer a lot more recently, but I still think redux, or at least the pattern of having actions/reducers/dispatchers/selectors the best way to approach state in the UI

When I learned what a kernel was, and the implications of proprietary closed source kernels, I was instantly radicalized to the cause of gnu and libre software.


I always thought programming was my passion as I never get tired or loose motivation when I code until I realised that my passion was DIY and programming was just a tool to create things.

Starting from there, I don't take any programming language seriously anymore. Whatever makes the job easier and faster to accomplish in context.


When I finally understood that browsers only understand HTML, JavaScript, and CSS. Yes that’s a simplification, but it’s essentially true. Early on in my career when I started more front-end development, I was under the impression that there was somehow much more going on under the hood, i.e., more languages, executables, ways of setting styles and layouts. When this finally clicked with me, it was all so much clearer and immediately less intimidating. I’m still surprised to this day that even many experienced developers think that browsers can natively interpret SCSS, TypeScript, or whatever templating language they’re using — heck, some people even used to call jQuery a “language”.

As a relatively new Clojure developer, the meaning of the parentheses was the ah-ha moment. I imagine it’s the same for other Lisp-y languages.


Not really an 'aha' moment, more of a dawning realisation as I gained experience.

When starting out as a developer, I think there is a tendency to see the particular language/framework/syntax that you're using being all-important.

Over time, and with experience, you realise that the language and syntax are just the "fine detail" of how you solve problems as a software developer: as your understanding deepens, it's as if a kind of abstraction happens in your brain, because you stop thinking so much about the fine details of language and syntax, and start to worry about things like managing complexity and optimising design etc.

And at that point, the realisation for me was that I can apply that experience to any language/syntax/framework, which frees me up to pick the best way of solving any particular problem, and to not be stuck in a rut with any particular technology.

An added benefit is that a lot of the debate over stuff like "language X is better than Y", or "this code style is correct and that one is wrong" become unimportant, because you're thinking at a level that's not limited by specifics.


Making the realization that I only do this for money and attaching my self-worth to my output or current role is futile.


What makes money is with a business sense with a shitty developer skill and not with a developer sense with shitty business skill.

Of course, being purely technically talented is one way to go as I have chosen to be.


Trying to master non-framework CSS and realizing how dead simple styling a website can be when you are using a framework instead of trying to reinvent the wheel all over.


What is your favorite CSS framework/s?


Bootstrap - my one and only experience with a CSS framework. Love it, but can't compare to anything yet.

I'll properly give Bulma, Tailwind or another popular framework a go for my next project - let me know if you can recommend any other.


You can give Halfmoon a try: https://www.gethalfmoon.com

Disclaimer: I built it.


This is really great.


Were you inspired by tailwind?


Definitely yes!


- when I read Clark Weissman's `LISP 1.5 Primer' in 1968

- when I finished a student programming assignment a year later and realised that a well-chosen set of procedures constituted what we'd now call a DSL

- when I ported Martin Richards' very clean BCPL compiler and realized that you could write efficient code in something that wasn't Fortran

- when I read the Thompson/Ritchie paper on Unix

- when I completed my first reading of SICP


SICP made me fall in love with Lisp even more. It taught me how powerful the tandem of function and closure can be.


1) Some people don't share your passion for software. They consider this like just another job, they are not interested in improvement as long as they can do daily programming tasks. That's ok.

2) It's not enough that an idea makes sense to you to communicate it. People usually hate changes, you need to understand all the details before opening your mouth.

3) I remember many a-ha moments regarding how the Internet works while reading "Computer networks" by A. Tanembaum.

4) When I was finally able to exit vim.


When I actually understood the idea behind functional programming, specially being able to think in the form of recursive functions than loops.


1. When I finally understood how event loop in Javascript works for the very first time, I had an Aha moment on the language design.

2. When I read about how garbage collection works in V8 Javascript engine, I had an aha moment about how hard things are just one layer below my working area.

3. When I started learning Rust, had another aha about how the language is designed without GC. I never thought that was possible in a language.

These maybe simple things for many, but I have no educational background in any of these, so I'm amazed by things that people actually got used to.


Programming is using conventions. Lots and lots of it.

Conventions for syntax, names, logic, API, structure, vocabulary and so on.

Conventions are by nature arbitrary, influenced by culture, history, social behavior and a whole lot of human weirdness.

Don't try to learn all the conventions first, it comes faster with practice and exposure to it. Solve the problem, then find the conventions you need to apply the solution in your context.

The beginners paradox is that they need to learn a bit of specific conventions (E.G: part of a language, one paradigm and a few libs) to start working on solving a problem, so it's a frustrating experience.

There is no easy way to build the rest of the conventions based on your knowledge of what you know now, because it's artificial.

It's also what leads people to say "don't learn languages, learn to program". Which makes no sense to you at all, until, well, you finally know how to program. But you got there by learning conventions on the way.


When I watched a world class software engineer build a real time data processing platform using only functional programming.

At a major company, over ten teams were automated in a year.

Beautiful and useful abstractions around data and data processing tasks that provided extreme value.

I could never replicate it, but it reminded me about the power of software and the extreme ability that some have to weird it.


If someone is given the freedom to think and apply it can be amazing. Most companies don’t support this though. This was my experience early on in my career - hardly anyone wants to let coders think, instead it’s about fitting in to the processes. Not saying I’m that clever just common sense stuff. Like err let’s not get 3 people to simultaneously build 3 similar screens in the app.


Could you share the video you mentioned?


Spending so much time on meetings, especially awesome agile scrum - standups and even when working in probably 5th company, business processes do not work well, people deal with same problems of work organization. For me as developer, this kills my productivity and annoys me sometimes when it's just to much in a week.


I've had a few, but I'll pick one in particular. I got started with programming via BASIC (various forms), none of which supported recursion. I didn't really know what it was, but knew that I just wanted to call into the same routine again from itself or via mutual recursion (again, didn't know the term). This didn't work as expected, however. Either the program wouldn't compile/execute (threw an error at the recursion) or the recursion just corrupted the data (each call shared the same local data, so they'd clobber each others' work). While playing around on a TI calculator I programmed (something, can't remember the details) but created a stack using the list data structure. I then looped (versus recursed) but pushed a data element onto the stack or popped elements off of it. The program quit when the stack was empty.

Later, in college, we were learning lower level programming details (like what C translated to in assembly and how it managed calls and the stack frame). Despite this being my third CS course in college, I hadn't really grokked recursion yet. But I had a flashback during one of the classes to the TI-BASIC programs I'd written using a stack, and realized I'd recreated recursion (but manually). After that recursion and loops were synonymous in my mind (as they should be, at least in many cases) since I knew how to translate between them. Whenever I saw someone managing a stack and looping until it was empty, I knew both that it could be and how it could be translated into recursion (and vice versa).

It seems to be one of the hardest topics for many of my colleagues (especially those without a CS degree, so lacking practice with recursion) to understand or ever use. But I can usually get them to understand it once I draw a few things out on paper and show the two solutions to a problem (recursive or iterative). This doesn't mean they like recursion, most still avoided it, but they started to understand that it wasn't magic, it was just the computer managing the same data structure they were manually managing.


1. When I understood everything that happened in a custom chart with d3 I created, including d3 patterns, animation, svg, everything else just paled in comparison.

2. When I gave into Power BI finally and fixed a report someone else created, it felt so good. I don't think there is anything that compares to it and I only scratched the surface.


this is on my bucket list! I have managed to get by using either simpler libraries such as chart.js or adapting the data of an already-made chart so far.

What would be a good resource that helped you get to that aha moment ?


This was when d3 was v3.5. I did a lot of experimentation but also simultaneously read multiple blog posts for the smallest things.

On top of my head I can remember these:

1. https://github.com/d3/d3/wiki

2. Jerome Cukier blog posts

3. dashing d3 js

4. https://alignedleft.com/work/d3-book


I was studying operating systems and computer architecture in university (this included a tiny amount of assembly programming) and was interning at a company that used scala and learning scala. At the time there was quite a fuss about the fact that the jvm did not support tail call optimization (at least until Java 7 IIRC?).

So on one side i was programming in scala and on the other side i was "hand-compiling" small C functions into m68k assembly...

The aha-moment came when I was hand-compiling a recursive function down to m68k assembly and I saw that I could completely eliminate ALL the recursive calls by re-arranging some register values and some values in the stack frame, inserting a small preamble in the assembly and then at the end of the assembly routine jump back to said preamble instead of making a recursive call.


Realising that 90% of programming advice is unsupported junk.


Heh, I initially read this as "realizing that 90% of programs are unsupported junk".


Haha, definitely this too.


1) Complexity is very expensive

2) Written code can take a long time to stabilize and is thus expensive to change and maintain

3) Ostensibly orthodox legacy systems are full of anti-patterns.

4) How easily a small problem can turn into an infinite number of other, small problems, and the intuition required for where to 'draw the line'.


Finding Anki after 17 years of programming and commiting to insert every new data into it and practice it daily.

For me there is no other way of learning anymore and it serve as my real memory. Basically it give me the confidence that something that I learn now will be known years later.


Can you share a real example? How do you use it? What tools do you employ?

Sure thing! (and sorry for the late reply :))

Anki is a program which allow you to learn/memorize by using a Spaced repetition technique where newly introduced and more difficult "ideas/notes" are shown more frequently, while older and less difficult ones are shown less frequently in order to exploit the psychological spacing effect. More about the technique at https://en.wikipedia.org/wiki/Spaced_repetition

I use it for everything that I learn which I also want to know in the future, this range from CS related knowledge like a new algorithm that I learn to an English word which I don't know to even remembering the main concepts from books that I read or names of people that I meet.

The flow goes something along the lines of: 1. Learn something new till you actually understand it 2. Summarize it to a note or few notes with proper questions (in a way this stage reminds me the Feynman method) 3. Find/Create (by create and find I mainly just do screenshots or just save pre-made images... ;)) some pictures (if applicable) as adding visuals enhance memory capabilities 4. Insert it into Anki!

Admittedly this flow demands way more time and energy from "just listening/reading" some content, but after so many years of programming I found that I learned so much but forgot most of it... so I prefer to learn better and slower, and honestly after doing this for around 1-2 years now I got to say that it's absolutely amazing (I tell it to anyone who is willing to listen not only to programmers).

There is also a daily practice where you need to pass through some notes (this is the memory spaced repetition part), this usually take anywhere from 5 to 45 minutes a day (for me and based on how intense I am learning during the last period).

You can download Anki for free and with no subscription fees at https://apps.ankiweb.net/ It also support Linux/PC/Mac/Android/iOS (the iOS version cost money).


Some more recent ones of note...

0. When I learned how to model systems in TLA+

1. When I figured out how to structure programs using monad transformers... sort of the moment where I started reasoning algebraically about programs on my own

2. When I learned how to specify propositions in types and verify my designs and programs


1. My first job out of university, when I discovered you can do interesting things with programming.

2. Learning XForms, which can do most of what JavaScript is used for in a tiny amount of code.

3. Learning JetBrains IDEA, the only IDE that I ever enjoyed using.

4. Learning red-green-refactor TDD. Now refactoring is something I do as a matter of routine, not dread.

5. Understanding the fractal complexity of Bash. It's weird how a language can make stream processing and parallelization basically trivial, while making things looping over non-trivial sets of files reliably PhD-level tasks.

6. Doing pair programming, and then keep doing it for four years because it was brilliant.

7. Installing Arch Linux, the first distro where things weren't constantly broken because of version drift.


How easy I am to distract when I think I have something important to say ... gosh dang it ;-)


Watching a coworker and his Zettelkasten notes. Junk that he did three years ago, just pulling them right up, finding everything related in a couple keystrokes, and reproducing the results. Unlike my lame attempts to keep notes that I can never find again.


Did you remember what software did he use? I have been looking for some solution with this "methodology" but has not been able to find something satisfactory 4 me. tks in advance!

When I realized I can use a simplified version of my take on Agile and Scrum with a simple Trello board, and apply that on my side projects. And that could help me to build them twice as fast and actually finish them too.


The moment when I first wrote my cgi/perl thingy and deployed my first website.


Docker & associated devops stuff like docker-compose. Had a bit of that "speak the right incantations and magic appears out of thin air" vibe to it as with learning to do basic programming.


That everything software and most hardware/electronics is just logic all the way down with different levels of abstraction.

This is why I love software so much and struggle with people. Software is all logic and mostly deterministic, while with people feelings are involved (which could be argued to be also deterministic but then we get into philosophic discussions about free will)

But on the software/hardware side, any bug can be resolved by digging through the layers of abstraction and figuring out where the logic error is.


1. When I first learned to program (in Pascal)

2. When I learned event-driven programming using Windows Forms in C# and I was able to create programs that resembled the ones I used

3. When I took a course in POSIX, programmed using fork and pipes and learned about stuff like race conditions

4. When I spent a year learning everything there is to know about coding (including assembly, lisp, smalltalk, rust) and realized I would never feel as happy as I felt during 1-3 because I had changed as a person


1. Figuring out and loving functional programming in JS/Node.js, coming from a Java/OO background

2. Hating working on a Java/Angular/OO project after 5 years of FP


my aha moment as a developer

- I would learn some technology to a point where it does what I want

- I take what I learned to some new technology but find myself doing things from what I learned in other technologies

- this puts me in a strange loop where reality wasn't lining up with my expectations or I would do things that were more work than necessary because of learned habits

- the aha moment was when i started learning the theory of the thing I was working with

* it was key to getting out of this rut

* turns out this applies to any theory from abstractions all the way down to computational theory, type theory, automata theory, software analysis theory, hey if i ever get there maybe even software synthesis

in a nutshell i would summarize this aha moment as "you can keep doing the same old tricks and eat the cost or you can always dig deeper and see what costs can be avoided."


this might be a bit vague so to put another way perhaps I learned to zoom in before zooming out, and once I have done something, it never hurts to zoom in again and see what can be gleaned from the assertions I was making about the world at the time. in practice some things I have learned to ask myself are

- what is the shape of the input data (requirements, configuration, dependencies are also data) of the thing I'm working with and what is the shape I'm looking to produce as output of this software (could be any combination of side effects, screens, or just data)?

- on a larger feature or set of features i look for a domain language to be discovered or did I create a domain language and does it hold true?

- what are the fundamental assumptions I was making and could they be improved from first principles?

- what is the expected and unexpected behavior of what I am building today or what I built in the past and why? (learning opportunities)

- the takeaways are my assertions are only as good as my understanding and understanding requires detail, attention to detail is only as good as my checklist, and my checklist is only as good as the questions I'm asking, this helps create a habit loop where I can hopefully improve outcomes with each iteration based on deeper introspection


  [1]> (+ (expt 2 32) (expt 2 32))
  8589934592
Numeric overflow is a choice. Even a 16-bit program that might have had a 32-bit ALU could simply decide to support larger numbers rather than producing something meaningless and dangerous.

  def retry(block: =>T): T = ???
Half of what I missed about macros is being able to control (re)evaluation of an expression, and call by name handles that really well.


I'm not sure if these are too meta, but here are ones that I think are relevant:

- The free/libre/open source community care deeply about fundamental problems of our society and are trying to provide legal and technical tools to help take steps to create a better world. When I was younger, I thought "suckers! They're giving their compiler away for free!" It took me a while to internalize the free software ideals and even longer to be an active proponent.

- Corporate software, especially Microsoft, is in the business of creating a walled ecosystem, charging end consumers for their product and charging software developers to be part of that ecosystem. The first 'a-ha' moment was when I realized they, and others like them, were a racket, or at least trying to be one. The later 'a-ha' moment was when I realized there was a viable alternative to this game.

- Most (but not all) computer programming language flame wars about which language is better boil down to whether the developer prioritizes speed of program execution or speed of program development. See [1]. Newer language wars center of safety and bug creation, so maybe this point is dating me.

- Programming languages are more about human psychology than about some mathematical proof of correctness. Programming languages are designed as a compromise between how a computer needs to be told to operate and how humans communicate. All the theoretical and mathematical infrastructure around language design are there to justify decisions once the language has passed the "psychology" test. This is the same idea between JPEG, MP3, etc. compression. Fourier transforms and wavelet transforms don't inherently create a saving, it's only when we can throw away higher order coefficients because the human perceptual system isn't as sensitive to them as lower order coefficients, does it give benefit. That is, JPEG and MP3 are specifically tailored to the human perceptual system. The math is there as the building block but which bits/coefficients to throw away are determined empirically. To belabor this point a bit more, programming language discussions arguing over functional vs. imperative, type safety, etc. that don't try to determine measurable metrics of success, preferably through testing their hypothesis with data collected on actual usage, are expressing an opinion only.

[1] https://web.archive.org/web/20200407014302/http://blog.gmarc...


When I first started compiling code, there were often pages and pages of compiler errors. I felt that I had to read all the errors every time, and then somehow the most important information would magically emerge from taking in the big picture.

I learned to focus in on just the first compiler error, and ignore all the rest.

Read the first error. Resolve the first error. Recompile. Repeat.

This is just one example of breaking big problems into smaller problems.


You can get paid more and have an easier time.

You can’t possibly know what a company you join will be like until you actually have been working there 12 months.

For large code bases you can’t really rearchitect anything. You are stuck with how it works. Maybe on small scales you can refactor.

Don’t blindly apply design patterns. SOLID is good as a thinking framework rather than a code review gate.

Marketing isn’t what you think it is until you study it somewhat. E.g. it’s not glossy ads!


> You can’t possibly know what a company you join will be like until you actually have been working there 12 months

Does it actually take you 12 months, or did you mean that as a measured, sensible statement? Or are you perhaps looking at it from a bi-directional loyalty or advancement perspective and not just general culture?

For me, that would only apply to companies that I interpret as middling/unimpressive at first glance. The really good or bad companies stick out much more, so I can usually tell by the hiring and onboarding processes and the first couple of tasks you're given, even as a consultant.


I don’t see how you can know he team dynamics and politics (all companies have politics) before joining somewhere unless you have a friend working there. And even then what they pick up on might be different to you. I’ve seen a company I haven’t worked for but who was on my radar go from hero company that employees loved to mass walkout not soon after that was bought out.

Buy outs are a big factor plus reshuffles of management and teams.

Not only that, I’ve had excellent vibes at places in the first 6 months to find out the asshole factor is high later on too. Also without any of this stuff just changes, and I don”t see how that is avoidable. Change is constant! MMMV.


Please note I wasn't saying _before joining_, either. I agree on your points, though.

1. computer science is maths, software engineering is power point.

2. in software engineering it's always a people problem no matter what they tell you. see point 1.


There were so many... I learned programming by myself with limited resources when I was a kid. Eventually I got a formal education and there were a lot of aha moments!

For instance:

1. Writing a Scheme interpreter in C

2. Abstract data types and encapsulation

3. Functional programming and recursive algorithms

4. Groking OOP and design patterns (this took me a long time)

5. Understanding how processes and scheduling is done in an OS

More recently, not really a "aha" moment, but Git has been a game changer.


Using shell as a REPL for C.

I learned C on Windows, and before I learned any dynamic languages. And before I had ever written a unit test.

I knew all the rules, but I was not good at making a reasonably sized, correct program in a reasonable amount of time.

---

But then I developed a good workflow for writing Python, shell, etc., and then went back to writing C, and it helped immensely.

C is a dynamic language in many respects anyway!


When I realized that my code won't last forever, and in fact shouldn't. My code will solve a problem now, but in 5-10 years someone else will rewrite it. The company may pivot, or get acquired, or merger. So build a solid foundation, but don't plan for it to become the next world wonder.

1. lisp

2. realising that its more about delivering than dreaming up the perfect abstraction (get it done).

3. what you think the user wants vs what the user thinks they want vs what the user actually wants vs what the user actually needs.

4. that there are always tradeoffs

5. building a product by yourself (whether on the side or starting up) will give you invaluable experience.


1. The first language I ever used/learned was batch to script things in windows. When I learned python shortly after I recall it taking quite a bit of convincing myself to accept/use the "magic" of control flow being able to automatically jump back after a function returned as I was so used to manually wiring up all my gotos (was more of a painful experience than an 'ah-ha' I guess...)

2. When I was going through Tim Roughgarden's Algorithms course and saw the derivation of runtime complexity for mergesort and finally understood/visualized what a logarithm actually did (in school it was just taught as some rote function to help you manipulate eqautions of the form y=b^x)

3. Learning how TCP works from the bottom up. I think the biggest aha moment was when the textbook I was reading explained the TCP algorithm as a state machine that's just running on the client and server machines with the rest of the underlying network just forwarding packets, i.e. "pushing the complexity to the edge of the network".

4. Working through the nand2tetris project resulted in a lot of "oh X is just basically this at its core"

5. When going through a textbook explaining how relational database engines were implemented and seeing that they're essentially just using disk-persisted hash tables and self-balancing search trees to build indices on columns and make `select`s O(1)/O(log) time (I wasn't taught this in my uni's database course and assumed there was some fancy magic going on to make queries fast)

6. Realizing that I could just do a form of graph search/dependency resolution when learning a new codebase/trying to understand how a function works. I think before seeing someone do this in front of me I would usually just panic at the thought of "thousands of lines of code" rather than just "keep iteratively diving into the functions being called". Whenever I'm learning a new language, the first thing I'll do is setup the LSP plugin in vim so that I can quickly navigate up and down call graphs. Tbh I don't understand how some developers claim to not need this and instead just manually grep+open file in order to "jump to definition".

7. Forcing myself to derive the rotation logic for AVL trees. I was curious if, given just the high level properties of how an AVL tree behaves in order to guarantee O(log) time lookups, if I would be able to figure out all the rotation cases. Was a very rewarding exercise and something I plan on writing a blog post about (eventually...)

(edit)

8. Learning about the log data structure and how state can be replicated by replaying/executing this stream of transformations/updates.


>Working through the nand2tetris project

Done various sections of this and would heartily agree with how illuminating working through the project's been


1. When I learnt TypeScript after having used mostly PHP and JS until then – not sure how I lived without stronger typing

2. When I started reading programming and software engineering books after having just "done it" – this brought so much thought and structure to what I used to improvise


My favourite Aha moment is the chorus of The Sun Always Shines on TV. Gives me shivers down my spine.

When I realized Emacs can do everything.


In the 90’s, as a teen I was reading the Amiga ROM Kernel Manuals to try to learn programming on the Amiga.

The section on graphics and UI described BOOPSI - an object-oriented way of constructing UI elements with inheritance, etc. Had never been exposed to that before and it blew my mind.


1. Objects finally 'clicking' into place for me.

2. Being an early adopter of Tulip (later renamed asyncio) and gaining an understanding of the event loop and concurrency without threads.

3. Understanding that all code is just a particular representation of some S-expresssion.


Microsoft Excel is a functional language IDE where each cell is a function expression.


+1 to this, but I came upon this indirectly.

My biggest "OK Moment" (1) was non developers building tools via Spreadsheets Sheets when they do not have a tech team or the team's bandwidth, Which makes them IDEs. They also function - independent micro data stores - a data exchange mechanisms

1: https://ravivyas.com/2020/07/07/stack-the-bricks-with-ok-mom...


When I realised it's not magic.

The first computer I had access to was my family's Windows 95 PC. I learnt to write HTML in notepad and see it rendered in Internet Explorer. This was the beginning of my career, but so much of what was happening was accepted by my brain as simply "magic".

I would later chip away at that stack of magic and learn how more and more things worked, but even while being an accomplished programmer I still had this feeling that magic existed. It wasn't until I learnt to build my own computer from discrete logic components (thanks, Ben Eater) that I finally felt like I understood it. Computers are just machines.

I've since revisited textbooks (like Knuth) and the history of computing (starting with Babbage) and feel like my eyes are no longer obscured by my preconceptions about what a computer is.


Dad's HP-67 .. could program it ... he later got a HP41C. Dad bringing home his HP-85 for my school vacation for "work". Turbo C Unix What is new is actually old ... "Mother of All Demos".


When I was introduced to Arc, it was the missing piece that completed LISP for me


When I realized that the entire software industry is a farce; the only purpose of which is the creation of jobs in an unnatural supply-side economy driven by unequal, preferential access to easy money.


Learning Clojure. All of the FP theory I’d learned in university finally clicked and made sense. Also, it challenged my notion that the only sane languages were statically typed.


If you are good at communicating, sometimes you do not even need to write a line of code.

Understanding what someone really needs should be a skill taught in CS.


Most recently: HTTP2 has unpredictable performance problems when not used for typical get/response or binary streams.

I'd be interested to know what the 'aha' was when learning Express (and how you find it compared to Django)?

Reactive programming coming from OOP was it for me.


1. Finding out WPF after years of work on WinForms; in that you can do MVVM stuff, control templating, vector graphics, etc.

2. React and Typescript

3. Jetbrains IDEs


After going through Ben Eater's 8bit computer video course on YouTube. He builds a computer on breadboards. It is amazing.


Yea I would say the same about nand to tetris for me. I only discovered it after college I wish I had found it sooner


When I purchased the most amazing problem solving device I've ever had. A personal whiteboard.


How do you retain the learnings? I have a black board for the same purpose but am not able to commit to it due to this reason.


Yes, that's a downfall. I bought a large sheet of whiteboard material for about $12 and cut it into quarters. If I have some subject that's really important, I'll leave that one stay for a while.

I probably should transfer the really good findings to a notebook, or maybe take a photo. But I haven't yet.


1. That you only need VIM :qa


My discovery of Clojure followed by my discovery of Ruby's Lisp roots.


Realising that functions could be first class object, that is, they could be passed around just like integers or strings. Suddenly a whole new world of possibilities to simplify code magically opened up.


The less code I write, the less bug it will contain.


(gamedev) When I switched from Rust to C.

Networking and byte packing. Still amazes me.


the moment that I realised what a syscall is.


Most of the time spent developing is spent making decisions.

To me technical debt is therefore defined as how many decisions you have to make in order to create a feature. Clean is when you don't have to make many decisions to get things done.

Example decisions, I'd say I spend at least 90% of my time developing on these decisions:

- What feature would be good to have?

- Is the feature worth the effort to build?

- Is the feature worth the compute costs?

- What language/framework should we use for this feature?

- How should we structure persistent data related to this feature?

- Where should the code for this feature live?

- How should we test this feature?

- How performant should this feature be?

- What name should this helper function/variable have?

The more of those you have to think about when developing the slower you will make progress. Therefore the main productivity hack is to write down guidelines or roadmaps or design documents for all of those so you don't have to think much about it when developing. This means don't be a manager when coding, let someone else do that work or do it before you start programming.

Things you can do to reduce mental cost of above decisions:

- Product roadmap with features that would be good to have.

- Discussions in the roadmap related to how much value said feature will provide and the effort to produce it.

- Discussions in the roadmap related to how expensive the feature will be to run.

- General guidelines on what language/framework you use.

- Have a very good architectural document describing how you structure persistent data.

- Have list of example commits showing where to put code for different features.

- Have a well documented testing strategy with examples pointing to commits with good integration and unit tests.

- Have guidelines on how much typical actions are allowed to take, like "page update should take 100ms at most".

- Try to write code where you don't need a lot of long superfluous names, namespaces and anonymous functions are your friends.

- Lastly, as much as possible try to make reasonable defaults for shared code. If you have to make 20 configuration decisions in order to use a library then you wont save a lot of time using it, and likely people will just copy the configuration from other places anyway since making 20 decisions is too much work. For example, lets say your library have a flag that can speed up processing 2x in some cases but with extra overhead most of the time. You could think that forcing the developer to decide in each case to ensure we aren't missing any performance improvements would be a good thing, but in reality a 2x performance improvement rarely matters. So the cost of having every developer making this decision outweighs the performance benefit. Instead have it as an optional config that they can set when they actually need the performance.




Applications are open for YC Winter 2021

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: