We've been doing CRUD in our industry for decades. How can we not just say "this is how you do CRUD, we're done w/ that now". We've been doing data serialization for decades now. How can we not just say "this is how you serialize"?
There are communities where this is the case. Why have we abandoned them? Why have we abandoned that knowledge and experience to reimplement things in language X or using platform Y?
We might not like to hear it, but my guess is it's a culture problem. They say the way to get ahead at Google is to build a new successful product. Is that the same thing we're doing? It's easier to get ahead by building a new Z framework than to become a core committer on X framework from 10 years ago? Are most X frameworks run by toxic communities? Is there something specific about software that means tenured projects become less and less useful/maintainable/understandable over time?
There's something in here that's specific to SWE. I don't know exactly what it is but, I think we should figure it out.
Software has so many more possibilities to explore than carpentry which is constrained by our current physical technology. It's far better to encourage engineers to explore these diverse possibilities than to encourage conformity and allegiance to some singular path that everyone is supposed to agree on and work towards. You would simply miss a lot of different innovations by grinding away on the same path. Communities that do so just stagnate.
Sure you can create a lot of fuss all around it, but I feel we create a lot of fuss because of ego, because we want to be perceived that we came up with new ways.
The reason to not conform is ego. Software is perhaps the cheapest ego boosting tool ever created.
I worked at an agency that produced CRUD apps at a rate you wouldn't believe. Every task was correctly estimated to the nearest hour. Add xyz entity 2hrs, add xyz frontend widget 3hrs, change deployment pipeline 4hrs etc. This was possible because they picked a tech stack and stuck with it.
I've also worked at companies where doing the same task could be 2 or 3 days. A place where no task can be estimated smaller than 1 day. The reason being the infrastructure, deployment pipeline, tech stack etc is overcomplicated. Way too much overhead.
Unless you are building some massive scalable solution all you need for BE is Spring/Django/.Net and an SQL server with a single backend dev who knows his stuff. Frontend you might need to change frameworks more often but still you can go a solid 2-3 years building momentum before needing to switch.
Especially on the .NET side.
A general history of CRUD in .NET:
- Basic ADO.NET (Not too different from JDBC/ODBC, direct commands)
- First Gen ORMs; Linq2Sql (functional but only on SQL server, and missing some features)
- Entity Framework (4-6) /NHibernate. Lots of people wound up hating this, so they went to
- Dapper. Dead simple; Takes SQL and only maps the results back. Everyone loves it.... Similar abstractions are created over Linq (linq2db, SqlFu) as well, with less (but happier) adoption.
- EF Core is released. Everyone switches back over again.
The whole thing is silly.
We're verging towards this, "No Code", PaaS, FaaS, Zapier, etc. I'd be super surprised if there were lots of CRUD jobs in the industry in 10 years.
That being said, I assume that the first few decades of carpentry didn't undergo as many changes as software has in its first few decades. My theory is that software changes so quickly because it can be bootstrapped. When framing a house, you can learn from the process so that you can make the next frame better by changing the process, but the output of that process (the house frame) doesn't directly affect the next time you attempt it. On the other hand, you can write software that invents an entirely new process for editing software (e.g. a compiler or interpreter), which then you can use to write software that might not have been possible before. You can then repeat this process with the new software, creating yet another paradigm for writing software, and so on. More generally, when a process produces a tool that can then be used for to do the process in a new way, the process will be able to evolve much more quickly than if updating the process can only be updated with the output from other processes.
A Kurt Vonnegut quote comes to mind:
"Another flaw in the human character is that everybody wants to build and nobody wants to do maintenance."
I'd love to go back to old code with the benefit of deeper domain knowledge and greater understanding of my tools and be able to make products even better. However, it's hard to square that against making +20% earnings by helping build a new chat app.
Is that really the case? Forums like this look down on maintenance a lot. But I find that real world companies much less so.
People talk about it like there's something wrong when, at any given time, a few microservices are being rewritten. But I would expect that for a sufficiently large machine, on any given day a few parts are being replaced.
It's changing requirements. When you build a house, people don't come in 6 months later and ask you if you could make one small change by placing some jet engines on the walls so the house can fly somewhere else during the summer. It's just a small change, right?
The problem is that in code, it often is a small change. Or at least, it is possible to make one quick adjustment to satisfy this new use-case. But often, these small changes are also a hack which doesn't fit into the previous overall design and which would've been implemented in a completely different way had the requirement been designed for in the first place. Now, one of these "small changes" don't tend to kill the product, but years or even decades do. That's why refactoring exists in software engineering, but not really in home building. Well, in some sense it does exist by renovating. But nobody thinks it's a good idea to completely renovate a house 25 times around an architecture that just doesn't work anymore for what it's being used for.
If you build a piece of software for exactly one well specified use case and only use it for that, it'll probably run really well forever. But (almost) nobody does that.
If someone wants a custom built order and inventory management system, that’s like asking a carpenter to build a custom 4 story house from some napkin sketches.
The whole reason computers are valuable is because they automate away all of the rote, repeated, predictable stuff. The unpredictable part of SWE is not comparable to carpentry, it’s more easily compared to architecture/engineering where the problem statements are vague and most of the job is getting agreements on what the thing will actually be. The carpentry part of programming is mostly predictable.
Anyone is free to compare software development with any engineering field, which typically have to solve large problems.
Thus if you feel carpentry is not a good comparison them look into civil engineering.
And no, the key factor is not the 'power' of software tools. The key factor is stuff like processes and standardization.
Sometimes it feels like software developers struggle or even completely oppose adopting and establishing processes and standards of doing things. Hell, the role of software architect is still in this very day and age a source of controversy, as is documenting past and future work.
We're talking about a field that officially adopted winging it as a best practice, and devised a way to pull all stakeholders into its vortex of natural consequences as a way to dilute accountability. The field of software developme t managed to pull it off with such mastery that even the document where the approach is specified is not a hard set of rules but a vague "manifesto" that totally shields their proponents from any responsibility of its practice delivering poor results.
If an entire field struggles with the development and adoption of tried and true solutions to recurrent problems then it isn't a surprise that basic problems like gathering requirements and planning is something that is still the bane of a whole domain.
What's standard in building a bridge? You have some physical constraints(length, what's the bridge crossing), material properties, environmental constraints(temperature, weather, wind, what soil are you building on), what kind of traffic. Then there are standard 'shapes'(though it's your choice of suspension or whatever). You then have a ton of standard tests that you run to check that the bridge is fit for purpose. But it's not like the bridge is built out of legos, and even if a lot of standard subcomponents are used the assembly will still end up being fairly unique due to every location being different.
Software does in fact have tons of standardization. No one thinks of processor arch when doing web dev. Or DB implementation. Or how you modify the dom(there are a handful of libraries to choose from, similar to a handful of bridge designs).
How do you make a CRUD app? You can do some arthouse project, or you can just use Rails or various Rails-like frameworks. They're all mostly equivalent.
How do you serialize data? JSON(before that XML, I guess). Yes, you can do something different, but you can also build an apartment building out of reclaimed wood.
The real uncertainty lies at the User Interface, which really isn't engineering driven, it's fashion and art and stylistic architecture. So yes, the way websites look tends to change and be fuzzy, but so do clothes and no one complains about that.
I think software people both overestimate the standardization of physical engineering and underestimate the complexity of physical engineers' decisions, presumably they're not just following a recipe.
TLDR: When software standardizes a tool or process it becomes invisible, an import and forget somewhere in the pipeline of tools we use. This makes it seem like there's a lot of churn. But the churn is a bit of froth on the top of really solid foundations. Yes we're always working in the churning part, but that's because the foundational part requires almost no interaction at all.
Now there is software that tightly follows specs and standards, and you typically find it in critical systems, such as medical and aerospace. But there are orders of magnitude more software projects than non software engineering projects because they require so little to instantiate. There is almost no barrier to entry with software, and no BOM, and no supply chain.
Perhaps it would help to only call a subset of software projects as "engineering" - that would solve the problem. Not all software needs to be engineered. I don't need to engineer a script that downloads some videos for me or my personal website. And that's not a bad thing.
By contrast, I’m still writing code more or less the same way I was 10 years ago, with mostly the same tools, and have not seen “order of magnitude” level of anything contributing to my productivity.
This particular angle is explained in the article
>> This is ego distraction in action. Self comparison determining effort. If we feel like we’re ahead we continue to put in the effort. If we feel like we’re not, we determine it’s not worth the effort.
The reason people would prefer working on a newer project/framework/whatever is that there is a higher chance they might be able to contribute meaningful code / support. I am admitting to that, and I am sure many have similar thoughts. It is purely guided on where one thinks success is achievable.
Also keep in mind - progress is being made. Python is clearly more productive that Perl. Or Django vs CGI/FastCGI. So 15 years ago, if that were my two choices for two projects, I would have taken the path of Python. Not just because it was new & shiny then.
Fast forward a decade, Go is clearly more productive than many things that came before. Kafka is clearly easier to manage than home-grown queues via databases and flat files. So why should I stick to old process?
The problem I feel is lack of arriving at any standards for anything basic. We have 10 message queues, but limited interoperability. We have 50 popular databases, but no easy migration. We don't even have universal support for Parque in all languages even though it has been around for a while. When can I grep a parque file? Something as simple as Azure blobstore and Amazon S3 can be linked together without arcane and inefficient copying.
New languages are popular. Why are they popular? "Because they are better." But in every other domain of software we also say "The best technology doesn't always win." Why would languages be any different? What if Go is, in fact, Worse is Better? And if it's a Worse is Better, then what is the Right Thing?
Ultimately, I think most programmers, given enough experience, eventually settle on a style and propel the style through the language, not the other way around. And to that end, there can always be new languages so long as there are styles of coding that remain unaddressed.
But this is counterbalanced by the assumption of a rationalist project existing: that code is made to be shared, and to be shared, it must be standardized.
If one looks at the hardware/software ecosystem, it is not rationalist in the slightest, though. It is a merciless field of battle where vendors manuever against each other to define, capture, control, and diminish standards. The small ones seek to carry a new standard; the large ones absorb the standard into their empire.
Software bloat is a result of this: everything must go through a compatibility layer, several times, to do anything. Nobody understands the systems they use. With each wave of fresh grads, another set of careers is launched, and they join in on the game and add more to the pile.
In that light, rational standards do not exist. They are simply the manifest ego of "us and them", and therefore are mostly a detriment for all the reasons that ego is a detriment.
There exist several examples of excellent feature scaling from small codebases: VPRI STEPS, various Forth, Lisp, and Smalltalk systems, project Oberon, and microkernels such as Minix. The quality they all share is an indifference to standards: they are sometimes used where convenient, but are not an object of boasting.
Therefore I currently believe that developers should think of standards as reference material, not ends in themselves - that is, you use one if you can't come up with a better way of getting the result.
Say, there are two approaches for a problem - how do we decide which one we go with? In the last 10 years I have not seen a single case where the decision was made based on something other than subjective opinions of a person or a group of people. "Past experience", "this is how it's done here", "this is the only way I can do" and countless other reasons - all of those are subjective and cannot be used for objective comparison of approaches.
You could say, "days to implement" or "money spent" is such metric - but then, there are no reliable ways to mathematically calculate this measure for any code you plan to write and then prove it in advance.
To put it another way - there is no standard unit of code/system correctness, by which we could have measured what we are actually doing or plan to do. Until one emerges, we are bound to continuously reimplement same things over and over again, justifying it by nothing else than our projections, prejudices and boundless ego.
Complexity. Understanding a legacy codebase is pretty much a small-scale research project. You need to gain domain knowledge, become familiar with the team, get acquainted with the codebase and its history, before you'll be able to reliably tell bad code from clever solutions to tough problems. The longer a codebase is developed, the more is there to learn and retain in your head. It very quickly becomes just too much, which means onboarding people takes a lot of time, and day-to-day development also involves being extra careful, or creating obscure bugs - both of which make the project take longer.
> They say the way to get ahead at Google is to build a new successful product. Is that the same thing we're doing? It's easier to get ahead by building a new Z framework than to become a core committer on X framework from 10 years ago?
Yes and no. Not every one of us plays the office politics. Some of us code because we like it. The yardstick then is one of personal growth, the ability to comprehend and build increasingly complex, powerful and beautiful systems, or automate mundane things faster and faster.
But, regardless of the "core drives", one thing is true: building a system from scratch is a much faster way to learn about the problem domain than trying to understand someone else's system and maybe contributing a patch somewhere. We learn by doing. That's why there's so many half-baked libraries for everything out there. Yes, there is ego involved - particularly for people who go out of their way to make their half-baked libraries seem production ready - but a big part of the picture is still that programmers learn by writing code and building systems.
(The difference from most other professions is that people there can build stuff xor share stuff - not both at the same time.)
And it's not my area, but this seems to be true in construction as well? The building codes change, and available materials and components change, as do their relative prices. Maybe not as fast, but fast enough to make older books out of date.
Also, side note: with respect to carpentry, books from 50+ years ago on wood working techniques, framing, joinery, etc. are perfectly relevant today. And many of my grandfather’s tools are still in use in my workshop.
Humans are pretty good at physics. At the layer of abstraction where carpenters work, our predictive ability is solid.
What fields of science are the primary judges of "good software"?
> Programs must be written for people to read, and only incidentally for machines to execute
> -- Harold Abelson
So it is pretty much _all_ psychology and cognitive science.
Humans are not yet that good at cognitive science because brains are complicated. There is real disagreement about how Working Memory operates -- and Working Memory is core to why modularity matters!
As an analyst, can you explain this bit?
I keep hearing things like "that's not actually a software development job, just CRUD", "we're done with doing CRUD" etc. But it seems like between the application and the DBA all the CRUD is taken care of, wouldn't the developer just work on the application itself? And isn't saying "we don't do CRUD anymore" somewhat akin to saying "we don't do [+-*/] anymore"? How can you have persistent data without CRUD? I must be missing a piece of the puzzle in this discussion.
The data that we manipulate has business meaning and there are consequences for the users that arise from how we model things. Consider the genre of articles like "Falsehoods Programmers Believe About Names" . There is ridiculous complexity here, for those willing to see it, but some people get tired of it.
The "it's not _actual_ development" framing is usually directed at applications which "only" allow users to perform basic actions on some data, basically UIs for manipulating a database. It is absolutely real development (in my view), but less sexy than AI/ML, big data, etc, etc.
You are correct that every application (with some sort of data persistence) needs CRUD. But how CRUD is implemented, for better or for worse, depends on the requirements of the application storing the data. For (most) relational databases, the low-level "how do I CRUD" is well defined: standard SQL queries. But if I use NoSQL, or flat files, or something; it changes.
The definition of CRUD also varies depending on the layer of abstraction within an application or the perspective of the user/developer. For example: from a DBA's perspective, CRUD is SQL queries. From a UI, CRUD might be a JSON API or GraphQL endpoint. From a server-side application, CRUD might be a specific ORM library.
Mapping state to the database is to web dev what applying paint to the canvas is to painting. It’s how you do it that counts. Saying otherwise is overly reductionist.
Frameworks exist that abstract CRUD away. But you end up sacrificing UX and / or flexibility.
One of the many reasons why CRUD is way harder than its reputation credits it with.
I think it's mostly a class thing though. Test automation is similarly looked down upon even though it is often much harder to do right than regular coding.
There is a definite pecking order when it comes to programmer roles and it's not necessarily related to difficulty (although it correlates very strongly with pay).
I think that might have to do with this complexity, but also: software has so many ways of doing something, even within the same language -- and that gets permuted across, say, five different languages (Python, Rust, PhP...). It's impossible to say the "right" way to do it because there are multiple ways to achieve a valid result that's readable, AND there is a margin for disagreement between what is "readable".
I was just thinking today about how to teach someone to be better at debugging.
We're prone to tediously repeat the same conversations over and over and take the cosmetic approach rather than the fundamentals-first way of doing things.
Early in my software engineering career I would constantly and painfully wonder if I was actually capable of fixing a certain bug or solving a new or difficult problem. But then after working hard on a solution, 99% of the time it would work out. After going through this process of debilitating self-doubt and eventual success over the course of years, it has become much more manageable.
I still sometimes panic when initially faced with a very difficult programming problem, but I can put those fears to rest much more easily by saying, "ok, I've solved hard problems before. I may not know how to solve this particular problem yet, but I feel confident that I will be able to figure it out just like I did in the past with difficult problems X, Y and Z."
At the risk of sounding pedantic, part of leaving the beginner phase — and the true value of experience — is developing a kind of armor against those feelings of inadequacy (of course you don't want this to go too far into feelings of overconfidence or an inability to reflect when things do go wrong).
I also think it's the responsibility of more senior engineers to recognize when a more junior teammate might be having those self-doubts and be empathetic while helping them build up their own successes.
The following this just my experience, but it's a bit of an odd one! So I thought I'd share for fun :)
I have this too and I only have 1 year of work experience.
What helped for me doing a course where I needed to know:
The course was about analyzing binaries and malware. I didn't know any C and almost no x86. I did the course as a challenge, but it to date has been the most difficult programming challenge of my life. Teaching yourself 2 prerequisites while following a normal course load at the same time, while feeling insecure and have a strong suspicion to not be intelligent enough was tough, for me.
I've worked at 3 companies in that little 1 year of experience (2 times as a freelancer) and it hasn't come close yet. I'm hoping where it finally gets tougher, but I've heard from people who actually are experienced full-stack devs for 4+ years that that course was way harder than anything they have ever done.
So long story short: do super hard courses. If they're not the hardest courses of your life, then it isn't hard enough.
Give yourself some time in between too. It is all about the fun.
Everyone one on the team should be allowed to make mistakes or take bad choices. Juniors, seniors, managers... we are all people, we're all engineers, we're a team! Cult of perfection is limiting to everyone. The manager's job is to recognize everyone's contribution, no matter how small.
Too often, the teams project a higher bar than is actually reachable. Sure it will lead to sense of inadequacy, for absolutely baseless reasons.
It’s the years of applying Zen Buddhism, scheduling your chores or staring at the mirror telling yourself you’re a great person that changes you.
I know because I recently recorded from a major depression and anxiety, and everything that I’ve done that has actually helped, like lying to myself in the mirror, or convincing myself no-one on the train was actually judging me, took 6+ months to have a real lasting effect.
It’s the same with distractions. Just look at your screen time spent on your smartphone today. It’s probably a couple of hours by the time you go to bed. Like it is for the rest of us. Most of that time is frankly wasted, you know it. I know it. But reading a self-improvement article about how cutting down screen time is healthy for us isn’t actually going to change our behaviour one bit. Maybe for a day or two, but not next week and certainly not next month.
This is well put, and I think part of the reason so much self-improvement material is drivel. Generally, I've noticed that some of the most pathological people are the most into 'self-improvement' as an idea. That being said, their brand of 'self-improvement' generally does not extend beyond reading and quoting books by various gurus.
On the flip-side, those I've met who are actually highly motivated and disciplined, have never picked up one of those guru books.
Reading up on something is one thing, and in many cases, it's an important first step. There's no way to start using a new language without reading something. That being said, simply reading is not enough. On top of that, what you read has to be actionable. The self-improvement platitudes are not actionable. Reading a book on Python does not turn you into a python developer. Why should reading a guru book turn you into one?
I've noticed that some of the people who spend the most time paying attention to their blood sugar are diabetic.
Work without reading about it: might work, and often does.
Reading about it without doing the work: way less useful.
Work without research often is actively harmful in addition to failing and wasting time & resources. Research at least improves knowledge while not wasting other resources besides time.
However, I felt the context of this conversation was self-improvement though. In this particular context, it's easier to get things done without reading any motivational books/articles (in fact, most people get things done without reading about how to self motivate), and the contrary -- reading self-improvement articles -- doesn't mean anything if you don't do the actual work.
Let me quote the initial post of this subthread, which is the sentiment I agree with:
> "The thing about working on yourself is that it’s actually work. Reading an article, or a book on behaviour, self-improvement and what else doesn’t actually change you any more than reading Harry Potter does."
I wouldn't try it with lab work, though pioneering work sometimes was that way ;)
Our tech lead is a bit of a diva. He is smart but basically he just programs and doesn't bother with much else. He bangs out code quickly, but it can be buggy and its usually the rest of the team that fix the bugs, keep the infrastructure running, write the tests. He is good at tricky algorithmic stuff. His code is fairly well organised. I don't find his abstractions particularly good. The REST API he created is terrible (poor abstraction) and not RESTful a lot of it uses POST requests, 200 success contains errors. No tests. Terrible at explaining his work to other people. Poor at listening.
Give me a good team player with average ability over a good programmer that lacks the other skills any day.
Python's PEP 20/zen of python is one of the best guide for craft, imo. It works well for individual programmer as well as for teams.
Programming -with intent- is the best way to get better.
If I code up a 10k LOC main.cpp with stringly typed data structures, I'm not really better at programming, am I?
It's like that saying: practice doesn't make perfect, perfect practice makes perfect.
Programming is not literally just typing, as we all know, nor is it simply getting a Thing to compile. A lot of it is educating oneself on different types of data structures, algorithms, math, architectural practices, and so on. Expanding our workbench of tools, as it were.
And -then- putting that into practice when actually programming.
I'm not good at Rust merely because I've worked with Rust a lot; I've also read books on Rust, and I've read many web articles on Rust (found from Rust Weekly) and various libraries, etc.
I think you are, you are better having done that than before. It might not have been the best improvement you could have gotten out of it, but still.
The relevant XKCD is this one: https://xkcd.com/1414/
Basically it goes: If you just keep working, will you always keep making progress towards where you want to be?
I don't necessarily mean "if you just put in the effort then you'll succeed," which I do not believe in. People talk about "practice with purpose." You have to know the parts that you need to improve on and correct them if your actual intention is to get better at something. I believe that works better than taking any arbitrary action at all, with the same goal in mind.
So it's not knowing if writing that 10K LoC program actually does help or not. I forget things I've done. I lose interest.
Then I extrapolate from this and think, then there must be some spectrum of things in between that are not practically useful, and if I keep doing them then I will not improve in the ways that I want. I will believe that maybe writing a stringly typed C++ application is just reinforcing bad habits that I will have to expend extra effort to undo later. I then believe if that's the case then I ought to not do that thing at all if I believe it's just going to hinder my progress.
The problem is that this mindset costs me a lot of my action, because I figure if what I'm doing is not beneficial for my skills then I'd better get something else. A lot of the time that "something else" is something less challenging, all the way to nothing productive at all. So I end up believing I'm just coddling myself in an attempt to avoid "wasting time" not really improving.
I think this kind of fallacy stems from a fear of banging my head into a wall expecting to get better at some point without knowing if I'm actually on the right track. At least if someone knowledgeable teaches you they could suggest so. And that fear stems from placing too much value on intellectual success as opposed to enjoying the process. If you only enjoy something on the condition you improve, then it discourages you. I've been discouraged a lot.
It could also be due to divorcing enjoyment of something from improving at it. I simply always care about improving, and if I don't see improvement then I'll lose interest. But some say that people who enjoy things just improve on the basis of doing it at all. I just can't seem to get myself to believe it, though.
2. Bias towards action. If you want to start running, just go run. Don’t read about it. Don’t sign up for a race. Don’t buy better shoes. Just go run for a while (or write some code or say all the Spanish words you already know out loud).
3. Spend 10-20% of your training time (do not go outside of this range) on improving your training. This is when you watch that video about your activity. People naturally gravitate towards 0 or 100% of time in planning. “A little bit” is the best but rarely done.
4. Check in with someone better than you on a regular schedule to make sure your training is progressing well. Weekly is very good. This could be a coach, mentor, partner, something like that (not an accountability buddy).
It's because we lack confidence in knowing if our system of improvement is going to work and we don't want to waste time.
I think you kind of need to Let Go and enjoy exploring or maybe just take structured online classes that you pay for.
You are applying it when you change as a result of reading and specially using what you learn. Change is essential, if you do not change anything you are not using it.
It is easier to just (passively) read something than applying it. The problem with reading(or watching videos) is that it can be used as an excuse for procrastination as it is way easier doing something passive that active.
The most interesting thing is that the problem is not in reading. I worked with a kid whose parents were worried as he used videogames to procastinate. They put the console out of the kid's reach and now the kid will just stare at the wall for hours just daydreamming.
So my advice is for you to start applying what one book about procastination says. Select just one good book and start applying it on your life.
It is very important that you just decide and pick one. I don't know "The Now habit" for example.
Write down in a journal the difficulties you face, your emotions while doing so and work over it consistently.
I think, though, that the author makes a really valuable core point: Most challenges are hard not because of the subject but because of our approach and perspectives. I can't think of anything important in life that doesn't benefit from the exploration of metacognition.
If you get stuck, you tell yourself in whatever way you want, and honestly, some version of the following: "I don't understand this thing that is happening, but I know there is a cause. It does not happen without cause."
Honestly, it's a bit odd, and I don't know if that's the best way to express it in English. Nevertheless, several people have some back to me and told me that it has helped them.
My initial inspiration, and hypothesis is that the simple acknowledgement that I don't understand the problem, and that the problem still - despite my lack of understanding - still follow the laws of cause and effect, somehow temporarily halts our brains tendency to protect our ego at almost any cost, logic be damned.
I started trying this out after puzzling about why it's unreasonably common to figure out the answer to something only moments after you get up from your desk to go ask someone else for help, even when you might have worked with it for hours. It had to have a reason, although I don't know exactly what it is!
Well you've heard the advice on looking at a problem from a different point of view, right? Usually this is intended in the sense of changing the context or reframing the problem, and it works, but takes effort because we all have our default go-to mental models. But it turns out that changing the mode of your thinking (eg. visual vs. kinesthetic, etc.) is just as helpful, and the act of trying to phrase the problem verbally is usually just different enough from just thinking about it (I believe even if you are mostly a verbal thinker) to do the same trick.
Hence "rubber duck debugging" where you solve the problem by describing it to a rubber duck rather than another human.
Seconding this, a quote I have from a past computer science teacher of mine is: "Someone with a brain wrote this code, so you - as someone with a brain - can understand it". Definitely helps me when I'm really stuck on a problem.
Every effect has a cause, and when debugging your job is to know your codebase well enough to be able to quickly pinpoint that cause.
One thing I would add is that intrinsic motivation seems to be framed and activated very differently depending on one's personality. I find myself performing best when I'm on the edge of failure, trying to catch up to the high performers, and when recalling past times when I overcame failure or adversity. Comparing myself to the group described as demotivating in the post is the best motivator for me. And then there are little tweaks to one's environment (for me it's coffee, exercise, occasional travel, specific movies and music) that I find end up making an enormous difference in motivation, focus, and overall mental state. I suspect this has a lot to do with personal physiology and the environment in which you grew up.
With that caveat, the post is incredibly thoughtful and helpful, and I really enjoyed reading it.
The issue for me is that I really struggle to turn this theory in to effective practice. Each time after reading "The Practicing Mind" I have tried to cognitively remind myself whenever I was frustrated, to stop and look at the problem as a beginner would, to drop my ego, etc.
The problem is that it would sort of help, temporarily. I'd find myself a little bit better at getting a solid day of work done, but not dramatically better. After a week or so, I'd forget to even do the exercises, and I'd be back to struggling.
What honestly helps more than anything, the "magic bullet" really is pharmacology (aderall). For me, it somehow calms me down. I don't feel more energy, I feel tranquil, and able to let defeat roll off my shoulders.
Sadly, taking aderall is not a sustainable solution. Amphetamine is a neurotoxin which raises blood pressure. Not to mention, I don't like being "tranquil" for anything other than my work. I like my 'normal' state of semi-uncontrolled energy, which is great for exercising and video games. I'd like to be able to turn this feeling on or off, and taking a medication doesn't allow for this.
So I tend to see saw between three states... 1) Struggling at work, barely getting by, quality of life sucks. 2) On medication, happy at work, feeling productive and peaceful, but desire to get off medication 3) Off medication, using "Beginners Mind" but find my ability to implement it in a way that is strongly effective, absent.
What helps for me the most is intentionality. To literally set my intention for a day or for a problem right before I jump in. So if I know I'm about to jump into a tricky problem I literally take a few seconds to remind myself of the attitude I want to bring and even exactly what I want to focus on.
So this would be things like "Don't try to judge difficulty (easy/hard), just go wherever it takes me" or "Don't be afraid of the amount of work". One that super helpful for me is deliberately separating to "understanding" part of a problem from the "solving" - so i'd tell myself "I'm just trying to understand what's going on right now - solve later". Etc etc.
Hope this helps.
As always ymmv, and good luck :)
Learning to program as a kid was probably one of the most exciting developments in my life up to that point, and I expect that's true for many people on this forum. I originally attributed this to programming's usefulness, and the mathematical beauty of watching all the pieces fall into place when solving a problem. And those were surely both important motivators, but, looking back, the primary motivator was the pure power trip of it. Programming is extremely powerful (software is eating the world, after all), and I could immediately sense that, and that power was the biggest high I got from it.
Throughout my teens and twenties, I didn't really consider this, and just followed the high, and it led me to develop skills and a successful career as a programmer. For me, it was a positive feedback loop, where the more I put into programming, the better I got, and the bigger the ego boost. Unfortunately, though unsurprisingly, it got to a point where my inflated ego started getting in the way of my personal relationships, and even my self perception. I considered myself a great programmer, but not a very good person. I became quite self-loathing for many years, but I've noticed that's healed up after moving away from programming as a primary job responsibility, and my personal relationships have benefited, too.
I still love programming for the beauty of it, and I still dive into little personal programming projects a few times a year. Part of me wishes I did so more often, but I'm held back because the only way I've found to get through a project of any duration longer than a few days is to basically develop delusions of grandeur about it. Programming is fun and beautiful, but very hard, too, and somehow without the promise of the conference talk, or the influential git repo coming out of it, there's just too much friction. So, more often than not, these days, I simply don't bother. I guess with my current middle-aged testosterone levels, I'd rather keep my family and friends than be king of the world.
(That said, if anyone out there finds this relatable, but has been able to push through and develop a healthier, less ego-reliant, relationship with programming, I'd live to hear about it!)
Before getting into programming, I was a somewhat accomplished guitar player. By the time I was 20, I had played in a bunch of bands, recorded several albums, and gone on tour. As a result of these early successes, I developed a big ego about myself as a musician.
I realize now that the main thing driving my musical career was that ego. I enjoyed playing, but getting better at my craft was not my primary driver. Instead, it was that I wanted to be famous and rich and noteworthy and desirable. For me, playing guitar was inexorably linked with becoming a certain kind of person and gaining status.
Now any time I pick the guitar back up for more than a day or two, I quickly get lost in delusions of grandeur. I start thinking about how I'm going to change my whole lifestyle to "be a great guitar player" and playing itself takes the back seat to fantasizing about gaining power and status. Try as I might I can't just casually play guitar for its own sake—kind of like how you have trouble programming without the promise of a conference talk or an influential git repo coming out of it.
For me the solution has been to avoid playing music, and to focus on programming (and my family/friends) instead. I think the groove of ego I carved out as a guitarist is just too deep to allow me a healthier relationship to music. As a programmer, I don't have that same narcissistic false-self to live up to. I just enjoy it and want to get better because it's fun.
Maybe the solution for you could be to take up a creative pursuit other than programming?
I suspect that sort of thing won't be much help to you, however, because for me, my motivations have been relational more than power-based.
That, and I find enough joy from programming just from the utility of things I've worked on, rather than needing it to be a vehicle for influence.
Still, I'm very interested to see where things go for you.
I think this part is definitely true for me...
I have let "notice when you are confused" and "understand the impact of your work" and "make sure you are building the right thing" and "make sure you know stakeholder needs" get kinda etched into my identity. I keep wanting to _understand_ the systems I work with and I keep getting distracted by noticing problems with its UX or implications to business process.
I can turn that voice off with deliberate effort, but I don't know how to get it to stay off.
Does anyone else have any methods for more permanently-silencing UX-worries and just cranking out code?
About giving up on projects and how the ego plays into that, I don't think in such black and white - giving up = bad, persevering = good. Sometimes you need to give up in order to find a better approach. There are reasons why this instinct is present in our species (something to do with the exploration exploitation trade-off). We can't paint over it with self help advice.
Comparing yourself to others is bad? Why? It's an evolutionary advantage to learn from the experiences of others. By doing comparisons you can calibrate your values. Competition is a great motivator. Having a row model can be fast way towards improvement. Comparison between peers is like a second order metric, first order metrics relying only on self.
The advice about not comparing yourself to others is useful only in a limited setting - where you devalue your accomplishments and have nothing to gain from it. But when comparison motivates you to improve, then it's actually not bad. Also when comparison prompts you to take action and avoid a crisis you could be spared a lot of suffering. Comparison can act like an alarm. Another function of comparison is to make groups more cohesive - if they form a common culture they can function better - so aligning oneself to the group can be beneficial for all.
Don't be put off by the title, it's a wonderful read no matter if you think you're gifted or not.
I read that book this year.
It definitely contributed towards resolving some unresolved childhood trauma, and I'm grateful for it, but it was no walk in the park.
My counter point was that while I enjoyed the book's results, the read/process was the polar opposite of wonderful.
I felt my anecdata might be helpful to those who might pick up the book.
“Whenever distress or displeasure arises in your mind, remind yourself, “This is only my interpretation, not reality itself.” Then ask whether it falls within or outside your sphere of power. And, if it is beyond your power to control, let it go.”
I need to teach myself to take the ego hit and that in the long term it'll pay off more than independently struggling on a problem.
I try to counter this by reaching out myself when I need help, which is pretty regularly. I'm hoping my younger colleagues will see that even the old fart needs help sometimes and isn't afraid to admit that he doesn't know something.
I disagree with this [at least for me personally]. Being a good programmer is just straight up not something that's part of my identity. There's millions of people who are much, much better at it then me, and that's totally fine.
The reason that it's hard to work on hard problems is because they're hard! Sometimes, programming can be a really difficult or a slog of an activity. It's same with mastering any skill. Learning to play guitar, becoming an Olympic athlete, whatever. You can't Zen Buddhism your way out of the fact that you're going to spending years and years practicing until your fingers bleed, or until you're completely exhausted, etc.
Many Zen Buddhists know how long and hard practice for mastery is. Practicing meditation is in many ways skill acquisition as well, which is why it is called a practice.
You’ll of course be spending years and years practicing programming, and the insight that ego identification gets in the way and one must practice beginners mind is a simple yet deep understanding that comes from years and years of practice.
You can't just 'of course' this! I mean, you can, but that's the whole point. If you, or anyone reading this enough cares enough about being a great programmer, you wouldn't be on this site in the first place. Which is fine, I enjoy wasting time on here as much as anyone. But the people who are actually really good at programming? They're not reading blogs about ego. They're not writing blogs about ego. They're programming.
Look at what Fabrice Bellard has accomplished in the last 20 or so years: https://bellard.org/ . QEMU, FFMPEG. I will never even be close to the level that he is. I'm much closer in relative skill to the person who just wrote their first hello world yesterday, and I've been programming for 15 years or so. And that's totally fine with me. Programming is not my only interest in life.
For the blog author, it seems that they're searching for a reason that they're not as good at programming as they think should be. I mean, he's already a Staff Engineer at Circle CI. He's not someone who's been programming for a year. It's very possible that's he pretty close to being as good as he'll ever be. Sure, he'll keep improving, but he'll never be Fabrice Bellard. If he was, he already would be and wouldn't be writing a blog about why he's not.
So what I would say to him is: that's fine! Life is not just programming. "The Second Truth is that this suffering is caused by selfish craving and personal desire." You wrote an article about how your ego gets in the way of you becoming a better programmer, but its your ego that makes you want to be a better programmer in the first place!
The point of the article isn't that there is a shortcut around having to put in the time, it is that ego sabotages your efforts to put in the time, and if you're somehow coerced into spending the time anyway can prevent you from receiving the expected benefit (eg. just staring at the code on the screen probably won't help, unless the problem is literally a typo).
Sure, but what I'm saying that that statement is almost meaningless in the grand scheme of things. Is not letting your ego get in the way a necessary part of getting to mastery? Absolutely. But no matter how much you change your thought process, or analyze the problem, you still need to eat the proverbial whale.
Think of all the people who've ever played basketball and have had any aspirations of joining the NBA. For the vast, vast majority of them, they just were never going to be good enough. They didn't have the talent. It didn't matter how much they practiced, how much they got their ego out of the way, how badly they wanted it. Only about 3,000 people have ever played in the NBA.
Now, I'm not saying you need to be in the NBA to be a "successful" basketball player, or whatever you want to call the equivalent of that for programming. What I'm saying is that everyone has ceilings for how good they can become at something. Maybe on the relative scale, the best you can ever be is a pretty good programmer, and that's it. No shame in that!
After all, part of Zen Budhism is accepting who you are and your limitations.
The teaching is that all beings are capable of enlightenment and outlines a path to accomplish that. The concept of "you" is part of the problem and meditation on sunyata can offer insight to that.
Then why did you call yourself "good", why not just a programmer? ;-)
Thinking this way is very liberating because it means everything is your fault, but.. you are human and humans make mistakes.. so that means that this is a learning experience. If your mindset is that we are constantly learning, no mistake can ever really touch your ego.
I often find the solution to many of my problems is to go back and practice from this mindset. I coincidentally went back and reread selections from this book a couple weeks ago, as I found my ego had been creeping into many facets of my life recently and I needed to go back and be reminded to practice with this mindset.
from what I see others comments in HN. "Points and Counterpoints"
Every article or idea doesn't work for everyone as we are a complex cocktail of ideas and impressions. if some idea resonates with you, you have found your type of the idea. so enjoy it else don't resist the idea wait for next one that might work or not. thank you for sharing in any case.
You have a problem, you discover a potential solution and it seems to be working. You get excited and you try to make sense of it and you want to tell everyone because it is such a game changer! Then some time passes by, the emotions fade and you arrive at a new perspective - that your life hasn't changed that much or if it has, now you have a new set of challenges and a new potential solution will come your way sooner or later.
It never ends, unless you at one point recognize that it never ends and cease wanting to be better and wanting to understand it all so much. It's not that you purposely cease trying or wanting, it's that you relax the wanting, because you know the problems will be there tomorrow and the day after, no matter how much you try.
That's when 'it's the journey, not the destination' finally sinks in and life takes on a new quality :)
For some people, it happens when they are reminded of death and the inevitability of it all, for some when they've burned through their health enough that they can't do it anymore, for others it just occurs to them one day - I can't keep up with this bigger, better, faster, stronger culture and frankly, I don't want to, either.
So when the author was talking about how being an expert is really just a matter of become a great student I was quickly reminded of my golf game. Where I often time find myself with the lowest handicap I have ever had without actually feeling like I am improving. I shave a stroke one day. Then another.. and another, and before I know it I am a 3 instead of a 10.
That being said, I wouldn't say I have an ego problem in coding myself because tbh I have always felt like a bit of an imposter. I think my imposter syndrome has actually ended up being a good thing over time in my career as a programmer. It seems to have kept me grounded, and as the author suggested is a good thing, it seems to have kept me in the forever a student mentality.
This happens in a more literal sense as well. Since we compare ourselves to each other, it makes sense that more experienced engineers love feeling like they're ahead and beyond newer engineers, and it bleeds into behavior.
For example, engineers love asking candidates obscure questions in interviews - as if not knowing a specific JVM perk makes the candidate less of an engineer.
Let's also not forget the severe elitist attitude some engineers have when interacting with others. It's almost like everyone else is trash. For all the work he's done, which I admire, Torvalds was seriously toxic to interact with.
It all ties back to us wanting to compare ourselves to each other.
Same experience here, and probably for a lot of devs. It took a good 10 years or so to break out of this. If I couldn't get something working that I expected to work, I silently took it out on myself. Must not be good/smart enough. And so I'd beat my head against the wall. Once I started letting broken code or an unfixed bug wait until the next day and saw that the sky didn't fall, and that I could eventually fix the issue, I started to trust myself more, and just let things be. Some days, everything works. Some days, nothing works. Your build fails and you spend hours updating some obscure library, and you don't get feature X done that day. It's really your decision whether or not you let this effect how you feel.
If I was guessing at a cause, I'd put it more on the teachers/school. That they have it in their mind that there should be a range of performance in the students and they enforce that idea. Eg. if everyone gets an A in their course they start making it harder until they get the distribution they expect.
When I was younger I got into computer programming, for the first ten years (1987-1997) I thought I was hot shit because I could do things with computers that no-one else I knew could even understand (with the exception of a family friend who was a programmer in aerospace) then I ran into other programmers on-line and realised that there where other much better programmers in the domains I was interested in (strangely I never got into programming games, I always liked utilities and 'productive' stuff).
So I doubled down and resolved to be the best programmer I 'knew' again except this time I knew hundreds or over the years thousands of programmers an impossible treadmill.
Sometime in my late 20's/early 30's (so ~2007-2008) I realised that not only wasn't I ever going to be the best programmer I knew, I really didn't know much about programming in the general sense if you look at the whole field (no-one does really except the odd person) so I re-framed it, I was going to be a better programmer than the me of a year before and focus on the other skills I'd let languish over the years what I'd often derided as 'soft' skills (I don't think I was ever an arse-hole but I was the guy who'd sit in the corner muttering with the headphones blasting thrash metal).
In the end what I realised was that after all this, I like programming, I like providing value and when it comes to work the best thing I can get is feedback from a user whose life I've improved by making whatever I've touched that little bit better.
If I can do that then it was a good day.
The freedom from all this is I learnt to play again, if I'm interested in functional programming I'll go poke at that for a bit, if I'm interested in algorithms I'll go poke around over there - free from the the self-imposed need to compete I get to satisfy my own curiosity and nurture the devs on the team I run.
With 7 billion people on the planet it's statistically unlikely you are ever going to be the best and even if you are it's likely in only one dimension.
I noticed that the programmers I normally really admire are all older than me and seem to be excited/happy about technology and wondered how they kept that enthusiasm for so long in an industry where so many seem miserable and I think I can hazard a guess now.
Oh and because the universe loves a punchline, I have a dev on my team now who is determined to prove himself the best programmer, never says a word and listens to thrash metal all day while muttering, he's talented so I'm curious to see how he figures it out.
> “I cannot say this too strongly: Do not compare yourselves to others. Be true to who you are, and continue to learn with all your might.” ― Daisaku Ikeda, Discussions on Youth
This article feels like it gives me the how - thank you.
100% spot on that the external distractors are easier to manage than the internal ones. A buzzing phone, tempting social media websites, and loud rooms all tend to be relatively easy problems to fix. As for internal distractors, I feel like telling a personal story after reading this.. There are two internal distractors I've recently noticed myself struggling with:
1) A busy mind.
I often find my brain meandering on ideas or conversations completely unrelated to the work I'm trying to do. Daydreaming, imaginary arguments, and unnecessary tangents all tend to creep in (esp in the afternoon for some reason). I'm glad this post touched on Zen Buddhism and the beginner's mind. At risk of proselytizing, I have to say the best way I've found to manage a busy mind is through meditation. Consciously setting aside 10-15 minutes everyday to practice letting go of thoughts has helped build a (tiny) mental muscle which I can sometimes use to bring my focus back on the things in front of me.
This is a bit of an external distractor, but also an internal one. In college, I was able to stay up all night drinking and coding. No longer! I find it amazing how insanely less productive I am even after a single glass of beer. I now get tired shortly afterwards and have immense difficulty focusing. Perhaps as the article mentions, the alcohol is wrapping up my ego in the task at hand. I don't have a drinking problem, but I now solve this by consciously deciding how to spend my next couple hours. "Am I going to grab a drink and take an extended break (perhaps for the rest of the day)? Or am I going to grab a water/tea and continue working?" Gone are the days when I could reliably reach the Ballmer peak (https://xkcd.com/323/).
For controlling distractions, other hobbies like music or something else apart from programming keeps me ticking.
For internal distractions - I agree with the post. Separate me from problem I am trying to solve.
I've found that I almost can't program anything sensible after even very small amounts of alcohol, even half a beer or 25ml of vodka is enough. It's either alcohol or programming for me. I don't drink too often, but when I do, even after large amounts I don't have hangover and I remember having memory loss only once.
> 2) Alcohol
Have you had your B12 levels checked? You might want to try a supplement.
Then most of these articles usually say the recipe is to have the adult very closely observe the child and hit it with a stick any time it wanders of the "desired" path.
How do you expect it to work between a real parent and a real child? I think it would fail miserably and annoy both sides. I'd suggest coming up with some nasty parental tricks instead. Also adjust them after some time, because they do tend to stop working.
It's extremely hard to make money being an independent software developer. There's a lot of noise and money in the market. It's hard to compete with marketing from big companies. You have to work for a corporation or a startup with funding. You have to be part of an ecosystem. When you are accepted to a program like YC, you win an entry to an ecosystem.
Can you be a software artisan nowadays? Can a small team develop and sell software without having an ecosystem behind it? I've seen some examples of this, like Ruby on Rails, 37signals. But they are rare exceptions.
I'm currently working on an open source project. Let's see how long I can be independent for. Check out my project :)