1. > You’re going to forget what your code does in a few months. Make it ridiculously easy to read.
Sometimes, all you need is a one-liner in sed.
2. > No matter how many managers are screaming at you, the first step is to reliably replicate the bug
3. > Sooner or later, you will meet some older man who has been around for a while. At some point, this man will lecture you about the Laws of Programming. Ignore this man entirely.
The first has to do with knowing yourself and having years of experience with a diversity of tools. (lessons: be reflective and work on a diverse toolbox)
The second has to do with experiencing and learning from team/org dynamics and to some degree how to operate under pressure.
The third is tricky - key red flag to watch for "the laws of programming" or "you don't have enough education" or similar put downs. But the older programmer who is always rolling up his/her sleeves to jump into the bug fixing process and and help - follow this person around like a puppy and watch everything - these guys are epic.
edit: one more thought on my note about the third item: look for these people across the org - technical or business - look for the people who jump into to constructively help when its needed. (very Mr Rogers)
I think the third was an attempt at some humorous self-deprecation.
problem is -- I've literally had that conversation (when I was the young guy .. never walked away feeling good).
I guess then I have been a Zen programmer from the beginning. Maybe it was because first I was into writing, and my favorite book is the Elements of Style. I relish deleting code, but you have got to understand I'm not deleting features, security, or anything.
It is really hard to explain how it's possible to someone who has not known it first hand, how you can subtract things yet lose nothing --- or even gain something. The simplest example of how you can delete code and not affect features is deleting code that was commented out by the last hooligan, commented out because he did not use version control and was trying different ways to solve the problem. The next most obvious example is code that is never run: functions that are defined but never called or, trickier, branch conditions that will never be true. But most of all I guess I have just worked with a lot of people who subscribed to the copy-and-paste method of code re-use. It's easy to pull out the repetition into functions.
One of my favorite examples of less-is-more is outside the field of programming, in the field of sports cars. The Lotus Elise weighed 1,600 pounds, so even though its engine had just 118 horsepower, it could go 0 to 60 in 6 seconds. In constrast to heavier sports cars, who solved the speed problem with the more obvious solution of dropping in a bigger engine, the Lotus also had these advantages: more miles per gallon, faster braking, nimbler cornering. As the founder, Colin Chapman, said, "Adding power makes you faster on the straights. Subtracting weight makes you faster everywhere."
How do you shave weight while retaining rigidity? Through another minimalist technique: reducing pieces. The chassis was of one piece, a "monocoque." Since things break at joints, instead of reinforcing the joints, you just do away with them. Steve Jobs, another minimalist, used the same technique in MacBooks. By cutting them out of one piece of aluminum, the frame can be thinner and lighter yet still stronger than competitors.
Long story short, always comment weird shit and use the correct bug ID.
> Il semble que la perfection soit atteinte non quand il n'y a plus rien à ajouter, mais quand il n'y a plus rien à retrancher.
> Obviously, perfection is not attained when there's nothing left to add, but when there's nothing left to take away.
> Arrakis teaches the attitude of the knife - chopping off what's incomplete and saying: 'Now, it's complete because it's ended here.'
That things happen in cycles. Little is fundamentally new. The time your in now, your wxperiences, are not all that different than previous peoples.
That young people (definition expands as you get older) really are clueless. BUT, that is really really important. Without naive optimism and youthful exurberance (God young people have so much energy, don't squander that) very little "progress" would be made.
Process matters, a lot.
Writing code is not about getting computer to do something. It's about getting that future developer to understand what this code is suppose to be doing.
Optimization of software almost never matters. Optimization of developement often does.
Meh. I've gotten to the point where as long as variables are not named 'a', 'obj', 'thingie', e.t.c I'm good. I've gotten into enough arguments about naming that have turned into bikeshedding that I just don't care. I'll do my best effort at good names but I will go with whatever your review notes say. If I need to understand code later I'll rename as I go, do the code change and then revert them.
I've generally worked on legacy systems and on any reasonable sized system there will be multiple names for the same business concept embedded all over the place. You just have to roll with it or be perpetually frustrated.
But that doesn't mean it's unimportant! I've found that when I struggle with properly naming things (outside of anonymous functions), it usually means I need to stop and properly think about what I'm doing. I've found this to be the case for both naming functions/methods and naming 'data' (which is why I've warmed a bit to the idea of explicit typing).
This is one of the primary mechanisms by which software ages. It is also the real justification for comments and internal documentation.
1. It's much easier to figure out something by looking at code samples, other people's code than reading documentation.
2. Clever one line code doesn't look very clever 3 to 6 months down the line and often leads to what was i thinking moments.
3. Perfectionism including premature optimisation is the progremer's biggest enemy.
Bonus: programming doesn't make you money, marketing it does.
This one I find the hardest to accept out of pretty much any advice I've ever read on HN. All rational evidence points towards this being true, and yet all my being, all the ethics I have, scream loudly that it ought not to be so.
People like stereotypes, basically, and I'm trying to (sparingly) lean into the 'autist programmer' stereotype even if it sometimes feels like 'hamming it up'.
EDIT: specifically, it means I dress down to a fault with some clients, let my inner geek out at 'social events', get too technical if I know there's someone present who can translate and 'explain me', etc. Whether this behavior is just another form of manipulation and self-marketing, or just undoing years of trying to mask my 'true nature' is something I often contemplate.
The first one is basically why the original documentation for WordPress worked so well. Say what you like about the software or its questionable code quality, but almost every page of documentation had a neat example or two that people could tweak to figure out a function or class or what not.
What's more, if you look at most overly well used pieces of software for development purposes, you'll notice that almost all the popular/beginner friendly ones do the same thing. jQuery, Bootstrap, React... every one of them has documentation filled with simple examples.
And for the last one, you could even expand on it a bit more and say '[name of skill] doesn't make you money, marketing it does'.
Because at the end of the day, it's not just programming in which this is the case. It's anything from art to writing to design to creating any form of media. You don't need to be good to succeed in any of these fields, you just need to be 'okay' enough that you can deliver something and marketing savvy enough to sell it to an audience.
Not realising this has likely sunk thousands of startups and companies, and understanding it has propelled many poorly done ones/outright scam artists to fame and fortune.
I don't envy young people who get into the business now. A lot of them will probably never get the chance to work on something uninterrupted for a long time. Instead they are being micromanaged daily, have to justify every little thing they do and don't get much opportunity to make mistakes and learn from them.
I guess most industries get boring once they mature. Working on cars between 1900 and 1950 was probably much more fun than working on modern cars in a modern car dealership.
That could be an illusion caused by the fact that the number of programmers in the world have been observed to double every five years. It may be that all the older programmers actually do stay programmers, but there are always more and more new young programmers around, which makes old programmers seem like anomalies.
Another anecdotal point; we are rolling a new app; the lead tech is 64 y/o and uses React Native and Node (different company than the above so different tech choices; not my choice). He does the app and the API almost all by himself (I do code reviews on them). He gets paid well, works from home and is fast.
So, what, you've literally never met anyone over the age of 40? 50? 60? in IT? I'll bet if you think about it for a moment you probably have. But if you haven't you need to look no farther than Pike and Thompson working at Google for two examples of older people working on interesting problems at a top tier company.
Regardleas I don't share in the cynicism and burn out that some have shared in this thread.
Even if I see some ideas recycled from 20-30 years ago I'd argue the implementations are novel. Frankly the fact that IT keeps changing is what makes it fun for me.
Well, there are 60 year old exotic dancers too.
I don't think most of us can follow the example of some superstars.
I agree about unions. The older I get the more they make sense to me. The tech industry is living off exploiting young people's enthusiasm with the promise of a big payoff that materializes only for a few.
Can't agree more with this. When I rail against the tech industry, it isn't as a Luddite, but as someone rallying against exploitation and subjugation.
Which was precisely what the Luddites were doing too. They weren't anti-tech. They were anti-exploitation. In these days, such exploitation often involved use of weaving machines, so the latter got destroyed in the process, but it was never about technology itself.
Also a shelf life, just how much abuse your body can take.
The cost of maintenance and construction must be going to skyrocket in the next ten years.
I guess what I’m saying is: Maybe it’s time to revisit the blackboard. Rethink how a neural net or intelligent model should be implemented. Retry and rethink the “set in stone” basics. Also, throw out the entire discipline and try to come up with novel, new approaches.
OTOH, certainly worth the watch even if just from this point to the end, easy listen at 1.5x speed.
But as soon as there were new fresh CS degree graduates, companies preferred to hire those since they were cheaper. This produced two effects: 1. The CS graduates were, for some reason or other, mostly young men, so the gender balance tanked. 2. The new CS graduates, fresh out of school, were neither disciplined nor mature, and so required a lot of micro-managing. Both these effects were made more or less permanent by the steady growth which you summarized, and the second of these two effects is the change which user MrTonyD lamented above.
No need to double-negate this word.
This is not directly in response to your comment, but I see three types of people.
Some people don’t know better or care to know better or weren’t paying attention or don’t mind because they know what I meant.
Some people care enough about language to debate usage.
Some people just want you to follow “The One True Way”, which is coincidentally whatever they think.
You do the 24 hour call outs with 15 min response times for key business processes a live monthly billing run for example.
But that's not always true, especially at smaller dev shops. However, smaller shops sometimes like to imitate the big corporations, it makes them feel more important, even though it's not necessary.
In both cases having broken builds in your production pipeline are an indication of a process or policy problem more than anything else, especially in 2018 when build hardware is cheap and CI platforms are everywhere.
(P.S. I've worked in both types of orgs. Ones where the builds had several teams of people working on them, where broken builds could mean huge amounts of lost or blocked work hour by hour and fixing builds was a very high priority task, and where there were huge systems and extensive policies designed to forestall build breaks getting checked into important branches. All of those systems and policies were built out of a history of chaos, where builds were almost always broken (if you have thousands of devs, and none of them take build breaks seriously, then there will almost always be one break checked in every day), schedules were unreliable, etc. So much so that it was burning people out. On the other hand I've also worked in places where a build break can be handled by filing a bug and letting it go through the normal process of triage and fix over a period of many days. Which process is appropriate is entirely dependent on context.)
This way master can be always green, because build errors will crop up in branches - only once the tests pass and review is complete, the bot can push to master. Because builds are generally done from master, this prevents landing broken code that could break others' development flow.
The end result is that all build failures have been addressed in the development branches. Bot will refuse to merge code that fails tests - and the workflow itself will guide towards a desirable process.
Humans need to be involved in the code review step. Even if everything is green and covered it still can be wrong or unmaintainable.
They still are unreliable. the only difference I see is that I spend a lot of time on reporting status instead of doing actual work.
The other way code doesn't live in a vacuum, which I've seen too many people fail to understand for far too long, is that code is the tip of your personal pyramid. The foundations of that are your mental health, your physical health, your family and friends, your professional connections, your civic connections, etc. If you don't look after those things, it will affect what code you write and how efficiently you write it, or possibly whether you can keep coding at all. Think about what it means to be doing this for 20+ years, from young adulthood through having a family and even to the empty-nest period. Don't burn your bridges or burn yourself out in five years. Learn to do what you do sustainably over the long haul.
I've been programming since the 80s, it all existed & exists to 'serve my needs' alone. (Well, I have written a few things to help other people.) So obviously that doesn't seem right to me. Maybe you meant 'commercial/professional code' or something. I wonder what % of the LOC ever written were to serve the programmer and no-one else.
That's like saying programming language theory stopped 20 years ago and all the conferences and papers on the subject since then are fraud.
Take Rust for example. It has several cool features, many of which are copied from other languages. Only an idiot would sit on the sidelines and say "oh what's so great about a package manager like cargo, Ruby has had that for a decade". A smarter person would realise that this is exactly how great ideas spread, and what's important is the combination of ideas and features that the language chooses. For Rust it's zero-cost abstractions, a good package manager, no data races, generics and sum types. Maybe this is the exact combination that's perfect for your use case.
I'm no Rust fanboy either, I just chose it because it's the newest one I could think of. I'm sure I could make a similarly strong case for many other popular languages.
Obviously there's plenty of new stuff that is properly new, especially in a field as young as CS. But there's also a ton of stuff that just keeps being rediscovered. Looking back instead of stumbling upon the same idea eventually can give a programmer a huge advantage.
Android JIT and UWP deployments can be traced back to how mainframes work, specially IBM z and IBM i ones.
IDEs although quite powerful, still lack several common features from Xerox PARC research.
A good example is how Flutter gets sold to developers. What would the audience think if the same examples would be shown using Smalltalk, Common Lisp/Interlisp-D/Dylan instead?
LLVM is great development, specially in what it was brought to the table to C and C++. Yet reading the PL/8 research compiler for IBM RISC does sound quite familiar to how LLVM was designed.
Everyone is now running containers on their laptops, well so were IBM and Unisys on their mainframes.
Now we have the cloud and browser as UIs, when I started working professionally we had MS-DOS terminals with Clipper applications connected to Novell Netware servers.
So yes, there is lot of important work being done, but most of the time it does feel we are doing circles.
Specifically, it's important to remember that, while the new idea may share its fundamental underpinnings with an older idea, the older idea also did/didn't work for specific reasons. Sometimes the "new" idea is actually the "old" idea with the big problems fixed, in which case I'd say there's a good chance the new version makes it further.
The most important one is that team is more relevant than whatever company one works for, and promises they might deliver.
Work friendships carry on whenever one goes, companies are all about profit, even when they state otherwise, things do change when bad times arrive. Too many times there.
Soft skills are quite relevant, it is not always about cool technologies and what not. Specially useful when working on UI/UX or getting to know what is actually happening at the company.
Always be curious to learn about new technologies and programming paradigms, but when time comes to launch products that are supposed to be around for years to come, stick with the vendor tooling of the target platforms.
New shiny thing might be a box of surprises when requirements start changing all the time and it is time to go live.
Picking on the last one, use whatever technology is appropriate for the problem, as what matters is delivering a working solution that solves a client issue, not to expose source code on the Louvre.
The (well, a) key to good program structure is reducing coupling between components.
Comments should say what the code is intended to do, not how it does it.
Good luck to the great and zen programmers in creating anything or finding a job.
It might make more sense to say we have good moments where we write good code, great moments where we unlock key insights without writing code, and zen moments where we make things better by just deleting code. Hopefully as we practice over the years, the great and zen moments happen more often. But most of the time we have to be content with merely the good.
Whenever I write software I try to picture those future conversations and what we need to do to avoid them. Having a few mildly unpleasant conversations about resources, quality, or time needed early on saves on far more unpleasant ones later. It's hard to do this with conviction unless you have actually been there.
I would agree with both of these as rules of thumb though, chances are a new programming language is a dumb idea and that your idea has already been done in some library. But new innovations still do exist, they are just rare.
- there's a sweet spot between "new, cool but unproven" and "boring but works" and it is different each time
- find the right level of abstraction to work with. Too high and you'll lose flexibly, too low and you're doing work you shouldn't.
- cut the crap and don't waste time with what you don't need. Focus on the product and on shipping the product. Your customers don't care about the latest js lib unless your product is significantly better than the competition for using it.
That's a very good point that took me a long time to fully comprehend. The concept of code abstraction is usually introduced with a promise of it increasing flexibility. But the truth is, abstractions increase flexibility along one dimension, by severely constraining you in many others. As long as you only need to move around along the abstraction-favoured dimension, things are fine, but if you find yourself needing flexibility elsewhere, it turns out that rewriting dumber/less abstract code is much easier and faster than digging yourself out of a wrong abstraction.
 - I sometimes feel there is a "law of conservation" kind of thing hiding in there. That abstractions are merely moving complexity around, not destroying it. Someone must have explored this idea before; anyone here knows of such attempts?
Crunch is a choice, failure of management and irrational. Organization and process matters.
Staying long is not being hero. It is being a sucker. It makes little difference in result. It amounts to vitrue signal.
Never cease learning. (Youngsters know that too, but it is important.)
As staying long in one company - imo depends on company.
Local refractors in code your touching anyways tend to work great though. Since your changing it you’ve built enough understanding to refactor it correctly.
1. Your code is temporary, but your attitude towards programming is not.
2. There is no such thing as perfect solution - a solution can be implemented by multiple ways, so no need to criticize one solution over another.
3. One size doesn't fit all
4. Learn to appreciate the creativity of youngsters and nurture them.
5. Brilliant programmers drive a product & culture, not a specific program.
As someone mentioned earlier, these tips just don't apply for programmers - but any profession.
Maybe it's not necessarily tied to 20yrs in programming, but to stay long in this field, one needs to develop an ability to sustain or recognize burn-out. Personal and on team level.
Especially while young and seemingly having limitless energy, attention, and desire to succeed.
Hell yes. I was bitten by this 2 months ago. Updated the server to Ubuntu 18.04.0 and had a week of hard disk crashes (randomly). I just could not work out what the issue was. Midway through the week 18.04.1 was released with two filesystem bugs fixed. Made all the crashes disappear. Never making that mistake again!
Those in positions of power (the managers) have never been those who programmed 20 years. Either never programmed at all, or were early successes either trough luck or more often than not, connections.
As a result they don't know and don't care about the things known by those who programmed 20 years, it's not their problem.
Loose Coupling and High Cohesion
At all levels, macro and micro. It applies at the function level and to global distributed systems.
For example, each of the letters in the "SOLID" Object Oriented patterns boils down to one or both of these principles.
Whenever I'm looking for ways to improve something, I start by thinking of it in these terms.
Every technical decision is trivial in comparison.
2 If your little new feature causes you to consider modifying the database, GOTO 1.
That's not so hard to see. The hard part is figuring out what simplicity is.
More detailed answer: https://xkcd.com/1205/
This one does: https://xkcd.com/1319/
>Sooner or later, you will meet some older man who has been around for a while. At some point, this man will lecture you about the Laws of Programming. Ignore this man entirely.
That said, like everyone else I'm going to add my most salient bits:
1. Keep It Simple, Stupid
2. Programming is a lifelong study of the difference between what you said and what you meant to say.
3. Reflect reality in your modelling.
4. If you have to get creative, you might be doing it wrong.
5. Build tests first when you can.
6. You're coding for two audiences: The computer and the future programmer. Favor the latter, because the computer doesn't need help understanding your code.
7. There are two hard problems in programming: naming things, cache invalidation, and off by one errors.
This week I finished reading "Startide Rising", a 1983 (!) novel. There was this gem hidden in it:
"You don’t have conversations with microprocessors. You tell them what to do, then helplessly watch the disaster when they take you literally."
If I want my coffee I have to spend quite a bit of time instructing said assistant to open doors along the way rather than just walk into them (thankfully there's an OpenDoorEx library for that!). I'll also have to carefully come up with an exhaustive list of interactions that my assistant might have with the barista (and agonize over the fact that I can never be sure that this external API is fully covered). And that's just some of it.
But when I've done all that work, my assistant can get my a coffee in seconds, because they're ridiculously fast, powerful, and precise. And over time the work I put in up-front will be paid off pretty quickly because it takes them seconds rather than an hour to get me my coffee, twice a day.
I suppose the fact that I sort of enjoy coming up with these ridiculously precise instructions is why I'm a programmer.
Of course in practice the OpenDoors library will break, or one of the doors on the way suddenly works differently. The new barista might have a speech impediment, or the items on the menu might change. Or my boss wants coffee too so now I have to teach my assistant to open doors with their elbows.
To normal people all of this is purely maddening, but for me it's 50% frustrating, and 50% an excuse to get to come up with new instructions!
Fun fact, in "The Uplift War" (1987!) I found what looks like a reference to cryptocurrencies:
"The money inside was GalCoin, untraceable and unquestionable throughout war and turmoil, for it was backed by the contents of the Great Library itself."
Operations is a thing, and it's harder than coding. Especially with a customer on the phone.
Your customer's expectations will always exceed the functionality you have provided. Get over it.
All the biggest mistakes occur in the first days of a new project.
Testing shows the presence of bugs, not their absence. If you think your TDD (test-driven development) method produces correct code, then you will be disappointed.
Nobody reads Brooks anymore.
You'll never meet "the future programmer" so I don't see how this is true. They will never know you existed, you'll never know they existed. The tree may or may not have fallen in the forest.
Heck most of the stuff you work on never gets seen again.
And I've never seen someone punished for not coding for "the future programmer".
Do we want real truths?