First, the title: written in the past tense makes it seem like he believes the journey is complete. The article itself doesn't convey that message, fortunately. But I think it's worth noting that a critical skill for a professional is constant education - don't be satisfied that you're already "better", get better every day.
I also strongly disagree with the suggestion that you should ignore the form of your code. Code is read many, many times more than it is written, so improvements to your own and your teammates' understanding can pay huge dividends.
In particular, the advice to "not worry about" DRY and duplicate freely reveals a common misunderstanding. DRY shouldn't be taken literally, it is a recognition that repetition is evidence of some greater underlying truth. By duplicating freely rather than thoughtfully avoiding duplication, you're just creating opportunities for bugs.
I believe the spirit of what he's trying to get at is not letting concern for DRYness stand in the way of working things out. I saw this in practice many times, and even approach complex projects in much the same way--I'll typically not worry about DRYness until I've implemented something a couple of times, sometimes in different ways, to get a feel for how I can approach it best. Once I've figured that out, I clean things up.
I've seen many junior devs get so worried about the quality of their code, and how they should do things, and omgosh, DRY that they can often become paralyzed and don't just start writing code. I've been mentoring someone for nearly 18 months, and I'm always trying to push him to just start coding and get something working, then come back and discuss and evaluate his work together.
I think you've misunderstood what James has said, and feel compelled to push back on any suggestion that his advice regarding DRY and duplication is borne of any misunderstanding or lack of recognition of the value of DRY. Nor is he someone who I've ever seen stop learning. I keep up with his work, and I'm always impressed by the different things he picks up just because something about it interests him.
TBH, junior devs are often hammered constantly throughout training about this stuff. They are drilled on design patterns before having even coded anything significant.
It's not surprising they can only breathlessly parrot software principles when they begin..
James's suggestions about ignoring code form, DRY, etc resonate with me because I'm someone who is usually inclined to spend too much time on those things. I consciously try to reduce the effort I put into those things and instead focus on trying to get the code to actually do something!
That is: Make it pretty and readable and DRY once the code works. Until then my code can have several ugly and improvised things I've left for later.
I'm a bit skeptical of that as a general principle.
I want code to be a consistent representation of my mental model of the solution at all times. I use code to help me think, remind me of my thinking and communicate with others. It's not just a series of commands that make a machine do something.
As long as any ugliness is not in conflict with this purpose it's fine. I can totally live with functions that are a bit messy and probably need cleaning up later.
1) Make it work
2) Make it maintainable
3) Make it fast
All three should be on your mind when developing something, but if you have to choose between 2 and 1, go with 1.
My rule is simpler: never write instant legacy code. If it feels like something you won't understand a month later, it likely is legacy code.
If it reads badly (and I'm not talking about this or that brace or whitespace choice), then it also provably is in that category.
(On C2 wiki, a proper SpikeSolution)
If you hit a case your design cannot cleanly handle, it is time for redesign, but often a tiny amount of design constrained by use cases will expose a good design quickly.
Why can't you just keep working until you consider your task done?
Development time is inherently very hard to predict, so strict deadlines - even if set by people who understand the technical issues don't make a lot of sense. There many books on this, and I won't try to summarize them here.
Still, if you have a deadline, you can use your time however you want before the deadline, right? So if it's better/faster to clean up the code after you made it work than to try to type everything perfectly the first time, what's stopping you?
Or is it that the deadlines are so tight that you only have time to write terrible code?
But when I see developers with the self-expectation of clean code take on a task where the architecture is not self-evident, where the architecture probably won't be right the first time, then they often freeze up. Or they move their efforts into the abstractions they do understand, instead of the task they are trying to accomplish.
I'm talking about one single commit.
If I spend 4 hours on a story, the last half hour is probably spent cleaning up things. That's when it's easiest to do, because now that the problem is solved, I understand things the best, and the code is not in flux.
I do not have to ask a manager for permission to do this. If you do, you should really try to find a job where you're trusted to make these simple technical decisions.
If the code is working but is badly designed, that can take longer.
Occasionally, the new feature requires larger refactorings to redesign preexisting code. That can take days, but is of course essential to do.
Adhering just somewhat to principles from the beginning shouldn't make this too time consuming.
1. That's a good way to break it.
2. Typically there is not extra time given to cleanup code, write docs and unit tests.
In my (admittedly anecdotal) experience, devs who tend to overemphasize formatting and DRY are already self-aware about it, whereas those who give it far too little care and attention regularly shoot themselves in the foot by overlooking simple mistakes that get camouflaged in poor formatting and excessive repetition. My concern is mostly that the latter group would see this article as evidence that they can safely ignore code review comments and such.
Formatting, DRY, and good code organization don't solve business problems, but they do make solving those problems (and keeping them solved) a lot easier.
Secondly I believe that with practice, DRY can become second nature. Back in the mid 90s I wrote a lot of COM code in straight C (for reasons). This ends up being extremely verbose, so repetition carried a sufficient penalty that I learned to avoid it, to carry a sufficient mental model of the code base to know when I could avoid repetition.
 plonjeur -> plongeur
While he explicitly said to ignore fluffs, he followed it up with `Another way to say this is "use your time wisely"`. So I interpreted it as to just not spend too much time on fluffs, rather than completely ignoring it (like code styles as you mentioned). He created Prettier, so I'm sure he knows the value of code styling.
He came back four hours later and said that he couldn't stand it and he had refactored the code and fixed/written some tests.
Perfect. I learned long ago that there are ways you can start cleaning that will induce (some) others to participate.
He spent hours writing tests while I was working on an adjacent part of the code that had worse bugs in it. The work got done and a big piece of the code (this was a project I supported but didn't maintain) didn't, in the end, have my edit history all over it.
And PEPs cover many things.
There's definitely a balance, and if you know the type of problem you are solving ahead of time you can probably abstract early. But if you're just kind of feeling around in the dark, which seems to be relatively common in complex business applications (for instance), I much appreciate an implement-test-reflect-refator cycle.
It is a very tough sell to most customers.
The quick cycle does not scale with project size.
The classic line "Make It Work, Make It Right, Make It Fast" may have communicated this concept in a better way (with each step explained).
How should he have formulated the title then? I think it is logically correct.
"Beta abstraction" is a useful mechanism for this. Wrote a post capturing some discussions in my org. 
edit: I wrote "as you said", but I expanded on what you said instead.
Properly anticipating issues takes experience.
The way abstractions are usually laid out is by trying to generalize a concept, usually along simplistic lines without attempting to substantiate it. Beta abstraction provides a substantiatable route which could involve a bit of repeating before the abstraction surfaces. For example, the Pixar cars don't form a class hierarchy with a "car" base class - there is only one class "car" and the variety is generated by configuration (need to pull up ref for this, as I read this long ago). This may not be an obvious step for someone not in the domain.
Even now, though, I continually doubt myself. The point is that this feeling doesn't go away, so just try to ignore it, keep hacking, and keep building experience.
1. Make a ~/git/scratch repository.
2. Whenever you see a code snippet in an interesting blog post, don't just read it. Copy it into a subdir of ~/git/scratch and run it. Write shell scripts to automate the process of running it. Prove to yourself on your computer that it works.
The first few times, it may be a little onerous. But eventually you will fall into a groove and it will take 60 seconds or less each time.
I don't promise to understand it on the first pass. Just the act of downloading it and running it gets it into your brain. Half the time you end up hacking on it anyway, and other times, you don't understand it, but when you see something related later, the light bulb will go off in your head -- "that's similar to something in my ~/git/scratch repo". And then you can go from there.
I don't know why but having a running code snippet really makes it feel "ready at hand" and you will learn faster. It somehow primes your subconscious. I feel like there are a lot of people who read Hacker News a bit passively, without retention.
Here's a good blog post along those lines, with code: http://journal.stuffwithstuff.com/2013/12/08/babys-first-gar...
A much bigger commitment, but with correspondingly bigger benefits: I found helpful was to reimplement like 10 different things I use in maybe 500 - 1000 lines of Python. I like the new "500 lines or less" book  -- I was doing this 10 years ago!
Once you implement some class of program, you have a very good idea of how the "real" libraries you use are implemented. That helps you build better systems and write better code.
Examples: A template language, a pattern matching language, test framework, protobuf serialization, a PEG parsing language, Unix tools like grep/sed/xargs, an event loop library based on Tornado, a package manager, static website generator and related tools, a web server, a web proxy, web framework, etc.
In addition to writing something from scratch, I also find tiny code on the Internet and play around with it, like tinypy, OCamlLisp, femtolisp, xv6, etc.
Write and maintain your own code for a significant amount of time. When you see bugs repeating themselves you have no one to blame but yourself. If you come back to code that you have written a few months later and you can't make sense of it fairly quickly, its probably time to refactor that part. Think about the design up front and the implications of doing things that way or another way.
I think that working in constrained environments early in my career gave me lasting instincts both for efficiency and maintainability/sustainability. I wrote a lot of code on Win16, GeoOS, Epoc32, and a variety of embedded systems. Debugging tools aren't good, the dev-test-repeat cycle can be arbitrarily long, so you learn to avoid a certain sloppiness.
I've been having fun with PICO-8 on a Pocket CHIP, it has some similar constraints, the keyboard is awful and the screen is small, so there are similar incentives, and the added bonus that making a cute game is fairly accessible.
When you say just the basics, what are you referring to? Syntax? Pointers? Dealing with garbage? I've got a vague idea about it, but never had to really do anything with it so I feel now is a good time to mess around with it.
Great read though, always a pleasure seeing James mentioned (someone that definitely inspires me).
So things like:
- viewing primitive objects, structs and arrays as blobs of bytes arranged between address X and address Y
- a pointer as an address of that blob. Understanding what it means to copy that blob vs. to pass around the pointer: calling by value/reference not in terms of the semantic effect, but in terms of what happens underneath.
- the stack and how it's typically used, esp. in recursion. Local vars vs. heap-allocated vars. How control flow leaves its trace on the stack. How arguments are passed and return values are returned.
- complex data objects viewed as blobs of memory pointing to each other vs. pointers. Concept of "owning" such a blob and how ownership is passed. Understanding how your language's runtime keeps track of the blobs if you don't have to, and what are common pitfalls.
- clear understanding of the difference between int8, int32, int64. Strings as null-terminated or counted arrays. Bitops.
"Learn C" is just a useful way to force you to internalize all of the above, because you can't properly "learn C" without doing that. But it's the above that helps you back in your favorite language. That, and perhaps the fact that C gives you a feeling what it's like when you can look at a line of source code and understand immediately what happens in the machine (broadly) when executing it. No hidden effects. C++ doesn't have that (constructors you don't know about when looking at the line, exceptions etc.) That "local clarity" isn't the most important thing in the world, but if you feel and appreciate it, perhaps you'll strive for local clarity back in your favorite language, too.
it's kind of a shame because aesthetically speaking its so full of bad choices, but it is still very influential.
yes, absolutely this. most memory managed languages don't give you pointers at all and let the GC and objects implementation handle that completely. however, understanding the details of reference, dereference, and memory addressing is good for developing insight into how things are implemented at a lower level and what the performance characteristics of those implementations will be.
> Dealing with garbage?
C is a manual memory management language. Doesn't even have smart pointers or optional GC. To avoid memory leaks you have to be extremely dilligent about error handling and cleaning up/deallocating on every possible program branch. this will really hammer home a few points about rigor that are easy to gloss over in memory managed languages.
however, it's a pain in the butt and I'm really really glad I don't have to do manual memory management on a daily basis. still, learning a bit about this has made me a lot more sensitive to possible memory leaks that turn up even in higher level languages.
It's tough to say. Given a choice between being a JS ninja capable of hammering out http://www.track-trump.com/ in a week, or a generalist with a wide variety of skills, the former is so much more valuable from a monetary standpoint that it'd be hard to turn it down.
It's less satisfying, and you end up less capable in certain respects, but it all depends which axis you want to optimize along.
Greg is the JS ninja I was referring to. https://twitter.com/sama/status/822500368797966336
I don't see anything there besides some content presentation stuff. Am I missing something more interesting? When I think of "JS ninja" I think of sites with really complicated UI, like multipart forms with lots of validation, or interesting map layer based tools, or browser games or something like that.
- you can recover a little from all this learning. Your brain need rest too.
- you are really productive and hence are the best professional you can be.
Otherwise you spend your time eternally on project when you are not an expert at, which for your client/employer is not really fair.
> I usually stay 6 months to 1.5 years each on average
That's right around the point where you can increase your scope and responsibility within the team.
On the medium-large sized projects I work on you won't be making any major changes until the 12-18month mark.
Id say it's a good strategy for a junior dev but very limiting in the long term.
It seems like that'd exclude anyone who wants to make a name for themselves, which usually means excluding all the best people.
The two best developers I know have spent 12 years at Microsoft and 6 years at Google respectively.
What makes you think you need to switch jobs every 12-18 months to make a name for yourself?
And what makes you think the best developers care about making a name for themselves?
In other companies, creativity is considered an asset. Most of the important work done at my previous company was largely completed by one very productive intern, which was surprising to discover. And at each of the companies I've worked at, I was given freedom to determine direction and implementation of the projects I was given, as long as the results were excellent.
Both approaches have merit, but if I were to bet on one, I'd choose the company that's liberal in granting freedoms but also willing to fire someone if they turn out not to be able to deliver. This has generally been a recipe for success at most startups, for example.
It's more cutthroat, but in a different way: the former is cutthroat politicking, whereas the latter is based entirely on a developer's capabilities. I'd rather be in an environment that rewards effectiveness rather than alliances. And when your effectiveness can only be demonstrated within the very narrow scope and boundaries set by your boss, then people who care about being effective tend to migrate elsewhere.
Most of this can be summed up as "The idea of paying your dues is anachronistic."
I'm not talking about "paying your dues".
It takes time to get up to speed and to build trust when you are performing high impact work on a medium-large project especially if you don't have prior domain knowledge.
> Most of the important work done at my previous company was largely completed by one very productive intern
> liberal in granting freedoms but also willing to fire someone if they turn out not to be able to deliver.
I suspect we simply deal with different sized systems.
The idea that an intern could do most of the important work implies you work for small companies.
In comparison my current project has ~90 devs.
Likewise you have this idea that you can quickly determine a good choice from a bad one. I'm still cursing design decisions I made 3 years ago that everyone thought were great at the time. If I had left after 6 months I would still be patting myself on the back.
Again that seems to apply better to small companies.
I think you're right that if a system is massive, there won't be many major changes to it. But that's true regardless of who has authority. It's always possible to solve smaller business problems with self-contained projects that can then be integrated into the larger system. But not if an environment is set up to prevent someone from doing this by forcing them to work within the constraints of the existing monolith.
(It's possible to do this without creating a microservice, in some cases. The important part is simply to be allowed to experiment with alternate solutions, as long as it's not interfering with your main duties.)
Again I think we are simply dealing with different scales of systems.
Aren't these the end-goal companies anyway? Next is potential start-up?
Granted; there are some things that require so much time you need to do it at work to really get the grasp of it, but this isn't everything.
No. This is the advice you'd expect from someone in their early thirties who has been coding and doing nothing else their whole adult life.
Your doubts should fuel your learning and temper your decisions. They are the voice in the back of your head and you can make it work for you instead of against.
Mastery isn't working harder, it's working smarter. It involves retraining your instincts to match your rational understanding of the domain. You get the shape of a problem and you naturally gravitate toward a reasonable solution without having to intellectualize the whole process first. If challenged, you of course have to walk back your intuition and build a step by step case, not just a rationalization, but that's fine because you've trained for that.
Study something else, anything else. Preferably with a teacher. What you're going to learn about the process of learning (and teaching) will be profound. It will make picking up and putting down new things easier, which will keep you from getting in a rut later.
Source: 40-something who spent his precocious twenties only coding. I learned a hell of a lot but I didn't know everything.
I think specifically it helps to deep dive on something that is much older than programming. A craft, an art, a sport, or a physical passtime (dancing, martial arts, kayaking). People have been teaching that stuff for centuries and it turns out they actually know a few things that we only pretend to know. Two things in particular stand out.
Cross-training is one. The notion that pushing harder, white-knuckling everything, is a virtue in software development is rarely ever challenged. Not good yet? Just keep doing it.
The second is that we don't get that rules exist in a context. Always do this. Never do that. As you mature it is partly your responsibility to figure out when you are no longer part of that context. When you are ready, you will understand why you had to follow the rule and why you are now free of it. Look at the backlash against DRY. A bunch of people who've been using it for 10-15 years figuring out it's not all roses. See also: 'it depends...'
It was especially fascinating because the first sentence was something like "E-Prime refers to a version of the English language..." instead of "E-Prime is a version...". The article had all sorts of interesting quirks that really highlighted the diffs.
The sort of advice followed by programmers when building systems that will accumulate heavy technical debt that someone else will eventually have to sort out.
You will not become a better programmer this way.
The main thing is communication - duplicate, but put comments in both places regarding the duplication so they can be merged if they haven't diverged after time.
This is also why I dislike "no comments in code" principles - communication is key, and there are some things only plain English can convey; If all your comments are code, you aren't communicating enough!
Make sure you get good rest too - all the learning in the world won't help you if you cannot retain what you learn. If you're in a pressure cooker of a job, consider transitioning to a less stressful job to give yourself space to work on yourself so you're better prepared for the long haul/set yourself better up for success.
Lastly, focus on being a problem solver - this means if you see a point of friction/inefficiency/tricky situation, that is something to be solved. Your success rate in solving these problems is ultimately what drives stable success. This means practice as much as possible applying your mind to these problems, and if you fail, be ruthless about figuring out what you did wrong and iterate. Very few people can get away with being excellent there with minimal effort.
If you do want to leave the comfort zone, this is a must.
And we really do not care for the framework du jour, even if we use some.
- develop critical thinking
- take time to learn
- force yourself to understand (avoid cargo cult)
- accumulate non-redundant experience
this allows you to write simpler and cleaner code, which is a high quality code.
and lots of coders think clean code means isolating code into many different pieces but that is so far from the truth. clean codes to me are generally less lines of code, files, and places to look for your stuff. its elegant and efficient. should put effort to separate less and group more. i cringe everytime some dev breaks up a simple methid into dozen pieces when there arent much justification to begin with.
so pls try write your own framework as an exercise. even in a crowded seas of framework, coming up with your own will be the best learning excercise you can do imho. it doesnt even have to be framework but something along the lines that will lead you to think outside the box.
To add few more points on this, work on side projects. I have to tell, how a side project can help you. For example, I was very comfortable using React in my project. I thought of building a simple preview container (for different sizes from desktop, mobile, tablets), where user can drag and drop few components, position them etc.
I'm in the initial phase of this and guess what, its really helpful. Making you uncomfortable at times (as in this case for using React to position components on the fly, which I had never done before) helps a lot.
I am really starting to wonder what other careers are possible as this kind of sucks. I am especially bad at bullshitting so I don't really get anywhere.
The somewhat good news is that job listings are not a great way to find jobs anyway.
I have and have known many others that have gotten jobs they did not meet the description for at all. Just a matter of going out and meeting people. If you can meet someone face to face, they won't sweat the details in the job description. No bullshitting required.
My other piece of advice would be to code with other people as much as you can, ideally from very different programming backgrounds.
Other than that becoming proficient in C and systems programming (Thanks xv6 book) was a major game changer for me.
Step 2: Update the documentation the next time through.
There is no step 3! ;)
Many good devs have bad attitudes, which makes it a bit harder for junior devs to value their skills, but it prevents them from being idolized too much.
Also, I always think it's a good sign if devs I look up to said stuff I found bad, so I know I still see them as humans and not as infailable idols.
> Don't devalue your work
This is a hard one.
On the one hand, if you work with too much non-technical people, they tend to overvalue your work. I met rather much mediocre devs who got sold to me by managers as the best devs ever. They simply always "delivered", which some devs don't. But finishing your work is a minimum in my eyes and not the "best thing ever".
On the other hand, if you only work with highly skilled devs, you could start to think you can't do anything right. In the end you got skills worth mad money to non-technical people, but you think you wouldn't get a job ever again if you lose your current one.
> Don't feel pressured to work all the time.
This is hard, especially for us devs who think of programming as their hobby.
I started freelancing 2 years ago and got about 4 weeks holidays in this time. I worked on many weekends. Not because of "crunch time" but because I liked what I was doing, but I found out it really takes its toll :\
Now I try to do 3-6 month long projects and 1 month holiday after every project. Also, only weekend work on "crunch time".
> Ignore fluff.
This is really hard, because fluff is fun.
But I came at the price of fluff everywhere.
It gives me a nice feeling reading about other devs who just don't get async/await, observables or destructuring. Not because I think they are idiots, but because I think "This seems to be hard and I already know about it!"
But yes, I probably poured days into learning observables and probably can't use them in my next projects.
> Dig into past research.
This is a nice thing, because most people don't do this.
I got a big book on HCI research of the last 50 years or so and I always find nice solutions for my problems there. Since many of the web and mobile problems have already been solved with experiments on research devices that never went mainstream.
> Take on big projects. Get uncomfortable.
Also: Let your life depend on it ;)
If you need to pay the rent with a project, you're much more inclined to "really" finish the thing and "really" learn the hard parts you need to understand before you can implement the solutions, which you need to "ship".
(Okay, letting your life depend on it isn't that good of an idea, but if money is involved it's often easier for me to walk to the end. You always should have enough money backed up to survive a failed project or two, so you can also logically justify to leave your comfort zone)
Please share, which book is this?
by Julie A. Jacko and Andrew Sears
WHAT? C is beauty, C is art, C is clean and concise. What is this guy talking about? Nobody complains about C, it's about C++ they complain, you idiot.
And yet, his blog repository which is, in his words, "This is just a stupid simple server that indexes posts off the filesystem", has 1193 stars. Why would anyone star anything like that if not because they idolize jlongster?
I didn't point a contradiction, just noted that he has fans that idolize him. The contradiction is theirs.