Hacker News new | past | comments | ask | show | jobs | submit login
John Carmack: Career Advice (twitter.com/id_aa_carmack)
274 points by tosh on Dec 18, 2020 | hide | past | favorite | 169 comments



David Epstein's Range has a lot to say about this topic. Carmack's idea of 'deep' work [0] is environmentally dependent. It works for him and for the places he has been at. But it is not a universal rule for learning in all environments.

Epstein uses Tiger Woods and Roger Federer as his headline case, but goes into many other forms of learning. Woods' golf is a kind learning environment; the rules are clear, the feedback is quick, the skills are straightforward. Federer's tennis is an unkind learning environment; the rules aren't as clear, the feedback isn't as quick, and the skills are more murky. A lot of tennis is the mind-game aspect and in elite tennis, you don't get the same person very often.

With programming and development, the environment Carmack talks about is a kind learning environment (rules are stated, feedback is a compile away, skills are practicable). Contrast this to the business/marketing people whose environment is unkind (unclear rules, long feedback cycles, skill overload).

Epstein relates the method to be great at a thingy is to know what environment you are in. This gives the keys to success. In kind learning environments like programming, golf, or chess, the key is practice and drilling, to do it until you can't be bad anymore. In unkind learning environments like tennis, jazz, or marketing, the key is to learn as many things as you can as broadly as you can, to gain a reservoir of ideas to use.

[0] not Cal Newport's idea of 'deep work', to be clear.


> In kind learning environments like programming

I'd agree this is the case up to a point with programming, but some of the most important aspects of the job are in the things that are unkind, like how well your design will hold up under future maintenance. Often times there are no feedback mechanisms whatsoever for whether you've done a good job. Or they're slow and coarse and can be ignored or justified away. It's easy to just hop onto the next project and leave the mess you've made behind.

The more subjective stuff like how to approach problems as well as what to write and not to write in the first place -- these are what separate people who can technically do the job from those at higher levels. And often even a successful application here ends up getting overlooked by the very fact that it Just Works and doesn't make a lot of noise.

This is just one aspect of the unkindness. There's also how effective you are, how reliable your systems are, how productive you are, whether you pick the right things to invest in learning about, how well you solve problems, how well you debug, how well you coordinate with peers, how well you are able to understand and navigate existing systems, and how well you communicate technical ideas. Many of these things compound in positive or negative directions over time. I've yet to see any good materials teaching most of this stuff, and I suspect plenty of people will go their entire careers without ever really figuring out or being aware of a large chunk of it.


The description of programming is kind really depends on what you are doing:

are you building things that all sorts of different systems need to integrate with that you don't necessarily know what the technical specifications of these systems are = unkind

are you designing a database schema and writing queries against it = kind

are you doing frontend development (or something similar where fast paced change means new technology, new best practices etc. all the time) = unkind

are you doing text processing analysis of structured data following well understood schemas = kind

I mean really, they gave two examples of sports one which was kind one unkind - but for programming it is just kind?


  are you designing a database schema and writing queries against it = kind
I worked in a project using MongoDB. Initial implementation was easy and quick to iterate. After a few years when it was evident that it was not sustainable. Two years of painfull effort was required to squeeze out MongoDB.

  are you doing frontend development (or something similar where fast paced change means new technology, new best practices etc. all the time) = unkind
React already won, front-end tech is stable in last 4 years. It is an easier part of my job. Front-end development require high upfront learning, but it is boring at some point.

  are you doing text processing analysis of structured data following well understood schemas = kind
I know some pipelines on structured data that are hell. I worked with XML (FpML) and systems that produced/consume these, It would make most people cry. It is very easy to create massive technical debt where data is flowing. Any change needs to be done from both ends. It is not like front-end where I can refactor components as much I want.

Because programming is such diverse field, I would be careful with labeling anything kind/unkind.


>Any change needs to be done from both ends.

If the change needs to be done on both ends it is probably an integration problem, which I labelled unkind.

The use of the words kind/unkind is probably not well-thought out, because kind implies easy, but there is nothing in the description of the term that means it is an easy thing to do.


Kind because there is an obvious path forward. Unkind because there is uncertainty over time. In programming there is no objective measures for code quality, cost of maintenance or even productivity of individuals.

I would argue that programming is fundamentally unkind. You can only learn a small fraction of stack and operate on assumptions. It is very hard to learn from other people's experiences as it is hard to quantify complexity. If someone is preaching for microservices you can only understand tradeoffs by having a specific knowledge/experience. That why we have people still using PHP for new projects because they are so entrenched in their local maximum.


Epstein goes through many other examples of learning environments in the book, most of which I found to be very entertaining and informative. To be clear, Range is not an academic article, and is very much in the 'pop-psych' genre like Gladwell or Talib. That said, it's well worth a Christmas vacation to read through. Gates had it on his best books in 2020 list for a reason (though it was published in 2019). Pick up a copy yourself from the local library (if open/online) or you can buy it yourself:

https://www.amazon.com/Range-Generalists-Triumph-Specialized...


to expand on this I sort of have to expect, if this whole kind/unkind thing is to mean anything at all, that any reasonably large area of human endeavor (sports, programming, politics, law, music, etc. etc.) will have some sub-fields that are either kind or unkind, although I suppose some areas (like politics or business) might skew extremely towards the unkind.


> Federer's tennis is an unkind learning environment; the rules aren't as clear, the feedback isn't as quick, and the skills are more murky. A lot of tennis is the mind-game aspect and in elite tennis, you don't get the same person very often.

The Inner Game of Tennis [0] for folks interested in learning more about this.

[0] https://www.amazon.com/Inner-Game-Tennis-Classic-Performance...

PS Novak's probably more clutch than Rog.


Novak also has to deal with hostile crowds a lot more, who are in denial that he’s on track to surpass Roger and Rafa. He’s possibly under-appreciated because he doesn’t have the flair of the other two, but rather a robotically consistent, precise, and relatively low risk style.


Anyone who has played some tennis can attest that it's more mental than physical specially when you're playing against opponents of the same level. No magic about that.


This is an interesting perspective, from a technical view it seems like the difference between writing library and debugging a system.

Programming a library referencing an RFC seems kind, the rules are already there.

Debugging a performance issue in a large distributed system is unkind, you need a lot of existing knowledge and ideas to know where to start.


I don't know, I think debugging a perf issue in a large distributed system still falls under kind work to me, assuming you actually have a way to tell when it's fixed. Yes, you need a lot of knowledge, but the rules and outcomes are still clear. Compared to writing a novel or painting, it's very clear when you've reached success and it's hard to argue that you've been successful.

IMO the "unkind" work that a typical engineer does is things like writing specs. The work itself does not give you any feedback about if you're making progress or doing a good job.

Of course it's all on a sliding scale, and if your perf issue takes four weeks to surface after each deploy then that's a much less kind environment. But, in general, anything that you can put clear metrics around is much closer to the "kind" bucket to me because those metrics give you a clear path for improvement and iteration (the major risk is getting stuck in some local maximum).


> I don't know, I think debugging a perf issue in a large distributed system still falls under kind work to me, assuming you actually have a way to tell when it's fixed. Yes, you need a lot of knowledge, but the rules and outcomes are still clear. Compared to writing a novel or painting, it's very clear when you've reached success and it's hard to argue that you've been successful.

The same goes for poker, you know when you won a hand. However winning a hand doesn't mean you learned something, similarly successfully debugging a large performance issue doesn't mean you learned something either. The learning we actually care about in software engineers isn't kind at all.


> Programming a library referencing an RFC seems kind, the rules are already there.

Depends.

Is there a reference implementation? A format/protocol validator? An interoperability matrix? What's the ratio of SHOULDs and MAYs to REQUIREDs, SHALLs, and MUSTs? Does the RFC reference other RFCs? What are the answers to these questions regarding those?

I mean, I wouldn't ever describe the task of implementing a SOAP[0] or WebDAV[1] library as 'kind'.

Oh, and if you're working at a sufficiently large org, be prepared to battle it out with a competing internal implementation for the coveted position of 'company standard'.

Note that you (or your boss) may not actually want to win that battle, as internal customers will make all sorts of unreasonable demands[2], and implementing them all will cut into your real work and negatively affect your evaluations and career advancement. Also, refusing to implement those requests will get you labelled as "not a team player" and negatively affect your evaluations and career advancement.

Fun times!

[0] https://www.shlomifish.org/humour/by-others/s-stands-for-sim...

[1] https://www.slideshare.net/mobile/tobyS/webdav-the-good-the-...

[2] https://dilbert.com/strip/1995-11-17


Epstein kinda treats things as a binary: un/kind. But I think there is a spectrum of 'environmental kindness'. Troubleshooting is a good case of this. If all you do is debug a single system, then you'll get really good at it, via practice. It starts out environmentally unkind, but ends up being environmentally kind via repeated use. While if you have to debug multiple systems and add in new systems, then you stay in an environmentally unkind situation.

Being a mechanic on your own car is one thing, but working in a shop with new cars and problems coming in all the time is another.


I was nodding along until the last sentence of:

> In unkind learning environments like tennis, jazz, or marketing, the key is to learn as many things as you can as broadly as you can, to gain a reservoir of ideas to use.

I think this very much applies to kind (programming) too. Sometimes the biggest programming level ups requires broadly knowing about this, that and the other thing.

For example, you can practice writing file upload code until your eyes bleed but if you're unaware of the implications of what uploading to local disk brings in a containerized or Heroku-like platform then a lot of what you know suddenly falls apart. Your entire basis of what you've been doing might need to be rewritten or drastically altered (creating a "storage" abstraction, etc.).

With software development it's super valuable to get a broad understanding of many parts of the stack, not just individual features. I say "feature" here because in a golf analogy a feature would be putting within 6 feet, chipping under 20y, or using a driver. In golf you can individually learn these things in isolation and your overall game improves in the end. In programming this isn't as clear cut IMO.


>programming as work >rules are stated

I'm not sure where you work but I'd love this kind of environment.


Here are some ways to learn deeply and be relevant:

1. Pick a data structure (such as a hash table or LSM-Tree) then read all the literature there is to read, every single paper that's great, following the best conferences year after year, and implement a 10x faster or more scalable version for the std lib of your favorite language.

2. Pick a fault model (such as storage faults, network faults, cryptography faults) then read all the literature there is to read, every single paper that's great, following the best conferences year after year, and write a fault injection or fuzzing harness to break some of the most respected storage/network/cryptography systems (for examples, see the work done by Remzi and Andrea Arpaci-Dusseau on storage faults, Kyle Kingsbury on Jepsen, and Guido Vranken on Cryptofuzz: https://github.com/guidovranken/cryptofuzz).

3. Pick a software field (such as web applications, mobile applications, native applications, file formats such as Office Open XML, or protocols such as SMTP, MIME, HTTP, QUIC) then read as many CVE reports and bug bounty reports as you can find, and then start participating in bug bounty programs within this field. Pick a target and give yourself a goal, e.g. DoS, RCE or read/write access, and do the work to make it happen. Chain as many steps as you can. Automate and enumerate. You'll find a way in if you keep at it. There's nothing like crafting an exploit to change the way you think about programming.

As you gain experience in data structures, storage/networking/cryptography, and security, you'll find this translates well to most software engineering work. You'll gain a speed/safety/security way of thinking, you'll have fun being curious and learning along the way (and hopefully you'll earn a bounty or two and get some CVEs under your name).


Let’s say I’ve picked a data structure. How would you suggest identifying the great papers and best conferences related to it?


A few ideas:

1. Add "abstract" to your search query to surface papers.

2. Search for "... reading list". For example, Heidi Howard maintains https://github.com/heidihoward/distributed-consensus-reading...

3. Read blogs like "The Morning Paper" (https://blog.acolyer.org) but skip fields that are outside your scope. You don't have time to follow more than one or two (or three) major fields.

4. Use Google Scholar to find the most cited papers, or to find papers that build on papers you think are good.

5. Keep an eye out for the conferences where these papers were presented. Then read the other papers that were also presented.

6. When you come across an amazing paper, read other papers by the same authors or supervisors.

7. If you're lucky you might also find good "survey" papers that cover and reference the state of the art.

8. Lecture notes from Stanford or MIT or another university can also be a great way to get a big picture of the evolution of techniques for a given data structure or problem. For example, these lecture notes are just brilliant for getting started with stuff around memory hierarchies: https://www.eidos.ic.i.u-tokyo.ac.jp/~tau/lecture/parallel_d...

These are a few tricks that I find useful. What else?


Yeah similarly “review” “survey” in google scholar will work. Identify major authors (they keep cropping up). Find the big textbooks. See who those kinda people cite, follow that trail.


Start with the foundational paper(s) on the topic. Then use google scholar or your bibliography tool of choice to see who's cited it. It takes a little work but just burn down the list reading abstracts as well as noting how heavily each of these papers has in turn been cited. Those are likely the most influential derived works. It's common for grad students to do survey papers as well, so keep an eye out for those as they often give you a great roadmap to the current state of things.

Also pay attention to authors. If someone has done an influential paper on a topic, it's likely there will be additional work or tech reports in that area on their homepage or with their research group.

To find conferences/venues just note where the more recent papers are published.


>following the best conferences

There are conferences for hash tables?



Talking about hash tables at a conference and a conference dedicated to hash tables wouldn't be the same thing.


Can anyone elaborate on what Carmack is suggesting here? To specialise in some tool early in your career? Or something else? I don't quite get it. (Or at least I don't get why it would be such a good idea.)

Carmack is usually great to listen to, by the way; check out some of his interviews or talks if you haven't before.


I think his argument is fairly simple:

1. Learning (fundamental) stuff deeply is the way to become great.

2. But knowing how stuff works fundamentally at an abstract level (e.g. being able to write your own toy OS) is not directly economically useful (and/or very time consuming to get to that stage). So it won't get you a job.

3. So for starters learn just one thing that people actually use (say git) deeply at a concrete level. That is economically useful, because most devs will only have fairly superficial git skills, so you becoming the go-to person for git will provide economic value to the company.

4. At the same time, this is an effective way to bootstrap a more abstract deep understanding. E.g. if you really master git, you will also learn a fair amount of abstract concepts that go beyond the concrete tool (deep understanding of git implies the ability to implement at least a toy version yourself, which will teach you useful stuff; in particular if you can implement your own version of git you are probably already better than 90% of programmers).

5. Be mindful that you will be blinkered at this stage (because you only know a single thing and thus lack a basis for comparison), so don't become opinionated yet.

I think this a great early career strategy as long as you pick the right thing to deep-dive into. E.g. if you picked AspectJ or Agent Learning or CASE tools or the semantic web around 15-20 years ago (at the peak of their hype-cycle), this possibly wouldn't have worked out so hot. In particular, learning some things like Spring or Freudian Psychology will probably harm rather than help your intellectual development.

Making a good pick is hard if you lack experience; I'd say if in doubt pick something really everyone in your line of work uses but most people have not mastered and where there is value in mastery (and you can see how mastery might tie into things you want to learn more about at a fundamental level). Also, preferably pick something that has been around for at least 5 years (unless you are quite confident in your nose for trends).


This closely matches advice I've been giving to new hires and team members for years. When you join a new team look for the gaps, what isn't getting done or what will benefit the team most and fill that gap. You're not going to compete with the current guru of this or that, so learn from them for sure but that's not going to mark you out.

The only down side is there are often reasons those gaps are there, but it's just time to roll up your sleeves and get to work. It's an approach I've taken every time I've joined a team and it worked well for me.


I'd be a bit weary of gaps in some environments. As you suggest, those gaps often exist for a reason.

From my experience a lot of those gaps are essentially integration efforts between two systems where one person builds something independent of the other that were supposed to be integrated or are now desired to be integrated yet were never planned or designed appropriately to integrate. In the end, you end up needing to become a master of the disjoint pieces that formed that gap to connect them and have to deal with two 'masters' as opposed to one.

My suggestion is to find gaps that aren't integration efforts, if possible, that are instead extension efforts. This allows you freedom of creativity for some greenfield development while avoiding a bunch of legacy code and technical debt that will swallow you whole. It also gives you a starting leap from something developed by a master to work from.

I can't count how many times I've joined different teams to assist and found a pile of integration efforts (my sample may be biased, YMMV). This seems to be a more common modern phenomenon where people need to pump out something to look good/productive and then coast away before the challenging effort of making things work together occurs. I would say this issue is exacerbated by the mass adoption of "agile" where integration is always an afterthought since business leaders are often the dictators.


> My suggestion is to find gaps that aren't integration efforts, if possible, that are instead extension efforts.

Yes! This can go even further and the technical integration work can easily slip into soft/people/team/project/management integration. Here is a talk which warns against doing this kind of glue work too early in your career:

Video: https://www.youtube.com/watch?v=KClAPipnKqw

Summary article: https://noidea.dog/glue


Excellent video and summary, thank you for the links.

I explain this same concept to people at less software savvy businesses frequently, so it's nice to have an external reference to point to. Going to borrow her nomenclature.


> This seems to be a more common modern phenomenon where people need to pump out something to look good/productive and then coast away before the challenging effort of making things work together occurs. I would say this issue is exacerbated by the mass adoption of "agile" where integration is always an afterthought since business leaders are often the dictators.

For me it is the latter over the former. I would love to do a 3 point business feature every sprint and spend the rest of the time cleaning up code. Between the POs pushing work onto my plate, as well as everything else I have to do (releases, code reviews, fixing regressions etc.), cleaning up old code is a luxury at the moment.


The above explanation is great but I think the example you selected (git) is rarely the right thing to choose as for most companies git isn't where the money comes from (notable exceptions are companies like GitHub or GitLab). Sorta like the difference between working in the IT department vs Development (for a software company).

Maybe a better choice is something like CSS: there's actually a lot to know about it and a lot of companies need people who are good at it for their core product (even though it's not as deep a topic as say being very knowledgeable/skilled in functional programming).


It's been surprising to me to see my wifes deep knowledge of git, SVN, and perforce be so cherished in her organization. Surprising because she works for a place that is routinely featured on the frontpage of HN, where very nearly every single person on HN has used their software sometime in the last ~30 years. It was my view, originally, that these people ought to already have the answers to the questions they're asking. I was wrong.

Her and one of her colleagues are the de facto git gurus in her entire department of over 100 people. She's ended up owning tooling in their build pipelines that is quite complex, using libgit, etc... That tooling was eventually exported around to the rest of the engineering teams (~1,000 total engineers). Now she gets random, challenging git (and perforce, to a lesser degree, and she stepped in to help with some SVN issues at one point in time, though those projects have moved over to perforce, I believe) questions from the entire engineering organization.

I do agree with you, though, that git is probably the wrong thing to focus on and that she and her colleague are outliers. But, it's interesting to see nonetheless.


At least in pre git days many large companies had people whose entire job it was to merge and resolve merge conflicts. They were not part of the project teams. They were part of the build and deployment team.

Granted in any sane company this should just be done by the devs themselves. But as the OP said, in most teams knowledge about versioning, branching and version control is very very limited. Even if all you know is how to do a rebase of a longer lived feature branch easily in git (skipping the commits from the other branch) you will be the wizard that everyone goes to. Many many devs will not even attempt a regular rebase. It's black magic for them.

If all you desire is a cozy job in a large company with braindead processes and policies this sort of thing is probably enough to be tipping the scales towards: well we shouldn't lay off that guy coz he's the only one that can work the version control magic AND he codes like mad.

If you wanna stick out in a company filled with John Carmacks you will have to do better, sure.


Managing distributed version control at scale is one of the core challenges facing nearly every growing software company. Performance issues, the increasing incidence of merge conflicts, preemptive CI, build system integration... this is definitely a topic you can go deeply into, and make an impact at many companies.

On the flip side, you are now a source control/build system engineer, and it's pretty easy to get typecast as such (given the outsize impact your specialist skillset will have).


>the example you selected (git) is rarely the right thing to choose

It's clearly just an example, but whether it's a bad one or not is debatable. Git is used by most shops, and most people have challenges with it at one time or another. Further, it is IMO an excellent example of how choosing the right data structure for a problem can have a profound impact on the resulting application. You could do worse for broadening your knowledge of ComSci than by a deep study of Git.


Thanks for the comprehensive explanation!

I also find it quite interesting that you mentioned git specifically, because I’ve been going back and forth for the last few days with myself about whether I should spend the rest of Winter Break and next semester (I’m an undergrad with one semester left) learning git internals deeply. So, this felt somewhat validating of the perspective that I should learn about git more deeply so I can have enhanced practical knowledge, which could translate into more economic value for the company I’m working with after graduation.

Once a decision is made in this regard, I then have to figure out the best way to actually learn it. I’ve considered trying to follow the mailing list[0] and contribute code to the git project, thinking that would force me to learn it very deeply. But at the same time, I feel like trying to go that deep might have diminishing returns and might stress me out. I like the idea of being an open source contributor to such a huge project and gaining more experience that way, but I also worry about the potentially high time cost.

I imagine I could alternatively read the Pro Git Book[1], the git documentation[2], and/or some other specific git internals resource someone has curated, but I haven’t decided what would be best yet.

[0]: https://git.wiki.kernel.org/index.php/GitCommunity

[1]: https://git-scm.com/book/en/v2

[2]: https://github.com/git/git/tree/master/Documentation


You might like to try Write yourself a Git [0] (discussed here previously [1]). YMMV, but I find the best way to learn something deeply is to get hands on. Less of a chance of convincing yourself you understand something that you really don’t.

For less of a time commitment, Git from the inside out [2] is a really nice explanation of the internals, from initializing a repo and the files that creates in the .git directory, all the way to pulling from and pushing to remotes.

[0]: https://wyag.thb.lt/

[1]: https://news.ycombinator.com/item?id=19386141

[2]: https://maryrosecook.com/blog/post/git-from-the-inside-out


Wow, thank you so much for pointing this out!

I know from (my albeit limited development) experience that I definitely retain concepts I’ve encountered via a hands-on approach better, but I hadn’t made the leap to _this_ sort of hands-on approach! I’d just resigned myself to the long slog of trying to get a commit merged into the master branch haha.

Once again, thank you! I’m going to check out both of those resources!


I agree with the sibling comment that trying to implement a limited version of git (but that can work on a real git repo) would almost certainly a great way to learn git (there are multiple "implement git yourself in X" projects for different languages X that you can find online).

There is no harm in reading through Pro Git (which is a fine git book) either, but generally, and I wish I had been more acutely aware of this myself when I was at school, if you really want to learn something you have to either implement it yourself or use it in anger in a realistic setting. It is useful to read about stuff to get more of a feel for what's out there, but you won't get a good understanding of anything you will need to actively engage with it, and it a setting that practically matters to you. From personal experience: it's easy to trick oneself into a superficial sense of knowing something and coming up short when called upon to either implement it or use it to solve a real problem. So my number one advice would be don't fall into this trap; do (hard for you) things in a way you can't cheat yourself (and get them working first before you make them pretty).

So give it a try, the winter break should be long enough to get it done, and if you find it's beyond you at the moment, you can always scale back your ambitions and tackle it once you've grown in ability.


> generally, and I wish I had been more acutely aware of this myself when I was at school, if you really want to learn something you have to either implement it yourself or use it in anger in a realistic setting.

This is so true! Only towards the end of my formal education have I begun to learn the same thing. In many ways, I’m still biased towards just reading things —- partly because implementing something and running into issues I feel I’m not knowledgeable enough to solve/failing really bugs me —- but I’m gradually learning that failure is just part of the learning process, and doesn’t imply I’m incapable of doing something.

I’m definitely going to try it out over the break! :)


Since you mentioned diminishing returns, stress, and time cost, it sounds like you want to balance practicality with interest. Being frank, don't do it. I'm a software engineer and I have had incredibly talented coworkers, some of whom are the type that everyone goes to for help, who barely know how to use git without a GUI, if at all.

If you want to spend a whole winter break + semester (which is a very valuable amount of time), is there anything else you're interested in that would also line up with your career interests? That might be a win-win for the academic exercise and practical benefit


This is a very good question and I'm going to think about it more. During this past semester, I started keeping a Trello list of potential projects/readings I could work on to consolidate things, but it's still hard to pick from. The Git internals thing I mentioned was one thing in that list. Since I'm interested in open source software and the communities that form around projects, another idea I had is developing a "book club" where people meet and read the code and discuss the architecture of popular open source projects. I created a basic site for this a few weeks back, but haven't advertised it since it's not acceptable for public viewing yet.

The main issue for me is that my interests are a bit too broad. Should I learn web dev from my job by day, then try to implement an OS in my free time by night? I know I can't learn everything, but yet it feels like there's so much I really _do_ need to know to be a competent developer.


> I know I can't learn everything, but yet it feels like there's so much I really _do_ need to know to be a competent developer.

There really isn't anyone on the planet who is truly a "full stack developer" and competent from the front-end all the way down to the OS and bare metal. No one.

Most developers stick to a particular layer, and are only familiar with adjacent layers to the extent of being able to have arguments with those developers. A 5x-ish developer is competent in those adjacent layers as well. A 10x developer either has mastery of their chosen layer, adds additional competency layers, adds competencies on the connecting tech between chosen layers (ie. not just front-end + backend, but networking too), has relevant domain competencies, or some combination.

So, just pick an entry point, learn as much of it as you want, check out whatever is adjacent, follow your interests, and periodically evaluate whether you need/want to change course or dig deeper into technologies, tools, domains, etc. wherever you happen to be. Even if you end up wandering quite far from where you started, the experience and knowledge you gain on the way isn't likely to be "wasted", unless it is mastery of some in-house tech or tool that literally no other organization uses.


I disagree, I can easily imagine somebody having full metal to UI knowledge. Obviously not knowing every programming language, all types of electrical engineering applications, etc. But it's totally feasible to be able to have a sense of understanding from metal to front-end at least to an extent.


Note that I am speaking of "competence", and you're mentioning "knowledge" and "sense of understanding". Not really the same thing.

Separately, "the stack" is inherently polyglot, and even aggressively pruning the selection wherever possible and eliding many formats, tools, and protocols still gives you an absolutely minimal set like JavaScript, HTML, CSS, HTTPS, TCP/IP, SQL, C/C++, Assembly, & VHDL.


You can replace what I said with competence, and I still claim it. I think it's fully possible for people to be competent from metal to frontend, and I'm sure there are at least a few out there who are.


We may be differing in defining competence, but regardless: agree to disagree?


This is what I would do. Treat it like an engineering problem: break it down into smaller, simpler parts, and start with small steps.

You value practicality as well as personal interest/fulfillment. Either pick one and start there or pick both and consider the options there. If you want to be practical, learn the most marketable skill that you're at least somewhat interested in: Java, AWS, Docker, Javascript are all incredibly marketable. If you want to go for interest, just pick the thing you're most interested in. You could also try and juggle two deep dives, at the cost of more time and less depth, but maybe that works for you.

Don't worry about becoming a senior dev in a semester. Pick something and focus on it. That could be an AWS certification, building something in Java, etc. Once you get through that, then decide your next step.


I'm sure many will come down hard on my comment and disagree. But speaking as someone who teaches at a university and also works in industry and is involved in hiring, I don't think becoming an expert in git is worth your time. At this stage in your career you should spend your time mastering algorithms, data structures, and a compiled language like Java or C++. I would put emphasis on learning how to use your language of choice idiomatically (e.g., iterators, streams, the standard library and its core data structures, etc.). In my experience, the best way to do this is practice Leetcode every day. Doing one question a day (a 30-60min commitment) will put you leagues ahead of your peers. Combine this with reading a major book on your language (e.g., Effective Java or Modern C++ Design, etc.)

Without getting sidetracked about the merits of the technical interview, it is current a fact of life. In my experience most undergrads struggle solving even the most basic problems and even if they come up with a solution, they are unable to code it in any language of their choice. If you are coming out of university as a "git expert" and can't code up a basic solution, you will get passed over every time.

Most teams (at least in my experience) are not struggling to solve git problems (although they certainly pop up). So while you could add some "value" there, overall you aren't adding a whole lot of value. On the other hand, if you know your language and are a moderately competent coder, you can add a lot more value.


I'm pretty sure (well hoping) that all of the advice to go deep into one thing would always be in addition to actually being able to code. I completely agree with you that understanding git very deeply is not needed or useful to the level of detail suggested here in other comments.

The level of understanding of git you need to stand out is very very limited in my experience. Most devs in "regular" companies struggle to understand even the basic rebase. Even on a logical, abstracted level. Never mind how and why it actually works as well as it does.

The types predicaments I see people get themselves into even with git vs say subversion is mind boggling. I have never gotten myself into a situation that wasn't resolvable by simply making sure that everything I try is done after committing my changes. You can just always go back and retry. And just slapping a label (branch in git speak, sure) on a commit before force pushing after that rebase with lots of conflicts so that I have a backup. And even that is not strictly necessary. I've fucked up conflict resolution only to notice when the build server tells me and I had to go find a commit hash in my terminal output somewhere to resurrect it (I guess I was lucky I didn't hit an auto cleanup of dangling commits in between ;))


It wasn't clear to me that the other comments were starting from "know how to code," which is why I made my comment. If OP spends time honing their coding abilities then by all means learning more about git is a great plan.

I am really surprised by your comments about git at "regular" companies. If all we are talking about in this chain is understanding workflow and how to rebase, cherry pick, etc. then I completely misunderstood the discussion.

I have certainly gotten myself into some hairy situations. Since I avoid making massive commits, if things get too bad I have always been able to quickly resolve the issue by just doing a clean clone somewhere else and moving my changes over. As a last ditch effort it works quite well and does not take too much time (or stress :) ).


Oh I'm sure some other people took the discussion on various different ways. Such is communication between humans ;)

Btw. in case it helps you. No fresh clone in a different place needed. I think I know what kind of situation you mean and all you need is to cherry pick your commit on top of the branch you want instead of doing that merge/rebase that isn't working out. Takes even less time than cloning somewhere else and moving your changes over.

And in some cases what this sort of situation really just needed was an interactive rebase that just skips the appropriate commits that already happened on the main branch. Suddenly a litany of seemingly unresolvavle conflicts doesn't even exist in the first place. Many ways leading to Rome there.

I would encourage you to always work from just within exactly one repo with git. The whole "having another copy of the repo somewhere else" is something I have seen so much with svn but it just really isn't required with git at all. And even if you have to "do a fresh clone" you really don't have to. Just get rid of all files (rm -rf) except the .git directory (or copy it where you need it) and checkout again. I've used that a few times when I was having build issues and I wanted to make absolutely sure there were no generated files from either my IDE or the local build left anywhere that could screw things up.


Pretty sure your getting downvoted because this sounds like the advice of a professor who hasn't spent much time in an industry setting: grind algorithmic problems to succeed, only to find out past the interview that knowledge rarely gets used.

Ive managed to not have to do a technical interview for all of my internships and jobs so far, largely due to my efforts networking and focusing on learning popular technologies (especially React). Ive done the theoretical coursework and enjoy the problems, but the hour-a-day leetcode would not have been nearly as useful as learning a popular library and building connections.

And that said, I'd say git problems are the norm especially with newer devs. Having that one person on the team who is a git-master is invaluable when you've made a mess.


Hmm - that is surprising since I specifically mention I work in industry (and have for quite some time).

It sounds from your post like you may do front-end work. I work primarily on distributed systems for machine learning, so I can't speak to front-end work, but in my experience understanding basic data structures and algorithms is quite useful in day-to-day work. It is great you have gotten to where you are without doing technical interviews. On the other hand, every job interview I have had has had multiple rounds of technical interviews.

And I should also make clear that I am not saying become a competitive programmer or a Leetcode expert. For example, there isn't much value in looking at dynamic programming style problems unless you are interviewing at a top company. But spending 30min to an hour a day on easy to medium level questions will definitely sharpen your reasoning about algorithms and data structures. And like I said, in my experience those are used very often in day to day work (at least on the backend side of things). Also, to clarify, I am not saying you should do this forever.

As an anecdotal example, I recently reviewed a PR that had a lot of complex if-else statements that was dramatically simplified through the use of a set. The updated code was easier to read as well as understand. When I pointed this out, the engineer agreed and understood what was going on. But their initial instinct was to use the one tool they knew: arrays and chains of if-else statements. This is the kind of skill I am getting at - knowing enough about your language, data structures, and algorithms to know when to use the tools in your toolbox.

I don't think it is unreasonable to expect a software engineer to understand the difference between tree-based and hash-based data structures, when you should use arrays, and pros/cons of linked lists, etc. Practicing this kind of stuff, which is very easy to do in Leetcode, is the fastest way to build this intuition (at least in my experience with distributed systems).

Edit: just wanted to add that understanding these things makes it extremely easy to reason about systems like Redis, memcached, Cassandra, Kafka, etc. If you understanding the basic, then you start having these moments of clairty thinking "oh this is just a big hash table!" etc.

Also, meant to add that thee repetition of Leetcode style questions is super valuable in learning the APIs of your language. Things like "how do I create a hash table? how do I populate it? How do I check if it contains a key?" This is all simple stuff but a lot of new grads aren't as familiar with the APIs. It's not a big deal, sure, but it is also an _incredibly_ easy way to stand out.


I think you're definitely right about how taking time to practice with algorithms and data structures using tools like Leetcode can give you that nudge of intuition that helps you reason about other systems.

In fact, I had a similar "simplification" moment during the summer after practicing Leetcode for a few weeks. There was a longer than needed function that returned a list of unique items, and realizing the intent of the code, I simplified it to a one-liner using built-in data structures. Small moments of recognition like that feel great!

Your point about becoming more familiar with your language's APIs is also a great one. After the aforementioned Leetcode practice, I didn't have to look up things half as much as I did previously about Python. It was a good feeling :)

Unrelated to the above anecdote, I recently started following the Backend Engineering Show with Hussein Nasser [0], and he makes a lot of these kinds of connections to popular technologies like you described.

EDIT: forgot to add the link!

[0]: https://open.spotify.com/show/55pPBm0l75K28dIqoHIQIc


So for technical interviews, yes you should absolutely grind algorithms, and not just any algorithms, but exclusively the stuff that comes up in leetcode, and yes you should do weeks of daily practice at leetcode or hackerrank if you want to get hired at a FAANG (or many other places). Up to a certain level getting better at leetcode problems will even improve your actual programming skills (BTW: I would be extremely interested to hear about any data concerning this improvement).

Beyond bandwagon effects I suspect these tests originally became so popular because they are an unproblematic way to select for people who are highly intelligent but also do as they are told.

This also points to a potential shortcoming of the "grind leetcode" strategy: it is applicable to the extent you meet the above characteristics and even if you do well enough to commend yourself as a high-spec corporate cog, and get that FAANG job, it does little to differentiate yourself against other high-spec corporate cogs (unless you are one of the tiny top fraction of competitive coders, maybe).

So with the exception of mastering algorithms (as taught in uni beyond leetcode needs) or mastering C++ or Java[+], I wholeheartedly endorse your advice, but becoming really good at some suitable and economically useful X is, I think, a more widely applicable strategy with a higher ceiling.

Also, I find it interesting that although I explicitly mention that my idea of learning git deeply would involve the ability to implement it (rather than memorizing the man-pages, say, and presumably validated by actually taking a stab at it) a few people pointed out that git is essentially too pedestrian to be useful. This is not the case, in my experience: both in that understanding git well will go way beyond the antiquarian and also in that lacking git skills are in fact a productivity drain at many or most companies.

[+] Algorithm courses at uni seem to gravitate towards what's neither interesting nor useful. Of course, go a head and concentrate of C++ or Java if your interests require it (e.g. Games programming). Otherwise, I suspect the best languages to master early are either python or javascript. If you care about machine learning or science, master python, if you care about web and app development or graphics, master javascript; otherwise pick the one you like better. Either will deliver much better bang for the buck in terms of skilling up and being able to do interesting things in a short amount of time than Java (which risks pulling you towards enterprise antipatterns unless you have good guidance) or C++ (which requires mastering an enormous amount of antiquarian knowledge to get anything done). I would maybe complement this with learning enough C to be comfortable with heap, stack and pointers and enough Rust, Ocaml or Haskell to have a glimpse at what a language with a proper type system looks like.


Nice explanation. MIT’s “The Missing Semester of Your CS Education” attempts to address some of your points https://missing.csail.mit.edu/


The course is decent but I really don't get how it's relevant here.


Heh, I was the "git guy" for a project, and it mostly made me want to try the alternatives like fossil and pijul...


> "Spring ... will probably harm ... your intellectual development."

I just started some client/contract work with Spring as API/middle, I'm curious about your opinion on Spring especially on this context of Careers. A niche to be avoided?


It's paramount to keep #1 firmly in place. The fundamentals don't change but the implementations do constantly. If you focus only on the implementations you can very well end up a programmer with 20 years of 1 year experience.


Regarding specific picks, I would say things like git or SQL (if you want something "ageless") or docker, aws, react (if you want something trendy, but still proven) would be good choices.


Nice explanation.


He kind of explains it in the following tweets in thread. I think he says more or less:

"I'm usually advocating for undestanding deeply the problem and being an expert in the matter, not in the tools that solve it that come and go quickly. But in reality it's ok to be learn the tool, as it also brings you knowledge on the same matter, eventually. It may not be as comprehensive, but it's still desired. And people who understand tools that solve the problem are also needed and it may fill some real world needs."

I find it very relatable, as I'm kinda doing that at my work. I'm no expert in anything but solve problems very skillfull coders embarrass themselves at. Not that they are dumb. They are smarter than me, more expierienced as well. But they often lack a wider view of the issue, don't take account so many things. The OS in their code, neglect the existance of network, delays, assume weird things about how other software work and so on. The almost always neglect that people just lie sometimes or will fight to death to prove they tested corectly while I know they made trivial mistakes while testing. All people do. I just verify that once more. Sometimes coders ask me (after I help them with an issue) what apps do I code at our company, while I don't code at all. It would take me a year to Hello World using company internal standards. I just troubleshoot a lot. I know options. I read what you guys write here and try to understand what problems you have or don't have when using different aproaches. So I can suggest new ways of debugging, solving issues, simplifying things.

I like my job. Except for the part when developers lie to me, but in general it's a cool job ;)


Interesting! I love debugging, writing (specs and docs), learning from others, teaching others (doing this part time rn), writing/tuning tools and libraries and refactoring. I sometimes wish I had more time for these things, I often have to overlook them in the face of time/budgets/features.


On one of my first programming jobs, they weren't quite ready for me, so they just told me to read all the manufacturer's documentation on the platform we were using. I managed to read most of it cover to cover. Within a month I was being recognized for doing things others who had been there a year weren't even aware were an option. I've always been the type to open up a new application and explore what every single menu option offers, as well as at least digesting the whole table of contents/index to the help file or documentation.


Probably not what he intended, but what I gained from his advice is that it's better to deeply understand a project than to just focus on building out features. I often fixate on the latter to the detriment of gaining understanding of the bigger picture. So whereas you could focus on one feature and get it done quicker, if you dive deep and understand more of the bigger system it's part of, you can not only go into features with better conceptual understanding, but you can even understand how changes will relate to other parts of the system and foresee potential consequences.

For any fan of Carmack, Doom (the game), or just video games in general, I recommend reading "Masters of Doom". Carmack's story honestly reads like a programming/general nerd rockstar in the book, and it's a lot of fun. Also helps me understand what kind of personality it takes to be considered one of the best ever at something: for much of his early life, Carmack was essentially awake just to program. His home was basically a mattress and a computer at one point, IIRC.


That was a fantastic read. It pretty much defined my standard for an engineering work ethic. There are other books he played a prominent role in that I enjoined.

Dungeons and Dreamers was a side by side story and contrast between Carmack and Id's path vs Richard Garriot and Origin Systems.

The History of the Future features a much older John Carmack at the top of the industry, joining Palmer Lucky and a team of young entrepreneurs to build Oculus and VR. The entrepreneurial and technical ride is at least as wild as the one in Masters of Doom.


I believe he means that by better understanding the tools, one better understands the trade.


You said what I wanted to say but in 5% of my comment length. I would love to speak English that well.


True. Learning tools doesn’t have to be a negative thing.


With an added caveat that this is an indirect path to that knowledge, but perhaps a more job-compatible one.


It seems to tie in closely with the article from last week or so about "Blub Studies." Learn your tools, learn them well. Knowing the details of how real software actually works really compounds over time. And a huge amount of it transfers, because we just keep building newer and ever so slightly better mousetraps.

https://www.benkuhn.net/blub/


Here is my interpretation.

Carmack is saying learning something deeply, for example learning all features of a tool, helps in the long run, but in this approach it takes a while to provide value.

An alternate approach is to learn a subset that provides immediate values. In the case learning only the subset of features of the tool that are being used at that time. That's more tractable and allows you to provided immediate value.

He is leaning towards the 2nd approach because first groups of people are opinionated without broad experience.

Elon Musk said something very similar. He said, lets say you need to use a wrench, but if you start from the very basic it would be extraordinarily difficult to start working. Instead you can just starting learn about the wrench and go backward from there. This was you are providing immediate values to the work at hand.


To me, it’s about being able to bring (direct) added value to a company. Focussing on specific tooling is an example of doing that. Learning deeply is important as well, but it usually brings value to the company on the long term (if at all).


> Can anyone elaborate on what Carmack is suggesting here? To specialise in some tool early in your career? Or something else?

I think he is overall suggesting that you should find your business value - your value as an employee to the business you work for. He seems to be specifically talking about "career advice", as advice for employment, so slightly outside of his own experience.

He is suggesting a shortcut to this business value is to know one tool deeply enough that you become the go-to person for that tool in your company (whether or not that is in your job description).


My take: learning the fundamentals of the web and programming is essential to become a good senior software engineering some day, but knowing React is what gets me hired at junior positions.


It's pretty interesting that there's so much variation in how we're understanding the advice.

I read this as John suggesting a stair step path to "learn deeply". If you can section off one defined area, a tool, it's possible to learn _that_ inside and out in a reasonable amount of time. Then you're useful and people want you around. From there you can expand outward or down or whatever.

This would be as an alternative to dabbling, learning the most commonly known things across a wide area.


"An investment in knowledge pays the best interest." - Ben Franklin


we understand things at superficial level most of the time. we don't spend time to mastery. start being a generalist and learn as many things you can and want. Now pick one or two skills where you would like to go deep and gain mastery.


Interesting; my strategy has been almost the polar opposite of his ('learn widely', not 'deeply'), and it served me well so far. It's the whole discussion of being a specialist vs. a generalist, to some extent. Do you want to know stuff about a lot of things, or know a lot of things about some stuff?

TBH I also believe in knowing fairly deeply a few areas (you have to have _some_ depth too, not just breadth); but I suspect Carmack is not exactly a narrow-field specialist, with him founding Armadillo Aerospace and all... his knowledge might very well be both wider & deeper than mine :)


> Interesting; my strategy has been almost the polar opposite of his ('learn widely', not 'deeply'), and it served me well so far.

This whole kind of advice is not very useful in practise. Copying a recipe for success from a person that is wildly different in personality than you will not lead to the same outcomes.

If you've read Masters of Doom, it is evident that Carmack is not neuro-typical. Few people have the ability to concentrate like him. There is no point trying to become like him, this will only make you miserable. Don't fall for post-hoc rationalisation: he naturally gravitated towards it and became successful. 200 years ago he might have been a scholarly monk, or a watchmaker tinkering.

You have to choose your own path. Doesn't mean you cannot learn from the mistakes of others, but this will probably be about how and less about what.


My number one tip for this strategy is: good generalists need deep knowledge of the fundamentals which apply to their general field.

If you know the fundamentals and the theory of programming/CS well enough, you can contribute to almost all kinds of programming projects (given some time to acclimate). Some examples of what I mean by fundamentals:

- data structures;

- computer architecture (no need to design your own hardware, but you should know how all the components interact);

- basics of complexity theory;

- at least one scripting language.

- knowing how to write low-level programs in at least two operating systems. For example, write a simple HTTP proxy in C by using just the C standard library and BSD sockets. If you don't want to use C, use another compiled language but avoid any fancy libraries for this kind of exercise.

If you're good at all of the above you'll never run out of companies that want to hire you. You won't know everything, but you'll be able to learn most things quickly enough.


I agree with all the above, except:

> If you're good at all of the above you'll never run out of companies that want to hire you.

If you're good at all the above, you'll be hit with whiteboard monkeydance and the interviewer will walk away convinced you don't know how to write an if statement.


> TBH I also believe in knowing fairly deeply a few areas (you have to have _some_ depth too, not just breadth);

There's a specific term for this, it's called "being T-shaped". You have breadth at the top and then depth in one subject area.

I think the ideal balance probably varies from person to person. Some people really enjoy going deep on a subject (these are the kinds of people who get PhD's). I personally can't keep my interest on one thing for too long so I end up with a lot of breadth, but I make sure there are a few subjects I revisit a lot so that I manage to build some depth.


To me it makes more sense to learn widely early on as a junior. You’ll get a sense of what you enjoy and what you’re good at over the years. Then you can focus on what you like as you become senior and have a foundation.


Depends on the needs of the team too. In most of my roles, having good breadth has been more important to keep an eye on the big picture and to be able to contribute in various areas and tell when the Jr's are doing something unwise.


Indeed. The most senior engineer on a team typically needs to have sufficient broad knowledge to understand (in detail) what everyone else on the team is (or should be) doing.

That said, Carmack clearly has wide-ranging enough knowledge to run rings around entire engineering teams - despite his preference for learning in depth.


Slightly unrelated: I love that Carmack uses backslash (\) to escape 'EOL' and signify that this will be a thread and not a standalone tweet. Much more programmerly than "1/n" and "Thread".


> Much more programmerly than "1/n"

On the other hand, null-terminated strings are avoided by most modern programming languages, which tend to explicitly store the string's length.


Twitter uses a static buffer size of 280 characters to avoid this problem.


I meant the scheme used to indicate it's part of a multi-part message.

the 1/n scheme indicates the length of the sequence upfront, whereas the backslash scheme doesn't.


> the 1/n scheme indicates the length of the sequence upfront, whereas the backslash scheme doesn't.

I've seen a variation of the 1/n scheme that literally uses "1/n", "2/n", etc. to avoid specifying the length up front. Usually the end of the sequence isn't indicated in that case.


Anyone who uses Twitter to post threads / long-form content is fundamentally misusing the platform - they should write a blog post (they’re free at Kmart) and link to it in a tweet. I don’t read Twitter threads because of this - how to trust the judgment and opinion of someone who misunderstands the medium they are using to such extent?


Doesn't the fact that it's John Carmack doing this somewhat disprove your theory?

There's something to be said for writing where your audience is. Carmack isn't a blogger and doesn't often write longer posts.


Tweet threads are normalised at this point, with support from Twitter as well (using the + button). Besides people are more likely to read a thread than to click through to your blogpost.


I have to disagree. More often than not, when I send someone a link to a thread, Twitter decides to show only the first post or few. There's a link to see the rest of the thread but there is zero indication as to whether this link will reveal 1 or 2 or 1,000 further posts. I forget the exact text, but it's something small like "See replies" or similar. A plain link, no icon. For someone unfamiliar with Twitter, this is completely undiscoverable and it has caused me to no longer send Twitter threads to people I know that aren't on Twitter because of the confusion it causes.

For me, I have no problem reading threads on Twitter. I look forward to the daily one from Foone. But if you put someone who hasn't used Twitter much before in front of this interface it becomes apparent very quickly how poor the UX is for threads.

Right now, we're discussing the various symbols people put in their posts (which are severely space-limited!) to indicate that there is a thread. Why is this necessary?


> For someone unfamiliar with Twitter, this is completely undiscoverable and it has caused me to no longer send Twitter threads to people I know that aren't on Twitter because of the confusion it causes.

For sharing purposes, you might try using one of the "thread unroll" services to get around this issue.


Indeed. What kind of person would use an environment in a way it wasn't intended to be used, especially for their own intellectual curiosity?

There should be a word for people like that...


I think it is a different use, not a ”misuse”. I am ok with it


less people are going to click a blog link, thats just facts.


Interesting. It's programmerly, but at the same time I can see it catching up with non-devs the way @ and # did.


FWIW I started using \ after noticing that Carmack did. https://twitter.com/theshawwn/status/1272706848957722624

It's neat how fast a good convention spreads.


To be honest, getting career advice from John Carmack is just like getting career advice from a lotto winner, not everyone has the luck/genius.


On the surface your comment makes sense but if you read his Twitter thread, he's not saying "do what I did"..

He's actually giving decent advice which is make yourself a source of knowledge on something within your chosen field.

Most people who are just starting out are not going to create the next amazing thing, but it's within reach for them to become a subject-matter-expert (or at least start that journey) so that they can use that knowledge to help them make things (and in the process support their peers and benefit from knowledge sharing).


Yeah, actually what he's saying is applicable at a wide range of IQ levels etc. Even if you're not a genius you sure can be the most knowledgable person in town for x. Just that x can't be too IQ demanding. There are tons of people having good careers being this person in for example infrastructure construction(the physical version).


I did read that, again, not everybody can follow that, by that I do mean what he wrote in that thread, not what he did.


I had an Aural Skills (pitch training) professor in college who had perfect pitch. At first I felt kind of cheated, "how can you help me if you didn't have to go through this yourself?" But in a manner of speaking the professor still had to go through it themself and hone their skills, and their breadth of knowledge was much wider than their own personal experience in music training. This professor had helped countless students master pitch, and they could see much further across the landscape of music study than I could.


I strongly disagree with this sentiment. You'd be surprised how effective people that are "smart enough" can be. I've become less and less impressed with people's innate "genius" and more impressed with their work ethic and focus.

I've wasted so much time in my life trying to learn things top down... that is, learn bits and pieces of something to complete a task without fundamental knowledge of the thing I'm working on. I've noticed "geniuses" never work this way. They insist on knowing their fundamentals first and work up from there.

A genius' mental framework for problem solving is so well developed that it just seems like they won DNA lottery. Carmack is incredible at what he does, but there are likely many intellectual things that he is terrible at because he has no mental framework for solving those types of problems. It ultimately would take him thousands of hours to be great at those things like the rest of us plebs.


Depends on what part you listen to. Carmack's work ethic is legendary and ability to output high volumes and exceptional work are things a lot of people can aspire to, especially startup entrepreneurs. I certainly have taken those words to heart in my business.


One source for his legendary productivity: http://bookofhook.blogspot.com/2013/03/smart-guy-productivit...


> getting career advice from John Carmack is just like getting career advice from a lotto winner, not everyone has the luck/genius

To a large extent, career success is the result of a combination of hard work and luck. The more you have of one, the less you have to rely on the other.


I believe his advice is based on what he has seen in his colleagues and in jobseekers as much as what he himself has experienced, if not moreso.


My advice:

1. Make sure you do what you love to do.

2. Make sure you constantly improve and try to do as good job as is possible.

3. Don't get distracted.

In almost any field, there is always need for people who have deep expertise and can do excellent job. It doesn't matter if you are developer or lawyer or a salesperson or actor. If you do world class job, your prospects are good.

It is hard enough to do something for entire lifetime. It is harder to do something you don't love.

It is hard to find strength to constantly improve in a field you just don't feel particularly interested in.


It's really good advice. My career has very little in common with Carmack, but I think I achieved success with the same mentality.

I have non-techie friends who have toyed with learning to code, but they usually get discouraged after a few weeks. I try to encourage and mentor, but it rarely pans out. There's a huge disconnect between what people are taught and how career progression works in the modern world. It's enough of a gap that I struggle to connect "Eloquent JavaScript" or a similar 101 guide with actually getting someone a job.

I'm certain there's a framework out there that could help learn and make a living in an emerging industry (outside of "work your ass off and don't get exploited"), but there's also so much garbage and bad guidance muddying these waters.


My problem is that what I love to do changes a lot, so I get burnt out and/or bored with a thing before I've mastered it (and certainly before I've spent a full year at a company doing it day in and day out).

Also, most of the things I genuinely love (like reading whatever strikes my fancy at the moment for pleasure) aren't things anyone will pay me to do.


John Carmack's path is highly unusual. I highly regard the person but would not take on his experience as career advice.


ah the irony of career advice. we only care to hear advice from successful people but the successful people are exceptions.

this is why I focus on generalized principles in my writing. all personal success is anecdotal but we should not let that stop us from trying to discover generalizable truths by induction and deduction.


But his "highly unusual" success is lighting that strikes 5 times in a row. Maybe he's onto something. His lighting strikes consistently in completely different domains too.


Lightning strikes aren’t evenly distributed on the globe. Carmack is a skyscraper, the average human is a normal tree.


There seems to be a few comments on survivorship bias and luck. I want to address those. While, yes it takes a lot of luck to be at the pinnacle of success. Their advice shouldn't be discounted. No one lucks their way to the top of the mountain. You can't control the weather, but you still have to climb the damn mountain. It's a combination of wisdom, hard work, and luck. A lottery winner brings nothing to the table. Most of them actually go broke again. It's hard to reach success and stay successful if you aren't decently competent.


I think of it like this: In the the short term, luck might be sufficient to put you at the top of your field. Over the long term, luck is necessary but not sufficient.


Learn to understand all components around you.

When you actually understand and question everything, you understand why things are how they are and then you are able to just fix the critical things because you acutally understand the system.

After a while the technical deep dive will not matter that much anymore as they become systems, you become a system expert and you only need to look at technical details here and there.


Tools change, fundamentals don't. I disagree that you should modify what you should learn in order to start contributing as fast as possible to a team. I think that the advice that he gave was a bad one.


The problem is that far fewer people get hired to work on the fundamentals than to operate the current industry-standard tools. Also many of the jobs that require fundamental skills seem to require steep academic credentials (e.g. a PhD from a big name university) that are sometimes more difficult to acquire than knowledge of the fundamentals itself.


I think people misunderstand about the advantages of understanding fundamentals. Yes, very few are inventing new tools, protocols or reimplementing known data structures or hash functions. But the point of mastering the fundamentals is to transform the way your brain thinks when it approaches all sorts of computing problems. It's a framework that allows you to deeply think about the correctness and speed of what you are trying to build, backed by tens of years of research.


Unfortunately the most optimal thing to get your career started is not deep study of fundamentals but "grinding leetcode".


Fluency designing algorithms is a significant part of fundamentals though, if you can't do leetcode problems then you aren't fluent with algorithms. Practicing them doesn't necessarily make you fluent, but being fluent means you can do them.

Edit: Since we are talking about Carmack here you probably know he is among other things famous for designing custom algorithms for the games he made. So when he talks about deep understanding it includes deep understanding of algorithms, how they work and how to make new ones. If you got that down well then leetcode isn't hard at all.


That depends on what kind of a career you are working towards.


Of course fundamentals are invaluable. But a stable career is equally if not more important to most people, I think.


I don't think Carmack has ever been "hired" for a job since he was a teenager.

So you're right, he's talking about "this makes you desirable in my eyes", not "practical employability through the lens of HR".


I think his point was that good tools are built with deep fundamentals.

* If you've mastered git commands, you haven't followed Carmack's advice.

* If you've used git as a domain to understand key-value stores, Merkle trees, hash functions, and network theory, you have followed Carmack's advice.

Or another example:

* If you've mastered using a gaming library to move characters around and memorized APIs, you haven't followed Carmack's advice

* If you've mastered a game library to understand 3d transformation matrices, GPGU, and simulations of physics, you have.

A tool can point you to an aligned, coherent set of fundamentals which also work together to make a package useful for employment.


The only constant is change, which is why you always need to modify what you are learning and keep learning and growing. I've seen far too many aspiring developers being held back because they stubbornly held on to fundamentals. Companies don't care about "global variables" or technical debt, they have to make a profit, look after employees, survive pandemics, maintain ISO accreditation etc.


Thing is that when you master that tool deeply you also get fundamentals. Lets use deep learning example when you master something like pytorch deeply you also understand fundamentals like how automatic differentiation works etc.



I prefer Richard Hamming’s series:

https://www.youtube.com/watch?v=e3msMuwqp-o&list=PLctkxgWNSR...

I have given a talk with this title many times, and it turns out from discussions after the talk I could have just as well have called it "You and Your Engineering Career," or even "You and Your Career." But I left the word "Research" in the title because that is what I have most studied.


+1, one of many transcripts of a version of this talk: https://wcarss.ca/reading/you_and_your_research


Imo the best way is the painters approach, starting with the broadest strokes. For software this means tool and platform independent knowledge that can translate to any tool first. Being a specialist without getting the the big picture seems very fordist and dystopian, gulag shit imo.


I’d restate this advice as “be a specialist in several areas, start with one. Don’t be a generalist, while that is sometimes valuable, it’s not always.”

It is good advice. I’m doing pretty well in my career at 43 as an individual contributor. Next year I’ll be earning $500k easily (and I don’t live in NY or SF). I started as a coop student out of university which helped me focus on individual specialties. One of my very first jobs was as a database design analyst. I wanted to be a developer, but, i hit it off with this hiring manager in the interview.

When I started I had no idea about entity relation design, normalization, the great relational vs networked vs hierarchical database wars, data warehousing vs operational datastores, star schemas, reporting vs OLAP cubes, ETL tools, etc. I got very good at understanding a few of these tools [edit: specifically datamirror, informatica, Cognos, and Oracle RDBMS] very deeply, and became something of a technical expert on the topic of data warehousing, eventually meeting the pioneers of the field (Inman, Kimball) and eventually working for one of them (Terdeman). This was over the course of 5 years, with a sidestep into coding a large scientific test and measurement system on a Gemstone object database platform (a new area that benefited from my prior specialty). It was all a mix of serendipity and ability to focus.

My career moved away from data/analytics after that into middleware and REST architecture, and now cloud computing, and Kubernetes.

Tl;dr:

Knowing a few tools or subject areas super deeply comes across as having super powers. It really is about focus and curiosity.


Thanks for sharing, it's so hard to find data on this!

If you don't mind sharing more, is that working for a single US company or is it contracting / your own company?

I'm at 1/3 of your income but I find it increasingly hard to grow from here. To be honest I wouldn't mind being an individual contributor in the future but I don't see my income raise much more than that (I'm in Europe and I literally don't know anyone making close to half a million as an individual contributor).

I guess FAANGs (and stock options) would be an option but my friends at Google / Apple / FB are not very happy. Mostly dull jobs unless you're in some lucky team, less flexibility (at least pre covid, now everyone is remote anyway), more politics, similar cash compensation but very valuable stocks (which is how you make money). Plus throwing away two months of my life to prepare for a dumb interview.

Instead I'm looking into kissing my IC's career goodbye, start a small tech business and try to grow that one.


This is working as an employee at a single US company (I live in Canada). It was total compensation (salary, bonus, annual stock grants). I grew from 1/3 of this 13 years ago though I changed employers several times. Eventually the small raises / bonuses / stock grants kept piling up.

During this period for about 2 years I was an independent contractor and was at around $250k annually (could be higher, but you don’t get paid for vacation, conferences or marketing/sales efforts).

Building a small lifestyle business is another pathway to making a lot of money, though it too tends to take a few years. I have friends that have built niche industry software apps that gives them about $1m-2m a year. Enough to hire staff to keep things running.


his point is about tools OVER areas - it's good to specialize in an area but often that involves building your own tools from the ground up to really completely understand something. Specializing in tools, or as he calls it being a "tools master" at a company provides more immediate business value bc the tool probably has lesser known features that exist for a very good reason.


Yes. And sorry if that wasn’t clear, I learned the areas as a result of learning a tool very deeply as the priority, and then expanding from there.

You start with a tool and then build experience in the area so you can eventually make evaluative judgements.


Perhaps I’m missing something, but he moves past “learn deeply” as it doesn’t help with breaking in, but then offers “tool master”, and a tool master seems to me like someone that has done a fair bit of learning deeply for a specific thing.


well, I think "learn deeply" there means really learning from the ground up, while you seem to associate it with investing time (which you need to do anyways). I think he argues that for employment in IT it might be better to invest time in learning build systems or webframeworks or firewalls, than learning from the ground up (starting with C/TCP-IP). Bad analogy: if you build a house, you study engineering (or carrying stuff), rather than theoretical mechanics. Once you are involved in the business, the deeper levels will inevitable show themselves at some point.

I think this has its merits (and actually I think, it suits me), but without any guidance your measurable and marketable progress is slow. For example, I build my scientific softwarestacks with spack, debugging build failures. Net result is, that I got into a random project of a colleague, fixed his C++-Makefile without having more than basic C-knowledge.


Learn deeply basically means being able to make amazing tools from the bottom to the top. Be an expert in the tool means being able to make amazing things with it without necessarily being able to make the tool yourself. So low level versus high level.

For example the majority of webdevs probably couldn't write V8 but they can make JS sing.


Do whatever it takes to get your foot in the door somewhere, because it's impossible to know what works otherwise. Internships are a great hope.

From there, it's a totally new story/path.

It helps a little bit if you like what you do, but you don't need to 'love' it - if you do, you may get taken advantage of, i.e.'gaming'.

Once on the inside, it should become clear the folks that are valued and some of them will be in accessible positions with a degree of focus.

Tech is a pretty good industry for talent exposition, almost like no other. Usually when people are good or where knowledge shines ... it stands out. At least more than in almost every other industry.


Could this all be summed up to "know your fundamentals"? I get the nuance to the rest of what he says, but I think it still falls under the "know your fundamentals" umbrella.


I think it's more like "learn a useful skill while you're learning the fundamentals, so you can contribute value in the meantime"


Career advice for Carmack: don't work for Facebook. You are much better than that. And Valve does better VR anyway.


The same thing Chris hawks talks about on his YouTube channel. Just build projects.


Generally I think advice from people who are far outliers in success is probably useless for normal to above average folks. It's like asking Lebron James how to succeed at basketball. Well first you need to be born a genetic freak with inhuman hand eye coordination and grow up to be 6'9" tall.


[flagged]


Hello! Feel free to DM me on twitter to talk. (I'm @theshawwn)

I noticed some of your other comments, like https://news.ycombinator.com/item?id=23737849. If you're 20-something, just remember that life is pretty much supposed to feel listless. No one really knows what they're doing until they figure out what they want. And figuring out what you want is often very hard.


very kind of you to offer help where most of us down vote. this person does seem like they need someone to talk to.


I related with "my only friend is a few thousand miles away. We watch movies together and talk via discord." It's normal to end up in that situation. Or at least, I've been there.

I hope they'll take me up on the offer, since it's interesting to get to know people. I'd reach out to them directly, but there's no contact info in their profile.


Unfortunately they never reached out. But, wherever you are, I hope you're having a Merry Christmas, daodedickinson. Everyone should!

Don't worry too much about stuff. Things will get better. (Prozac also helped me.)


Agreed with sillysaurusx. Hang in there buddy. These times are particularly difficult for everyone (and of course some more than others!), and just know that the pain you're feeling does not have to be your pain. I think of it as the pain... of the human condition which we all share together. It makes it feel a lot less lonely. Brighter days will come, and you will be all the stronger when you get there.


50% connection 30% luck 20% hard work.

The best solution is not to reproduce till you can help your children with connection.


Survivorship bias. Not everyone has the opportunity to be a programming geek at the beginning of a new era of PC gaming.


It's always good to be aware of biases in people viewpoints when considering their advice but I'm not sure that's that applicable in this case because he's advocating people don't do what he did essentially.

It's also as far as these things go pretty mediocre and obvious advice. The majority of the games industry today is focused on using and customizing a reasonably small set of tools rather than writing their own.

Further engines like Unity and Unreal actually have pretty big penetration outside of games making them even more valuable skills to have.


is it mediocre if it's true? sometimes good advice is just truth stated simply.


Calling it mediocre is just calling it common or average and lacking in special insight not that it’s false. It’s good advice that anyone in the industry would give to people trying to break in.


not too get too pedantic but i dont think something can be both "good" and "mediocre" - both descriptors you just used. but i'm not gonna get into an extended discussion over word choice :)


Heh, contextually you sure can. ;)


Have you read the whole thing? Because he actually refers to very different aproaches then his own. Says it's perfectly fine! He describes a modern way of creating software, with lots of tools and that's it's ok NOT to work as he is working.


New era of PC hardware and adoption, maybe. Carmack arguably created that new era of gaming by catalysing the rise of the FPS.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: