Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Eternal Novice Trap (feoh.org)
190 points by signa11 on Dec 29, 2019 | hide | past | favorite | 73 comments


I know intelligent people who fell into this trap and can't get out.

I know a guy who is seeking enlightenment writing web frameworks. He is streaming new frameworks at ever increasing pace. Unfortunately, he never gets around to supporting what he has created, thus, he never learns how to make something that actually makes life easier in the long run. His frameworks are pure zen, but only to him. The rest of the team while their days trying to convince management that their high regard for the guys capabilities is completely misplaced.

I know a bunch of people that spend their entire time learning new technologies. New frameworks, languages, patterns. The learning is typically shallow as it ends the moment the person knows how to do anything with it or when he/she finds some kind of huge disadvantage. Then there is a switch to another language/framework/library/tool/paradigm and the eternal cycle repeats.

Stick with a project/problem until completion. Don't be satisfied with first 80% and neither with second 80% of the task. You need to go all the way. Try to learn everything there is to learn about it. Try to figure out all different ways to improve it and then try to figure out why those might not be the best ideas. Go discuss with your team, your architect. Listen to their problems, then figure out how your ideas do/do not help solve them. Understand why.

Try to make things so simple everybody can understand them. Try to be able to explain every decision and how those decisions interact with each other.


This is great advice and would be even better, if our industry was not that hype driven. Tell someone, you are a JavaScript expert and you had great success in the past with jQuery and knockoutjs. People will reject you, you won't have a job and no money.

And if you are one of these poor fools, who depend on work for income, than good luck not at least partially follow up on trends, them being silly or not.


> I know intelligent people who fell into this trap and can't get out.

> Stick with a project/problem until completion.

It's not about intelligence. Sticking is extremely hard for ADHD people.

People with ADHD are mostly learning and working on new things eternally. They keep doing that makes them look like smart people or fast-learners. However, if they keep focusing on one project without trying out new things, they would burn out very fast.


Amen!


FWIW this is my post. I didn't post it here because I'm not really involved with this community very much.

Any feedback would be very much appreciated.


It's well written and pretty much spot on. The difficult thing with advice is that one first needs to figure out where one is in the spectrum before determining whether it's applicable - i.e. overweight people could benefit from eating less, underweight people could benefit from eating more.


Great article. Gives one much food for thought.


To avoid the "eternal novice" trap, learn computer science fundamentals. If you can, choose projects and jobs which don't force you to burn time learning superficial material which will become obsolete in a few years.

I agree with this post (except for swearing allegiance to any specific programming language): learning programming languages is a poor way to prioritize. Learning algorithms, or math, or operating systems, or compilers, is more effective — and will make it easier to pick up a new language when you need to.


That won't teach you how to build maintainable software in the long run though. It will give you a lot of groundwork, but one of the major issues I see in eternal novices is the fact that they never had to maintain software beyond 2-3 years. Maintaining software in the long run demands certain discipline so it doesn't bite you in the ass and many developers simply never had to deal with something like that. Even the ones styled with "senior" titles.


I'm just learning this now. I have built lots of stuff over the years but I have always had to balance my time between actually coding, managing the coders, and running the business side of things.

Now, over the last few years we have actually experienced a lot of growth and our code has started to require some scale and I have had time to focus a bit more on the code.

I feel like I have learned more in the last 2 years than the 20 before that.

I've had to focus so much more on actually architecting the system, thinking about how the data is stored in the DB, creating a smooth developer experience, creating maintainable code, and actually tracking and logging tons of detail about the app so we can find and diagnose bottlenecks before the become a problem.

So in my case, it hasn't been so much about having to maintain it for more than a few years (I have projects I have maintained for 10 years now), but the fact that this one is experiencing some real scale has really laid bare my bad (and good) decisions and forced me to learn about and start making better more informed decisions.


A lot of it is just encountering new challenges you haven't faced before.

Learning to scale an application was certainly one for me. Another was writing a webapp where accessing the primary datastore was a 50ms round trip. Another was writing warehouse software that had to interface with hardware. And another was building a somewhat complex multi-system application that ran an ecommerce business.


Most days, I feel like programming is more like bookkeeping than anything else. Especially in legacy code. Managing complexity and maintaining consistency get hard.


This. I'm an electrical engineer and took the time to learn many of the fundamentals in my particular corner of the field. I constantly recommend others to do the same as it means you can do any analysis or study far easier. It's not just learning existing engineering software packages either. Once you know a little programming, Excel, or SQL, you can even build your own tools from scratch when the vendor applications don't do what you need. I've often found that my company has done something one way for years that is inefficient, slow, and less accurate than a process I can setup in a day or two. This all comes from the fundamentals. Learning these fundamentals isn't easy though.


"eternal novice" isn't a skill gap, it's a personality type, with shades of ADHD.


Yes and no. I'm very glad that I followed the Pragmatic Programmer approach early on in my career. It's partly why I learned Ruby back in 2003, and was exposed to Lua and Haskell and others around the same time (and built pretty neat production software in all of the above languages)

Over time it becomes less necessary to deep dive on new languages, but that doesn't mean you should altogether stop, it just means that you're more of an expert now and learning a new language to the point of mastery that you are used to in your day to day work is a lot bigger task.

What the author describes is the dilemma with any kind of knowledge acquisition. You always need to be learning more, but you have to balance that against being productive with what you actually know. As an expert or senior or whatever, I have a pretty good understanding of what the knowledge landscape looks like. As a junior there's a lot more utility in doing a broad survey before diving too deep.


I think this sort of misses the point of the Pragmatic Programmer’s advice: you should learn a new language every year for exposure to the ideas that language has (e.g. Prolog to get a sense of how logic programming works, J for array programming, etc.). This doesn’t necessarily mean learning the language well enough to write production code and it certainly doesn’t mean switch your primary go-to language every year.


Indeed. This can be extended to other things than just programming languages. Learn what a probability space is, code up a simple neural network, get an Arduino and try to connect it to a bunch of sensors and an LCD display, etc. This will not make you a mathematician, or a deep learning expert, or an embedded engineer, but is a little mind expanding.


> you should learn a new language every year for exposure to the ideas that language has (e.g. Prolog to get a sense of how logic programming works, J for array programming, etc.)

But why? what does that give you?


I can only assume this is revealed in the process or not at all.


I've been a musician for far longer than I've been a programmer. During the first 3-5 years of playing, I'd play for many, many hours. At my peak, I'd practice for up to 8-10 hours a day. Before school, during school, after school. I had my trusty one guitar, which worked for everything I needed.

Then I started earning money, and began purchasing stuff. This avalanches into a full-blown addiction of constantly chasing / trying out new stuff. My playing suffered, my practice routines died, my creativity went to shit.

Even though I was still quite skillful, I found myself playing/noodling the same stuff, caring about pretty much everything BUT the actual playing/music.

I'm now back to owning a couple of guitars, and focusing on what's important. Writing music, getting ahead, and improving.

And for some reason, that mirrors my experience with programming, too.

I'm not sure, maybe some people are just prone to that kind of stuff? Getting caught up with the superficial things.


Well I have an anecdote that works as a parallel:

During my first few years of surfing, I would try (and buy) lots of different surfboards. Also happened to me that as I earned money, I spent more and more on surfboards.

I moved countries a few years ago and sold all my boards back at home, and just got two boards here (one for small waves, one for larger waves). That turned out to be so much better for me. I don't spend so much time looking at every detail of the surfboard, instead I just go surf. I try new things on the surfboards I have, and it tends to be about techniques and less so about the boards themselves. Occasionally I will try my mate's surfboards and I think that they're nice, but not enough to justify spending more money on another board.

I don't believe it's 100% about proneness to get caught up in that kind of stuff. I guess for me it was more just trying to get the most out of the experience. Ironically enough I wasn't surfing as much before moving to a different country, whereas where I currently live I surf 2-3 times a week. Maybe there's some parallels there? As in the more you actually practice X, the more you're likely to find the subset of tools that actually work for you?

That said, I do still believe (and agree with the link) in a few times a year testing out new languages, frameworks, etc, so see what the fuss is about, and in the process learn new things that I might bring back to my current project or toolset.


It could be more general, that humans just get caught up in what feels good.


Tangentially related Is the idea that we should not eat any more learning curve than absolutely necessary during a project.

The technical risk of integrating some fresh technology has bigger, sharper teeth than most risk predators seeking to devour our success.


It’s done probably because that’s how the authors are making time to learn new things.


For a long time it was fashionable to write server side java using POJOs and Spring running in Tomcat or Jetty, and you could get very far just being very good at Java and Spring. You still can.

With the advent and surging popularity of multiple server-side alternatives to Java such as Javascript via Node.js, Python, Go, and others, Java is now viewed as "slow to startup" in comparison and therefore less suitable for cloud-scale deployments that must spin-up instances very quickly to handle the massive bursts of traffic that cloud-scale platforms have been designed to service.

I'm curious what people think is the best alternative to Java for cloud-scale deployments. Javascript - ala Node.js? Python? Go? Something else entirely?


I believe Go is eating Java currently.

If you look at companies they're all quite different though. Apple uses a lot of languages. Google uses a lot of C++. Facebook is a combination of things, but AFAIK their main platform is still written in Hack. Netflix is probably still mostly Java. Uber is Python + Go.

I honestly think Java is going to come back in fashion at some point after all the recent updates, along with all the newer JVM languages ala Scala, Clojure, and Kotlin.

If you're asking this for "what language should I learn?" I think knowing anything typed + Python is an extremely strong combo.


I’ve been working with ‘unfashionable’ php for quite a while, and I love it. I can create most things clients ask for. Thing is it is mostly a web/server language: talk to client, talk to db etc.

Recently I’ve been installing Python and I really like it, as it less server oriented and more computing oriented, and it has been a lot of fun. I think enjoying the journey learning a new language is also important!


Thanks for your reply. I think I agree somewhat with the article regarding the Novice Trap. I can learn them all, but I probably can't master them all, and it's probably a waste of time to try to. I think we are in the midst of a paradigm shift and it's not clear who the real winner is yet in terms of programming languages. I currently have the fairly unique opportunity of choosing pretty much everything about how I build the systems I'm working on, and so I'm looking for input as to which way to go. I'd like my choice to be the one that will be viewed as "the winner" 5 years from now.


These days I don't think you can get away with not knowing JavaScript, unless your thing is pretending to be a luddite, in which case you shouldn't actually be using computers if you're actually a luddite.


What would you call "knowing JavaScript?"

I've written some greasemonkey scripts for automating work-related tasks, and done some very basic editing for helping others with visual stuff. I've fixed some folks node code, but that was just reading docs and applying the few lines of change.

There are a lot of positions that never really touch js. My most recent project was working a lot with Chef, which is just ruby all the time.


That's part of the spectrum of "knowing JavaScript" that I mean, since you're not afraid to get your hands dirty. But some "linguistically pure" programmers hold themselves above it all and refuse to learn it out of pride because it's deeply flawed and they don't want to sully themselves. Well guess what: every language (and CPU architecture) is deeply flawed and riddled with historical baggage, and programming is about getting your hands dirty and dealing with it anyway. Even if you're programming in Lisp or whatever your idea of the perfect language is, there's still a lot of tedious shit work and hacking you're never going to be able to avoid if you want to get the job done. The people who wrote the compiler that translates your favorite perfect language into x86 instruction codes had to get their hands really dirty with deeply flawed technology, and you have to respect them, because they didn't boycott the x86 because the PowerPC was more beautiful.

It's hard to outweigh the advantage of using the same language in both the client and the server, and right now, JavaScript is the only universally practical choice on the client side, so it rules both sides now, in spite of all of node's and npm's problems. WebAssembly will loosen JavaScript's monopoly as it gradually matures, but right now all other languages but JavaScript are second-class citizens on the client side, and it will be that way for a while.


Transpiling TypeScript to ECMAScript5 using Webpack (as is done in Angular and React) seems to be how folks are solving this. It does strike me as a rather convoluted solution, but it’s a solution.


This completely dismisses every developer out here writing drivers, hardware controllers, Operating Systems, compilers, etc. Look around as you drive to work. See the boxes along the road? See the automatic doors on the buildings? See the lights? There’s an entire world of software engineering out there that’s not some CRUD web app.


In my opinion the minimum toolset needed to practically overcomes most usecases and give you a gigantic pool of career oppertunities are: c(++), java, python, Javascript and bash. With those there is nothing you cannot do. From web to direct hardware programming.


It's the concurrency model. The industry has decided threads just don't scale and it's doing away with the two most popular languages that rely on multithreading (Java and C#) and favoring NodeJs (event loop), Go (goroutines) and I guess Elixir (I'm not sure what really is the concurrency model, I guess the actor model somehow).


Threads may not scale, but thread pools can. And that concept has been in Java and C# for a long time.

I agree with your point about the industry now favouring the event loop model of concurrent programming. I know C# has introduced recently async/await syntax to match the idiomatics of JS (probably to appeal to JS-comfortable devs) ... but the concept of built in event queues is still not supported. (I may be mistaken as I only read a bit about this feature of C# in late 2017)


I think these languages are thriving in the cloud because they not only have a favored concurrency model, they have it mandatorily and so they spun a cohesive ecosystem.

Take Java for example. You can make an event loop and use it on a web server (Play framework is just that). It doesn't mean you can retrofit all dependencies to do the same. It also doesn't mean business logic inside this server will necessarily use one instead of spinning up os threads nor that it can even plug into the same event loop. And it also means you just introduced opposing views into your ecosystem.


"The industry has decided ..."? Which industry is that? Judging by the vacancies on Indeed.co.uk there are an order of magnitude more opportunities for Java/C# developers than those using Node.js or Go. As for Elixir I think it's fair to say that the industry has largely ignored it.


Thanks for clarifying. This really is a paradigm shift that is happening "under our noses" so-to-speak. I don't want to be the guy who designed his new system using a technology stack that's been shunned.


Java isn’t that slow. I have multi gigabyte installs that take 5 seconds to load up. 5 seconds isn’t make or break territory when it comes to scaling a deployment.


I have Java apps that are up and running in well under a second. It's reflection-heavy frameworks that drag out the startup - avoid those, and Java can be pretty quick.

Maybe never quick enough to use in a CGI style environment, or for command-line tools you want to call from a loop in a script (at least, without AOT). But plenty quick enough for any normal server-side use.


I agree with you, but 5 seconds is a noticeable lag that I would like to avoid if possible, since I have the option to choose the language I use. I've been writing java for more than 20 years so I would love to continue to use Java, and I think it's actually likely that upcoming versions of Java will fix this, but I want to make the best decision here, not the most comfortable one.


[flagged]


We've asked you repeatedly not to do personal attacks on HN. If you keep doing it, we're going to ban you. I don't want to ban you, so would you please review https://news.ycombinator.com/newsguidelines.html and fix this?


Thank you for your informative reply. I will try to be more entertaining in the future if it pleases your highness.


Please don't respond to a bad comment with another, even when provoked. That only makes the thread worse. Instead, the site guidelines ask you to flag it, or email hn@ycombinator.com in egregious cases.

https://news.ycombinator.com/newsguidelines.html


There was a discussion last week comparing Go to Java for short lived processes...

https://news.ycombinator.com/item?id=21871645

I have had a lot of success using Node.js as the middle layer to route requests to a running JVM for all the computational work.


Thanks for this, it's inspired me to simply do some benchmarking to reach my own conclusion.


I don’t think the term POJOs is used anywhere outside the Java community.



You're absolutely right there. "Plain Old Java Objects" - POJOs as opposed to "Enterprise Java Beans" EJBs.


It’s all in the definition of “learn” in the sentence “learn at least one new programming language a year.”

If you calibrate the definition of “learn” appropriately for your skills and goals that advice is great.


There's no way to properly learn a language in a year. Even for an advanced programmer who understands many concepts, getting up to speed and understanding idioms takes notably longer. I'm talking someone who has a few years of in depth experience which is usually focused practice, not coding the same thing over and over or playing around. You cannot learn those things at work nor doing algorithmic challenges - maybe someone can actually make a set of such exercises.

The advanced programmer in other language can fall into many traps and produce non-idiomatic (usually overcomplicated) code.

Goes the same when learning a big enough framework.


There is definitely enough time to master a new programming language in a a year. This is not rocket science, programming languages are purposely designed to be easy to grasp. I'd say less than a month for most languages, perhaps only more if it's a family change.

Most people can learn to speak a new (human) language from the same "larger family" (e.g. french from english) in less than 25 weeks.


> There is definitely enough time to master a new programming language in a a year.

I've been using Go for middlin' size projects for several years, and recently wrote about a 10K SLOC program in it. I still don't believe I have mastered the language, not by a long shot.

I've been writing C since the late 1970s, and I still discover things about the language. And I don't know anyone who claims that you can master C++ in a year.


You can be reasonably competent quite quickly but “mastery” is something else and takes much longer. I have done C# for a decade now but I still discover new stuff all the time or finally understand things I knew about but didn’t really get.

Same with languages. You can be conversational pretty quickly but to truly understand a language takes many years.


Bear in mind that all programming languages are not created equal. The surface area for a language with a lot of history like C# is much larger than the surface area for a younger language like Go, for example. I've been writing production Go for about three years now, and I feel like I have a solid grasp on it. I've been writing production Python for more than twice that long, and I still learn new things every few weeks.


You cannot ever reach a level of mastery in which you'll never ever discover anything new. Even the people who came up with the language come up with errors from time to time.

But you can definitely reach a level similar to these people in a couple months full time. If you think the people who design languages have decades of experience, you will be surprised.

I personally would consider it a failure of the language if it really took more than a couple months to master. We are not talking about computer programming in general here; we are talking about a specific programming language: a programming language is an _artificial_ construct whose sole intention is to be easy for humans to work with it. If humans are generally terrible slow with it, then, what is the point of such language?


Perfect example. French and English are not in the same "larger family". French is a Romance language and English, while influenced by French, is a Germanic language. Learning it in 25 weeks depends on your definition of learning. You might be able to communicate basic ideas in 25 weeks but very few people are going to ever master a language in that time.

The same goes for programming languages. The syntax may be easy, depending on your language of choice, but mastering it is another story.


They are in the same "larger family" in the sense they even use mostly the same structure and alphabet. Compare with Korean, which will surely require more than a couple of months.

The CEFR/EU consider you only need around 800h to reach Advanced/C1 french level as an english native speaker, which is around 30 weeks full time. That is way more than "communicate basic ideas", which is more like an A. In fact, the oft-shared [1] map from the US' FSI considers than 24 weeks is enough for a US diplomat to reach _proficiency_ in French.

[1] https://img.theculturetrip.com/1440x/wp-content/uploads/2017...

I think people seriously understimate the amount of hours a single month has. Programming languages are much easier to learn. Very likely you can even memorize the entire specification of your average programming language (save for a couple exceptions) in such time.

Which I wouldn't do to learn a language, but just saying to prove the point.


Many of us are no doubt coding in the equivalent of trader's pidgin - good enough to get the job done on a day-to-day basis, but not really mastery.

There's more than enough examples of people writing C or Java in other languages.


Well, this all depends on:

* Whether you're learning in your spare time outside of your work and other commitments

* Whether you're learning something radically different to anything you've used before to teach you a totally new way to think about solving problems

* Whether you want to reach the level of "yeah, I've used it" or "go ahead, technical interviewer, try to stump me, your large production codebase won't contain anything I haven't seen before"


There's a difference between mastery and whatever it is you think you're doing in a month. Learning syntax is simple enough, but learning to write software in the language's idiomatic style and using common libraries without looking them up every time is a longer task.

It's not like anyone would say the vistor who looks up words in a travel dictionary has mastered the language...


Month of what, though? How much time is that in actual activity-hours?


It's a nice idea, but I would contest the idea that it's even necessary. Maybe it's good for someone with the potential of being a 10x developer, but most people don't have that potential. Some of us don't even want to be a 10x developer.

With some exceptions, most language people were using a decade ago are still around and have paying jobs. There's no reason why everyone should go berserk trying to eat, breathe, and think in code. Life is more than just computers. Personally, I'd rather spend that time tinkering, hanging out with friends and family, working on my side business, making moonshine, traveling, etc. Programming languages suck and I don't want to use them more.

> The advanced programmer in other language can fall into many traps and produce non-idiomatic (usually overcomplicated) code.

Kind of reminds me of when I came to JavaScript from Ruby and overcomplicated the code by treating it like Ruby.


I separate my “learning” into deep and broad. At work, I’ll take on new categories of assignments (broad) while maintaining a portfolio of profitable, hard problems in my domain (deep).

Personally, I’ll study new languages or cultures or technologies or games. But I’ll also develop—and necessarily, ditch—ones I’ve previously learned.

It’s easy to get caught up in learning lots of basics. Right after the basics is the hard part of integration, which leads to deeper understanding. It’s also easy to do the same thing every day, tricking yourself into the illusion of mastery, and risking becoming the best in a dying field.


> THE BRIGHT, SHINY, INFINITE RABBIT HOLE

Wow, what an apt metaphor.


Key insight IMO:

> You can talk the talk like a champ, and be up with the latest buzz, but in some corner of your mind you may recognize that your basic skills are fundamentally lacking.

Learning for learning's sake is admirable, and it's easy to snub practical application. But it's much like how you can only read a few thousand books in your life -- the modern world is so full of novelties that you must carefully choose what you learn. There is an inherent opportunity cost in all learning.

My personal razor for deciding what's not worth learning is: if I can't justify teaching others what I've learned, it's probably not worth learning.


I find it so hard to choose since almost everything is fascinating to me. I'm terribly afraid that if I commit to any one school of anything I will miss the truth buried elsewhere. I find it easy if I have a fixed end goal to pick the most pragmatic options but outside of that I'm lost.


This seems to be a curse of the Javascript world. Every few months there's a new "framework".

Also, for each new language, there's a new build system, with a different directory layout.


I think the main issue with trying to learn too many languages is that it is exceedingly taxing on your memory capacity. Being a regular developer in a language comes with a lot of context that you have to keep in mind. This gets hard to do if you are constantly switching between languages/frameworks all the time.

So there's definitely some balance that is good to strike between learning a greater number of concepts and paradigms vs going deeper into fewer languages/frameworks.


I don’t see this being a trap to anyone working in a real company doing software developpent for a living. Managers won’t let you recode the same thing over and over in various languages, and you’ll somehow have to maintain the software you’ve developped, optimize it, debug it, etc.. and that means go pretty deep into one technology..


I have this problem. The OP talked about languages but didn't mention frameworks, tools, cloud vendors, programming styles etc etc. It sucks because you go to interviews where they only care about one language and one framework and you dont know it all that well - same as the 99 others.


Can totally relate. I just stopped doing this recently.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: