I bought a copy of "Sams Teach Yourself C++ in 24 Hours" at a CompUSA in 1999. The guy at the checkout looked at it, laughed, and told me about this article.
When I was 16 I was looking at programming books at Borders and a guy handed me a copy of "The C Programming Language"; changed my life.
So much of my career has been shaped by running into developers or just people interested in programming out in the world.
>> I bought a copy of "Sams Teach Yourself C++ in 24 Hours" at a CompUSA
Some of those books were laughably bad. But thing I miss from back then, was the feeling that the book was going to be the key to getting you started writing whatever crazy idea you had for a program in your head.
I think the last time I had that feeling was walking out of Microcenter with a new MacBook and an iPod Touch, ready to make the next Angry Birds...
> But thing I miss from back then, was the feeling that the book was going to be the key to getting you started writing whatever crazy idea you had for a program in your head.
Yes! I miss it too. For me, that book was Head First PHP & MySQL[0] - Late 2000's, I was living and working in Thailand for a family friends small logistics company in the capacity of a BA, trying to move them into this century. We were trying to find a vendor for any system that would allow us to keep track of the movements of shipping containers within a container yard, and something sparked and I thought "hey, I did a semester of programming at Uni, how hard can it be?!", over the span of the next 6 months or so, this book helped me put together a solution that scarily, they are still using to this day.
Fast-forward 15ish years and here I am, still cutting code, although in the capacity of a dev lead, but after all that, I have the book to thank.
IIRC, 3 months to hack together a CRUD like app, another 3 to “ship” the “product”, and yes, quotes very much intended.
I did spend the next year or so adding features, bug fixes, etc. It was very much a case of learning on the job and I was quite fortunate to be in the position that I was. I think it was 2 or 3 years later I managed to land a dev role at Agoda (who had offices in Bangkok at the time)
I know exactly what you mean. Felt like the world was opening up.
I don't know who it was, it might have been one of the guys my dad worked with, be they said "You won't be able to write a game with a book like that, but you might be able to write a phone directory." Didn't phase me.
I remember someone asking me later on if I was going to write a game. I just looked at them blankly and said: "I'm going to write a phone directory".
I just learned today that "Sams Teach Yourself C++ in 24 Hours" was co-written by Jessee Liberty, one of my all-time favorite technical book writers. "Programming C#, 2nd Edition" was a great book.
I loved tech stores too. I still have all of the Apple pamphlets that CompUSA had sitting next to the display units.
CompUSA and Borders have a special place in my heart. I'm from a 3rd world country, so visiting them was a sacred ritual anytime I had the privilege to visit the US. I always brought an extra bag mostly for books, and sometimes a Sound Blaster, some extra memory or a US Robotics card, or a fancy Handspring Visor and some Dreamcast games :)
I actually didn't think that the Sams books were so bad, albeit a bit misleading if you think you're really gonna grok C or C++ in 24 hours.
I learned C first in high school because I found a pirated copy of "Learn C in 24 Hours" online, and I actually felt that they did a reasonably good job getting me started, though if I recall correctly some of their examples with pointers and memory management were actually incorrect. Still, it was enough to make programming interesting to me, and when I dropped out of college the first time I was able to salvage a somewhat decent career because of that interest.
Ah the Sams books. I once thought that writing tech books was going to be a career (having contributed a single chapter to “Microsoft SQL Server Programming Unleashed” by Papa, Shepker et al. I was one of the Al’s).
I think the “in 24 hours” thing was more a case of “I’m going to tell you everything I know about XYZ in 24 hours”. Still, I have so many fond memories from the late 80s and 90s spending countless lunchtimes at McGills or the Technical Bookshop in Melbourne (now since long gone), flipping through tech magazines and wondering which of the many weighty tech tomes I’d spend my precious hard-earned on.
> programming books [...] a copy of "The C Programming Language"; changed my life.
My particular life-changing book was the BASIC manual that came the C64.
This was then reinforced when (~8 years later) I got Turbo Pascal, and it came with everything needed to get up to speed, then further reinforced when I started with Linux in 1995, and discovered that manpages (and info) were installed that basically got me up to speed on everything needed to write programs for that system. Later, late-90s, ISTR reading through the EGCS manual front-to-back.
I miss the days when the software came with all information required, instead of simply requiring you to search google for a product page or a manual.
I guess in about 5 years I'll miss the ability to easily google stuff (which I used to get in manuals) because all the relevant information got posted to discord and remained unindexed.
In early 2000s I learned sockets programming on Linux and FreeBSD completely from the manpages. Later I got Stevens' book and got through it very quickly, thanks to having already tried the manpages and written a few programs.
For me it was "Visual Basic Professional 3.0 Programming by Thomas W. Torgerson" from Barnes and Nobles in 1997. I had downloaded Visual Basic 3.0 from 100 different email attachments on AOL and wanted to learn how to make punters/progs.
I remember transcribing code from the book to attempt to play a sound file when my program opened. I remember being stunned when it actually worked!
What an honor it has been to make a career in software development since.
Haha yes, similar for me. It was word scramblers as games in AOL chat rooms that lead me to my love of programming... and of course Visual Basic 3.0. What an amazing product back then.
Bro you and I had the same origin story. Remember Rampage Toolz!?? I ended up learning a lot from that author. Winsock.
For me my first programmed action was adding a button to a form, coloring it yellow, and having it exit program. I was absolutely shocked and geeked out.
Still just a hobby for me, but I have something in the works. I hope to share it with you all here soon.
> So much of my career has been shaped by running into developers or just people interested in programming out in the world.
This is probably less common now, but remember not to laugh at young girls in the programming section of a bookstore. In the 90s, that was a pretty standard response: no advice, just incredulity.
> remember not to laugh at young girls in the programming section of a bookstore
I can't believe anyone needs a reminder to act like an actual human being, much less not to belittle someone -- anyone! -- who is interested in learning anything constructive.
Unbelievable that anyone, ever, would act that way.
Sadly it’s almost always “learned” from (or actively “taught” by) others, mostly parents or peers who themselves learned it from parents.
Children/young adults aren’t even necessarily aware that their behaviour is inappropriate as it’s “normal” based on what they’ve seen happen around them their whole lives.
Adults should know better, but may do it anyway due to the human tendency to often dig their feet in, or consciously/actively do it for a reason which really has nothing to with the persecuted group.
Bigotry/racism/sexism persevere largely due to the above.
One of the hills on which I am prepared to die on is that "The C Programming Language" is the best programming language book I've ever read. Every developer should read it and learn C at least once, even if they promptly forgot it all.
You’re not going to die on that hill. There are so many people that share your opinion that you might have trouble finding space, but you’re not going to die on that hill.
<ul><ul><ul><ul><ul><ul><ul><ul><ul><ul><ul><ul><ul>Looks great! Reminds me of some of my early classics!</ul></ul></ul></ul></ul></ul></ul></ul></ul></ul></ul></ul></ul>
I still firmly believe that the move to transparent bags is what killed them. For me it ruined my annual tradition of buying Christmas presents while family was next door in another store, but I’m sure for others it was being treated like a criminal. Those magnetic readers aren’t great but they aren’t in your face.
Once I stopped going in routinely I stopped going in at all.
The CompUSA that opened in Cambridge, MA, had a guard who insisted on checking your purchases after the registers, like you were a criminal, or implying it was a crime-heavy area.
Not the kind of insult or atmosphere anyone should have to put up with, and, further, the generally affluent people of Harvard and MIT, which the store was located right between, weren't accustomed to it.
> I actually spent a lot of time checking out the local vendors, but was generally dissatisfied, and used them only as a last resort. If you're in Cambridge, MA, USA, some vendors I looked at: PCs for Everyone (informative Web site, but long waits at their showroom, and they didn't have a floppy drive after I'd waited 30 minutes), MicroCenter (large superstore, mailorder generally has better prices, didn't have advertised CD-RW drives in stock, the three of the four things I bought there were somehow defective), BestBuy (very poor component selection, guards at the entryways), and CompUSA (smaller version of MicroCenter, guard at front door insists upon comparing every shopper's purchases to their receipt as they leave). So much for brick&mortar service.
> UPDATE 2018-12-10: MicroCenter brick&mortar has risen to the challenge, and is now my overall favorite source for PC parts.
Still better than Circuit City. I bought a graphics card there once. It was dead on arrival. I took it back to the store to exchange it, but that was their last one. I asked for a refund then, and they told me about their restocking fee. The kindly offered to give me 100% credit to another card that was twice as expensive.
That was the only time I’ve been asked to “sir, please keep your voice down” and “sir, we need you to leave”. I still think I was on the right side of that one.
My credit card company eventually handled it for me.
Is family ever in the store next to Costco (which is a null pointer exception), such that meeting on foot with clear bags holding presents is no longer secret?
Love reading the comments here and how these books changed lives.
For me it was dBASE III PLUS programmer's reference guide from 1987 and it imparted enough knowledge to start a programming career. Read that dog eared thing end to end and still have it in my book collection.
Still, I kind of feel like I want to print a fake “Teach yourself open-heart quadruple bypass surgery in 24 hours” sleeve and leave it on the shelf to see what happens.
I got started buying Borland C++ Builder in… 1997, I think, and read the book that came with it which was similarly named. It was quite bad, I remember the chapter on pointers more or less said “I can’t explain why you’d want to use them, but when you find a situation that requires them you’ll know you need them” or something like that.
I must say that for myself, I still learn/reference many things from programming books... Mostly Perl and Java, but none of them have promise of teaching me programming in any time interval.
Rarely do I ask for help online as I was ridiculed for most of the time.
Nowadays, if paper is insufficient I consult with the AI, but I still refuse to use copilot.
I wonder who's going to write the "Teach yourself Rust from scratch in 21 days" book. If C and even C++ can be learnt effectively as a first programming language, there's little reason why Rust couldn't be. And if not Rust, maybe Golang could play that role.
Golang yes, but not rust. I think rust is a terrible beginner language. Despite having decades of experience it took me about a month to feel productive in rust and stop “fighting the borrow checker”. I learned Go in about 12 hours.
Let beginners cut their teeth on stuff like Python and Go. Rust can wait.
I completely agree but do feel it needs qualifying. The problems beginners run into aren't usually the same as the problems experienced devs run into when adopting a language new to them, but where I see the two overlap I know something is a serious hazard in a language.
Java as a first language: won't like the boilerplate but won't have any point of comparison anyway, will get a few NPEs, might use threads and get data races but won't experience memory unsafety.
Go as a first language: much less boilerplate, but will still get nil panics, will be encouraged to use goroutines because every tutorial shows off how "easy" they are, will get data races with full blown memory unsafety immediately.
Rust as a first language: `None` // no examples found
I think Go as a beginner language would be better if people were discouraged from using goroutines instead of actively encouraged (the myth of "CSP solves everything"), otherwise I think it needs much better tooling to save people from walking off a cliff with their goroutines. And no, -race clearly isn't it, especially not for a beginner.
And in one respect I've found Go more of a hazard for experienced devs than beginners: the function signature of append() gives you the intuition of a functional programming append that never modifies the original slice. This has literally resulted in CVEs[1] even by experienced devs, especially combined with goroutines. Beginners won't have an intuition for this and will hopefully check the documentation instead of assuming.
Yes but that's the thing, if you pass everything by value or read-only reference you don't have to deal with the borrow checker, and that's quite acceptable when you're first writing code. That's why I think Rust is almost a FP language in disguise with imperative features added after-the-fact - and it could be taught as such.
This only gets you so far. In today's software landscape, somebody is going to want to hit a HTTP JSON API to automate something in their second week, and they'll have a really rough time with how async Rust works today. Try explaining to a total beginner why they need Arc<Mutex<T>> and why the payload struct needs to be :Send+Sync+'static.
For all of Go's faults, "just send it over a channel bro" is easy to explain for trivial cases even if it often risks deadlocks and data races. To avoid the worst risks of this kind of code, Rust makes even trivial cases a big exercise. Go has an opportunity here to become more resistant to bugs without becoming less simple, but resistance to bugs doesn't appear to be a priority even for the Go team itself (e.g. even recently https://github.com/golang/go/issues/64474 )
You’re not wrong, but precluding tons of concepts not suitable for beginners. “Just .copy() everything” shouldn’t need to be mentioned in an introductory book.
Yep; and cloning boxed objects everywhere often leads to slower rust code than the equivalent javascript or Go. And if you only teach a subset of rust without references, people can't read code snippets online, or most of the standard library documentation.
In comparison, Python, Go, Javascript and C# are all much easier to teach and leave students more able to make real software.
TRPL assumes familiarity with some existing programming language, such that much of its focus is how Rust differs from more common languages. A true "from scratch" book would probably first teach idiomatic Rust as something not too different from a pure functional language (in that the default approach in Rust is to pass objects by value and forbid all shared mutable state, just like FP languages do) and introduce interior mutability subsequently as a way of supporting more "imperative" patterns of coding.
I used to love standing at borders looking at all the computer books. I still have my copies of Ivor Hortons beginning C and beginning c++. I never see that guy getting love in the programming author world, but that was my guy when I was a teen.
I wrote a few things for Wrox back then. Ivor managed to get his books onto the reading lists for a few US colleges, and that ensured a steady flow of royalties (and requests for updated versions). I was quite jealous.
My first ever programming book, a well-intentioned gift from my parents, was “Sam’s Teach Yourself C++ in 10 Minutes” - meant of course as a quick reference but always conceptually funny to me because… it’s C++.
The same magic at "computer fairs (faires)". That same magic when you happened across that other nerd in the college/school lab doing something cool and subversive.
Boy I had the same book that I borrowed from a friend. I was truly in love with that book and went over it a few times. My friend ended up giving it to me as a present, something that truly shaped who I am now.
Unfortunately "The C Programming Language" has enough typographical errors in it to seriously undermine it's utility as a book with which to learn programming.
What are the typographical errors? I didn't think it had any.
It does use features of C89 that may not compile after C99, but those aren't weird.
But, as someone who actually did try to use K&R to teach themselves to program, I agree it's not ideal for that. It's definitely not the best programming book ever, which some people say. The exercises are pretty good, though.
Looks like I downloaded a bad copy... the following has an error at the top of page 87 (there are a few other errors in the same chapter but I don't have my notes handy)
This essay holds a special place in my heart, since I first read it as a teenager when I was just starting to learn to code.
Re-reading it now, I was surprised to see references to Malcom Gladwell, since I didn’t remember Outliers becoming a thing until much later. Then when I saw the reference to Ratatouille, I realized the article had been updated since its posting in 1998. The original is still available on archive and is significantly shorter: https://web.archive.org/web/19980206223800/https://norvig.co...
Respect to Peter Norvig for continuing to edit his posts over the years.
I resent Gladwell to this day, possibly because I was assigned to read him in high school. He is quoted too much -- the 10,000 hours thing is just a meme at this point, for me anyway.
Oh I agree 100%, the 10,000 hours thing is a cliché and the popular interpretation that 10,000 hours of an activity equals mastery has been throughly debunked. I roll my eyes whenever I see it mentioned.
I’m giving Norvig a pass here because I don’t remember it being such a cliché in 2008, and because I appreciate the point he’s making about the importance of practice. (Early in my programming education I often felt that I might just not be cut out for programming, and I wish more articles at the time had emphasized that it takes a lot of practice to become a good programmer).
The breaking point for me was when someone linked me to his podcast where it's clear that he either hasn't done much diligence or is obviously ignoring information to tell a better story. It's painful and made me really question anything he puts into his books.
I can see myself now, reading this in the university library computer lab a decade ago. And now I've taught myself programming. The journey was much more difficult and winding than I was imagining it back then!
I am at times more of a book learner, but find that attitude is often more helpful for non-CS disciplines that change slower (eg, math/physics).
A recent negative book example for me is the Quantum Programming book from O'Reilly. I found that it did not discuss quantum circuits in a detail that helped me really understand what I was doing -- though I suppose that is a conceit of quantum computing. Perhaps I will return to it later (I am working through Nielsen/Chuang now, which is very theoretical, but explains things very clearly)
My policy is that a book is nothing more than a learning tool, which a hobby project can also be (perhaps more effectively due to the experience gained).
Then again, knowledge is power, and books are great at pointing you in the right direction -- assuming you found the right one for your needs, of course.
I know some people who won't open a book unless they know they can read the whole thing, which I think is a ludicrous attitude.
I did just order O'Reilly's Generative Deep Learning book, and am hoping to get something out of that, and if I only retain a handful of snippets to use in my career, that is profitable for me.
The least I can count on is that it will look nice on my shelf.
It's often hard to tell what information is truly fundamental, and what will age badly. I look at many books of software engineering that were popular in the early 2000s, like The Pragmatic Programmer, Agile Software Development or the Design Patterns book by the gang of 4. Those books aren't taking about technologies that we expected to be superceded in 5 years. And yet, a lot of the text regarding object orientation and design patterns is outdated at best, and outright harmful at worst.
One can also look at, say, concurrency. A topic often seen as unimportant in the 90s, but ultimately taught at the lowest level, with mutexes, threads, semaphores and such. Those aren't going away, but how often is concurrent programming all about manually setting mutexes? We use higher order abstractions, but those change too. Many languages are encapsulating this in monads, whether people know their promise implementation is a monad or not. But that's not the only way, and the most popular way can change: Maybe languages in 10 years will be all about continuations and direct styles. Maybe it'll be something else. The fact that in the end, the same basic features from the 80s and 90s will exist at the bottom somewhere isn't that useful for most programmers.
And html generation... I was writing UIs before web apps were there. A lot of things that seemed fundamental went away when the browser was embraced. But will the browser live forever, or be superceded? I suspect that it will all get replaced, or be utter legacy, eventually. Will that eventually be 10 years? 30? It's very hard to say what will remain fundamental, and what will not.
Mutexes, semaphore, CAS, threads are fundamental and this doesn't change when you embrace monadic effect systems or continuations, even if some of the details change (e.g., obviously, if a computation can jump threads, then you need implementations of mutexes, semaphores, etc, that can jump threads).
Also, programming languages in 10 years from now will be pretty much the same languages we have today. There might be a newcomer or two, but it will be niche, as incumbents don't really change and only grow.
As a reminder, for people thinking that technologies change too fast, POSIX is still here and as relevant as ever.
Just for the moment, consider the idea of implementing the requirements shown in the figure using a network of connected microcomputers, one microcomputer per bubble.
Structured Analysis and System Specification, Tom DeMarco, 1978.
Data structures, algorithms, algorithm analysis, various discrete math topics (set theory and number theory, a bit of graph theory, are usually included in a typical CS undergrad curriculum), models of concurrency, models of computing (lambda calculus, Turing machines), complexity classes, Chomsky hierarchy, type theory (some might consider this more advanced, varies by school and its lean towards practical or theoretical CS), systems of logic.
If software engineering whiteboard interviews are anything to go by, strictly hash tables. Hash tables are the only thing that matter, unless of course the interview process is broken...
Bonus points if some kind of list requirement comes up and you deftly maneuver the conversation back to using your hash table with array indexes as keys.
Dan Grossman's "Programming Languages" series on Coursera was important in helping me connect the dots. And that was after only completing the first of three parts of the course...
When a link has had significant discussion in the last year, the software will normally redirect the submitter to the previous thread rather than allow a repost through.
For what it's worth, I think this helps HN feel fresh. This is the first time I've read this article, and I enjoy the comments left by my fellow cohorts of this year's read, just like it was something new.
Long lived historic articles can shape multiple years worth of discussion. I think it's even more fascinating to read the comment history and see what the current thinking is, if it has changed compared to previous years, etc.
It's just for fun and interest, as the other replies have pointed out. I've added my standard disclaimer to the comment above. Reposts are fine after a year or so! This is in the FAQ: https://news.ycombinator.com/newsfaq.html.
Thank you for the link to the FAQ! Unless I blacked it out, I don't think I have read it before. Now that I know it exists I can see it clearly in the page footer, alongside the Guidelines... I will go and read those, but I also wonder if this should be reposted from time to time too.
It's so that people can see the previous discussion if interested. 'about once a year' does happen to be the frequency at which reposts don't count as dupes on HN but that's somewhat incidental.
For what it’s worth I check the site fairly frequently and this is my first time seeing this. It would be interesting if there was a way to mark links as seen so they could be removed from your feed in the future for people who don’t care for reposts.
I am quite impressed that the ancient Amazon.com link [0] on the page (with quite a few non-trivial query params) still returns relevant results today. A good case of Cool URIs don't change [1].
I’ve had a number of people approach me, at work and elsewhere, asking how they too can get one of those programming jobs. I always mention the timeline I was on, having started learning as a kid, making it a major hobby, getting a CS degree, internships, etc. and people are surprised and disappointed there’s no quick path.
Or is there? Has anyone here done one of those 0 to 1 bootcamps with success?
I've been programming professionally for 8 years now and came out of a bootcamp. I had taken some programming courses in high school and college for fun but my knowledge was very limited.
The bootcamp stayed true to the name. Core hours were 9-5 but many of us would get in early (usually 7am) and leave late (sometimes 10pm). I would also go in on weekends, sometimes taking Sunday off. I intentionally went to one in a different city so I wouldn't by distracted by a social life.
The focus was in ruby, then javascript. I got a job as an intern in C#/.NET a few months after graduating. I took the intern position because it seemed a tough sell to get a junior dev position right away. I was promoted to junior dev after just 2 weeks.
The school is no longer around, bought by kaplan and dissolved.
It differed quite a bit. There were a handful that put in the same effort as I did and the few I've kept contact with are doing quite well. I'd say that just over half of the students stuck to core hours and most of them didn't make it through the bootcamp at all. I recall that many of those that stuck to core hours and graduated had some programming knowledge already or were exceptionally smart.
About a quarter of those that graduated in my cohort are in the IT field, but are project managers or something similar.
The bootcamp was in 3 phases and you had 2 chances to test out of a phase or you'd be kicked out. Of my starting cohort of 16 people, 12 of us completed the bootcamp. Our cohort grew to 22 people though because there were people that had repeated phases to complete it.
Glad to see at least another perspective more or less echo my experiences/observations. You can't discount motivation, and bootcamps are filled with either those chomping to learn and explore, or those looking for the next goldrush. Adding to the challenge of differentiating is sometimes they are the same person.
If you're skilled/smart/whatever you're gonna be fine, usually, so let's discount those and set them aside.
There's definitely a cohort of "moderately capable at plugging away" type developer that can successfully come out of a bootcamp, and it's usually something like this path:
bootcamp - headhunter agency that needs bodies - develop familiarity/skill with the toolset - hired directly via some other company.
Less "startup valley next great thing" and more "updating the Java application for a change in the tax code this year" style corporate programming (which is the vast majority of all programming).
I mean I think I'm sorta smart but nah I have a 0 to bootcamp success story and I studied languages at uni and didn't even pass A-level maths in high school. I did a 3 month bootcamp. I was determined to end it with a job so I studied 12 hours a day 6 days a week. By the end I couldn't walk down the road without being out of breath but I had a job. My first year of employment was terrible. I was bad at everything and my employer had agreed to take bootcampers without any preperation of how to deal with that so just kept keeping me away from tickets, stunting my growth. After I wasn't renewed I spent a further 2 months doing intense studying and portfolio building and applied for my current job. They've supported me a bit and now 8 months in I'm a pretty competent junior I'd say. I feel like this is the level I should've been at at my first job but as long as you're willing to work at shitty companies for a bit you can just go from bootcamp to employed immediately and then work up to being good from there.
Edit: Whether I'm really worth the investment to these companies is another thing. I tend to think I'm not because I do sorta suck. But in terms of whether it's possible to go that path, be constantly employed and improve in time, yes it definitely is possible.
I dropped out of CS after a 3 months, had never programmed before. I sat in my mom's basement for 8 months and taught myself the basics of JS and was able to get a job.
Since then, I've spent a metric fuck-ton of time programming (tens of thousands of hours to date). I've worked on compilers, 3D graphics, semiconductors and game engines.
I guess my experience kind of supports both sides of the coin. I was relatively easily able to get into industry with little experience (circa ~2012), although since then I've put in a ton of work to become a good engineer that people with interesting projects want to hire. Take that as you will :)
Haven't done one, but have hired out of them. Limited data conclusion: just like traditionally trained developers it's hit or miss. Best results: some other "classic training" (like physics or chemistry or engineering) with a desired career change == advanced junior with an accelerated learning curve. If you want to add motivated newbies to the parts of your software where pure programming is not as critical they can be great hires; the standard deviation is huge, but not sure if it's any larger than say a 2-yr diploma grad.
Did a 2-year, there's people that graduated there that to this day can't answer FizzBuzz, in that same group, there was a marine biologist that was looking for a change, I'm saddened that the former will probably be what people think when they see our diploma
If it makes you feel better, past one job or so out of whatever training/college, nobody actually cares, except perhaps the HR automatic resume sorter.
I did a grad scheme where a consultancy trained me for 3 months and then worked for them under contract for 2 years. I studied humanities at university and had never coded before.
The first 2 years were very hard and it felt like swimming upstream the entire time as I didn't have that basis. It was only at the end of the 2 year programme that I actually felt like I could provide value independently.
A lot of my knowledge is very applied and I've noticed that I lack the CS fundamentals which sometimes makes it a bit harder as I'm having to learn 'basics' as I go.
In fairness my role is a Data Engineer now so it's a lot lighter on the more traditional CS areas like Data Structure and Algorithms.
I think one of the beauties of programming is that you can be self taught. It doesn't necessarily help on a corporate ladder though, who maybe expect some certificates.
Not sure If you'd count it as a bootcamp but I did something similar.
I taught myself to program Basic aged 11, using the manual that came with the early 80s Dragon 32 home computer (a whole 32k of ram). Wrote some simple games, worked out you couldn't really do anything cool without knowing assembly language, completely failed to learn assembly language, as it was too hard for this 12 year old on his own, armed only with a copy of https://colorcomputerarchive.com/repo/Documents/Books/6809%2...
By the time I was 13 I'd given up programming and basically barely touched a computer for the next 15 years, did a History degree, bummed around as a musician for a few years and then, courtesy of the UK dole office, did a 3 month, 5 day a week, 9-5 course in Visual Basic, followed by a 3 month work placement. The company I worked for employed me when the placement came to an end and I've been working as a programmer for the last 25 years - although the last Visual Basic I did was about 24 years ago...
I think I was lucky that I hit the London job market as the first dotcom boom was starting, it seemed to be pretty easy to get a job back then and no one seemed that bothered if you had a CS degree or not. (I still don't have one)
I have done some academic CS courses (via Coursera) and did learn some interesting and useful stuff - but I think the most important thing to have if you want to be a good programmer is to have a self-directed mindset, you have to want go and figure stuff out and be able to learn on your own, using what ever resources you have to hand.
One of the things I like most about programming/software engineering is the fact that there is always more than one way to solve the problem, always scope for improvement and inovation
You tell people who honestly want to get into professional programming that you have done this your whole life with the implication that this is the way? That anything less is “bootcamp” delusion? No wonder they get deflated. Of course they can’t go back in freaking time to when they were kids and get subliminal Programming Pearls from their wet nurse.
And it’s also just wrong. People get into professional programming with... less experience than that. You might have never met anyone in your particular bubble but they do exist.
> The key is deliberative practice: not just doing it again and again, but challenging yourself with a task that is just beyond your current ability, trying it, analyzing your performance while and after doing it, and correcting any mistakes. Then repeat. And repeat again.
I think this is the really important part, you have to challenge yourself and go outside of your comfort zones to keep learning.
I wonder how people’s learning habits will change with AI tools like GitHub Copilot. I used it for several months but randomly got logged out and am now finding myself valuing the slight inconvenience of looking up official documentation (and surprised how much more sluggish I felt for the first week).
Going through extra steps to learn something through primary sources and valuing being uncomfortable at times are important for my evolution as a programmer.
I started learning to code this year, and I keep thinking that I would have thrown in the towel if not for ChatGPT. For better or for worse.
The big difference for me is being able to struggle right up until I'm ready to give up, and then ask ChatGPT for insight. Usually my issue is a syntax one, and I have the concepts down (i.e. I was right to solve it using nested if statements, but I forgot I need to put a variable outside a function, for example). This way I get that dopamine hit of being mostly right, and quick feedback on what I need to improve. If not for ChatGPT, I'd be left feeling like I just failed entirely and I'm not getting it at all, which I don't think is the case.
I think the same experience would be achieved with a good teacher, but then I'd need my schedule to overlap, and the feedback on problems would still be often be delayed instead of instant.
Using ChatCPT to help you after you have been stuck and struggled would be equivalent to asking a senior. A senior might hallucinate an answer too, but either way you will get pointed in a generally useful direction. That’s usually all it takes
But as a senior, I can’t imagine using an LLM at this stage for solving anything meaningfully complex.
A friend of mine in a senior role uses it all the time and says it 2-3x'd their productivity. He architects everything using experience but simple but time-consuming sub-routines are done via an LLM. He also uses it to create tests for his code and is quite happy with how it performs in these areas.
What language does he program in? ChatGPT and CoPilot have increased my productivity, sure, but I don't know if they've multiplied it by 2, and definitely not 3. I mostly program in Rust, and while they are good, they still often produce things that don't compile. Iterating back and forth to get something that works takes time, and it feels to me like sometimes doing it alone would've been almost as quick.
I could possibly be way off in my estimations, though. A true comparison would be having me do a task with and without it, but of course once I've done it once, the next time I will do it faster.
The big thing it's helped me with (also learning) is that my learning style is, after reading a new concept or thinking I get it, imagining fringe/edge cases, and trying to understand the battery limits of the concept and how it fits in with others to ensure I'm comfortable. I'm not explaining this well, but "what's the difference between this method and the one I learnt 3 chapters ago, they seem really similar. Why would I use one over the other, and why is there a need for both to exist?" would be a good example.
Without ChatGPT I'd just ensure I got an exercise right, and move on, half cloudy and uncertain in my understanding. The static content on Codecademy obviously doesn't explain that when first teaching you, but ChatGPT allows you to do ANY such comparison, and explains things exactly as you asked them, complete with demonstration code blocks and said fringe/examples where, in the above example, one method or the other would break or be suboptimal.
ChatGPT has been a blessing for learning C. How do you calculate the conjugate for a complex double when the compiler doesn't support complex numbers? ChatGPT will give you the correct answer together with math examples. One example out of many. Then I verify the result by looking at other people's code - and now I know what to look for.
I wouldn't use Copilot. I've noticed how it distracts people in screen recordings. They pause to see if what is proposed makes sense, and it usually doesn't. Distractions are the last thing I need.
> ChatGPT will give you the correct answer together with math examples.
Actually, back in the days, google would have given you a tutorial with the correct answer together with examples too. You don't need LLMs for this, just honest search. But honest search is gone.
I wonder how long till ChatGPT gets entshitified too...
Yes, Google isn't so good nowadays, or my problems have become sufficiently advanced to not be a quick search away.
What might save us from ChatGPT enshittification are the freely available LLM's. Probably won't be long before we can have a code companion running locally. It's probably already here
In my experience if you search for an advanced topic it's likely to drown you in beginner's tutorials for anything even remotely related.
My pet peeve is i do embedded linux on custom hardware, and every time i need to check something i'll be up to my neck on tutorials telling me how to enable, say, spi on a raspberry pi in user space. Even if I was explicitly searching for a totally different SoC and a recent reference for the kernel functions :)
Can't use any of that, because they all use complex.h (not supported by my compiler) and I have an interleaved array of complex numbers where 0 = real, 1 = imag, 2 = real, 3 = imag etc.
Dear ProbablyRealPersonGPT, please provide the necessary code in C with a short explanation
Yes. They already got the answer, thanks. Pointing out links after the fact is laughable.
Some of you are set in this idea that an LLM can possibly not be even slightly helpful that you have to trudge through pages of documentation just to look for one slight thing that it'd have told you right away _if_ you wanted it to.
I personally think learning how to use ChatGPT and the like IS going out of the comfort zone.
To make an analogy to music, I see it like brass players at the advent of the valve.
It’s a new way of playing the instrument. The old ways and new ways aren’t congruent in all ways, but the new way does seem to have a higher skill ceiling
Yeah, I realised I'm not really learning by having such a cheatcode. Was doing a Vulkan renderer and the moment I stopped using ChatGPT, I started making way more reasonable decisions on how to structure my code via abstractions and such.
You shouldn't take the easy route when learning lads
I don't even use autocomplete. I know that autocomplete is, like, table stakes for the developer experience these days, but I just can't stand videogame shit happening in my field of vision whilst I'm coding, and I've learned to relish the extra effort of typing the whole name in every time, it's like a vocabulary word I can wield with power and precision, rather than relying on the machine to finish my thoughts for me.
> it's like a vocabulary word I can wield with power and precision, rather than relying on the machine to finish my thoughts for me.
I like that point of view and generally code for fun that way. My biggest use for autocompletion is when working with unfamiliar Java APIs on a tight timeline.
I've worked with a fair number of less experienced, ChatGPT focused engineers and they aren't all that different from the Java engineers of yore that were completely helpless outside of an IDE and could only work extending existing code. These latter engineers where the exemplar "blub" programmers from PG's famous essay [0].
But clearly using an IDE does not make one a bad programmer any more than using ChatGPT will in the future. The bigger issue is the field is awash in people not really interested in programming. There's nothing wrong with this as everyone has to make a living, but this has a far bigger impact on the quality of engineers out there than the nature of the tools used.
Not long ago I was at a hip Bay Area startup and I don't think any of my coworkers, senior or otherwise, spent a second of their free time thinking about programming. For me I program for a living because it's great to get paid to do my hobby during the day. Getting started in the field during the shadow of the dotcom bust, the majority of the senior engineers that inspired me were likewise obsessed with programming and would be programmers even if it paid minimum wage. I don't think I would have necessarily become a programmer today since I wouldn't have been near enough the flame to ignite my own spark.
I think solving one own's problems with code will become more accessible to the wider population, also coding will become a bit more demystified. If you ask me, it will become sort of literacy, like reading alpha-numeric code.
But programming will remain domain of CS, which requires a deeper understanding and proper studying to perform it well. So, while I agree with you, I don't think all of this is that bad. It must be a positive think that more and more people at least try to pick up coding, can solve some basic problems with it, and if anything, will maybe motivate their kids to learn it properly.
I kind of wonder if it's similar to how I never learn how to get somewhere if I use Google maps. If I put away Google maps and use my brain then I can learn the route after doing it once.
As a software developer with some tutoring/TA experience, I'm concerned about whether the next generation of devs will actually be able to code, if all they had to do was use a Copilot to complete their assignments, especially at the freshman/sophomore level. They'd be automating themselves out of usefulness at that point.
Then again, we've had similar conversations about "iPad kids." As an iPad (along with everything else) user, I find it's not a bad approximation of a laptop -- you do have a filesystem, for example, and apps continue to be more fully-featured (in an Apple-approved way, of course).
I use copilot myself, but mostly as a smart autocomplete for setting variables or other minor things that consume time better spent contextualizing through problem solving and interacting with teammates. It's not a substitute for being an engineer.
This is true, but I think it's better put this way: The key is making it a hobby.
Expertise comes with experience and experience will only come if you keep coming back. Some people can force themselves to toil endlessly, but here's some wisdom for mere mortals: start by searching for a part of software that you can truly love, then cultivate it.
It is also a good idea to place the hobby part of programming in a related but not same field/tech stack than you day job.
If you simply do more work in your free time you might burn out. You want something that provides relaxation but still synergy effects with your day job.
I've seen incredibly smart people in the coding space (definitely smarter than I) basically "smart themselves out of it" because they felt it wasn't challenging anymore. Or they felt tired of the "wrong" challenge, like "waiting for everyone else to catch up to them." (They were usually involved with bleeding-edge tech that ended up being the norm years later.)
If fascination is motivating, I guess be happy that you're not too smart? ;)
Do you need a person teaching you directly? Then you need a mentor (to guide you) or to go back to school.
If you just need someone to have done the legwork and established the exercises and presentation of material for you, there are a myriad of options: books, blogs, videos, open courses, paid courses, etc.
The trick I've found to mastering a field without having a teacher standing over me and guiding me (and even when I have a teacher) is to learn to construct my own courses and exercises. These are "synthetic". I look at multiple university courses on a topic and grab one or more books, and then work through them in an order that fits my way of learning. If they have a lot of exercises, I do them. If they don't or their exercises are poor (happens), I have learned (over the years) to construct my own exercises.
A major lesson from years of doing this is that there are no perfect teachers, courses, books, or other resources that will, on their own, teach me a topic to mastery. I have to assemble it myself to fit my way of learning and needs as a learner. That part requires introspection along with trial and error. I attempted courses and books that taught me nothing (or barely anything). After a bit of floundering with that, I took a step back and figured out what resources were effective and ineffective for me, what they had in common or were missing, and started assembling my own "courses".
That's a very useful skill to develop, and it transcends disciplines. Even if you do have an actual teacher with you this is still helpful (perhaps even essential) to be able to go beyond what they're teaching you and to cover over their weaknesses as a teacher (either general weaknesses or weaknesses with respect to you).
Nothing beats that feeling of starting the day thinking “I have no idea at all how to do this” and finishing the day with a proof of concept for 80% of the system.
I had no idea it was originally talking about a Pascal book, thanks for the link! Fun to read old versions and see what changed over time (language recommendations, for example).
Great article. At 8 hours a day, 40 hours a week - 10,000 hours of practice will take 250 work weeks to achieve. That’s about 5 years of nearly non-stop programming.
Realistically, it takes closer to 10 years to achieve that goal.
Realistically, it depends how much you enjoy programming. If you program a lot, you can easily hit 3k hours a year..
Most days I program 12 hours a day, most weeks 6 or 7 days a week. On a day I'm feeling very motivated I'll program for 18 hours. I've been doing this for close to 10 years. Occasionally I'll take a month 'off' and gear down to 3-4 days a week, still long days. By my conservative estimates, I hit about 3500h/year.
I work this much because I fucking love programming, and there's nothing I'd rather do ( other than my daily surf in the mornings, of course :)
That’s great to hear. I said “realistically” because most people don’t program for 12 hours a day. I love programming, but I love other hobbies too. Everyone has the same 24 hours in a day :)
The 10,000 hour heuristic isn't particularly accurate, it's just popular.
FWIW, 10 years professional experience seems to be about right, give or take, to make a solid developer. I don't think I've ever met anyone who really got there in 5. Some people shave a few years off by having done it obsessively as a teen also, but at some point that typically overlaps anyway.
Of course there is also the old saw about 1 year, 10 times being a problem. I've met plenty of developers with >10 years experience that weren't that solid, too...
I have a little over ten years experience, and when I read code that I wrote at 1-2 YoE it's shamefully bad, but after that it starts to get pretty good. I don't think anyone would notice a huge change between my code 5 years ago and today. I've moreso improved on soft skills.
Though it probably helped that I had two mentors at a job 2 years in who were ruthless about reviewing my code.
Oversimplified: 1-2 years of not knowing what you don't know. Then 3-5 learning how to write code, then 3-5 learning how to really write software. Most of that last learning curve happens outside the editor. That's a little to linear and non-overlapping, but the essence is true I suspect.
> At 8 hours a day, 40 hours a week - 10,000 hours of practice will take 250 full weeks to achieve. That’s about 5 years of nearly non-stop programming.
Isn't this basically a software job? Like, get a junior role and work your way up over 5 years doing 40 hrs a week.
If the only programming you do is generic commercial work, you won't get very good, especially if it's at the same job or on the same kind of project.
You can learn more about programming by giving yourself new and challenging projects at home. Doing both personal and professional work is the fastest way to progress while learning lessons from both. (commercial work teaches you the "professional" side of software engineering, which has almost nothing to do with programming)
> If the only programming you do is generic commercial work
Congrats you've found an edge case.
I've spent my whole life working in "commercial" settings, I've learned everything doing that, and it's all programming. I really don't know which places you work at where a software engineer doesn't do programming. That's the whole job, 8hrs a day.
Fuck going home and slugging another 4hrs software, unless you have the energy in which case good for you, you can fast track.
This is true in any profession. Building intuition comes from doing more and more with incrementally difficult challenges. There comes a point at which one can make connections and apply what you know from one area into another, and creativity kicks in.
I can't source the video that I watched 5-6 years ago [0] but it made a point about the increasing numbers of programmers and how the majority had less than 5 years of experience. It made sense in the context of IT/web continually having a larger involvement in our lives.
Can safely say after 20 years that there's maybe a bunch of information I don't need to know anymore that people <5 years experience probably would never need. OTOH experience is experience and knowing the constructs of things and their reason for being is always helpful.
And that there's plenty less experienced programmers than me that can do many things I can't even imagine.
TBF it's a pretty wide field, this Turing complete stuff.
After 40 years of programming, 33 of that professionally, I can’t think of anything that was a waste to learn other than scrum.
Syntax is something I never learned, I programmed with man pages and manuals open at all times. But every language and technology I learned taught me something useful about a fundamental problem in computing and systems. The way we address problems gets wrapped in different clothes but the problems themselves are the same.
There’s no short cuts, and I know for a fact that as a programmer I’m orders of magnitude more capable than my 5 years in self. I’m excited to learn more and do new things more now than ever because they will unlock more of the craft to me. The only thing I worry about is my body and mind failing me at some point, not that I’ll run out of things to learn or be lapped by those younger than me.
I feel most of my time as a professional software engineer has been learning stuff that I didn't need to. And I would argue that the majority of debugging time goes into debugging external code from libraries, the language itself, company-provided platforms, code, libraries, DLLs, experimentally figuring out how to use a system or tool that isn't documented, and more. It's pretty miserable. It'd be like being a carpenter or mechanic but spending the majority of your time debugging, fixing, and communication with manufacturer's of broken or poorly working tools.
If only I got to spend my time writing interesting code and learning interesting things.
All of these things are the craft of software. If you don’t find figuring out how to efficiently debug and understand things that are complex you’re probably in the wrong field.
The point isn’t that day to day is full of brilliant insight, but that process year over year you become really adept at these things you’re struggling with and you see patterns more clearly and can cut through the struggle faster and focus more on doing cool things with that ambiguity.
Frankly today is better in these respects than it’s ever been. The tooling for insight and debugging is mind bogglingly powerful relative to 30 years ago. But that doesn’t make struggling in your own ignorance without a clear guide through any more fun. That feeling of desperate helplessness and idiocy is the feeling of LEARNING, which is compounded by ambiguity. It’s easy to feel angry with X for not doing Y (I.e., lack of documentation) for your situation, but the right answer is to buckle down and figure it out and focus on developing skills to make it easier next time you have to learn.
“When art critics get together they talk about Form and Structure and Meaning. When artists get together they talk about where you can buy cheap turpentine.”
― Pablo Picasso
>> It'd be like being a carpenter or mechanic but spending the majority of your time debugging, fixing, and communication with manufacturer's of broken or poorly working tools.
What do you think a mechanic does ALL DAY? Very few carpenters start with a raw blank of nice wood and craft something amazing. It's all interpreting plans (poor spec), framing (boiler plate), figuring out an addition (bolting on features to a shaky platform), correcting things that have failed (maintenance), addressing things that were built wrong (bug fixes) or fixing shoddy renovations (tech debt). Sounds EXACTLY like the majority of software development.
Are you implying that they are fixing their tools rather than cars? I don't know any mechanic that would put up with that and would replace their tools with tools that actually work.
Except mechanics tools are fairly simple devices based on thousands of years of standardization. I think probably a better example would be someone working in a lab on experimental devices. Software tooling is often absurdly complex and relatively new with no clear standard. In 100 years I’ll wager things are not like this.
Yes, exactly, because that's actually supporting my point. That is their job, so they spend all day doing their job instead of diversions along the way.
I once bought whatever jamb saw they had available at one of the big box stores.
It was basically unusable. Fundamentally flawed design. There's no possible way anyone tried it and went "yes, this is fit for purpose". Worse, the complexity of its design which caused so much of the trouble (it had a single-sided blade that could be spun around—the locking mechanism was simply flawed, I didn't get a bad one, it just could not work well) could have been entirely avoided by spending barely more money on manufacturing. I mean it was also very dull but at least it would have been usable, if slow.
I managed one jamb with it, taking probably 10x as long as a version of the tool fit-for-purpose would have, busting up my knuckles plenty, and making an uglier cut besides, so I could move on, and ordered a different one for the rest, since the store didn't have any other models.
That's about the median quality of all tools and libraries I encounter in software, in my estimation. Size of vendor or how "smart" their employees allegedly are (LOL. LMFAO.) doesn't seem to much matter, either. And usually it's like that crappy jamb saw being the only jamb saw available at the store, except there's no option to skip the big box store and look for acceptable ones online, and if there are any other options they're somehow even worse.
Luckily 99% of the tools for "real" work I've bought haven't been anywhere near that bad. Even the cheap ones. Almost all of them are more-or-less sensibly designed and actually do what they're supposed to without causing a huge amount of trouble.
Most of an average day programming, for me, metaphorically, is swearing at this fucking piece of shit jamb saw while my knuckles bleed and I'm fucking around with its brokenness instead of getting anything done that I actually wanted to.
On the other hand the tools are mostly very simple and highly standardized. The simple tools are applied to the problem through a combination of applied experience and constructed jigs. I guess the equivalent in the software world would be classic non-fat Unix as your base tools, and building your own libraries and toolkits as your jigs (no or very limited third party deps), while aggressively avoiding complexity. You’ll get reliability this way; of course, there are serious tradeoffs.
Don’t misunderstand me, though. I agree that the modern software treadmill is horrible and needs to be tamed.
You know, I'd actually like to add to my earlier comment.
> Yeah, it’s not quite the same.
I actually don't know quite why I wrote what I did, because the concession really isn't true. I can only imagine we were discussing some romanticized view of the subject. My own experience, however, shows that reality is otherwise.
Cheap tools usually don't work well or last; you end up constantly complaining about them and constantly replacing them. This is true whether you're talking about power tools or simple hand tools. (How much do you pay for most of the software and libraries that you use?)
For the hand tools --- some things like hammers basically work, but other things, unexpectedly, do not. Handsaws and handplanes are two good examples of simple tools that are fundamentally broken from the manufacturer, and in many cases simply cannot be fixed due to design. Other tools can be salvaged, but don't work out of the box. You can shell out inordinate amounts of money to a company like Lie Nielson, or hope to get lucky on the purchase of an antique, but the typical products you might expect to work, don't.
For power tools, the cheaper tools might technically do a job, but they often function in ways which are detrimental to the work or to personal safety, and don't last long.
You might think the solution is to spend more money for quality tools, but this doesn't always work out either. I'm thinking of a DeWalt biscuit joiner and a Makita pin-nailer; both are well-respected brands for "real contractors" who rely on their tools and expect them to work. The Makita pin-nailer doesn't load its nails right, so I have to futz with it after every nail --- which kind of defeats the purpose of the tool (it didn't do this originally, but it didn't take much use before the problem developed). The biscuit joiner, despite being a well respected model, cuts slots that have so much slop in them that there's no point in using the tool at all.
Now lets talk about machines briefly. My radial arm saw (no longer a common tool, but once quite common both in homes and on jobsites) cannot be calibrated to proper specifications. One of the adjustments cannot be made to spec and it throws off several of the others. The result is a tool that, while useful, is only half-functional. I have theories, but don't know exactly what is causing the issue (I got it second-hand); I would need to completely tear it down and grok the entire design, then rebuild it, to properly fix the issue. By the way, the calibrations it does have need to be periodically checked and corrected (the RAS takes a lot of flak for this, but I find this is true of any machine with adjustments).
This is usually about the time where the internet gets up in arms and tells you to throw away the RAS and use a table saw, so lets talk about my table saw --- the fence can't be calibrated to lock parallel to the miter slots. I don't think the fence is broken by design, but I don't know exactly what's wrong. The calibration process is logical, it just inexplicably doesn't work. I suspect one of the fence rails may be slightly out of alignment. I could fix this by building my own high-quality fence system --- I have the skills --- but this is analogous to throwing out your third party library or tool and writing your own.
In all of the above cases concerning power tools, the answer is maintenance/repair, but in all cases (aside from building a new fence), it is not the work I want to be doing, nor is it work in which I am experienced, nor is the necessary work obvious through casual investigation.
Of course, I have plenty of tools that work great and never give me trouble, other than periodic maintenance, but it's the problem tools that stand out. Software is the same for me; most of it works fine, some of it breaks down due to bit rot (maintenance required), some of it is broken from the manufacturer (out-of-box repairs required), and some of it is just piss-poor in design (see hand tools above).
I still hate the treadmill just as much as you appear to, but romanticizing the life of a tradesman is probably a mistake. Entropy is everywhere, poor design and unfortunate occurrences are everywhere. I don't know if you have any personal insight into the life of a tradesman, but they deal with all kinds of BS, from all angles, that just isn't commonly discussed outside the field in question.
===
Addendum:
I keep mentioning software libraries, but those really aren't tools. They're more analogous to prefab construction components, so in that case we could talk about items such as pre-hung doors and prefab cabinetry that is supposed to save time and require less skill to install. That's great, until you discover that the item was assembled out of alignment at the factory, defeating the entire purpose. I could go on ad nauseam, but I imagine you probably get the point by now...
God, it's all just miserable and depressing when you really start digging into it, isn't it?
It's part of the job, but it shouldn't have to be. If software engineers just did things properly, then we wouldn't have to deal with each others' messes.
> I can’t think of anything that was a waste to learn other than scrum.
Believe me, I have plenty of criticisms of that entire culture but it has a lot of good ideas. What in particular (besides the obvious overwrought ceremony) do you find not worthwhile?
It has no good ideas. All the ideas that appear good are taken from agile, which 20 years ago WAS good. But then the process managers came in, the management seeking clarity on delivery dates, and engineers who got into it for a good job seeking ways to avoid doing real work because they’re actually not into engineering. It’s a morass of everything awful about stupid management cargo cult bs meant to emulate but not understand excellent engineering culture, with roles meant to keep project managers employed, and overriding everything stupid “rituals” to try to measure velocity and delivery dates for leaderships need to control. It delivers none of what agile promises because it is satisfying the needs that makes waterfall so slow.
It's completely wild to me to have been in the field long enough to remember when agile was revolutionary, and then to see it slowly become twisted by decades of mismanagement into something that people hate today.
For most engineers today "agile" just means using jira, 30-60 minute long "stand-ups", and a perpetual panic about not getting enough done in 2 weeks.
Yeah I think kanban is the right model for almost every development team. Teams embedded in a massive orchestrated project across many teams waterfall actually is pretty reasonable.
The idea, and especially its unconditional forcing down the throat, that you can run marathon faster by running it in 100m sprints. That idea is wrong as it has been shown many time in practice, yet even the slightest doubt of that idea is squashed. Scrum is basically an ideology/cult, and as typical for such it is built on a lie - it is sold to rank-and-file as performance improving whereis its real goal is improving reporting to the manager at the cost of rank-and-file's performance, the manager's convenience at observing the status even if the things are relly moving much slower as a result. It is similar to how in a religious cult people would be sold on supposedly reaching nirvana/etc. while in reality the guru/pastor/etc. would be abusing them financially/morally/physically/etc.
Scrum forces you to package your work into daily sound bites. It pull your focus elsewhere and reframes your work in a wasteful way where you are always cutting off the corners because of how it affects your 30 second daily advertisement.
Their "sound biting" of a complex issue resonates. It can be difficult to even explain what I'm doing to several people in the bandwidth of 2 minutes sometimes.
Most charitably, it's just a quick, lossy status report and forced request for help with blockers.
I feel like something is wrong if you feel like you need to cut corners to perform a "good-looking" report in standup. You shouldn't feel/cave to such a pressure.
I was referring to "advertisement" part not the "daily" part. Stand ups can be a way for people who know / want to game the system to prosper in the eyes of some idiot scrum master and managers, while our guy here gets smoked and made to look like he isn't productive.
You shouldn't cave, but if that's the culture around your in for an uphill battle.
I have plenty of complaints about scrum the process, but don't think it's much different than the results of scaling up and standardizing any process across teams of varying levels of skill and commitment.
> I can't source the video that I watched 5-6 years ago but it made a point about the increasing numbers of programmers and how the majority had less than 5 years of experience
I believe that was in this Bob Martin talk — The Future of Programming
It is a rapidly advancing field because it is young, so it makes sense that we’d accumulate a lot of ephemeral information.
It feels like we’re still in the alchemist days. Eventually the chemists will come along. They’ll systemize our ad-hoc observations and jettison a ton of them. Until then… Neuton was an alchemist and he still made some pretty big contributions.
I agree with the first part, but not sure about the second. I believe software is much more engineering than science. The challenge is in organizing the bits. Most of the things learn, debate and design in software are more less arbitrary structures created to facilitate our own cognition. The hard science behind computers is much more cleanly abstracted away, compared to, for example, chemical engineering. For that reason, I don't believe software advances in the same way that natural sciences do. It's much more social in the sense that groups rally around different ideas and implementations, and wherever that energy goes also shapes the new ideas to bloom on top.
Put another way: I think almost everything in software is arbitrary. Certainly some ideas are better than others, but on balance everything is a tradeoff, and I don't really expect for some powerful new discoveries or ideas to reshape the industry towards more structure and efficiency. I think it's more likely that AI replaces human programming entirely (although I dont think that's likely either).
There are layers here. The purest theoretical CS is like mathematics in general. Also necessarily structured around cognition, but seemingly groping for something more fundamental or universal.
I'd agree lots of software is like engineering in that it is further tilted towards the practitioners and marketplaces and their limitations. It has more focus on cost-effective repeatability and risk management. But, traditional engineering is more rooted in some physical reality. I.e. civil engineering has basis in physics and materials science, because those drive failure modes and design criteria. The more social aspects---say evacuation routes for fire safety---developed within an existing practice that already been formalized to manage risks of structural failure etc.
But, lots of software is far detached from such a physical realm. It is more like legislation or contract law, or the arts. It's all about what people want to say or hear and display to each other. Massive influence from the cognitive and social context. It may be colored by technique and available materials or resources, but thanks to Moore's Law and friends, these practical limits have less and less influence on the shape of the outcome. Significant failure modes can be rooted in the vagaries of psychology, social sciences, or even political sciences.
However, software diverges from law or arts when we attempt to scale it up. We want to reach speeds that cannot be supervised and quality controlled by a human operator, or volumes of support tasks that cannot be managed by an affordable human workforce, or impactful decision-making and effects that cannot be adjudicated by a higher court. You need something like engineering here to manage risk, but there is no physical basis for building codes etc. Not only the new software products, but even the elements of risk are social constructs.
But I think the metaphors are broken as one pushes too hard "up the stack", into AI replacing knowledge workers and analysts. Our current best social systems completely depend on human adjudicators in hierarchical layers of oversight. Courts that can comprehend the big picture and resolve differences when our other abstract social activities go wrong. I am nervous about a future where tech behaviors become too complex for such oversight. Or worse, where naive practitioners think they can solve it with another layer of unproven tech. Instead of turtles all the way down, it becomes some faith-based electronic oracles all the way up?
A better analogy for software engineering is search. Given our current position and a desired change, we search for a solution in the fog. Analogies to other problem domains often fall short as search doesn’t feature as prominently.
The chemists are already here, the Knuths and such - it's just being ignored or isn't directly "relevant" yet.
It's like building buildings without "building science" - which, surprisingly, we have actually been doing in many ways to the current date.
There aren't really formal schools of "be a carpenter and build a house" - it's all very master/apprentice even if it is not formally so everywhere. And since things like "energy usage" of the building didn't really matter until recently, many buildings just kept being built the way they always have.
Now things like "can we keep this building comfortable at much less energy usage with some additional up-front costs" are coming to the forefront, and how buildings are built is slowly changing.
Compared to that, software is very in its infancy, because the real only bottom line items are "does it mostly work" and "did it not cost infinity dollars more than it made".
It's not in its infancy. The fundamental problems regarding algorithms and data structures, OS theory and formal language theory are well understood and have articles on Wikipedia. We know Amdahl's law, the CAP theorem and the ABA problem.
The problem is the explosive decentralised development of tools, languages and frameworks under a hyper-competitive neoliberal economic system. Someone invents a language to lock programmers to their platform. Another programmer creates a tool to transpile that language to their half-assed dialect of another language. This leads to a huge problem of entropy and inefficiencies at all levels.
It's a term thrown around by various people to refer to "building houses with something slightly later than 1950s designs".
When people hear "civil engineering" they think "skyscrapers, bridges, roads" - this is more like "if you wrap a house in insulation, it's insulated better, but that can also cause moisture capture, which destroys it" type engineering.
Because programming can be a means to an end. You can do both: Learn enough programming to do useful things for yourself -- maybe not in 24 hours, but 24 weeks is not unthinkable. And spend longer to learn it as an art if you manage to get over the initial "hump" and are still interested.
From the perspective of pulling some C++ programmer in 1993 into 2024, and dropping them into a large C++ code base consistently written with the latest C++ idioms.
(And yes, this is humorous exaggeration. But the name of C++ is apt. A C language dedicated to accumulating features.)
It's a pre-increment operator. If it was called ++C you'd have a point :)
I just don't understand how someone could have been working in C++ and not picked up the largest changes even just by osmosis. My code-based "upgraded" to C++11 about a year ago, but I can still read C++17 and am not intimidated by C++20 fragments. YMMV I suppose.
(Giving you the benefit of the doubt you may have taken the comment literally, if so I'd maybe apologize for insulting someone with 41 years of experience in a field that is still young whilst having no knowledge of exactly who you are talking to)
Teaching yourself programming is important on many levels as the job is a continual learning type of job. What you learned 5 years ago might apply today or you may have to learn something completely new to stay on top.
It's never been easier then now.
For example, if you want to learn how to make games...
There's a guy on youtube called clearcode. He will teach you how to make games, in a Bob Ross voice...pretty much from scratch. You can also use ChatGPT to help you out.
The resources for learning are so much better. Socratic method of just asking questions is now possible with AI.
I think Sam Altman said that he is seeing x3 productivity increase from AI helping out programmers.
Personally, I’d like to see the data/research behind the “3x” claim to buy it. Especially when it is hard to imagine Sam Altman as an unbiased observer when it comes to benefits of AI.
One angle to these multipliers would be this:
A generalist programmer can take other tasks of sw-engineering than just programming with the support of ChatGPT. Like DevOps, DBA and test-automation. For small team or company that is then actually saving or postponing FTE hires
I agree there are many more great resources but also many more terrible ones which will take your money or attention and give you barely anything in return.
I've always been fond of this advice. I think it correctly describes some of the activities I would expect you to do and experience before you will feel like a deep expert in programming.
I also think it's worth saying, "you don't need to be an expert in programming to try it! Start by tinkering!" (Other comments have been downvoted for describing what tinkering might look like, but any tinkering is valid! Try running llama.cc on your MacBook! Quantize a model yourself)
Ultimately, you'll become good if you do it a lot, and you'll do it a lot if you have one or more hobby projects that you're motivated to work on. I have a few friends who are completely self taught (one going the Arduino / electronics / hardware route, the other going the web app and tools for personal use route), and the keys to their success is that they have projects they like working on. And they've kept at it for years.
They don't know everything, and they don't necessarily have great foundations, but it's not too hard to learn things on an as-needed basis these days. Both of them find information very differently than I do, which is also valuable for me to learn and see.
There are other ways to motivate yourself: taking a class with homework that gets you coding (the success of this strategy depends on the person!), finding an accountability buddy who you discuss your projects with, finding an open source project you're interested in (start by adding comments, fixing typos, or looking for a good "first timer" GitHub issue), do Recurse Center (although their job placement program may have limited options for junior / entry level engineers)
At some point, you'll have beaten your head against a problem (how can I order these things correctly? How do I get this interaction to work correctly? Why is my component re rendering in an infinite loop?) and you'll watch a video or read a blog post explaining it and you will truly understand the issue. It will be common, and you might have encountered it in a 201 class, but your first-hand experience will help it stick.
Another totally valid way to learn programming is to be good enough to get a job [1] and then be paid to figure it out day after day, and ideally have experienced programmers mentoring you. I've seen people go through bootcamps get a lot out of it, but I think the quality is highly variable
[1] unfortunately, while I think this was a really good path ten years ago, the bar continues to be raised (new grads with internship experience can be very good, companies are not hiring as aggressively today as they did in the world of zero percent interest rates)
I'm self taught, started with IRC bots in tcl in late 90s and making websites with netscape and frontpage in early 00s. Dreamworks and Fireworks too. Got business degrees by mid 00s, but making things with code kept me going.
Got a US remote job early 10s (I'm in the EU), and by mid 10s I was called "software engineer" by my peers and bosses.
I have just fairly recently upgraded my LinkedIn bio from "Web developer" to "software developer", as I can't even call myself an engineer.
Honestly, I have no idea what I'm doing other than making a real effort not to do stupid shit (and RTFM).
And then again you ask the run-of-the-mill SAP consultant how long it takes until an updated data record is available to a connected system and the answer will be something like 86.400.000.000.000ns
Peter is a gem who really seems to understand the structure of reality slightly better than just about anybody, and is a nice person to boot.
I guess I spontaneously picked up on the 'challenge yourself a little, improve, repeat' strategy a long time ago. I often implement things over decade or more, through a series of small increments. I also decompose many problems recursively and use a sort of A* approach to make progress, sometimes revelling in some detail for months or more at a time before solving it, then backing up the stack to the larger problem I was working on.
For example, I'm building an automated microscope. I work with folks who buy $1M scopes just to speed up their science. I don't want a $1M scope- I want a $1K scope that does 30% of what the $1M scope does. To do this, i've learned how to design and 3d print components, integrate motors, controller boards, etc. Eventually I reached a point where improving the illuminator (the LED light that provides the light to the sample, which is then picked up by the camera) was the most important step and so I took a deep dive into LEDs, and the electronics required to support them.
This has meant putting the scope down and instead creating a series of designs for PCBs that incorporate increasingly sophisticated electronics and higher power LEDs. I set a challenge for myself that is beyond my ability: design and have manufactured, a working constant current driver and assemble the PCB myself using surface mount components. When I started, I knew nothing about constant current, or SMD, or designing PCBs. I started with the simplest possible designs- copying a reference design for ac ontroller, cloning a board I already have, incorporating low-power LEDs onto a board. Each step along the way, adding something slightly more challenging.
When I do this I fail a lot. Some days I get a PCB made to my design after a week of waiting (JLCPCB is AMAZING) and within 5 minutes realize I made a fundamental mistake. Other times, a board works perfectly and I "level up": I can now take everything I learned in the process, and use it to pick up the next challenge. Sometimes I get frustrated and depressed- not being able to figure out something that should be straightforward, and then I either rubber duck it, or ask a simple/stupid question on reddit, which typically unblocks me.
Today, I expect to receive my next constant current board design. If I assemble that and it works, I can then proceed to building a board to host a high power LED. That will introduce all sorts of new problems (heat management) that involve going into Kicad, thinking about stuff, making some experiments, sending a design to JLCPCB, waiting a week, and then assembling a bunch of boards, most of which will burn out (high power LEDs are tricky, if they got hot they fail faster). There's an opportuntity to buy some thermistors (little temperature measuring devices) and put them on the board to see how well my design for heat spreading it.
At the end of all this, I'll have a world-class transmission light microscope that can track tardigrades for hours at a time (itself an enjoyable delve into modern computer vision techniques), and I've talked to the world's leading tardigrade researcher, who wants to incorporate my ideas into his research to make tardigrades a model organism.
By the way, if I had stayed in academia, I would NEVER had the time, money, or energy to pursue this; I'd be stuck working on my funded research. NOBODY wants to give me money to design scopes that are roughly where state of the art was in the 1970s. But if I keep this up, in a few years I'll be ready to go play with the big boys and girls in the robot biotechnology labs with their $1M toys.
Bringing this back to Peter, I had the chance to work with him on a project (attempting to disprove the Beal conjecture by finding a counterexample). He did all the brilliant math and we wasted a bunch of CPU (and I mean A BUNCH) trying to find counterexamples. I like how when he wrote https://norvig.com/beal.html he wrote in the nicest possible way that I was wasting time and energy.
Just doing something for 10 years doesn't make you an expert, I started coding 11 years ago- as a kid. But my knowledge became so diffuse across topics I was interested in that I never really became an expert in any of it.
By "doing something for 10 years", I think the implication is you're actively working on that subject area, regularly, for 10 years. Not per se occasionally working on something every now and then. You can become a really good Chess player if you consistently play for 10 years--you figure out all the strategies and shortcuts, and it would be unusual if you weren't really good compared to someone playing for a year (assuming they're not some exceptional learner). But if your metric is just "I've played it occasionally since 10 years ago" then indeed, you probably didn't develop much breadth or scope as you were constantly forgetting and relearning as opposed to compounding your knowledge.
As for programming, I don't think programming is all that hopelessly complex and broad as field, but it can seem that way to a beginner. Most computer science concepts translate very well to other parts of the field, and the core programming constructs and libraries don't change much at all. How many ways can you configure a website, a mobile app or a database? Your instinct might be to think about all the different libs you can pull in, all the different programming languages, etc. But they all do roughly the same thing, they all compile down to the same stuff. You just have to develop the skill of understanding the fundamentals as opposed to getting lost in a sea of high level abstractions.
I agree writing a variety of software will have similar base ideas, but still, having a good foundation in a topic doesn't make you an expert. I've barely used Java, but if I had to I could learn enough to write it pretty quickly, that doesn't mean I understand at all how JVM and Java's GC works, or what a factory is.
It's not about "doing something", it's about intentionality, i.e. you're working to advance your knowledge and expertise with some form of a plan, even if that plan is "try a bunch of stuff and see what sticks, then double down in that area".
Also, broad-based knowledge can be the expertise; think a general contractor who's skill is tying all the specialists together. The best software managers I've ever had came from skilled generalists backgrounds vs. incredibly deep specialists.
The idea that you should find some narrow niche in which you are so passionate that you dedicate the rest of your life to attaining mastery is only valid for such a small part of the population and, in my eyes, a little sad.
Yeah, it's nice feeling like I have a foundation in enough topics that I actually feel like I would know vaguely where to start looking to solve most problems. Definitely not an expert though.
Maybe, or maybe I became used to and ossified in bad strategies. I hope not, but I think I have learned that learning has to be intentional. Meaning for me, writing stuff down, trying to find out other solutions to the same problem rather than just going with whatever my first idea is, etc.
I think it does, just not in the things your measuring.
Sitting on a couch for 10 years will make you an expert. Not an expert in sitting on couches in general but an expert in sitting on your couch during the period you were there.
24 hours? Oceans of time! I had "Teach Yourself C++ in 10 minutes".
I had done some MSX-BASIC, but after we got a "real" PC I wanted to learn a "real" and modern language. This is what they had at the local bookshop.
The "10 minutes" is done by explaining what C++ is and then it declares "there, in the last 10 minutes I explained things and you now know what C++ is". Ehh...
I didn't understand a lot of it. Chapter 5 or so is templates. It's pretty thin, and just rushes past things and never takes the time to really explain anything. It may be somewhat suitable if you're experienced in other languages, but it's absolutely not suitable for beginners. Visual Studio also didn't help (at the time I thought you needed VS to program on Windows – it was 1999, we didn't have internet, and I was 14, so what did I know?)
Aside from being a border-line scam, these books are worse than useless and actively harmful. As a result of this book I gave up programming for years, thinking I just didn't have what it takes. Wasn't until years later that I discovered this "Linux" and then "FreeBSD" thing that I discovered you can "just" write programs with "just" a text editor, and that things like Perl and Python and C exist.
If you see one of these things at the bookshop you should steal it and throw it out. Haha, only serious.
Reminds me a lot of when I was 13 and had learned Visual Basic 6 and wanted to make a game so my mother bought me a book on how to make games. “How to make games” she said excitedly as she handed me the book. “How to Make Games” it indeed did say. It also said under it “in C++ and OpenGL”. I had no idea what I was doing or what was going on. Enter high school. An elective called “Computer concepts and programming” was the only computer class outside of “Keyboarding”. It was taught by a female teacher who used to write punch cards for a living. She introduced us to C. All of the sudden, that book from a few years ago clicked in my brain and next thing I knew, I was writing games in C++. They were crap. Horrible performance. But I made them from scratch.
Fast forward another decade and then a lot of these Packt publishing style books on making games and learning coding - except for the holy grail of book series - GPU Gems… if you ever want to feel stupid, go read some GPU Gems articles. Today there’s a lot of choices - PBR - Vulkan - anything Eric Lengyel.
I still feel like it’s a trap. All these “Teach yourself…” should really be titled “Become obsessed with…”
Because we live under an economic system that says you must produce value for capital or die.
This is all very good advice. AND, I think we would have many fewer bad programmers and many more good ones if fewer Knuths were spending eight to ten hours a day mopping floors at fast food restaurants for $7.25 an hour, or writing garbage JavaScript for fly-by-night startups for $65,000 a year.
Can you name any social structure in which this is not true? Even medieval monks who are participating in a completely different society have to do the chores.
> Knuths were spending eight to ten hours a day mopping floors
I don't believe this happens and I know quite a few eccentric and unsocial smart people. Much more likely is them to spend the next 10 years at a relatives house speed running Mario.
> and I know quite a few eccentric and unsocial smart people.
Selection bias. People who have been able to distinguish themselves as “smart” are already a narrow sample, as the results of many UBI initiatives suggest.
> I am, somehow, less interested in the weight and convolutions of Einstein’s brain than in the near certainty that people of equal talent have lived and died in cotton fields and sweatshops. — Stephen Jay Gould
> Can you name any social structure in which this is not true?
That's a rather disingenuous framing; I think we could probably do quite a lot to reduce the extent to which it's true without any particularly radical changes to our society. When we do start talking about radical changes, there are a lot that are not just likely to work but actually pretty well proven, like a UBI, improved labor rights, well-regulated and well-supported unions, wealth taxes, and so forth.
The possible alternatives are not "if you don't have a job you starve on the street" and "fully automated luxury gay space communism".
> Much more likely is them to spend the next 10 years at a relatives house speed running Mario.
Despite the disdain which you seem to hold for speedrunners (why?), a lot of really interesting hacking comes out of the speedrunning community. Some very interesting applications of deep learning had their start in the TAS community.
I'd rather ten people "waste" my tax dollars on speedrunning or art I don't like than one person be prevented from making art that's important to me (or doing some cool math, or whatever) because their boss's boss wanted a percent of a percent towards another yacht.
You're most likely getting downvoted because the modern lets-hot-glue-a-bunch-of-dependencies-together approach notwithstanding programming has gotten significantly more complex over the years. Can you muddle through with youtube videos and SO? Sure, if you want the career equivalent of staggering around in the dark stepping on rakes.
The idea that one needs MIT-level training with SICP and AOCP is elitist and sickening. You start by writing small and useful tools that help you explore the language features. You compare your code to other people's code solving a similar problem. You look for ways to make it shorter, more readable, more efficient by reading SO and watching YouTube.
Nobody says you do. In fact, I agree that a lot of the paths people who have never mentored a programmer their entire life think you have to take are bullshit. (Teaching is not mentoring. Mentoring is mentoring).
However, that doesn't mean that it doesn't take a lot to become a decent programmer. It does. And not just in terms of time, but perhaps most importantly in terms of intellectual maturity.
Nobody said an Ivy League educations is (or should be) a requirement to write code. That said without some form of guidance a fledgling developer has no way to discern good code from bad or grasp the architectural or security implications of some random-ass code snippet they lifted off of SO. So again, unless your plan is to spend the bulk of your career repeating other people's mistakes in an effort to learn the hard way getting some kind of formal training should be a primary goal. I'm entirely self-taught and let me tell you from experience rediscovering a design pattern from first principles years after the book was published isn't as heroic as it sounds.
Exactly. The newer teaching materials and resources are great, but they don't change the advice in the article about how to learn programming. In fact, they might make it harder in some ways because of the temptation to take shortcuts.
You're downvoted but learning material has improved a lot since then. Spaced repetition + today's resources should remove a couple years from the 10.
People can even practice programming on their phone with termux. Anyone can program anytime instead of having to wait to use the family computer for just an hour like back in the day. It's so easy to get mileage in.
The complexity of frameworks has increased, but the fundamentals haven't changed and the learning material on the fundamentals is more available and free.
This medium article[1] argues there is a decline of genius.
> I think the most depressing fact about humanity is that during the 2000s most of the world was handed essentially free access to the entirety of knowledge and that didn’t trigger a golden age.
But I believe we are in a golden age, innovations are made every year, life keeps getting better. Change is so frequent that it has become mundane and we don't appreciate it.
Many people are taking advantage of the quality of learning resources, and it's speeding up the development of everything.
> People can even practice programming on their phone with termux. Anyone can program anytime instead of having to wait to use the family computer for just an hour like back in the day. It's so easy to get mileage in.
It may be like that for some, but generally in the 90s computers were widely available and those interested were usually allowed by their family to use them as needed.
There were tutorials. There were digital manuals. It was not that different.
In countries like America yes those 90s computers were widely available.
But now practically everyone no matter the country has a phone now. The availability is much higher. There are more ways to learn, videos from harvard and fun games are the new introductions instead of manuals.
Sure, for a few countries (perhaps Brazil, India, China). That is not really related to technological progress but to economical development in those countries, though.
Everyone else either had that access (Europe, Australia, Japan, even Russia) or is still too poor to have time or is lacking the necessary fundamental education to be able to focus on it even now (for example, much of Africa, unfortunately).
Ironically your advice of watching YouTube videos and reading SO posts sounds more like those "Teach Yourself in 24 Hours" books. There's no easy shortcuts to truly learning any topic.
I only read one of those (Teach Yourself Microsoft Access 97 in 24 Hours). It was 25+ years ago obviously. First of all it's about 24 hour-long chapters with exercises to follow. That's a good amount of material to pick up something like MS Access, and it's not like you'll learn everything within 24 hours from now, but after 24 hours of learning in a week or so.
(You might laugh at MS Access but it was useful at least 3 times in my career, but mostly in the 25-15 year ago range.)
All of what you just said is an exact modern version of the same shortcuts. Net difference is probably minimal whether you had a crap book or watched a pile of YT in short order or short cut through someone else's project, or SO.
Comprehension takes time. Thinking for yourself, making smart architectural decisions, working in an ambiguous stack, and so on.
The nature of the shortcut for experience is the same. Experience crushes anything you learned elsewhere. My best engineers are senior engineers with a fuck load of experience for a reason.
Actually, no. What you describe is a superficial culture in which people mistake instant gratification for actual knowledge and experience. That might work if you are content to end up being a lifelong junior dabbler, sitting the end of a Jira-pipeline and doing what you are being told. It is not how you become a principal or someone people call when they have problems for which there doesn't exist obvious blueprints.
Oh please. All of these time estimates, from 24 hours to ten years to 10,000 hours are completely bogus.
The “24 hours” figure is marketing copy designed to unify and differentiate a brand of technical manuals.
The person who coined the 10,000 hours rule (Anders Ericsson) rejects it as an oversimplification of an arbitrary number, noting that half of the violinists in his study fell short of that number. The ten years figure is derived from this flawed rule.
The linked article is well-written, but the comments are giving “kids these days” insecurity and mid-life crisis.
SO answers, SEO-driven blog posts and YouTube videos are not good learning tools. They provide shallow answers to specific questions. They don't teach you deeper lessons or give a nuanced understanding, and they don't give the experience of practice. I believe programmers have become worse over time due to depending on them.
When I was 16 I was looking at programming books at Borders and a guy handed me a copy of "The C Programming Language"; changed my life.
So much of my career has been shaped by running into developers or just people interested in programming out in the world.
Thanks.