Hacker News new | past | comments | ask | show | jobs | submit login
Current Software Engineers Have No Deep Knowledge (Jonathan Blow) [video] (youtube.com)
59 points by adrian110288 7 months ago | hide | past | favorite | 65 comments



Such a long-winded way of saying, "Get off my lawn! …kids these days."

Having been in the industry for 30 years and alive long enough to run programs for 45, I can point to innumerable examples of badly written shareware games. Hell, I can give examples of then-AAA games where you learned to avoid certain patterns because they would either crash or grind to a crawl for extended periods of time.

Then there are the business apps that used janky proprietary storage formats and silently lost data, sometimes piecemeal and sometimes in whole. Crashes for a time in many apps was normal. Advice was just to remember to save often so you didn't lose as much when it did happen.

How long did we use RCS and CVS and Subversion before modern alternatives like Bitkeeper, Mercurial, Git, et al. entered the scene? You're telling me Visual SourceSafe was deeper knowledge than Fossil?

We have some of the best programmers alive today. And the worst. That's what happens when the talent pool keeps getting larger. It's not a trend of quality; it's just a furthering of the mainstream.


There's nothing in your comment that really makes much contact with his views.

I think he sees peak software in the 80s, with often terminal interfaces and highly reliable systems; or with arcade machines and the like. In the era where your disk may include the operating system your program ran on, rather than relying on one present (etc.).

30 years is exactly the length of time he's talking about; software jumped off a cliff in the 90s, not after. So when you talk about "modern" version control being a comparsion between git and sourcesafe, i think you're exactly trapped making comparisons in the "crap vs. terrible" era of software


> I think he sees peak software in the 80s, with often terminal interfaces and highly reliable systems; or with arcade machines and the like.

Everybody think their childhood was a golden age when everything was better… part of growing up is realizing that things just seemed better because you weren’t aware of the problems. Jonathan Blow was born in 1971, he wasn’t doing in-depth code quality reviews in the 80s.


I think he was at UC Berkeley at a time in the 90s and dropped out into industry when he would have been exposed to a rosy, elitist, R&D tilted view of programming. Where one might both hear myths and have social contact with some 70s-80s giants.

This would be very different experience than someone bumping around in the equivalent of b2b/enterprise software of that era...


Well, I have a vast collection of retro machines. In many aspects things were definitely better.

First of all, everything was orders of magnitude simpler.


I don't recall adults in 1980s considering the advent of arcade games to be "definitely better" or "orders of magnitude simpler."

Every generation establishes a new baseline for "normal". Older generations always think they're wrong and complain about what has been lost, usually ignorant of many things gained along the way.

You will find accounts in ancient Greek writings about the younger generation lacking.

https://history.stackexchange.com/questions/28169/what-is-th...


45 years absolutely includes the 1980s. Entirely includes the 1980s. And it's not like the 70s had universally phenomenal assembly or BASIC being written. Computing wasn't better when processors were doing 1,000 operations per second. They were just doing less.

Today, developers have to contend with attack vectors early developers would have been utterly baffled by. In the 1980s for the vast majority, they weren't coding in an actively hostile environment just to make a todo list app. When you connected via acoustic coupler, there wasn't a single thought to whether you'd be actively port scanned within 30 seconds.

We literally can't use operating systems from earlier decades without erecting rigid firewalls to isolate them first. ANYTHING network aware more than 20 years old (and often newer) would be compromised within 60 seconds of connecting directly to the internet. The only reason 1980s computers weren't absolutely wrecked with viruses and worms was because you had to buy your software from a brick and mortar shop and install it from floppies. (And even then they had occasional supply chain attacks.)

Programmers in the 70s and 80s were complaining about how "kids these days" used C instead of having the intimate knowledge of assembly and even binary that they had. How C programmers just took the easy way out without caring about efficiency and precious CPU cycles. Same energy. And the same fallacious conclusions.

Software developers don't change. The environments they have to code to do. Growing pains and maturity, not lack of skill or commitment.

For the record, RCS was 1982. CVS arrived a decade later in the 90s and was obviously better. SVN in the 2000s was better than CVS. There is no metric where RCS and SCSS before it (1973) were superior examples of software.


I mean, if you literally did what you said then that would mean using telnet and not encrypting your traffic in which case or course you are going to be compromised...that was not designed for networks that were open and accessible to the public.

I'm not sure you have made a point that at all addresses anything Jonathon said. You've constructed a straw man version of the arguments in favor of less abstraction, and more like older systems, which is rarely intended to mean using exactly the same network protocols from the 1980s.

And at the time, those old programmers that criticized programming in C vs. Assembly were objectively correct. It was true that hand written assembly was significantly more efficient until very recently, as in I've seen common examples of such a thing well into the 21st century, and there still exist many cases where that is true. In the 80s and 90s it was the norm, and understood by most everybody (including the Unix developers despite having transitioned to C) that optimizing compilers could not compete in truly performance sensitive contexts. This did not really begin to change until everyone began to consolidate on x86. Until then, people were just willing to make the sacrifices but you couldn't have your cake and eat it too.


You can pattern match these concerns to "kids these days" all you wish. But there are these days. There are days when the kids are doing it worse; and to imagine their arent is foolish.

These are the days when your operating system is hostile to you, sells your data and injects adds into its basic operation. These are the days when the main distribution channel of software is HTTP with dynamic memory languages which require build systems to deploy. These are the days when people read a book and think themselves jnr engineers.

It is all well and good deploying your cynicism in this fashion, thinking yourself wise, but in the end your view is just another old wives tale. The tale of "the elder shouting at clouds" is just the tale of the eldar shouting at eldars shouting at clouds. If one analyses the world with such infantile aproximations, one will finds oneself likewise dumb.

You aren't engaging with the quite serious catastrophe that has become modern software development. It is hard to imagine software engineers of the 70s and 80s, though imperfect, doing with modern technology what we do with it.

The Linus Torvalds of the world, the jblows, the K&Rs, and so on -- are people born of a culture of ruthless simplicity. Award the 70s with the technology of today and it's a hard sell to suppose that group really invents React, the HTTP stack, and so on.

The inability to see this is the real naivity here. Cheap cynicism doesnt win you any point


Linus wrote the first draft of the Linux kernel as a college student in late 1991.

19-NINETY-1

Linux is 100% a product of the 90s and after, and is one of the biggest software quality success stories of all time. It has only improved since then.

Initial public release of PostgreSQL, the best open source database around today, was 1996. It has only improved since then.

The C spec was improved from C89 to C99. Implementations appeared for C89 in the 1990s, and both were notable improvements over original K&R.

First release of Python in 1991. Improving steadily since then.

The web browser is arguably the single most impressive app of all. Graphical layout and executable from network-loaded resources. It is THE killer app of the internet. As far as the average person in the world is concerned, if browsers disappeared, the world would stop. The security scrutiny browsers have received have exceeded ANY other pieces of software that have ever existed, and multiple browsers have been independently created.

Blender: 1994. Improving since then.

Games have never been more technically sophisticated, visually impressive, environmentally immersive, and narratively deep as they are today.

I do have a soft spot for Star Control II, but again that was released in 1992.

This isn't cynicism. This is an honest appraisal of my lived history of computing.


Again, I don't think you're making contact with jblow's point, nor the context of my reply. Nothing in what you've said he'd disagree with.

His claim isnt the obviously absurd idea that no good software is written today, or over the last 30 years. Such a position is trivial to refute.

His issue is the complexity of operating systems today, with too much emphasis on bad developer tooling (eg., GC+OO over just providing better memory handling tools), coupled with the internet as a means of app distribution, and in the influx of "below-jnr" level engineers has created a perfect storm of terrible software.

Something disguised by radical advances in hardware. When a modal popup in vscode steals my cursor focus when i'm in the middle of deleting a file (etc.) it's hard to disagree.

Everything is 10-100x more complex than it needs to be, and most developers are stacking plates higher and higher (eg., electron) on a terrible foundation.

This isnt a point that can be engaged with based on a superficial analysis or just a reductio ad absurd ("here are counter examples...").

How many developers write postgres, and how many write vscode? How much has been spent on the latter vs. the former? (and so on).

I giving Linus as an example of the earlier programming culture, jblow himself is alive today after all. The claim isnt that nothing of this culture survives -- it's that its power and influence has collapsed.

It started to collapse when hardware stopped disciplining programmers in the mid 90s.


Gatekeeping.

The barriers to entry have dropped to the point of disappearing. It's why the industry is so vibrant.

"The odds are good, but the goods are odd."

The core group of CS & CE graduates are as good or better than their forebears. That hasn't changed. What has changed are the number of folks without CS & CE degrees entering the field.

You cite VS Code modals yet scoot over a huge truth: without Electron, Microsoft would never have invested as much into an IDE with consistent UI and feature set across Windows, MacOS, Linux, and more including both amd64 and ARM architectures.

You complain about an annoying modal without recognizing that without Electron, you wouldn't have the app! Or it at best would be Windows-only. Or perhaps a limited-features version for MacOS like their Office suite is.

Then of course there's the realization you can embed VS Code in a browser with essentially feature parity with a desktop install.

Electron paved the way for VS Code to be installed and run on a Chromebook. A Chromebook!

Do you get how amazing that all is? How the VS Code marketplace is filled to the brim with tools for every programming language and subsystem you want or need? Many are crap and many are solid gold.

There are undisciplined developers, sure. There are undisciplined plumbers, roofers, auto mechanics, electricians, medical doctors, CPAs, school teachers, and on and on.

I've met CS PhDs that couldn't code their way out of a paper bag outside of academia. I've met folks who haven't graduated high school do incredible feats of software engineering. And all points in between (which is where I have been in my career).

It's not collapsing. It's becoming normalized in our society. This is what that looks like.


You're explaining the decline, not arguing against it. That MS chose to invest vast amounts of time and money in typescript & vscode shows they could easily have done so on a fast native cross platform technology instead.

The network effects that the interent has created, around terrible technologies, explains the tactical choices made by each actor here to worsen the quality of software dramatically.

The issue is the net strategic loss in what choices were not made. Cross-platform games cost less than the investment made into VS Code -- and consider what valve has done for the steam deck.

Let's be clear: highly performant, reliable, cross platform, native feature-ful software was always an option; and often a relatively easy one to exercise.

The modern software culture made deliberate tactical decisions not to pursue that path. The idea that we should be amazed an IDE runs on a chromebook is frankly obscene.

We should be outraged that we live in a world where laptops can run x86 CPUs and not run arbitary x86 software. We should be outraged that there was ever a world in which a generic piece of hardware would arbitarily constrain generic software.

All this apologia of yours i think really exposes how far our expectations have fallen, and does nothing but show Mr. Blow to be on the money


> Let's be clear: highly performant, reliable, cross platform, native feature-ful software was always an option; and often a relatively easy one to exercise.

Okay. Got it. You have no idea what you're talking about. I challenge you to name these "relatively easy" cross-platform libraries and apps with feature parity.

No, seriously. List these UI libraries that seamlessly integrate with the target OS.


if you think they can't be named for engineering reasons, then err, i dont know what to tell you.

How many games run on xbox, playstation, pc, windows, mac, etc. that include an entire UI library with native font rendering, events, buttons, windows, etc. -- lots ! Indeed, with multiplay, online updates, networking...

This is not an engineering challenge, indeed, their absence given electron/vscode/etal. is the sign of how bad things have got.

The engineering "talent" at Microsoft now consists of teams of script kiddies building javascript UIs on top of javascript UIs with teams smaller than, say, No Man's Sky and achieving much less.

The situation is, really, that bad.

Apple brought iOS apps to MacOS with relatively little effort; steam likewise via proton did a great amount more in brining entire non-linux game libraries to linux.

Making a stupid cross-platform UI library is absolutely trivial compared to either. Many game devs do it for every game they make. Yet companies like MS, indeed most software companies, don't have enough talent available or the incentives to do it.


> How many games run on xbox, playstation, pc, windows, mac, etc. that include an entire UI library with native font rendering, events, buttons, windows, etc.

And each required a non-trivial amount of engineering effort AND are not common libraries between game vendors. A lot of duplicated effort. In addition, they may look the same but there are A LOT of platform-specific differences that must be maintained separately.

> Apple brought iOS apps to MacOS with relatively little effort

Because they spent literally decades bringing their dev tools and libraries in sync with one another. You're highlighting the last lap of a marathon. It should be noted that it only applied to a single platform, Apple's. There was absolutely no cross-pollination with non-Apple targets.

> steam likewise via proton did a great amount more in brining entire non-linux game libraries to linux.

Again, the last lap of a marathon, building on decades of Wine work before it. And even then, there are huge swaths of games that have not been ported.

> Making a stupid cross-platform UI library is absolutely trivial

Yes, Java AWT was one example. Java AWT/Spring was yet another. wxWidgets, Qt, and others as well. Only a minority of devs use these because they are either a pain in the ass to use, have uncanny valley aspects when deployed, or just plain don't look good outside of a Linux environment (where almost everything looks like crap by default).

Have YOU actually used and shipped a cross-platform app with a GUI? Do you have personal experience in this area, or are you talking about theoreticals and trade magazine articles. Because I have. It's a royal pain in the ass. Electron was far easier to deal with though bloated and slow as it is.

It's why Tauri is so attractive to me now. Far fewer resources while still maintaining a common, familiar cross-platform API.


> How long did we use RCS and CVS and Subversion before modern alternatives like Bitkeeper, Mercurial, Git, et al. entered the scene?

For Subversion, about a year.


Not really, a company that adopted SVN early, stayed with it for 10 more years until it was clear that git is the way and they were struggling to find people willing to work with SVN.


Still using SVN here along with git. Not for coding, but storing CAD/EDA data. Large file support is the key here. Git LFS unfortunately is not very good and as soon as your repos grow into tens of gigabytes, git is not very fast anymore.


My first (and only) IT role in an EDA shop (both mech and electrical), one of my first asks was to get an SVN server running. I was like what? You sure? Yes, they were sure.


I get tired of listening to this kind of thing pretty quickly. There's very little here that's falsifiable or of substance, very little to actually confront in an intellectually rigorous way.

But I'll add this, it's especially uninteresting to hear comparisons between gamedev and webdev. The goals and constraints are totally different.

Of course you're going to see a thousand very similar frameworks and a million similar npm packages, and of course none of them are going to be incredibly optimized or performant in the way a video game would be. That's not the goal. The intent is to ship things quickly and iterate on what you've shipped as necessary, so that you can test the value prop and your assumptions in the real world quickly.

It's completely opposed to how video games are engineered, which are delivered in (if not a shrinkwrapped box) individual major releases that might come once a quarter or even year. Video games have to meet strict 60fps performance targets, they often push the boundaries of the platforms they're built on.

So what's with all the fake shock? Why do we pretend to be surprised that the output and the process is different when the goals are different?

Get real. None of this performative whining from up on some pedestal helps anyone, except those who are so insecure about their engineering ability that they need validation from some imagined authority.


"Video games have to meet strict 60fps performance targets" is certainly... an opinion, and it would be interesting to understand how you came to have that opinion.

Just to give us an idea, can you tell us the last say, half a dozen new video games you've played?


Look, I'm trying to give the guy some leeway for being a massive stick in the mud. I think some people call this "steelmanning"?

Fine, it's rare that "60fps" is actually mandated by publishers or console manufacturers (even if there's at least one instance to the contrary[1]), but it's at least a fact that your title is unlikely to pass console certification if it isn't at least somewhat stable[2] and that titles have been retracted for instability in the past[3].

Whether you have to meet 60fps or 30fps or just remain somewhat playable is entirely besides my point, which is that making mistakes in writing code in gamedev can lead to lost sales while being too conservative to make mistakes in writing code in webdev can lead to opportunity cost, and that's where the difference in engineering strategy comes from, and why are we fake surprised that they're different when it's obvious?

As an aside: I'm not sure why you're asking me to name six new video games I've played when even my most avid gamer friends barely have time for half that.

1: https://www.polygon.com/2016/3/17/11256142/sony-framerate-60...

2: "A playable experience varies per title, but generally means no severe drops in frame rate, no freezes, impasses, bugs causing major progression hindrances, or graphical corruptions." https://learn.microsoft.com/en-us/gaming/gdk/_content/gc/pol...

3: https://www.theverge.com/2020/12/17/22188007/sony-cyberpunk-...


Sure, I don't really think Jonathan needs "Steelmanning" at this point, but it's not a bad exercise.

For (1) that's an engineer (not the decision maker) saying that it's a requirement and then almost immediately back-pedalling, and it's nor for Sony's mainline console but specifically for their niche VR product. I do not get the sense from this that if I was an AAA publisher Sony would, in fact, refuse permission because the heavily pre-sold VR title sometimes missed 60fps. Their engineers would be angry, understandably, but the engineers don't make business decisions.

I've only played one game on Sony's VR system for any significant time and it was Keep Talking And Nobody Explodes (which is excellent, if you can stand VR at all I strongly urge you to try KTANE with a few friends, only one VR setup is needed). I didn't notice any frame rate drop in KTANE, but then, that's not very surprising given the uh, intentionally sparse environment you play in.

"Somewhat stable" is a much easier bar to meet, but realistically what matters is just business. Cyberpunk made sufficiently many actual customers very angry and they demanded refunds. If your title is less hotly anticipated, do not expect the same pushback.

As to why I asked for six, it's because just one or two isn't very representative of your experience. Maybe I should have asked for three or four. I played a bunch of Mario Wonder this week, that's nice but while it's a very fancy 2D platformer it is just a 2D platformer.


laughs in cities skylines 2


I am often surprised by how much foundational knowledge is unknown, even when extensively used at work. I can't really speak for game programming, though I imagine the prevalence of game engines has a lot to do with that. In regular SaaS back end work, knowing how your SQL database works is pretty key. Ruby/Rails devs are now familiar with N+1 queries and how to fix them. It's common for the same devs to not know how indexes work and why adding/removing a WHERE condition or changing an ORDER BY can have such drastic effects.

Since the video didn't get into details of examples of what constitutes 'deep knowledge' (or I missed it), I'll describe it as lacking an accurate mental model. That's what I struggled with first learning git, once I had a mental model that was close to how it actually worked, every new command I learned fit into that seamlessly. The part the video does talk about is cargo-cult behavior, which is very much how the building and adjusting of a mental model is avoided--it's just building a memory bank of problem X, incantation Y. These folk will be the first to be replaced by LLMs. Build your mental models and adjust them!


Why is it surprising that beings with finite time capacity are unable to learn information that’s expanding at a daily rate greater than their capacity?

https://xkcd.com/1053/ is relevant. The world would do well by adopting a more positive perspective towards people who don’t know what we expect them to. Because I’m sure, they probably know something I don’t… that they expect me to know. And that’s how we all progress.


The fundamentals really aren't expanding that quickly though.

I think focusing on mastering the fundamentals in depth would pay off for any software engineer, with benefits that compound every decade of your career after putting in that upfront time cost and a fairly minimal maintenance cost.


That supremely depends upon what you define as “fundamental” and in what context. :)

Do some ideas have higher leverage than other ideas: certainly. Is it easy to tell a priori which ideas those are: not entirely.

Consider log-structured merge trees (LSMs) were invented in 1996; Amazon S3 was released in 2006. Some might consider these fundamental ideas that have existed for 28 and 18 years, respectively. Have you mastered these ideas enough to understand how the cost structure of S3 enables you to implement a really economical distributed LSM at extremely low cost because S3 doesn’t charge you for bandwidth usage?


For sure, but I think even if you didn't consider those particular examples to be a part of the fundamentals, if you had a solid command of basic algorithms and data structures (e.g., including your basic trees, including B-trees) and some basic math, you could easily pick up LSMs and figure out how to exploit the S3 cost structure as needed.

The point of the fundamentals as commonly construed is to allow you to quickly pick up novel concepts as needed as they're almost always a slight extension of the fundamentals.


I would say focus on what's fundamental and pragmatic to each (e.g. immediately relevant). All the recent stuff actually isn't that, unless you're the one doing the research or implementing them. I think LSMs are great and I'd gladly use a db that uses them but I don't need to know how they work, only their properties when selecting a db. If I'm using a relational DB though I'd damn-well better know how indexes work.


In absolute terms the number of people with deep knowledge here is probably higher than it’s ever been. The ratio has changed, which is what I think he’s actually lamenting here. He mentions civil engineering; one thing I find myself often bringing up is that software engineering is a brand new discipline. The operating domain of softare is immense, maybe infinite. So I’m less curmudgeonly than Jon is here, we’ve got a barely century or so of software experience as a species, there’s gonna be slop.


Hillel Wayne offers a much better, more formally grounded survey of how software engineers compare to “real” engineers - https://www.hillelwayne.com/post/are-we-really-engineers/. He actually interviews civil engineers turned software engineers, among other “Crossovers” as he dubs them. Turns out, software engineering is as “real” as other types of engineering to crossovers who’ve done both. Software engineers just _think_ that other types of engineering are more real.

The likelihood is low that Jon has spoken to any meaningful number of civil engineers in making his comparison to the software engineering field. Rather, he reasons from some idealized version of what he believes civil engineering to be.


The “crossovers” all studied engineering. I think that makes this an incredibly biased sample. At best, I think one can say that those who studied engineering apply the engineering design process they learned in other engineering fields to their new jobs in software.

Beyond that, I think the survey would need to expand the pool to a representative group working in software. Possibly those who studied CS, those who studied other fields, self-taught without formal higher education, and include people who studied (and possibly are licensed) software engineering, to provide a control group.

Even within the engineering field, not everyone is an engineer or practicing engineering. There are different levels of education and credentialing and those people fill useful positions. For some reason everyone in software insists that they are doing engineering and are an engineer without having studied any engineering topics. (I’m not talking about having to cover the chemistry, physics, differential equations and other topics that aren’t core to software.)


Thanks for taking the time to read the article! My responses are key’d by paragraph number.

1. The study is comparative in that it answers the question do engineers from other fields consider software engineering “real”. The interviewees answer more than whether they apply their an engineering design process, they comment on what they see in the industry from others, and resoundingly agree that software engineers are abundant and no different than “real” engineers.

2. We agree, aspects of the study could be improved. Nonetheless, I think Hillel’s analysis serves its purpose in leading the discussion forward on whether software engineering is “real” in a more productive direction than Jon Blow’s comments do, necessarily, as the topic of this HN thread.

3. Part 1 of Hillel’s article, linked above, addresses and agrees with your point directly. Suggesting that we don’t have a vocabulary vibrant enough to describe all aspects of the work people do with and on software yet.


I'll be honest, I've read the article several times. I'm way too dumb to get what the author's point is.

I think the video opens up more interesting lines of discussion. There are two points I think the video has that are relevant. One is the discussion on deep knowledge (there is a body of knowledge to learn about software engineering including fundamentals of the discipline). The other is really about the engineering design process (there are domain specific techniques for software that can be used, but it's common to all engineering disciplines).

As other commenters mentioned, the fundamentals are probably more accessible and more known now than at any other time. The minimum bar to enter the field has probably gone down substantially because of all of the abstraction and tooling that exists, and because there is no standards to what job titles are, everyone wants to be called an engineer.

The software PE was removed in the US, but the fundamentals are probably pretty similar in Canada. https://www.egbc.ca/registration/individual-registrants/how-...

Sure, this is probably a little much and there are plenty of people working in the field that don't need to know much of it. That brings us to the vocabulary, which is already in place! Engineer, technologist, technician, skilled trades, and unskilled labor are different categories of jobs within the existing fields of engineering. These range from requiring no formal education to requiring years of formal education. Again, everyone wants to be an engineer and there is nothing in America stopping them from calling themselves one.


I'm not sure that there was an inrush of programmers because "it could make a lot of money". I can't see, "Chad" (yes, I am being kind of offensive — but to make a point) having no knowledge or interest in math, programming — pursuing software development just to make a lot of money.

I don't doubt that a number of students who could have chosen among a number of STEM fields did in fact go the software route because it was lucrative. But these are going to be smart people - not Javascript-only code monkeys that the speaker is implying.

There's a spectrum of engineers that I have worked with.

Some are very, very technical, more like the civil engineers the speaker talks about. They care very deeply about optimizing for performance, care about clean API. And, sure, there are engineers on the other end of the spectrum that are more focused on completing a task with perhaps a more pragmatic (?) focus.

Perhaps it's a good thing the industry has both kinds.

(Myself, I was closer to the latter example above — while I could intuit the efficiency of an algorithm, I never did take the college course that covered n^2, etc.)


> I'm not sure that there was an inrush of programmers because "it could make a lot of money".

This is all I've seen within the last 5 years.

For context, I've been working in-industry for ~8 years. I don't live or work in one of the big tech cities in the US nor one of the biggest within my state.

Within the last 5 years, I've worked at two companies and have seen 12 developers join. Of these 12 ten came from various local React bootcamps. When I've asked about their history with creating software it's always the same response: "I've never written any before <insert boot camp name>" and when asked why they decided to get into this industry it's always "<insert tech influences name> said I could make great money with only a 3-month boot camp".

The other two are oddities and also the most recent two hires I've seen. They both had ZERO programming experience before getting the job. No college/university, no at-home learning, no boot camps. Not even a basic printf("hello world") attempt. The company decided they could do 'on-the-job training' by letting them watch Udemy courses for the first 3 months then they just threw them at the senior devs like a grenade with a loose pin.


> I'm not sure that there was an inrush of programmers because "it could make a lot of money".

I certainly saw that in college in the 90s. I was shocked at the number of people who didn't know anything about or even really care about software who were in computer science classes. Because the dotcom boom meant anyone could be a "webmaster," so computer knowledge would lead to a "good job."

Since then I've seen at least two more waves of folks chasing the money. Because millennials and zoomers have precious little hope of living debt free or even home ownership unless they follow what we used to think of as a "get-rich-quick scheme" like taking a 6-week bootcamp to try to get a tech job.

> Some are very, very technical, more like the civil engineers the speaker talks about.

> there are engineers on the other end of the spectrum that are more focused on completing a task with perhaps a more pragmatic (?) focus.

> Perhaps it's a good thing the industry has both kinds.

Would you say the same of any ("other") engineering discipline? It's a good thing those engineers at Boeing were worried about completing the task at hand instead of taking a deep, technical, holistic view, perhaps? I can't agree with you.


> I'm not sure that there was an inrush of programmers because "it could make a lot of money".

Literally everyone in my college class in the 90s was there exactly for that reason; they almost all sucked at it, even after years of work. Then when I ran my first larger company end 90s to about ‘10 , almost everyone we interviewed went into development because of the money; most didn’t like it, would never do it as a hobby and they closed the door after 5 pm. All fine, but not sure how you sure how you can say there was no inrush of people digger for gold and not really wanting to do the work.

There are many good people but the majority simply don’t care. I guess in some places you will find more of one kind than the other.


I absolutely can. I've met people whose family see studying computer science as an acceptable alternative to studying to be a doctor or lawyer. Software engineering has gained both in monetary value and in social value.


I think he's saying that we're losing knowledge of fundamentals which is true. It's because corporations are commoditizing and devaluing applicants with university educations to save a buck. The university educations are where people gain knowledge of the deeper fundamentals of Computer Science.


If you ever watched Blow talk about higher education, you'd know he does not have a great opinion on it. As far as I remember he dropped out of Berkley before getting a degree, he's mostly self thought, and he considers most of the curriculum of a comp-sci degree useless for "real" programmers - where "real" is probably game dev, or something else similarly computationally intensive.


> I think he's saying that we're losing knowledge of fundamentals which is true.

The more knowledge and different application domains a field has, the smaller share of that an average individual practitioner is going to know — or need.


You're going to want to know more stuff, to be able to change jobs more easily.


Its funny because he also argues that much of the computer science curriculum is worthless advice on programming. Recently dove into parser theory as an example.


It's also because demand has been higher than the pool of workers with a CS degree.


This resonates, but I think it's overly pessimistic about the knowledge that software engineers collectively have. Actually there's a lot of deep knowledge and experience, just not enough to counterbalance the massive volume of people and ideas trying to get things done in the space.

To use his example of civil engineering, the reason that's a more solid technical discipline is because the difficulties and consequences are more visible and comprehensible to laymen. People understand it's hard to build the Hoover Dam and the consequences if you fuck up. It's obviously a huge responsibility that is taken seriously, and does not attract dilettantes just looking to get rich. Because projects are expansive, massive and local, there's no scalability principal that favors engineers contribution. The only way to get rich off big construction or civil engineering projects is to run massive companies, because the actual costs are large, and the biggest overall factor in success is good logistics and project management.

Software is just a whole different thing because there is no physical locality or even logical locality. Code is trivial to copy and remix in any form, heterogenous executing machines can interact across the network. Goals are arbitrary and can be understood differently by every participant if they even are understood at all. Yet despite this, software can be deployed and made instantly to billions of people at the push of a button. Zombo.com wasn't lying, you really can do anything you want with software. Impact can range from world changing and unequaled wealth to a life spent toiling in obscurity on imaginary sand castles. It's no wonder software is a mess.


Ultimately engineers exist to make things, usually to make money. Any trend in the knowledge required to do so is just a reflection of what sells. This is reason why some web developers barely know any theory, they don't need it to make a thing that makes money. Same reasoning explains why a Game Developer (if the money-senders demand, customers, publishers, whatever) might need to know these things.

Just saying "kids these days" or chalking it up to education or a generational issue is silly, there's simply more software and faster computers, unless you make comically bad decisions the average paying customer isn't going to complain about a web app's performance. If you want to find someone who has this "Deep knowledge" of how computers work you needn't look further than someone who works with any sort of hardware limitation.


The comments here are a demonstration of his claim about there being a lot of people chattering in a “loud room.”

I see the same problem in software testing and I see the same outlook with the same heuristic for improvement: just do better in an obvious way.

The marketplace of software development and testing ideas is highly inefficient. Good ideas float to the top slowly and are easily buried again by transient and desperate fads.

I agree that the irrationality of the market may outlast any individual’s solvency. Still, anyone who chooses the path of craftsmanship and excellence can set an example that has some chance to make the world better.

The first time I wrote a comment like this in public was probably on Comp.Software-Eng in 1994, unless it was on Compuserve prior to that. The loud room has been around a long time.


I don’t think this is limited to software engineers.

I feel like change is happening more rapidly, and there is more being created than ever before.

That makes it difficult for anyone to go deep on anything. We are spread thin across vast landscape of languages, tools, and platforms - with new ones being added every day.

On a similar note, I sometimes think about what it takes to be a movie or music “buff” these days.

I was born in the 80s. Being a movie or music buff back then meant gaining deep knowledge in a much smaller catalogue of work.

Thanks to advancement in audio production technology and access to affordable software for home studio producers (plus the rise of streaming and sharing platforms), the catalogue of music in the world exploded. The same applies for TV and movies. As a kid, it was possible for me to listen to pretty much the majority of my favorite genre (metal) available to me and be a “buff” with deep knowledge of that genre.

If I was a kid today that would be much harder, I might have to stick to certain sub genres or focus on only the most successful artists.

To come back on topic, I used to have deep knowledge of a certain type of software - Flash actionscript. Who remembers that?

These days I just get by with what I need to get the job done.


This is almost all he ever talks about and it's getting kinda tiring.


That is not _all_ he talks about, yet he does it enough and with enough of a lack of empathy that people get upset and amplify it to the levels that it's most of what you see about him if you're a normal person.


Man who works on software requiring specific knowledge and who runs in circles of people doing the same laments that the other 90% of software developers don't know or value his specific knowledge

I may not be a Jon Blow or a John Carmack but I've been in programming long enough to think 'bullshit' the moment I hear anyone talking so generally about a field basically born yesterday, that looks almost completely different today.


I don't disagree with any of this, however the lack of breadth of knowledge is much more remarkable. Aside from security and hacking, I find Software Engineers lack bare metal and low level fundamentals in electronics and IT. Go ask your Software Engineer friends if they know how to change their IP address or if they know how to replace a PCIe card in a PC without using an online tutorial.


> know how to change their IP address

My first job was as a sys admin, and I was explicitly told that they hired me because my eyes didn't glaze over when the interview covered their static ip network

The thing is, that was then. Now suggesting a static ip immediately begs the question of what wouldn't you use dns to solve the problem instead.

Dns doesn't solve all cases but it does now cover a significant number. The industry has changed enough that I no longer expect anyone who doesn't have some background with networking specific concerns to know this stuff

I couldn't tell you if this is a net positive, but it definitely feels less relevant to my day-to-day than being competent with whatever DSL your CI pipeline uses


Is your home printer using DNS?



dhcp. Boring, simple, and insufficient for professional work imo. Not a problem for a largely static home network

At work "the ricoh guy" manages the printers. You can access them by name (wins) or ip. I think that's a good enough solution, but it focuses all the static ip work to one small team (possibly 1 person) and to everyone else it's some characters you effectively copy paste once per laptop refresh. No networking knowledge necessary

To be clear, it's not that I think these skills lack value or practical application in lots of professional settings, it's that I think their relative position to other things has gone down


A typical modern home printer is connected to WiFi and uses that to reach the cloud where most of the actual implementation for printing lives. So, yes.

My mother's (5+ year old) all-in-one colour printer etc. will cheerfully scan a page and email it to her, because of course although the sensors turning the page into image data are physically in the scanner, all the actual software she's using is "in the cloud" so sending her an email is no harder than turning it into a downloadable ("saved") PDF.


Could you explain in simple terms how knowing how to change the PCIe card in your computer without looking up instructions has had an effect on your ability to write software?



That does not answer the question at all.


No overlap in one's profession with an abutting profession determines a lack in skill or time with it. Ex. the chef that can't sharpen a knife, or a chemist that has no knowledge of physics, or the Software Engineer that doesn't know the difference between a file's size and its size on disk.


We’re getting more specialized as a society, and we have to accept the facts that not everyone knows everything anymore




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: