Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Do you feel the quality of SWE has gone down?
58 points by throwaway_45 64 days ago | hide | past | web | favorite | 89 comments
It feels like a lot engineers now days don't seem to have a good cs background. They don't seem to understand things like cache, paging, virtual memory,cpu pipelines, algorithms or other things pretty important to CS.

I know we have a lot of bootcamps and people are joining because it pays decently, but is this necessarily a good thing for the industry?

When we have the next industry crash (.com crash) will these people stick around?

I honestly can't remember a time in my career when I've looked at a codebase and thought to myself "what this dev really needs is a more thorough understanding of CPU pipelines and virtual memory". Ed Yourdon once wrote something like "No project was ever cancelled because the developer couldn't address the serial bus". I'm much more concerned with bootcamp graduates not understanding good class design or appropriate unit testing than I am about any lack in low level mechanics.

I don't know if SWE is getting better or worse, but I do think that most SWE operates on a much higher level of abstraction than it used to, and I think that's good thing.

I'm in computer graphics, and find myself thinking "this dev really needs to better understand memory layout and computer architecture" all the time when I look at even basic code.

A perfect example of this is looping over some range of pixels, and many will just go ahead and do for (x) for (y) setPixel(x, y, f(x, y)). There's no logic error here but it's still terrible when dealing with images in the usual memory order idx = y * width + x.

It seems like many people have no idea how even basic abstractions (such as the aforementioned pixel indexing example) work, and have no performance expectations because they don't code in any systems languages.

People who are aware of such issues and still can structure large codebases well seem to be getting more rare to me, at least. In the 90s there were so many incredible demoscene programmers, and now... hmm...

(A related thing I wonder about is, where is the von Neumann or Newton of our times? There are more people around than ever, nutrition and medicine and poverty is globally better than ever, ...)

I'm currently studying EE and would like to get into embedded programming/semiconductor design, could you explain the optimal way to loop over the pixels if not with two nested for loops? Would it be something related to how in memory it's effectively a 1-D array?

It's a really deep problem actually! Doing y then x is better but it's far from "optimal".

These parallel programming course notes have an example of using z-order curves which is slightly better yet: http://ppc.cs.aalto.fi/ch2/v7/

And the best solution requires something like Halide https://halide-lang.org/ to find the best traversal order.

The loops should be inverted. It should be for y, then for x, which accesses the memory in a linear fashion instead of in stride.

This presentation (PDF) explains why: https://www.aristeia.com/TalkNotes/ACCU2011_CPUCaches.pdf

At the very least, the inner loop should be over X, since that is how the pixels are packed.

It is precisely the linear ordering in memory. Whether X-then-Y or Y-then-X is best depends upon if the data is stored row- or column-major.


Honestly, who has time these days?

People that care about their craft?

Yeah buddy I get home tired from work and the commute, crack open a cold one, and surf late 2000s forum posts on installing gentoo on a PS3.

There are many healthy activities to take up rather than play with deprecated/obscure tech. Not to be judgemental of course, I draw my line at home repairs.

When it comes to computer graphics, "deprecated/obscure" tech is the tech that is comprehensively documented from the ground up. There is some documentation about internals of the Raspberry Pi SoC and GPU, but sadly not enough to make it a compelling demoscene target.

I'm not saying you should. Enjoy your free time. I just think it's weird to shame people that care more by implying they have too much time on their hands. Everyone has a level of caring that's appropriate for them.

There's a lot of things you can learn from writing demos that is still incredibly valuable. I wouldn't expect 99.9% of developers to know those things, but if I saw it on a resume? That'd totally jump off the charts to me and I'd definitely want to interview that person.

Don't shame expertise just because you choose to spend your time differently.

I appreciate what you're writing about demoscene programming, but the GP comment does not seem to me to be shaming. It most likely was just about the commenter not having time themselves. HN has a guideline for cases like this:

"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."


I agree with the rule, but I only accused him of shaming after he drew a broad insulting caracature of a community I'm involved in and actually understand (the demoscene) by suggesting that it just involves people "browsing early 2000 forums and installing gentoo on PS3s". I think if you said that to any demo coder they'd tell you to eat sand. It's not even remotely close to what that scene is about. If there's a criticism to be made here, perhaps it's about not making assumptions about groups you know nothing about?

I made no assumptions other than what he explicitly said, which was insulting and displayed ignorance of a community he clearly does not understand. My original comment was also only "People who care about their craft?", which was simply saying that people get into demo coding because they care about doing something worthwhile (getting the most out of their machines in an artistically satisfying and interesting way).

There's a common social anti-pattern in most tech forums of hand-waving away any complex domain knowledge that the person doesn't personally use as "useless". I have no regrets about stating the contrary.

I am amused that I am the one getting a warning about this though. I'm not the one that made a drive-by ambiguous comment that could easily be read as singling out a fun community as being a waste of time. I just pointed out that that's what he just did.

I see now that the problem was that you pointed it out in a way that made sense to you, but not to the rest of us who lack all that background information. If you had responded in the first place with the explanation you've posted here, it would have been much easier to understand what the issue was. Now that I read this, I get your point, and I also see how the problem started with the GP comment, not yours. But this was not at all clear to me before! and I'm sure not to many readers.

HN has been super impressed and favorable to the feats of demoscene programming for over a decade now, so hopefully there's a lot more of that than this.

I'm sorry it came off as shaming.

I don't know of any project that was ever cancelled because of poor class design or lack of unit tests either. Sure they were hack jobs but if they met the needs of the customer then they were successful. Likely successful enough to be worth hiring you to fix those shortcuts. And allow you the free time to complain about craftsmanship on a public forum.

I absolutely have. A bonkers class hierarchy and no unit tests means that velocity will bog down to a crawl or even start running in reverse.

Absolutely. But to management it can look like another reason.

For example it could look like “Not enough revenue to pay all the devs”. But you only need a big team because it’s basically spaghetti code and there’s a lot of fires to put out.

I’ve seen this.

It’s also really hard to convince managers even technical ones who are not in the code that there is complexity to deal with. “But it’s just a [something that prima facie sounds easy] should be easy!”

It depends on the industry vertical. If you have heisenbugs in your software, let alone poor design / lack of unit and functional tests in one industry, it may not matter. But it may end up killing someone in another industry.

A bootcamp person can learn most of these things. It just seems with some of our recent hires they don't seem to want to learn this stuff or maybe it doesn't really matter for react devs.

And these things do make a difference. L1 cache is faster than L2 cache which is faster than L3 which is faster than memory which is faster than hard disk etc. You want your code to keep things within those limits to make things faster.

Or for example if you start using virtual memory and paging to disk you might want to switch algorithms. For example you might want to use merge sort instead of quick sort if you don't have a lot of ram and you have to go to disk. However if you have 128GB of memory and mostly randomized data you want to use quicksort.

This is kind of trivial example, but I think this stuff is somewhat important.

Yes I do think about this occasionally at work for math heavy (LA and in house built convex optimization) problems. I cringe when I see tight loop with stream/functional java API. It’s not zero cost abstraction you know!

Because, and your acronyms hint at this: software engineering is not the same as computer science anymore, and really hasn't been for decades now.

Software engineering _relies on_ computer science, but the fundamentals that computer science is concerned with are about as far removed from software engineering as pure maths is removed from physics: no modern software engineer needs to have a deep understanding of the hardware-level behavioural fundamentals underpinning the solutions that CS already gave us to problems associated with that hardware. The work of a modern software engineer instead relies on the solutions to those problems working.

And sure, you can become a specialist, and dive into those subjects if you want to, but progress in any field is made by solving problems to the degree that the next wave can simply rely on the solutions being "a given": yesteryear's software engineers built the tools that today's software engineers rely on, without having to question that those tools get the job done.

And while in exceptional cases, they won't, and you might need a computer scientist to look at the tool and go "ah, that's because: ..." (and that can be the same person, applying a different discipline to the problem), it is that evolution of tooling that necessarily takes us further and further away from what at any point in time is "fundamental". The things you list are still fundamentals, but of a different field entirely by now.

I strongly disagree on not needing to know the fundamentals. Can you get a job without them? Sure. Can you even create a successful product without them? No doubt. Will your product take up 50GB of RAM, have un-debuggable hitches, erratic performance, and weird bugs when the 5000 dependencies you have change subtly on an update? YUP. Will you have a bunch of new developers that replace your old developers because of high turnover that want to rewrite it two years down the line, but don't realize they don't have the skills to do it better than the original people did? Probably!


As a software engineer going on fifteen years, I roll my eyes at the claim of elitism. Over my career, I’ve witnessed tons of similar disasters caused by not grasping CS fundamentals.

It isn’t elitist to expect a building architect to understand physics, a doctor to under biology, or a developer to grasp CS. It doesn’t need to be a barrier to entry, but it will likely be a career blocker down the line.

You have a pretty negative take. There are fewer software engineers with the skills you mention. But it's kind of the wrong question. Without these skills, individuals are still capable of offering tremendous value to tech companies. I work at a solid company with a great team. Those with solid CS fundamentals have certainly been profoundly helpful when designing systems that need to be performant and reliable. But other, passionate engineers with liberal arts backgrounds have added insane value to the product as well. It's about curiosity, intelligence, and thinking about what's needed to get the job done.

Could not agree more. For a product company, having software engineers with diverse backgrounds can be a lot more valuable than having only super strong CS theorists (but you should have them of course). As long as the engineers are intelligent, passionate and productive, diverse backgrounds can only be a good thing. Now this is probably different for research oriented careers, but most of us are building products for customers in some form or another.

I actually doubt that there are fewer software developers who have those skills. The software industry has grown by leaps and bounds every year for decades. I suspect there are more than there were even 15 years ago.

What has also happened is that tools have improved at roughly the same rate, and the tasks to which we apply software development have also grown exponentially. So there are jobs that exist today that did not exist 15 years ago, and they just don't need these skills.

No, I don't think the quality has gone down. I think this outlook is a way of gatekeeping engineers who don't have a formal CS background. I happen to be one of them.

Most people who go to a bootcamp (I did not, but have hired a developer who did) do not end up working in a role that requires understanding of the 5 topics above you mentioned, with the exception of cache and maybe algorithms. It's just not what most bootcamps are targeted for.

I have seen a lot of terrible code from both sides, and don't believe the quality of the code to be a function of the developer's level of formal education.

I think you can get that knowledge from anywhere, and university education is only one path. That being said, that knowledge is REALLY important. I don't care if you come from a boot camp or not, but I wouldn't want to hire a developer that doesn't understand those 5 things (plus a lot more).

I mean, you're using "gate keeping" as a pejorative, but, gate keeping is super important in any profession. Doctors and lawyers have a lot of "gate keeping" too, but would you really want to go to a doctor without a medical degree or a lawyer that's been disbarred? You might say what engineers do isn't as important, but then, if you're running a software business that employees 50 people, that business shutting down because their engineers can't cut it is a fairly impactful thing to a lot of people.

There's nothing wrong with having novices at work. We need novices and apprentices, they're the lifeblood of our industry. The problem is that our novices don't know they're novices, and now we have novices teaching novices and telling them that a lot of important stuff doesn't matter. Or you have novices hiring novices, and now you have bloated engineering organizations that take a ton of time and manpower to do things that should be simple, and the entire industry gets a black eye for it.

Yes, clearly I am not advocating that anyone of any experience should be allowed to do any job. OP is positing that there is a decline in software quality because of developers who don't know those CS concepts. I don't agree.

If you're saying you wouldn't hire a developer who didn't know those concepts in your domain, OK.. I don't know what your domain is. Software is a huge field as I'm sure you know and there are many domains in which the requirements include understanding these CS concepts.

If you're saying you wouldn't hire a developer who didn't understand CPU pipelines in ANY software engineering context, I strongly believe you would be missing out on some highly skilled & capable people.

> I think this outlook is a way of gatekeeping engineers who don't have a formal CS background.

If a driving school doesn't teach parallel parking, pointing out that deficiency is not gatekeeping.

> It's just not what most bootcamps are targeted for.

If you're making $35,000 a year in the service industry, making $70,000 translating Photoshop files into HTML is life changing. If making that transition is your only goal, great.

But I think many people enter bootcamps with more ambitious goals. They'd like to move up the career ladder, take on more responsibilities, tackle more difficult problems, and receive commensurate compensation.

People I've talked to who come from non-CS backgrounds said they hit a wall years into their careers, having to play catch-up on the job. In my experience, CS didn't help me as a junior engineer; by the time I was senior, those concepts were invaluable.

When people with CS-gaps hit a wall in their career, we have a two options: we can give them more responsibilities anyway, which sets them up to fail, or we can identify gaps early and help them. That's the opposite of gatekeeping.

Gatekeeping may not be the best word, and I think you misunderstood what I was trying to say. OP is not merely pointing out a deficiency: they are suggesting that there is a decline in software quality because of these non-CS people entering the market. Pointing out a deficiency and blaming said deficiencies for a trend of bad software quality are two very different claims. It suggests that one cannot produce good software without a CS background, with which I disagree as an absolute statement. That may be a slight leap, but I think OP made a pretty broad suggestion.

I haven't and wouldn't suggest that people be moved to a position with responsibilities over their head If you want to move up and you need to learn more, of course you need to find a way to learn the required concepts. I did this and continue to do it.

My point about bootcamps is most people who finish them aren't out there getting jobs that require knowledge of CPU pipelines. If that IS happening, someone is really bad at hiring.

Depends on the axis.

In terms of working software that is somewhat dependable, we're probably better than ever before.

In terms of using the system resources efficiently? Unmitigated disaster. So. Much. Bloat. Chat clients should never use gigabytes of RAM

In terms of average talent level? It seems like its getting worse every decade. I was talking to some recent CS grads about Turing completeness and they had no idea what I was talking about. How can you have a CS degree and not know Turing completeness? Also so many developers refuse to understand how their tools work. There are so many developers terrified of C. C is warty, but to be frank, if you can't write some system level code you're going to be a weak developer. If you don't understand how your machine works, and you're unwilling to leave the world of HTML and Javascript, you're always going to be a novice

I have a theory about the "10x developer" thing. I think so many software developers are essentially "expert beginners" (https://daedtech.com/how-developers-stop-learning-rise-of-th... ) that when someone comes along with experience and competency they look "10x". But in truth, I think its more that the industry has a lot of 1/10th developers so the 1x guys look like rock stars.

I also think in terms of project management we've gone in a really wrong direction, and the rise of the expert beginner is not unrelated. Early agile had the right mindset, but its morphed into a gross ineffective caricature of itself that operates on a factory metaphor. The expert beginners need all the micromanaging that comes from scrum, and the project managers are more than happy to do it, and the people that know what they're doing get dragged down into it because management thinks every programmer is as bad as the average.

> I was talking to some recent CS grads about Turing completeness and they had no idea what I was talking about. How can you have a CS degree and not know Turing completeness?

Having studied CS, I agree that you should know about the concept, but does it really give the normal Software Engineer an advantage in their day to day? Most likely not. It always depends on what you are working on, but the body of CS knowledge is huge and it requires a lot of work to keep all the concepts fresh in your mind if you are not applying them. I'd rather have people with a strong grasp of software architecture, algorithmic complexity, system design and networking protocols. On the other hand a game developer should be strong in other theory areas that they need for the job. Machine Learning engineers have again different focus areas etc.

> There are so many developers terrified of C.

I don't think even an above-average developer can make C work reliably. You absolutely should be terrified of C. And C++. Now, if you had said Rust, or Haskell, I might see a point to what you're talking about.

I agree that the developers we hire nowadays don’t know as much about virtual memory, CPU pipelines etc. But they don’t need to.

I feel that the Open Source ecosystem has made building software a lot easier.

Also, the sheer number of software engineers have gone up. 20 years ago those who chose to become software engineers where often very passionate about it. Today, it’s like any other trade, you get a larger variety of people.

My last point is that organisations have also matured and know more about what to expect from software engineers. When I started programming professionally I had a lot more time to finish any given feature, today everything needs to get out the door faster.

Overall, I think we are building better software today.

I am not a developer, so take this with a huge grain of salt, but this comment sounds a bit like old school photographers saying iPhone and Instagram are not for “real photographers.”

Yep. Exactly.

“How one can be a car mechanic without knowing how to blacksmith and forge car parts manually, from metal ore”.

I think that's right. Tangentially related: have been thinking a lot lately how tone often (but not always) implies some kind of distortion in thinking. To a degree where you can _anticipate_, but not necessarily intelligently articulate why some position is incorrect.

I am a developer with a CS degree and think you hit the nail on the head with that observation. There are also plenty of other things developers have to know about now that you didn't have to 20 years ago - security, privacy concerns over data, devops etc.

It's very lazy thinking to resort to a "kids these days" mentality of modern software development.

As a software engineer with CS background, I first wanted to disagree, but have to say that the analogy checks out.

In 1994, my new boss turned to me and said "I've been despairing that no young people seem to have the hacker spirit any more. No sense of technical adventure."

So I reminded him that Socrates was supposed to have said "The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise."

He got over it.

> It feels like a lot engineers now days don't seem to have a good cs background. They don't see to understand things like cache, paging, virtual memory,cpu pipelines, algorithms or other things pretty fundamental to CS.

Yes but nearly every software developer job is web stuff these days and in the web world, they don't need it. (Cue all the HN posters saying "college is a waste of money" and "it's just a piece of paper".) All they need to be able to do is glue together libraries and frameworks created by people who do happen to have good CS backgrounds. You really have to go out of your way to find jobs that actually require knowing "cache, paging, virtual memory, cpu pipelines, algorithms", etc.

> When we have the next industry crash (.com crash) will these people stick around?

It won't crash, because everybody needs a web site these days, but it will become commoditized because the bar to entry keeps getting lower and lower. I've seen some comments already on HN saying that software pay is gradually becoming bimodal.

There are so many things to do with computers that have nothing to do with the web, where native code is incredibly important. There's probably even more of those jobs than there are web dev jobs, it's just that hacker news is a bit of an echo chamber.

Kind of a half-assed estimate but summing up web/mobile vs everything else (excluding QA) from the counts from this article https://learning.linkedin.com/blog/tech-tips/the-american-ci... gives us about 13% of software jobs being non-web. I lump in mobile with web since a lot of mobile apps are cross platform hybrid apps using web stuff.

Personally, I think even that estimate is too high; I'd be surprised if the true number wasn't somewhere closer to 5%.

Go on then, go look for them.

Go search on job boards and fine all those mystical jobs.

They are the tiny minority.

The last job I had was doing graphics coding for a video game, and before that it was c++ CAD programming. They’re not hard to find at all. They probably seem more rare because the barrier to entry is much higher but I think that's a good thing.

Selective hearing from you.

You claim there's MORE than web dev jobs. Go to a job site, type in developer, and count the web dev jobs. Now count the non-web dev jobs.

Worse still, you're repeating this and replying to me even after ThrowawayR2 gave you a hard number to go off (a mere 13%).

Not only do you have web businesses, but also almost all enterprise apps are now web apps. I'd guess that those are the vast majority of jobs in the market, but again you probably won't believe that.

They're all web dev jobs because it's easy to deploy. 20 years ago, I used to work in places that had to roll out desktop apps to everyone. It was a nightmare, as well as a security nightmare as they had to have the db open to the whole network.

I've worked with a number of bootcamp graduates and I've found the experience extremely refreshing. There are perspectives, skillsets and domain expertise that can come from having transitioned into an engineering career later or from other fields, that more conventional computer scientists often don't/can't have. Whilst there are situations where pre-requisite CS knowledge is crucial, my experience is that there are equally many where that doesn't factor heavily into your ability to effectively work in a technical role.

I don't mean to be disrespectful (but will risk doing so as I thought the original question was negative), but ironically as an employer I would place preference over the ability communicate effectively and respectfully, unlike the way the question above was asked.

N.b I'm a "vanilla" CS grad (although wouldn't claim to have an in-depth knowledge of the areas you've listed twice).

1. Yeah, people seem to be copy-pasting incoherent difficult-to-read stuff everywhere.

2. No seriously though - what you've written touches on CS and completely ignores designing programs. You can be great at CS and still write crappy programs.

3. Though it's unlikely to have an industry-specific crash, recessions are inevitable and then people (competent or not) lose their jobs.

4. I guess my whole point is - even if these thing were true, so what? Each individual person has the choice to go into whichever field they like. Each person is also free to spend as much time and effort as they want on improving their CS and software-engineering skills. This knowledge isn't somehow exclusive to degree holders - there are so many free resources on CS and SWE. And as I mentioned, the degree doesn't guarantee that you'll be able to create good software.

It hasn't gone down, it has expanded and as a result diluted the general base of knowledge. At the same time, that isn't as much of a problem as you'd think as most 'SWE' isn't really that, isn't more of a cookie cutter basic CRUD/Feed/CMS setting where real in-depth knowledge isn't required to make a product work 'good enough' and keep a job.

There are still plenty of people that actually do know the theory and actually do work on low-level stuff, but at the same time the reality is that the olden days of engineering aren't coming back; they are what is currently often referred to as the '10x engineer' type of work. It doesn't scale, it doesn't work well with others and it doesn't return on investment all that well.

Unless you need someone who works on hardware, kernels, compilers, runtimes or severely constrained constructions (query planners, memory managers, transaction engines etc.) it really doesn't matter as much as it used to.

I'm sure the absolute number of software engineers who have a background in CS is increasing. However, because of the massive increase in developers without a solid background in engineering or CS is increasing, the feeling in the industry matches the OP's sentiment, (which I've shared for a while now.) The ratio of quality developer to unskilled developer is worse than ever. This is what I've been referring to as the Blue-collarization of Software Development. I believe that building software is going to become much more similar to being an auto mechanic in the decades to come. There is absolutely a need and an availability of very skilled software developers, but the day to day work is going to be much more accessible to those without the rigorous CS background that one needed 10 years ago.

I've worked on a lot of old code bases that were built by this previous generation of engineers, and let me tell you, their code was awful. Software engineering has learned a lot over the last decade.

Sure there are some things that aren't emphasised as much but most of them aren't super useful. JavaScript, css, and mobile development are way more useful today than algorithms, cpu pipelines etc...

Most of the last round of unicorns you could build without any of the items you mentioned, but you couldn't even get started without an understanding of modern web or mobile development.

This is coming from someone who is by no means an expert on any of those things but took a bunch of courses on them in college and have only used them a handful of times in the last decade of professional software development to eek out very small performance improvements.

It looks like these things don't matter anymore for a lot of companies or at least they are not valued. Creating something that works somewhat in the smallest amount of time possible to get the customer to pay and that management can check the crosses in the excel sheet.

The Java app at my current company does 23k database queries uncached for a directory listing with 100 files in a folder. There are two devs at work there for years, nobody bothered to look up the rather well documented api docs that just use a single query for that task from the upstream software we use.

Personally I'm frustrated and I'm looking to learning more CS to get a job that values quality over quantity.

I'm not sure quality has gone down as much as the need for programmers has drastically expanded. Now pretty much all businesses have the need for custom software to some degree, and many of those businesses do not have a big budget.

What do you get when you have millions of businesses who need ten million dollars worth of software but can only spend 100k? You get lots of shitty developers coming out of the wood work to meet this demand, and you get what you pay for.

In the 12 years I've been doing this, I've just noticed that all the good software developers are working at the places with big bucks, and all the shitty developers are picking up the scraps.

I'd beg to argue this point. Those who are stimulating this employment exercise are more often than not degree holding. And that is where we have fallen. School != expertise in any regard.

I would generalize it more: people don’t seem to understand the domains they’re getting into. I know devs who are expected to do a lot of Linux work who flounder on the command line. I know devs who write “services”, but don’t know how to administrate the server their service runs on. I know devs doing database work who can’t write JOINs.

I think this is partially because of how democratized the field has become. Any asshole with a computer and some time can write code. I don’t think this is fundamentally a bad thing. But after a certain point, it stops being a hobby for some, and turns into a job. The line is blurry, but at some point you have hobbyists writing software-as-a-service with no knowledge of things like security. I don’t want to gatekeep, but I also don’t want dilettantes getting my identity pwned because they took a JavaScript course and thought that qualifies them to write a SaaS product.

Admittedly, it never hurts to understand the computer at a fundamental level. I think anyone who does automatically has a leg up on anyone who doesn’t. I expect my mechanic to know how engines work, even if they just change my oil. You certainly can change oil without being a mechanic, but you lose out on some depth of knowledge.

So yes, I think there’s rampant dilettantism in the field right now. Short of licensure and/or some laws with teeth, I don’t know how to prevent it. You can nail a lot of legs to a dog and make it an octopus, and there’s still apparently money to be had there, so the octopus keeps moving.

I've been coding for at least two decades, and possibly a lot longer if I want to stay vague about my age. It's definitely not getting worse imo. First, a CS education is nice to have but being a great engineer and having a degree aren't highly correlated. What makes a great engineer is someone who loves programming, and someone who builds lots of things. You learn by doing. You cannot stop people like that from learning and improving.

There are certainly far more programmers every passing year. Does someone need to be a highly trained engineer to build a small marketing website? Likely no. Is it a bad thing for our industry to have such a high demand for programmers with varying degrees of ability and training? I don't think so. What I envision instead is a future where basic programming is like driving a car. Lots of people will know how to do it.

I follow many great engineers and can even look at their code and learn from them. They are building things the best of us thought would be impossible a couple of decades ago. It's a super easy choice: if I could choose to go back to any point in time in the history of software engineering in order to experience the highest levels of quality, I would choose to be here, now.

Most engineers have always been terrible. Been true for decades.

But perhaps with the demand for engineers really high, and the economy at full employment, and the big tech companies growing and also explicitly optimizing hiring practices for things other than "best SWE skills", more of the not very competent folks are getting into the places you don't expect to find them

Yes, I do and I think a lot of it has been driven by hard and fast (reckless) development of really odd frameworks like react native:

class AnimatedCollapsible extends React.Component { state = {expanded: false}; render() { return ( <View style={{overflow: 'hidden'}}> <TouchableOpacity onPress={() => { LayoutAnimation.configureNext(LayoutAnimation.Presets.spring); this.setState({expanded: !this.state.expanded}); }}> <Text> Press me to {this.state.expanded ? 'collapse' : 'expand'}! </Text> </TouchableOpacity> {this.state.expanded && <Text>I disappear sometimes!</Text>} </View> ); } }

Look at this code. How is this okay? When were framework developers taught that mixing business logic and UI is a good idea? Seriously?

> When we have the next industry crash (.com crash) will these people stick around?

I feel like the people who jumped into boot camps to make a quick buck will move on to the next hot thing

I'd like to think that the people who are in the industry because they actually enjoy the challenges and problem solving will stick around

SW development isn't real engineering as computer scientists are not real engineers. That being said, we do have a generation of coders who are proficient at googling for terms then doing copying/pasting. They hit brick walls when it comes to debugging and testing code.

I strongly disagree with your understanding of software engineering. In my experience, the best engineers adapt as layers of abstraction permit them to no longer be as concerned about the lower level considerations, such as the ones you noted. The engineers most useful to a business are generalists who have an enormous breadth of knowledge and can specialize at will, where and when it becomes necessary. Much of the knowledge you’ve acquired in the past, will and should atrophy as it becomes less useful. You need to stop having a fixed impression of what a software engineer is and realize that the definition is constantly evolving.

Yes. In my last job, most people were self taught with no formal education, it was a real hustle to explain basic concepts. And every discussion didn’t really progress because everything had to be explained in detail because they simply didn’t understand the concept. This was exhausting and draining. Also there were an naive belief that everything on the internet is true.

At my current job, people people have a way better understanding of how things work and a better understanding on how to apply them. Discussions are much more simple and shorter. There’s a huge difference in how the work is done and how you do code reviews because they are much more spot on and not haphazard or random.

There are at least two factors in motion in this question: the exponential growth of abstraction layers and the career trajectory of the observer. Up to a point somewhere in midlife, the observer will have increasing knowledge of more and more of the layers, whereas the average SWE will have knowledge of only the top few. Hence the observation of a quality decline. Wait a few more years until the ever accelerating layers are coming tetris-like too fast for you to handle with your declining brain power and the observation will reverse.

A lot of the things you mentioned are abstracted away from most Software Engineers nowadays. Of course it's good to know how these things work, but for most SWE jobs it is not necessary to have a strong grasp of them. The world of software has grown tremendously, many companies need rather simple software that is built with high level frameworks and tools. On the other hand there are still people building compilers, game engines, routing algorithms or work on embedded systems where all the things you mentioned are much more important.

Backend code has moved a little. Java's popularity might even be waning.

I don't know about anyone else, but must of my time is spent wrangling enormous scripts in tools like CloudFormation to deploy relatively small amounts of code.

What's happened with JavaScript on the frontend is nuts though. I can't follow what those guys are up to anymore and can only assume they have a better understanding of caching, algorithms, etc with the explosion of code there compared with the days before even jQuery.

No. It's never been easier to make high quality software. I can run my own cloud environment, make quality user interfaces, integrate with third party providers to trivially do previously highly messy things like accept payments or send text messages, all as a solo developer.

Not because I'm particularly good but because a lot of the foundational work is now available with `git clone`.

There's a lot of crap around of course, but that's always been the case.

CS has expanded so much that unless you are a genius you can't really know it all, so corners have to be cut. Depending on where you position yourself in the stack, deep understanding of hardware may or may not be important.

As for algorithms: for most of the stuff what people really need is to have a basic grasp of complexity of the common containers and time accesses to various parts of the system but beyond that–just learn on the go.

I think what has gone down is the value of the aforementioned skills. Aside from caching and algos, very few engineers I know at FAANG are using any of these. I have a formal CS background, but in the industry these topics have been abstracted away to the point where you don't need to worry about them 99% of the time.

If you're building extremely complex or expensive systems, then sure, but most businesses aren't.

Just on your last point, I think that when we have the next economic crash some of these people, and also talented engineers as well, won't stick around in software engineering. Careers in technology have stages and not everyone stays a developer their whole career. It will be impossible to measure what exactly percentage what people without "a good CS background" leave.

Only irked about toy projects like terminals on Electron or reinventing the wheel instead of chaining shell commands.

On paging and CPU architectures, these aren't universal or usually supplied with standard libraries. Only serve to obfuscate and concretisize your codebase.

Just optimise your hot paths, let Intel/AMD/whoever make better optimising compilers for what they engineered.

Has quality gone down? No. What you're talking about with people not knowing these various things and going to bootcamps, those are all developers. Developers are not engineers, no matter how many times HR mixes up the terms.

It's like asking if the quality of mechanical engineers has gone down because mechanics don't know x, y, and z.

Recently I had the pleasure of working with people in a project who I (CS background) hired myself, and I now couldn’t care less about how much they know about memory paging. The most important traits to me are:

1. Intrinsic motivation to be productive and solve problems

2. Ability to communicate well, especially via specifications and Slack.

Anything else can be learned in a few months max.

Nah, I don't think so. Software engineering has always been a field where the barrier to entry is quite low, and where a lot of people didn't need a CS background to get started there. And for many of them, they still don't. Web development for instance requires more knowledge of how browsers work rather than the computers they run on.

Even so, the number of mediocre/terrible software engineers and web developers doesn't seem to have changed much overall. We all know of old school desktop software that was incredibly poorly written. We all know examples of games which were incredibly poorly written. There have been badly coded, badly designed websites from people with little experience in the field since the web first became a thing.

Plus people joined because it paid decently back then too. The dotcom boom brought in a lot of people who only cared about the wage slip at the end of each month. The early days of gaming had tons of people jumping on the hype train for money, whether it was the home computer scene in the 80s in the UK or the early console gaming one when Atari was still a big player. Or perhaps for every generation since.

As for whether they'll stick around if it crashes? It depends. It may very well not crash at all. And while a certain percentage will leave if it does, others will stick around and learn more instead.

So no, I don't think the quality has gone down. There have always been people from informal backgrounds, a knowledge of CS hasn't ever been necessary or mainstream overall, and mediocre to terrible programmers and developers have been a thing since the field began.

I'd say it's the quality of SW Engineers has stayed roughly the same, but the quality of SW Engineering has gone up. I worked at eBay back in 2000ish when it was all C++ and there were still plenty of kids that didn't understand pointers very well etc. Now at least they don't have to as much.

The need to do good bold things has gone way down. The need to understand the machine has gone way down.

I'm way more concerned with that. Longer term, how SWE is supposed to maintain itself & train new generations when the field looks so homogenous & we are so well served is a prospect that does keep me up some nights.

Being somewhat proficent at both sides of CS - practical and theoretical requires a lot of effort and no school is going to teach it.

Doing constant progress (learning) both - modern and everyday useful things meanwhile not giving up at things like security, protocols, lower level stuff, hardware requires a lot of discipline.

Things like cache, paging, virtual memory, and cpu pipelines are not important to "CS" or most software development. Based on your list of priorities, I'd guess you have much to learn about what goes into good software engineering. It's not a bunch of facts about CPU's.

>> cache, paging, virtual memory,cpu pipelines, algorithms or other things pretty fundamental to CS

Algorithms is the only thing you mentioned that has anything to do with Computer Science. The other stuff is Computer Engineering.

No, not at all, quite the opposite, actually.

It's also much easier today to write good high quality code than it was when I started professionally, which was ~15 years ago.

However, I never worked with a boot camp grad, only people with degrees.

I havent read anything that could be convincing that it has gone up (no new paradigms, no new architectures, no novel technology just endless horizontal scaling). So , statistically it can only have gone down

What about Ordo notation, complexity theory, context free grammars, type theory, information theory and lambda calculus? Topics that are really important to Computer Science? :p

Can you suggest a product update in the last 8 years that is clearly better than its previous version?

Yes. The quality of software has definitely gone down. Then again, so has the quality of everything else, from engineering to urban planning to infrastructure maintenance. A possible explanation for this phenomena is that the global mean IQ has also fallen quite a bit over the last 50 years and continues to fall[0].

[0]: https://www.fourmilab.ch/documents/IQ/1950-2050/

That's quite curious, because previously I thought the average intelligence was rising, to the point where the IQ had to be renormalized regularly for the mean to stay at 100.

This source [0] seems to agree with that view. What's the deal here?

[0]: https://ourworldindata.org/intelligence

My feeling is that the quality of software engineering has gone up dramatically in the past decade.

> It feels like a lot engineers now days don't seem to have a good cs background. They don't see to understand things like cache, paging, virtual memory, cpu pipelines, algorithms or other things pretty fundamental to CS.

Pipelines and caches and paging and virtual memory are stupidly complicated in modern processors. If you claim to understand these things and you don't either work at the company or have an NDA with the company so that you can implement drivers, you're probably full of shit.

What I can't stand are the "highly-ranked" schools that introduce students to a very basic and abstract (and outdated) notion of these topics, and students enter the workforce overconfident that they have understood the topic. You haven't understood the topic, and having some rough notion of the topic can often times be worse than if you didn't know anything at all.

tl;dr: Modern processors are proprietary IP and you should be skeptical of anyone who claims to deeply understand it but doesn't work for the company making it. You do not need to understand how one works to be a great software engineer.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact