Hacker News new | past | comments | ask | show | jobs | submit login
Computer science as a lost art (2015) (rubyhacker.com)
247 points by jxub on July 30, 2018 | hide | past | favorite | 141 comments

Well stated.

I recently went through (the recordings of) MIT’s intro course for electrical engineers, in which somewhere the professor says students may wonder why they have to do all this calculus and learn FET models and so on — in real life don’t you just wire chips together? And he points out that MIT degrees are for the people who make the chips.

he points out that MIT degrees are for the people who make the chips.

You never know when background knowledge and first principles might come in handy. One of my favorite YouTuber practical engineers has this story about going on a boat trip. The new coffee maker on board was freaked out by the noise from the inverter and kept shutting itself off. A total disaster! There would be no coffee the whole trip. So he turned on the blender while making the coffee, and the coffee maker started working. How did he know? He knew what kind of motor was in the blender, and knew its windings would increase the inductance of the circuit the kitchen appliances, filtering out the higher frequencies put out by the cheap inverter.

This guy isn't an electrician. ("Elekchicken") His day job is just to put pieces of "industrial lego" together -- just like how so many programmer jobs now are mainly about gluing libraries together. But he never shies away from knowledge of first principles, and he demonstrates all the time why such knowledge is valuable.

Background knowledge and first principles also come in handy with more mundane work.

Arguably our most prominent contribution to OSS (outside of our own projects) was in React. We're credited in the codebase when the approach was used: https://github.com/facebook/react/blob/v16.0.0/src/renderers...

As discussed in the original PR https://github.com/facebook/react/pull/4400 the original checksum used a naive adler-32 algorithm, but with some basic math you can find a much more efficient implementation that eliminates most of the mod operations without risking a hidden deoptimizing overflow.

It's not everyday that you get to use your background knowledge, but when you do it feels great!

I'm intrigued. What channel? I greatly appreciate this type of intuition / problem solving. AvE is one of my favorite.


Ah, must have missed that video :P Thanks!

Share the channel please


It’s kinda like that Asimov story, Profession:


This is the best, most succinct, answer to do you or don’t you need an eng degree I’ve ever seen. Thank you

Do you want a 200k/yr 50hr/week job? If so, Get the bestest degrees.

If you want a 110k/yr to 1B/yr job, self taught with some confidence is my recommendation.

Jeff Bezos went to Princeton, Zuckerberg and Gates to Harvard, Sergey Brin and Evan Spiegel to Stanford. The bestest degrees seem like a decent option for the xB/year category too.

With the exception of Bezos, everyone on your list dropped out of their degree before finishing it - although to be fair to Brin, he already had a degree from Maryland before dropping out of his PhD.

Going to schools like that is more about having the right pedigree and connections. It’s much easier to get VC funding or get hired into a tech company as a stanford dropout than as a tripple doctorate from croatia.

Dropping out doesn't mean they didn't already get a lot of benefit from their schooling. Just that they realized they didn't need the piece of paper and could do the rest on their own.

But especially the fundamentals is where I think most of the value of a uni degree is.

Yeah, there's a huge difference between "so-and-so left a PhD program to start a company based on his research" and "so-and-so read a web framework tutorial instead of going to college."

Dropped out because they had companies to run, though. That's a bit different.

I've worked for years as a self-taught developer before going back to get my CS degree, I've taught at a bootcamp, and I've hired bootcamp graduates.

Bootcamps can be valuable, but they are in no way comparable to a 4 year degree from a decent CS program. The top performing bootcamps are either functioning as an extended job interview that you have to pay for, or they are very good at selecting experienced students who only need a 12 week course to be ready to be productive developers. In my opinion, the reason we've seen bootcamps close or fail to expand is that there is a limited supply of these types of students.

For the vast majority of people a 12 week course, no matter how intensive, is a good introduction, but a lot of training is still necessary to be useful. If you are prepared to invest in that training, they can be great hires. However, you need to be aware that it's likely going to be months before you get real productive work without hand-holding. It takes most people a lot longer than 12 weeks to be comfortable with the basics of moving up and down through levels of abstraction.

I'm a self-taught web dev, and I want to further my CS education but can't go back to school for various reasons.

I've seen various syllabi eg teachyourselfcs.com and though they seem legit (and I've dabbled is some courses) I don't quite see the application/direct benefit professionally.

Let me phrase in another way; when I got started it was easy to see why I needed to learn front-end and back-end to make a web app (for a CRUD job). Now I want to go further, but where?

I think it would be helpful to see what jobs I could get by furthering my CS fundamentals, and not just "Senior Software Engineer".

So, as someone with your unique perspective, what do you recommend? Should I really slog through ye olde CS curriculum in hopes that one day I'll be able to apply some of it? Can you recommend another approach?

Again, put another way, some of these "top performing bootcamps" sharpen your React skills and whiteboarding skills, which have a career/market value. But they don't appeal to me because they don't seem academically/CS focused.

I hope this duality/constraint came across.

Any guidance appreciated.

Knowing CS allows you to build unique solutions from first principles. In cases where you're tied to applying solutions that have already been decided there will be less opportunity to use your CS knowledge. I think the majority of the time for most jobs you're just applying solutions, so it's much clearer how this would benefit you professionally.

Depending on the kind of work you do, the minority of the time where it would be beneficial to know CS might have a big impact both on the product and your reputation in the company. This could help you move up to higher, better paid positions, but it also might not. Teaching yourself CS is a big time investment and if you just want to maximize your salary there's probably better ways to do it.

I think it's really only worth it if on some level you enjoy it and find it interesting. You can seek out jobs where they use more CS, but again this doesn't guarantee you'll advance professionally. But if you're the type of person who enjoys programming as a creative activity, I think CS can be very rewarding because it opens you up to what's possible.

>I think it would be helpful to see what jobs I could get by furthering my CS fundamentals, and not just "Senior Software Engineer".

This is going to difficult, because getting a job other than Software Engineer (or something related in management or sales) often requires credentials.

However if you'd like to get a higher paying software engineering job at a company like Google, CS fundamentals can definitely help you get through the interviews.

>Should I really slog through ye olde CS curriculum in hopes that one day I'll be able to apply some of it? Can you recommend another approach?

Yes to the first, no to the second. CS fundamentals build on each other. It's difficult to understand algorithmic analysis if you don't understand calculus (at a basic level), and it's difficult to understand automata if you aren't familiar with set notation from discrete math etc...

Learning on your own, it's tempting to avoid the boring stuff and focus on what's fun or seems useful to you right now. The problem is you don't really know what's going to be useful later on.

You already know how to code, so If you are really interested in the learning the fundamentals, first I'd make sure you have a basic grasp of high school math through calc I. Then I'd go through a good discrete math textbook (I'd recommend one that's CS focused). After that I'd work through a data structures and algorithms textbook, then a programming language concepts textbook, a computer architecture/organization textbook, and finally a theory of computation/automata book.

You'd have a decent grasp of the fundamentals by then. After that if you want to go further, I'd go through books on networks, operating systems, compilers, and functional programming.

I'm also a self-taught web dev, trying to learn more about the fundamentals. I've been trying to get through this course, and while the going is slow, it has been pretty incredible:


Not OP but if you're looking for a way to learn CS fundamentals thru practice I would strongly recommend trying to write a Gameboy emulator.

The project is challening but has a pretty rewarding feedback loop. Along the way you'll understand how high level concepts like grphics are implemented on top of hardware, computer architecture, cpu behavior, memory access, interrupts and timers and so on. It'll also challenge your software design skill and project planning. It might not be the best project if you are more interested in cs algorithms though, but I can't really think of anything that has a more enjoyable feedback loop.

previous hn discussion: https://news.ycombinator.com/item?id=17134668

If you want to learn the fundamentals of CS (and then some), https://en.wikipedia.org/wiki/The_Art_of_Computer_Programmin... is probably going to contain everything you want.

I always enjoyed teaching what I know.

Why you don't send me an email (on profile) or a DM on twitter @siscia_ ? Twitter I believe is the fastest.

Anybody else, feel free to do the same :)

Computer science graduates also need a lot of training. If not more, a boot camp graduate has already been exposed enough to modern tech stacks to at least come in as a junior. Too many CS grads have little practical experience to hit the ground running on even the simplest real world projects. Yes I have a CS degree.

>a boot camp graduate has already been exposed enough to modern tech stacks to at least come in as a junior.

From my experience that is not true. Bootcamp graduates (obviously barring additional experience) tend to be qualified to start at the intern level.

>Too many CS grads have little practical experience

Compared to veterans yes, but compared to new bootcamp graduates I disagree.

What CS grads can lack vs bootcamp graduates is the kind of thing they can pick up in a few weeks (as evidenced by bootcamps only lasting 6-12 weeks)--the reverse usually isn't true.

Practically, I've almost never seen this. Just like I've never interviewed a CS graduate who couldn't use a for loop despite all the horror stories that people like bring up. Even the most theoretical CS programs tend to require several sizeable projects.

Bootcamp grads can be a good find if you're willing to train, but in general that's because they are cheaper than CS grads, and for some level of work they tend to be goo enough, not because they are better programmers.

There are obviously many exceptions to this, and I'm only comparing CS grads to new bootcamp grads assuming similar backgrounds ages etc... One advantage bootcamp grads do have is that they tend to be a little older/more mature, but that's really nothing to do with bootcamp vs university.

In my experience CS grads to be far far stronger out the gate then boot camp grads. Boot camps tend to get you to intern level generally.

Depends on where you get the CS degree.

I'm currently doing CS and one of the core requirements for finishing the degree is to have atleast one semester-long internship at a corporation in something that relates to your degree. I did backend development for half a year, though the rest of the courses are fairly real-world focused once they have dug through the theoretical meat.

It should be a requirement to have each student complete a modern full stack.

Some schools have pitiful results for what they say is a degree.

Alternatively many CS grads never end up working in/near the web stack and that requirement in no way better prepares them for their profession.

If you wanted to suggest that they be required to have a significant capstone project (a full stack system, a compiler or VM, an operating system (rudimentary or not), firmware/BIOS, HPC application, etc.) then I would agree but to say that all CS grads should be good web devs is rather narrow minded when looking at the value of a computer science curriculum.

Why not have a curriculum that focuses on the full stack?

The full stack consisting of assembly, compiler design, C, networking, operating systems, embedded development, data structures and algorithms, databases development and design (including an understanding of the algorithms/optimizations used).

At least one class in functional programming and at least one in and OOP language.

And for all that is holy please teach classes where they have to do presentations and learn interpersonal skills and a few business classes. I am so sick of working with people who argue about technology but can’t come up with a business case for thier idea.

And then let them choose a “track” based on thier interests - after having for lack of a better term “career day” where people from the industry come in and give talks about what they do and the requirements to get a job there.

If that curricula means cutting out some core classes - so be it.

> The full stack consisting of assembly, compiler design, C, networking, operating systems, embedded development, data structures and algorithms, databases development and design (including an understanding of the algorithms/optimizations used).

> At least one class in functional programming and at least one in and OOP language.

I think it's pretty common for CS degrees to have all/most of this in scope already? At least I had all of it, with the exception of embededded development.

Yeah... maybe I'm out of touch (started undergrad in 2002) but... that sounds like a CS degree to me! Mine didn't cover embedded at all, but if you took the Advanced OS course, you ended up writing an x86 bootloader in asm at the start, which could eventually jump to main() for your scratch-built OS.

You are essentially describing the computer engineering curriculum that we have at Virginia Tech and I imagine a number of other schools have as well. Modern Computer Engineering curricula are more or less old school CS programs in that they focus on everything from user space and below.

At least from looking at Georgia Tech, there is a difference between the Computer Science curriculum and the Computer Engineering curriculum. The Computer Engineering curriculum seems more practical.

It’s often argued here that the purpose of a CS degree is “not to get you a job” but to “become a better citizen of the world” and if you just wanted to learn something to get a job “you should go to a trade school”.

> It’s often argued here that the purpose of a CS degree is “not to get you a job” but to “become a better citizen of the world”.

I can't recall ever hearing that argument advanced for CS degrees on HN (and I read a _lot_ of HN postings). Perhaps you're thinking of liberal arts degrees?

The usual rationale for CS degrees is that one learns the underlying theory behind the systems and tech stacks that they use, making them a more capable developer.

Here is one discussion that I was involved in.


What's the point of a chemical engineering degree? It's to learn to be a chemical engineer, but it's not a trade school degree. Far from it.

And there's degrees in chemistry, for those whose goal is to study the way atoms interact with each other. That's different from chemical engineering, where the goal is to efficiently produce the desired molecules at scale, without blowing up the factory. Two different degrees, with two different focuses, in the same general area.

In the same way, I think that computer science needs to be a separate, different degree from software engineering. At the moment, CS is trying to be just one area, but I suspect that it's often inadequately preparing those who are going to go on to be software engineers - who are in fact the large majority of CS grads. (They may also be inadequately preparing those who intend to stay in theoretical computer science, but I have even less information about that.)

Getting a CS degree to get vocational training to be a developer is a pretty terrible idea in my view (and I have a CS degree) - having a physics or mechanical engineering degree is particularly relevant to being a plumber so would you expect a CS degree to help being a developer?

It should help you get a job. No one says that a mechanical engineer shouldn’t be employable somewhere after they graduate.

I was involved in the hiring and filling of > 50 developer positions at a company, while being a tech lead.

We tried hiring a few people with just bootcamps. Only a few (so hardly a representative sample), but none of them worked out. As soon as they had to try something even the slightest bit different than what they'd done in the bootcamp they were lost. There were people with degrees in unrelated fields who then did a bootcamp who were good, and almost all of the CS/CE/EE people we hired were good.

I'm not saying this is always the case, but the two years of CS fundamentals seem to be valuable, AND the two years of unrelated core classes seem to be valuable. It might just be how it forces you to engage with and learn things you don't care about (because there will be times in any job you have to do that), or the people skills of having to learn to deal with professors and other students, or the pattern of constant learning and adapting it ingrains upon you, or something else entirely, but per the link, I don't think a bootcamp should ever be viewed as sufficient preparation for a career in development. It's fine in tandem with other things, but it's extremely limiting on its own.

There’s something to be said when you spend years building complex things from the ground up

My fav set of classes that changed the way I think account things in CS. It was at UNSW in Australia.

Fundamentals - we started with C. I hated C segfaults but realized the power of pointers, structs and how memory kind of worked. We built a little 8bit machine code and a little VM.

Datastructures and Algos - hashes, b-trees, graphs etc. it was eye opening to hear how this had been invented before I was born and the latest and greatest databases still use their variants. We built a crawler and search engine with our own hand written database. That’s where I learnt about mmap. A lot of fun.

Compilers - The whole, lexer, parser, checker, emitter pipeline. Write your own subset of C to jvm compiler. We did a simple lisp interpreter too.

Advanced graphics - we built our own little 3D game engine using only OpenGL calls.

Microcontrollers - literally programmed in assembly to operate a simple lift. Learnt about electrons and gates and how they make machine code work. Connecting compiler knowledge with microprocessor internals was a holy mind opener moment.

There’s something to be said about “we’re gonna build mystery thing X from the ground up and connect theory to practical”.

Sucks that Australia has very poor VC funding and startup ecosystem. UNSW produces some phenomenal graduates that are quickly stolen by US tech giants.

For a counter example, I'll say that we've got 3 Bootcamp graduates on our team at work and they've all been very good. They've all moved into languages beyond what they learned in their courses. One has become a major platform contributor and has been a driving force in decisions being made. Another has been working with hardware and system deployment. They're all very driven, I've been quite impressed.

Now the caveat to my statements there are that all three already had degrees, in disparate fields unrelated to CS/CE. It's an example of capable individuals being able to learn practical skills in anything.

Those aren't really counterexamples. They corroborate this finding by the OP:

>There were people with degrees in unrelated fields who then did a bootcamp who were good

I also think bootcamps are not very selective once you get in. In my experience it's very hard to fail out of them and there's a huge amount of hand-holding. Compare this to a CS101 course based on SICP and it's easy to see how CS degrees select for people who are able to thrive when thrown into the deep end.

What I miss in those kinds of discussions are the intangible benefits of having studied a subject in depth to acquire a degree. The person that entered university is different from the one that came out of it. The way to tackle problems, to think scientifically, the ability to see the broader picture are some of the advantages of good education that are easy to dismiss since they are not immediately visible.

There has been a similar thread on HN where someone with a bunch of degrees said: "I haven't used anything from my studies in my work". But this person might be blind to the fact how the education shaped her mind. Understanding goes beyond mere knowledge.

My degree continues to inform me, years later, because we looked at some of the theory and at a lot of generally applicable ideas, not just tech specifics. In some ways it's frustrating that we didn't look at more tech specifics... but in others, I was given the time and the tools to pick things up quickly.

I've worked with a couple of good people without degrees, I'm not saying it's impossible, just that I've found it very useful. I don't think I'm alone in this!

(Also, c'mon, a degree is not just about the qualification or the education, it's about meeting peers and forming life-long bonds, and having fun too :)

-- edit -- Oh, and on-topic, I think it's good to test your boundaries - reverse engineering an embedded board with a multimeter, a line-levelling serial adaptor and a soldering iron, and then getting it to boot linux with some custom kernel bring-up code, was one of the most rewarding things I've done in recent years. Learning a little bit more about something that's a black box at the edge of your understanding is always good.

> There has been a similar thread on HN where someone with a bunch of degrees said: "I haven't used anything from my studies in my work". But this person might be blind to the fact how the education shaped her mind.

I tried to read thought my uni notes on metric spaces a couple of years ago. I don't even understand them anymore. Littered with idiot comments like "obviously -> " despite it being nothing of the sort.

University me was a douchebag. :(

I'm guessing you don't use metric spaces in your day to day work. Take a look at notes from a subject you are actively using, and I think much of it would be obvious.

Yeah, I think it's also wrong. Most of the time I don't need any of my university knowledge at work. But it has helped me out many times just knowing that there is a better algorithm, or knowing how something works below, or knowing some extra math.

Same here. For me, often the most useful thing I bring away from my university CS classes is the knowledge that various problems have been solved, and even if I don't know those solutions (either I forgot them or never actually learned them), I know where to start looking.

I take this a step further and just always start with the assumption that a solution already exists. I've found that to be very rarely wrong in practice.

So, for the dogshed builders among us, what might be the recommended pathway to learn some architecture--beyond the obvious academic options?

I'm sure this has been covered to death on HN already, but if anyone has a link bookmarked and feels like sharing?

Read carefully "Structure and Interpretation of Computer Programs" (e.g. online for free), preferably do exercises. You'll get a solid start.

Along those lines, the accompanying MIT class (MIT 6.001 Structure and Interpretation, 1986) is available on YouTube[0] and MIT OCW[1]. It's worth watching, if only for the following:

"I'd like to welcome you to this course on Computer Science. Actually that's a terrible way to start. Computer science is a terrible name for this business. First of all, it's not a science. It might be engineering or it might be art. We'll actually see that computer so-called science actually has a lot in common with magic. We will see that in this course. So it's not a science. It's also not really very much about computers. And it's not about computers in the same sense that physics is not really about particle accelerators. And biology is not really about microscopes and petri dishes. And it's not about computers in the same sense that geometry is not really about using a surveying instruments."

[0]: https://www.youtube.com/watch?v=2Op3QLzMgSY [1]: https://ocw.mit.edu/courses/electrical-engineering-and-compu...

SICP is a Lisp textbook. This will be controversial, but I think it has little application outside of the Lisp world, however much it is venerated.

Yeah, it only teaches you about term rewriting, boolean logic, iteration vs. recursion, algorithm complexity, higher-order functions, data structure design, closures, generics, statefulness, environments, mutability, concurrency, stream processing, modularity, interpreter design and implementaion, lazy evaluation, nondeterminism, logic programming (i.e. search/constraint), low level computer architecture, memory models, and the design and implementation of virtual machines, garbage collectors, and compilers. Just a total waste of time unless you're doing Lisp.

SICP for python: https://wizardforcel.gitbooks.io/sicp-in-python/content/

LISP was the medium, not the content

I think the best way to learn architecture is to try (and initially fail) to do some projects which make demands on your architecture abilities. I'd recommend writing some frameworks or engines (without worrying about fully completing and releasing them!). Where I learned the most about designing reusable systems was building a game engine, then actually building a game or two on it and seeing what was painful and what worked and taking notes, then building another engine based on my experience, and repeating. I built three engines by the end, and made maybe four or five (little) games off of them.

I never released any of them and none are useful for anything any longer. None probably reached more than 75% completeness. But by trying to build games on them I saw with such clarity things that worked and didn't, and I got deep practicing in thinking about more 'general' forms/concepts/structures. I wrote a framework for writing 3D model loaders after, and the lessons carried over. I was utterly lost when starting the first game engine; with the model loading framework I had to think deeply, but I new where I was going.

Here's something to consider. If you try to just start writing 'a game engine,' (or whatever engine/framework) that could mean about anything (from 1 day of work, to 20 years of work). But, if you consider some concrete ideas for particular games you'd like to make, think about what they have in common, and then try writing a piece of software that takes care of the common parts, you'll have constrained the problem to something workable with definite goals, and a way of testing your success after.

This is at the heart of any sort of 'abstraction' you develop: you want to code several things which have a bunch of overlapping aspects, so you write something that captures the common parts, and then arrive at the several things you actually need by instantiating your abstraction with different parameters (I mean that in a very general sense, not with specific reference to OOP, though it provides means of instantiating abstractions with various parameters too, of course).

I'd say being able to do the above effectively is the cornerstone of being able to do architecture well. A bunch of other stuff comes after, like knowing when to not create abstractions, and all sorts of knowledge about doing things well for systems in particular domains (e.g. architecture for an interactive 3d simulation vs. web apps have very different patterns that have surfaced as effective).


I didn’t do CS in university. Following the above got me up to speed. I’ve seen it several times on HN. This time it’s me promoting it.

The best introduction I’ve ever read is Michael Sipser’s Introduction to the Theory of Computation. It is eminently readable. It somehow manages to be logically rigorous while remaining approachable. I couldn’t recommend a book more strongly for someone such as yourself. As a bonus it’s not even particularly long.

That is not a book on architecture. I like the book too, but it just doesn't have anything to do with architecture.

+1. It was a terrific read when I took my Computability course, very, very well-written.

Depends on what you mean by architecture. But if you mean what you'd learn in a Computer Architecture and a Computer Org class, this course/book is good.


I meant it in the context of the article, where the metaphor was made between a dogshed-builder--a person who can code--and an architect--a person who understands CS and uses it to build things using code.

Appreciate the rec!

Georgia Tech’s OMSCS program has been great so far, outside of the super rigid proctoring policies they have for testing that cost me 2 letter grades in my database class.

Some classes are better than others, but it’s extremely valuable to have a program that you’re following with deadlines and measures. If you have a desire to learn the stuff, you’ll get a lot out of it.

When I get busy personally or at work, I kinda phone it in and just do enough to make grades, but when I have the latitude to dive deep I’m learning a ton.

It’s also very affordable, fitting within most workplace education allowances, if you have one.

The Architecture of Open Source Applications[0].

[0]: http://aosabook.org/en/index.html

It means that a person can get the little things done while knowing very little. But it also means that this person probably will never learn enough to get the big things done.

To be honest, I get secretly frustrated with the lower-level people who now exist in giant hordes. (I rarely tell anyone that.) To me, they are like people who have decided to learn 5% of their field in order to get a few things done, have some fun, and make a living.

These people use tools to create little applications for everyday use. But remember: The tools themselves are also software. But they are a level of software far beyond anything these people could dream of creating. They use languages, editors, compilers, and operating systems; but they don't have the first clue about how to create any of these things or even how they really work.

The most disturbing thing to me, based on interviews I've conducted, is that this seems to include some large fraction of people graduating with a Computer Science degree from supposedly top tier schools with high GPAs that supposedly mean something.

If you want to make really interesting exciting things that have never existed before, if you want to make a tiny little difference in the industry and change the world just a little bit, then you do need that degree. If you want to make the tools and libraries that the lower-level people use, you do need that degree.

The tools and libraries aren't sentient AI yet. If you want to use the tools and libraries at a high level, then you really need to have some knowledge about how they work. The disturbing thing I might be seeing, is that something like 40% of graduates from even good schools have that Computer Science degree, yet really only have that 5% knowledge, yet have been led to think that they know more.

I think the mistake a lot of people make is the split between you need a degree and degrees are worthless. Degrees are largely what you make out of them.

I have been going through a bachelors of computer engineering and the curriculum/professors make every effort to provide as much value as possible yet most students do just enough work to get the grade they want in the class.

In this same vein, students so often take the required courses, look for the easiest (instead of the best) teachers, and try to pick the easiest, lowest effort classes out of the in major classes they can choose between (tech electives) rather than focusing on building a useful base of knowledge for their future careers. Many students intentionally avoid useful classes because they are hard and don't want to risk damaging their precious GPA. I find this accounts for a lot of the grads with very high GPAs but with fairly limited knowledge outside of the basics. This ends up with students putting the cart before the horse by focusing on how to be best suited for getting a job rather than how to be well prepared for it.

The true value of a CS/ECE degree comes from the classes that you take not the degree itself. Much of that material (particularly the fringe optional courses) can be very difficult to grasp on your own and having a professor dedicated to assisting in your understanding of that material is extremely valuable.

Degrees are largely what you make out of them.

Agreed. I think a lot of people take CS to get a GPA, network, get a good list of impressive sounding internships, and work with buzzword-compliant libraries.

What this is missing is that programming is not one job, it is multiple different jobs with different required skillsets and aptitudes.

Making tools and libraries is not necessary that complicated. It is oftentimes simpler then making large business app, because requirements tend to be clearer, more stable and because you can copy design of one of existing editors. It does require knowledge of algorithms that large business app does not require. Being able to make a compiler will not make you succeed in creating good finished web app, whether large or small. Finishing web app does require good grasp of ui and visual design completely irrelevant to compiler coding. Likewise, being good at algorithms, even difficult ones, and being good at maintainable design is not the same.

It is possible for one to be objectively harder in some sense or require more rare aptitude. Nevertheless, these kind of laments tend to rely mostly on "the kind of things I like and are good at are the real tech and everything else is for weaker programmers" attitudes.

The point of CS is not to make you highly skilled in operating systems or languages or even web apps. It is to give you overall idea about those areas to make you able to choose what to focus on and to make you able to learn any of these ideas once you decide.

The point of CS is not to make you highly skilled in operating systems or languages or even web apps. It is to give you overall idea about those areas to make you able to choose what to focus on

My point is that those overall ideas are lacking in substance and understanding. People should have just enough background knowledge to keep themselves out of trouble and off of toes. So many CS grads seem to lack even that.

> people graduating with a Computer Science degree from supposedly top tier schools with high GPAs that supposedly mean something

Who claimed that a high GPA from a top tier school meant something? Top tier schools are notorious for giving As to everyone. The point is that you got in.

Surely that's a transitive point. If the fact that they got in is supposed to be meaningful, then surely the high GPA is just some arbitrary slice of that meaningfully distinguished group. Yet they're still often clueless.

I would suggest that if someone isn't interested in technical fundamentals, they should consider a degree in human computer interaction and design. The things you can create with shallow technical knowledge continue to become more commoditized, but understanding problems that people have and designing a solution that makes them happy is a good way to create value.

We've reached an era where the average worker's serviceable time long outlives the competitive edge they've gained from their education/training in their formative years. The accelerating pace of economic and technological change is faster than ever, and this condition is unprecedented in human history.

I've become more and more convinced that this is the defining problem of our times- we're becoming victims of our own success. The author of this post feels like a dinosaur, and I would bet that many young people in our field who give in to their natural instincts and specialize in something will emerge on the other end feeling the same, at a much younger age than the author, and maybe unable to find equal or better work than before.

In other professions, the difference is more stark, and I think this is a major catalyst for the political/populist zeitgeist of the day. Entire industries have disappeared in a historical blink of an eye, and their former struggling workers are up in arms fighting powerful forces of nature trying to turn back the clock and stay relevant / valuable.

Bringing this back to CS, it's interesting to use this lens to determine whether the degree is worth pursuing anymore. On the one hand, it's fundamental and it encompasses the building blocks of how computers work and what they can do. On the other hand, programming techniques haven't changed very much and are quickly becoming commoditized and more accessible. As the author notes, it's true that you don't need to know as much as you used to, to build a useful program anymore. Like it or not, that's a fact, and economic forces are exploiting this more and more.

I think our human-being wiring is optimized to learn when young, and then "grow up" and become efficient at repeatedly applying our skills to obtain the expected outcome. Increasingly, I feel like the winning (or at least a better) strategy is to stay "young" as much as possible, since the chance you will need to reinvent yourself seems to only rise. This sounds great when you're actually young, but as time passes you get worse and worse at it, despite needing to remain "young" and malleable, and despite the mounting competition from actual young people.

So given all this, saying people "need" a CS degree seems like punching and kicking at giant waves you'll never beat. And I say this as someone who deeply loves both CS and academia. Stay "young" as best you can and try to keep riding the next wave you can find.

Disclaimer: My only formal training in this field was TAFE in Australia, which involved an 18 month course and is roughly analogous to a trade school or community college, before that I was a high school drop out.

> We've reached an era where the average worker's serviceable time long outlives the competitive edge they've gained from their education/training in their formative years. The accelerating pace of economic and technological change is faster than ever, and this condition is unprecedented in human history.

I think when change is this fast understanding the basic building blocks is more important than ever. These don't change quickly, some haven't changed since the industry was born. So much technological change is just reinventing concepts that have existed for decades and once you realize you're staring at an old concept in a new package keeping up is much easier.

The question then is what educational format teaches these fundamentals the best. For some of them it probably is a computer science course but for others it might not be. One of the best classes I had was building our own database (TAFE was pretty hands on) and from what I've seen this was a lot better than how it's taught in many universities. We had to start at the file level and think through the various steps to make a half decent database, like what is required to handle index lookups efficiently, how to retrieve records in order, etc. It gives you a much more intuitive grasp of what steps a DBMS has to go through on your behalf. In my first real job after graduating I had to explain to someone with a CS degree why storing dates as strings was inefficient and making our monthly billing took half a day to generate instead of half a second.

Foundational knowledge is important but the where/when and how we obtain this knowledge could do with a shake up, you can produce a lot of valuable output without an upfront 3-4 year investment, but it doesn't seem like there are a lot of opportunities to gain it after becoming a full time worker.

Very nicely said.

> If you want to make really interesting exciting things that have never existed before, if you want to make a tiny little difference in the industry and change the world just a little bit, then you do need that degree. If you want to make the tools and libraries that the lower-level people use, you do need that degree.

I wonder if the author considers Node.js to be really interesting and exciting and never existed before. Ryan Dahl doesn't have a CS degree (but does have a mathematics degree).

Another (pretty cliché) example: Bill Gates never finished his degree and went on to create many great, exciting and interesting things.

I think exceptional genius combined with exceptional work ethic can overcome any shortage of credentials or education, period. But for the great majority of folks, a formal CS education will give you a great advantage over a bootcamp graduate, or even a self-taught hacker. The thing is, if you're smart enough and contrarian enough, you're not going to listen to anyone's advice on this topic anyways...so those of you who do care what other people think, I think it's best to get a CS degree. Boot camp just doesn't cover enough bases. It's important to learn fundamentals. The fundamentals change much less often than languages or frameworks or platforms.

> I wonder if the author considers Node.js to be really interesting and exciting and never existed before.

Node.js was neither new nor interesting, and certainly was never "exciting" (note that this is coming from someone who now does a lot of node.js development).

To start with, it is highly relevant to note that node.js ignored fundamental things that were learned in CS going back to at least the 80s involving the duality of events and threads, leading to an entire generation of people who actually believed that callback hell was somehow a good thing to encourage :/.

The big thing, though, is that it definitely wasn't hard to do, and it certainly wasn't the first project to do it :/. I mean, my own personal website was built using a JavaScript on the server framework I threw together in a weekend using Rhino and Jetty years earlier, and I had myself gotten the idea from people who really know what they were doing: the people working on Apache Cocoon.

In Cocoon, they not only had correctly handled the callback hell problem, they had generalized it so far you could write programs on the server that made "requests to the browser" in the form of rendering a page that were expressed as function calls that would return when the user clicked links and submitted forms, inverting the normal flow of control to make it easier to build complex interactions.

So yeah: they had all of this stuff working almost a full decade before node.js existed at all, much less finally was able to use async/wait to manage callbacks. When they ran into evented hell, they didn't sit around for years building shitty workarounds: they implemented continuations for Rhino and contributed it back so they could do it correctly.

The real thing to realize is that sometimes, shitty things can have more impact than great things, and things that are none of interesting, exciting, or new can have a greater impact than things that were all three of those things if they have better community management or business acumen behind them.

However, we should call a spade a spade, and not pretend that those people are as good with software as someone who has spent years studying foundations, in the same way that the world's greatest software developer shouldn't pretend to be a great business or marketing person because they threw together a good enough website using a template and made some sales of their product on some app store.

I'd imagine not, I wouldn't either. Node is just a few basic apis on top of Javascript. The real magic is in V8, which was created by engineers with advanced degrees and deep understanding of language-theoretic and compiler concepts.

Germane meditations from the pioneer of kvetching about the standard substandard approach to programming: https://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD103...

Bootcamp vs CS degree really has to do with what sort of work you want to do, which is a missing variable in this article.

There's plenty of programming work out there that doesn't require any deep understanding of CS. You're not going to be creating algorithms when using existing frameworks to write yet another web thing, phone app, or internal businessy database-based system.

A bootcamp can get you started doing practical things. Yes, you won't have deep knowledge, but really you don't need deep knowledge for most employable work. Code doesn't need to be hyper-optimized at the scale you're working, and it's easy to learn common pitfalls & best practices from applied practice, reading, and mentorship.

And I say all this as an oldish fart who understands the chain from designing bespoke high level language environments down through to transistors. We don't need to count bytes & clock cycles anymore; people can let the machine & its provided environment simply work for them and learn the top-level interface.

> He's a freshman at Kennesaw State right now, but he really struggles with the idea of taking two years of classes that he has very little interest in.

If it is just the idea of having to take a load of liberal arts classes that perturbs your son and not the low level courses like chip design, logic, algorithms and data structures, calc and stats, one alternative to consider is to study internationally. English universities, for example, offer a bachelors in computer science in three years. Unlike a "8- to 16-week full-day immersive courses that focus solely on technology" they have a curriculum almost exactly the same a US computer science course minus the 3 English classes, 3 history and political sciences classes, 2 economic courses and an art course that a college like Kennesaw has a graduation requirement: Kennesaw State Curriculum (http://ccse.kennesaw.edu/cs/docs/BSCS_2016-2017.docx). To compare look at the University of Bristol's Curriculum:[https://www.bris.ac.uk/unit-programme-catalogue/RouteStructu...]

I know this is not an option for everybody - many people need to stay close to home for personal or financial reasons, but is definitely something to look into. With regards to finances, English university for international students even with 1 year less of study still cost a lot. However, I am pretty sure that the course structure is similar at most European universities some of witch offer really low fees to international students.

I wish more programmers took a few writing classes so they would appreciate how to write an email with the correct punctuation. I've seen far too many emails where the author didn't appear to understand how to formulate a complete sentence or understand where to put a paragraph break.

I wish more freelancers took a class in business accounting so they'd have an idea of how to do it and what a good (or bad) contract looks like... or understand the value of their time. There are far too many that decide to become "freelancers" and yet have no idea on how to do the basic business items that come with being a freelancer.

I wish more programmers took a class that had a public speaking component. Reading powerpoint slides as a team presentation is boring. The work environment isn't just "I write code" but also a transferring of knowledge from one person to the rest of the team.

I wish more programmers took some classes in history, or physical sciences - things outside the major. I've had more than a water cooler conversations where a person doesn't understand how the length of the day impacts the temperature, or is surprised at the similarity of events today and those of thirty some-odd years ago. This concerns me, not for the skills of work, but rather the understanding of the world outside of the office.

To these things, English composition, human communication, contemporary economy, arts and culture, political science and history... oh, those are are excellent class titles to help fill out those I wish items.

I wish people would understand those things can be picked up before a degree or outside of a degree.

Going to a university in the UK I was free to study the subject I was interested in and wanted to understand. I was able to fully immerse.

My schooling prepared me for the rest.

I have a BSc in Computer Science. I managed almost completely avoid taking any papers outside of the science and engineering departments. The only non STEM paper I took was a commerce paper about eCommerce.

Looking back now, I actually regret only doing STEM papers. I got stuck in a bubble of dealing with the same people (or the same types of people) in all my classes, and really got no breadth to my degree.

I'm not saying that people should be forced to take arts papers for a science degree, but in retrospect I think that I should've taken some.

English (and NZ and Aussie) universities have 3 year courses because high school kids pass minimum national requirements exams before they enter, US high school graduation is not uniform across the country, or even across states - in essence those minimum requirements ensure that students entering Uni have already met what in the US would be the General Education requirement

That's not whey they have 3 year vs 4 year courses. The 3 year vs 4 year debate is a legacy of philosophical differences about what qualifies a liberal education that are far older than standardized national secondary curricula.

Universities in the US have entrance requirements for classes completed in high school. They do the same thing English universities do when admiring American students--they require specific general education classes were completed in high school.

Because of this even though the US doesn't have national standards, virtually ever university bound student will have already completed the same general education classes that students from countries with 3 year universities would have. Universities her don't force students to take English, math, and history classes because some substantial population of college bound high school students haven't taken them.

well yeah, they let people test out if they meet already can meet the requirement

CLEP tests may be the answer. At many schools, you can get out of many liberal arts classes if you can pass a two-hour test on the topic. You get college credit as if from the class, too. And the tests are far cheaper than the classes.

As soon as higher-level programming languages such as COBOL, Algol, and FORTRAN came out; many clerks in the mid 1960's onward learned programming without knowing about the hardware guts or theory. Thus, the layering of specialties had already begun.

I'm not that sold on this. Programming in ASM isn't really "harder" than programming in Haskell, it's just slower - it requires discipline, but not the ability to grasp abstraction that some more modern languages need.

I think that issues like knowing "how do compilers actually work" or "this thing is actually the halting problem, let's stop" are more relevant than how removed from latches and memory controllers one is.

> Programming in ASM isn't really "harder" than programming in Haskell, it's just slower - it requires discipline, but not the ability to grasp abstraction that some more modern languages need.

Reminds me of this comment: https://news.ycombinator.com/item?id=17403233

That story reminds me of a guy who made a CRUD application (tracking, reports, & statistics) out of MS-Excel VBA. I was asked to start supporting it, and did a lot of similar head scratching. Amateur programming can be worse than no automation at times.

He wasn't bitter, however; just confused about my assessment about it being time-consuming to support. I had to explain that typical programmers use abstractions to make code maintainable and not rely on our ability to read, well, spaghetti code fast. I've met some programmers quick at deciphering spaghetti code, but I confessed I wasn't one of them. I used the analogy of taking a custom-built amateur car with a custom-built engine to a generic auto-mechanic and expecting them to figure it out as fast as a regular car. That story seemed to click.

Assembly offers just as much capability for abstraction as any other language, and I think that it demands more "ability to grasp abstraction", because you pretty much have to build those abstractions yourself.

Akwardly, i agree on the facts but the sentence doesn't feel right. Would you agree with this rephrasing: asm is simple in terms of number of abstract concepts needed to define its syntax and semantics (and as far as i know this is good since it's intended as a mother-of-all lingua franca), but of course every language out there is turing complete (and has some mechanism for syscalls); so in the end the only way to build haskell-like abstractions in asm is actually to code up GHC and then code in haskell. Which i wouldn't call at all programming in asm (just like i'm not commuting circuits by typing on this keyboard). Nor do i think this is actually what asm is used for (when written by humans). My guess is that asm is used for programming close to the metal--crucial parts of firmwares, drivers--in situations with simple logic and in which abstraction would actually get in your way (who care about types or functions when you just need to write some values into some memory location).

I don't necessarily disagree, but typically an organization wouldn't let a non-degreed employee use or touch ASM (fair or not). It looks dodgy from a P/R perspective.

I also forgot to mention that many engineers and scientists used FORTRAN and BASIC back then without knowing much about the guts of computers. It's largely why such languages were made.

> without knowing about the hardware guts or theory.

And so it should be (for the vast majority of the HW's users). The better the work done by the people below, the more hermetically with which the abstraction can be interacted from above.

College is only one way (albeit a good one) to get a firm grasp of CS. I'd argue putting in 1000s of hours of work is another.

I know someone currently doing a CS MS so she can get into tech. She is good at math, so all of those classes are free As for her. She has other people help with her programming homework assignments, sometimes even having them do the entire thing for her. I know because she told me this herself, and even asked me to do some for her. As long as she gets near 100% on homework, it's nearly impossible to get less than a B in any programming class. She has an adderall prescription so she can cram for tests, which have way more multiple-choice questions than should reasonably be expected.

She's currently on her second internship. They're both at employers that don't screen candidates on actual programming ability (they just looked at GPA, resume/application, and then a soft interview), and the current one has a reputation for being a very meh internship, though good resume padding. The last time I helped her, her code was fine for someone who had just started learning two years ago, but I don't think she is going to progress to the point that you'd expect someone with a Master's to be at simply because she isn't doing her own homework.

I don't have a CS BS or MS, but there have been a few times where I feel like I need to get one just in case the market tanks again and they become a significant hiring criteria. But at the same time, I have to wonder just how many people currently enrolled in MS CS programs throughout the nation are doing something similar, and devaluing the worth of the degree (on paper, to potential employers) to the point that some could even look at it negatively.

I see pros and cons to both sides (4 year university vs boot camp). I have a CS degree, whereas our front end developer came from a boot camp.

For my part, I have found the underlying theory to be helpful in ways I couldn't have comprehended while at school. Understanding binary made understanding octets in IP addresses and subnet masking much easier. Taking a class that involved programming sorting algorithms by hand in C++ was very beneficial, even though I have no need to do this in my day to day work. Learning about logic gates has even been helpful. Basically, I'm better equipped to have a fundamental understanding of how software and hardware works, even if it's a very basic understanding. What I lacked coming out of school, though, was having a clear road map of how to just build something in a modern stack on day one at a job.

My compatriot is in the opposite boat. He came out of boot camp with a clear understanding of how to build web applications using Angular. He could hit the ground running, and did from day one. However, he lacks the underlying theory that helps to understand how things work. Does he need these things to do his job? No, but I do believe it makes for a more well-rounded developer to have this knowledge. Fortunately, he's got a great attitude and aptitude, so he's been picking these things up as he goes.

I'd rather see something more in the middle, where one can get the theory coupled with the real-world programming skills. Maybe my CS program is to blame, and others exist that do a better job of this. Looking back, my senior "full-stack" project was very limited. I would have benefited from a little more meat to the project, and also having some more of the ancillary things taught, such as anything to do with networking in a more practical rather than academic way.

I mostly disagree. Software, like electronic engineering, is about abstraction, but it differs in a critical way: It's self-modifiable. Kids can think they're making computer games using little apps, but what they're doing is more akin to making a map for Starcraft than it is to actually making a game. If anything I'd argue that getting a CS degree or similar (math, engineering, philosophy) will arm your mind with the tools it needs to really compete over the coming decades.

If you want a simple middle class life a bootcamp is perfectly fine. Lots of people make money writing CSS. There is nothing wrong with it. But I would never tell a bright youngster that CS degrees (and similar) are a waste.

I don't think they were saying CS degrees are a waste at all, in fact I think the message is exactly the same as yours. Going to a bootcamp teaches you to use some specific tools and let you make basic projects. Getting a CS degree lets you learn how those tools work and even how to make better ones if you're good enough.

I'm missing the part where this is in disagreement with OP. It actually seems to be saying almost exactly the same thing.

You could write almost the exact same post about a philosophy degree

Bob Martin has a great talk https://www.youtube.com/watch?v=ecIWPzGEbFc, which illustrates a lot of history of computers.

He asserts that the number of devs is doubling every 5 years which means that half the developers have < 5 years experience. The industry has lots much of its scientific discipline which has to return, or regulations will force more structure.

Anyway he's a great speaker and this is one of my favorites.


I think what people writing articles like this tend to miss, is that it's much easier to be super deep in a field when the field is limited and low-entry but you're already in it. Because there's not really as much going on and there's not much else to do but learn C or some text editor on a super deep level or what not. What else are you going to do? Look at the stuff "deep" people are generally into, it mostly revolves around POSIX some way or another. And databases, but nobody wants to talk about that.

But today, there are hundreds of languages, a whole bunch of frameworks per language, various tools, constantly changing standards, etc. The available landscape is absolutely staggering. If you want to deeply focus, you need to pick what to deeply focus on, which is a rather tough choice and a questionable one, because the thing you focused on might become obsolete.

> if you want to make a tiny little difference in the industry and change the world just a little bit, then you do need that degree

And what sense does THIS make? Among the people who I know who _do_ deeply get into some specific CS topic, many are those who do not have degrees, because they're often people who are not fans of structure and ended up doing what they want, as opposed to what might be beneficial for career purposes.

This just seems to be heavily misguided elitism.

If you really want to know why the quality of software, and basically everything else, has gone down, just look at market incentives and you'll find that to be an utterly boring question.

> This just seems to be heavily misguided elitism.

Not really. There is a bit too much meet you at the bottom communism in the modern ethos of computer science. When you can have measured difficulty and time to skill in a field isn't it a little odd that we don't have a lot more measured elitism in computer science? The reality is the hardest most complex stuff is literally solved and produced by an elite few. All of which could be taught or learnt but no one gives a shit. Its a bit like magic is slowly dying from the universe, and the wizzards keep suggesting it might be worth holding onto. But they then get called out as an elite minority only interested in furthering their arcane agenda. Whilst everyone else is using the accessible modern "technology" built from the original magic and cannot fathom why anyone should give a shit about magic anymore. Wizzards only appeared special to get a cushy job next to the king in their own tower right? Technology is just as good as magic, because it was built from magic!!!! So elitist. The problem comes when the technology fails, or doesn't do something thats needed and cannot be changed without changing the base magic that the technology started from. If there are no more wizzards and no more magic, you're never going to be able to create new base technologies. The only hope is the magic making technology everyone is currently working on called Machine Learning. Then all the wizzards can be virtualised and controlled like slaves, even if its provable ML isn't actually magic, its close enough... we hope.

> The reality is the hardest most complex stuff is literally solved and produced by an elite few. All of which could be taught or learnt but no one gives a shit.

The elite few don't really want anyone joining them, so nobody does. Look at the state of academia and look at lack of training in jobs. Nobody wants anyone to be elite, so people don't bother, there's no benefit in it. Your problem is that you think you're important, that you think the most useful contribution from a person is what they do personally, but all that does is just advance, you, personally.

The thing is, the elite are often much more worried about being elite than about what they're actually doing. Once you see that, you know the incentive is corrupt. It's a status thing for them. And how dare anyone challenge their status. That's really all this is. It's hardly about the advancement of technology, because if it was, those people would be out there teaching, or trying to address the informational overload, and not looking smug. It's elitist because it's utterly disregarding most of human experience and presenting yours as superior, and your entire argument will ultimately derive from that view and pretty much everything you say after that could be really anything as long as it supports your idea that you're superior. That's why elitism is bad, it's destructive to conception of reality.

I know plenty of people who don't look at things as magic but who also don't consider themselves as some "wizards". Maybe you should try getting off your high horse and talking to people some and figuring out what it is that they are doing all day and you'll understand how silly everything you're saying here is. But as with everything else, it's easier to sit on top than to try to understand.

Snobbery is all this is, and likely unearned, can't say the association between perceiving yourself as elite and actually being so is very good at all.

I, for one, am really looking forward to seeing the projektirSolver in next year's SAT competition and the projektirNET in next year's ILSVRC!

> What else are you going to do? Look at the stuff "deep" people are generally into, it mostly revolves around POSIX some way or another. And databases, but nobody wants to talk about that.

Self-driving cars (controls, perception, planning, safety, sensor design, localization, mapping, integration, security, etc.), human-competitive NLP/image classification, advanced robotics (repeat list from cars here but for things in the air, off-road, on the water, in the water, in orbit, in deep space, ...).

Those are all examples of real things that real people get to work on every day.

Operating systems topics don't even scrape the top 20 of stuff I think of when I think of deep expertise.

And even if we limit ourselves to the sort of things you mention, most people who work deeply on languages, tools, and standards view these as manifestations of their deep exploration rather than the focus or subject of the dive.

For example, TensorFlow. The framework very much is the product. But even if some other framework won the day tomorrow, the people who worked on TensorFlow would not have "wasted" their time thinking deeply about how to build the system.

This is why researchers whose original contributions were made in the 70s and 80s none-the-less continue to establish themselves as desired experts in new technology trends (e.g., Leslie Lamport and cloud computing or Martin Abadi and ML frameworks). Because they were focused on ideas and fundamental problems. The problems never disappeared, just changed form. And ideas have a lot more staying power than their manifestation in code.

Most people working deeply on systems today are not "revolving around POSIX in some way". See the proceedings of OSDI. And most deep experts choose other topics, most of which your post doesn't mention: graphics, programming languages (making them and analyzing programs written in them), compilers, security, robotics, user interfaces, NLP, ...

Your definition of "expert" seems to revolve around using things, mostly things based on ideas and techniques that were well-understood already 20 years ago and that are related to building a particular type of software system. Which, if anything, seems to deepen the author's point.

Someone gets to fill the AI research labs, staff the self-driving car companies, work at NASA, build core infra at large tech companies, and build the foundation for the next 20 years of trendy growth areas.

It's possible to get to those places without a degree, of course, but a degree is by far the path of least resistance. And in most of these cases, learning the material from the degree isn't optional; you're probably going to have hard time doing that controls engineering job at a self-driving car company if you never made your way through a calc sequence, some physics, and an algorithms course.

It's also worth noting that very often, building wordpress plugins pays more than doing all of those things I mentioned. I guess it's all about what you want to spend your life doing, which is exactly what the author says at the end of the article.

> Self-driving cars

Self-driving cars are new. You've missed my point. The OP is from an older age. There wasn't a lot back then, so for choosing what to focus in, weren't a lot of choices.

Funny, you mention graphics, what do graphics usually involve? C, C++. Back to POSIX. Operating systems? Same. Robotics? C. Basically, if you picked C, you're good.

On the other hand, how often do you hear about a Java expert? Someone who knows the intricacies of the GC? All about the JVM? Not that often. They exist. But it's not hip. They chose "poorly".

These days, there is a lot more choice, so what are you going to pick? On what is that choice based? What do you do if you picked the wrong thing? This is problem is very modern and didn't exist to such a degree before.

> Self-driving cars are new. You've missed my point. The OP is from an older age. There wasn't a lot back then, so for choosing what to focus in, weren't a lot of choices.

RALPH, a 1990 Pontiac Sport minivan, drove across the US 98% autonomously. In 1995.

Much of the control theory, robotics, and AI work that enabled the current self-driving gold rush was invented decades ago.

The Dartmouth workshop was in 1956. Dearth of choices? Please! Those days provided an enormous surplus of choices, almost all of which were good ones! People from that older age invented reinforcement learning, image classification, natural language processing, OOP, hell, even the notion of a pointer! And the list goes on.

Today there are far fewer choices than there were back then because so much has already been done.

That is, of course, assuming you're in the business of "doing things no one else has done before" as opposed to the business of "following a well-trodden path".

So, I guess if your view of the world is confined to "using things other people already invented and explained to me", you might consider the 1950s and 1960s a bleak period when no one knew how to do anything. As opposed to the cusp of a century-long period of continuous innovation...

Again, as the article says, I guess it boils down to what you want to get out of a lifetime of work.

> Funny, you mention graphics, what do graphics usually involve?

shapes, views, raytracing, texture mapping, lighting, color, filtering, scaling, reconstructing, visual surfaces, grids and voxels, octrees, kd-trees, partition trees, polygonal rendering, ...

And that's just the undergrad stuff, not the cutting edge.

Oh yeah, knowing C/CUDA/OpenCL is nice. But when compared to deep expertise, it's a rather trivial time investment and is completely orthogonal: an implementation detail, not the fundamental content.

> Operating systems?

Kernels, scheduling, device drivers, caching, distributed systems, energy models, timing attacks, and the list goes on.

Of course, knowing C is essential, but that's the easy stuff when compared to wrapping your head around a modern OS, or even a tiny piece of a modern OS

> Robotics?

SLAM, sensor fusion, filters, actuation for various types of novel actuators, PDEs and ODEs, optimal control, stability and robustness, system identification, model-predictive control, motors, servos, simulation, etc. And that's just the software side.

Oh yeah, knowing C is nice. But when compared to deep expertise in robotics, it's a rather trivial time investment and is completely orthogonal: an implementation detail, not the fundamental content. Many of the fundamental ideas at techniques in robotics pre-date C by decades.

> On the other hand, how often do you hear about a Java expert? Someone who knows the intricacies of the GC? All about the JVM? Not that often. They exist. But it's not hip. They chose "poorly".

I know a few true, honest-to-god Java experts. They all make insane amounts of money (even by SFBA SE standards) and love their work. Turns out Google has quite a bit of Java code and a metric shitload of money.

You think graphics, robotics, and OSes are just "C and POSIX". That's not true. C and POSIX aren't even table stakes. They're the thing you pick up in a few weeks or maybe a semester so that you can spend several years obtaining the table stakes -- see the list above. Then you need to build true expertise on top of that.

The path from "I know C" to "robotics expert" or "graphics expert" is at the very least a multi-year path. And that's assuming you're bright and have your full work day (and then some) to dedicate to following advances and building your own.

I've worked with plenty of competent developers who don't have degrees. Most crud jobs don't need degrees anymore.

But, any serious business that's going to churn a lot of data, needs fast pipelines, needs to invent entire new markets or ideas will heavily rely on people with patience and training in scientific process.

Very fun to read and sounds very true. In my experience though, I found no correlation between programmers with a CS degree and being a good developer/architect.

I will say that those of us who didn't graduate (including me, I dropped out) often feel they have something to prove and will work harder.

If you require programmers to have a deep understanding of computer science then you will never have enough programmers. This is good for those people with the qualifications but not necessarily good for the rest of society which ends up with an unmet need.

I dropped out of a Computer Science program at KSU. It’s been great.

What have you been doing since then ? Are you pursuing self-study ? I'm curious .

I am always working on side projects that usually become businesses or lead to a consulting/full time opportunity. I’m currently working at Apple as a tech lead.

I worked my entire way through school doing IT (6 months) then software engineering (2.5 years) before I was offered my first full time by some coworkers that went to a startup. The full time offer also lined up with my term as president of my fraternity ending, so it was time to get out.

> If you want to make really interesting exciting things that have never existed before, if you want to make a tiny little difference in the industry and change the world just a little bit, then you do need that degree. If you want to make the tools and libraries that the lower-level people use, you do need that degree

I don't know about you, but I need some solid evidence that you did anything close to that so I can consider you seriously.

As a person who never had a degree but whose knowledge goes above and beyond what 90% of the market needs and well into CS territory, this article insults me. It's also a painful reminder of similar prejudices in people who are looking to hire. I often end up doing the kind of basic development the author talks about and I'm not happy about that.

Why not just take a CS degree? Because I'm a poor fit for the education system. That's why I dropped out of high school in the first place.

Also, I feel like I've paid my dues already. I've been learning about computer software (and hardware and electronics) for 27 years. I haven't stopped at merely what I needed to know to do my job. I have done a lot of self-study. I routinely roll my own libraries and write embedded code, and I had a patch submitted to the Linux kernel a few years back. I also design analog and digital circuits on my spare time.

I feel it's not about having a degree at all, because I'm living, breathing evidence of that. I've met people with CS degrees who can barely write a line of code. Maybe they didn't go to a good college. Maybe they did, and it's possible to pass the exams by cramming (followed by forgetting).

Saying that you can't do advanced stuff without a CS degree is snobbery.

I don't think that's a charitable reading of the article. The author is not saying that you can't have this knowledge without a degree, he's saying that few people in the field today have this knowledge, because they don't have the degree. The two assertions are different: there are things a degree teaches you; if you have the degree, you probably know them. While you can't judge whether any given individual has the same knowledge without the degree, you can:

1) Expect that 1,000 people with the degree will mostly have that knowledge;

2) That 1,000 people without the degree will mostly not have the knowledge.

I disagree strongly with your point #1. "This knowledge" specifically is deep full-system understanding, as quoted in the article:

> "They use languages, editors, compilers, and operating systems; but they don't have the first clue about how to create any of these things or even how they really work."

And yes, he is asserting that people should get this from a degree:

> If you want to build doghouses, just pick up some skills with hammer and nails, and then go for it. If you want to be an architect who designs and builds skyscrapers, then go get a degree in architecture first.

The full depth of applied understanding comes from personal interest & experience. Whether or not a person with such interest pursues a CS degree is completely orthogonal.

Plus, these applied low- & mid-level computational practicals have nothing to do with Computer Science; they are programming, architecture, and engineering. People generally do not complete CS degrees with any specific imparting of these 3 facets, unless they use their university time to pursue their own interests & ambitions in the field. And again, such people can and do pursue those outside of university, especially in their pre-university age exploratory years, and on-the-job experience with real systems.

> The full depth of applied understanding comes from personal interest & experience. Whether or not a person with such interest pursues a CS degree is completely orthogonal.

It's not orthogonal, but rather highly correlated. Combinatorics, graph theory, computer architecture, etc., will all be part of a university CS curriculum. Someone who has a CS degree will probably know those things (or at least recall them after a brief refresher).

Combinatorics and graph theory have zero to do with this practical knowledge. I'll grant that computer architecture classes introduce some relevant concepts, but this foundation can be had from anywhere as its matters, history, and details are widely discussed in the open online as a persistently current practical concern.

People like the author, with a degree and lots of experience under their belt, really overestimate what the degree specifically gave them, vs what they learned through decades of experience as they developed their craft. When it comes to practical, applied programming and skills of abstraction, informed by deep knowledge of what goes on under the hood, vanishingly small amounts of that come from university education.

Again, I will be careful to separate out those who do actual Computer Science on the job from this practical craft of quality programming. The former is much more rare, but is a separate field.

The author did disclose his bias up front and was responding specifically to a question about a traditional degree vs. a bootcamp. I personally still think that general education holds a lot of value. Of course it’s not for everyone, and I think it’s a problem that it is marketed as such. Nonetheless, there is a lot of value in liberal arts, in learning to think not just about software but other things. Good developers, scientists, mathematicians, artists, etc., etc., pull inspiration and insight from fields beyond their own. And we don’t apply software in a vacuum, we apply it to real problems in some field, typically other than software. I think a broad knowledge is part of where the “advanced stuff” comes from, and degrees are tailored to impart some of that over, say, a four week coding course.

Absolutely agreed. I eventually finished a CS degree, but what I learned from it was a little speck next to what I'd learned outside of it (even measuring right after graduation).

I believe the only thing of lasting value I got from the degree was that I saw more of a 'roadmap' of CS topics that I didn't know existed before, which were a starting point for me learning more about them on my own. HN has done that more successfully than the university since though ;)

It also makes the opposite error: claiming that you can do advanced stuff with (just) a CS degree.

In the years since the author graduated, there has been a shift in some CS degrees, to meet market needs. A key difference is that then, a CS degree wasn't about programming.

Those CS degree were just the starting point, letting you know what is out there, and a taste of how to approach it. Part of that was contact with research academics. (Though some recent graduates miss the point and believe that because they now know something, they now know everything.)

Interesting. I didn't read any snobbery into the article. From the tone, my take is he's talking about an interest and willingness to acquire CS knowledge and skills. He framed it in terms of degree or not because he was responding to a question about somebody who's in college wondering what to do.

I'm the other way.

CS grads are lazy programmers with the mentality that someone else will do it.

I am self taught and can usually rig a solution to a problem in a few hours. A CS grad will tell you its impossible.

Maybe they need to do hackathons, but its like they have been taught to doubt instead of taught to think.

He calls it a rant, but I would say its just being honest.

My Alma mater on HN? Weird!

Curious to know what's weird about KSU showing up on HN. Is it an anomaly?

> If you want to make the tools and libraries that the lower-level people use, you do need that degree.

No, you need to somehow invest more time to build more features than another sucker out there. That's almost entirely it. Period. Shit is just fast enough these days, and if your industry cares about performance, well then maybe understanding something about caching that can be learned in less than a day from a blog article will help you with 80% of your problems.

There's plenty of work out there that craves better solutions, and a degree is absolutely not even a nice to have at this point. Let me repeat: there are fundamentally basic applications and software solutions that various industries are dying to have exist, millions of dollars on the line if you know what industries in question, that simply just take a damn long time to implement but every individual piece is so far removed from so much as a basic comp sci 101 algo class, that you're literally just talking about business logic at that point.

That’s all true. The flip side is that there’s also problems that do actually require specialized training. An example that comes to mind for me from a few years ago involved real-time simulation of a constellation of satellites. The accuracy was on the order of meters. I ended up implementing RK4 as a solver, and had to use pretty complex differential equations (2nd order effects matter at that resolution).

The project also involved real-time bitstream generation and modulation at 10MS/s. That was a mixture of understanding DSP and some clever hacks to get the performance we needed on the hardware we had. Oh, and concurrency without race conditions, because we needed to use every core we had to make it all work.

Yes, there’s lots of business problems that can be solved by programming without much computer science. But there’s also a huge pile of problems where it’s not even clear that it’s possible to solve using current tech. I, personally, much prefer the latter, but to each his or her own.

This is completely untrue. I have in the past and continue to create web components that "hard CS" folks would scoff at and also quietly use in their own projects if my work weren't owned by my employers.

I've built chunking hi-def megapixel web camera services a decade ago when browsers were limited to 20mb uploads. I've suggested building simple structured data protocols based on xml for a high-rate foreign exchange brokerage client-server architectures. We wanted to secure traffic between the client and server so I suggested self-signed certificates.

I'm not stupid but I'm not a "10x developer" either. I work as a software developer because it's the most lucrative, accessible and interesting opportunity for a non-credentialed individual like myself. I've met folks with Masters in CS that unfortunately didn't merit a basic certificate in the same.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact