I recently went through (the recordings of) MIT’s intro course for electrical engineers, in which somewhere the professor says students may wonder why they have to do all this calculus and learn FET models and so on — in real life don’t you just wire chips together? And he points out that MIT degrees are for the people who make the chips.
You never know when background knowledge and first principles might come in handy. One of my favorite YouTuber practical engineers has this story about going on a boat trip. The new coffee maker on board was freaked out by the noise from the inverter and kept shutting itself off. A total disaster! There would be no coffee the whole trip. So he turned on the blender while making the coffee, and the coffee maker started working. How did he know? He knew what kind of motor was in the blender, and knew its windings would increase the inductance of the circuit the kitchen appliances, filtering out the higher frequencies put out by the cheap inverter.
This guy isn't an electrician. ("Elekchicken") His day job is just to put pieces of "industrial lego" together -- just like how so many programmer jobs now are mainly about gluing libraries together. But he never shies away from knowledge of first principles, and he demonstrates all the time why such knowledge is valuable.
Arguably our most prominent contribution to OSS (outside of our own projects) was in React. We're credited in the codebase when the approach was used: https://github.com/facebook/react/blob/v16.0.0/src/renderers...
As discussed in the original PR https://github.com/facebook/react/pull/4400 the original checksum used a naive adler-32 algorithm, but with some basic math you can find a much more efficient implementation that eliminates most of the mod operations without risking a hidden deoptimizing overflow.
It's not everyday that you get to use your background knowledge, but when you do it feels great!
If you want a 110k/yr to 1B/yr job, self taught with some confidence is my recommendation.
But especially the fundamentals is where I think most of the value of a uni degree is.
Bootcamps can be valuable, but they are in no way comparable to a 4 year degree from a decent CS program. The top performing bootcamps are either functioning as an extended job interview that you have to pay for, or they are very good at selecting experienced students who only need a 12 week course to be ready to be productive developers. In my opinion, the reason we've seen bootcamps close or fail to expand is that there is a limited supply of these types of students.
For the vast majority of people a 12 week course, no matter how intensive, is a good introduction, but a lot of training is still necessary to be useful. If you are prepared to invest in that training, they can be great hires. However, you need to be aware that it's likely going to be months before you get real productive work without hand-holding. It takes most people a lot longer than 12 weeks to be comfortable with the basics of moving up and down through levels of abstraction.
I've seen various syllabi eg teachyourselfcs.com and though they seem legit (and I've dabbled is some courses) I don't quite see the application/direct benefit professionally.
Let me phrase in another way; when I got started it was easy to see why I needed to learn front-end and back-end to make a web app (for a CRUD job). Now I want to go further, but where?
I think it would be helpful to see what jobs I could get by furthering my CS fundamentals, and not just "Senior Software Engineer".
So, as someone with your unique perspective, what do you recommend? Should I really slog through ye olde CS curriculum in hopes that one day I'll be able to apply some of it? Can you recommend another approach?
Again, put another way, some of these "top performing bootcamps" sharpen your React skills and whiteboarding skills, which have a career/market value. But they don't appeal to me because they don't seem academically/CS focused.
I hope this duality/constraint came across.
Any guidance appreciated.
Depending on the kind of work you do, the minority of the time where it would be beneficial to know CS might have a big impact both on the product and your reputation in the company. This could help you move up to higher, better paid positions, but it also might not. Teaching yourself CS is a big time investment and if you just want to maximize your salary there's probably better ways to do it.
I think it's really only worth it if on some level you enjoy it and find it interesting. You can seek out jobs where they use more CS, but again this doesn't guarantee you'll advance professionally. But if you're the type of person who enjoys programming as a creative activity, I think CS can be very rewarding because it opens you up to what's possible.
This is going to difficult, because getting a job other than Software Engineer (or something related in management or sales) often requires credentials.
However if you'd like to get a higher paying software engineering job at a company like Google, CS fundamentals can definitely help you get through the interviews.
>Should I really slog through ye olde CS curriculum in hopes that one day I'll be able to apply some of it? Can you recommend another approach?
Yes to the first, no to the second. CS fundamentals build on each other. It's difficult to understand algorithmic analysis if you don't understand calculus (at a basic level), and it's difficult to understand automata if you aren't familiar with set notation from discrete math etc...
Learning on your own, it's tempting to avoid the boring stuff and focus on what's fun or seems useful to you right now. The problem is you don't really know what's going to be useful later on.
You already know how to code, so If you are really interested in the learning the fundamentals, first I'd make sure you have a basic grasp of high school math through calc I. Then I'd go through a good discrete math textbook (I'd recommend one that's CS focused). After that I'd work through a data structures and algorithms textbook, then a programming language concepts textbook, a computer architecture/organization textbook, and finally a theory of computation/automata book.
You'd have a decent grasp of the fundamentals by then. After that if you want to go further, I'd go through books on networks, operating systems, compilers, and functional programming.
The project is challening but has a pretty rewarding feedback loop. Along the way you'll understand how high level concepts like grphics are implemented on top of hardware, computer architecture, cpu behavior, memory access, interrupts and timers and so on. It'll also challenge your software design skill and project planning. It might not be the best project if you are more interested in cs algorithms though, but I can't really think of anything that has a more enjoyable feedback loop.
previous hn discussion: https://news.ycombinator.com/item?id=17134668
Why you don't send me an email (on profile) or a DM on twitter @siscia_ ? Twitter I believe is the fastest.
Anybody else, feel free to do the same :)
From my experience that is not true. Bootcamp graduates (obviously barring additional experience) tend to be qualified to start at the intern level.
>Too many CS grads have little practical experience
Compared to veterans yes, but compared to new bootcamp graduates I disagree.
What CS grads can lack vs bootcamp graduates is the kind of thing they can pick up in a few weeks (as evidenced by bootcamps only lasting 6-12 weeks)--the reverse usually isn't true.
Practically, I've almost never seen this. Just like I've never interviewed a CS graduate who couldn't use a for loop despite all the horror stories that people like bring up. Even the most theoretical CS programs tend to require several sizeable projects.
Bootcamp grads can be a good find if you're willing to train, but in general that's because they are cheaper than CS grads, and for some level of work they tend to be goo enough, not because they are better programmers.
There are obviously many exceptions to this, and I'm only comparing CS grads to new bootcamp grads assuming similar backgrounds ages etc... One advantage bootcamp grads do have is that they tend to be a little older/more mature, but that's really nothing to do with bootcamp vs university.
I'm currently doing CS and one of the core requirements for finishing the degree is to have atleast one semester-long internship at a corporation in something that relates to your degree. I did backend development for half a year, though the rest of the courses are fairly real-world focused once they have dug through the theoretical meat.
Some schools have pitiful results for what they say is a degree.
If you wanted to suggest that they be required to have a significant capstone project (a full stack system, a compiler or VM, an operating system (rudimentary or not), firmware/BIOS, HPC application, etc.) then I would agree but to say that all CS grads should be good web devs is rather narrow minded when looking at the value of a computer science curriculum.
The full stack consisting of assembly, compiler design, C, networking, operating systems, embedded development, data structures and algorithms, databases development and design (including an understanding of the algorithms/optimizations used).
At least one class in functional programming and at least one in
and OOP language.
And for all that is holy please teach classes where they have to do presentations and learn interpersonal skills and a few business classes. I am so sick of working with people who argue about technology but can’t come up with a business case for thier idea.
And then let them choose a “track” based on thier interests - after having for lack of a better term “career day” where people from the industry come in and give talks about what they do and the requirements to get a job there.
If that curricula means cutting out some core classes - so be it.
> At least one class in functional programming and at least one in and OOP language.
I think it's pretty common for CS degrees to have all/most of this in scope already? At least I had all of it, with the exception of embededded development.
I can't recall ever hearing that argument advanced for CS degrees on HN (and I read a _lot_ of HN postings). Perhaps you're thinking of liberal arts degrees?
The usual rationale for CS degrees is that one learns the underlying theory behind the systems and tech stacks that they use, making them a more capable developer.
And there's degrees in chemistry, for those whose goal is to study the way atoms interact with each other. That's different from chemical engineering, where the goal is to efficiently produce the desired molecules at scale, without blowing up the factory. Two different degrees, with two different focuses, in the same general area.
In the same way, I think that computer science needs to be a separate, different degree from software engineering. At the moment, CS is trying to be just one area, but I suspect that it's often inadequately preparing those who are going to go on to be software engineers - who are in fact the large majority of CS grads. (They may also be inadequately preparing those who intend to stay in theoretical computer science, but I have even less information about that.)
We tried hiring a few people with just bootcamps. Only a few (so hardly a representative sample), but none of them worked out. As soon as they had to try something even the slightest bit different than what they'd done in the bootcamp they were lost. There were people with degrees in unrelated fields who then did a bootcamp who were good, and almost all of the CS/CE/EE people we hired were good.
I'm not saying this is always the case, but the two years of CS fundamentals seem to be valuable, AND the two years of unrelated core classes seem to be valuable. It might just be how it forces you to engage with and learn things you don't care about (because there will be times in any job you have to do that), or the people skills of having to learn to deal with professors and other students, or the pattern of constant learning and adapting it ingrains upon you, or something else entirely, but per the link, I don't think a bootcamp should ever be viewed as sufficient preparation for a career in development. It's fine in tandem with other things, but it's extremely limiting on its own.
My fav set of classes that changed the way I think account things in CS. It was at UNSW in Australia.
Fundamentals - we started with C. I hated C segfaults but realized the power of pointers, structs and how memory kind of worked. We built a little 8bit machine code and a little VM.
Datastructures and Algos - hashes, b-trees, graphs etc. it was eye opening to hear how this had been invented before I was born and the latest and greatest databases still use their variants. We built a crawler and search engine with our own hand written database. That’s where I learnt about mmap. A lot of fun.
Compilers - The whole, lexer, parser, checker, emitter pipeline. Write your own subset of C to jvm compiler. We did a simple lisp interpreter too.
Advanced graphics - we built our own little 3D game engine using only OpenGL calls.
Microcontrollers - literally programmed in assembly to operate a simple lift. Learnt about electrons and gates and how they make machine code work. Connecting compiler knowledge with microprocessor internals was a holy mind opener moment.
There’s something to be said about “we’re gonna build mystery thing X from the ground up and connect theory to practical”.
Sucks that Australia has very poor VC funding and startup ecosystem. UNSW produces some phenomenal graduates that are quickly stolen by US tech giants.
Now the caveat to my statements there are that all three already had degrees, in disparate fields unrelated to CS/CE. It's an example of capable individuals being able to learn practical skills in anything.
>There were people with degrees in unrelated fields who then did a bootcamp who were good
There has been a similar thread on HN where someone with a bunch of degrees said: "I haven't used anything from my studies in my work". But this person might be blind to the fact how the education shaped her mind. Understanding goes beyond mere knowledge.
I've worked with a couple of good people without degrees, I'm not saying it's impossible, just that I've found it very useful. I don't think I'm alone in this!
(Also, c'mon, a degree is not just about the qualification or the education, it's about meeting peers and forming life-long bonds, and having fun too :)
-- edit -- Oh, and on-topic, I think it's good to test your boundaries - reverse engineering an embedded board with a multimeter, a line-levelling serial adaptor and a soldering iron, and then getting it to boot linux with some custom kernel bring-up code, was one of the most rewarding things I've done in recent years. Learning a little bit more about something that's a black box at the edge of your understanding is always good.
I tried to read thought my uni notes on metric spaces a couple of years ago. I don't even understand them anymore. Littered with idiot comments like "obviously -> " despite it being nothing of the sort.
University me was a douchebag. :(
I'm sure this has been covered to death on HN already, but if anyone has a link bookmarked and feels like sharing?
"I'd like to welcome you to this course on Computer Science. Actually that's a terrible way to start. Computer science is a terrible name for this business. First of all, it's not a science. It might be engineering or it might be art. We'll actually see that computer so-called science actually has a lot in common with magic. We will see that in this course. So it's not a science. It's also not really very much about computers. And it's not about computers in the same sense that physics is not really about particle accelerators. And biology is not really about microscopes and petri dishes. And it's not about computers in the same sense that geometry is not really about using a surveying instruments."
LISP was the medium, not the content
I never released any of them and none are useful for anything any longer. None probably reached more than 75% completeness. But by trying to build games on them I saw with such clarity things that worked and didn't, and I got deep practicing in thinking about more 'general' forms/concepts/structures. I wrote a framework for writing 3D model loaders after, and the lessons carried over. I was utterly lost when starting the first game engine; with the model loading framework I had to think deeply, but I new where I was going.
Here's something to consider. If you try to just start writing 'a game engine,' (or whatever engine/framework) that could mean about anything (from 1 day of work, to 20 years of work). But, if you consider some concrete ideas for particular games you'd like to make, think about what they have in common, and then try writing a piece of software that takes care of the common parts, you'll have constrained the problem to something workable with definite goals, and a way of testing your success after.
This is at the heart of any sort of 'abstraction' you develop: you want to code several things which have a bunch of overlapping aspects, so you write something that captures the common parts, and then arrive at the several things you actually need by instantiating your abstraction with different parameters (I mean that in a very general sense, not with specific reference to OOP, though it provides means of instantiating abstractions with various parameters too, of course).
I'd say being able to do the above effectively is the cornerstone of being able to do architecture well. A bunch of other stuff comes after, like knowing when to not create abstractions, and all sorts of knowledge about doing things well for systems in particular domains (e.g. architecture for an interactive 3d simulation vs. web apps have very different patterns that have surfaced as effective).
I didn’t do CS in university. Following the above got me up to speed. I’ve seen it several times on HN. This time it’s me promoting it.
Appreciate the rec!
Some classes are better than others, but it’s extremely valuable to have a program that you’re following with deadlines and measures. If you have a desire to learn the stuff, you’ll get a lot out of it.
When I get busy personally or at work, I kinda phone it in and just do enough to make grades, but when I have the latitude to dive deep I’m learning a ton.
It’s also very affordable, fitting within most workplace education allowances, if you have one.
To be honest, I get secretly frustrated with the lower-level people who now exist in giant hordes. (I rarely tell anyone that.) To me, they are like people who have decided to learn 5% of their field in order to get a few things done, have some fun, and make a living.
These people use tools to create little applications for everyday use. But remember: The tools themselves are also software. But they are a level of software far beyond anything these people could dream of creating. They use languages, editors, compilers, and operating systems; but they don't have the first clue about how to create any of these things or even how they really work.
The most disturbing thing to me, based on interviews I've conducted, is that this seems to include some large fraction of people graduating with a Computer Science degree from supposedly top tier schools with high GPAs that supposedly mean something.
If you want to make really interesting exciting things that have never existed before, if you want to make a tiny little difference in the industry and change the world just a little bit, then you do need that degree. If you want to make the tools and libraries that the lower-level people use, you do need that degree.
The tools and libraries aren't sentient AI yet. If you want to use the tools and libraries at a high level, then you really need to have some knowledge about how they work. The disturbing thing I might be seeing, is that something like 40% of graduates from even good schools have that Computer Science degree, yet really only have that 5% knowledge, yet have been led to think that they know more.
I have been going through a bachelors of computer engineering and the curriculum/professors make every effort to provide as much value as possible yet most students do just enough work to get the grade they want in the class.
In this same vein, students so often take the required courses, look for the easiest (instead of the best) teachers, and try to pick the easiest, lowest effort classes out of the in major classes they can choose between (tech electives) rather than focusing on building a useful base of knowledge for their future careers. Many students intentionally avoid useful classes because they are hard and don't want to risk damaging their precious GPA. I find this accounts for a lot of the grads with very high GPAs but with fairly limited knowledge outside of the basics. This ends up with students putting the cart before the horse by focusing on how to be best suited for getting a job rather than how to be well prepared for it.
The true value of a CS/ECE degree comes from the classes that you take not the degree itself. Much of that material (particularly the fringe optional courses) can be very difficult to grasp on your own and having a professor dedicated to assisting in your understanding of that material is extremely valuable.
Agreed. I think a lot of people take CS to get a GPA, network, get a good list of impressive sounding internships, and work with buzzword-compliant libraries.
Making tools and libraries is not necessary that complicated. It is oftentimes simpler then making large business app, because requirements tend to be clearer, more stable and because you can copy design of one of existing editors. It does require knowledge of algorithms that large business app does not require. Being able to make a compiler will not make you succeed in creating good finished web app, whether large or small. Finishing web app does require good grasp of ui and visual design completely irrelevant to compiler coding. Likewise, being good at algorithms, even difficult ones, and being good at maintainable design is not the same.
It is possible for one to be objectively harder in some sense or require more rare aptitude. Nevertheless, these kind of laments tend to rely mostly on "the kind of things I like and are good at are the real tech and everything else is for weaker programmers" attitudes.
The point of CS is not to make you highly skilled in operating systems or languages or even web apps. It is to give you overall idea about those areas to make you able to choose what to focus on and to make you able to learn any of these ideas once you decide.
My point is that those overall ideas are lacking in substance and understanding. People should have just enough background knowledge to keep themselves out of trouble and off of toes. So many CS grads seem to lack even that.
Who claimed that a high GPA from a top tier school meant something? Top tier schools are notorious for giving As to everyone. The point is that you got in.
I've become more and more convinced that this is the defining problem of our times- we're becoming victims of our own success. The author of this post feels like a dinosaur, and I would bet that many young people in our field who give in to their natural instincts and specialize in something will emerge on the other end feeling the same, at a much younger age than the author, and maybe unable to find equal or better work than before.
In other professions, the difference is more stark, and I think this is a major catalyst for the political/populist zeitgeist of the day. Entire industries have disappeared in a historical blink of an eye, and their former struggling workers are up in arms fighting powerful forces of nature trying to turn back the clock and stay relevant / valuable.
Bringing this back to CS, it's interesting to use this lens to determine whether the degree is worth pursuing anymore. On the one hand, it's fundamental and it encompasses the building blocks of how computers work and what they can do. On the other hand, programming techniques haven't changed very much and are quickly becoming commoditized and more accessible. As the author notes, it's true that you don't need to know as much as you used to, to build a useful program anymore. Like it or not, that's a fact, and economic forces are exploiting this more and more.
I think our human-being wiring is optimized to learn when young, and then "grow up" and become efficient at repeatedly applying our skills to obtain the expected outcome. Increasingly, I feel like the winning (or at least a better) strategy is to stay "young" as much as possible, since the chance you will need to reinvent yourself seems to only rise. This sounds great when you're actually young, but as time passes you get worse and worse at it, despite needing to remain "young" and malleable, and despite the mounting competition from actual young people.
So given all this, saying people "need" a CS degree seems like punching and kicking at giant waves you'll never beat. And I say this as someone who deeply loves both CS and academia. Stay "young" as best you can and try to keep riding the next wave you can find.
> We've reached an era where the average worker's serviceable time long outlives the competitive edge they've gained from their education/training in their formative years. The accelerating pace of economic and technological change is faster than ever, and this condition is unprecedented in human history.
I think when change is this fast understanding the basic building blocks is more important than ever. These don't change quickly, some haven't changed since the industry was born. So much technological change is just reinventing concepts that have existed for decades and once you realize you're staring at an old concept in a new package keeping up is much easier.
The question then is what educational format teaches these fundamentals the best. For some of them it probably is a computer science course but for others it might not be. One of the best classes I had was building our own database (TAFE was pretty hands on) and from what I've seen this was a lot better than how it's taught in many universities. We had to start at the file level and think through the various steps to make a half decent database, like what is required to handle index lookups efficiently, how to retrieve records in order, etc. It gives you a much more intuitive grasp of what steps a DBMS has to go through on your behalf. In my first real job after graduating I had to explain to someone with a CS degree why storing dates as strings was inefficient and making our monthly billing took half a day to generate instead of half a second.
Foundational knowledge is important but the where/when and how we obtain this knowledge could do with a shake up, you can produce a lot of valuable output without an upfront 3-4 year investment, but it doesn't seem like there are a lot of opportunities to gain it after becoming a full time worker.
I wonder if the author considers Node.js to be really interesting and exciting and never existed before. Ryan Dahl doesn't have a CS degree (but does have a mathematics degree).
Another (pretty cliché) example: Bill Gates never finished his degree and went on to create many great, exciting and interesting things.
Node.js was neither new nor interesting, and certainly was never "exciting" (note that this is coming from someone who now does a lot of node.js development).
To start with, it is highly relevant to note that node.js ignored fundamental things that were learned in CS going back to at least the 80s involving the duality of events and threads, leading to an entire generation of people who actually believed that callback hell was somehow a good thing to encourage :/.
In Cocoon, they not only had correctly handled the callback hell problem, they had generalized it so far you could write programs on the server that made "requests to the browser" in the form of rendering a page that were expressed as function calls that would return when the user clicked links and submitted forms, inverting the normal flow of control to make it easier to build complex interactions.
So yeah: they had all of this stuff working almost a full decade before node.js existed at all, much less finally was able to use async/wait to manage callbacks. When they ran into evented hell, they didn't sit around for years building shitty workarounds: they implemented continuations for Rhino and contributed it back so they could do it correctly.
The real thing to realize is that sometimes, shitty things can have more impact than great things, and things that are none of interesting, exciting, or new can have a greater impact than things that were all three of those things if they have better community management or business acumen behind them.
However, we should call a spade a spade, and not pretend that those people are as good with software as someone who has spent years studying foundations, in the same way that the world's greatest software developer shouldn't pretend to be a great business or marketing person because they threw together a good enough website using a template and made some sales of their product on some app store.
There's plenty of programming work out there that doesn't require any deep understanding of CS. You're not going to be creating algorithms when using existing frameworks to write yet another web thing, phone app, or internal businessy database-based system.
A bootcamp can get you started doing practical things. Yes, you won't have deep knowledge, but really you don't need deep knowledge for most employable work. Code doesn't need to be hyper-optimized at the scale you're working, and it's easy to learn common pitfalls & best practices from applied practice, reading, and mentorship.
And I say all this as an oldish fart who understands the chain from designing bespoke high level language environments down through to transistors. We don't need to count bytes & clock cycles anymore; people can let the machine & its provided environment simply work for them and learn the top-level interface.
If it is just the idea of having to take a load of liberal arts classes that perturbs your son and not the low level courses like chip design, logic, algorithms and data structures, calc and stats, one alternative to consider is to study internationally. English universities, for example, offer a bachelors in computer science in three years. Unlike a "8- to 16-week full-day immersive courses that focus solely on technology" they have a curriculum almost exactly the same a US computer science course minus the 3 English classes, 3 history and political sciences classes, 2 economic courses and an art course that a college like Kennesaw has a graduation requirement: Kennesaw State Curriculum (http://ccse.kennesaw.edu/cs/docs/BSCS_2016-2017.docx). To compare look at the University of Bristol's Curriculum:[https://www.bris.ac.uk/unit-programme-catalogue/RouteStructu...]
I know this is not an option for everybody - many people need to stay close to home for personal or financial reasons, but is definitely something to look into. With regards to finances, English university for international students even with 1 year less of study still cost a lot. However, I am pretty sure that the course structure is similar at most European universities some of witch offer really low fees to international students.
I wish more freelancers took a class in business accounting so they'd have an idea of how to do it and what a good (or bad) contract looks like... or understand the value of their time. There are far too many that decide to become "freelancers" and yet have no idea on how to do the basic business items that come with being a freelancer.
I wish more programmers took a class that had a public speaking component. Reading powerpoint slides as a team presentation is boring. The work environment isn't just "I write code" but also a transferring of knowledge from one person to the rest of the team.
I wish more programmers took some classes in history, or physical sciences - things outside the major. I've had more than a water cooler conversations where a person doesn't understand how the length of the day impacts the temperature, or is surprised at the similarity of events today and those of thirty some-odd years ago. This concerns me, not for the skills of work, but rather the understanding of the world outside of the office.
To these things, English composition, human communication, contemporary economy, arts and culture, political science and history... oh, those are are excellent class titles to help fill out those I wish items.
Going to a university in the UK I was free to study the subject I was interested in and wanted to understand. I was able to fully immerse.
My schooling prepared me for the rest.
Looking back now, I actually regret only doing STEM papers. I got stuck in a bubble of dealing with the same people (or the same types of people) in all my classes, and really got no breadth to my degree.
I'm not saying that people should be forced to take arts papers for a science degree, but in retrospect I think that I should've taken some.
Universities in the US have entrance requirements for classes completed in high school. They do the same thing English universities do when admiring American students--they require specific general education classes were completed in high school.
Because of this even though the US doesn't have national standards, virtually ever university bound student will have already completed the same general education classes that students from countries with 3 year universities would have. Universities her don't force students to take English, math, and history classes because some substantial population of college bound high school students haven't taken them.
I think that issues like knowing "how do compilers actually work" or "this thing is actually the halting problem, let's stop" are more relevant than how removed from latches and memory controllers one is.
Reminds me of this comment: https://news.ycombinator.com/item?id=17403233
He wasn't bitter, however; just confused about my assessment about it being time-consuming to support. I had to explain that typical programmers use abstractions to make code maintainable and not rely on our ability to read, well, spaghetti code fast. I've met some programmers quick at deciphering spaghetti code, but I confessed I wasn't one of them. I used the analogy of taking a custom-built amateur car with a custom-built engine to a generic auto-mechanic and expecting them to figure it out as fast as a regular car. That story seemed to click.
I also forgot to mention that many engineers and scientists used FORTRAN and BASIC back then without knowing much about the guts of computers. It's largely why such languages were made.
And so it should be (for the vast majority of the HW's users). The better the work done by the people below, the more hermetically with which the abstraction can be interacted from above.
She's currently on her second internship. They're both at employers that don't screen candidates on actual programming ability (they just looked at GPA, resume/application, and then a soft interview), and the current one has a reputation for being a very meh internship, though good resume padding. The last time I helped her, her code was fine for someone who had just started learning two years ago, but I don't think she is going to progress to the point that you'd expect someone with a Master's to be at simply because she isn't doing her own homework.
I don't have a CS BS or MS, but there have been a few times where I feel like I need to get one just in case the market tanks again and they become a significant hiring criteria. But at the same time, I have to wonder just how many people currently enrolled in MS CS programs throughout the nation are doing something similar, and devaluing the worth of the degree (on paper, to potential employers) to the point that some could even look at it negatively.
For my part, I have found the underlying theory to be helpful in ways I couldn't have comprehended while at school. Understanding binary made understanding octets in IP addresses and subnet masking much easier. Taking a class that involved programming sorting algorithms by hand in C++ was very beneficial, even though I have no need to do this in my day to day work. Learning about logic gates has even been helpful. Basically, I'm better equipped to have a fundamental understanding of how software and hardware works, even if it's a very basic understanding. What I lacked coming out of school, though, was having a clear road map of how to just build something in a modern stack on day one at a job.
My compatriot is in the opposite boat. He came out of boot camp with a clear understanding of how to build web applications using Angular. He could hit the ground running, and did from day one. However, he lacks the underlying theory that helps to understand how things work. Does he need these things to do his job? No, but I do believe it makes for a more well-rounded developer to have this knowledge. Fortunately, he's got a great attitude and aptitude, so he's been picking these things up as he goes.
I'd rather see something more in the middle, where one can get the theory coupled with the real-world programming skills. Maybe my CS program is to blame, and others exist that do a better job of this. Looking back, my senior "full-stack" project was very limited. I would have benefited from a little more meat to the project, and also having some more of the ancillary things taught, such as anything to do with networking in a more practical rather than academic way.
If you want a simple middle class life a bootcamp is perfectly fine. Lots of people make money writing CSS. There is nothing wrong with it. But I would never tell a bright youngster that CS degrees (and similar) are a waste.
He asserts that the number of devs is doubling every 5 years which means that half the developers have < 5 years experience. The industry has lots much of its scientific discipline which has to return, or regulations will force more structure.
Anyway he's a great speaker and this is one of my favorites.
I think what people writing articles like this tend to miss, is that it's much easier to be super deep in a field when the field is limited and low-entry but you're already in it. Because there's not really as much going on and there's not much else to do but learn C or some text editor on a super deep level or what not. What else are you going to do? Look at the stuff "deep" people are generally into, it mostly revolves around POSIX some way or another. And databases, but nobody wants to talk about that.
But today, there are hundreds of languages, a whole bunch of frameworks per language, various tools, constantly changing standards, etc. The available landscape is absolutely staggering. If you want to deeply focus, you need to pick what to deeply focus on, which is a rather tough choice and a questionable one, because the thing you focused on might become obsolete.
> if you want to make a tiny little difference in the industry and change the world just a little bit, then you do need that degree
And what sense does THIS make? Among the people who I know who _do_ deeply get into some specific CS topic, many are those who do not have degrees, because they're often people who are not fans of structure and ended up doing what they want, as opposed to what might be beneficial for career purposes.
This just seems to be heavily misguided elitism.
If you really want to know why the quality of software, and basically everything else, has gone down, just look at market incentives and you'll find that to be an utterly boring question.
Not really. There is a bit too much meet you at the bottom communism in the modern ethos of computer science. When you can have measured difficulty and time to skill in a field isn't it a little odd that we don't have a lot more measured elitism in computer science? The reality is the hardest most complex stuff is literally solved and produced by an elite few. All of which could be taught or learnt but no one gives a shit. Its a bit like magic is slowly dying from the universe, and the wizzards keep suggesting it might be worth holding onto. But they then get called out as an elite minority only interested in furthering their arcane agenda. Whilst everyone else is using the accessible modern "technology" built from the original magic and cannot fathom why anyone should give a shit about magic anymore. Wizzards only appeared special to get a cushy job next to the king in their own tower right? Technology is just as good as magic, because it was built from magic!!!! So elitist. The problem comes when the technology fails, or doesn't do something thats needed and cannot be changed without changing the base magic that the technology started from. If there are no more wizzards and no more magic, you're never going to be able to create new base technologies. The only hope is the magic making technology everyone is currently working on called Machine Learning. Then all the wizzards can be virtualised and controlled like slaves, even if its provable ML isn't actually magic, its close enough... we hope.
The elite few don't really want anyone joining them, so nobody does. Look at the state of academia and look at lack of training in jobs. Nobody wants anyone to be elite, so people don't bother, there's no benefit in it. Your problem is that you think you're important, that you think the most useful contribution from a person is what they do personally, but all that does is just advance, you, personally.
The thing is, the elite are often much more worried about being elite than about what they're actually doing. Once you see that, you know the incentive is corrupt. It's a status thing for them. And how dare anyone challenge their status. That's really all this is. It's hardly about the advancement of technology, because if it was, those people would be out there teaching, or trying to address the informational overload, and not looking smug. It's elitist because it's utterly disregarding most of human experience and presenting yours as superior, and your entire argument will ultimately derive from that view and pretty much everything you say after that could be really anything as long as it supports your idea that you're superior. That's why elitism is bad, it's destructive to conception of reality.
I know plenty of people who don't look at things as magic but who also don't consider themselves as some "wizards". Maybe you should try getting off your high horse and talking to people some and figuring out what it is that they are doing all day and you'll understand how silly everything you're saying here is. But as with everything else, it's easier to sit on top than to try to understand.
Snobbery is all this is, and likely unearned, can't say the association between perceiving yourself as elite and actually being so is very good at all.
Self-driving cars (controls, perception, planning, safety, sensor design, localization, mapping, integration, security, etc.), human-competitive NLP/image classification, advanced robotics (repeat list from cars here but for things in the air, off-road, on the water, in the water, in orbit, in deep space, ...).
Those are all examples of real things that real people get to work on every day.
Operating systems topics don't even scrape the top 20 of stuff I think of when I think of deep expertise.
And even if we limit ourselves to the sort of things you mention, most people who work deeply on languages, tools, and standards view these as manifestations of their deep exploration rather than the focus or subject of the dive.
For example, TensorFlow. The framework very much is the product. But even if some other framework won the day tomorrow, the people who worked on TensorFlow would not have "wasted" their time thinking deeply about how to build the system.
This is why researchers whose original contributions were made in the 70s and 80s none-the-less continue to establish themselves as desired experts in new technology trends (e.g., Leslie Lamport and cloud computing or Martin Abadi and ML frameworks). Because they were focused on ideas and fundamental problems. The problems never disappeared, just changed form. And ideas have a lot more staying power than their manifestation in code.
Most people working deeply on systems today are not "revolving around POSIX in some way". See the proceedings of OSDI. And most deep experts choose other topics, most of which your post doesn't mention: graphics, programming languages (making them and analyzing programs written in them), compilers, security, robotics, user interfaces, NLP, ...
Your definition of "expert" seems to revolve around using things, mostly things based on ideas and techniques that were well-understood already 20 years ago and that are related to building a particular type of software system. Which, if anything, seems to deepen the author's point.
Someone gets to fill the AI research labs, staff the self-driving car companies, work at NASA, build core infra at large tech companies, and build the foundation for the next 20 years of trendy growth areas.
It's possible to get to those places without a degree, of course, but a degree is by far the path of least resistance. And in most of these cases, learning the material from the degree isn't optional; you're probably going to have hard time doing that controls engineering job at a self-driving car company if you never made your way through a calc sequence, some physics, and an algorithms course.
It's also worth noting that very often, building wordpress plugins pays more than doing all of those things I mentioned. I guess it's all about what you want to spend your life doing, which is exactly what the author says at the end of the article.
Self-driving cars are new. You've missed my point. The OP is from an older age. There wasn't a lot back then, so for choosing what to focus in, weren't a lot of choices.
Funny, you mention graphics, what do graphics usually involve? C, C++. Back to POSIX. Operating systems? Same. Robotics? C. Basically, if you picked C, you're good.
On the other hand, how often do you hear about a Java expert? Someone who knows the intricacies of the GC? All about the JVM? Not that often. They exist. But it's not hip. They chose "poorly".
These days, there is a lot more choice, so what are you going to pick? On what is that choice based? What do you do if you picked the wrong thing? This is problem is very modern and didn't exist to such a degree before.
RALPH, a 1990 Pontiac Sport minivan, drove across the US 98% autonomously. In 1995.
Much of the control theory, robotics, and AI work that enabled the current self-driving gold rush was invented decades ago.
The Dartmouth workshop was in 1956. Dearth of choices? Please! Those days provided an enormous surplus of choices, almost all of which were good ones! People from that older age invented reinforcement learning, image classification, natural language processing, OOP, hell, even the notion of a pointer! And the list goes on.
Today there are far fewer choices than there were back then because so much has already been done.
That is, of course, assuming you're in the business of "doing things no one else has done before" as opposed to the business of "following a well-trodden path".
So, I guess if your view of the world is confined to "using things other people already invented and explained to me", you might consider the 1950s and 1960s a bleak period when no one knew how to do anything. As opposed to the cusp of a century-long period of continuous innovation...
Again, as the article says, I guess it boils down to what you want to get out of a lifetime of work.
> Funny, you mention graphics, what do graphics usually involve?
shapes, views, raytracing, texture mapping, lighting, color, filtering, scaling, reconstructing, visual surfaces, grids and voxels, octrees, kd-trees, partition trees, polygonal rendering, ...
And that's just the undergrad stuff, not the cutting edge.
Oh yeah, knowing C/CUDA/OpenCL is nice. But when compared to deep expertise, it's a rather trivial time investment and is completely orthogonal: an implementation detail, not the fundamental content.
> Operating systems?
Kernels, scheduling, device drivers, caching, distributed systems, energy models, timing attacks, and the list goes on.
Of course, knowing C is essential, but that's the easy stuff when compared to wrapping your head around a modern OS, or even a tiny piece of a modern OS
SLAM, sensor fusion, filters, actuation for various types of novel actuators, PDEs and ODEs, optimal control, stability and robustness, system identification, model-predictive control, motors, servos, simulation, etc. And that's just the software side.
Oh yeah, knowing C is nice. But when compared to deep expertise in robotics, it's a rather trivial time investment and is completely orthogonal: an implementation detail, not the fundamental content. Many of the fundamental ideas at techniques in robotics pre-date C by decades.
> On the other hand, how often do you hear about a Java expert? Someone who knows the intricacies of the GC? All about the JVM? Not that often. They exist. But it's not hip. They chose "poorly".
I know a few true, honest-to-god Java experts. They all make insane amounts of money (even by SFBA SE standards) and love their work. Turns out Google has quite a bit of Java code and a metric shitload of money.
You think graphics, robotics, and OSes are just "C and POSIX". That's not true. C and POSIX aren't even table stakes. They're the thing you pick up in a few weeks or maybe a semester so that you can spend several years obtaining the table stakes -- see the list above. Then you need to build true expertise on top of that.
The path from "I know C" to "robotics expert" or "graphics expert" is at the very least a multi-year path. And that's assuming you're bright and have your full work day (and then some) to dedicate to following advances and building your own.
But, any serious business that's going to churn a lot of data, needs fast pipelines, needs to invent entire new markets or ideas will heavily rely on people with patience and training in scientific process.
I will say that those of us who didn't graduate (including me, I dropped out) often feel they have something to prove and will work harder.
I worked my entire way through school doing IT (6 months) then software engineering (2.5 years) before I was offered my first full time by some coworkers that went to a startup. The full time offer also lined up with my term as president of my fraternity ending, so it was time to get out.
I don't know about you, but I need some solid evidence that you did anything close to that so I can consider you seriously.
Why not just take a CS degree? Because I'm a poor fit for the education system. That's why I dropped out of high school in the first place.
Also, I feel like I've paid my dues already. I've been learning about computer software (and hardware and electronics) for 27 years. I haven't stopped at merely what I needed to know to do my job. I have done a lot of self-study. I routinely roll my own libraries and write embedded code, and I had a patch submitted to the Linux kernel a few years back. I also design analog and digital circuits on my spare time.
I feel it's not about having a degree at all, because I'm living, breathing evidence of that. I've met people with CS degrees who can barely write a line of code. Maybe they didn't go to a good college. Maybe they did, and it's possible to pass the exams by cramming (followed by forgetting).
Saying that you can't do advanced stuff without a CS degree is snobbery.
1) Expect that 1,000 people with the degree will mostly have that knowledge;
2) That 1,000 people without the degree will mostly not have the knowledge.
> "They use languages, editors, compilers, and operating systems; but they don't have the first clue about how to create any of these things or even how they really work."
And yes, he is asserting that people should get this from a degree:
> If you want to build doghouses, just pick up some skills with hammer and nails, and then go for it. If you want to be an architect who designs and builds skyscrapers, then go get a degree in architecture first.
The full depth of applied understanding comes from personal interest & experience. Whether or not a person with such interest pursues a CS degree is completely orthogonal.
Plus, these applied low- & mid-level computational practicals have nothing to do with Computer Science; they are programming, architecture, and engineering. People generally do not complete CS degrees with any specific imparting of these 3 facets, unless they use their university time to pursue their own interests & ambitions in the field. And again, such people can and do pursue those outside of university, especially in their pre-university age exploratory years, and on-the-job experience with real systems.
It's not orthogonal, but rather highly correlated. Combinatorics, graph theory, computer architecture, etc., will all be part of a university CS curriculum. Someone who has a CS degree will probably know those things (or at least recall them after a brief refresher).
People like the author, with a degree and lots of experience under their belt, really overestimate what the degree specifically gave them, vs what they learned through decades of experience as they developed their craft. When it comes to practical, applied programming and skills of abstraction, informed by deep knowledge of what goes on under the hood, vanishingly small amounts of that come from university education.
Again, I will be careful to separate out those who do actual Computer Science on the job from this practical craft of quality programming. The former is much more rare, but is a separate field.
I believe the only thing of lasting value I got from the degree was that I saw more of a 'roadmap' of CS topics that I didn't know existed before, which were a starting point for me learning more about them on my own. HN has done that more successfully than the university since though ;)
In the years since the author graduated, there has been a shift in some CS degrees, to meet market needs. A key difference is that then, a CS degree wasn't about programming.
Those CS degree were just the starting point, letting you know what is out there, and a taste of how to approach it. Part of that was contact with research academics. (Though some recent graduates miss the point and believe that because they now know something, they now know everything.)
CS grads are lazy programmers with the mentality that someone else will do it.
I am self taught and can usually rig a solution to a problem in a few hours. A CS grad will tell you its impossible.
Maybe they need to do hackathons, but its like they have been taught to doubt instead of taught to think.
No, you need to somehow invest more time to build more features than another sucker out there. That's almost entirely it. Period. Shit is just fast enough these days, and if your industry cares about performance, well then maybe understanding something about caching that can be learned in less than a day from a blog article will help you with 80% of your problems.
There's plenty of work out there that craves better solutions, and a degree is absolutely not even a nice to have at this point. Let me repeat: there are fundamentally basic applications and software solutions that various industries are dying to have exist, millions of dollars on the line if you know what industries in question, that simply just take a damn long time to implement but every individual piece is so far removed from so much as a basic comp sci 101 algo class, that you're literally just talking about business logic at that point.
The project also involved real-time bitstream generation and modulation at 10MS/s. That was a mixture of understanding DSP and some clever hacks to get the performance we needed on the hardware we had. Oh, and concurrency without race conditions, because we needed to use every core we had to make it all work.
Yes, there’s lots of business problems that can be solved by programming without much computer science. But there’s also a huge pile of problems where it’s not even clear that it’s possible to solve using current tech. I, personally, much prefer the latter, but to each his or her own.
I've built chunking hi-def megapixel web camera services a decade ago when browsers were limited to 20mb uploads. I've suggested building simple structured data protocols based on xml for a high-rate foreign exchange brokerage client-server architectures. We wanted to secure traffic between the client and server so I suggested self-signed certificates.
I'm not stupid but I'm not a "10x developer" either. I work as a software developer because it's the most lucrative, accessible and interesting opportunity for a non-credentialed individual like myself. I've met folks with Masters in CS that unfortunately didn't merit a basic certificate in the same.