There is more harm in the notion that there is anything at all that absolutely everyone needs to know to succeed than there is benefit from the contents of such lists themselves (which boil down to "everyone needs to know what I know").
To succeed, all you really need to know is enough to be able to do meaningful contributions to society in any way whatsoever. This is all there is to it and there are thousands of individual paths that lead to this. In fact, many of the most successful people specialize early, devote themselves to a single project or group of projects and spent the rest of their professional lives on it. Do you think John Carmack knows much about visualization, databases or cryptography? Does Linus Torvalds care about Scala, Haskell or Prolog? People love what those guys have created, so they have succeeded, it is as simple as that - nobody examined them for knowledge of 25 unrelated areas, instead they have built for themselves a useful toolkit of skills for their particular goals. It is not worth it to spend your life chasing some abstract platonic ideal of an uberprogrammer who knows everything about anything - in fact such attitude is detrimental to success.
Ooof, very much agreed. I studied electrical & computer engineering in college, and went on to work in software instead. I was a bit overwhelmed by most of this list, even though I consider myself a fairly successful software developer.
And then I got to the computer architecture section. Given my background and my time away from ECE, I've forgotten more about comp arch, semiconductor physics, and microprocessor design than most CS graduates I know have ever learned. And I think that's completely ok. Even most people I know (including past-me) who work on software for embedded systems don't really need all of that, though a healthy dose is helpful.
> Given my background and my time away from ECE, I've forgotten more about comp arch, semiconductor physics, and microprocessor design than most CS graduates I know have ever learned.
I feel you on that one man. I bombed a lot of interviews because I didn't memorize all the trivia. I've implemented a basic CPU in VHDL and wired up a whole computer out of a single microcontroller and I know all about the cache and TLB and ILP and all kinds of low-level stuff. But those questions never seem to come up in interviews.
I expect they come up all the time, in interviews for sufficiently low level positions.
There are far more relevant things to ask in the vast majority of software interviews, though. Ignorance of the existence of memory cache or its impact isn't a good thing in an application developer, of course, but known VHDL or semiconductor physics isn't a good predictor of success in that.
If you're writing a web-app knowing how the TLB works is just as relevant as knowing how a red-black tree works. Which is to say, not very.
> but known VHDL or semiconductor physics isn't a good predictor of success in that
Sure, I'll grant that. But I would also argue that a lot of what people are doing to interview isn't a great predictor of success either. Data structures and algorithms are important in some jobs, but knowing how to use git, automated deployment, testing, and shipping products or features seems a lot more relevant to job success than the "how to tell if there's a loop in a linked list in linear time with constant space" question that everyone seemed to love 5-10 years ago.
I also believe a lot of interview questions are poorly conceived. I can't, however, see why anyone would be surprised to not get many (any) detailed hardware questions in a typical software interview process. And absolutely, there are important questions you must ask about process, testing, deployment, tool sets.
But understanding algorithms and data structures is fundamental to the job, even if you spend most of your time not thinking about it at all. This is important, and failing to realize it is a prime source of hard-to-analyze, hard-to-extend, hard-to-test, and hard-to-maintain code.
An application developer (I'm leaving aside the can of worms opened by "web-app") absolutely should understand at least roughly how a red-black tree works. They shouldn't have to be able to implement one properly for you on a white board, and they shouldn't remember all the details or even the definition off the top of the head. However, given a description and a implementation (perhaps partial) they should be able to work out and describe how it works, why i works, and say something sensible about the issues in properly implementing and testing one, sure. Bonus points for knowing a couple of places they come up in practical use. Obviosly, some exception for the most junior positions.
Even if they get things wrong in the details it can be fine. But not being able to speak sensibly about such a data structure is a significant red flag. Done right, questions like this can reveal a lot about a candidates ability to do the job, even if they never directly do this. (the caveat there being on "done right". This sort of question isn't pass/fail, it's the discussion that is useful)
On the other hand, understanding the details of a TLB really doesn't to me seem to have the same sort of halo effect -- getting into details on this in a software position (with certain exceptions) seems to be more a waste of the time.
> An application developer (I'm leaving aside the can of worms opened by "web-app") absolutely should understand at least roughly how a red-black tree works.
Thanks for completely dodging any kind of substantive point I might have had. I can move the goalposts such that my opinion is always right and I can always win, too. But I generally try not to because that isn't the point of HN, at least as far as I can tell.
Here's my point. There are many, many software jobs. Only a fairly small subset of them need serious data structures and algorithms work. But interviews tend to focus on data structures and algorithms brain teasers, regardless of the actual day-to-day of the job.
I say "fairly small subset" here because that's the actual truth. There's a huge swath of programming that can get by magnificently with nothing more than arrays and hashtables.
I had a friend interview with Google to work in their uptime group or something like that. He had to solve some brainteaser about generating an in-order list of numbers that meet a certain criteria in O(n) time and constant space. I get that someone somewhere thought it was a good idea. But if you're tasked with keeping failures from happening being able to solve that brainteaser doesn't say anything about your suitability for the job. Being able to imagine disaster scenarios and how they'll play out seems a lot more important to me.
Now I'm not Google so who is to say that my opinion is right and theirs is wrong. Perhaps they're happier with the subset of folks who can do both. But it seems like they might well be missing out on the people who can imagine how bad it could get and help plan for that but who didn't take advanced data structures and algorithms.
> I had a friend interview with Google to work in their uptime group or something like that. He had to solve some brainteaser about generating an in-order list of numbers that meet a certain criteria in O(n) time and constant space. I get that someone somewhere thought it was a good idea. But if you're tasked with keeping failures from happening being able to solve that brainteaser doesn't say anything about your suitability for the job. Being able to imagine disaster scenarios and how they'll play out seems a lot more important to me.
I think Google's approach to hiring has generally been "let's hire really smart and overqualified people and then they'll be suitable for any job".
I really didn't mean to dodge anything. By that comment I meant that web-app and application is a fuzzy line, and I'd rather not distinguish between them.
Give that, I don't accept your premise. There is a huge swathe of programming that can get by day-to-day using nothing more than arrays and hashtables. And, all else being equal, in my experience this work is consistently done much better by people who can understand and reason through what is going on in a slightly more complex data structure, or describe the function of a recursive algorithm, etc. Completely avoidable design errors are often made by people ignorant of why what they were doing was algorithmically silly or using inappropriate data structures. This is often all so avoidable.
I was very explicit that I don't think you should give "brain teasers" and expect people to "perform" them in an interview. And I was very explicit that I think there are lots of other areas that are at least as (and often more) important to get into in an interview situation.
However, any developer with reasonable fundamentals can reason about a red-black tree, even if they've never seen one before.
So yes, I don't expect you prove the big O of the search time, but i might ask you why we would want to balance a tree at all, and what the balancing algorithm is achieving once you're shown it. And I'm not going to ask anyone to implement one from scratch on a board, but I might show you an implementation and ask you what your thoughts on testing and corner cases to analyse would be.
These are perfectly applicable questions (perhaps excepting entry level), and vastly more relevant than how a TLB works, which was my point. To be honest, I'd probably use something even simpler, but don't feel R-B to be unfair.
I even stated that you don't have to get these answers "right" but the discussion is useful. If you get the details wrong but say sensible things, that's great. If you can't reason anything out about it even with some handholding, it's a bad sign.
So I feel you've completely mischaracterized my response, as what I'm talking about has nothing whatsoever to do with your friends description of their google interview.
Now to your broader point - would I prefer a candidate who was an ace at algorithms but couldn't reason about system failures and risk mitigation over someone who could "imagine disaster scenarios and how they'll play out" but can't talk sensibly about a simple data structure?
Honestly I wouldn't particularly want either of them on my team, so I hope there are better candidates.
Well, you've addressed a lot of what I said and done so fairly reasonably. I think I've mostly been complaining that there are a lot of companies who value brain teasers over thinking ability and you say you're not in that camp. Fair enough.
> Honestly I wouldn't particularly want either of them on my team, so I hope there are better candidates.
It'd be nice, but Google is hiring up lots of people and it's getting hard to get everything you want. Better, faster, cheaper. Pick two.
I agree, this list is way overboard. It is far too broad and too deep in too many areas. You need to implement RSA? But then just a couple vague paragraphs on databases?
I've tried to answer this question as the conjunction of four concerns:
What should every student know to get a good job?
What should every student know to maintain lifelong employment?
What should every student know to enter graduate school?
What should every student know to benefit society?
For all the four points listed the attitude of focusing on some checklist covering everything is not really the optimal strategy. For graduate school you may need to pass some broad exam, but afterwards in turning towards research you also want to be much more picky about what you choose to learn and stick to learning mainly (but not necessarily exclusively) things that are likely to help your particular goals.
You should think of learning as of investing and learn things that you are likely to have some use for in the future or ones that you simply like learning. Things that you learn but that you have no long term use for or passion for you will promptly almost completely forget and the time and effort you spent learning them will simply be lost.
For getting a good job/lifelong employment this list seems way too long and broad.
I'd argue better time is spent actually building stuff in your area or language of choice, networking, taking internships and jobs, and practicing—not reading books about unrelated subjects.
I'm still in my early 20s, so I could be wrong, but I don't think you guarantee lifelong employment by just having a laundry list of subjects you read about.
Could not agree more, we really need to go away from the notion of doing 'IT' just for the sake of it, and use our computer knowledge to solve a problem (make sure the problem is real first).
Which is why most of the awesome programmers I've met, NOT a major in IT, self-thought and actually used the computer as a tool to solve a problem.
I can see how a list like this could develop, but unfortunately it seems to have become the academic version of a job ad that wants 20 years of experience in every tool and technology ever invented. No-one would stand a chance of coping with such a broad and deep syllabus within the time frame of a degree course, but I worry that the target audience for this list won't necessarily realise that and so the current presentation could actually discourage people from studying CS.
I think part of the problem is having so much emphasis on specific tools and standards rather than general principles and theory. Hopefully CS students do pick up some familiarity with practical examples along the way, but I don't think that should be the priority for undergraduate level CS teaching. I'm particularly wary of putting so much emphasis on "old school" tools that aren't necessarily very good by modern standards but remain popular because of momentum and ecosystem effects.
As a result of that emphasis on tools, I'd say sound theoretical foundations seem to be given less weight than they deserve. Some examples of terms that I would have liked to see on any modern undergraduate CS syllabus or reading list are "comparison sort", "halting problem", "ACID", "transaction", "distributed system", "CAP theorem", "concurrency", "deadlock", and "lock-free", as well as a lot more specifics on basic data structures and algorithms.
- More on databases (including non-relational). We all need to know a lot of this for day-to-day work yet it but it barely made the list.
- More on writing clean, readable code (this isn't on anyone's list, damn it!) How to name variables appropriately, how to comment, how to not write 150-character lines, etc. Fail them if they can't produce code that other students can quickly read.
- Corporate survival skills (read your email before sending, come to meetings prepared with what you need to say, don't criticize people directly especially in front of others, etc)
- More on concurrency
- No machine learning or robotics - these are specialties without much use for the rest of us (though they are fun)
- Reduce the number of useless but interesting languages
It seems like for every problem out there - at least some tiny bit can be solved by machine learning.
General pattern recognition (identifying peaks, predicting disk outages, etc) at the minimum seems like a requirement that will show up in the next few years.
I graduated 10 years ago and never used calculus, differential equations, linear algebra, or physics in my job.
I'm not saying this to say they aren't useful. But does anyone find them generally useful for their job? If so, what for? I'd like to know what I'm missing here, if anything. I don't want to go through a refresher just to forget it in another couple of years.
I left college about 12 years ago and I use physics and linear algebra nearly every day, along with trigonometry and a bunch of other branches of math.
The thing for me is that I didn't use it until 4 years ago and now I find myself going through the bibliography of classes I took more than a decade ago to try and 'remember' all this stuff.
As for what I use it for? Right now I'm doing a bit of 3d stuff in the browser (javascript and three.js), the library takes care of many details, but one of the first things I had to do was print a trigonometry cheatsheat (sin, cos, arctan...) and put it on the wall.
I'm not sure this is relevant to computer science, but as a mechanical engineer, I've used differential equations and topology quite a bit in my coding.
Topology has its important role in CAD and CAE softwares (particularly when dealing with manifold/non manifold objects) where it's important to understand the relation that various geometric entites have with one another (whether you're coding a tool or using one). There is also a good amount of research currently being done on automatic CAD model simplification for analysis (FEM/CFD) based on geometric simplification (various methods exist ranging from topology to fourier analysis [1]). Meshing theory also relies quite a bit on topology. [2] (but I don't think anyone denies the importance of topology in CS though)
As for differential equations and calculus, they are heavily used in FEM [3] and CFD [4] and there is therefore a need to numerically evaluate these equations in a efficient manner. They are also heavily used in controls (for instance, in a quadcopter PID controller [5]). So I'm guessing in that case, differential equations are used to describe something you're then modelling in your code through numerical integration. Numerical methods however rely both on understanding calculus and differential equations as a given numerical scheme may be more adapted to a particular type of DE than others [6][7][8][9] (sorry, couldn't find a comparaison of these schemes on a single link so I'm citing a few).
Signals processing and statistical analysis may also be used in conjunction with differential equations such as in turbulence modeling in CFD. RANS (Reynolds Averaged Navier-Stokes) equations essentially average the NS equations over time in order to locate areas of various turbulence strength [10]. LES (Large Eddy Simulation) filter small eddies out of the NS equations in order to be left with only the large eddies thus requiring less computational power [11] (the filtered eddies are modeled separately for the sake of energy balance).
Finally, I would argue that calculus is also used in optimization codes and neural networks with gradient based descents [12][13]
I hope I didn't go off topic with this post, all I was trying to say is that in some fields of computer science, you do have to come up with algorithms evaluating differential equations using numerical methods, and an understanding of calculus and DE is a definitely plus. I'm sure finance has its share of DE to model requiring their understanding.
EDIT 2:
Optimization methods can be used to find optimal solutions to differential equations [A]. I talked about gradient descent earlier, but there's also the simplex method [B] that makes heavy use of linear algebra and topology to navigate in a solution space in order to optimize a solution.
Math and physics are beautiful and tasteful. So there's a liberal arts-ish component of you'd take a fine arts class as part of your liberal arts component of education to develop at least some minimal aspect of good taste and common sense as part of your education.
Its a very vocational list in general and needs some liberal arts, not just to make life worth living, but to develop good taste as a workers tool. There is a slight glimmer in the list where its advised to learn a little technical communication. However learning a little graphics arts wouldn't hurt. Or a little political history might help navigate corporate structures. Or a little persuasive public speaking for those presentations. Or some art so your diagrams and whiteboard talks don't look awful.
Yet going the opposite direction, if we're currently graduating degree holders with a very small minimum of liberal arts who none the less can't fizzbuzz, then this hyper vocational extremely large curriculum isn't going to fit, or we're going to graduate people who can't fizzbuzz and fail to integrate e to the x and also fail to appreciate Hamlet in its original Klingon. Not sure failing even more generalist stuff is better overall outcome. Maybe it could be a 6 or so year long vocational apprenticeship. Or maybe if high school education didn't generally suck. Maybe.
My two latest research projects required all of those things. More specific, I tried (and failed) to model movement in an environment as a particle in an electric field. Defining this model required physics of a particle, and optimizing the parameters required differential equations. A sub-section of the project (not done by me) defined a grammar, which required some properties over a semi-ring, which I remember from Algebra.
Well, the first part of the project got scrapped, because it turns out people behave like a particle only under trivial conditions (or I modeled the problem wrong, that's also possible). The idea was that, since we know which target is attracting a pedestrian's attention, we could model both as a particle being attracted to a target (and being repelled by distractors). Whatever little remained of that idea ended up in another paper[1]. As for the grammar, the best I have is this paper[2].
I apologize for not having anything less dry to read than papers - as I'm not looking for another job, I don't feel the need to get off my ivory tower too often.
For the first 10 years of my job, I never used them. In my current job (working on hydrologically modelling software), every now and then I wish I could remember highschool maths.
Interesting that the list includes "not being a lone wolf" and that goes on to state you should be capable of doing pretty much everything necessary to complete a project on your own. If you're capable of working well in a team you don't need to be able to do all the things in this list. You can hand off things to people who are better at those things. That's a good skill to have. That makes you very employable.
A good team member plays to his strengths, but a wide breadth of ability is still valuable. It means you can understand what your teammates are doing. It's not requisite and sometimes is unreasonable to expect in multi-domain projects, but that doesn't mean it isn't valuable.
Similar to philosophies on management. A good manager does little hands-on work himself, but he will generally make for a better manager if he could hypothetically fill the shoes of each of his reports. Not that he should ever actually do that, but it means he understands what they are doing.
Every time I read one of these "what computer science should be" posts, I keep coming back to the old quote, "computer science is no more about computers than astronomy is about telescopes". I feel there are many computer scientists that wish they had studied engineering instead, but assume the career is the one that needs change. This author IMHO tries to solve that by making sure that CS contains everything, which I think it's a bit too much.
I think I'd cut most of the more technical subjects (robotics, computer graphics, system administration), moving them to an elective basis. I have no issue with a Comp. Scientist that doesn't know Racket or Smalltalk, but if I don't see formal proofs there will be blood.
Having produced a laundry list of skills ranging from implementing cryptography protocols through being an expert UX/UI designer, skilled in robotics, AI, data visualization, and having proficiency in 11 programming languages across 3 paradigms (not counting the 12th, which one should implement themselves), the author humbly notes that "My suggestions are limited by blind spots in my own knowledge."
This upsets me.
"CS Majors" (as described in this article) should first learn that life is not a checklist, but rather a random sequence of experiences & events.
Just be passionate in what you do, no matter the field you are in.
I would agree that four years is an optimistic timetable to learn everything here to the degree that you would want to.
I also disagree on the relative ranking and importance of the items, but I don't feel experienced enough to really argue with them on the basis of my knowing better. Maybe in a few years.
I think a few of the points could be expanded. For example 'Robotics' gives the wrong impression and I'd say that you can get a lot of conceptual bang for your buck just from understanding the operation of basic electronics components like capacitors. It's not necessarily 'computer science' in the strict definition but being able to get an idea of how electronics and hardware items work is frankly more powerful than some of the items on this list like being able to make pretty visualizations.
Moreover, in that long list of cool languages, not a single one was a scripting language like Ruby, Perl or Python. Having one of these in your toolkit is pretty much essential and it feels like a glaring omission.
I dislike how these articles make it seem like you need to be a super-human coming out of college. You don't. As long as you are perceptive, intelligent and can learn on the fly you'll be fine going into most entry level jobs in the industry.
As far as the list itself, it is highly debatable. Depending on your job you won't ever need to know most of this.
As far as the list itself, it is highly debatable. Depending on your job you won't ever need to know most of this.
Any comprehensive list of "things to know" will include things that not everyone needs to know, but it's still important to know "of" a lot of things. That way, when you encounter a new problem, you'll be reminded of something you heard about, but don't really know, and can look it up on Wikipedia.
Yeah. My general objection to this article would be that for the dedicated autodidact you'll be better off taking this advice on a longer timescale and as a guide instead of a checklist you need to cross off before you're Minimally Qualified. In particular be sure to optimize the curriculum towards your interests and goals instead of prioritizing somebody elses recommendation by fiat.
I earned my CS BS degree almost 30 years ago. My school kind of served as a training ground for mainframe programmers. We had plenty of assembler and COBOL, but not much in systems architecture and design.
There is a lot that has become much more relevant since then: Security, networking, and distributed systems. But if I could have taken one more course, I'd really like to have studied graph theory.
Agreed. As computing professionals we should always be learning. Whether that is graph theory, a new programming language, or anything else for that matter.
On the subject of technical communication, formal oral presentations are not enough -- frequent practice engaging in short defenses of small ideas would benefit students immensely when it comes to working on teams in industry.
"Scala is a well-designed fusion of functional and object-oriented programming languages. Scala is what Java should have been."
The only thing I disagree with. Scala is the only language I know of that several startups have decided to give up on. How can you call C++ a necessary evil and recommend Scala, which is basically the JVMs C++?
Anyway, that list is pretty long - I don't think I know any CS major with that skillset. Or even half of it. You'll still be employable, don't worry - but its a good long term guide.
I've gotten to the point where I won't write new JVM-based code in Java anymore. I think many companies that have tried and discarded Scala used too much of it. It has a lot of advanced features, and I'm not particularly good at using most of them (and some of them I probably don't even know exist).
If people start off thinking of Scala as simply a "better Java", and only move to use its easier-to-understand features, I think it'd be a lot easier to hire people to work on Scala codebases and/or train people in the language.
You mention the key points - in order to use it successfully, you have to agree to use a common, safe subset of the language (e.g. no implicits), similar to C++. You then have to make sure everyone adheres to this, if possible by using automated tools. As a startup, you often don't have the luxury to be able to invest in that kind of infrastructure. You have to train people to use the language correctly, which is a huge time sink. Onboarding still is a huge problem.
Backwards compatibility is another big factor that has bitten people once too often. Compiler bugs, slow compile times, bad community attitude come on top of this.
>> If people start off thinking of Scala as simply a "better Java", and only move to use its easier-to-understand features, I think it'd be a lot easier to hire people to work on Scala codebases and/or train people in the language.
Yes absolutely, and I agree it can be used effectively in that case. But the language attracts people that like to think of themselves as wizards that want to master the advanced stuff, too. After all, the whole thing started basically a vehicle for Martin Odersky's papers, implementing any paradigm known to mankind.
Considering all those downsides - is Scala still worth the tradeoff? I do understand the appeal, but there are so many languages out there right now. You need a stable and reliable platform to base your business on, and after hearing - for several years - people moving away from it because it is lacking in that regard seem like something a responsible CTO should not ignore.
This is the reason language wars are looked down upon on HN - someones going to feel offended. Its pointless. But be reminded that ad hominem is not welcome here either. You are welcome to add to the discussion, in case you are knowledgeable in the subject. You have not done much to factually counter any arguments brought up so far, and in case your goal is to persuade anyone of your contrary position, you have done that goal a disservice.
This is not about being "offended". This is me being bored to death by the umpteenth dude – you – thinking that repeating random stuff written on the internet adds anything to the debate.
It has made people shake their heads 5 years ago, and it hasn't changed since then.
There is no need for persuasion. Those who are motivated enough try the language can discard non-sense on their own, and those who keep spewing the same old and tired stuff are not the people I want to deal with anyway.
5 years ago, Paul Phillips was still a core contributor to Scala. Anyone seriously considering Scala should listen to his talk, titled “We’re doing it all wrong”, and take a look at Policy, his fork of the Scala compiler.
In hindsight I shouldn't have posted this, language wars never lead anywhere. For an overview of why some people deem it unsuitable for big, long-term projects, this thread is a good starting point: https://news.ycombinator.com/item?id=9393551
A lot of this stuff isn't strictly necessary for many people. But there is a very distinct change in the quality of your thoughts and approaches to solutions after learning advanced mathematics and the theoretical concepts that inform modern systems and abstractions. For instance, Haskell is far easier to someone with prior experience with discrete structures, category theory, and recursion than to someone coming from a purely object-oriented background. Both will be able to write effective code, but it's no question that the mathematically-inclined individual will write more elegant, more concise, and qualitatively better code than the other.
Being self-taught, different experiences bring different problem-solving approaches and not _everyone_ has the need to know Haskell or Racket to self-identify as a programmer.
This is exactly the kind of article that prevent people to consider a software engineering career.
Now I have one more anecdotal confirmation for my hypothesis that all advice is bullshit.
I notice that the section on Networking has nothing to do with building cross-company relationships with your peers and future business contacts.
Nothing is said about interviewing or salary negotiation.
Suspiciously absent is the section about how to balance doing something correctly against doing it profitably--or how to stop arguing with your manager before you get fired.
I have to assume that someone majors in computer science as a means to land a job in the field. We all know that hobbyists don't need the expensive degree credential. The vast, overwhelming majority of work now available for software professionals is to take someone else's shit code and make it work better. A very tiny fraction of it is producing all-new code that is done right the first time. As a new grad, you are probably not going to get that work, and even if you do, your inexperience in the real world will probably still make it shit code that someone else will have to improve.
You do need a resume, even if you have a stellar portfolio, because the majority of available jobs are filled by people who don't know what a github is, or how much their mechanic charges to fix it.
It is blindingly obvious that the writer is an academic, who has never held a developer position in a startup, a mid-cap, a megacorp, or a government.
In the real world, the only skill you absolutely need is to learn exactly what you need to be productive. You don't just make yourself "T-shaped"--you also cultivate the ability to make yourself "Pi-shaped" by quickly growing a new stem of deep knowledge.
Prefacing this I am only 20 years old so I have some time to figure it all out but I feel like I am missing out a fair bit on not pursuing a degree in computer science. A lot of the content, especially the focus on algorithms, seems really interesting to me.
Straight out of high school I did not have the math grades for CS so I went to the American equivalent of community college and am on track to get a diploma from an IT related course thats basically going to amount to a web dev job in the future. If I decided to go that route for lets say 5 years to save some money would it be a hindrance career wise, because of a large time gap in the resume, to take the 4 years off to get a CS degree from a decent school, say Waterloo?
The biggest point of fear is that I would not be able to complete the degree and realizing it was to above my cognitive abilities.
Time spent in university isn't usually seen as a gap. Most employers would admire that your chose to go back to school so you could really understand the fundamentals.
Right now, I'm prepping for a Computer Science degree. I find I just can't study mathematics at a pace that's as fast as a Uni course goes. I really find I need to understand the underlying concepts of the math, so it takes me a bit longer but I seem to be able to hold a lot more info, and my progress is getting quicker.
As an example, it took me some time to grasp that the determinant is the size of a square matrix, and that a one-dimensional matrix gives the distance, a two dimensional matrix gives the area of a parallelogram, and a three dimensional matrix gives the volume of a parallelopiped. After that it became easier to understand how to get the determinant of an n-dimensional matrix.
Same with vectors and trigonometry.
I'm finding that self study with a little help from Google is really paying off!
> I've tried to answer this question as the conjunction of four concerns:
> ∙ What should every student know to get a good job?
> ∙ What should every student know to maintain lifelong employment?
> ∙ What should every student know to enter graduate school?
> ∙ What should every student know to benefit society?
I'm having difficulty articulating why, but I can't help feeling that this approach goes against the idea of education as a good in its own right; that the things that every computer science major should know are those things that are fundamental to the field, defined without reference to e.g. whether it'll get you a good job.
My degree is history, and it would be difficult to provide any sensible answers to those questions for that subject, except perhaps #3. There is an argument that studying the humanities provides transferable skills in critical thinking, which I would dispute; apart from the importance of questioning and evaluating sources, and a large pool of counter-examples for many arguments of the form "people have always done X", most of my (limited) abilities in critical thinking derive from my amateur digressions into science and logical fallacies. The only quantifiable benefit to society I can identify is the nuggets of interesting information I can sometimes throw into discussions and comment threads (my favourite being "Gandhi wasn't a pacifist and the British were bombed out of India", which I don't get to use often enough).
But a society without students of history (or French literature, or ancient Japanese ceramics) would be poorer; so would one without computer scientists, even if the field had no practical applications. Human knowledge advances best with a broad and deep body of ideas to bounce off each other. (No, I can't provide a concrete defence of that statement.)
The things every computer science major should know are those things that, if he or she did not know them, would make it ridiculous to call him or her a computer science major. Ditto for every other formal field of study.
I am really happy that, with industrial Engineering background and thanks to all the resources I have been able to find online (stackoverflow and blogs like this), I have been able to follow all of these steps independently to be a programmer.
The things I'm still missing (and trying to improve) have to do with managing servers and parallelism. This is highly advisable but maybe too daunting in such a list for sharing with a beginner...
The author is trying to shove too many things into a CS curriculum that just don't belong there. CS is for studying the theory and nature of computation, so why is cutting and crimping a network cable on the list? System administration, user experience design? No way. This isn't a vocational program to get someone a job, it's Computer Science!
What is a good book for C++11 and beyond? I took a lot of C++ before '11 was implemented, but have been focused on other languages recently, so I feel like I'm a bit behind the modern development. The references he points to are all 10+ years old, so I strongly doubt I'll find any C++11/14 in them.
Concerning visualization I'd rather recommend Wilkinson's "The Grammar of Graphics" instead of Tufte's "The Visual Display of Quantitative Information", since the former contains much more practical examples and advice.
Thanks for the reference. I was rather disappointed with Tufte's works, given all the hype. There were too many specialised "arty" graphs, and not enough standardised methods.
This was a lot more practical and hands on and engineering than most CS degrees, which are more about math. Sys admin is really IT, it isn't CS at all.
Fail article...no you don't have to know all these things... Mathematics for 3D Game Programming and Computer Graphics??? OMG!! Thats totally not what everyone should know, thats a very "narrow" field of software engineering that it's suitable for those who really like it...jesus just don't read it.
That's a very solid book that deals with things not just restricted to 3D programming. It covers vectors, matrices and linear transformations, which are useful in other areas of Computer Science.
Although an excellent book that covers these things is The Manga Guide to Linear Algebra, by O'Reilley.
> Although an excellent book that covers these things is The Manga Guide to Linear Algebra, by O'Reilley.
Thanks for that recommendation! I'm reading the sample chapter now, and it's a great review. I'm not a huge manga fan but this is a nice way to break up complex ideas.
It's not. It also is not quite not engineering too.
A lot of terminology and philosophy from engineering carries over to CS, but for most applications, the quality and thoroughness of real engineering does not, nor is it needed.
Computer science, or software engineering if you prefer, is a strange hybrid of a lot of things. On the pure development side, there is a lot of engineering-like stuff going on. In academia and research, there's a lot of science. And everyone is doing heaps of math. Unfortunately, English doesn't have a good word that encompasses all this, so we end up with poor terms that each only describe parts of the whole.
As a software practitioner, I find it hard to believe the distinction is a question of degree of quality and thoroughness.
CS largely involves design and implementation of machines and automaton, and in many cases far more complex machines and automaton (for example, self-learning ones) than "real" engineering. I can't see how this is not engineering, so I'll have to respectfully disagree.
Some CS involves that. Some involves fitting pieces together, or following an existing pattern: something more akin to carpentry then engineering.
My point was not to denigrate CS, but in fact to say that it is something larger then just engineering. Sometimes what we do is engineering, sometimes it's science, sometimes it's math, and sometimes it's carpentry.
To succeed, all you really need to know is enough to be able to do meaningful contributions to society in any way whatsoever. This is all there is to it and there are thousands of individual paths that lead to this. In fact, many of the most successful people specialize early, devote themselves to a single project or group of projects and spent the rest of their professional lives on it. Do you think John Carmack knows much about visualization, databases or cryptography? Does Linus Torvalds care about Scala, Haskell or Prolog? People love what those guys have created, so they have succeeded, it is as simple as that - nobody examined them for knowledge of 25 unrelated areas, instead they have built for themselves a useful toolkit of skills for their particular goals. It is not worth it to spend your life chasing some abstract platonic ideal of an uberprogrammer who knows everything about anything - in fact such attitude is detrimental to success.