Just teaching yourself you'll likely miss lots of topics and not challenge yourself as much.
Conversely I've met plenty of students that don't really care about the topic and are just taking the class to get a job.
So do both! Take the class and teach yourself. In other words pursue your hobbies and use code where those hobbies match. Learn the things you need to achieve your hobby goals. All while taking CS classes and as much as possible making those two things overlap
A big clue going in, I suppose, was that the CS degree entry requirements demanded applicants to have mathematics A-Level (i.e. spent, under the typical UK system, about a third of their age 16-17 full-time education exclusively on mathematics). It was in no way intended to be, and clearly was not, a degree teaching people how to be software engineers or how to program beyond being able to put the logic, in the chosen notation, on the screen.
This was back at around the turn of the millennium.
Mostly it informs the student of their current level of understanding. 70% of lecturing is marking assignments. That’s what you don’t get when you teach yourself.
Honestly, it needs to be. Employers are not going to be impressed with grads who've not encountered basic things like version control by the time they've graduated - and it's irresponsible to market a degree as a stepping stone to a e.g. software engineering job without covering these basics properly.
The theory is still incredibly important - mathematics, algorithms, performance & time complexity, data structures etc. It underpins everything we do in industry. But there really needs to be some element of practical software engineering too because the vast majority of the UG cohort are there to get a job.
That's true now. It certainly wasn't true in the past; especially with larger companies who would hire people with the intention of training them.
I wonder if there will be a bifurcation in education. To some degree (excuse the pun), there already is; software engineering "boot camps" and the like that teach software engineering rather than CS. Maybe we'll see that split reach upwards to the formal education level; a more vocational college for the software engineers, a more mathematical alternative for the informatics enthusiasts.
Nothing beats real world experience, and the sooner you can put theory into practice, the better.
Additional learning on your own is, of course, also a great idea. Pick a project you’d like to do, and then do it. Or find an interesting open source project, explore the code base, and try tackling a bug report or two.
This is effectively what I did. I found a job at a small company programming part time while I was still in school. Class was very much this is how things should work, and the job was this is how things actually work when you're coding for clients with unrealistic schedules just trying to make rent. I learned a lot in both scenarios.
The other thing I would add is to take additional classes other than CS/math. Most people in CS are there because they live and breath computers. Getting exposure to other subjects is how someone can standout and/or get that first idea for their own business.
This description comes with an important disclaimer. You learned how things worked at a particular company and for the one or more projects that you were assigned. There's no reason to think any of the knowledge you pick up in a specific job will be valued outside the company, or even in a different team within the company. The PHP you learn at one company won't help with a different company using the latest JS framework, and the skills you learn at a small company might not even get you an interview at a large corporation.
This is not to say there is anything wrong with your answer. I see similar statements all over the place, but seldom does anyone clarify the part about how specific those skills are. "I have a degree in CS from Stanford" is not specific at all.
I learned to code while in high school (in 1981), and had summer internships in a computer facility. My mom was teaching CS at an extension campus of a state university at the time. She was the one who didn't think a CS degree was necessary for becoming a programmer. Her students were getting good jobs after 1 year in her class. And the programming work that I observed at my internships seemed like it would be ultimately boring.
So I went to a liberal arts college and got a double major in math and physics. At the college, it seemed like the people doing "cool" things with computers, were in the physics department. I kept learning more about programming, and also taught myself electronics. And then I went to grad school in physics. Granted, this was all before the Interwebz.
A lot of people whom I've worked with over the years were programmers with degrees from all over the place: Natural science, philosophy, music, and so forth. The programmers with non-traditional backgrounds tend to be happy to be earning more than they were in their original fields, but also tend to do non-traditional work. At my workplace, any job involving math goes to a handful of "math people" who happen to have math or physical science degrees.
Survivor Bias Alert: I can't say that I discovered a magical career formula. Many of you are vastly richer than me. I'm reminded of the saying: "You go to war with the army you've got." I don't know if I can ever be self disciplined enough to sit in front of a programming screen all day. I do a great deal of programming, but my background allows me to do enough non-programming to keep me from losing my sanity. And my interest in electronics supports a fairly rewarding side-business.
Becoming a programmer at age 19 isn't a panacea. My friends who did that, many of them were swept into the vortex of alcohol and consumer debt. A couple didn't survive. For some, going to college is a way to break away from their hometown culture, and try a new life.
A lot of people (though not everyone) have the opportunity to go to university (typically only once in their lifetime), to learn more about the mysteries of the universe, or about complex humanities, or about our history, art and anthropology, all of it from the leading experts in those fields. The fact that so many people seem to end up doing their subject because they think that's what they need to get a job to start a career they may or may not even have 10 years down the line seems a little sad to me.
I've seen too many people when I was in school myself and subsequently interviewing recent grads, where people knew all the theory, but could barely pass a FizzBuzz test. You just don't actually have to write enough code in most CS programs to become competent at it, without doing it for outside reasons.
That being said, personally I thought for a long time that I didn't need to study CS and could teach myself to code (which is true), however I do notice that sometimes I lack a deep fundamental understanding of CS, which I regret.
Or should people who don't do coding related hobbies not do CS? Which is a special requirement just for this major. People in civil or biomedical engineering don't apply their majors to their hobbies.
I wouldn't be surprised if you could apply modeling and wood quality simulations into a software program to help your woodworking, too.
My hobby is fiction writing. But hey, Scrivener exists for god's sake.
Or you can tackle the really hard problems if you want. Make a CNC lathe for wood working. Maybe create an app to layout your cuts in the most efficient way.
I don't know enough about crotchet to suggest anything...but I bet there are problems to be solved.
As far as why you wouldn't use an already existing app? Why would you build a birdhouse from scratch when you just go buy one?
E.g. Designing Weave Patterns Using Boolean Operations... https://www2.cs.arizona.edu/patterns/weaving/webdocs/gre_bol...
I started working on web development before my CS classes started, back in 2001. Web was just not part of the curriculum, and when web related classes were introduced, they were using Java for the job. I was working with PHP, so I absolutely hated the idea that I’d have to go home to learn to code in a different language (because the professor didn’t teach it at all) to be able to complete the homework, while I could be learning more about what I could actually use in my daily job. When all the agile trend started, TDD, etc. it was even worse. Professors were stuck in the past or in academia, not having a clue of what was happening in the market.
Eventually I dropped out.
I agree with this, I didn't take this approach and I'm regretting it (in a way). I'm in the computer science camp and just trusted that my education in it would be fine. Instead, I studied other things on the side (working on the side as a TA and studying psychology were my side projects and in my final years web dev, which is how I know what I've missed in my early years).
The trade-off is a tough one though: studying psychology and being a TA made me a much more well-rounded person. I have so much more confidence in my social skills. I would not have gotten that if I'd have teach myself coding on the side as well, not even if I'd do freelance projects on the side as well .
Instead when I learned them I was bored out of my mind and it was such a shame that I was, I shouldn't have been bored. I should've been fascinated, because it is fascinating. Unfortunately, I couldn't see it at the time.
But hey, at least now I can!
 Because just like computer science, psychology taught me something formal that I needed to learn but wouldn't be able to learn on my own which is: how to detect nonsense when people talk about the science of human behavior. I need a lot of formal training on human behavior, so knowing how to distinguish a good vs a bad learning source is of paramount importance, especially because about 1 in 20 resources are decent and the other 19 will make you worse off.
 Because it taught me general industry concept simply by reading the front page while also keeping my interest high in other topics.
In short: IN GENERAL - Those with SE degrees (or other non-CS degree, or no degree) can be excellent developers (good code style, good unit test coverage, aggressive PR reviewers, etc).
However, the art of computer programming - or computer system design -- is often in formally defining the problem at hand using computational structures (such as graphs), understanding the properties of that model, and creating a new algorithm over that model to solve it. Then developing a software system to mirror the properties of that formal system you designed.
Those people who did not spend dozens-to-hundreds of hours agonizing over their coursework in algorithms, abstract algebra, linear algebra, "foundations of computing" (i.e., computational complexity theory, computational theory, language hierarchies, etc) I found can lack this critical aspect of system design. Instead, they design systems that lack unity and are often simply a collection of routines to handle various inputs - rather than having the knowledge to think "Hm, maybe this whole thing I'm working on is a special case of a well-studied problem". What this means practically is that there is often a simpler, better-defined, and more testable solution lurking behind the veil of ignorance.
Sure, there are many many teams and projects for which the scale or complexity are not sufficient to warrant some kind of formal analysis and modeling -- but many do.
For as much time as is spent in school on theory, one still needs to spend (at least) an order of magnitude more time going deep in a theoretical domain to understand it well enough to apply it competently. The difference in the amount of self-study required to grok many classes of complex systems design between someone with a CS degree and someone without falls below the noise floor. No matter what, you will need to do inordinate amounts of self-study to acquire theoretical competency.
A CS degree is essentially a theoretical survey course. It tells you these domains exist and gives a little bit of high level context, but if you want to become an expert in those theory areas then you are still learning almost entirely on your own and/or from other experts on the job.
I am reminded of PGs famous example of the "Blub" programming language .Here's the relevant quote:
> As long as our hypothetical Blub programmer is looking down the power continuum, he knows he's looking down. Languages less powerful than Blub are obviously less powerful, because they're missing some feature he's used to. But when our hypothetical Blub programmer looks in the other direction, up the power continuum, he doesn't realize he's looking up. What he sees are merely weird languages. He probably considers them about equivalent in power to Blub, but with all this other hairy stuff thrown in as well. Blub is good enough for him, because he thinks in Blub.
In the above, the "Weird Language" can also be a formal modeling of the problem domain and solution.
(Note: He should not be presuming the developer is "he")
While a good manager should be slotting the right people into the right projects, an important meta-skill for all professionals of whatever background is self-awareness of your own skills and weaknesses and using that awareness to play an active role in determining where you work and what you work on.
In almost all these cases, developing a simple model grounded in some kind of common CS formalism, playing with the idea, and then explicitly constructing the system around that model would have made the application much more understandable, with a lot fewer mysteries, and reliably used by the customer in much less time. Even small systems containing simple logic, with a few input and outputs, can yield extremely complex emergent behavior (Game of Life being first obvious example).
I have met many computer science majors who are incapable of writing software to save their lives, let alone design complex systems.
I also think that if you get hands on experience first, you learn to appreciate the science later on and therefore learn it with more gusto.
You are severely underestimating the number of corporate/enterprise/SAAS CRUD jobs out there that don’t have to do anything that complex.
I arrived at university having already worked for a couple years in open source and a bit of contracting. I worked as a professional programmer throughout my undergraduate degree. I was bored by most of the introductory programming courses. The liberal arts, being around other young adults, the theoretical CS parts, the electrical engineering bits, and some of the project work were the only valuable parts, but that was plenty of value to me.
Going to university to learn to program is like going to university to learn refrigerator repair or how to play the guitar. Vocational and on-the-job training will go deeper into the craft. If you only want vocational training, go to a vocational school, get a private teacher, or teach yourself. Don’t go to university just to learn a skill. You certainly can, but it’s not an efficient way to do that. People study music in university to become better rounded musicians, not to learn an instrument. Same with CS.
I know right? Stupid plebeians expecting to get something that will help them earn a living out of spending three or four years learning something and foregoing the earnings they could have made in that time. If you don’t know that university is for signaling your social class rather than learning skills that will be useful in earning a living why are you even there?
At least the nerds who think university is about learning don’t have the stink of trade about them.
If that’s your issue, I completely agree with you. That’s why, in my comment, I actively encouraged people to do vocational training (boot camps) or teach themselves. If you want to learn to program, those are much better ways than to do a CS degree.
That doesn’t make CS worthless. It’s just different. The CS education provided by a university isn’t meant to train you to be a programmer. It’s broader than that. And it’s also, in some ways, deeper because it dives into CS topics that you won’t ever need for work.
I think that’s totally fine. The vocational and university systems have coexisted for a long time and serve different purposes. Pushing people into the university system that are better served by vocational training is an issue. But throwing away the university or pressing it to become purely vocational training is, I think, both unnecessary and deeply harmful to society.
CS isn’t worthless. CS is awesome, just like Physics or Philosophy, for those who care about them. Those who have any interest whatsoever in intellectual pursuits are at an absolute maximum 10% of the population. Many more need to get a job and for many of them a university degree is an almost absolute prerequisite. To suggest they forgo it is to do harm to them because there’s the possibility they might listen.
As a society almost everywhere in the first over spends massively on education because while it may be socially wasteful it’s individually rational. It’s a signaling arms race.
Universities have never been primarily for those who had av burning interest in the topic matter. If they became that, forsaking their roots as upper class vocational schools and finishing schools they would be vastly smaller than they are now, in terms both of faculty and student body.
My thanks for such a cordial reply to an intemperate message.
If your goal is to maximize your earning potential, there's far more effective ways of spending 3-4 years.
So you’d either have a graduate studies only university, like the European University Institute in Florence or Rockefeller University or a research institute like the Max Planck Gesellschaft, RAND, SRI International or the Institutes for Advanced Study.
The idea that research is even a part of the core mission of universities is at most 200 years old. They’ve always been traded schools for theology, medicine and law and latterly finishing schools for the upper classes but the research university emerged in Germany with Alexander von Humboldt.
Most people don’t decide upon goals and then look for the most effective ways to pursue them. They look for the socially approved and known good ways of getting what they want. For the huge majority of university students that means they’re at university because they need a degree to get a job. People with money and people who attended selective universities may think otherwise because they have other or better options but most people know that if they want a decent, respectable living they better get their certificate of middle class membership, their Bachelor’s degree.
Outside of maybe Cal Tech no university sends the majority of its students on to graduate study aimed at producing researchers. They are funded by governments whose voters would be apoplectic to be told education was an afterthought to faculty research. The median college graduate might be capable of doing a Master’s degree but there is no way more than 10% of those who matriculates as university students are capable of becoming researchers if that.
Whatever universities might be of their purpose is research they’re a terrible waste of resources.
What makes you think that the best way of training future researchers isn't having the current researchers train them? This is the model currently used across most universities I think. As part of my degree, I was taught entirely by people who are active researchers in their fields.
> For the huge majority of university students that means they’re at university because they need a degree to get a job.
Yes, and I'd argue this is a negative trait. People should be going to university because they want to learn a thing, not because it's a necessary hoop to jump through to work in a field (unless that field is academia). Fields that require specialist education (law/medicine/etc) already have specialist institutions that do this vocational training.
> The median college graduate might be capable of doing a Master’s degree but there is no way more than 10% of those who matriculates as university students are capable of becoming researchers if that.
But how does the university find that 10%? You presumably can't select effectively, so what better way than to run a 3-4 year program for those who are interested to see if they remain interested enough and are good enough to become a researcher. It doesn't have to have a high conversion rate to be effective.
Sorry, but this assertion is so incredibly wrong. At Warwick, where I did my undergraduate degree, 95% of students in my CS cohort were in a professional job 6 months after finishing the course - they're not staying around to do research! The courses are marketed based on earnings potential after graduation. Research grant income is dwarfed by tuition fee income. That's just how it is.
It's just laughable to suggest that universities aren't interested in teaching.
>If your goal is to maximize your earning potential, there's far more effective ways of spending 3-4 years.
I agree with this and it's one of the regrets I had.
This is in no way contradictory to my original assertion. Of course universities recognize that they're going to lose people after their undergraduate course is finished. But more good quality people applying means more good quality people on the course, which ultimately means better researches. Obviously they'll market based post-degree earning potential, because that attracts better people.
Given a choice between an undergrad leaving for the real world and staying to do a postgrad, they'd always prefer the latter. Warwick would love that 95% figure to be 90%, or lower. An undergrad course where 100% of people left academia after they had completed it would not survive long.
> It's just laughable to suggest that universities aren't interested in teaching.
I worded this badly, but this is not what I suggested. Teaching is a core part of research, because teaching is how you get better researchers. My point is that universities are not interested in furnishing software companies with better quality developers.
Mind you this was 15 years ago, but I came out of it with a decent enough working knowledge of C, C++, VB.Net and Java - and basic understanding of SQL, PLScheme and functional. On top of that there was a solid year of data structure class.
Trade schools have been largely demonized and shut down in the last couple years, but I can say honestly I would not be where I am today without it.
I never would have cut it in college, I did really mediocre in grade school, I had trouble focusing on things I didn't care about. However, in trade school I flourished, largely because I actually cared about what I was learning. I graduated with a 4.0 and I make six figures now.
I genuinely see college requirements as a horrible way to filter useful but unbalanced people like myself out of the work force.
I would not recommend this career track if you're the kind of person that wants to climb a career ladder and go into management on the business side of things, there are definitely useful skills to acquire on the legal and softskills side. Having any degree will just check a box and help people without a clear view of what they want out of the future. It can also help if you want to put yourself through being just another number at a FANG organisation.
Too stubborn and arrogant to allow someone to tell me I'm incapable of being a success without a piece of paper.
- Time and space complexity (Big O, Little O, big Omega)
- Algorithms and data structures (linked lists, heaps, trees, graphs, search algorithms, sorting algorithms)
- Artificial intelligence (neural networks, evolutionary computation, decision trees)
- Electrical engineering logic gates and circuit analysis (gives a good basic understanding)
- Discrete maths (De Morgan's laws)
- Database design
The database design course I did was heavily centered around relational databases but some of the core concepts of this course ended up being very valuable for any kind of DB and also for areas outside of the DB. It taught me a lot about useful software engineering concepts such as identifying functional dependencies in the data, achieving good separation of concerns, establishing a clear data flow, one-source-of-truth principle, indexing of data, etc...
The other day one of our network guys asked me what a Red-Black Tree is. I couldn't explain it. I got an A in my algorithms and data structures class, but I haven't used it so I forgot it.
1. Beyond a few general requirements, most CS programs do let you focus on what you’re interested in, and to much greater depth and pragmatic application than trade school. “Focusing on things you don’t care about” is a very minor part of college technical programs unless you make up your mind to solely care about some small niche.
2. “I never would have cut it in college” is not a free pass from needing a well-rounded education, appreciation of art, literature, business, ethics, etc., and good writing, communication and organizational skills well beyond what’s taught in trade school. This is becoming increasingly critical in business software settings, where developers are often working directly in tandem with product managers.
I could see trade school working for very specific niche areas, like DBA for a single specific enterprise database system, or breadth of knowledge in devops compatibility across multiple cloud vendors.
But I can’t see trade school as a reliable way to get necessary experience for general software architecture applied to business problems and fundamentals for the social coordination of multi-system software projects.
You personally might have had the raw talent and gumption to parlay trade school into that sort of career skill, I’m just saying it’s not likely to work out as well for a random student in today’s job market.
Not that few. Here's what mine has:
- Writing x 2
- Discrete Mathematics (a few concepts in there are really needed - the rest is mostly not)
- Public Speaking
- Humanities x 4
- Calculus x 2 (I doubt even 10% of SW folks use this)
- Linear Algebra (as few as calculus use this)
- science course (e.g. chemisty, bio, phys - you pick) x 2
- Non tech seminars x 2
- Probability and Statistics (I'd argue one should take this, but I'd still wager less than a quarter of developers use this)
What's worse: The majority of the above are in the first two years - so if you don't have the discipline to do courses you feel are irrelevant, you just want get to the fun stuff. Only 6 out of the first 20 courses are related to computers.
> “I never would have cut it in college” is not a free pass from needing a well-rounded education, appreciation of art, literature, business, ethics, etc., and good writing, communication and organizational skills well beyond what’s taught in trade school. This is becoming increasingly critical in business software settings, where developers are often working directly in tandem with product managers.
Look - I think I gained quite a bit from the well rounded education I got. And my general advice is that people should go for it. However, I also have the reality of experience: Most of my peers who have degrees are poor at writing, poor at communication, poor at ethics, and poor at business skills. I see nothing differentiating them from trade school folks in this regard - and these all matter in the work place.
So yes, I gained quite a bit from those courses. But the reality is most grads do not.
writing: it makes a huge difference to me if some open-source project I'm supposed to be using has good documentation or not, and there are a lot that don't. Poorly written specifications are also a huge waste of everyone's time. Of course you need the right kind of writing class to make a difference here.
Linear algebra: there are different ways to teach this; the prove-stuff-from-first-principles approach which I agree is more suited to mathematicians, and the here's how you use matrices for real stuff which I used a lot in my last job. The moment graphics become involved, it's a big topic.
Prob/stat: this really depends what field you're in but for machine learning it's essential, and that's quite hot right now. I mean this in the sense of understanding what's going on and being able to interpret results, not in the sense of being able to copy a tensorflow command from stack exchange.
public speaking: if you want to rise in the organisation, or just get your ideas heard at team meetings, then it's not a complete waste of time. There are lots of more important factors, but being able to get a point across well is a small factor in your favour.
discrete mathematics: there's only a little bit of this that you'll actually need, but I got from lowly intern into proper projects at one company because I fixed a bug that had been causing random weirdness for ages at the company and it turns out someone implemented Java .equals() in a non-transitive way for one domain class and that was messing with the ORM.
The problem with a lot of universities is they want to teach CS in a way that turns out CS researchers (or at least grad students), which is an overlapping but not the same skill set as good developers.
Linear algebra, discrete math & probability really are needed all the time.
I can only repeat what I said:
> Linear algebra, discrete math & probability really are needed all the time.
One can understand O(N) enough to do over 90% of SW jobs without taking these courses. You just need a bit of dedicated instruction for it.
And while linear algebra is crazy important for some disciplines, it's simply not needed for over 90% of SW careers.
And I do think a good understanding of statistics and probability can really help you, I also see that most CS majors take this course and have forgotten almost everything they learn in it by the time they graduate. I think some parts of the SW industry are such where you really would benefit from it - roughly 25%. I also think that percentage will increase with time, so I would encourage everyone to learn it - degree or trade school.
Don't get me wrong. I spent over a decade in university doing math heavy stuff. I love it. I can also say from my experience in industry is that very little of this differentiates good from poor performers. Or rather, the main factor is motivation and internal drive. Those who have it learn it with or without a degree. And those who get a degree (even with a good GPA) but lack the motivation forget it ridiculously quickly. I still recall my shock in my first industry job where almost everyone had graduated with a good GPA and a top school and had forgotten most of their linear algebra, discrete math and probability within 2 years of working. And more importantly, they fought hard to avoid doing such problems.
If we were talking, say, electrical engineering (electronics/control theory/EM/communications), then I would heavily tout the degree. It's a ton more math heavy than a CS program, and it requires a lot more discipline to do that on your own, and I've not heard good experiences from trade school on those.
I'm of the opinion we need to split school in a way where young people focus first on how to be useful, then be useful for a period of time. From here, humanities are more relevant once one has enough wealth and time to appreciate it.
I’m in the same boat as you, I for myself am a former CS student now approaching 40. Up until my late 20s - early 30s I was really trying to absorb almost anything I could that was CS-related and which I reguarded as interesting (for example I used to spend time reading lambda-the-ultimate even though I had never written a compiler and most of the stuff written in there was way over my head), but since then I’ve started realizing that reality (and human interactions) is (are) a lot more complex and difficult to understand compared to anything CS-related. I have since started using some of my CS “skills” in order to further understand said “reality”, as part of some personal projects of mine.
If someone doesn’t start undertaking study of the humanities until “one has enough wealth and time to appreciate it” this would be a disastrous outcome for the future of hunanity, let alone also frankly something of a wasted life for people who pursue it in that order.
Well, one needs calculus por statistics.
I didn't know this and I find it incredibly surprising. I've literally never heard anyone say anything bad about trade school ever. In fact, I almost always hear about it in contrast with college, which is said to be an expensive waste of time that turns kids into gay communists.
It's sad, but inevitable with tuition and living fees as they are set now. I do some contracting for a university in the UK, and I honestly don't blame the students who treat the university as a service provider and the degree as a financial investment. They're paying customers and deserve a high quality service with good teaching and good facilities. There's no room for variability here, the CMA will rightly fine universities who have misled students or failed to fulfill their obligations.
If the degrees were free at the point of use (as used to be the case here), I'd totally be behind the idea of studying for the sake of broadening one's horizons.
Fact is, they've gotten increasingly more expensive and the students with sense are going to make sure they get a return on what will (for some) likely be a working lifetime of additional tax or a huge upfront cost.
You need to be in a privileged position to suggest that dropping probably around £45k on 3 years of a degree should solely come down to a desire of "immersing oneself in an environment of learning, inquisitive students and knowledgable professors full-time". The "simple financials" are a barrier to today's students - even with the UK's maintenance and tuition loans.
You're quite right it's £27,750 for home/EU students currently, but I'm including maintenance/living costs in my figure too which realistically will probably be about 5-6k a year. Some universities make working part-time more feasible than others, which can help with this aspect.
If you're international, the total can be closer to £90k after three years (based on a band 2 UG course @ Warwick + £5k per year maintenance/living costs).
>I think we shouldn’t complain: it’s much less than what top US schools charge
I'd respectfully disagree - aiming to be better than the US sets an incredibly low bar. Just like their healthcare, US education is ridiculously expensive and some of the (private) loans seem almost predatory. We should be looking at Europe where in quite a few countries tuition is a few hundred euros a year. Or, thinking back a few years where tuition was £3k or free entirely.
>the tuition loan conditions are extremely manageable
It's definitely a forgiving loan in terms of repayments - but my loan accumulated interest at ~6.3% whilst I was still studying with no proper income. That interest rate is worse than a bank loan.
If you want to take advantage of not having to repay the loan, you're essentially betting against your future earnings potential.
Further, the idea that not learning in school is "teaching yourself" is flawed. You're using books written by others (sometimes the same books that are being used in university. There are communities - online and in real life - that you have access too, often full of people much more dedicated and experienced in the field than university classmates. If you really want something that's similar to a university class, there are MOOCs, coursework, and lectures available.
The education system gets 12 years to expose almost everyone to great art and science, some of them go onto further study and most people in both groups never engage in anything resembling pursuit of knowledge for its own sake once their studies are over.
Nothing wrong with that. My life would have been better if no one has tried to expose me to team sports when I didn’t care. I wouldn’t have been under the impression I hate exercise when I just find team sports grating.
People who love knowledge have their entire lives to pursue it. The people who want a job and to watch their three hours of TV a day out spend time with their children, friends or spouses have to endure 12 years of boredom in K-12 and then four more if they have to get a degree to get a job that’ll get them a job they want. They’re paying in time and suffering. They don’t deserve pity.
The reality is that we all learn continuously throughout our lives. A college should simply be a place that provides (a) low-cost temporary housing for those who need to devote more time to pursuing knowledge than to a job, (b) access to as many learning resources as possible, and (c) optional classes/mentorships/tutoring for additional cost.
We should work to keep this idea alive, but in order for it to persist, the model has to change. The juxtaposition of that high-minded ideal with the harrowing reality of the financial cost is too jarring. We can give all the lip service to collegiate education that we want, but if we slap a 100K price tag on it, it's no longer an ideal.
Furthermore, with new online courses you can learn more advanced topics in a far easier manner than every before.
I've met a few decent self-taught programmers. Most are awful. Over and over again the bootcamp grads and people who have never attended school end up being the worst performers on my team.
A LOT of the people I have worked with who do not have degrees have bad attitudes and are hyper-vigilant about "no degree" microaggressions.
I use the stuff I learned in university all the time. I didn't go deeply into debt to attend a private university, instead I worked during the day and went to school at night.
There is such a massive glut of people trying to get into the entry level programmer jobs. It's an easy decision for me: Throw the people who do not have a 4 year degree or relevant work experience into the trash. I would assume that a LOT of other people are doing this.
Also, even though I've worked for various "top" companies, I've never once had to write my own binary tree, hashmap, Dijkstras search, etc etc. Note, I also think FAANG style interviews are a waste of time too, which is why at my current company I make all applicants do a take home test which is analogous to the kind of work they will be doing if they get the job.
leetcode definitely helps but won't guarantee being hired. There are many things we look for that it won't help you with.
And the unfortunate fact is that merely having experience doesn't mean you are a competent programmer. We need some way to check for that, and onsite coding is the best way we have right now, even though it's imperfect. Sometimes companies ask more realistic problems ("implement this API on a laptop w/ access to docs") so it's not always so artificial, but the artificial problems make it easier to have a level playing field.
Over the last few years I have kind of seen the light of day. When you’re hiring at scale, you do need a standardized process. On the other hand, while I haven’t spent a day studying any really complicated algorithms, I’ve spent just as much time knowing how to talk the talk of a “cloud native enterprise architect”, and it took me awhile to realize how much of a hypocrite I was being for saying that I would never jump through the seemingly artificial hoops of studying algorithms.
 that’s not entirely true. I spent a year or two maintaining a bespoke compiler/IDE/VM for Windows Mobile.
It says that they can be gamed and that they aren't perfect. It doesn't mean they're entirely useless, entirely wrong, or that they generally don't select for the profiles they want to select for.
> Do you really think they are all the best and brightest?
Having worked at FAANG, non-FAANG with a lot of ex-FAANG, and non-FAANG, generally speaking, yes, I do believe that.
They want you to want the position. They want you to explicitly study and prepare for the position.
Sure you could go from zero to ready to interview Google over the course of the year. But I feel like most experienced software developers would only have to study for a month or so to really brush up on interview questions.
I have a previous comment on this site about reading. That is actually what I look for, intellectual curiosity and a desire to continue learning and growing.
I'll just quote myself:
What I look for in a developer: READS BOOKS. ( Audio books count )
That's the only thing. I'm sorry, if you are not reading and studying to keep up, you are getting left behind. There are so many brilliant people writing amazing books on a huge array of subjects. If I could get every one of my developers to read ONE book on software design a year, I would die happy and the entire industry would be 10 years ahead.
They don't even have to be technical books. I just want to see intellectual curiosity and a commitment to self improvement.
- 0: In the vein of Clean Architecture, The Pragmatic Programmer, The Mythical Man Month, Designing Data-Intensive Applications, The Google SRE book, etc
But how would you know that? I thought I was a pretty decent programmer when I started college - I had been programming since I was 5! But a five-year engineering degree gave me some much-needed structure around the things I already knew, and filled massive gaps in knowledge I didn't even know I had.
If you find learning in a structured environment fun and can get someone else to pay for it, a masters may be right for you :)
Per my experience working in a big tech shop, most of the engineer-kind co-workers DO have a STEM degree. I don't think a CS degree is required in that sense, as long as the background is relevant. Only one of them came with no college degree, but he fits in well and decent in his work.
Hiring is difficult as always, degree is indeed a powerful indicator. When the amount of candidates is not that tight, and the hiring process is increasingly streamlined, its importance will only grow.
At another job, a theoretically inclined colleague was grilling a junior candidate on DDD and hexagonal versus onion architectures because that interested him. I had to point out that wasn't relevant at all for a fledgling programmer's ability to write a basic web API.
Self-selection is a big problem, whether it be through cultural values, personal interests or the topic at hand: degrees. Let's please stick to what a degree is: an authority argument and a semi-reliable proxy for intelligence. The rest is incidental, as you can be lazy and disorganised with a degree, or possess a broad theoretical background and be intelligent without.
I take issue with bootcamp being self taught - they are not remotely the same thing at all. I would probably be more suspicious of bootcamp grads than anything else - from experience of various commercial training over the years. Someone truly self taught - rather than winging it - is approaching it a very different way.
Being old, most of the seniors in my early career had no CS degree or even degree at all. A fair selection of whom could give anyone in today's world a run for their money.
The TL;DR of which is, of course, most applicants regardless of background will be pretty dire... Which is why hiring remains non-trivial.
I've met CS grads who can't code FizzBuzz, and I've met self-taught programmers who work 1/10th as hard as I do and are 10x as productive. There's no correlation whatsoever, and the industry reflects that; the pay scale is exactly the same.
An apprentice or even a bootcamp grad can learn these skills in their current job or in a online MOOC course without a degree.
What the author fails to mention is the amount of student debt that comes with the CS degree. In some cases a CS degree isn't worth the debt if the curriculum doesn't align with the market or if the university is less prestigious. It's no good becoming a graduate and leaving with no experience + £50k in debt and convincing employers that you are "qualified" for the role.
Given that most or if not all of these CS and soft skills can be learned online for free, students utilising free courses can find work debt free with the same skills as a CS grad with the added bonus of hands-on experience if they are apprentices. Unless you are after a research position, it is economically better to teach yourself for a typical developer position these days.
As mentioned in a further comment below - I can understand why code is slow (eg: BigO for cpu or memory), what structures and algorithms to use in different scenarios (Data Structures & algos), how things are working (Computer Architecture, Programming Languages [theoretical construction of languages], Compiler design), and Automata. Absolutely, these can be learned by oneself - but I find they rarely are. I know what is happening at a network layer, OS layer, memory layer, application layer, and system layer, and code layer.
I can grok a new language quickly, by learning some very key aspects of its language design from my Programming Langs class (the "meta" of ProgLang design), etc.
But where I really perceive my limitations from lack of formal education is where math comes into play. I get the gist of how cryptography or machine learning works, but I don’t think I’d be able to work in those areas for a living even if I spent some time brushing up on my skills and doing self-study. I never developed the intuition for the relevant math, and that makes it extremely difficult to pick up the material just by reading. And doing an entire self study course sequence of the missing material seems like too much of a hill to climb.
in my opinion, the best way to develop an intuition for the math is through manual calculation (by hand) and by implementing the algorithms/math in code, wax on wax off.
Since everyone making a living in those areas is reusing libraries now many people's knowledge is limited to things they have heard repeated, they don't necessarily have an underlying intuition about the math.
Like, knowing that automata and complexity theory exists and remembering roughly how it works means that if something that needs that comes up, at least you have a sense of what you forgot about it.
For instance, having taken a database class where we worked in the relational algebra has helped me fluently write okay SQL even if I couldn't write out a sentence in the relational algebra to save my life.
Yep. For me, the biggest reason I think a degree is necessary is because it is comprehensive. Self taught people are going to inevitably have multiple holes in their understanding. And worst of all, they will be ignorant of where those holes lie and how large they are.
> Yep. For me, the biggest reason I think a degree is necessary is because it is comprehensive. Self taught people are going to inevitably have multiple holes in their understanding. And worst of all, they will be ignorant of where those holes lie and how large they are.
This is not my experience. Self-taught people are often much more aware of the potential for knowledge gaps, because they've had to learn what to learn. Part of self learning effectively is appreciating the complexity of a topic and choosing what to focus on, rather than just be lead through it by a teacher.
Their default position is one of humility, rather than the often common attitude of "I have a CS degree so I know what is happening at a network layer, OS layer, memory layer, application layer, and system layer, and code layer". (Hint, you don't).
I can see the "I want to earn more money" boot-camp types having holes in their knowledge, but those who read engineering books and stay up all night coding always seem to have the deepest understanding of all because they live for it.
At that time there was no formal courses, and their knowledge probably covers everything from assembly language all the way up - because most people who weren't just playing games started tinkering with BASIC then moved to assembly.
That low-level understanding of the whole system is invaluable, and really hard to self-teach yourself these days when computers are so complex and mysterious.
In 1984, when the Mac came out, I was a starving artist in Denver. I had a lot of college at that point, but no degree. I had foolishly wanted to learn everything interesting, so I had a year of art school and three years of coursework in biology, physics, and mathematics, but I was working in labs and retail by day, and by night as part of a songwriting partnership that ultimately went nowhere.
I was fascinated by MacPaint and wanted to learn how to do that. I had some FORTRAN and some BASIC from coursework, and a little FORTH from a friend, but they hadn't captured my imagination. MacPaint did.
I couldn't afford a Mac. They were 2500 dollars. In 1984 that was half of my annual income. (Yes, I realize that's unbelievably low. Starving artist.)
With help from the woman who became my first wife, I got hold of a Commodore 128. I started by typing in BASIC programs from Compute magazine. One of them was an assembler, and that's where things got really interesting.
I did end up writing an absolutely terrible paint program, but what really obsessed me for months was writing variations of Conway's life. The one I spent a lot of time on was a version that ran multiple sets of rules on the same field at the same time.
The color of a live cell was determined by the set of rules that governed it. Live cells of different colors could see each other for the purposes of liveness analysis, which is how the rulesets interacted. They were otherwise independent of each other. I spent countless hours poring over whatever references I could find about how to make things faster and more comprehensible, and tweaking the rulesets to see what happened. I would leave a program running overnight to see what it looked like in the morning. Often it was a boring disappointment. Sometimes it was spectacular. The rush of the spectacular ones kept me going.
I wrote other programs, too, trying to understand how window and graphics systems and compilers and interpreters worked. I worked in bookstores and read everything I could get my hands on about programming. In those days, retail bookstores paid crap wages, but allowed the staff to borrow all the books they wanted. It was a good deal, from my point of view.
In the fall of 1987, my mother sent me an ad that Apple Computer ran in the Mensa Bulletin, looking for hires. I applied and, to my utter astonishment, Apple hired me, and that started my profession as a programmer and a technical writer.
The way I got the job prepared me well to take advantage of Apple for learning more. I came into the company ravenous to learn all I could about programming. The company had three institutions perfectly designed for an autodidact like me.
First was the corporate technical library, a huge collection of technical books available for employees to borrow. Their policy was that if you wanted to borrow a technical book and they didn't have it, they would order two copies: one to lend, and one for you to keep. I got a lot of expensive technical books for free.
Second was the software library. They wouldn't give you a copy to keep, but they would order software that you wanted to check out, and they would let you renew the checkout for as long as you needed. They also maintained copies of basically every piece of software Apple had ever developed, released or not. My first hands-on exposure to Smalltalk was a check-out copy of Apple's implementation of Smalltalk-80.
The final pillar of my self-education was Apple's competitive intelligence lab. It was essentially a hardware library with instances of more or less every interesting hardware architecture you could imagine. They had a Cray Y-MP. I learned Lisp and emacs on Unicos running on the Cray. I used to sign up for time on the Xerox machines to play with Smalltalk and Interlisp and the Star Office system.
I agree with those who say that teaching yourself leaves holes. I've been identifying and filling in those holes ever since 1987. On the other hand, I can't say I regret how my natural curiosity ended up leading me into what became my profession. Undoubtedly, I would have been better prepared if I had taken a disciplined approach to learning programming from the beginning.
On the other hand, if I'd had that kind of discipline in the 1980s, then I probably never would have discovered that I liked programming in the first place.
That sounds idyllic.
Can is the key word here. Most people don't have the discipline to go through a CS curriculum on their own.
I would even go as far as saying that being able to self-study on your own is an essential life skill, up there with learning how to read/write. That's how important it is. You need to learn how to do it. It is important. Personally, I think overtime it will become easier to self-study topics due to advancements in edtech technology.
I worked for years as a programmer before going back for a degree, and I've hired and worked with many people with and without the degrees.
I've yet to run into someone who self studied their way through the equivalent of a CS degree. I'm sure they exist, but they're pretty damn rare.
What you are talking about is essentially programming as a trade. That’s not what a CS degree or any degree is for, really.
The best programmers have a deep understanding of how computer software works.
That's a value judgement that is not reflected in the market, fwiw.
CS moves slowly, tooling moves fast.
Edit after thought: Of course, when to use an Array vs a Linked List in your specific language (for example), might not matter if you're doing web apps of medium scale taht one can throw more EC2 instances at
IP, TCP, DNS, and HTTP were covered in classes I took.
As for actually on the wire. I haven't studied anything about how ethernet actually works. I assume its kindof like i2c, but I don't know more than that and haven't needed to.
I never said or implied that college was invaluable for teaching networking. I just answered a question about what was taught.
As for debt, there's no need for crippling debt though if you go to a state school and are planning on working as a programmer.
- tons of self study while I designed hardware and wrote code on my own
- school, which started showing me more advanced things
- an internship, which exposed me to the real world
I was into computers years and years before college. This was the 1970s, when books on computers were few and far between. I had not heard of Knuth (and if I had, his books might have chased me away) and most of my learning about computers was from Intel datasheets and articles from BYTE and Dr Dobb's Journal. I read about Smalltalk and Pascal and FORTH (and wrote my own FORTH) and C (and mucked around with Tiny C) but didn't really know anything.
School taught me some theory (starting with Knuth and such), but the whole time I was taking classes I was doing my own self-study on things the school didn't offer (mostly I was studying how to write LISP interpreters and compilers, and then how to write video games). School taught me how to teach myself.
The internship taught me about politics and how to work with people, how to work with physical machines (from DEC-10s and large PDP-11s to microcomputers), and Unix systems programming ("Here, kid, photocopy my copy of Lyons Notes, read that and also K&R, and have fun").
I was taking grad-level courses when I realized that I was having more fun writing video games than anything else, and I was not looking forward to spending my last year of school doing the boring graduation requirements courses I'd blown off in favor of just taking more CS. So I dropped out, moved to Silicon Valley and still, 40 years later, don't have a degree, and I'm still reading papers and learning new stuff all the time.
You go to the school and get a map with all the roads, one-way routes, walks, parks, etc.
If you learn by yourself, your map may not contain some of those routes and you may not know that you can actually go some other way or you may go a way that has a dead end.
On the other hand, it is just a map. You have to walk it yourself and university cannot make you walk all the map. But at least you know (almost) all the paths and their requirements.
It's difficult work, but not complex.
I'm not surprised every thread started by a person who hires grads undervalues their education, because they probably don't have any work to assign to them that contains any complexity.
This doesn't say much for us as an industry. The simple and avoidable difficulties of 'real world' programming are not much to brag about.
The only difficulties we face nowadays in programming are almost always down to buggy libraries and frameworks that have nonexistent documentation.
I remember when I was starting in the work world, that you could buy a whole wall full of paper bound books for your IBM machine, or Sun, and when you looked up a function in there it was well documented and correct.
Nowadays, I suppose we're too cool to do something like that, better to move on fast and go break something somewhere else.
To any recent grads, I'll tell you a secret. If you want to move up the ladder in this industry, the most important thing is to keep outrunning your mistakes.
Build a system, brag about it, move to another role and let some other poor sucker maintain the stupid thing.
This is why we are in the mess that we are in right now.
I don't think that going to college is necessary or detrimental. It's probably good that people from 18 to 21 stay somewhere safe and learn something out of their parent's home.
That said, there is very little that a CS course can teach you that is both practically useful on the job and you can't easily learn from yourself. That's because a large part of coding is not about knowing the most advanced and obscure data structures or algorithms.
Which brings me to the point. Neither of those options teaches you how do develop a product feature. That's what people still learn on the job today, and it's why knowing how to code is merely necessary but certainly not sufficient to be a good developer.
there are great developer that are self taught and self sufficient, sure, but the majority end up coding small potatoes or cogs within a soul crushing enterprise with zero agency
and while most cs grad have little coding skill they have the ability to learn far beyond because while they don't know the hows, they know whys
so even if their core skill stars similar, their long term trajectories are wildly different, which is something to consider too
All the rest that you are saying applies equally to any college student or merely to any intelligent, learned person who constantly improves their intellectual skills.
1. I gave deadlines
2. Students did group assignments 50% of the time
3. They learned to work with Git and how to communicate
I studied a CS degree and learned less collaboration than my students did.
Based on my experience I must conclude with regret that this article is too unnuanced and therefore not very informing, if at all.
The author himself can email me for a discussion on email@example.com since I'd like to understand his views better and perhaps he might benefit from mine.
Edit: I emailed the author.
> Certainly there is some level of prestige that comes with having a degree. This comes from the level of communication required to simply navigate a degree. Also it is, in a way, a validation stamp that says… “Yes - you have the skills to be a Software developer”.
Prestige is only worth something in an inefficient social system. I don't want to encourage prestige or signaling as it will make a system worse off in the long run. Prestige does not hold any inherent value as the metric of why something is prestigious may be for completely the wrong reasons.
Moreover, the level of communication required to navigate a degree is too heterogeneous to measure and then state as an opinion that it is always more difficult than a bootcamp. Navigating my psychology, CS + business minor bachelor and game studies masters were easy. Navigating my CS master degree was tough. They are all university degrees and it was fully dependent on my personality. The reason navigating my CS master degree was tough is because I'm not that formal in my thinking.
Likewise, I had a couple of people who really disliked full-stack development and wanted to be front-end developers but they were stuck in a full-stack bootcamp.
Navigating those 3 months was a really tough thing to do for them. They talked with me and I adjusted their curriculum to a front-end one as much as possible. And also, front-end wasn't my specialty, so I couldn't optimize it as much as I would've liked. Other teachers may not be so nice (I've met them) and they knew they were taking a huge risk telling me this. They were vulnerable, and made the situation more complicated for me and themselves as well, which I was fine with but it isn't easy. I had an easier time at my psych. bsc, CS bsc. and game studies msc.
Having tough coursework across diverse esoteric subjects (where you cannot easily stack overflow your way into a solution) was easily the best learning experience of my life. Standing before massive "walls" that looked impossible to climb, and being forced to figure out how to do it on my own, is the most important skill I have. Some CWs which I thought were particularly tough and taught me a lot:
- Kernel for target non-widespread platform (where all you have is the processor's manual), capable of IPC, written from scratch in C.
- Both a pathtracer and a rasterizer without using a graphics library. (you can read about the advantages or disadvantages of each one, but writing both makes you appreciate each technique independently)
- Some sort of compiler or interpreter (parser too).
- HPC problem that requires non-trivial parallelisation and vectorisation.
- Some sort of adversarial attack on a target "real world" platform that uses deprecated encryption (i.e. RSA).
- Individual research project.
ALL of these tasks are possible to do by yourself, independently, but very hard to see them through the end without solid infrastructure and support staff surrounding you. I imagine it takes a very highly motivated individual to do the same things outside of academia.
The biggest benefit of attending uni is learning how to learn and tackle complex problems.
My experience has been that it has taken a lot of effort and time to try to learn things, and keep up with everything. Also, I think my resume just gets straight up tossed out a large fraction of the time compared to CS majors.
Maybe there's other issues, but I've had people look over my resume and things like that.
Additionally, my lists of technology are pretty weak (I'm trying to work on it, but my company doesn't use a lot of mainstream technologies if any).
I did not have any internships in CS.
I don't know how bootcampers and other get jobs at FAANG companies in 6 months, because I can't even get a non technical phone screen with them.
I even did triplebyte without too much success.
That being said, I accept there are probably other issues and things I need to work on, and I'm trying to work on them as much as I can.
But, I have spent a lot of time reading about topics, asking my friends questions, reading conversations in group chats that are technically oriented and looked up things from them to learn. It has been a hell of a journey.
Trying to move to a company that uses more mainstream technologies. So I can have a resume that will actually get me calls back.
This is a bit of a ramble.
Also, considering the article. I don't think I really had to learn any of those things in school well. Maybe if you do clubs and things, but I don't think it's much of a result of the curriculum IMO as far as teamwork/communication go. I know CS majors have senior design where the work in teams, but I think a lot of them suck at the teamwork part.
Current weaknesses of mine: Socket programming, messaging, multi threaded applications, I don't know SQL (beyond basic queries)/database schema design (we don't SQL it at work)
I still barely know what dependency injection, factories etc are. I only learned what a "god object" was today.
I'm trying really hard. I want to have a really good job since I know the benefits in terms of rewards of me studying hard are HUGE.
Plenty of CS graduates have very poor communication / social skills. It's very unfortunate but it's the sad reality.
As somebody with computer science knowledge, but no official university degree, I can testify to the importance of going through such a curriculum. I'm eternally grateful to MIT and their OpenCourseWare initative which allowed me to develop my understanding of the subject. This, without a doubt, made be a better Software Developer. All of this for free.
Physically attending university has two main advantages as the article points out: Discipline for people who can't work when not pressured (ie deadlines) and of course networking (meeting with companies on campus etc)
But the majority of the skills listed are not intrinsic to computers, computer science, or software at all.
"There are many skills that Developers now require besides just coding. They need time management skills, organisational skills, people skills, translating skills, negotiating skills.. the list goes on. These skills are often gained through completing an undergraduate degree."
Most people will get experience in exercising and developing those skill sets with almost any degree program.
"Certainly there is some level of prestige that comes with having a degree... it is, in a way, a validation stamp that says… “Yes - you have the skills to be a Software developer”."
Umm... not really. We've probably all met people with CS degrees who were not really all that capable of doing day to day development.
Coming back to this, one of the defenses I've had from CS people who couldn't do day to day development all that well is "well... that's not what CS is for - it's really for diving deep in to the theories behind problems... " - or some other relative claptrap. On its own - yeah, there's a class of problems that are served well by people with a CS-focus. Just don't try to drop them in to a "developer" role and assume that they'll be "just as good or even better" than someone who's simply been doing development perhaps with minimal schooling or just self-taught. One skill isn't really a superset of the other, but some people tend to think of them as such.
IMO the advantages of CS college are this:
- You can reach a higher salary.
- It's easier to learn hard math/CS ideas in school than on your own.
- It's easier to make friends in college than elsewhere.
I'm not sure about "reach". From what I've seen it's more of a "You can start with a higher salary". After a decade of work barely anyone cares about your university. There are some exceptions of course - R&D departments at corps like degrees for example.
They end up using a list for a map etc.i.e. there are foundational gaps in their knowledge. A CS program ensures that there is at least an attempt to build a foundation.
This all also applies to “many CS graduates”. My systems-programmer wife has given up asking basic systems questions to new grads in technical interviews because she just expects to get blank stares at this point.
Stuff that was expected knowledge of anyone programming computers in any capacity up through the 1990s is now a mystical secret known only to gurus.
Someone who takes hard college courses in operating systems, database implementation, networking, distributed systems, etc. can obviously learn a lot of useful stuff. But those are often not required for a CS degree.
That same thing that drives people to be good in programming happens with it without schooling. And most of it happens to be experience.
You can either pay for experience, or you can get paid to get experience.
At the end of the day bad programmers from whatever walks of life are going to use list where maps should be used, and not understand algorithmic complexitie of the code they write.
Processes, threads, and memory allocations are all system level operations and a programmers depth of knowledge of them are highly dependent on where in the stack they are working.
In fact. I would wager that most self tough programmers are a cut above the average college grad, and way more suited to actually gettng work done. Thry have already shown they are self motivated and don't require hand holding to get the job done. Besides they will also be less stressed from not having all that debt.
Perhaps the author would be even more in favor of a history degree and teaching yourself how to code than a CS degree.
I'm not knocking college degrees. I have one myself. However, those who are self-taught tend to take far more initiative and invest far more time in studying and understanding how things work than those who just got a degree. They seem to be intrinsically drawn to the craft and don't just do it for grades or accolades. Software engineering is an integral part of who they are and they are brilliant because of their unquenchable fascination with it. At least that has been my experience.
I spent several years doing BGP and related router support; when I passed my CCNP I was forced to learn a lot of interesting broad topics ranging from multicasting to obscure details of low level packet switching.
When I was very young in the 80s and introduced to 1st edition K+R C I was unimpressed with pointers; why bother with all this? Later, I found pointers pretty interesting and useful for data structures and low level driver "stuff".
This is hardly a problem solely for CS. Philosophy has the same issue. Self taught philosopher almost certainly means maybe one favorite author and maybe one book named something like "The philosophy of (insert pop culture movie title here)" and little more. Even a pretty lame intro to philosophy class will be much more broad.
An enforced broad education of a topic can be very useful. That seems to generally be missing in online education. A lot of effort over the years has been put into very tiny topics like "how to teach kids the derivative of x to the y power" but very little effort seems to be put into "how to decide what is learning enough of calculus, how to measure it, and how to pinpoint lacking areas"
A CS degree is learning and not just a bit of paper. It will give fundamentals and breadth, but not necessarily depth because they have to cover so much ground. It doesn't teach how to code, they teach how the systems work.
Self learning can give you the depth in an area that interests you, and provide the experience to debug and live with code you produce or to take on existing systems by contributing or interacting with OSS.
They are not mutually exclusive... If a CS degree is available to you, do both.
If you're really passionate about the science of computing, do CS. But if you're really into particle physics, anatomy or medieval history, do a degree in that (and learn to code).
Long term everyone is teaching themselves to code, or picking up skills on the job. Maybe I'm an idiot, but almost every skill that I have has gone almost 100% in and out of cache over the years. Having had a decent foundation at one point is probably helpful, and struggling through a highly mediocre CS PhD (the program at CMU was excellent, but I was pretty shit at the research side) gave me some lifelong skills and interests. However, almost everything that I can do semi-competently now is a product of repeated independent study and learning on the job.
Many of the skills that I now think of as bread-and-butter weren't even a thing when I was an undergraduate or early grad student. SIMD programming was pretty much non-existent on mainstream processors (although I learned some nice parallel prefix stuff in a parallel algorithms course). Computer architecture is hugely different. C++ is nearly unrecognizable - when I started C++ programming, half the language wasn't even really there (no templates, and exceptions exploded in weirdo ways) and the emphasis of how people programmed back then was totally different.
In the end, everyone will either be a self-directed learner or GTFO into management.
The skills you will get from university can be deeper, but I've met plenty of people who managed to escape CS degrees without learning anything profound. They can also be 'wider' - I learned a lot of useful stuff in pure mathematics (graph theory and combinatorics). There's nothing stopping a truly inquisitive person from picking up a lot of this on their own, but the structure of university is good for a lot of folks.
Agree on the CS side. I received my undergrad CS degree almost 20 years ago now. It definitely laid a foundation for future learning, but beyond that it's hard to know at this point. I was also close to a minor in business, and what I learned in finance, economics, and accounting has been useful at nearly every stage of my career. The one thing I wish I had done more of in school was literature and writing. Through experience I've learned that writing (really communication in general) is the most important aspect of most jobs.
I have these at work.
I have this at work.
I do this at work.
I have this from doing a good job at work.
Sounds like I'll be fine without a CS degree.
The kind of full-immersion one-on-one mentoring you get as a junior member of an effective team is going to have a much larger impact that anything you do in your formal education. And that's a good thing too, because I've worked with numerous CS grads who:
• are weak communicators, especially in writing
• are not effective in a team, either for interpersonal or technical reasons (ie writing unmaintainable code)
• are not particular fast or effective, especially in situations where projects and priorities are not fully specified
Now, to be fair, some fresh grads are great on all these points. Just as I've worked with ones who are incompetent, I've also worked with some who are as effective as senior engineers right from the start. But that is not entirely common and almost always a function of things they did on top of getting a CS degree. It's hard to reason about counterfactuals, but I am sure that the same fresh CS grads who are incredibly effective would still be roughly as effective even if they had taken a different path to becoming a professional programmer.
And even if there is any systematic difference between junior programmers with and without degrees, I am absolutely certain that it gets completely erased after several months of working and mentorship in a professional context.
What I think I missed in a CS degree was study of lower level fundamentals of operating systems, compilers, databases, algorithms. Things that you don't necessarily get exposed to at a deep level building out a Ruby on Rails web app. And teaching yourself to code you can still seek those out to, that was just my particular path into software development.
I think the article could be strengthened by mentioning and focusing on some of those things.
Granted not everyone does that and my motivation was never just to get a job.
The second part which I don’t really concur with are the soft skills parts being taught during a CS degree. I happen to know plenty of new programmers who constantly overcommit and or get fed up with small details. I think that these skills can only be acquired at work with the help of good instructors.
What it does mention:
Struggling as a community; learning independently plus from people who are learning independently of you; networking with people who are committed to the same career path as you; and confidence that comes from having a degree. (I'm paraphrasing extensively)
Since this is something that all college degrees offer especially under bad professors and terrible academic programs, I think the lesson here is heavy goalpoints combined with little guidance makes for successful students and career paths when backed with prestigious degrees.
The purpose of a degree is to get knowledge (yes!) but, more importantly, work and get connected with people in the field and get a proof of your ability and the work you put in in the form of a degree.
Now, if you want progress further in the field, maybe you think about teaching or doing research, this is going to be pretty important.
On the other hand if only thing you are going to do is programming for a big co then this is, in my opinion, pretty much a huge waste of time.
You can teach yourself all the skills needed to be better developer than CS degree would ever teach you, better, and in shorter time while working for big co and getting paid at the same time!
Understand that programming in itself is pretty basic skill when compared with other skills necessary to be a good developer:
- debugging complex systems,
- dealing with complexity in large systems,
- getting good at very specific software stack that is used for your project,
- learning how to communicate with your team, your boss, your stakeholders,
- and so on.
Learning those things takes more time than learning how to program and learning them can only be done while working for a company.
On the other hand you can learn to program on your own and if you are really committed, just get a copy of CS curriculum and do it on your own.
There is some worth to having a CS degree (it gets you into interviews more and maybe a bit higher salary). But I think this is false thinking.
First, you only need to find one job. If you exclude all companies that require CS degree there is still huge number of good employers with as good rates to offer.
Second, your CS degree also means you just lost about 4 years of real world experience (I assume you would still need to spend about a year learning on your own before you can realistically start work as a programmer). This additional experience would count as much or even more than the CS degree and it would also be huge economical advantage for you (getting paid instead of getting in debt?)
This is very important if you are working for a big company. As a self-taught programmer, having never dealt with large systems, I had to struggle a lot early in my career.
any degree will get you those skills. project planning, deadlines, networking, dealing with people, etc. college will always give you access to people.
what is missing is basically easy & sometimes free access to knowledge. god i miss having access to nearly any library and all these books, and all these publications you can print for free. i still have a trove of PDFs on a USB stick somewhere...
and also all the discounts you get on most software. poor student me would have never had the monies to buy full visual studio back then...
The only one of the advantages listed in this article that seems true to me, is the Prestige of having a CS degree. There are plenty of employers who will require a degree, and a few which prioritize a CS degree.
In general, CS majors spend their first few years unlearning all of the theoretical constructs they were taught in college. 90+% of the programming which needs to be done in the real world is CRUD with a few special features. Rarely are those special features related to anything taught in school, they are usually horrible hacks that are related to the legacy system your new system has to connect to.
There are, of course, exceptions, and for a few employers a CS degree is a big advantage. But, unfortunately, much of the teamwork and communication skills which the author lists s advantages of a CS degree, are again something where the CS major has to get over their schooling, and learn to communicate in the language that everyone else on the team understands.
Which, of course, nearly every CS major learns to do, they're smart people. But the only thing they really get out of the CS degree that matters in the real world, is the prestige, in 90+% of cases.
Do you have any examples of these theoretical constructs? I find most of the things I learned in my CS degree useful for commerical software development and can't think of anything I had to 'unlearn'.
Of course, if you're dealing with truly big data, like a FAANG company or something, this may not be true at all. But the reality is, 90% of the software needed is satisfied by a standard CRUD setup, and you will not need to use any of the algorithms learned in school. This can be frustrating for a new graduate, who feels an urge to do "real" development and get past the simple stuff. It is mostly simple stuff, and should mostly stay that way.
So basically, anything that wasn't CRUD (sort algorithms, processing that avoids just loading it all into memory, etc.) will usually be overkill. But, where you work it could always be different.
The problem with "teaching something yourself" is that you don't really know in advance what is important and how things are connected. If you knew already, you wouldn't have to learn it.
Curiously one of the best developers that I have pleasure of knowing don't have any degrees. If I could go back in time I would have never wasted 4 years of my life going to a university for a CS degree.
Many CS degrees do not teach how to code, they might give you some starting help but expect you to learn coding in parallel to doing the CS degrees.
Also IMHO learning time pressure, coding and teamwork
Works best in a good company, where there is some time to give feedback/teach the junior. But the how many of the companies hiring Juniors do that? In how many companies will you not only learn not much, but potentially even mislearn thinks? Are you able to judge this aspect when you just get started as a junior?
My recommendation is to do a part-time Studium spending half the week in University and half at a company as junior programmer. Well at last if the universities you are interested in support this (many German universities do). Note that I explicitly do not mean a dual Studium.
PS: There are other benefits of a CS degrees which only matter in some cases, e.g. learning how to read CS papers , typical CS notations, better understanding of basic algorithmic things. Etc.
Not saying CS degrees aren’t creative, IMO they just seem a little more prescriptive in their solutions to problems.
I don't know if you can do that, maybe yes, sure yes and a lot of more things. The thing, anyway is that i'm an almost lawyer (working, but without degree yet) with absolutely no academic preparation in tech. Just internet, manpages, and intuition.
If your intention are just code to have work and pay debts while obey someone, sure CS degree is useful... but if you want code to resolv things that you actually needs, no, not in all: systematize all the knowledge in an scholar scheme just would be a lost hacking-time.
In the other hand, i havent much work as lawyer, lol, whatever.
If I am hiring an engineer, and I have to sort through which 10 candidates out of 200 resumes to interview, the CS degree is one heuristic of many to make the list manageable. If a colleague I respect hands me a resume and says “This is one of the best people I’ve ever worked with” I won’t even look at the degree or anything else on the resume.
Several of the absolute best engineers I’ve met are self taught without the CS degree. That isn’t incompatible with saying the median engineer with a CS degree is stronger than the median without. (And many with the CS degree are self taught too)
They meet deadlines, they are incredibly good at communication and collaboration and they have pretty good networking. Most of these traits come from the fact that they needed to develop them in order to succeed in learning by themselves.
It is a pretty limited view of the world to think that only college can bring you this. Immersing yourself in a coding bootcamp for some people means leaving the jobs they need to survive in order to have a better job in the future. I can’t imagine how being on college can teach more about meeting deadlines, teamwork, communication and perseverance than that.
I wish this article provided more facts to back its beliefs up.
What is important is to understand what you want to do first since CS is very broad. If you can find a subset within CS, such as web development, app development etc, then things become easier.
It enables you to get a job/internship quicker; boot camps, self-learning, bachelors degree, so many paths!
Once I got my internship at a company located in a tech hub city, the connections I got there and the ability to write it on my resume was all I needed. I went back home and started my masters while working remotely but I very quickly realized that no one asked for a degree anymore, references / portfolio was all I needed.
I continued taking my masters, but only doing the courses I enjoyed. :)
I do believe in higher education. If you want a broader and deeper education than what you get in high school, with new ways to approach learning and applying your education to larger projects, a degree gives it to you. It also gives you the academic credentials to pursue higher degrees. And the point that some prestige can come with a degree is true.
There are reasons why a degree is a good thing. But for the most part, this article doesn't accurately explain what they are. If you just want to learn to code, get a job, and do it well... you don't need a degree for that.
In many accounting related areas, having a degree is not important. The accountants, finance and management related executives who need the programmers didn't need CS degrees to create their Excel spreadsheets and macro, so why should they think a CS degree is important?
Try getting a programming job in an engineering domain without a CS degree. They would rather have a physics or maths major and teach them programming rather than a person with programming experience without a degree in that field.
It all depends on that programming tasks are required. You certainly don't need a degree to code for AWS or Kubernetes, but you will need it in hard STEM areas.
I taught myself to code, and sold my first commercial project at the age of 14. And learned a lot.
I then went on to get a CS degree (in addition to physics, math and EE degrees - all undergrad). And learned a lot. (The structured program of learning about Databases, Compilers (and more about compilers!) and Operating Systems was far above what I would have (could have?) learned on my own. Data Structures and Algorithms - that is straight forward enough if you already know how to code)
And then went on to sling a LOT of lines of code. And learned a lot.
Now - I don't think ANY of those (except maybe the last one) teaches one to be a systems engineer, which one needs to be for large systems, but they are all excellent building blocks for when you get there!
In my case, I taught myself how to code over a few years.
I got a job with Georepublic and I love it. I’ve learned a lot.
But, in my case, I had the skills that the author mentions already. Time management, communication, etc.
All that came from my previous careers ( Navy, then teaching.)
And quite frankly I’m not really interested in studying computer science. I studied criminal justice and psychology.
For my Masters, I’ve decided to study Geoinformatics (which is directly related to the kind of work I’m doing at Georepublic).
So, in my opinion, I feel like if someone is getting into programming, they should study something related to the field they are in. Of course, this is assuming the person is older, changing careers, etc.
The CS degree ain't going to teach you how to code, you'll have do that yourself anyway.
What the CS degree will help you with is handling high complexity and/or harder problems. Which can be invaluable in anything except the most simple software.
So... ideally you'd get both :)
All of that can be learned during your first internship where you get paid instead of paying.
I would argue college delays maturity and does not foster it. Hanging out with a bunch of people that are the same age, same life stage, etc. doesn't exactly add life experience. Continuing to do "hoop-jumping" work just like we all did in high school also doesn't seem to be novel. Perhaps its the keggers and binge drinking that bring the maturity the author references?
As for prestige, if its a top 10 program sure, otherwise (and often even then) don't waste your money.
Disclosure: Current engineering lead with multiple Degrees (1 Summa Cum Laude) and a bootcamp CS education at one point (which wasn't all that valuable).
I would contend there are 6 classes worth of truly important information for practitioners to be equivalently informed as your top 20% CS undergrad.
I would say:
1. The standard Data Structures and Algs class.
2. Discrete Math (if needed)
3. Systems design
4. A Nand To Tetris type survey class
Bonus: An ethics class. Because the tech world needs more people with functioning moral compasses.
CS grads have typically been strong on opinions, weak on everything else. The whole program is just too "meta"--and it is a huge negative!
You can give an idea to an autodidact, or a math grad, or a real-science grad, and they will approach it with an open mind. Give the same idea to a CS grad, and God help you if it conflicts in any way with their indoctrination.
In our last hiring round, we filtered a lot of applications down to 5 individuals we wanted to interview. During the interview they were all asked to write their own implementation of Fizzbus in whatever language they desired. 3 out of 5 completely failed to arrive at a working function, the 4th was able but with massive help. The 5th were hired. All had 4+ years of education in CS.
I recently had a technical test, which I failed because I was depressed how boring it was so I missed many easy answers (say asking about private/public fields in OO).
Getting someone to regurgitate something they've been explicitly taught and memorized instead of trying to solve a problem they haven't been (does any CS course explicitly teach fizzbuzz) does not seem like good job selection criteria.
> I recently had a technical test, which I failed because I was depressed how boring it was so I missed many easy answers (say asking about private/public fields in OO).
If you're not willing to engage with a test because you think it's beneath you, you're going to have a bad time.
for my own anecdote, I'm taking responsibility for my own "tastes", I still find many HR process somehow inadequate if not insulting, it's a shallow game and considering the amount of effort, learning invested I just could not suffer it. Nobody likes to be gatekeeped by people less taught.
And how is that different than getting someone to regurgitate problems thet learned by studying leetCode?
If you can’t code fizzbuzz even after being told about the mod operator, how can you be expected to solve harder business problems once given the requirements?
It isn't. Which is why the trick is to find a problem that the person hasn't heard of and ask them that.
> If you can’t code fizzbuzz even after being told about the mod operator, how can you be expected to solve harder business problems once given the requirements?
I'm frankly bored of interviewing CS graduates who can give me a perfect definition of modulus but can't do fizzbuzz.
Most jobs don’t require hard CS.