I wonder if it’s prepared me better than a CS degree for the actual work I do.
What did I learn with my RELS degree? Hermeneutics— understanding ancient texts written by lots of different authors in their original historical contexts, often which has been deeply redacted over time. Anyone who’s worked on any production code can see the benefit of being trained in that kind of thinking.
The classes I took on theology and philosophy helped train my brain to organize ideas. I had no trouble understanding object oriented programming — Plato would’ve loved it too, I think.
Classes on ethics have come in handy too.
Oh and just, we did a TON of writing in college. Lots and lots of writing, lots of research papers. I had to learn something new, read all about it, and put together a document carefully explaining the idea, with lots of evidence and citations. That skill has come in SUPER handy for everything from bug reports to API clients, to customer-facing documentation, to internal memorandums.
"understanding ancient texts written by lots of different authors in their original historical contexts, often which has been deeply redacted over time"
That's brilliant and obviously useful for software engineers. Don't forget that you've probably also studied holy wars in considerable depth, too.
I'm a bit worried that both the coworkers, the writer of that comment and everyone else in that thread all somehow don't know to put one line of sed in a githook to fix that problem?
Easily the most trivial objection I've ever seen in the never ending spaces/tab debate.
Many people I know customise their local environment to suit.
I felt a disturbance in the force. It was as though 1000 angry hn users cried out in pain.
It is amusing to see how directly those skills taught in a religious context apply towards being a good engineer. Perhaps they're more universal than many realize?
I often used to do this to save myself from sinking feeling of how I made a bad choice and wasted precious time that I could never get back. But let's face it... You can make this argument pretty much with any degree, let it be art, humanities, literature or whatever. But, no it didn't prepare you better than actual CS degree. All those things you described could be perfectly done as part of CS degree as well. Most good CS programs are much more than just code and equations and these days even includes bit of ethics and humanities. However, the point is that the time you could have spent in learning actual CS material, strengthening the foundation and acquiring advanced skills was instead spent in memorizing and rationalizing virtually unrelated minutia about largely outdated and mostly unverified beliefs. It's one thing if you loved this field and wanted to study anyway but quite another if you ended up there without knowing what you are getting in to. It's better to understand this so if someone comes to you for advice or if you are making these decisions for your child, you don't repeat the same mistakes again.
I would say it prepared you differently. The argument is between specialisation and generalisation. It could be argued that a large proportion of a CS degree is useless for one particular application. Who cares about assembly when you do python all day? Or how does that computer graphics module help you with a back end server? Especially when most people have forgotten it when they did it 5,10, 20 years ago.
Ah but one might argue there is transferable knowledge in these areas! Your knowledge of assembly helps you to know what’s actually happening to the computer and why that bug happened; or that algorithm for graphics is very similar to this one for a server. These are very valid arguments, and a similar thing is true for an even broader education. Deciphering ancient religious texts gave him a similar skill set as needed for programming, and potentially a different one to most other programmers. So much of life is learning how to reason in different ways, often they come in handy, and different experience leads to a diversity in thoughts, which gives an edge in solving new problems.
I am very much a proponent of generalisation and then specialisation (along with variety in this path) for optimal performance.
> instead spent in memorizing and rationalizing virtually unrelated minutia about largely outdated and mostly unverified beliefs.
That’s not what my religious studies degree was about. It wasn’t expensive Sunday school. They didn’t teach apologetics. They taught scholarship, empiricism and critical inquiry. Philosophy and theology were taught from a historical/critical/contextual perspective, not a devotional/didactic one.
I remember one asking a RELS professor about his internal religious beliefs once (in private) and got a stern talking to about how deeply inappropriate that was.
I appreciate and agree with your comment generally, but your characterization of my degree is not factual.
I repeatedly encounter engineers who struggle to write even the most basic JIRA tickets. It was a real eye-opener on how many technical people, while brilliant, tend to take writing skills (or communication skills in general) for granted.
Funny thing is that it's deemed unimportant when they have to do it themselves, but appreciate it when presented with well-documented tooling/libraries.
A lot of technical people don't care to be good at writing.
Not that they actively want to be bad at it, but they do not see it as a priority, they hate doing it, and they are unwilling to put effort into being decent at it.
In more extreme cases, they will even make somewhat absurd excuses to justify it. My favorite is, "We shouldn't write documentation. It's harmful because it just goes out of date." True, documentation gets out of date, but you can plan to maintain it, or you can just mark it as out of date so people aren't misled.
Many are bad at communicating in general, e.g. selectively ignoring/missing parts of an email, incomplete sentences, typos. Some also have the nasty habit of skimming stuff and replying to what they thought they read.
I've seen this from interns to CTOs and remain baffled by its prevalence.
> Many are bad at communicating in general, e.g. selectively ignoring/missing parts of an email, incomplete sentences, typos. Some also have the nasty habit of skimming stuff and replying to what they thought they read.
I've seen this as well. It really makes one wonder how they're able to code properly since programming is, in a sense, also the expression of ideas as text.
Well, it's human nature to try and efficiently summarize. You may have to adjust your communication methods to account for that. For example, I almost always try to write emails that contain one central point.
I have a colleague who's a brilliant developer, but trying to wade through his codebase is literally like going through a session of Chinese water torture. And the 'comments' in his code are more mind-boggling than explanatory.
My philosophy degree has been extremely helpful throughout my software engineering career, for many of the same reasons. Clear, concise writing is crucial in code reviews, documentation, product requirements documents, and all sorts of ad hoc communication (Slack, emails, wiki comments, Jira cards, etc). This is doubly true for remote work.
Just like writing code, written communication is a difficult skill requiring lots of practice and feedback. A liberal arts degree provides that in spades!
philosophy degree here too. in general, the ability to approach a problem from multiple angles is helpful, not only in software, but (imo) for life in general.
I'd started with software well before university (5th grade - maybe 6th?) and there were not many resources. Our school had a computer, but no classes as such - the staff weren't really even sure what to do with the 3 we had. HS - there were some "computer classes" - intro to BASIC sort of things. I'd already been programming (mostly BASIC, a bit of z80 and 6502) by the time those classes were available.
CS was a thing in university, but just taking one class (some Pascal class), I was generally put off doing it "professionally" by the difficult social nature of the people in the main computer departments. I was not the social butterfly, and they were really offputting (and I may have been not very helpful as well) but I never clicked in that class or the lab, and so dropped that idea as a profession, but fell in to it years later accidentally.
long time followup. i'd programmed (basic, a touch of z80 machine code, etc) since the early 80s, and after school just started applying (via newspaper classifieds) to anything that sounded vaguely computerish. I got a letter back saying "I've never met anyone with a philosophy degree before - come on in for an interview". And I started doing low-end work for a company reselling OS/2 stuff...
> Oh and just, we did a TON of writing in college. Lots and lots of writing, lots of research papers. I had to learn something new, read all about it, and put together a document carefully explaining the idea, with lots of evidence and citations. That skill has come in SUPER handy for everything from bug reports to API clients, to customer-facing documentation, to internal memorandums.
IMHO the worst bottleneck in both practical computing and even much of computer science is the struggle to explain/document the work in, say, English.
Funny, I'm similar. I have an English Degree (Fiction Writing emphasis) and minored in Religious studies. It's proven to be a positive in so many ways, mostly because I am an excellent communicator and able to explain complex subjects to non-tech folk. I've also been told that I stand out as totally different from every candidate they've interviewed, prior. I started programming on my own at age 11 and have been passionate about coding my entire life.
I just posted some rambling thoughts in this thread about not having a CS degree (or any real degree) and being successful as a self-taught software engineer.
I briefly mentioned how my non-formal studies of religion and philosophy have helped to inform me - mainly in the context of "machine learning".
Your thoughts about how such studies apply to CS in general, though, are interesting. I noted that there are extreme overlaps between CS and many, many other areas of study that I believe both informs and is informed by those areas (whether known consciously or not).
We may actually be doing a disservice to ourselves (and perhaps for the world) by having (though rightly, I suppose) tying Computer Science so tightly with that of Mathematics.
Of course, that could just mean that mathematics also in turn informs and is informed by those subjects, I suppose...? Which is arguably true from what I understand of mathematics and the history of mathematics!
I guess, though, is what I am getting at is that we - humanity that is - have somewhat "enclosed" CS as a subset or adjunct of mathematics. Many don't seem to understand or "see" that so many other subjects and topics are involved. Not having that understanding may be hindering our advancement (both in CS and perhaps even as a species).
Yes. But cs degrees are not about learning coding. We have coding schools for that. It’s about learning the intersection of computing and math. But having science in the name sounds cooler so people take that instead.
Anyone with a CS degree will be able to tell you there's a huge difference between being a programmer/coder, and being a Computer Scientist.
Yes, most people can learn to code without a degree but to be able to fully understand the scientific/mathematical foundation of why things are the way they are, and to use those skills to create something scientifically new, probably requires a degree.
> I had no trouble understanding object oriented programming
Indeed, Object-Oriented Programming is harder to learn for programmers who have experience with other languages already (no pun intended) ;-)
When you know a language like C/C++/Java and you learn about OOP you think you understand, but in fact, you don't. So while functional programming was probably the easiest course for me in university, it took me 1,5 semesters to actually understand OOP. In the end, it is not that complicated, it is just that you have to think more on the meta-level than on the implementation-level and I think, for someone who doesn't have access to the implementation-level, it is easier because it doesn't get in the way.
These programming paradigms are really all about impedance match with a model that's already in some human's brain -- going from the brain-space model to code with the least amount of transformation. So if you're struggling to get to the mind-model, something is wrong! It definitely doesn't make sense to begin looking at the implementation details. They're only there to facilitate that simple transformation from thoughts to code.
OOP is a concept. It doesn't care about the implementation and neither should you. So if you are talking about integers, don't care about the implementation which might have limitations like MAX_INT, care about the idea of having a number which you can increase and decrease. That is a concept and it can be implemented in different languages and with different limitations.
Next, understand the idea of 'Everything is an Object'. This means that you have a class hierarchy and at the top is the class 'Object' (so everything can do, what 'Object' can do). Objects can receive so-called 'Messages'. When an Object receives a Message, it calls the corresponding method. So for example, in Smalltalk (the mother of Object-Orientation) the expression '1 + 2' means, you have an Object '1' (always the first thing) which receives a message '+ 2' and when the method '+' is being called, it returns an Object ('3').
I think that is part of the secret sauce that many tutorials are missing. You can read a lot about the class hierarchy (inheritance, class variables, instance variables, ...), but few seem to care to explain what 'Everything is an Object' really means.
So your OOP program is basically sending different messages to different objects.
Finally, what makes OOP so powerful is that there are some attributes which are being enabled by placing your classes in the right spot of the class hierarchy, so that an object of that class does not just inherit all the attributes, but can also be used as one of its superclasses (polymorphism). That way you can write code that works with a wide variety of objects.
I am sure I missed a few things, but for me, understanding the clean syntax of Smalltalk and the implications of 'Everything is an Object' changed my perception of the OOP concept.
Frankly and in retrospect, I consider this difficulty a sign of the unsoundness if OOP. While with typed FP you can get entangled into some seriously complex mathematical models, in OOP it’s just a bunch of metaphors and mental models you have to convince yourself are true. The effort poured into memorizing one rarely transfers to unrelated frameworks or different languages, or at least does so to such a high level that it doesn’t help that much anyway.
OOP is easier to use when describing a mental model as-is, without the analytical deconstruction that (typed) FP would demand.
You could certainly see it that way, but I found that it is a good example to illustrate the intention of having a very simple syntax and processing logic (up to the point where arithmetic expressions are not what you expect them to be).
I'm 39 years into a coding career with no CS degree. But as a compulsive auto-didact (probably a common syndrome among HN readers) it hasn't been a barrier. Self-learning is a continual, daily requirement for coders. If you can continue your education as needed for this job you probably also have the skills and interest to learn from scratch.
I studied the same books and did the same exercises as my brother the CS major and do feel that my training would be incomplete without that. But I don't feel disadvantaged by not doing that within a class structure.
The main thing I lack is access to government jobs, which routinely require credentials I don't have. But I've probably had a more diverse and satisfying career as a big fish in small private sector ponds.
Not everyone can learn coding without externally imposed structure. But those who can't probably have an ongoing problem in keeping up with the state of the art.
> Not everyone can learn coding without externally imposed structure.
That's actually not what a CS degree is primarily about. You learn to code in maybe the first 2 or 3 classes. After that, it's assumed that you can translate ideas into code and you start learning about different areas of computer science.
If you just need to learn to code for a job, there are bootcamps that can teach you that in a fraction of the time and cost.
> But those who can't probably have an ongoing problem in keeping up with the state of the art.
Personally, I wasn't able to learn to code in any meaningful way before college. I had tried to learn from books and online sources but never got beyond basic scripting. After working my way through a bachelor's degree and PhD, I don't have much trouble keeping up with the state of the art now.
Good comment. I can code, and I am knowledgeable about computing in general terms, but I am not a computer scientist as I haven't benefitted from the rigour of formal training. (I am a lawyer, fwiw).
There's no real fixed meaning to computer scientist, so you can call yourself one if you like. I considered myself one when I was doing computer science research for my PhD, but I don't consider myself one now that I'm working as a software engineer.
I suffered the opposite experience. I can’t learn in an academic setting. If I’m faced with a problem (read: motivation) I’ll learn what I need to solve it.
You're lucky. When I worked for the federal government(USDA/USFS), there was no way I was going to get access to non-windows servers for deploying my projects. When we brought up the idea of using AWS, they told is that it was impossible as they had no way to handle reoccurring monthly billing. (obviously, this is a lazy excuse). The only node project I managed to get deployed ran behind Apache on Windows Server 2003... and it performed as horribly as one might expect an event-loop to perform behind a threaded proxy.
I think it's possible but when bidding for contracts I think it matters. When I worked at a smaller company that worked government contracts my lead was always telling me I should work on getting a masters as it looks more attractive.
>a custom build of the open-source search engine Solr, a highly responsive UI engineered on top of React, a high performance distributed brokerage system, and cloud-based hosted services with Kubernetes in Amazon Web Services. The primary programming languages are Python, Java, and JavaScript.
I'm in a similar situation. Dropped out of uni because it bored me, easily got a dev job (I was really lucky), worked for years as a developer, and then an architect.
I eventually did study part time for a degree, mainly out of belief that it might help with future career opportunities, but also because I might enjoy it.
I got a 1st, with honours. TBH, the only part I truely enjoyed was the final project. The rest was mildly interesting at best, and a bore at worst. I found 80% of it easy, 10% difficult (maths), and 10% challenging (the final project, but only because I made it challenging).
In retrospect, I don't think it's really made any difference to my career, as I'd proved myself long before getting the degree. I don't necessarily regret it either though - maybe the most useful thing was getting me into the groove of reading academic papers, which had benefited me greatly, both work-wise and in my personal life (health issues).
The part about needing to continue your education daily is something a lot of non-CS/IT/programmers people don't understand. I know a lot of friends who want to get into a tech career (say, webdev) who don't get this. They think all they need to do is get a CS degree, or take a course and they'll be set for life.
A lot of people IN the industry (especially, but absolutely not exclusively folks who come from alternate backgrounds).
Ive been a consultant at several companies where other software engineers REFUSED to learn anything unless:
A) it was during work hours
B) the company provided a structured environment to learn the thing (usually in the form of paid 3rd party trainers coming on site).
Needless to say, those teams were far from successful.
There's also more and more folks coming in for the money (nothing wrong with that) expecting it to be a 9 to 5 job. It absolutely CAN be a 9 to 5 job given the right environment/company/structure, but no one in their right mind would pay average west/east coast software engineer salary for a 9 to 5 employee. The extra responsibilities, continual learning, possible on-call, etc, are all baked in those crazy 6 figure salaries everyone drool over.
There's totally a time and place for more typical schedule. Just expect to be paid accordingly. In the short term the market isn't quite adjusted to the idea that not all software eng roles are the same, so there's plenty of people who make 200k+ for doing essentially clerical work (that just happens to involve code) but its just a matter of time before that changes (its already happening).
From what I saw it is individual and also based on area of CS. There are lots of people who are for example very good in JS, CSS and HTML, doing fantastic UI work while not having CS degree. And there are some with CS degree who can't really solve real tasks effectively. Some people learn only what is needed to pass an exam. And some people continuously want to learn more an be better. I, for one, started doing a degree but dropped due to lack of time and lot of work opportunities. And while school learned me some useful stuff and it could learn that way even more, I continued on my own. I am an introvert, not hating math and having lot of patience. I learned so much over years, for example C, python, PHP, bash, vim, linux administration, docker, kubernetes, mysql, golang, html, css, vue, flutter, http, cors, CSP, opengl, linear algebra in graphics, mutexes, atomic ops, Atmel AVR programming. I also learned abou free software, GNU history, Mozilla/Netscape story, Gopher. Majority of this self-tought because my life is CS, I am reading or doing something CS-related whole days because nothing interests me more. Yesterday I discovered dgraph and I am excited to learn in. There are math-heavy and scientific problems which require more school-type skills. But most real-world work is surprisingly not like this. On the other hand, lots of people lack CS history - I've heard some modern web devs don't even know what a http header really is.
Hey, founder of Dgraph Labs here. If you are interested, we would love to consider you for a role at Dgraph. Feel free to reach out to me at Manish at Dgraph.io.
> The main thing I lack is access to government jobs
I guess my question would be which government? I do government contracting (US), and degrees are not required. The labor categories are typically written so that if you don't have a degree, you just need an additional 4 years of experience. On almost every contract I've worked on I've met a developer without a degree.
It depends on the position. It's a lot harder to get work as a government employee without a degree in the US, but contractors seem to have more flexibility. I worked for a government agency while I was in the military, and I can't even get an interview for the exact position I was doing for a couple of years because I don't have a degree. I know a few people who've missed out on jobs because the government position has a hard requirement for a degree, even if it's not relevant to the position (like computer science for a security position).
A decade ago when I was on the job market I saw lots of ads for government IT jobs, seemingly all listing CS degree requirements. So I didn't bother applying. Maybe it wasn't a hard a requirement then, maybe it has changed, or both. I have an ethical preference for market transactions, so didn't push on that door as hard as I might have if other opportunities weren't readily available.
Can you elaborate on some of these books you are referring to? I'm a self taught developer in his first real, institutional engineering role and I am worried I'm in over my head sometimes and would love to read some "must haves".
I got into this stuff in early high-school, and those were just "intro to C++" books.
The book that kind of changed my life (and a bunch of other people's as well) is one I bought when I was 19 called "Structure and Interpretation of Computer Programs" (SICP for short). It's a bit heavy, but if you know the basics of programming it's not too difficult. This book really teaches you a lot of fundamentals of software, and to me is the must-have book in compsci, and it's available for free of MIT's website.
Then I just went on ebay and looked up "discrete math textbooks" and "discrete structures textbooks", and bought a few of the cheaper ones.
Then I started finding individual topics that interested me, which largely dealt with distributed computing, weird abstract math, and video processing. For that, I bought the book "Programming Distributed Computing Systems" by Varela, "Certified Programming with Dependent Types" by Chlipala, and random books on Fourier Transforms from eBay.
This is over the course of a few years, and it was intermixed with about a million different blogs and tutorials to get me better. I've found if you stick with the more "theory-heavy" languages like Coq, Idris, or Haskell, you end up picking up a lot more of the compsci concepts that you might learn in school, just because there aren't a million people constantly yelling "OMG YOU DON'T NEED MATH FOR <insert language here>".
Yeah, agreed about the language choice. With math-y languages, the overall quality of presentation and material is higher than what you would get if learning the same thing in Python or JS.
As others have mentioned, there are many folks working in the gov world without CS degrees. This is especially true for older workers who graduated at a time when CS departments weren't common place at every college as they are now. However, if it's credentials that you are after, governments love certifications as well.
38 years here, I had undergrad in Chem and part of a masters. Had no college classes or experience when I got my first programming job. Always worked at the leading edge of whatever I did and always had success. Was never an issue. Today I wonder if I would have even gotten an interview.
Since these articles seem to be putting a cash value on the word "success" (or at the very least an instagram buzz value), I would point that IMHO, most of this success comes from people skill.
You can have a degree or not, even bad developpers can hop from one job to another, making more and more money in the process, given they're good enough to manipulate the recruiters/clients into hiring them. Making money does not mean your product is good,useful or well-designed. It means you're a good salesman.
Congratulations, you're making money by manipulating someone into giving you their money. Awesome ! Thanks for playing, please leave.
I guess the point of the website is to encourage people without a CS degree to learn programming. That's great. The disctinction beetween CS graduates and non-graduates seems to be very unfruitful IMHO so encouraging people to pick up a computer is cool.
What I would sugget is to stop promoting sensationnal stories like the kid that is making 15k$ a month. This is purely manipulative and dishonest. This kind of journalism is plain wrong. Even if the number are true (which I highly doubt) encouraging kids to drop out of high school to study CS on their own is wrong. Getting a degree is the best chance you have to becoming good at it. You can still be creative this way.
More, maybe encouraging good engineering (or craftmanship whatever you want to call it) instead of instagram buzz and money would make tomorrow's software a little less frustrating to use and a little less bloated.
I would love to see a story, just once, with the title "this guy made a beautiful piece of software, and it's awesome"
> Getting a degree is the best chance you have to becoming good at it. You can still be creative this way.
I think you're going to see these kinds of arguments made ('Don't attend University. Learn [x] instead/on your own.') and each day the argument is going to hold more merit. The university industrial complex continues to expand and become ever more out of reach for a large section of the population, or worse: it's becoming more and more fiscally irresponsible for most people to attend higher education, taking on the mountains of debt necessary to make it possible.
So the sensational stories of survivorship bias about the one-off people making exorbitant amounts of money can definitely go... but a university degree is becoming more and more impractical, and I expect it to get much worse.
Maybe is the US the college debt is a problem but they are a lot of countries where education is not as expensive. I attended engineering school for 200€ a year plus help from the government for accommodation.
That degree got me my first job. And the next. I started with zero debt and I do not come from money. I am very grateful for that. I would probably not have attended a 5 year college if I had had to get into debt.
> but a university degree is becoming more and more impractical
In the US*
All that means is that either alternate methods (eg: trade school) will become much more sophisticated, or that the US will fall behind countries that have solid, but much cheaper CS degree programs (helllllllo Waterloo)
This is turning into a "I'm self taught and I can code juste as well as the next CS graduate" debate.
I totally agree with that proposition. The degree is not a sine qua none condition of competence.
My point was that success stories about kids making a lot of cash on a self taught developer skillset are pure and simple lies. They hide the truth to those who really want to get into the IT industry and make a living with it. That truth is : work your engineering skills if you want to build cools things.
I said make a living. Not get crazy rich. Engineers don't get crazy rich. Salesmen (and salewomen) do.
People that did not have the ressources or desire to go to uni would get a lot more encouragement and from real developer tales : stories about smart engineering, beautiful UIs, dedication to a craft and passion, or even sly hacking.
Instead they read stories about skilled salesmen. The stories are not even that good. Glengary glen ross and the wolf of wall street are much better.
"encouraging kids to drop out of high school to study CS on their own is wrong. "
What about encouraging people to learn CS on their own, who have no way, to go to the university?
(lack of formal education, money, time ...)
And for those people usually what matters are not academic high valued software, but to make a living. And many people dream about making a living with IT, so they want to read such success stories, so thats why you see such success stories much more ..
Of course encouraging people to try and get in the IT industry even without a degree is good too. I thought I said so in my comment. I am in no way a college elitist. I know not everyone can go to uni.
My whole point was that the truth of what it is to code for a living is hidden by those kind of sensationnal articles. Those who want to get into the industry are not encouraged to work on what matters (ie engineering skills instead of fancy frameworks) and end up disappointed with their opportunities or their skill set. CS universities tend to push students toward developping an engineering skill set.
I would really like to see people without a degree that want to improve to be encouraged to do the same instead of being encouraged to read 3 wordpress tutos and start selling their skills by manipulating small business owners into spending money they don't have.
I trained developers with and without a CS degree and I always nudge them into improving engineering skills.
You see success stories of people becoming actors, musicians, or popular YouTubers but not once in a million years would I encourage a kid to pursue that path full time without them knowing what they're getting into. I feel like there are too many success stories in programming without enough stories of failure or hard work.
At some point the saturation for (entry-level) programmers with degrees will be enough that (entry-level) programmers who are self-taught will be in zero demand.
This assumes that a) the number of CS graduates keeps growing at a steady rate, and b) that the market puts a significant amount of value on a degree. Personally, I don't think either of those is true, especially the latter. With the cost of education continuing to rise in the US, I actually believe the value of a CS degree is trending towards zero, considering that the cost of the alternatives (code school and self-teaching) is much lower in terms of both money and time. And anecdotally, I've noticed that software engineers are in such demand right now that even companies that officially require a CS degree are easing that requirement, opting to instead focus on the potential of the candidates at face value.
Doubtful, you're ignoring the fact that self-taught (entry-level) programmers cannot simply be replaced by (entry-level) programmers with degrees (or vice versa).
Both have their up- and downsides, and treating them like equals is ignoring that which makes them each of them unique.
There will always be a market for self-taughts, if you don't understand this, you don't understand what makes a self-taught special.
As an employer, I want the best. All other things held equal, someone trained from some kind of institution will give me more confidence for employment than someone self-taught. Self-taught shows initiative, but institutional learning applies a baseline training employers can trust.
Agree 100%. Not to mention, having a degree opens up a lot of opportunities in case you ever want to leave the field. Many people get sick of coding after doing it a while due to its highly repetitive nature. Having a degree essentially gives you a pass to enter a myriad of fields that still require one.
I'm not sure that makes complete sense. I don't have a degree in CS as I was self taught and actually bothered to follow online courses and read documentation. If I get sick of coding and want to move elsewhere I still have the option to go to university.
I don't see why you would want to get a degree on the off chance you wont end up liking the job and want to go into a different field.
In my situation I learned how to code at a young age, I wanted to go to university to fill in the blanks. The first semester of my computer science Ba was a course on facebook and the internet which costed 3400GBP just for that few months. Then and there I decided I wanted nothing more from this course.
In my situation I thought I learned to code at a young age. That's a great way to phrase it too. If one (cough, OP's article) is using phrases like "learned coding [in x months]" then they have the same misunderstanding that I did when I packed up and went off to college thinking I was hot shit at 16 with 4 years of BASIC and PHP web dev under my belt. In my case it took college to expose me to a whole universe of new ideas and really get the point across that "coding" isn't a thing you can "learn" in some number of months. Given that nerdy teenagers are notoriously both cocky and stupid, I suspect that this is a common experience.
Of course there are the Peter Deutsch wunderkinds and the people who get a really good secondary education and can jump into CLRS on their own without hand holding from college, but I don't think there are very many of them out there.
And adult learners are a whole different matter for whom the decision to go to college for CS is radically different.
It's also called burn out. Often times you will spend years slaving over one program, making improvements and changes for different cases while simultaneously maintaining it, installing it, documenting it.
After years and years of this, 8 hours, 5 days a week, it gets really monotonous.
You'll thank yourself when you think maybe I can get into journalism, or HR, or accounting, or any X-degree field at a different company without even having to take more college because you already went.
There's also bad universities, you went to a bad one or you didn't continue far enough past general ed.
> You can have a degree or not, even bad developpers can hop from one job to another, making more and more money in the process, given they're good enough to manipulate the recruiters/clients into hiring them. Making money does not mean your product is good,useful or well-designed. It means you're a good salesman.
Good developers can also hop from one job to another and make more and more money in the process.
Job security is a thing of the past, whether you went to school or not.
The "ideal" developer isn't some kind of monk who dedicates his entire life to one firm.
Making money doesn't mean your product is bad, useless or badly designed. It also doesn't necessarily mean you're the best salesman.
Sometimes experienced developers without a CS degree and without sales skills still get paid a lot because:
A) the requirement doesn't call for advanced computer science knowledge
B) the requirement does call for a significant amount of practical experience and business knowledge
C) it's a highly profitable industry and the specific work being done will generate the company profit and/or cut costs
It can be perfectly legitimate to perform jobs like this without having a CS degree or ever intending to get one.
It can be perfectly legitimate to work your way up the ladder of such jobs and jump from job to job in the process.
Some of us want to raise a family and/or be able to retire one day. Why should we put our careers on hold to pursue a degree?
The difference between manipulation and influencing is whether or not something underhanded has occurred. You can sell yourself while being totally honest.
So many things in business are sales transactions, and it's better to come to peace with that early on. Sure, we're engineers but if you want your value to be recognized you have to learn to be a businessperson too.
Indeed, and the fact that unskilled engineers with good business skills can and always climb to the top proves my point : those success stories happen because the kid is a good salesman not because he taught himself software engineering.
This does not prove you can make money by learning some wordpress overnight but still the article pretends exactly that. The average worker with average sales talent will have to present real skills to an employer to get hired. Those real skills take time to develop. It's evidently possible to get them outside of Uni but it takes times and effort. Guidance by experienced people reduce time and effort to learn a skill. That's the whole point of schooling btw.
Those article say "I taught myself how to code and became rich by offering the world an awesome product". They should be "I'm a very good salesman and I became rich by getting people into buying my software".
That software is maybe good, maybe not. Mostly not. Second law of thermodynamics teaches us that you can't beat a 30 year old experience with an overnight schooling, except if your father is zeus or someone like that.
It's fun when people assume a CS Degree means you are a developer. Practically all development is self-taught, a CS degree is only handy for those moments where you dig into some abstract theory, which in most development jobs hardly ever happens.
You mostly spend (or waste) time on tools, frameworks and interaction (both with humans and other systems). Lots of common development is rather mundane compared to all the beautiful theory behind it. Which is not automatically bad, but also doesn't require a CS Degree to get going.
CS is more than abstract theory. It gives you a foundation that you otherwise are unlikely to get. Self-taught means learning things useful for your existing job or things you find interesting. I'm work almost exclusively on the frontend, but my CS degree gave me enough foundation that I'm fairly familiar with compilers (I wrote one in college for a required course).
I don't necessarily enjoy writing compilers. But the fact remains that I can write a compiler. I can program in C or C++. I do understand pointers. None of this may hold any real value, but I wouldn't be able to do any of them if not for the degree.
> But the fact remains that I can write a compiler. I can program in C or C++. I do understand pointers. None of this may hold any real value, but I wouldn't be able to do any of them if not for the degree.
I liked your initial point, but the argument weakens... People absolutely have learned all the above things (and very well), on their own, outside of a CS degree, and I bet you could've.
I wasn't advocating that people don't. There are absolutely vastly better compiler programmers than I am that have never taken a formal class.
My point was that I wouldn't have learned it because I don't enjoy it. And sometimes, professionally, we get to do stuff that isn't fun. The nature of being self-taught is heavily biasing the things that interest you. College forces you outside of the box. Outside of the box happens to be those things for me, but they'll be other things for other people.
As a self taught developer I can say your point assumes self teaching is all about following your interests. This is simply not true. Self teaching is a combination of following your interests and figuring out your areas of weakness and attacking them. Part of my self learning involves browsing college degree requirements or graduate programs to identify must know topics, I learn them regardless if I have a particular interest. If college is the only thing forcing you to think outside the box, what hope is there for your long term learning.
My point assumes people bias their learning towards their interests. That isn't an opinion, that's just human nature. That's not to say you can't learn things that don't interest you, but if you aren't going to do it formally you certainly have to be driven by filling those gaps and most people aren't.
Learning to think outside the box and learning about specific specialties that you don't find interesting aren't even remotely the same thing. If you read what I wrote, you'll notice the example areas I gave. None of them were "thinking outside of the box", they were fields that didn't interest me.
The value they bring is that I learned a common language. I can talk to ML engineers or Compiler Engineers in a common language because I have experience in those fields that I wouldn't had I taken a different route. That isn't even remotely a requirement for my field. It's just useful for discussions with other people.
Maybe you'll spend hours and hours learning compiler design and implementation so you can have a friendly conversation with a colleague at lunch, but I'm not going to. Formal education forced that on me. That's been healthy for my professional networking, but that's about it. I still consider it useful, but had I skipped the CS degree I wouldn't do it. Nor would most people.
At the same time people who are self taught typically have more time and energy to attack things in more depth. My cs classes are pretty 'standard' and on my free time I can go on long tangents where I really learn. I learned about assembly programming and the bacics of how cpu's work from self learning rather than getting a very simplified understanding from a class. There are many things in cs that can not be learned in 1 semester and people who are self taught have the ability to take things at their own pace and jump to different topics, something that busy college studens have little time to do
No, I'm speaking about human nature. Most people don't learn things they dislike because it fills a knowledge gap. How many people do you know with hobbies they hate?
It's natural to gravitate towards things that interest you. Some people are driven by gaps and learn things regardless, most people do not. That's not even controversial.
Would just like to say that this is a fantastic point, and really makes the case for a formal education. It takes a special kind of self-motivation (which I readily admit I do not have) to diligently learn those uninteresting-but-necessary parts of the CS curriculum.
There is actually a difference between being self-taught as you describe it, and being an autodidact.
The latter is able to identify holes in his own knowledge, and find appropriate material to educate himself on. That ability is what defines an autodidact in the first place.
This is also the very same thing you get taught in basically all sciences: How to aquire knowledge about a subject matter you yet know nothing about.
So, an autodidactic Programmer without a degree will most certainly have similar knowledge as a CS graduate, simple because that is how such people "tick".
It's just that not every self-taught programmer is an autodidact. And in that case I agree with your argument.
Exactly this. The way I think of it: It expanded the space in which I can search for solutions. I have enough underlying detail of the computer science map to pinpoint where fruitful design opportunities lie, and what patterns are likely to fit, /and/ which concepts would make a poor mapping for a given problem, leading to bad abstractions.
> But the fact remains that I can write a compiler. I can program in C or C++. I do understand pointers.
EDIT: Note - I see your replies and reasoning with others on what you meant; I do agree with you there, too.
---
I know you can do all of them without a CS degree.
Where a CS degree really helps, though - and this is something that I know I lack (but real-world experience has taught me enough that I can get by) - are what could be almost considered CS fundamentals:
* Data Structures
* Sorting Algorithms
* General Algorithms
Fortunately, in the day-to-day realm of most programming, you don't need to implement any of them from scratch, and most of them you can learn the basics of (or "best common examples") fairly quickly, which is where I am at mostly.
The problem happens when you run into a situation that isn't seemingly addressed by that level of knowledge (which again, thankfully, in most day-to-day work is extremely rare); by having that prior exposure via a CS degree, you would at least potentially know some words or phrases to look up to refresh your memory on concepts.
Further, you would also have the knowledge (or refreshing of knowledge) to be able to understand any new developments that may have occurred since you were taught (and so could easily read any research papers on the topics).
If I were in such a situation, I would need to break out some google searches on various xyz (ie structures and/or algorithms), use whatever understanding I have to understand those to a level to be able to pick out potential candidates to study in more depth, and maybe hope to run across more recent stuff that may be better (and pray I could understand it enough to implement it).
The one upside in this day and age, though, is that for all of these, for virtually any language I currently understand, someone has likely written an implementation of it - which can be extremely helpful both in the solution to a problem, and in understanding that solution.
But a CS degree would make things much simpler, were it something I had time to pursue. I could probably find the time, but the cost vs benefit ratio no longer works out to my favor, beyond self-edification, given my age and current life obligations.
The failing here is that you have put a (rather low) ceiling to what an auto-didact can achieve. There are no hard limits to a self-taught scholar of computing, certainly not things as elementary as data structures and algorithms.
Writing a compiler is just the side effect of showing it could be useful. It’s not hard programming equals cs. It’s the science part of computer science that matter.
I'm about 3/4 of the way through writing a book on implementing interpreters and compilers [0]. My day job is as a software engineer at Google working on the Dart language.
My highest attained degree is a high school diploma from a shitty public high school in southern Louisiana. I went to LSU for a year and a half and dropped out.
No, and my post never implied anything like it. On the contrary, my point was that I was forced into it by degree requirements. I'd know nothing of compiler design or implementation today without being forced to learn it. It's a valuable skill to be sure, but not one I'll ever use professionally because I find it frustrating and boring.
And that's my point. If you're self-taught you have to be willing to learn things you find completely uninteresting (or boring, hard, whatever) if you want to be as well-rounded as a CS graduate. Most successful self-taught programmers do exactly that. Most people don't.
> If you're self-taught you have to be willing to learn things you find completely uninteresting (or boring, hard, whatever) if you want to be as well-rounded as a CS graduate.
Not necessarily, you could also just find all that CS stuff interesting.
The CS program taught me the history and context for Computer Science. It taught me the fundamental physics that make a gate work and how to put those gates together to make a logic circuit. It then built on this one step at a time, walking us through each layer from how machine code manipulates bits to how to properly evaluate the run time of a complex algorithm. CS told me the comprehensive story.
Earning my BS and then my MS did not increase the amount of money people were willing to pay me to build things. However, I can generally tell when I'm working with someone who is entirely self taught and I assume others with degrees can tell also.
1. "I can just tell that person is self-taught." (No, they went to a contemporary dotcom feeder vocational CS program, and spent all their time adding the latest framework keywords, and drilling for the leetcode whiteboard interview.)
2. "It hasn't occurred to me that other person might be self-taught." (Yes, they are.)
Further, the book on which the curriculum is based. Diligently going through them will often teach you more than the structured class based on them. When coursera first came out I was very excited to take CS classes from top universities. Then... I was surprised at how generally poor quality the lectures and material were. Outside a few standouts, I think you are vastly better off reading quality, well reviewed books / papers than taking a course / watching lectures. (Unsurprisingly this felt true of my actual degree's as well).
At my University, CS mostly meant learning to program and hack on things unless you specifically did the theory track. Only two of the required classes were theory classes. Compilers and operating systems had some theory in it, but it was mostly about learning how these things work by building your own.
I sometimes feel like I have a leg up based on when I learned computers (self taught mid 80's to early 90's). At that time, everything was more low level, and there were fewer, yet higher quality books on the market. So I started off with W. Richard Stephens, K&R, Pike, Sedgewick, and others that you either had to really get it, or you wouldn't get anything. Not that this was the best way to learn, but you knew coming out of it what you learned.
I have a CS degree (in fact, I have both a BS and a Master's degree in CS) - I'm still "self taught" in the sense that I learned to program the same way you did, by reading Stevens, K&R, Knuth, etc. outside of my coursework. Don't get me wrong - I learned a lot of interesting stuff: I never would have learned calculus if it hadn't been a degree requirement, and it helps to understand a lot of AI, but learning to program is sort of a "side effect" of academic computer science.
I think that's one thing that's making me think twice about doing a CS degree now e.g. by night — a lot of what you learn at university isn't in lectures, it's in the ample free time that you have between lectures.
With limited free time, it's hard to know whether it's better to self study or do a degree (or even just follow along a curriculum online)
I go between trying to work through things systematically to get a good grounding where I have knowledge gaps, and jumping to what I think will help me best on my job at this point in time.
It would probably be better to stick to one approach or the other to make more focused use of time, but I'm not sure which way is best! :)
I also have both a BS and a Masters in CS. I took a required software engineering course in either 1999 or 2000. I think that there may not be a single thing taught in that course I would agree with now.
I learned a lot of useful things, like how to write a compiler. But I would say that in the sense of what I really do day-to-day to "program", I'm effectively self-taught too.
(I walk the walk, too; I can and have hired people without degrees, and it's been fine.)
That was similar to my experience on a CS course - you were expected to do a lot of development, but nobody really specifically taught you about it and you were expected to pick that up by yourself as you went along.
In this field, you're always going to have to learn things on your own... The trivial example is new technologies that weren't available when you were in university.
That's how my CS program was (late 90s) as well. You were expected to be somewhat proficient already in C/C++ programming going into the major. Much of the program was mathematics courses. It was brutal. They eventually saw the high drop out rate and re-configured the major. I envy the kids who get to go through the major now. The courses look much more interesting and it's comprised of more actual software engineering coursework.
I graduated with a masters in 2013 and my courses involved a lot of calculus, and some courses on theoretical foundations of computing ++
Still learned most of how to be a developer on my own, but I find immense value in the education. I learned to see how nothing in computing is magic, just hard work. In 2014 they changed the algorithm introduction course from C++/java to python, which I reckon is a good thing.
I finished my CS degree in 06' and it still was more about math than programming (C++, no Java). To graduate I had to take single/multi variable calculus, linear algebra, vector geometry, discrete mathematics, differential equations, combinatorics and numerical methods. When I started there were 600 in my class, when I graduated I think there were only 60!
I started programming properly in 2010 when I was 12 or 13. Before that I did some small things in Inform 7, and now I remember it, Visual Basic, but I don't really consider that 'proper programming' as it was mostly cutting and pasting code with little understanding. I started with Inform 7, Python, and a little bit of Processing and Lua. I did three or four ludum dares and at some point I made most of a game in Lua with Love2d (around October 2012ish).
Maybe a few years later I started delving into 'hacker culture', and I picked up C with K&R (since it was almost universally recommended and easy to read), I also at some point picked up UNIX Power Tools (recommended in Linux Format) and The Practice of Programming. At some point after that I became really interested in Compilers and Operating Systems.
When I went for my first internship three years ago, the interviewer said I knew more about the things we talked about, than most other bona fide engineers they interviewed. The university-hired interns that sat next to me couldn't read the Erlang manual or documentation because it just didn't click with them, which while I can appreciate on some level that people have different sources of information that suit them best, it's still astonishing to me.
Consider that programmers started out as secretaries learning how to program the machine from the specification and the schematics given. A lot of the programmers I have met, probably wouldn't be able to cope with such a task, I include myself in that category.
I agree with you about the quality of books. A fair chunk of the books on my bookshelf from that era are 80s/90s to early 00s because those books are just an order of magnitude better written, with a high signal to noise ratio.
Can you imagine a modern language like Swift or Rust being taught to the same level, in the same space as K&R? I can't!
> I agree with you about the quality of books. A fair chunk of the books on my bookshelf from that era are 80s/90s to early 00s because those books are just an order of magnitude better written, with a high signal to noise ratio.
This is a common misconception. There were plenty of shit books then, too, you just won't find people remembering and recommending them. Whereas we haven't, collectively, discarded the shit books of today, yet.
I don't think that's quite the case. I think there are more books in the 2000s written about basically nothing. Look at the "For Dummies" series of books. They're accessible but the content they teach could be condensed into one or two blog posts with nothing lost except the material is easier to read in a single sitting and much easier to reference.
Those books are still quite useful today: I learned C from The C Programming Language, for example, and I’d still recommend it as the best book to read to learn the language (though a nice supplement that covers how modern compilers work would be useful to accompany it).
I started with MASM-style assembly, and transitioned to higher level languages when I got frustrated.
I think this angle has made a huge positive difference in my ability to reason about my work. I think if you start with high level languages, it's easy to ignore what you're actually doing.
Same here. The added benefit of there being a handful of really good resources to learn. The manual if my C64 pretty much taught me all the basics in a very approachable manner (I was 7 when I started).
Nowadays, there are so many resources (mostly low quality) that people waste too much time trying to decide or lesrning things wrong.
I had a similar experience with C64 and its User and Reference manuals at that age. I was thinking about that a couple of months back, and found that they're still out there as PDFs of the originals:
I think its pretty well known that you can work as a successful developer without a CS degree. However , as someone who is currently working through a masters in CS (after working in software the last decade), I would say that CS courses (some, not all) provide invaluable foundation that would be difficult to obtain (of course you can self study CS). Sure — you don’t need to know how the OS works to write a web app that rakes in cash . But there’s something beautiful about peeling back the layers of abstraction : you gain an appreciation of how things work underneath the hood.
You can "peel back the layers of abstraction" without a CS degree, all it takes is time and (admittedly rare) dedication. Webservers, operating systems and books on theoretical CS are all out there, you just have to be interested.
It's possible to learn most things through self-study. But the vast majority of people will only learn the parts that they find fun and interesting.
They generally won't wrestle with the tough sections in a systemic fashion, the way you are forced to in a decent CS program.
For most people (even most professional developers) the only practical way to gain the equivalent knowledge you'd get with a CS degree, is by getting a CS degree.
I worked for years as a self taught professional developer before going back btw, and I've hired/interviewed many boot camp graduates, self taught programmers, and degree holders.
Like whatshisface said above, quality self-study is rare. As you point out, people will learn what is fun/interesting, but I will also add that they will just learn what they need to get whatever job done.
I have a friend who does webdev (mostly Wordpress, some Drupal) who did not have go through a CS program, though he took a few web dev classes in college. He goes good work, but he has a very loose grasp on fundamentals. He showed me a side project he was doing (a web based clicker game) and after looking at the code, I suggested he try adding some classes to reduce his code reuse. After explaining the basics (and some quick research on how classes work in Javascript), it kind of blew his mind. He never used classes before in his code because he just had no reference point.
We've all seen the utter messes that recent CS graduates have made with classes, unnecessary inheritance and over complicating problems, while being unable to do fundamentally basic logic, you must know the opposite problem is equally true.
CS majors who cannot program applying academic theory without understanding.
I'm old enough to remember graduates rolling their own sorting algos, or building classes 6 levels of inheritance deep. While simultaneously creating massive if/else nesting, with tons of duplicated code, unable to understand how to use basic ideas like functions and recursion, even though I'm sure they probably had a whole one lecture on the subject, mixed in with whole modules on pointless compiler lectures.
Because that's what their CS degree taught them.
How both CS graduates and non-graduates really learn is by seeing what other programmers do in the industry, or by making their own mistakes.
I agree on difficulty of self study by remembering my student days, simply I was asking the lecturer about why we have to study some of uninteresting CS topics. They were quite boring but really got their benefit in the long run.
No, you don't need a degree. All you need is a resolution to learn the things that you're missing. I covered C++ in all its hairy glory, assembly, SIMD, data structures and algorithms (implementing them, not just using them) and concurrent and lock free algorithms and data structures. Some of the data structures I created perform better than anything in the published literature - but I couldn't be bothered to write papers. I might do them as an in depth blog post one day.
I went way beyond what any CS grad does, and it cost me nothing in money and less time than a CS degree. If you're motivated you can beat a university CS education - it's actually really not that high a bar.
I agree with you that most people won't do that. I was home schooled so I have a different attitude about learning than most people - and that's made all the difference. But is fundamentally a problem of motivation and goals, you don't need university for either.
> some of the data structures I created perform better than anything in the published literature - but I couldn't be bothered to write papers.
You can probably imagine why many people wouldn't believe you. "My work is amazing but I am too lazy to publish it" is a peak stereotype for self-educated people. My prior on this being ignorance rather than genius is super high. And if it is true that you are a once in a generation genius who could outpace the research community then your experience is completely useless for others, given that you'd be so much smarter than a typical person who is doing self-education and trying to get a job.
It's not really that hard. For example I made a queue that outperforms the lynx queue. My fundamental instinct after reading that paper was that there's no way faults out perform bounds checks in a modern, speculating out of order CPU. I ran some tests and yes my instinct is correct. Maybe the result wouldn't hold under peer review for some reason, I just tested a few more ideas / criticisms with a colleague and moved on. I don't personally benefit from trying to publish the result and I have enough demands on my time.
I'm not a genius, unless you count an online IQ test I took in my early twenties, but I'm deeply skeptical of those. It's true I'm well above average though, so perhaps my experiences don't generalize. I don't believe that though. I still think it's motivation and goals that count.
Haha, well I'm not sure I can honestly deny an accusation of being a genius, at the same time I'm not sure I can affirm it either - and just to talk about it seems to imply a lack of humility. Irrespective, there are lots of kinds of intelligence and I'm blessed with some of the less useful forms of it that happen to test well.
So, in reality, you have no idea if your idea is truly better and it looks like you just ran some specific test cases and assumed it applied.
Still paper worthy, but far from thinking that you made anything better than published literature.
And the goal isn't to "personally benefit" it's to know that your ideas survive scrutiny and contribute something meaningful to the world.
You're making a lot of assumptions there about which your mistaken. I don't need to defend myself here. If that's what you want to believe, that's fine.
The only assumptions I'm making is what you explicitly stated. And the cardinal rule is, what's stated without evidence can be dismissed without evidence.
Either way, if you don't want to defend yourself. That's fine.
Because you've claimed that you made substantial strides in algorithms that nobody else has supposedly done. Why not get a free PhD out of the deal? If anything, you can show it to a professor and they should be able to recognize it. After all, you could be advancing the field directly.
Honestly, if you think writing a two to three page paper is "a lot of effort", I doubt you've done what you've claimed.
You overestimate the value of a paper. It's worth approximately $0. But would cost maybe a week of time or about $6000 if you only count what I can sell my time for. Terrible deal.
Yet your ideas could get you millions in publicity, fame, etc. to spend literally one week to advance the field.
The fact that you actually argue against this is laughable, not to mention incredibly selfish.
What makes you think the idea is significant enough to get “millions in publicity”, etc.?
He’s not claiming he invented cold fusion, just something that would be an iterative step in improving the state of the art, probably worth one paper in a decent but not super prestigious journal. Where do you get the idea that it’s worth a free Ph.D, fame and fortune, and all that?
Because I’m aware enough of the CS world that I just know that an iterative improvement in an obscure data structure is not enough to become rich and famous.
This. It's an iterative improvement on a data structure that in reality isn't useful as used on benchmarks and the benchmark performance is a totally artificial environment with little practical significance. It warrants a blog post more than a paper. Nobody is going to get rich or famous off of it. If someone wants to write it up as a blog post, I'll give them the code.
Why do you assume it is an iterative improvement? You can get a free PhD for a lot less. You can get rich and famous for a lot less. You don't have to make cold fusion viable.
This is absolutely true, however in reality (= in practice) very few non-CS degree devs _will_ reach the same level of knowledge.
When you have to learn all that stuff to get your degree, then you just have to learn it. Period. Also people who are full time students have more time to spend on these topics, but when you are actually working 8h+ per day and your career is mostly around JS frameworks it becomes much harder to invest time into these things.
Obviously this doesn't apply to everyone, but I'd say it applies for most.
I don't disagree with you, but I don't completely agree with your implications either. From my experience, most people with a CS degree will not attain that level as well. Even at the masters level, a number of people will not understand. It's sad. I really feel this is a failure of universities not forcing students to learn, but I know there are a number of politics etc that factor into how hard they can make courses, and what they can teach.
There's a difference between learning specifics, and learning habits of thought.
The two best things a CS degree can do is expose you to ideas you wouldn't otherwise know about, and to force you to work on hard projects that require a combination of multi-level analytical thinking and research.
Both of those are excellent training for at least some aspects of being developer.
But that doesn't mean the details are inherently useful. There are very few situations where you will be expected to write a compiler. So in that sense compiler theory itself is optional - far less useful than the experience of having to handle a complex set of data structures and relationships, which could in theory come from other kinds of projects.
Academic CS also tends to miss out a lot of useful practical skills. It won't teach you much about management (from either side), salary negotiations, office politics and co-worker relationships, or business theory.
It may not even teach you how to write good clean code that's easy to read and maintain.
So IMO the ideal CS degree doesn't exist. The ideal degree would be a good mix of theory with plenty of industry practice - possibly with some standardised requirements that would lead to a Chartered Developer qualification that was better at guaranteeing a working blend of practical skill, theoretical understanding, and analytical talent than current degrees seem to be.
>The two best things a CS degree can do is expose you to ideas you wouldn't otherwise know about, and to force you to
work on hard projects that require a combination of multi-level analytical thinking and research.
I agree that should be the case, and sadly enough, I've seen a number of people graduate college with degrees in CS and not have that. There have been a number of times when I'd mention some non-esoteric concept that should have been covered and the response is something like "huh"? I'm not talking about things like "Oh you don't understand how to implement Redux?", it's more things like "Ok, you need to compute the intersection of these two arrays." You are right, they probably have been exposed to these concepts, but they have no idea how to actually do it. More importantly, they grasp so little, they didn't even know where to start. The saddest one I saw was a student that was wicked smart, and the school didn't challenge him enough to struggle through any projects. When he came to intern, he was lost, because he'd never been actually challenged. (He was from a major public university too.)
Don't get me wrong, I disagree, there are a number of great CS programs from both public and private universities, and actually the good ones are exactly like you mention (both theoretical and practical application), but there are a lot that aren't.
There's a huge difference between taking the easiest path through a degree (mostly Cs, the easiest electives you can take etc...), and someone who pushed themselves while in college. That's why if the degree is a big part of someone's resume, I ask for a transcript.
Theoretical and CS fundamentals has never been an issue for me (no degree), because I find them interesting. Probably more interesting than most of my coworkers. I think you'll find this is the case for many people who end up in this industry out of passion rather than educational path. A year unemployed during the .com crash gave me lots of time to work on open source hobby projects, where I wrote my own virtual machine, compiler, etc. That was fun. Wish I could afford to do that again.
Honestly, what I find I'm missing is mostly class status and a piece of paper.
I'm at a point where I don't want to do (web) app development anymore and want to switch to something more interesting (to me)so I'm not stuck doing something I hate for the next x decades. Unfortunately, it seems that what I want to get into (low level embedded stuff, firmware development, etc) seem almost neccessary to have a degree (or even a Masters or PhD). I assume this isn't mostly because of the level of EE knowledge required. Unfortunately going back to school is out of reach now
It actually is because of the theory required. If you go low enough, more happens in your head and less in your tooling. It's why there are low-level and high-level languages.
The lower you go, the more you need to know (and do) yourself. That is where CS degrees do help, because without the benefit of layered architectures of shared libraries, systems and code in general you need to know what those layers did in order to know how to do the work without them. That said, low-level doesn't always mean the same thing. Some people think writing software in C is low-level, but when I think low-level I mostly think of assembly on bare metal with no OS or anything like that.
I have coworkers doing that without a degree. They did however show up with experience.
Personal projects will do. You could work on Open Source software. Obvious choices: qemu, valgrind, FreeRTOS, RTEMS, Linux (kernel), SeaBIOS, gcc, clang, ghidra, binutils, MAME, OpenOCD, gdb, dosemu, dosbox, FreeDOS, Wine, SDCC, dolphin-emu, Xenia, coreboot
Your comment goes to the heart of the debate: does understanding what’s going on behind the scene of your tools help you become better at using them? I think for the vast majority of jobs it does not, but for a few it absolutely does.
For me, understanding how a compiler and cpu/RAM works and is put into machine code and later on a programming language means that I fully understand what programming is.
It allows me to make sharper categorizations whether something is mathematical, architectural, security, programming, framework related or a best practice.
This again gives me a good feeling of whether something will be easy/quick to learn.
If so, though, then a CS degree in general is sort of pointless, isn’t it? But if you can say that about a CS degree, what degree could you really not say it about? I suppose a medical degree is “meaningful” in the sense that you can’t get access to bodies to dissect without attending medical school, but if a CS degree is a waste of time, then pretty much every degree is a waste of time.
It seems like you’re saying it’s nice to know the layers of abstraction but it sounds like you don’t use it in real life.
As a non CS degree developer I can’t really see anything that I’m missing because of not having the degree. I have a successful business, get hired for freelance jobs for a good salary, can build anything I want, ...
Would love to know what one would get out of having the degree versus self study.
The original comment is about engineering competence and having the comprehensive understanding of subject, which is not just limited to running business and getting monthly paycheck to pay the bills.
Some benefits that it will give you:
- It will actually let you move into different positions within tech/it industry when you have wider/deeper understanding of how things works.
- As someone said already in this thread: "allows me to make sharper categorizations whether something is mathematical, architectural, security, programming, framework related or a best practice."
- You'll be better at your job. Maybe not every day you need to know what's happening under the hood, but there are and there will be days when you need to. Even if you only developed JS frontend apps whole your career.
When you actually say "I can build anything I want", then (although I don't know you) I'm pretty sure that you can't. People who get that deeper understanding of things also understand how complex some things are and how complex some things can get.
Self-study verses earning a degree is a red herring. While there are some advantages to studying in an institution, the degree is simply there to tell others that you have studied a particular curriculum. My only concern with self-study is that a lot of resources are the educational equivalent of get-rich-quick schemes, but that says more about the people who create those resources than the learners themselves.
As for knowing the theoretical basis of computer science, that will have value in some parts of industry and very little value in other parts of industry. While someone in your position may have a high degree of success working in the upper layers of abstraction, someone has to develop, advance, and maintain the lower levels of abstraction that you depend upon. None of that is meant to say that you need that theoretical knowledge to be successful, rather it is important for some people to have that theoretical knowledge to ensure the success of the industry.
Engineers with degrees work for the contractors. They minimize materials and cost. The result doesn't always remain standing, even without an earthquake:
If you had a CS degree, you'd know you can't build anything you want cough halting problem cough.
I'm surprised this got down voted since it's 100% true.
I completely agree, Hackernews has really helped me personally. Tons of interesting articles over the years on LLVM or the inner machinations of garbage collections. Picking the brains of clever CS friends also helps a lot too!
I work with a twenty-something year old right now. He graduated from high school. Had his own non-tech company for a while after high school. Decided he wanted to go into coding. One of the best damn coders I've worked with in 25 years. Gives the toughest, most consistent code reviews. He's the type of programmer you can build a startup around.
In another comment a few days ago, I mentioned a 10x engineer I knew. Graduated from high school. Went into the military. Came out and coded. He is the best coder I've worked with.
College education is not necessary to be a successful programmer. You do need to have some predisposition to coding, but you definitely don't need a degree.
That doesn't matter. If there's one thing my years in this career has taught me, it's that the client(s) don't GAF what language or framework or system you used to solve their problem(s).
What they care about is if you have given them a solution that solves their problem(s). They don't care how you got there. They don't care what technology you used. Just as long as you solved the problem(s) they were having, and that you did it in the time/budget allotment for the job.
That said, as the solution provider you should have found out both their current use cases and potential future use cases, in order to select the tools you'll need for the immediate solution, and ideally for any updates or improvements that may be needed later on.
Because they will care, for instance, if the code you wrote was in a language that won't allow them to scale up quickly and easily (assuming you wrote the rest of the system to allow for this of course), or has some other kind of issue that prevents the client from moving forward with a change.
The last thing a client wants to hear is "complete rewrite using different language/framework/system" just to implement an extension or upgrade path (the best you can usually get by with is a refactor of the existing code base - but you better have a damn good reason and plan for that refactor).
You said a lot of words and assumed a hell of a lot of things that were not contained in my question.
I simply asked a simple question out of curiosity, JFC. Does curiosity matter?
For the record, all code is ephemeral so since you're going to end up throwing it away anyway one day, it doesn't fucking matter what language you start with. Pretty much any language can back-end an API until a certain (large) number of users. #ProblemsWe'dLikeToHave
Twitter threw away Ruby and went to Scala; Facebook is slowly replacing PHP with... whatever, pretty much anything is superior to PHP; plenty of other companies have had to do partial or full rewrites of core functionality on a new stack. Paul Graham himself had his Lisp code tossed in a rewrite.
While the evidence provided is anecdotal (that is to say: outliers exist in successfully learning anything outside of a formal education), there are languages designed to be more friendly and more fun for newcomers (Ruby, for example) while others are designed to be translate more directly to the instructions a machine might execute. More anecdotal evidence indicating that one language might facilitate informal learning better might be interesting or helpful.
I'm an engineer. No CS degree. I coded for fun on various side projects from 2006-2012. I didn't consider myself very good.
I worked at a couple startups from 2012-2013. During that time, I became very proficient in Coffeescript and a couple popular frontend frameworks. I was very productive in terms of pumping out lines of code (that somehow resulted in a working product).
However, I didn't know how to write a for loop. If you had asked me what "for...in" was, I wouldn't have had a clue. And forget about asking me how setTimeout has to do with the call stack.
The problem with being self taught, at least in my experience, is that you end up being very strong in whatever areas it is that interests you, and whatever areas of CS are relevant to the projects you're working on day-to-day.
In 2013, I began to realize my (glaring) deficiencies as a programmer. That was the point when I started spending all my free time doing online CS courses. They helped a lot with filling in the knowledge gaps that I didn't even know I had.
The main reason I'm posting this is to give a shoutout to Project Euler.
Project Euler (https://projecteuler.net/) is an amazing way to test yourself (and learn) CS concepts. If you've done a coding bootcamp and want to do a "gut check" to see how much you learned (or how much you have left to learn), I'd highly recommend Project Euler.
(I'm happy to say that in 2019, I can write for loops all day long...)
> The problem with being self taught, at least in my experience, is that you end up being very strong in whatever areas it is that interests you
This is absolutely key. One of the main advantages of a solid undergraduate curriculum is that it forces you to work your way through things that are important, but don't interest you as much. And this path is designed and supported by people with a much better broad view of the field than you have.
Now, you'll still always gain knowledge faster in the areas that interest you most, but if you only indulge this approach you will have significant gaps.
One thing I always recommend to people self-studying (when they ask) is to find someone you trust who really knows the area and ask them for recommended courses/books. Then follow that advice.
To some degree MOOCs etc. try and do this for you by designing course progressions, but there are limitations and constraints there that can have you spinning your wheel.
Being disciplined about both breadth and depth in a new subject is difficult. One of the underlying themes of a post-graduate degree, particularly a PhD, is to give you tools to do this for yourself repeatedly. It's not impossible to gain these skills properly with little or no formal education, but it is quite difficult.
Another way of putting this: a primary function of a curriculum - university or otherwise - is to convert unknown unknowns (things you don't know exist) into known unknowns (things you are aware of but not fully versed in) and known knowns (things you fully grok). This is why we shouldn't replace college with nothing: this is an important function.
Self-study isn't exactly nothing. I do find that there are a lot of people, especially web developers, who don't understand a lot of the basics, or even the language they're using.
I got into development from doing design work, also self taught back in the 90's. I wanted to do something... spent a very rapid weekend learning JS from a very large book, then applied what I needed the next monday. Spent the next weekend finishing that book and the next month swallowing three other large books on the subject (two more on JS, one on HTML). From there was VB5 then Access & VBA... then I started working as a developer. From there I learned more about databases and data structures, dabbled in C/C++ then circled around to VB6 when it came out. From there around the end of 2001 (after 9/11) I was unemployed and no jobs to be found. While crashing at a friend's house I learned C# with the command line compiler and another large book (didn't have VS). Since then more databases and db types, more C# and when it came out Node.js (had done a lot of classic ASP in JS along the way too).
I spent about 5 years working in eLearning, writing simulations of systems for learning/training as well as courseware. Really enjoyed that time in my life as I was constantly taking in domain knowledge as well as a very varied environment. Unfortunately after a while all the context switching and constant intake took a toll and I took a few more boring jobs doing corporate/banking work.
In any case, you'd be surprised how much you can learn without a formal education, or anything really structured at all. To this day I tend to take in new stuff rapidly and get bored once I've figured out the hard parts. Currently working on learning Rust and Kubernetes, short break on rust as the async syntax gets implemented and settles in.
I think my statement was too strong; I definetely don't mean to suggest that nobody who has never followed a curriculum of any sort can ever be successful. What I mean is that in general, as a policy, it is useful to have some curriculum to shed light on the unknown unknowns.
But to respond to "self-study isn't nothing": that's true, but your self study didn't replace college, it replaced self study. What I mean is that everyone has to do that self study and job training targeted at the specific stuff they're doing. Self study is necessary to expand "known knowns" in targeted areas, which is really necessary. It's just not as good at illuminating "unknown unknowns" as a curriculum put together by a person or group of people who know the breadth and history of a field.
I think it depends on the person and the experience. It's easy to say that a particular course was helpful, because it often is. But what's the opportunity cost? What other things didn't you learn because of all that time learning about, say, OS schedulers?
Also, it assumes that you have access to a very high quality education. In lots of cases, the teaching might be mediocre (perhaps even at elite universities that care more about research grants than education).
There's a lot of value in having a standardized base of knowledge, where you learn all the things that a practicing engineer should know. But in reality, the sum of things you "should" do takes more than a lifetime. And you have to find some ways to differentiate your knowledge and expertise if you want to innovate.
Yes, nothing is perfect. But in general, it is hard to be a good guide to yourself in a new area. You can do it, but it is a skill that needs to be developed and most people don't have it (or rather, they typically have the depth-first part but not the breadth-first part, if that makes sense).
I wasn't even talking about the course quality (which as you note, is very variable) but about the curriculum.
One thing you can do as an individual teaching yourself CS, say, is to try and follow the curriculum of a known good CS program. There are issues with this, too, but at least you are getting some guidance.
The opportunity cost issue is a good point. One problem with following undergrad programs, say, is that only part of what they are doing is teaching you material, the other part is learning-how-to-learn stuff. If you are an independent learner of sufficient skill, some courses will be way too slow for you.
It would be optimal in some sense to have high level "key concept" material that you can survey efficiently and then decide when and where you need to dig deeper. It's a bit of a chicken and egg problem though, because if you are insufficiently skilled you will do poorly at the second part...
As mentioned in another comment for this post, Cracking the Code Interview is probably a good place to start in terms of reviewing some key concepts, assuming you don't have a CS degree and are largely self taught.
Of course given the libraries and systems already available, you don't practically need a lot of this knowledge, but it does help. When certain portions of an application just don't perform well, or when you hit really weird side effects of race conditions on static properties etc. I think it's also important to know the language and tools that you are working with as well as possible. It's not always possible to know everything, but you should probably read and complete at least one book on the language you are using. You can hack away in any given language for years. But until you've actually read cover to cover on a comprehensive book, you won't necessarily understand some concepts, or better still you will learn how the language does something for you, that you've been doing the hard way.
I've been really involved in teaching people with non-traditional backgrounds (e.g. no cs degree) to program lately, and this has been my biggest struggle. There are people who are working in the field as a web dev and have been for years and can't do simple stuff like iterate. They have no understanding of what the stuff they're actually doing means or does. It's just copy and pasting snippets they see online and fiddling with it til it works.
I've taken to offering three pieces of advice to these people:
1) Go through the basic language tutorial of whatever language you use (e.g. how to declare variable, conditionals, loops)
2) Go read through the essentials guide to your languages (will be slightly higher level stuff)
3) Go read the sections on Data Structures, Concepts and Algorithms and Knowledge Base in Cracking The Coding Interview skipping the problems that aren't answered in chapter. Why? Because it's a solid primer for CS Concepts that people just don't pick up unless they have to, and most importantly short enough (< 100 pages) people will actually go through it. Large CS books with the dense writing intimidate people and so they never follow through.
I've had a lot of success with this method. It's not a formal cs education by any means but I've found it's enough to get people past the constant beginner part.
It is funny that people constantly mention these supposed developers who have jobs but cant program. Where are these jobs? I have a recent cs degree, did resume coaching, and worked through a couple coding interview exercise books. Still having trouble finding work. Show me one of these mythical jobs poorly trained programmers get, I will blow them out of the water. In reality I think most of those jobs have been outsourced and no longer exist.
Apply for bank outside of a tech hub, or any job outside of the major US cities. There's plenty of positions full of people riding the "expert beginner" status, blissfully unaware of how far behind they are because top performers leave for the hubs and the better opportunities offered there.
Boring corporation jobs are fantastic in these cases. The performance bar is low, and due to inertia within the org, you can usually learn faster as an IC and bounce somewhere better rapidly, versus being somewhere like a startup where the treadmill moves too fast for your own personal development needs.
Sounds like he’s talking about defense contractors. Many contracting firms just need “butts in seats” that they can bill out. Mediocre developers are better in some regards there: they bill out more hours since they’re slower, don’t mind 10+ year obsolete tech, and they don’t complain as much, which could endanger relations with the bureaucracy (customer.)
Often "approved" by people who have very little idea about what they actually need in a hire. And even they're measured on how well they can keep things staffed and they're often requesting requirements for roles that are unreasonable or downright ridiculous at the pay rates they expect. So they end up with people who are good at formally ticking the right boxes, but have no Godly idea about how to enable the mission or be aware of the broader purpose behind the tasks set out for them.
It depends heavily on your COR and government leads, but the good ones are few enough that they don't get to send the norm. There are enough people who are sufficiently checked out that contracting/staffing firms can get away with murder. Indeed, it might just be impossible for them to do a good job based on what sorts of requirements they're expected to adhere to.
I worked at a defense contractor for years. Some of the guys I knew could barely program, despite years of experience, but looked good on paper. Approving them individually doesn't mean that they're not a "butt in seat" kinda person.
It's always a real shock to get an elite CS degree and then be told we aren't as smart as the people who got their job because they were friends with someone at the company and just happened to know Java.
> However, I didn't know how to write a for loop.
> If you had asked me what "for...in" was, I
> wouldn't have had a clue. And forget about
> asking me how setTimeout has to do with
> the call stack.
I'm self taught, I started with the C64 in the early 80s. By the time I was 15 I was working in assembly, had an excellent knowledge of what we would now call embedded software development and a working knowledge of algorithms.
I'm glad that you took the step to get a proper education. With the modern internet it's extremely easier and the quality is incomparably better than my experience of the UK educational system. Between Coursera, MIT and the vast number of excellent books it's a dream.
It's an eye-opener to me that someone can be self-taught and not understand something as basic as loops. It has been my experience hiring developers that the self-taught type are considerably stronger than those who are purely college educated. I've also noticed that the self-taught type who consider themselves software engineers all went through a two year period of intense studying of computer science and software engineering in their own time. The combination of a real working understanding of software creation with passion is an extremely strong combination.
What I've noticed is that your type don't get scared of technical changes because you're basically able to learn anything. The "college career" type tend to scare easily and also jump into management the second they get a sniff of it.
All of this said, if I was 18 now I would strongly advise myself to do 3/4 STEM A-levels (Math, Physics and something else) and then go study Computer Science at a university which has a good program. By good program - one where I'd write a compiler from scratch, learn the theory (finite automata etc), machine learning, linear algebra and hopefully something fun like building a 3d engine or game engine or something. You can do it all yourself but a three-year program certainly makes life a lot easier, plus you get to make friends and network. AND you have a nice piece of paper.
I'm self-taught since age 16. I've been able to write a for- loop since age 16. Maybe it helps that I'm self taught in a pre-google, pre-stackoverflow world. Any beginning programming book will teach for-loops.
I do have gaps in my knowledge but if I lose internet access I can still be productive. I mostly google for how to optimize when some API is slow or remind myself some method name.
I just wanted to pipe up because the top comment is "I couldn't write a for-loop" and I don't want all of us who are self taught being lobbed into that category.
At my last job I regularly had to help colleagues (some who held degrees) solve issues. Strangely they had trouble starting from scratch on something new vs maintaining old code. They found the latter easier. They also sometimes chose the wrong data structures (e.g. looping through an array every time you want an item with id=x instead of just using a hashtable)
> (e.g. looping through an array every time you want an item with id=x instead of just using a hashtable)
Huh, I had almost the exact same experience with senior devs here - they were checking if elements were in a (huge) list in nested loops, instead of using a set. Switching brought the runtime from around 2 hours to 2 minutes.
> The problem with being self taught, at least in my experience, is that you end up being very strong in whatever areas it is that interests you, and whatever areas of CS are relevant to the projects you're working on day-to-day.
Which is not necessarily a bad (for whatever definition of bad) thing.
For the startups you worked at, your ability to write code was far more valuable than your understanding of underlying CS concepts.
Kudos to you for having the maturity to acknowledge the gaps in your education and to address them in a manner that complements your life.
> The problem with being self taught, at least in my experience, is that you end up being very strong in whatever areas it is that interests you, and whatever areas of CS are relevant to the projects you're working on day-to-day.
I think this is the case even if you receive a formal CS education. I picked the most difficult or interesting classes and got As in them, and barely passed everything else in order to graduate in 4 years. Many students avoided the hard professors to protect their GPA, so it was really easy to register for them since 25% of students might drop the class.
Probably the most important thing a formal CS education does is expose you to CS fundamentals, but in my experience you end up having to be self-taught in a university setting anyway. Most of the professors I had were more interested in research than in lecturing - many lectures were completely incomprehensible. And even with amazing lecturers, I would still have to spend hundreds of hours reading and practicing on my own.
One of those classes I barely passed was algorithms, since my other workload was too great. I eventually had to self-study this subject years later to pass the tech interview torture chamber.
College was mostly an exercise in self-learning or learning how to learn for me - something I am still reaping the benefits of today.
I'm a self-taught developer with haskell in my arsenal, although that being said I have an MSc in theoretical physics which probably helps me appreciate the power of its paradigm.
You are sure he was never at a university? And why are you drawn to Haskell?
For me I only had to deal with it a bit at university and I found the concept interesting, but could not imagine non math people to use it for anything as a language of choice
And I think imperative languages are way more easy and clear to read and write.
I allways imagined that you have to be a math nerd to prefer Haskell, but apparently this guy is not a math guy, but loves Haskell .. so ok, good counterpoint.
> And I think imperative languages are way more easy and clear to read and write.
Because it's what you've had exposure to. Perhaps, they're even objectively easier to read and write, I don't know. It's also completely besides my point.
My point was that I want my programs to work as desired. That's orthogonal to how easy it is to read and write. I don't want runtime exceptions where I could've gotten a compile-time error.
But I had exposure to math functions much earlier. And I never liked their syntax. I did understood the math, but the math language made it harder for me.
And null pointer exceptions? They only happen to me very rarely, when I quickly hack something together and then it is a "oh forgot - and fixed" problem. The problems I do struggle with are non reproducable race condition fun etc. and I doubt haskell could help me with them.
Or I struggle, because I do not really understand my problem, or a certain libary ... or, because I misunderstood existing code.
So how on earth are correct programms orthogonal to how easy it is to read and write them?!? Did you ever had to use someone else code?
Or your own that you wrote 5 years ago (or sometimes 5 days)?
With any bigger project it is all about how easy it is to read and write them.
> And null pointer exceptions? They only happen to me very rarely, when I quickly hack something together and then it is a "oh forgot - and fixed" problem.
And you never run into NPEs in production? It's something you always discover during development? How?
> The problems I do struggle with are non reproducable race condition fun etc. and I doubt haskell could help me with them.
In a pure system, thread race conditions are impossible. You can still get race conditions for your external effects, which is unavoidable.
The parent said they didn't know what for-in was, which implies they were using a language like C for their side projects which only has for (i=0;i<n;i++) loops.
My CS degree (2002-2006) did not explicitly cover "for x in y" iterators. But it did cover an understanding of algorithms and data structures well enough that I could implement one.
That was probably before they were really in style e.g. C# didn't add foreach / iterators until ~2005 with v2.0 and your curriculum probably lags a bit behind the industry state-of-the-art unless it was Ivy league
It has not been very long since CS degrees were taught entirely in C89 or C++98, and there are a lot of us who won't touch Python or Java unless forced to.
exactly - I had a similar learning curve (and no CS degree) where I could never remember what certain methods did but I knew exactly what to google for the answer
Is Project Euler better than LeetCode for "gut checking"?
I've probably done 300 or so LC, and do some every week, mostly as an insurance policy for technical interviews if I ever opportunistically interviewed in a short time frame.
They're geared towards different goals. LC is for interviews: solving small scale problems of the scale you might get asked in an interview or come up against every day in a software job. PE is more of about curiosity and personal intellectual development. It asks something that you might never have heard about and might never see again but is often very interesting, and will stretch your ability to develop your own algorithms or to apply learnt methods to unfamiliar scenarios.
I'd recommend LC to someone who has an interview in a month, PE to someone who has one in a year.
Speaking of LeetCode, don't overlook the discussions for each problem. I think a lot of people just solve the problem, and then move on to another problem.
In the discussions many people post their solutions, and those often have bugs in obscure cases that the LeetCode grader doesn't hit. Another common thing is for the posted solutions to always give the correct answer, but miss the time or space constraints that the problem statement asked for.
I've found that reviewing those other solutions to find those issues to be quite instructive. Also, there will sometimes be a solution in there that is better than mine, which is also instructive.
This is interesting, but I don't think it's the norm for self-taught devs to not recognize a for..in. I would find that surprising. (I'm also a self-taught developer, and now I work at a larger corporation.)
Right on! Plenty of people can teach themselves to code without a university education. But the degree gives you a broad exposure to parts of CS that you may never explore as a self-taught developer. And it gives employers confidence that you have the skillset to handle whatever they throw at you.
> And it gives employers confidence that you have the skillset to handle whatever they throw at you.
Blindly having confidence in someone with a degree vs. someone with no degree is a bad mistake in this industry, IMO, sadly it happens.
It really comes down to the person, self taught or not, if you don’t pursue continuing education (which is a must in this industry) whatever CS knowledge you learnt with your degree will only take you so far.
Also said “broad exposure” can be well… self taught as well, algorithms, data structures, OS, etc… all things you can learn and "master" with no CS degree.
With all that said, I do agree CS education is essential to become a better developer. As a self thought myself, learning CS has made me a way better developer for sure, and I would advise all self taught devs to do the same, it will pay off immensely and best of all no student loan to repay.
I think you’re missing the point your parent was making...
In a good CS program you will be presented a series of challenges below your “depth”....
“wait, you want me to WRITE a data structure? I usually just use a good one from a library”
“wait, you want me to fix a compiler? I have only ever run a compiler”
“wait, you want me to write code that CREATES processes out of nothing? I am used to letting the OS create processes”
... etc. These provide you a series of epiphanies, “wow, I can build a compiler from scratch, that means I could fux with LLVM if I had to”.
Ideally, these programs are designed to take you all the way down to the bottom of the machine. For some students, the end result is confidence in their ability to “dive in” to a problem anywhere in the system.
If your point is that you can teach yourself that outside of school—absolutely. But... well, in my case I doubt I ever would have. It wasn’t fun, I was pushed to do things I would never have followed through on if I was just casually teaching myself about programming languages or operating systems.
And if your point is that students can get away without learning the material—well, also yes. Of course.
But you are wrong to dismiss the idea that a CS degree is just another few things to learn. This business of “get all the way down to the bottom and challenge yourself at every step” is kind of the whole point of a CS degree, and the world outside is not going to encourage you to do it the way your profs will.
I have seen plenty of CS grads that can barely code or put together a real solution to a problem. There is no guarantee just because someone was able to muddle through a CS degree that they can do these things. In almost all cases, you really need to evaluate whether an individual shows the aptitude to solve your problems you need solved and crank out good well thought out practical solutions that fit the scale of the business problem. The degree paper is not all that important. I would give bonus points to a candidate if they came from an accomplished but different background than CS that shows they are capable of success/mastery in multiple areas, they are adept at learning and researching new material, and they have the matching technical prowess to spearhead a real project. For instance I have known many engineering (of the physical paradigm) types that are self-taught with no CS degree that I would trust to tackle a project over any random CS degree candidate.
Self taught, I remember loops were the last thing I learnt as a kid when I was working on my own Tibia servers using Lua. I remember the day it finally clicked. It helped having a forum where you could ask more experienced people to explain it plainly.
Self taught from ages 9-18, Qbasic -> C -> C++ -> PHP
I did go to college and learned Java and UML and a few other esoteric things - until I got into my data structures classes. If a self-taught software engineer feels like something is missing, take a few data structures classes. That helps a lot.
But yeah, self taught can take you pretty much anywhere unless the job reqs prevent it.
Nice to see that reference about projecteuler. Although I study Computer Engineering, which is very close to Computer Science with Electrical Engineer grade, I really focus too much in the subjects which interests me.
BTW, I reforce a lot of aspects about analysis, optimization and programming with ProjectEuler by doing Polyglot programming with some friends: https://github.com/DestructHub/ProjectEuler
i agree. i'm happy to have some CS students point me to some areas which are useful even if they are not interesting for hobbyists. There's also a lot of value in learning academic methods of investigation and research (if you are at a good university). A lot of self-taught learn a lot about computers or some programming languages, but a CS degree like any degree also teaches a lot of other valuable competences which can be hard to come by from own inspiration / initiative.
Another thing is that following and completing a degree is a practice in going from start to finnish in a multi-year project which the degree is, which is also very valuable in life in general. being able to see fruits in the distant future of your current work, and having that motivate and drive you.
A lot of people i meet who are self-taught will give up more easily due to this. not that they are quitters, but they tend to lean towards their interests, which is not always in their best interest :)
Formal bs/ms ee here as well. Started off wanting to do hardware and looked at programming as an afterthought. After a few years of working in embedded and realizing how many gaps there are in my knowledge, and how beautiful the field is, considering a doctorate in CS.
I think a lot of people learn a given language by hacking away at something they either want to create of modify. For example, in web development, I'd say most developers haven't ever read a cover to cover book on JavaScript.
I've tended to take the opposite approach, read first so I can at least grasp what I need to look for, then start applying. This of course is a mixed bag for rapidly changing/evolving tech and newer languages.
I take the same approach you do. When mentoring new developers I also start them off by assigning them tasks to read the docs for the core of the language we are working with.
I wonder if it’s prepared me better than a CS degree for the actual work I do.
What did I learn with my RELS degree? Hermeneutics— understanding ancient texts written by lots of different authors in their original historical contexts, often which has been deeply redacted over time. Anyone who’s worked on any production code can see the benefit of being trained in that kind of thinking.
The classes I took on theology and philosophy helped train my brain to organize ideas. I had no trouble understanding object oriented programming — Plato would’ve loved it too, I think.
Classes on ethics have come in handy too.
Oh and just, we did a TON of writing in college. Lots and lots of writing, lots of research papers. I had to learn something new, read all about it, and put together a document carefully explaining the idea, with lots of evidence and citations. That skill has come in SUPER handy for everything from bug reports to API clients, to customer-facing documentation, to internal memorandums.