For high tech industries, this is even more pronounced. I apply very little of what I learned in college to my everyday activities (though I do love a good compilers or garbage collection nerd-out). This is okay with me: I know that without college I wouldn't be who I am today and probably not as prepared to make my mark on the world.
The more cynic of us will say that CS programs teach for one thing: post-graduate courses. This is, to a degree, true. However, the VIM trick that I picked up at work today is NOT the sort of thing I would have been happy with a professor 'challenging' me on during my courses.
How you become a good programmer is through: experience, lots of boring little things that add up and working with good programmers (which almost categorically disqualifies professors and fellow students).
Want to learn to program? Work up! Always be the worst person on the team. Find people better than you and listen.
Edit: Also, good programmers: part of your job is to make people around you better. Don't shoo away the wide-eyed and eager of us (me).
In the past year since leaving school, I've had more opportunity for maturity than I ever did in college. I've had the freedom to explore, and I've used that to travel to all 7 continents and broaden my internal horizons of the world. I've had the freedom to discover, and now I'm fortunate enough to have found a passion I can pursue with my full attention. I never had luxury of doing that in school - there's no time to step back and wonder why you're doing what you're doing as you're too busy being pushed through the grind of 'doing'.
I've had the freedom to become personally responsible. Being devoid of extrinsic pressures (must get good grades, build resume, get job, etc.) for the first time in my life means that I've had to replace those driving forces with intrinsic drive and accountability.
I've been living on my own in SF since June, and that's been a wild ride of being liable for every aspect of my own survival and well-being :). Being an entrepreneur means I have to account for a 24 hours each day on my own, as I don't even have an employer to provide a cover story for 8-9 hours every day. I'm not saying I account for all those hours nearly as well as I could and in fact I certainly don't (case in point), but this is offering me the opportunity to learn to be accountable, and that's exactly the point.
College is a pseudo-diverse atmosphere. Sure, people might be differently colored or have exotic accents, but everyone's there for the same purpose, is the same age, and has more or less the same mindset, or at the most one of a select few mindsets. Real diversity is living in a strange juxtaposition of homeless crackheads and billionaire tech entrepreneurs (woohoo San Francisco!), or seeing firsthand definitive proof that polio has in fact not been eradicated and is still alive and well in the marginalized corners of the world.
Even if we see these things in college, chances are likely our mindsets have been so tunnel-visioned that their full impact doesn't register on us. The insular bubble of college is pretty hard to penetrate.
Challenge-wise, being an entrepreneur is far harder than being a college student. And thanks to the cost of college, so many of us might never have the opportunity to take the entrepreneurial path (or any of a number of other paths) unless we take it in lieu of the college path.
And yes - college might very well be a mold that shapes you into the person you'll be, but it's a cookie-cutter mold, and it's not unique to you. Life is a variable endeavor, and the experiences we use as a mold with which to shape ourselves should similarly be variable. College is probably a wonderful mold for some. For others like myself, there are almost certainly better molds out there waiting for us to find them.
University teaches you the things that you cannot learn on your own.
Unfortunately most discussions about college education is rationalizing rhetoric - college is such a broad term, and can vary from diploma mill to top level educational instutions. Anyone willing to argue for or against it in this vague concepts is just rationalizing their beliefs. Second, as a person who wen't to college to get my degree after working in the industry for years, I noticed a lot of people don't realize that the time you go to college is the same time you start to grow up, get independent, thinking differently, associating with different kinds of people, all those changes people attribute to "college experience" in classic lines such as "college changed my way of thinking" or "college taught me how to learn" "I started socializing with different people" etc. are actually natural changes you will experience even if you don't go to college at that age, so I think people arguing for college from reflection are actually misattributing the cause.
I would probably benefit from some mentoring, but that's not forthcoming.
I think a lot of this is the same for coding. You can learn whatever you want to/need to on your own. What a university gives you ideally is a level of mentoring and collaboration that can bring you to the next level (yes that includes, say, learning assembly, something I have taught myself the basics of but never gotten to the point where I feel comfortable in it).
For all this though, I agree that college is supposed to teach you the one think you can't learn on your own: critical thinking. I don't see how one can learn critical thinking on one's own, or at least not well. Not saying it can't be done outside of college. Just saying it can't be done in isolation.
From the sound of the original article though, it sounds like the author didn't feel like he was getting this, and that's kinda scary.
Bottom line: in my experience, the reason most programmers are self-taught with regard to writing good code is not that no one tries to teach them. It's that most of them don't think they NEED to know (i.e. don't place enough importance on it) and are willing to lose marks for it until they get to a job and they have no to choice but to, you guessed it, teach themselves. There are people who try to make the best out of their education, and these can be very good coders who learned by making the best of their education.
(Note: obviously this is a generalization - there are plenty of students who try to write good code.)
There are just some things you'll never learn, for example, until you have to interact daily with code you wrote a year ago.
In university, I think the longest assignment I had was in the ballpark of a couple thousand lines of code and you could pretty readily keep it all your head as you solved the problem, there's not too much abstraction or delegation of responsibilities, no interfacing with 3rd parties, few library dependencies, it's all written in a single language, etc, etc. And you wrote it all from scratch by yourself or in a small team with (likely) equally inexperienced peers, so you know all the clever tricks that were employed.
On the job, I deal with company's code base that's over a million lines of code and uses a very messy database, maintained by dozens of people or varying skill levels over the last decade, wrought from an unholy amalgamation of 3 different programming languages spread over several servers and reliant on a half dozen "support utilities" to keep applying band-aids to bugs that "aren't cost effective to fix".
Writing good code the interacts with bad code is one of my hardest professional endeavors.
PS: At best a comment only says what someone thinks the code does. Program for long enough and you will get bitten by this.
I had been on courses, bought books, read tutorials etc. but at the end of each forced exercise I still could not build a decent rails app. Then, after coming back from the US 6 months ago, I decided I wanted a GroupMe clone for the UK. So I built one.
The SMS is disabled on that version as I have no intention of subsidising peoples' group-messages but the fact is it WORKS and finally learnt to code.
The controller is fat, the model is thin, there are no tests, there are lots of lengthy if-else statements, there are lots of bugs but I do not care because I finally learnt to code.
I have built little scripts for managing my kindle (https://gist.github.com/1404068) and scripts to get my Instragram pics (https://gist.github.com/1399696) and I love it.
I feel like I can finally call myself a rank amateur nuby Hacker. It feels good.
If you are interested in advancing that project, you may want to consider sending the SMS messages via email, which would be free. Each user would need to define their gateway when they set up their account. Many carriers have SMS gateways, and an easy way to translate a phone number into an email address that delivers by SMS.
How many hours did you spend per day, roughly, and how many all told?
Is there something that you really want that is not available? That could be the driver you need to get you excited about firing up terminal.
I recently wrote a ton of Node.js example scripts (chat, group chat, notifications, interaction with PHP, etc), because I had never worked with Node before. I knew PHP, a bit of Rails, and C, but it really feels good to learn a new language or way of programming. I didn't need books or anything, just a bit of example code.
Obviously, since I knew JS, Node wasn't too difficult, but it's an extension of what I previously knew so it took a while to get used to. I also learned a lot more about JS in the process. Self-teaching yourself to code has amazing results.
I am also on Twitter: http://www.twitter.com/ricburton
That skill is neither expected nor rewarded in academic careers. So unless they moonlighted as hackers or in a company, they never learned to code as students, and certainly won't as staff. They certainly won't teach it.
Here's an article that explains it via the 'theory of the leisure class': http://lemire.me/blog/archives/2011/06/06/why-i-still-progra...
"I believe that the rejection of programming as a lower activity can be explained by the Theory of the leisure class. In effect, we do not seek utility but prestige. There is no prestige in tool-making, cooking or farming. To maximize your prestige, you must rise up to the leisure class: you work must not be immediately useful. Thus, there is more prestige in being a CEO or a politician, than in being a nurse or a cook. Scientists who supervise things from afar have more prestige. Programming is akin to tool-making, thus people from the leisure class won’t touch it. People will call themselves engineer or analyst or developer, but rarely “programmer” because it is too utilitarian."
Just like computer science is not about computers, it's also not really about programming. Certainly there's overlap. I have a pH.D. student friend who works on proving the correctness of programs. He doesn't write code, he writes proofs. Of course he'd be more rusty in C than a full time C dev.
But I also have pH.D. students that build proof-of-concept software. Yeah, they're not optimized in the way that live software is optimized, but they can certainly code.
And I really have to wonder where you went to school if "most" of the TAs and profs "can't code" and what kind of standard you have. That was not my experience at all when I went to school.
Finally, I didn't realize how many teachers and TAs couldn't code while I was a student--although I had serious doubt about some of them; I fully realized it when I was a PhD and TA, with a background as a start-up developer.
I don't see it as a huge issue, though: as the article points out, many developers successfully learn to develop on the go. In my opinion, school is there to teach you the foundations which would be very hard to learn on the fly as a junior developer: maths, formal reasoning, hardware architecture, algorithms, maybe a bit of formal semantics. As I recall, those happened to be the most interesting lectures, given by the most awesome professors; it's probably no coincidence.
Programming is also an unusually marketable hobby skill. With many hobbies you can invest a lot of time and effort and produce beautiful things, but that's liable to only gain you praise; in programming, it can gain you a really good job. This is a pretty smart investment: for a modest expenditure yourself you might net a high income so it should not be surprising if people often teach themselves.
Besides, when I hire someone I like seeing evidence of side projects that reveal a certain passion for doing good software work. I wouldn't hire an artist or designer without seeing a portfolio either, and software has a lot of things in common with art and design.
I would wager a majority of the best programmers are also self-taught, though. Not because College is a bad experience or sets you up for failure, but because it's really internal motivation and curiosity that makes a programmer grow, not something they learned in a CS course once upon a time. And you've got to be pretty damn curious to make a career of programming without any formal introduction or education.
Programmers are artists. You can't teach art.
Computer scientists are mathematics. Math must be learned.
The two together are Engineers.
There's one thing I'd like to point out. You can teach the techniques of art, which are used for creating art. You can teach the techniques of programming, which are used for creating software. You can't teach how to create something that's never been created before.
That goes for creating ground-breaking art, as well as ground-breaking software.
Often, it's a set of disparate and interdisciplinary techniques, seemingly unrelated, that are used together to create the cutting-edge.
I completely agree. The article might be right that it doesn't usually happen. But we should ask what the causes are, not simply accepting the "it can't be taught".
Art (like programming) is the creative application of technique, and you can teach technique.
I joined a small team at a startup. Those guys did their best to make me produce quality code as soon as possible. They were reviewing my code several times a day, and teaching me what to do to, to make it efficent, easy to understand and well designed. Early on they even made me read the "Clean code" and I am eternally grateful for this.
Now I think part of being self-taught is actually having to find teachers on your own. Software Engineering is always about knowledge - whenever you code, you're probably solving a problem, which hasn't been solved before. Otherwise you'd just use some library to do the trick. It means that programming is about inventing new stuff, and so when you tell someone about your code, you're actually teaching them about it. Whenever you join a new project, someone has to teach you how to work with their system. If you're independent contractor building software for your client, you have to learn their ways of doing business.
High tech jobs are all about knowledge transfer, this way or another.
During my college I didn't learn that much from my professors, when it comes to software engineering in the trenches. I did learn a lot during college though. It was a time of meeting other people who were passionate for technology. We could exchange our insights, and do college projects our way. We would give each other feedback on our work, and so we were actually teaching each other.
I think above applies to meetup groups, code retreats and similiar initiatives. Programmers gather together to learn. Software engineering is all about knowledge transfer, and you have to want to learn, but it doesn't mean there's nobody to teach you. You just have to find your teachers on your own.
You are self-taught, but never alone!
I learned programming at work. I was working in a startup that I joined shortly after the internet bubble burst, in 2002, as their first employee. I was taught, beyond syntax and flow, to find my own "Way of the Programming Fist", as I like to name it, and more importantly to keep an open mind for other ways of programming. The key, and possibly the only lesson that is still relevant today:
"When in doubt, remember that your Way of Programming is complete, which means you can eventually build anything with it, anything. Whether you want to display a barchart from a static printer spool log or light up a bulb with a handclap, your Way of Programming can eventually do it. For every goal, there are probably faster, stronger or easier Ways, but these are never 100% better: there's always a tradeoff. For that reason you don't need to learn the other Ways - but make sure you acknowledge their existence, learn and remember these pros & cons your whole life."
You're talking about "Quality Code" - I didn't get it through people looking at my code, but through me looking at other people's code. My brains designed some abstract "compiler" of sorts, used in my very mind to parse an alien piece of source. For me, "Quality" in that domain is the amount of work I need to do to be understand the "Way of the Programming Fist" of the coder, and use it in turn for my own programs. It has a lot to do with conventions, design patterns, formatting, naming and of course the programming language's "family", whether it's procedural, functional, object-oriented or whatever.
A decade later, it feels like it's still only the beginning, but I'm now proficient enough to understand the pros and the cons of some of the various conventions, design patterns, algorithms or programming languages I've met along the way, the way they fit with various goals and requirements, I'm mature enough to make my own choices, and I actually have my own programming business.
I would absolutely describe myself as a self-taught programmer, but like the above poster said, I would have gone absolutely nowhere on my own. Programming is all about knowledge transfer - "Here's what I use to do this, and here are the benefits, here are the tradeoffs" should be every programmer's favourite sentence.
Your professors taught you unit testing? In terms of learning the "practical" side of programming, I think that right there gives you a leg up on most CS students.
When I've taught programming to beginners, I try and demonstrate good code and when I see ugly code I point them in a better direction. I have, however, had students be stubbornly focused on the output of their code and ignore all my advice. No matter how ugly and hard to debug their code was, they refused to believe it mattered.
But to say that all programmers are self-taught is a little short sighted. Beyond the basic syntax, there are difficult concepts and idioms in many languages or programming paradigms that can be quite hard for a person (who is not familiar with such concepts, idioms, paradigms) to discover by him/herself. These things include pointers in C, reasoning about correctness of programs, recursive techniques (dynamic programming, etc.), and many functional programming concepts. Good teachers can make these things a lot easier to learn.
Interestingly, this guy was a math professor, not a CS professor. But he was very particular about code.
Besides, even after college I learned a tremendous amount about programming from my colleagues and peers over the years; programming isn't one of those things you learn, and then you're done learning it. It sounds like in this case, college didn't adequately cover the basics, and the author decided (to their credit!) to drive their own learning; but I promise there's a great big crowd of helpful peers and teachers out there when you're ready for them.
Though now that I reflect for a second I have to acknowledge that it's all that matters to the stakeholders, and not necessarily to the coders (nor future maintainers).
My school does offer a class called "Production Quality Software" but it's a graduate course and if you want to take it as an undergrad you only get 3 credits.
IMO the situation is not good.
From what I've seen thus far, most school assignments are pretty easy to stumble through without really understanding what you're doing. As much useful stuff as I've learned in class, it's been personal projects that have really solidified things and taught me the most valuable lessons.
I read a course about data structures. It was taught in Java, but it was very clear that Java was not the point. What did I learn by that? Not to write linked lists in Java - I've never had to do that again - but to understand the basics of how to think about storing data in your programs.
For me, courses are great for grasping theory.
You go further with self motivation and self-learning since there're only so many class hours, and there're a lot more topics in programming.
The Software Engineering course I took at university was a pretty good introduction to "programming". One of our required texts for the course was Code Complete by Steve McConnell. We were required to read all the chapters that focused on programming style.
This is mostly a criticism of academic CS (which aims to produce systems programmers and researchers). The underlying implication that programming is somehow unteachable is bogus.
Whatever you study in school, you will never be 100% prepeared to work.
That being said, code/program design IS a very important skill that seems to be hard to teach in school.
Working with and learning from other people is way more effective (than just writing code) in figuring out how to identify good and bad code. And I think formal CS education usually gives people the right tools (like algos, design patterns, data structures, etc.) to understand and work with others.
I wouldn't be surprised if 10-15 years down the line, these mediocre corporate programmers (they can only survive in a large organization, where you don't have to pull your weight nearly as much as on a 10 person team) they become managers from hell.
Unfortunately, some are also self-taught in grammar and punctuation. (I had to. My snark quota is done for the day.)
It's telling that CMU is considered such a great tech school. Even the best don't know how to teach programming (yet?).
We print out thread libraries (~1000 lines of code) and kernels (~4000 lines of code, prints on 70 sheets of paper or more), and then we go over it with a red pen. We think it's some of the best feedback people get at any school. For the kernel, each group gets a one-on-one meeting with one of the TAs (or the professor) to go over the whole kernel.
I've graded some really abysmal project 0s, but by the time they get to the bigger projects we've stamped out some of the bad habits, and I'm happy to read code written by those students, because they learn the most and produce good results by the end.
The philosophy is that code is for people to read; the next guy to maintain the code in 6 months (or just you 24 hours later with no sleep in between); your reviewers, and so on.
My personal opinion is that university is not a place to learn tools (i.e. programming), but to learn to apply knowledge to form solutions to complex problems. We learn about algorithms, data structures and techniques because that is what we need to know to learn to solve hard problems. Programming, while not separate from this (i.e. you can hardly build anything without knowing how to use a screwdriver) is something that is assumed knowledge. Teaching it at university would waste valuable time in courses that are already too packed to give rigorous treatment of all topics covered.
I don't think we should assume the students have the knowledge 1) because they obviously don't have the knowledge and 2) it doesn't match the methodology of every other applied science. I don't assume that students are competent to work in a biology lab before college. Why should Computer Science get a free pass from teaching the lab work techniques that all other applied sciences are expected to teach?
To do otherwise is to require that incoming students already know how to program. That is, I think, indefensible. If students want to test out of the introductory course, fine. But it must exist, just as the intro to physics course must exist for physics majors.
It compares self-taught and college taught programmers to street fighters and martial artists.
I learned simple HTML when I was 11 and have been self teaching myself ever since. I'm not in my second year of a degree in CS and what I've found is that the language and the code itself is not important. What's important are the concepts. Types, objects, methods, the theories behind programming, and all of those less tangible things are what's important because that is the basic foundation of programming.
Programming is different than other skills. It's as much of an art as it is a science. The science is relatively unchanging but the art does. Your code is always evolving and there's always the potential for you to make yourself obsolete if you don't continue to teach yourself. I focus on the web and in working in web design in particular I'm seeing a trend that speaks to the idea that you must always be teaching yourself. There are oodles and oodles of web design/development firms out there that have been around since the late 90's that you can tell have been around since the late 90's. Their work looks dated in terms of design and their techniques look like they're right out of 1997! The guys working there used to be young and up on the latest trends but it's obvious they've stopped learning and young guys are passing them up easily. Even people with less than half their experience are writing better code and have prettier output.
I think you can tell who got into programming just to get a job and who got into it out of passion by looking at their work. If it continually improves they've got passion, if it plateaus at a certain point then they're just employed and nothing more.
In the end being self taught is a requirement and never an option in this field. Once you have the foundational knowledge of programming then everything else is just a matter of keeping up with new tech and learning some new syntax every so often.
I'm self-taught. I'm trying to filter for confirmation bias here.
I dated a girl (it was awesome but it didn't last) who was a C.S. grad student but wasn't the type to "autodidact" at anything. She was also a stunning musician, an entrepreneur, and an investor.
Ok, basically I'm trying to paint the picture of someone who is clearly my superior in many areas, and understands C.S., but isn't self taught.
I couldn't pretend to know all the reasons, but she wasn't in it just for the job or the money. She seemed a genuine hacker but just not in the same sense as me.
I also made another point about how some programmers never try to branch out beyond what they're comfortable with coming out of the gate (i.e. they stop learning) which could have made it seem like I was saying something else.