Demonstrably wrong. Developers (like anyone) are happy to pay for things if they can identify value. The problem is with the seller, not the buyer: people keep trying to sell things to developers that they can more easily build themselves. But today's platform as a service companies (like Twilio and others named in the piece) are providing a stack that can't easily be replicated by nerds working in isolation.
It's these kind of forward projections that cause booms and busts, ie. people assuming that housing prices will increase at a steady/regular rate over time. Just as we saw that this doesn't happen in a steady way, nor will facebook steadily decline over the next 3 months.
Since it's possible to earn a degree online today, I'm not sure this is the case. Top-tier universities are faced with a significant innovator's dilemma problem here: they want to enforce their duopoly on the granting of degrees and they don't want online programs (which can deliver comparable content to students in a more scalable way at a fraction of the cost) to cut into their tuition revenue.
I'm not sure that beating the traditional university is the objective or that it's even necessary. Remember that the essence of a disruptive product is that it's a little bit worse and (at least) a little bit cheaper than the incumbent product. The question, then, for disruptive education plays (of which my company is one) is: which part of the traditional experience are you going to take away in order to achieve scale? Most .edu startups are taking away instructor interaction and emphasizing lectures, which is exactly the opposite of what most students actually need.
No argument there. Like I say in the post, I'm all in favor of the goodness of the basics. I just think that giving undergrads all theory and very little practice is not optimal, and they should have the opportunity to go minimal on theory if it makes sense for them. Very few universities offer such a choice today.
I know. We lacked practice at the university too. I don't think you can blame the university for that, though. Practical things are better taught by people who actually use them on the job, not by PhDs who write scientific papers. That's why I went to long-term internship at an actual company during my second year, which later turned into a full-time job.
if the university purports to teach this stuff, they should hire people who actually know how to teach it. if their tenured PhDs aren't cutting it, they should look elsewhere. (this is how i got my university teaching job.)
It doesn't. The major purpose of a university is to produce scientists who do research, not software engineers who write crud apps. We, the software engineers, are just a byproduct.
> they should hire people who actually know how to teach it
They can't and they shouldn't. Why would one want to teach at a university when (s)he has enough skills to work at a company for a big fat compensation, and, possibly, equity? The only way to attract those people to the universities would be for the universities to offer much higher salaries, which would consequently drive up the cost of education (in my own country it wouldn't even work, because education is "free", i.e. mostly paid for by the government).
And for what - to teach the students something they could have easily picked up on their own by poking around the internets or participating in internships? That is not optimal.
If you study formal grammars in class and don't try writing your own little lexer/parser at home, it's your own fault, don't blame the university. It has taught you everything you need to know to write a parser, and it's your job to put it to use.
Where I spent most of my undergrad (Georgia Tech), we had a system called threads. Giving students tracks in which they could focus their study of computer science.
These tracks were generally all theoretical (the exceptions being some classes in platforms and devices), even my friends who had a love for software engineering ended up learning the thoughts behind each process rather than blindly learning the process itself in order to master said process that is currently practical.
The way I treated academic learning is very similar to the process described above by 10098. The learning you experience in college should be some form of aggregate information about past ideas/trials and tribulations. This is the process by which I (as a customer) have found most universities treat their undergraduates. Rather than tailoring them to become masters of a specific problem they try (albeit not successfully, and there are a vast number of people in universities today that don't enjoy, want to have, or have this mindset) to build the next generation of people to find flaws in our current society, in hopes that they have motivation to fix it.
That being said you are making the argument that software engineering is a vocational skill. While I'm not denying that the majority of the process that I've found in my limited time in the industry tends to be filled with skills that are akin to any other vocational profession, I don't believe increasing vocational skills help the students. It might in a short term give them a strong basis for obtaining a software related position. It might also give them the ability to perform really well in their jobs at that time being. It does however not enable them to question the process they've learned. It also does not help them find problems within a process.
It should also be noted that most companies aren't looking for a master software engineer. They test for intelligence and social skills rather than how strict the student is on testing practices or which design cycle they prefer. These biases also lead towards limiting employment because they'll end up fighting rather than adapting to the new work environment that is already established by the companies views on the subject.
did you actually read the post? i actually pre-rebutted both of your arguments.
i understand quite well what a CS program is, since (as i pointed out in the post) i've been a student in three university CS programs and taught coding at a fourth.
i'm saying that most of what goes on in academic CS departments is not what we need as a society and it's not what most undergrads hope to get into when they enroll. most undergrads get into CS to write software, not to learn "theory" or "compiler design" (although, again, as i pointed out in the post, if this happens to be your bag, more power to you -- but this isn't really about you, it's about the mismatch between what university CS programs do and what people and society really want or need.)
software engineering is obviously teachable (i've been doing it for 20 years) and that's true even if you personally didn't have any good software engineering instructors or mentors. it looks like you and i are in agreement that rapid change in software engineering makes universities not the optimal place to teach software engineering (that was the point of my post). but it doesn't follow that because universities are bad at this, that software engineering is "not teachable".
I do thing cube123 read your post, and I do think a valid conclusion from reading your post is that you have missed the point of what a CS program is.
I suspect that this comes from your own history, and part from the way in which CS 'grew up' as a discipline.
The point of your posting is that being employed as a programmer, and learning programming is not what a Computer Science degree is about. Try getting an undergraduate degree in Electrical Engineering, its similar in that you have lots of theory on lots of things but little practical experience.
The leap you make which causes me to discard your argument is this one, "but this isn't really about you, it's about the mismatch between what university CS programs do and what people and society really want or need." You have taken on the role of speaking for society and yet you haven't successfully made a case that you can accurately represent what society wants.
You can make an argument that there is a need for a training program between high school and employment that teaches people how to write programs to solve problems. You can call that Computer Engineering, Applied Programming, Programming Technology, what ever. Such programs exist, both in the 'for profit' University world and elsewhere. You can argue that such programs should be structured more along the various processes for producing reliable, testable code, and you do some of that in your education project (always great to put your money where your mouth is like that).
But the opportunity to teach folks who 'just want to code' does not disqualify CS as being a valid course of study, just like 'Accounting' doesn't disqualify 'Mathematics' as a course of study. So your central thesis that 'more universities should shut down their CS programs' fails the 'sniff' test.
Now if you said 'More universities should offer applied programming type degrees' and used your points about how it is what many people want to do. That is a reasonable conversation to have, do we want to elevate what have been things like ITT Technical College programs into a more general purpose degree program? Something between 'JavaSchool' and 'CS' ? I can see arguments for and against.
But if you are going to blurt out things like "Most undergraduates and professional actually want to learn applied software engineering, not 'computer science'" you really should try to develop some foundation for that claim. What evidence do you offer that this claim is valid? Some study on college exit exams, some survey of recent CS graduates? A self selecting poll on Reddit? Its all well and good to wonder if most folks just want to code, but to use it as a claim in your argument that Universities should restructure their CS programs, and expecting your readers to 'buy in' to that requires that you provide some basis for making that claim.
Your post makes a bunch of claims, four of them in big bold font, for which your provide no supporting evidence or structure at all around why the reader should believe them. Because of that your message is lost.
I agree with you that it is an interesting topic and as we've moved 'programming' into a more general skill requirement based on the explosion of 'programmable' devices, you might be more successful making the argument that we need to offer a better high school programming class. (much like Typing was offered in the 70's as a way to provide a generally useful skill to High School students).
Just a terminology note: 'Computer Engineering' is already existing term for the discipline of making computers -- designing CPUs and building hardware in general (our CE has strong security focus, so they do lot of RFID ASIC stuff, and some FPGA crypto stuff) and low level programming like compiler backends (SPARC register windows are pretty!), microcontrollers and interfacing with other hardware too. And whatever Hennessy and Patterson are doing.
So you shouldn't name gutted code monkey courses Computer Engineering :)
>i'm saying that most of what goes on in academic CS departments is not what we need as a society and it's not what most undergrads hope to get into when they enroll. most undergrads get into CS to write software, not to learn "theory" or "compiler design" (although, again, as i pointed out in the post, if this happens to be your bag, more power to you -- but this isn't really about you, it's about the mismatch between what university CS programs do and what people and society really want or need.)
I don't think that catering to what undergrads want is a good thing, at all. The majority of undergrads are 17 to 21 year old kids that really don't know what they want to do. They don't know the industries they want to get into. I certainly didn't, and my experiences at UIUC turned me away from going into game programming to a totally different space. And I wasn't a good programmer out of college. I was pretty terrible. But work has taught me a LOT, most of which I wouldn't understand without the theory basis.
>software engineering is obviously teachable (i've been doing it for 20 years) and that's true even if you personally didn't have any good software engineering instructors or mentors. it looks like you and i are in agreement that rapid change in software engineering makes universities not the optimal place to teach software engineering (that was the point of my post). but it doesn't follow that because universities are bad at this, that software engineering is "not teachable".
Like I said in my post, the difficulty of teaching programming and SE in general is that it's changing at an incredibly rapid pace. Languages have evolved. C/C++ were the de-facto languages for a while, then Java gained quite bit of popularity. Recently, C# seems to have taken a large majority of the mindshare.
20 years ago, a lot of SE practices were highly structured, highly documented(and highly wasting of time) systems. IBM's RUP is one example. 10-15 years ago, less structured systems like XP started showing up, and gained quite a bit of popularity. In the last few years, Agile has become very popular. I learned about RUP and XP in my SE courses, but the place I work at doesn't use any of the above.
So what would the program teach? Just the most current, up to date stuff? Or would you try to teach a bit of everything?
The problem with teaching just the brand new shiny is that you end up with the Java mills from the 90's. They're not teaching programming. They're teaching Java. And it may be worthless in a year. While that may just be what the student wants, it's not what society needs.
> I don't think that catering to what undergrads want is a good thing, at all
that's not what i proposed at all. what i propose is to bring curriculum into the 21st century, and to call out the fact that cargo cult thinking and poor organization on the part of universities are standing in the way of that.
I don't quite understand what you mean by that. Dijkstra's algorithm is the same both in 20 and 21st centuries, ditto for red-black trees. Are you proposing universities should be teaching iOS programming and web development with ruby instead of data structures and complexity theory?
I don't think they should be teaching that at all. It's a bigger waste of time than teaching differential equations. At least diff. equations are there forever and might one day come in handy if you end up writing some software for modeling physical processes. Ruby, on the other hand, will be replaced by some other trendy language in a few years. Plus, if you already know a couple of programming languages with different paradigms, picking up Ruby (or Python, or PHP, or Java) is really easy to do on your own, you can start writing production code in matter of weeks.
This isn't about whatever language is the flavor of the month, although seeing universities teach a language that is less than 20 years old would be nice. It's more about CS programs that produce graduates who have no practical ability to code at all, in any language.
Language proficiency, by and large, is the easiest thing for anyone to pick up. Again, new CS grads are, on average 21 or 22 years old. I was programming C/C++ for around a decade at that point, and I was still a terrible programmer. Work taught me a lot more on how to write good C code.
Programming languages are tools. Knowledge of a language is not the final goal of a CS education, because it's the same as teaching a mechanical engineer how a drill works.
Also, the tools are based entirely on the theory of computing. And once you understand the theory, understanding what the tools are doing, and how they work, becomes much easier.
My problem with your position is that what you are proposing does not require a 4 year degree. I think a 6 month to one year vocational program would be of more advantage to a student than a 4 year degree in what amounts to a vocational education. The problem with this, is that no one respects a vocational education and no one views it as a competent replacement for a 4 year degree (even if the student is better prepared for the job). So it is a catch 22, you are wasting the students time with a 4 year vocational degree, as the stuff they learn in year one is out of date and the perception of anything shorter is that it is a fly by night education. I don't disagree with your premise that students are not being served, but maybe the answer is improving vocational education and the perception there and not in butchering university academia, the latter has been tried and was not very successful.
There's no engineering without a solid theoretical background. You want to do civil engineering, they'll make sure you thoroughly understand Newton's laws; electrical, Kirchhoff's. And that's just the start. In fact, you're much better prepared to build big bridges if all you've studied is theoretical physics than if you had personally built a crossing over every ditch on campus.
Wait -- no engineering is possible? None at all? Really? Then how do you explain the success of high school dropout Benjamin Franklin?
The point of my post is not that theory is worthless (which I call out several times in the post). It's really that there is more than one way to learn this stuff and that most undergraduates pick out a sub-optimal path for cargo-cult reasons (such as the "halo effect" of elite universities which I also called out in the post).
> most undergrads get into CS to write software, not to learn "theory" or "compiler design"
The flip side is also that an undergraduate enrolls into a CS class, discovers (so it's not their bag before enrollment) that "compiler design" is far more exciting and intelectually challenging than developing typical business/CRUD apps, only to find out after graduation that such jobs barely exist. Yes, let's dispense with academic CS departments.
Maybe the message you tried to convey is that attending mediocre CS courses is much worse than attending good SE courses. That I agree with, but I do not agree that SE should be totally devoid of theory.
This is like saying every plumber needs to know how to mine copper and smelt it so they can manufacture their own pipes. There are different kinds of software engineers, and the kind that you happen to be may not necessarily be the kind that's in demand by every employer.