Re: basic skills. I think we have a big problem with schools. Why don't they teach how to test? Why don't they teach how to debug? Really, what are they teaching that is more important than these basic skills?
They do teach those things, but a classroom context can not replicate the reality of large-scale long-lived multi-developer projects. There's just not enough time in a college degree to give any realistic approximation of that, plus what does the academic world know about that? They're not qualified to put together that curriculum, and what's the point anyway since you can learn it in the trenches at any company out in the private sector.
Instead CS focuses on theory. The distillation of the most important theoretical foundations of the field is actually a pretty useful thing to get, because you won't find it anywhere else but in a University. You can Google all these things sure, but first you have to know what they are. Having the foundation of these things will be quite useful over the course of a career when best practices for testing, debugging and platforms themselves will change dramatically. A couple decades down the line, sure you won't remember the details of what you learned in this class or that, but sometimes just the tangential awareness of something such as the existence of dynamic programming algorithms will give you the hook needed to google around and find an efficient solution to something which an untrained programmer might never know how to find.
Computer Science courses must maintain a strong focus on theory. An entire CS course can be conducted using Knuth's TAOCP, paper and pens only. The reason being that Computer Science is a highly specialised extension of Applied Mathematics into computing environments. These courses produce researchers. Programming, software development and project management are out-of-scope.
Software Engineering courses are meant to cover the real world application of the Computer Science discipline, Project Management discipline and a broad variety of other topics relating to business and management. These courses produce software developers and software project managers that mostly sit in modern interpretations of the cubicle farm developing the 150th interpretation of a business productivity app.
Experience levels
3-5 years at University is not enough time to gain expertise in the wide range of areas that make someone a brilliant hacker. The best hackers graduating from University tend to have 10 or more solid years of diverse and highly creative experience through hobby involvement (open source development), school experiences and perhaps relevant paid work.
Work experience tends to lack the diversity and creativity required to develop and maintain impressive hacking skills. Changing jobs every 2 years or maintaining a strong open source/hobby interest can counteract this problem.
Suggestions for higher education courses
1. Treat Computer Science in the purest form -- an extended Applied Mathematics course.
2. Encourage students to complete a Computer Science and Software Engineering course together. This approach ensures that students graduate with a decent understanding of essential theory: graphs, sets, complexity, algebra, etc and have a job prospect at the end.
3. Replace toy problems and short assignments in Software Engineering courses with longer real world experiences. Open source projects, student-owned start-ups and short stints of relevant paid work experience are a much better environment for applying theory and practising software project management principles.
4. Make it clear to students that learning should be a life-long goal. Diversity of experiences across completely different fields (example: marine biology, tourism and software development) is of particular importance.
There is no substitute for hard-won experiences, but debugging is a skill that can be taught. The academic world (well, some part of it) knows about debugging and is surely qualified to put together a curriculum. Really, go read "Why Programs Fail".
A lot of what I know about debugging I learned in the trenches, but I also learned a lot from that book. My question is that why I had to discover this book myself, instead of being taught.
In a CS class I took, we built a project over the course of a month. The TA's then introduced 5 bugs into the project, locked us in a room for 3 hours and we were required (individually) to find and identify the 5 bugs in that time. If you failed, they do it again a week later for a lower grade.
In my CS program they taught debugging, but the majority of what I learned about debugging happened in the lab. Working on the problem sets was a microcosm of debugging that transferred quite well to the professional world, but I suspect not all CS curriculums have as rigorous a programming practice as mine (University of Minnesota, based on the MIT curriculum: scheme, c, java, c++, perl).
I'm knowing people who are getting "web developer" certificates and degrees, and who also never hear the words "testing" or "source control". In 2012. Sad.