> There’s No Feedback Loop Between Industry & Universities
Stuff the industry; CS isn't vocational training.
Most of the coding that goes on in the industry is mind numbingly dull from a CS perspective.
The idea that the industry should play fiddle, and CS should dance to the tune, is ridiculous.
This article fails to define what it means by "stagnating". "Stagnating" isn't a suitable single word for describing the memory of an experience (such as an undergrad program) which wasn't what one expected.
A CS department stagnates when it doesn't publish papers, I would think. Just like a political science department or English department. (An English department isn't said to be stagnating because it doesn't teach people how to speak English.)
As far as the undergrads go, they would be poorly served if the focus of the program was to teach them some application stack du jour. Even if that's what the industry wants today, that may change by the time they graduate.
Civil engineers have to take physics, and geology, and surveying... and crucially a PE exam after graduation. One is not really an engineer until that's out of the way.
Yes, testing the ability to do tasks related to the actual job is refreshing.
Similarly, fizzbuzz is generally massively more useful than whatever silly CS problems people are asking in interviews.
I use fizzbuzz and character counts (histograms) as my primary weapons.
If they pass that since i'm usually looking for iOS developers they get a simple app with poor table view scroll performance and are asked to improve the performance of it.
agreed, in Canada software engineering is a degree that teaches you software methodology,design patterns, project management, QA/testing, and general algorithms.
And as you said, there is already vocational training. Technical colleges. The question is then of course will the companies hire people who only have those degrees -- some will and some won't probably. But education geared for particular industry is already there. Maybe the feedback should happen there.
There is another important learning -- self learning. With a Github account and free time you can prove you know a lot more useful things to some companies that with a 4 year degree with an Ivy League school.
But let's take the link between academia and industry a step futher (mostly for fun):
There used to be company towns. Say a coal company or an automotive plant would come in, and set up stores, school, hospitals, everything. Even issue their own psuedo-currency which could only be used on their company stories.
So maybe there could be a GoogleLifePlex, or FacebookTown. Familes live there, have children, all services provided. Special stores, education, healthcare, book clubs. Students are tought programming using technolgies from Google or Facebook only and are groomed to be the next engineers.
CS has its fundamentals -- a queue is still a queue, a binary search tree is still a BST, a graph is still a graph, and it is useful to know how to traverse it. So those parts should stay there. Obviously there is exciting new work happening (distributed systems, CRDTs, others..) but a lot of those are still built on fundamentals.
But there are also classes tied to specific languages and technologies that are often not as relevant anymore. When I went to school in early 2000's we still had to take 2 assembly language classes for a CS degree. Those were very useful, but it probably could have been just 1 class and instead take another networking and/or protocols class.
There were other instances of that. There were almost a whole AI class wasted on Expert Systems. Those were cool but in 1980s not in 2000s. There was even a class tought in COBOL, it was an elective, but still... (I didn't know what it would be, signed up for fun, after the first class, almost half the people dropped it including me).
Another major driver for changes or lack of is faculty. Remember in US faculty is tenured for life. They can't be easily fired for not picking up the latest technology. They can still be teaching COBOL and Expert Systems. So that is what some do -- they just teach whatever they wrote their PhD thesis on, so you get a mix of whatever happened to be awesome anytime from late 60s up to 2010 (depending on how quick the tenure process takes). I personally saw some faculty do a fantastic job picking up latest interesting stuff, trying to stay relevent. Some, not so much.
I went to two of the top 5 CS schools in the US for undergrad and grad. As a current web dev, it was a total waste of time as far as helping me be useful in the industry. It gave me some street cred on my resume with bigcorp & people who don't know how to recruit, but that was about it. I ended up learning most of what I needed on my own, and not through a formal structure such as that of a university. I still frequently go back to the same texts we used in class, but I get a lot more from them now that I actually have a pragmatic need for that knowledge, as opposed to cramming something really abstract into my skull in 3 months.
I'm sure academia would have been a lot more useful if I really needed specialized knowledge in some specific area. Even then I doubt I wouldn't have been able to teach myself everything, or find someone who could.
I did get a lot out of team projects though, those were the closest I got to real-world-like experience of working in teams.
Software engineering is a craft. At some point we need to ask ourselves if we aren't served much better by turning our education into craft schools / apprenticeships etc. over the current awkward model of trying to stuff vocational training into academician-training programs that were never meant for that in the first place.
A program like Digipen/Guildhall or some well executed long-term bootcamp is fairly in line with what would work best for creating productive industry contributors. I remember one of my program directors being very clear about it: "We don't make software engineers, we make computer scientists who can go into research". I don't think that part is ever obvious to people going into CS programs.
CS is not Software Engineering, and should be treated as such. People who do nothing but coursework will be lousy developers, and people who don't get a proper CS instruction will have a harder time becoming proper software engineers. The problem is, getting a CS degree is by far the most major and common option available. Considering the number of people diving into CS to become programmers, I wish there was a bigger focus on _software engineering_.
My program was lackluster from a development perspective. No real projects, the coding-centric courses all had really poor execution, and a lack of instruction on how to actually program and design; however it was excellent for CS fundamentals and foundation. I ended up okay because I spent a lot of time in internships and just learning on my own, which balanced out with the abstract stuff from school, but I witnessed quite a few people graduating who could barely understand how to work with APIs.
EDIT: Just to add some background: I graduated last fall, got into one of the top popular major software companies in the U.S., and reaffirmed that the 4.5 years I spent getting my degree did not seem worth the time spent. It wasn't a complete waste; the foundation is really important for software engineering, but there was a lot of cruft and "took a class because of requirements" that went on.
If all you ever wanted to be was a webdev, you shouldn't have gotten a cs degree, just like if all you ever wanted to do was be an electrician you shouldn't have bothered with the EE major.
"We don't make software engineers, we make computer scientists who can go into research".
I don't think that part is ever obvious to people going into CS programs.
It wasn't to me. The frustration I felt for 4 years is beyond belief. I couldn't stand any longer in my fourth year. Stupid for dropping out at the 9th hour.
Now I wonder why companies aren't pushing for programming over the science.
The claim you always hear is that "people with a CS background have much stronger fundamentals".
I question that claim and believe that people perpetrating the meme have never actually hired and mentored software engineers themselves.
There's in my experience very little correlation between being academically great and being highly productive in an unstructured environment such as that of an early stage startup. In fact if you didn't go to school, and still managed to learn a ton about your craft and the underlying principles, I'm much more impressed.
Couldn't agree more, every time I'm asked to invert a binary tree in an interview I wonder exactly what kinds of problems this company is actually solving.
Like do they not have a general algorithm for this in a library?
Are they looking for more efficient ways of inverting trees?
> I still frequently go back to the same texts we used in class, but I get a lot more from them now that I actually have a pragmatic need for that knowledge
So you knew that there was something useful you could know about, and you knew where to find the information, and you were able to comprehend and absorb and use it.
So not a total waste of time. In fact, pretty much exactly the point of tertiary education.
I'd argue that knowing where to look up information is not worth a couple of hundred thousand $ and half a decade in a classroom.
As I mentioned, most of my comprehension was from learning about things on my own, not because someone in the faculty explained it to me. In fact, it was really difficult to get any kind of mentorship at these schools, given how all instructors were researchers who always seemed very unhappy having to teach courses.
college is definitely too expensive and few would pay that much money just for the info gained from coursework (any of the coursework...not just cs). it also seems counterproductive that society expects everybody to go to college.
unfortunately nowadays college is more about providing a network + campus recruiting opportunities + some qualification on a resume.
I always feel like I'm in a tiny minority when reading people talk about the practical irrelevance of cs coursework at universities. I loved learning the theoretical aspects of cs, discrete math and algorithms, compilers, using haskell, proving stuff...
But I think I would be a bad to mediocre programmer in the real world (ended up doing something different)
All of that stuff is very useful, but you don't need to be in school to learn it. Dipping into the theory doesn't suddenly stop once you get out of school, in fact it should accelerate.
Maybe my experience is unusual but I've found the opposite to be true. Since leaving school I have very rarely had to use theory or refer back to it. I still learn more but that's only because I find it interesting and I only do it in my spare time.
So, part of the problem is that computer science in a lot of schools (and I'm a bit familiar with both Rice and University of Houston here, having interacted a lot with students and teachers at both) is really some weighted sum of: computer engineering, software engineering, computer science (theory), vocational programming (lol game development), computational/applied mathematics and statistics, and maybe web development.
Depending on the program, these are weight differently--for example, Rice has some excellent computer engineering (compilers, networking) and (for a while anyways) software engineering projects (here's a customer, here's your team, organize and deliver this semester, don't fuck up but we'll guide you). It also had some good coverage on the purer CS theory as well as applied math (the robotics and graphics courses are pretty awesome for this). Next to no web development was covered (which makes sense to me).
As a result, you can take a Rice grad and drop them into basically any project and expect them to be okay--they could get up to speed on the vocational aspects or whatever weird algorithmic stuff you needed.
What I'm concerned about, though, is that a lot of programs seem to be adjusting to meet the gold rush mentality of the current software era, and are neglecting to teach fundamentals and theory that make the difference between a middling-decent junior/intermediate developer and somebody suited to really do senior and architecture work.
Our industry, incidentally, is doing nobody any favors. This is at least partly because of the number of well-meaning and excited amateurs who continually reinvent technologies and sharpen axes. This pretty much guarantees that any school trying to be merely vocational has its work cut out for it, and will be out of date rather quickly.
Businesses and companies want and expect universities and colleges to train CS grads how to program in certain languages. But that's not the goal or purpose of a 4 year degree. That's what a 2 year associates degree in application development or programming is for from atwo year technical or community college.
What my 4 year CS degree have me was a better understanding of the macroview of computer systems. It makes me consider "just spend $$$ on more memory" rather than "spend $$$$$+ tweaking the code". It allows me to quickly find issues related to computer systems from hardware/server setups/designs, networking, specific hardware, and software interactions. Its viewing the forest instead of the individual trees.
Very much this. CS programs force you to look at the various theoretical aspects of CS, such as algorithm analysis, formal verification, type theory, etc. The idea is that you will come out of it with the raw tools such that if you WANT to go into industry, you will have the fundamental skills to be able to learn whatever it is you need to learn. Hopefully you learn some programming chops along the way.
4 years of web dev is useless. 4 years of iOS and Android development is useless. 4 years where you learn systems programming, functional programming, distributed systems, OO, concurrency, parallel algorithms, algorithm analysis, compilers, networking, that all pays off in spades. Not to say people without these degrees can't do that. Not at all. But having the fundamentals instilled in you helps mold your mind for when the next thing comes along.
The quoted complaint that the CS curriculum didn't assist one with their startup is an odd one. Startups are meant as vehicles for rapid growth with an end goal of a high exit, and usually based on top of some form of end-user application that can be tiered, sold per unit or commissioned to advertisers.
In contrast, doing CS research (such as research operating systems), despite its high technical importance, frequently has no opportunity for direct end user monetization and is totally inappropriate for a startup structure. It can yield concepts that may lead to huge cost-saving benefits once implemented in the wild, but in of itself it has no market value, only possible utility.
In all honesty, I think the communication failure mostly falls on the industry's shoulders, not on academia. The industry has time and time demonstrated it values short-term conceptual complacency and having more of what they know, as opposed to any truly large paradigm shifts (in software, at least).
Computer Science isn't software engineering. I know lots of good computer scientists that are terrible software engineers (and vice versa). So expecting a Computer Science degree to teach you to be a software engineer is like expecting an English degree to teach you to be a writer. It's useful, but not essential.
The key thing is that software engineering is a vocational skill. It can only be learned on the job working as part of a team (very little software is developed solo). So if your degree course claims to teach you how to be a software engineer but doesn't have at least six months working in the field, then it's worthless on that front.
If industry wants better software engineers it should invest in them itself rather than getting someone to do the job for them.
I've always considered "Computer Science" and "Computer Programming" to be different things.
In school I hated the classes where I had to mathematically prove functions and algorithms. Felt like the biggest waste of time ever. That's CS. I wanted to learn how to make cool shit. CP if you will. Substitute another term if you like.
In hindsight I'm glad I took the CS classes. But I also recognize that a strict CS degree will not necessarily prepare someone for a job. While a strict CP degree (again, substitute the term if you wish) may not give everyone the theory and structure they aught to have.
The assertion that a 20% of a random sampling of Carnegie Mellon professors have industry experience in the sense that graduating undergraduates "go into industry" is, IMO, questionable.
I'm inclined to wonder how much of that experience came in the form of internships or jobs in industry research labs (which is technically industry, but not the same "industry" their students are going into).
I wouldn't worry that computer scientists have had a hard time figuring themselves out. Academia is a kind of a dinosaur anyway, though there are a lot of smart people in it, cranking out good work. I just don't think Academia does them justice and it will be obvious very soon. Scholars who embrace decentralized teaching/learning will have such a strong information asymmetry advantage they will make the academic output of universities look like crayon drawings.
You can argue that the academics will adapt and reform their institutions in the decentralized space, but I would argue it won't really be Academia as we know it.
this is a pretty naive viewpoint probably written by someone early in their career. CS programs should be teaching fundamental skills - algorithms, software design and engineering, computer theory. In industry it's much better to have a team member with a firm grasp on fundamentals than not, even if the later is an expert on framework du-jour. The one with with strong fundamentals will grasp the framework in matter of weeks. Not so much for the reverse ... additionally the poor-fundamentals programmer will pollute the shared codebase with bugs and performance gotchyas for years.
The greater problem for CS education is improving the curriculum and outcomes for incoming students who haven't already been coding since youth. Women and minorities often don't have that advantage. In my undergrad, I saw tenured research profs catering courses to top of the curve, mostly privileged males already with a strong background in STEM fields & programming.
We should differentiate between computer science and software engineering. They are somewhat overlapping but SE obviously places more emphasis on software. And I be alright saying CS focuses more on hardware.
Is CS really more focused on hardware? In my experience that hasn't been the case or at least a key distinction. I've found CS has more focus on theory and algorithms where SE is more about dealing with code base complexity.
Fair enough. I jumped the gun. SE has much much less concern about hardware. CS applies algorithms and protocol design to practical uses in hardware, networking, computer systems, and software.
Hiring CS grads to create CRUD apps is like hiring EE grads to wire your house. Even if the EE grad could figure it out, odds are it wouldn't be up to code.
Stuff the industry; CS isn't vocational training.
Most of the coding that goes on in the industry is mind numbingly dull from a CS perspective.
The idea that the industry should play fiddle, and CS should dance to the tune, is ridiculous.
This article fails to define what it means by "stagnating". "Stagnating" isn't a suitable single word for describing the memory of an experience (such as an undergrad program) which wasn't what one expected.
A CS department stagnates when it doesn't publish papers, I would think. Just like a political science department or English department. (An English department isn't said to be stagnating because it doesn't teach people how to speak English.)
As far as the undergrads go, they would be poorly served if the focus of the program was to teach them some application stack du jour. Even if that's what the industry wants today, that may change by the time they graduate.