

Ask HN: Help redesign a Comp Sci major - mitguy

I have the task of refocusing the Comp Sci program at a small liberal arts college, with an even smaller Comp Sci department. The trad MIT/CM/CalTech theoretical program just doesn't work for these students. Emphasis should be on employable computing skills that can be delivered almost entirely by 1 prof. What should I teach? What can be cut out? What would you call the major?
======
EnderMB
I'm absolutely shocked that people are okay with this.

One of the biggest problems I've found with recent graduates is that the
programs they've come from have been nothing more than trade schools for Java
and PHP developers, and most of them lack even the basic underlying knowledge
of things like algorithms, paradigms and how to make efficient code. This is
not Computer Science, this is basic programming and is no better than what
you'd get from the numerous video lectures on the web.

In short, employable skills are best left to employers, and academic subjects
are best left to academia.

If it were my choice I'd make Computer Science a purely theoretical course
from the outset. If these kids don't have a background in Math give it to them
so that they can read through the likes of Introduction to Algorithms and
TAOCP with no issue. Don't even let them touch code unless it's to illustrate
a theoretical aspect of CS.

Once they've got a solid year of CS behind them and they're all of a
sufficient standard in the theoretical aspects of CS then they can be
introduced to programming. However, I'd steer clear of the employable
languages and force them into Python, Lisp, Haskell, Prolog, R and co to
reenforce the theoretical background.

Of course, you need these graduates to be employable, so why not offer a
course titled "Internship" during the second year? Let them spend a few days
each week working in a real company with real developers that can teach them
real skills? I graduated from a modest CS programme and thanks to my numerous
internships I was offered interviews left right and centre. I learned more
about programming in an eight week internship then I did in an entire year at
university, despite most of my second year being about learning to program,
and because I had real world experience in employable languages the university
didn't cover like C# I walked into a job while others in my class with better
grades struggled.

You get the idea. Please stop turning academia into a trade school and teach
these kids real CS. If you want them to be employable offer a mandatory course
where students have to work in a real business and are forced to pick up
programming from people that actually know what they are doing.

~~~
mitguy
I do like the idea of an internship!

Real-world experience can not be overvalued. That being said, I'm still
working with the parameters in my original post. Small. Liberal Arts. College.
This isn't a huge program, I've consistently had between 12 and 20 students in
the major, of all grade levels and wide differences in skill levels,
experience, exposure, and mathematical prowess.

I know it may be hard for others to understand what my student landscape looks
like, but after more than a decade, I feel I understand them. But a year of
strictly theoretical CS, and I'm looking at 2 to 7 students in the major, and
filling out my teaching schedule with writing classes.

Looking at a newly built house, you see that all levels of craftsmanship have
been employed. If every contractor is a framer, because it's the fundamentals
of building, your interiors look pretty basic, and let's not even mention
plumbing made out of lumber!

What I'm trying to say is that I've tried my own version of CS snobbery and
highmindedness, and it isn't working. I'm looking to design a computing major
that isn't producing only framers.

I appreciate your input, please reread this post and my original post, and try
to put yourself in my position, then send me some more suggestions!!

~~~
EnderMB
Like many students, I studied at a modest university with what I was told was
a good CS programme.

It turns out that the reason it's good is because it teaches students how to
become the next batch of Java and PHP developers for local businesses. The
actual CS content was minimal and bizarrely a lot of classes from the "CS
school" were out of bounds to my CS degree, because learning XML was more
important than Discrete Math.

I spent half a year studying at a top 5 university and the difference was
overwhelming. The facilities weren't much better, but the lecturers were far
more helpful and the content was not watered down. Perhaps it was because it
was a Masters programme, but the lecturers were happy to help, even when you
were hopelessly stuck. Sadly due to money reasons I couldn't stay, but I'll
definitely return to get my Masters when I finish at my current job.

It still amazes me that a lot of students don't opt to take internships during
the summer. I worked for eight weeks after each academic year and everything I
picked up was invaluable. The best part about it was learning what you were
good at and what you sucked at. It taught me languages I'd never learn at
university and it taught me exactly what is expected of a developer in the
real world. Most importantly, I was employable right off the bat. My degree
almost didn't matter because the skills I had gained through these groups of
eight weeks were what got me my job.

I appreciate that you are constrained in what you can do, but I do feel that a
practical approach to CS and general trade Computing is less than perfect in
an academic setting. If possible, would you be able to highlight the exact
problems you faced when trying the typical theoretical route?

------
codegeek
"employable computing skills "

I think this is a very broad statement even though not trying to undermine
your question. The problem however with this statement is that it varies. Some
people have mentioned HTML/JS etc. which are surely good but not all tech.
related jobs need that.

Some things that I strongly feel should be part of every CS curriculum
especially in the context of your question i.e. employability are following:

\- Understanding real world projects and the lifecycle. This should include
practical experience delivering software and manging the entire phase. Some
universities already have a "software engineering" class for this.

\- Related to #1, teach them the value of teamwork. No real world project can
be done alone and the bigger the employer, more likely you will deal with
cross matrixed teams (yea teach them what cross matrixed means :)

\- Teach them about distributed computing and how software is deployed/managed
in a large scale environment.

\- Teach them that no one cares if you build cool stuff. People care about
what problem it solves. It could mean anything from keeping your
company/manager happy to solving real world client problems.

\- lastly, tell them that whatever you learn now, you will learn a heck of a
lot more when you are actually out in the real world and that will make you
realize that you did not know anything. This does not however mean that the
learning is useless but it is just the tip of an iceberg.

~~~
mitguy
Thanks! Good points, CG.

I know the "Give a man a fish" adage, but trying to teach my students to be
Global Anglers hasn't worked well for a decade (given our students, numbers,
faculty, and facilities). I'm not shooting for trade school either, just
something that's very mindful of your iceberg analogy.

Tip: \- Come, learn some skills, peek at the landscape.

Iceberg: \- Go get a job(s), learn the rest of what you need to build the
solutions that keep your company/manager happy.

------
jfaucett
I think unless you're a computer scientist or engineer at google, fb, etc.
most of the theory you run into at the university level you're never gonna
use. Having said that some "theory" is important for me as a developer, but
really it just boils down to 2 subjects Discreet Maths and algorithms, so I
would try to emphasize those two areas in the program.

As far as employability goes, I think having a strong understanding of
networking (TCP/HTTP) and Data/Databases, is really the core ( ie Data &
Application layers) of most development these days.

Also I know its kind of experimental but if possible I would actually teach go
as a programming language instead of the usual c/c++/java (for the systems
level / networking stuff). Go in tandem with javascript (both server and
browser) for web applications, allows you to deal with just about every
programming concept there is at an abstracter level and both languages are not
overly difficult to grasp.

So to put these thoughts together. Maybe you could center the program around
networked software development? This lets you teach things like webkit,
browsers, js, networking, servers, and databases while all staying in one
(though very broad) medium. I think this could be possible for 1 prof. So for
instance, in "interface design" you're in the browser and delve into webkit
and user interaction, or for "computer graphics" you can go into "webgl". For
"databases" you're still dealing with server / client architectures for which
you can examine db structures different types of web applications.

I'd love to have had an option to do a pure web engineering compsci program.
Maybe some of my ideas will help, good luck!

~~~
rhsanborn
You may never use most of the things you learned. But they almost certainly
had an impact on how you thought and developed. I'm surprised how many times
silly theory of computation things come up and maybe don't force me to change
my solution, but get me to think of it at least a little differently.

~~~
jfaucett
Yes, maybe I should have mentioned proofs/theory of computation seperately I
kind of just included them under the general algorithms umbrealla term :)

------
rhsanborn
I agree with several of the posts below, but I have to take it a step further.
I'm not sure you can or should call it a major in the traditional sense. It
cheapens the idea of a degree if we teach people trade school curriculum and
give them degrees. I've encountered a lot of these people, and it's the reason
a lot of people from private institutions (Baker, Davenport, Phoenix, etc) get
passed over for jobs.

If people really want a degree in MIS or CS, then the curriculum should
reflect that. If they can't handle the material, then I think the college
should look at the possibility of offering certificate programs in specific
areas.

A student who understands theory can generally go out into the world and
adapt. A student who understands how to make a web page using specific
languages and specific steps and can't go much beyond that hasn't learned the
things necessary to be given an associates or bachelors label.

Question: Are the students capable and don't have the prerequisite knowledge?
Maybe the college needs to consider an entry exam and remedial classes on
basic computer use, theory, and logic? If they aren't actually capable, then I
stand by what I said above. I think the college should consider certificate
programs like, "Web Programming with .Net and Javascript".

------
iamshep
I run a 13 person IT shop at a Big-10 university. If I were a single person
shop, what would be different? I wouldn't write code. I would be all about
integrating off the shelf software into my systems. I would very likely be
stitching University databases and third party apps together. I would be a
call center and a support shop. I would be doing a lot of reading logs,
Googling errors, and generally trouble shooting. I would be more than
functionally literate on MACs, PCs, Unix, and MF machines. I'd know a
reasonable amount about networking, and even more about SQL.

If you were going to roll this into a minor, I'd call it something like
Applied Information Technology Management (or some mix of such buzz words).
(Note: James K. badgered me into posting)

------
tjr
Some sort of class on embedded programming. Lots of software development jobs
are for embedded programming, which is in many ways a world apart from web
applications. Things like avionics software, automotive software, medical
device software...

Recently the JPL C coding standards were posted around the web, as people
showed a bit of interest in the software running the recent Mars robot.
Discussing that sort of thing could be interesting course material.

I don't know of any curriculum off hand, but maybe make it some sort of
project-based course where students design and build real-time applications on
some sort of real-time Linux system?

------
brycecolquitt
If these are the type of students I'm thinking of, they really don't want to
become computer scientists, or software engineers, even. They want to learn
how to code so they can build stuff, namely web apps. I'm assuming that since
they don't want/can't handle a theoretical program, yet still _think_ they
want to learn Comp Sci, they fit this description.

So I'd teach towards that:

1) an understanding of the technology stack for a web app 2) Javascript 3)
Ruby or Python w/ Rails or Django

~~~
mitguy
Yes, I think web apps is a significant part of it. Looking at projections of
software sales, web apps and web app support are on the horizon for almost
everyone.

------
probinso
It sounds like you want to seperate software engeneering and computer science.
Computer science is a different course material.

~~~
mitguy
Agreed!

But I'm looking for a hybrid. I don't want to abandon my theoretical roots,
but I'm missing more than I'm hitting, with my students as of late.

Plus, you can have Computer Science without Software Engineering, but the
reverse isn't true.

------
ghenne
Good idea! Some things to teach...

HTML JavaScript programming, Practical SQL/SQLite usage, Project Management,
Visual Studio .NET

~~~
mitguy
I think .NET seems a bit heavy for what I've got in mind. I also probably
don't understand .NET as well as I should either.

Can I get some more clarification why .NET should (or shouldn't) be included
in a modern, practical curriculum?

------
tjr
Teach this class: <http://philip.greenspun.com/seia/>

~~~
mitguy
Yes, this is definitely a course type on the radar!

It echoes some other responses, and also some of my thoughts.

MIT Press makes it that much cooler!

------
ghenne
Some things not to teach:

Don't do: Numerical analysis, Compiler design, OS programming, Real time
coding

~~~
mitguy
As much fun as Org/Arch is, I'm thinking we can leave most of that and ASM
behind, as well.

Not totally behind, but committing a semester to them as a course is what
hasn't been working.

