Hacker News new | comments | show | ask | jobs | submit login
How to make CS courses better (sophiechou.com)
42 points by thetabyte 1582 days ago | hide | past | web | 55 comments | favorite



The article completely lost me at,

> "These should be the heart of the course, and is what matters in “the real world”, anyway. No one’s going to give you a bonus for remembering the difference between inheritance and polymorphism: let’s face it, you’d take the 5 seconds to Google the definitions and move on."

Not that OOP is the be-all end all, but within that realm, if you have to google that, you never understood the concepts in the first place.

It sounds to me as if the problem is more with the writer's alma mater, and not "computer science curriculum" as a whole.

In fact, I feel a give-away is,

> "IF YOU MUST MAKE STUDENTS TAKE EXAMS, STOP ASKING LANGUAGE-SPECIFIC SYNTATICAL QUESTIONS"

I recall very little of this. There were probably a few courses with some of it, but probably where it was relevant, i.e. testing that you actually grasped pointers in a low level focused course that covered C, etc.

An aside, this is kind of what saddens me about the current, twitter/blog centric world. I'm sure the writer has some valid frustrations, but rather than directing those to a general "How to improve CS courses" article that misses the mark due to CS programs varying so much, why not have some discussion with the department heads? There has been too much of a shift to be publicly vocal with complaints globally, rather than to try and privately sort them out with actual stakeholders.


>It sounds to me as if the problem is more with the writer's alma mater, and not "computer science curriculum" as a whole.

From OPs description, I would agree. I attend a run of the mill state college, but thus far have been pretty pleased with the CS courses -- even though they center around Java. I can't remember a single assignment that was graded based on writing definitions of polymorphism and inheritance (although we did have in-depth class discussions about inheritance vs composition) nor were we ever quizzed on language specific syntax. Every assignment we had was basically "build [x] using the tools covered during the week." The homework for the inheritance week was building a simple console game. Results were based on the code that you cranked out.

I do wonder why they're based around java, though. I don't really understand where it fits in the world right now. I know a lot of companies use it, and it's still one of the highest paying languages out there, but.. is that just for legacy reasons? Just because it's already deployed in so many places?

Additionally, I know it grew on the web, but now, I mean, with the speed of modern computers, and high level languages like Python/ruby/etc., what would the advantage of choosing Java be. If you need something faster, why not just hop to C++?

I've gotten a little off track now... but it's just something I've been wondering. And for the record, I'm not deriding Java -- I rather enjoy writing it -- I'm just curious where it fits into the modern world.


> > "IF YOU MUST MAKE STUDENTS TAKE EXAMS, STOP ASKING LANGUAGE-SPECIFIC SYNTATICAL QUESTIONS"

> I recall very little of this.

I recall quite a lot: every exam I took where I had to produce a program, minor syntax errors like forgetting a semicolon at the end of a line would cost me some points. I think this is an artefact from the old days of batch processing, when syntax errors where actually costly.


I agree, I'm a third year CS student and 90-95% of what I know I learned from in class projects or projects on my own. I didn't learn how to write good code by studying for exams.

The best class I ever took was a graduate level course called Internet and Web Systems. It was entirely project based and, more importantly, the projects were entirely spec-based.

For example, the first assignment was to write a webserver in Java that met the HTTP 1.1 Spec. The prof didn't care what algorithms we used or what data structures, all that mattered was that the server performed as desired when subjeced to tests. These were not unit tests that tested specific method interfaced but real-world test like 100,000 concurrent connections or testing for proper response codes in uncommon situations.

The last project in that class was to build a search engine. I mean an entire search engine, a crawler, an indexer, PageRank, a frontend, etc. all distributed over a DHT on AWS. I learned so much about disk/memory management and concurrency while building that project. In the end it was graded on 2 things: the robustness of the architecture we invented, and the professors experience when using the search engine. These criteria gave us a lot of freedom to learn and explore new programming techniques.

TL;DR - I agree, projects are the way to go.


That's a massive project, but I agree most learning in CS happens with actual implementation. There's a massive difference in knowledge between being able to say "I've studied that" and "I've built that".

It appears the professor, Dr. Andreas Haeberlen, is deeply connected to some of the tech titans and has structured the course around influence from his own network [1]. Bravo. We need more similar course offerings across schools.

Internet and Web Systems is quite interesting in combining aspects of data structures, web apps, distributed systems, information retrieval, and cloud computing in a single course. Its undergrad daughter course Scalable and Cloud Computing also seems valuable. Some excellent paper and tutorial links are included on their course webpages [2][3].

1: http://www.thedp.com/article/2012/11/company-sponsored-compu...

2: http://www.cis.upenn.edu/~cis455/

3: http://www.cis.upenn.edu/~mkse212/


[meta]: 3 "tl;dr" rules:

1) Try not to need it at all.

2) Write it first, not last.

3) Ommit "tl;dr". The summary is enough by itself.


When I taught an intro to programmer course, I finally realized the value of exams: they're direct feedback on what students understand.

Yes, projects are more important, but it's possible for students to squeak by on projects while still having fundamental misconceptions. For example, I discovered that many of my students could not read code, and imagine how it would flow at runtime. These same students could write code with loops, but if I gave them code with a loop in it, they wouldn't "see" the runtime behavior. This was fundamental, and I spent time in class teaching them how to read code.

At the same time, everyone got some more abstract things (like what are the appropriate data types for modeling real-world things) that I figured were harder. I stopped stressing that in lecture because no one seemed to have a problem with it.

If it was feasible to give students a sit-down assessment that was not graded and could give me this kind of feedback, I would probably be in favor of it. But, if you don't grade something, students aren't going to put their full effort into it. Hence, exams.


I'm not sure that your last statement is true:

> But, if you don't grade something, students aren't going to put their full effort into it. Hence, exams.

I'm now in college, but when I went to high school my physics teacher would give us weekly quizes that covered the material we were learning. This quizes weren't graded but were used to assess what we had learned well and what we needed more work on. Everyone seemed to give these quizes a decent effort for pride reasons, they were still graded and handed back by our teacher but the grades were not recorded.

Now part of the success of this approach may have been the quality of the students I went to school with, I attended a selective private school, but I would think you have similarly disciplined and focused students in the CS departments at most universities, so I don't see why this approach wouldn't work there as well.


I was teaching all non-CS majors, and it was a summer course. Most of my students either had jobs or were taking other classes. I have no data to back it up, but my intuition is that if the exam had not been graded, they would have, rationally, not put their full effort into it.


"I discovered that many of my students could not read code, and imagine how it would flow at runtime."

In what language, and what sort of code? I cannot really blame a novice C++ or Java programmer for not being able to see what some snippet of code will do at runtime. Some languages are just less amenable to being run by a human brain than others.

"These same students could write code with loops, but if I gave them code with a loop in it, they wouldn't "see" the runtime behavior"

That is because loops are not very informative about program behavior, except in the simplest cases.


Simple loops in C++, such as computing an average. "Blame" isn't the point. I was very surprised that students could not read the same kind of code they could write, and I spent time teaching them how to do so because of this realization. Had I relied solely on projects, I would not have learned this and adapted my teaching.


I agree with everything in the article. I'd add one more thing: encourage students to use an IDE. Whenever you write code, you should be able to compile, run and set break points with a single click or keystroke.

The first thing I do when I help new CS students is to show them how to use an IDE. One guy that I taught XCode to went from an F one semester to an A in the next.

Side note: don't force new students to use VIM. It just adds one more layer of complexity to an already complicated subject. If they want to become keyboard ninjas, let them do that on their own time.


This is a great idea and I wish I could upvote it more.

My initial instinct was something like "Naw man, everyone should use vim or emacs and a CLI debugger," but I recall being almost magically more productive upon learning to use the debugger in MSVC and having a really difficult time learning to use vim and gdb.


I actually stand on the opposite side of this debate. Not using an IDE forces you to remember and think about your language and development environment. You learn the methods and APIs that you call much better when you have to research them rather than when you scroll down an auto-completing list of method names. "Oh, this name looks close enough, lets try it!" versus going and looking up in the javadocs or python docs or [your language of choice] docs. When you have docs that are a good read (like javadocs and python docs, and maybe boost C++'s docs too), you can read the description, the examples, and sometimes even some recommendations on how to use that code.

That said, going to a-whole-nother environment just to complile can be a bit of a chore. There are some text editors that come with hotkeys for compiling and running. When I was in highschool, we used TextPad. Ctrl + 1 to compile java and Ctrl + 2 or 3 for running programs (depending on if they were applets or applications).

In college, I used nano/pico and eventually emacs. I'm still slower in some modern IDEs when trying to manipulate text as fast as I did in emacs -- though autocomplete and jump-to-definition makes my coding faster, in general.

At the end of the day, though, we're two people with anecdotal evidence that strongly supports our side of the argument, in our minds.

Addendum: regarding debugging. I generally find, for the kinds of problems that new programmers run in to, println debugging is generally sufficient, so for debugging, I would just recommend that.


Good counterpoints.

Perhaps it just depends on what you're trying to teach. If you're trying to teach students language-specific APIs, then you wouldn't want them to have auto-complete. If you're trying to teach students how to program in a language-agnostic way, then the APIs themselves don't matter as much as the theory behind them does.

I agree with you on reading and understanding the docs. The problem with memorizing APIs, however, is that they change. Even the C++ standard library is at version 11 now. What is the value of memorizing something that will be deprecated in 2 years (or less)? Understanding it has much greater long-term value.

At my college, I've talked to a lot of students who are reluctant to try new languages because they've become reliant upon the syntax and APIs of one particular language. If I challenge them to read some code in language X, they just respond with, "But I don't know language X"-- as if that were a valid excuse.

Rather than being taught HOW something is done in a particular language, they should be taught WHY.


That would be teaching the students a tool, not a workflow. Well, using an IDE also has a workflow... whatever.


Many of my nerdy friends are trying to learn the basics of programming, as a vocational skill. They're not interested in computational science, which is often what's taught.

In college classes, these friends often encounter the situation where a class is taught by 1 professor each term, and the intro classes tend to receive the less-desirable professors. These are the professors whose assignments don't line up with class lessons, who teach programming history before techniques, etc. This puts them into the situation of having to suffer though learning a difficult subject with little help, or waiting years until one of the "better" professors teaches an intro class.

With Codecademy, it's easy to get discouraged because of the lack of people around -- you can go onto the forums, but there are no humans in proximity to commiserate or discuss hard problems. Further, some of the problems are broken -- while you and I can just "look in the back of the book" aka read the forum, others see this as "cheating" in the same way that reading a GameFAQ is "cheating" at a game.

What seems to be needed is something similar to Codecademy, but with lab/office hours -- where people can work at their own pace but in the same space as others at their same level, and people with more advanced skills, who can give advice and mentor. I've envisioned this as a hackerspace for some time, where the mentors are also working on projects, and may even receive advice from other members. Like a mix of montessori schools and valve.


Stephen--

It would be great to hear your thoughts on CodeHS.

>but with lab/office hours -- where people can work at their own pace but in the same space as others at their same level, and people with more advanced skills, who can give advice and mentor

This is literally a description of what we are trying to build at CodeHS.

>With Codecademy, it's easy to get discouraged because of the lack of people around -- you can go onto the forums, but there are no humans in proximity to commiserate or discuss hard problems.

Yup. And every beginner gets stuck. If you like our approach, contact us at team@codehs.com!


It sounds like a great program for high schoolers. What about us older people?


Older people can use it as well. Weve had students as young as 5, 9, 10, and as old as 80. Right now we are focusing on HS, but any beginner can try.


CodeHS is well on its way to that, providing help from live tutors and feedback on coding projects -- http://news.ycombinator.com/item?id=4835649.


I don't know why you chose Codeacademy as an example of an online course but left out edX and Udacity and Coursera, which all have decent discussion forums and hackers of all abilities.


With Codecademy, you can just begin lessons. My experience with edX, Udacity, and Coursera is that the class is structured, so you can't work at your own pace.


Harvard's CS50x has a very loose structure. It's open until April 23 and one can complete the exercises, problem sets, lectures on his own time.

MIT's 6.00x has a strict weekly structure.

Udacity's courses have all eliminated the schedule and are now always-open, self-paced.

Most of Coursera's courses are structured.

If you want your friends to actually become programmers I'd recommend that you push both an Intro course as well as codeacademy for language practice. Codeacademy by itself isn't going to get you anywhere. The tutorials don't provide a solid enough foundation to actually implement anything meaningful. As a beginner programmer I went through every Python tutorial in < 30 hours. I used them as Python language practice, which worked well as a supplement to the core Intro courses. Both Udacity and 6.00x use python so they work well with codeacademy's python tutorials.


Udacity's courses have all eliminated the schedule and are now always-open, self-paced.

I didn't know that. Thanks. I'll mention it to them.

If you want your friends to actually become programmers I'd recommend that you push both an Intro course as well as codeacademy for language practice.

That's great advice. I can't teach my friends because I'm not really a programmer -- I was a business major turned cloud engineer who's still teaching himself.

Codeacademy by itself isn't going to get you anywhere.

I gained a great amount from finishing about 1/4 of code year. I feel like I grasp the fundamentals of programming and could soon become an intern at a startup. I wish my friends could learn as easily as I do...


I bet you'll learn a lot trying to teach your friends - it's a great way to solidify concepts in your own mind.


I've learned that I'm bad at teaching my friends.


Actually you can view Coursera's courses whenever you want and do it at your own peace.


Unfortunately that is not true for every Coursera course. Most of them I've seen run by schedule, with weekly assignments. Classes that aren't open yet are closed for access, even if they have already run.


I do agree about Java, it strikes me as an especially bad language for teaching because most of the learning has to focus around Java's view of how object orientation should work and various JVM specific things.

It has a pretty steep learning curve to it before you can start writing interesting programs, compared to python or C. C has it's own difficulties of course but a lot of these are things that are actually difficult such as how to deal with memory management rather than fluffy problems about interfaces vs abstract classes.

OTOH there are a lot of jobs out there in Java which is perhaps mainly due to it getting into universities/colleges in the 90s when it had the kool-aid factor.

Not so sure about making all grading project based though. The problem with this is that whilst plagiarism is usually punished severely there is the gray area of "helping your neighbour". This gives a bonus to persuasive types who are good at getting other people to "help" them an awful lot.

Of course the argument could be made that this is good preparation for industry.


I agree that having projects weighted high would benefit those who got unfair levels of help. That's probably the main reason it isn't done already.

Maybe that could be helped by the author's suggestions on how to make the written tests better. Or by having timed programming exercises as part of the test.


>C is a much, much better “lower” level language for really grasping the way programming works, and Python is a much, much more fun language if you want to lower the barriers to entry and get students making things right away.

I've never seen it put quite like this, but it seems correct and important. The languages with lower barriers to entry also tend to be easier to explore computer science in. There are some languages like C or assembly that make it easy to understand how the machine works and some languages like Python, Javascript, Lua, or Scheme (sorted in descending in order of how much they get in your way) that make it easy to try new concepts, and a student in university could very well end up knowing no languages from either of these categories.

I was lucky enough to circumvent most of these issues by using my mandatory CS courses as an annoying supplement to my programming contest preparation and game programming projects.


Just because a prof at your school formats a programming class poorly does not mean that all cs curricula need to be changed. In every programming class I have taken, projects have taken large precedent over exams and I feel like general concepts (recursion, abstraction, design) have been stressed over nitty gritty syntactical details.


Agreed. Schools don't really know how to teach programming. The focus should be on doing projects, not sitting in lectures. People should just get together in groups and work on coding, with someone to ask questions to if they're stuck. I doubt schools will change, but there are alternatives, like coding bootcamps.


Your institution is doing something wrong if you're being taught ``programming'' while majoring in CS.


"Computer Science is no more about computers than astronomy is about telescopes."


I wish they'd call it Computational Science, since that's what it is. Computers are only involved for convenience, just like with geometry and lithography.


I do think that both approaches are legitimate, but conflating engineering with mathematics just causes unnecessary confusion, and feeds pointless Internet flamewars over semantics.


I would suspect that most people majoring in Computer science are interested in developing software and only take courses on subjects like Calculus and Computer theory because they need to.


Eh...excuse me?


See jfb's quote.

Also, I went to Rutgers as well.


If you aren't learning programming, you might as well study art history.


I have a project due for my second year cs class. It's in C++ and it had a lot of potential to be a fun project. We have to implement a game of hearts on the console. But they've stripped all the fun parts out. No shooting the moon, or breaking hearts. I don't know why either, it's not like they would be hard to implement. And then we had a chance to write a fun AI to play it. But they over-specified how the AI was going to play. We have no creative freedom at all and it's rather demotivating.

Another point is that this class is sortof a "learn c++" course, and I think it suffers from a lack of motivation, and everything feels somewhat contrived. I wish it had some overreaching goal that we were working towards as motivation. It would give more context to what we are learning.


Do you have prior programming experience? If you do, and you're in a lower-level class, you may be frustrated by a class that is designed to teach students things you already know.


Over the course of my college career, I pleaded with three different computer science and engineering chairs to please please please update the curriculum to focus on project based learning and more recent developments in software engineering. My requests fell on deaf ears. Every chair wrapped himself in the "we're preparing you for the workforce, not training you for a job" rhetoric. The reality is that the field is rapidly becoming portfolio based... employers want to see what you've made, and there's absolutely no reason that theory can't be layered on top of practice.

I'm interested in the trajectory of App Academy, Catalyst Course, and others. I'd wager most graduates of those ~6 week programs end up better programmers than graduates of many 4 year institutions.


>No one’s going to give you a bonus for remembering the difference between inheritance and polymorphism: let’s face it, you’d take the 5 seconds to Google the definitions and move on.

The point of those questions isn't to test whether you know the definitions of polymorphism and inheritance. The point is to test whether you understand the concepts. Anyone can regurgitate the textbook definition, but can anyone explain why polymorphism should sometimes be avoided? Not if they haven't invested some time in understanding both the theory and applications of OO principles.


I do not fully agree on his Java point. Because C is too hard for beginners, you just confuse students with hard details.

But Python is a candidate, the argument for Java is the combination of the IDE and the compiler. If you make a typing mistake the IDE screams at you, this is helpful for beginners.

After all, is Java that bad? Of course there are people misusing Java with bad design practices. But that is just a reason for teaching object orientated programming.


I had some background in programming before taking the intro CS course at my college, but not much. This intro class is done in C and most students who take it seriously, myself included, think this is a good choice. I think C probably seems more difficult than it is to those who are used to high level languages because they are accustomed to concise programs while C requires many more lines of code. Someone new to coding isn't used to the concision of other programming languages so this doesn't bother them.

And aside from pointers, there really isn't anything in C that is harder than any other programming language. I personally found pointers easy to learn, which I think was the result of having them explained clearly and then being forced to use them in my code.


OP is a girl :)


I agree that Java is not a good language to start off with as you do not get good understand of what is really going on at the machine level. But to say to stop teaching it outright is ridiculous. Where it be the growing android market or web enterprise jave development it is still in wide use. Java also is a prime language for learning/using OOP design principles.


There is no reason to subject someone learning how to program to Java. The amount of conceptual scaffolding required to do even the most basic tasks is stupidly large. It has a place in a survey class, I suppose, where professors with COBOL experience are approaching superannuation, but for all that's holy, keep it away from young minds.


Java has two strong points, its a great example of OOP and it forces you to understand types. I find that if students don't learn these things early on they will never get around to learning it, then when they get to C or Python where types are more of a suggestion, they have a really hard time with it. Java is stricter and harder to learn I agree, but the fact that it makes you understand OOP and types just to use it properly makes you a far better programmer in my opinion. I think Java should definitely be used as a starter language.


> it forces you to understand types.

If understanding types is so important, then why not go all the way and use either ML or Haskell, or at least Scala?

Most freshmen who stumble through a Java-school curriculum will not understand types at all, and based on confusion among a number of colleagues just yesterday (all Java programmers) about what a "sum type" is, it's clear to me that Java doesn't encourage developing a rigorous understanding of types, even for professional programmers.


Too, it's not a very interesting demonstration of OOP. Smalltalk or Ruby would be a more useful tool to use to teach OOP, iff teaching OOP is actually a goal.


> you do not get good understand of what is really going on at the machine level

I'd argue understanding "what is really going on at the machine level" shouldn't be the primary goal of your first programming language.

Java has a huge barrier to entry before one starts programming. Huge barriers to entry don't belong in a 141 course.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: