> "These should be the heart of the course, and is what matters in “the real world”, anyway. No one’s going to give you a bonus for remembering the difference between inheritance and polymorphism: let’s face it, you’d take the 5 seconds to Google the definitions and move on."
Not that OOP is the be-all end all, but within that realm, if you have to google that, you never understood the concepts in the first place.
It sounds to me as if the problem is more with the writer's alma mater, and not "computer science curriculum" as a whole.
In fact, I feel a give-away is,
> "IF YOU MUST MAKE STUDENTS TAKE EXAMS, STOP ASKING LANGUAGE-SPECIFIC SYNTATICAL QUESTIONS"
I recall very little of this. There were probably a few courses with some of it, but probably where it was relevant, i.e. testing that you actually grasped pointers in a low level focused course that covered C, etc.
An aside, this is kind of what saddens me about the current, twitter/blog centric world. I'm sure the writer has some valid frustrations, but rather than directing those to a general "How to improve CS courses" article that misses the mark due to CS programs varying so much, why not have some discussion with the department heads? There has been too much of a shift to be publicly vocal with complaints globally, rather than to try and privately sort them out with actual stakeholders.
From OPs description, I would agree. I attend a run of the mill state college, but thus far have been pretty pleased with the CS courses -- even though they center around Java. I can't remember a single assignment that was graded based on writing definitions of polymorphism and inheritance (although we did have in-depth class discussions about inheritance vs composition) nor were we ever quizzed on language specific syntax. Every assignment we had was basically "build [x] using the tools covered during the week." The homework for the inheritance week was building a simple console game. Results were based on the code that you cranked out.
I do wonder why they're based around java, though. I don't really understand where it fits in the world right now. I know a lot of companies use it, and it's still one of the highest paying languages out there, but.. is that just for legacy reasons? Just because it's already deployed in so many places?
Additionally, I know it grew on the web, but now, I mean, with the speed of modern computers, and high level languages like Python/ruby/etc., what would the advantage of choosing Java be. If you need something faster, why not just hop to C++?
I've gotten a little off track now... but it's just something I've been wondering. And for the record, I'm not deriding Java -- I rather enjoy writing it -- I'm just curious where it fits into the modern world.
> I recall very little of this.
I recall quite a lot: every exam I took where I had to produce a program, minor syntax errors like forgetting a semicolon at the end of a line would cost me some points. I think this is an artefact from the old days of batch processing, when syntax errors where actually costly.
The best class I ever took was a graduate level course called Internet and Web Systems. It was entirely project based and, more importantly, the projects were entirely spec-based.
For example, the first assignment was to write a webserver in Java that met the HTTP 1.1 Spec. The prof didn't care what algorithms we used or what data structures, all that mattered was that the server performed as desired when subjeced to tests. These were not unit tests that tested specific method interfaced but real-world test like 100,000 concurrent connections or testing for proper response codes in uncommon situations.
The last project in that class was to build a search engine. I mean an entire search engine, a crawler, an indexer, PageRank, a frontend, etc. all distributed over a DHT on AWS. I learned so much about disk/memory management and concurrency while building that project. In the end it was graded on 2 things: the robustness of the architecture we invented, and the professors experience when using the search engine. These criteria gave us a lot of freedom to learn and explore new programming techniques.
TL;DR - I agree, projects are the way to go.
It appears the professor, Dr. Andreas Haeberlen, is deeply connected to some of the tech titans and has structured the course around influence from his own network . Bravo. We need more similar course offerings across schools.
Internet and Web Systems is quite interesting in combining aspects of data structures, web apps, distributed systems, information retrieval, and cloud computing in a single course. Its undergrad daughter course Scalable and Cloud Computing also seems valuable. Some excellent paper and tutorial links are included on their course webpages .
1) Try not to need it at all.
2) Write it first, not last.
3) Ommit "tl;dr". The summary is enough by itself.
Yes, projects are more important, but it's possible for students to squeak by on projects while still having fundamental misconceptions. For example, I discovered that many of my students could not read code, and imagine how it would flow at runtime. These same students could write code with loops, but if I gave them code with a loop in it, they wouldn't "see" the runtime behavior. This was fundamental, and I spent time in class teaching them how to read code.
At the same time, everyone got some more abstract things (like what are the appropriate data types for modeling real-world things) that I figured were harder. I stopped stressing that in lecture because no one seemed to have a problem with it.
If it was feasible to give students a sit-down assessment that was not graded and could give me this kind of feedback, I would probably be in favor of it. But, if you don't grade something, students aren't going to put their full effort into it. Hence, exams.
> But, if you don't grade something, students aren't going to put their full effort into it. Hence, exams.
I'm now in college, but when I went to high school my physics teacher would give us weekly quizes that covered the material we were learning. This quizes weren't graded but were used to assess what we had learned well and what we needed more work on. Everyone seemed to give these quizes a decent effort for pride reasons, they were still graded and handed back by our teacher but the grades were not recorded.
Now part of the success of this approach may have been the quality of the students I went to school with, I attended a selective private school, but I would think you have similarly disciplined and focused students in the CS departments at most universities, so I don't see why this approach wouldn't work there as well.
In what language, and what sort of code? I cannot really blame a novice C++ or Java programmer for not being able to see what some snippet of code will do at runtime. Some languages are just less amenable to being run by a human brain than others.
"These same students could write code with loops, but if I gave them code with a loop in it, they wouldn't "see" the runtime behavior"
That is because loops are not very informative about program behavior, except in the simplest cases.
The first thing I do when I help new CS students is to show them how to use an IDE. One guy that I taught XCode to went from an F one semester to an A in the next.
Side note: don't force new students to use VIM. It just adds one more layer of complexity to an already complicated subject. If they want to become keyboard ninjas, let them do that on their own time.
My initial instinct was something like "Naw man, everyone should use vim or emacs and a CLI debugger," but I recall being almost magically more productive upon learning to use the debugger in MSVC and having a really difficult time learning to use vim and gdb.
That said, going to a-whole-nother environment just to complile can be a bit of a chore. There are some text editors that come with hotkeys for compiling and running. When I was in highschool, we used TextPad. Ctrl + 1 to compile java and Ctrl + 2 or 3 for running programs (depending on if they were applets or applications).
In college, I used nano/pico and eventually emacs. I'm still slower in some modern IDEs when trying to manipulate text as fast as I did in emacs -- though autocomplete and jump-to-definition makes my coding faster, in general.
At the end of the day, though, we're two people with anecdotal evidence that strongly supports our side of the argument, in our minds.
Addendum: regarding debugging. I generally find, for the kinds of problems that new programmers run in to, println debugging is generally sufficient, so for debugging, I would just recommend that.
Perhaps it just depends on what you're trying to teach. If you're trying to teach students language-specific APIs, then you wouldn't want them to have auto-complete. If you're trying to teach students how to program in a language-agnostic way, then the APIs themselves don't matter as much as the theory behind them does.
I agree with you on reading and understanding the docs. The problem with memorizing APIs, however, is that they change. Even the C++ standard library is at version 11 now. What is the value of memorizing something that will be deprecated in 2 years (or less)? Understanding it has much greater long-term value.
At my college, I've talked to a lot of students who are reluctant to try new languages because they've become reliant upon the syntax and APIs of one particular language. If I challenge them to read some code in language X, they just respond with, "But I don't know language X"-- as if that were a valid excuse.
Rather than being taught HOW something is done in a particular language, they should be taught WHY.
In college classes, these friends often encounter the situation where a class is taught by 1 professor each term, and the intro classes tend to receive the less-desirable professors. These are the professors whose assignments don't line up with class lessons, who teach programming history before techniques, etc. This puts them into the situation of having to suffer though learning a difficult subject with little help, or waiting years until one of the "better" professors teaches an intro class.
With Codecademy, it's easy to get discouraged because of the lack of people around -- you can go onto the forums, but there are no humans in proximity to commiserate or discuss hard problems. Further, some of the problems are broken -- while you and I can just "look in the back of the book" aka read the forum, others see this as "cheating" in the same way that reading a GameFAQ is "cheating" at a game.
What seems to be needed is something similar to Codecademy, but with lab/office hours -- where people can work at their own pace but in the same space as others at their same level, and people with more advanced skills, who can give advice and mentor. I've envisioned this as a hackerspace for some time, where the mentors are also working on projects, and may even receive advice from other members. Like a mix of montessori schools and valve.
It would be great to hear your thoughts on CodeHS.
>but with lab/office hours -- where people can work at their own pace but in the same space as others at their same level, and people with more advanced skills, who can give advice and mentor
This is literally a description of what we are trying to build at CodeHS.
>With Codecademy, it's easy to get discouraged because of the lack of people around -- you can go onto the forums, but there are no humans in proximity to commiserate or discuss hard problems.
Yup. And every beginner gets stuck. If you like our approach, contact us at firstname.lastname@example.org!
MIT's 6.00x has a strict weekly structure.
Udacity's courses have all eliminated the schedule and are now always-open, self-paced.
Most of Coursera's courses are structured.
If you want your friends to actually become programmers I'd recommend that you push both an Intro course as well as codeacademy for language practice. Codeacademy by itself isn't going to get you anywhere. The tutorials don't provide a solid enough foundation to actually implement anything meaningful. As a beginner programmer I went through every Python tutorial in < 30 hours. I used them as Python language practice, which worked well as a supplement to the core Intro courses. Both Udacity and 6.00x use python so they work well with codeacademy's python tutorials.
I didn't know that. Thanks. I'll mention it to them.
If you want your friends to actually become programmers I'd recommend that you push both an Intro course as well as codeacademy for language practice.
That's great advice. I can't teach my friends because I'm not really a programmer -- I was a business major turned cloud engineer who's still teaching himself.
Codeacademy by itself isn't going to get you anywhere.
I gained a great amount from finishing about 1/4 of code year. I feel like I grasp the fundamentals of programming and could soon become an intern at a startup. I wish my friends could learn as easily as I do...
It has a pretty steep learning curve to it before you can start writing interesting programs, compared to python or C. C has it's own difficulties of course but a lot of these are things that are actually difficult such as how to deal with memory management rather than fluffy problems about interfaces vs abstract classes.
OTOH there are a lot of jobs out there in Java which is perhaps mainly due to it getting into universities/colleges in the 90s when it had the kool-aid factor.
Not so sure about making all grading project based though. The problem with this is that whilst plagiarism is usually punished severely there is the gray area of "helping your neighbour". This gives a bonus to persuasive types who are good at getting other people to "help" them an awful lot.
Of course the argument could be made that this is good preparation for industry.
Maybe that could be helped by the author's suggestions on how to make the written tests better. Or by having timed programming exercises as part of the test.
I was lucky enough to circumvent most of these issues by using my mandatory CS courses as an annoying supplement to my programming contest preparation and game programming projects.
Also, I went to Rutgers as well.
Another point is that this class is sortof a "learn c++" course, and I think it suffers from a lack of motivation, and everything feels somewhat contrived. I wish it had some overreaching goal that we were working towards as motivation. It would give more context to what we are learning.
I'm interested in the trajectory of App Academy, Catalyst Course, and others. I'd wager most graduates of those ~6 week programs end up better programmers than graduates of many 4 year institutions.
The point of those questions isn't to test whether you know the definitions of polymorphism and inheritance. The point is to test whether you understand the concepts. Anyone can regurgitate the textbook definition, but can anyone explain why polymorphism should sometimes be avoided? Not if they haven't invested some time in understanding both the theory and applications of OO principles.
But Python is a candidate, the argument for Java is the combination of the IDE and the compiler. If you make a typing mistake the IDE screams at you, this is helpful for beginners.
After all, is Java that bad? Of course there are people misusing Java with bad design practices. But that is just a reason for teaching object orientated programming.
And aside from pointers, there really isn't anything in C that is harder than any other programming language. I personally found pointers easy to learn, which I think was the result of having them explained clearly and then being forced to use them in my code.
If understanding types is so important, then why not go all the way and use either ML or Haskell, or at least Scala?
Most freshmen who stumble through a Java-school curriculum will not understand types at all, and based on confusion among a number of colleagues just yesterday (all Java programmers) about what a "sum type" is, it's clear to me that Java doesn't encourage developing a rigorous understanding of types, even for professional programmers.
I'd argue understanding "what is really going on at the machine level" shouldn't be the primary goal of your first programming language.
Java has a huge barrier to entry before one starts programming. Huge barriers to entry don't belong in a 141 course.