This methodology is suspect, and the headline is extremely deceptive. Note that Guo has conflated CS0 (meaning: CS for non-majors) with CS1 (meaning: intro to CS major). He is upfront about this fact and he does it intentionally, but comments about "first officially-sanctioned exposure in college" and such make it sound like he's unaware that most CS majors do not ever take a CS0 course; and among instructors of CS, phrases like "introductory language" without further qualification is usually understood as referring to CS1.
If we then look at the table of actual data to filter a little bit, the story changes. There are certainly some schools, big ones, that use Python in CS1. But a lot of the Python representation is coming from CS0 at schools where CS1 is taught using Java.
(I can also clarify one of his other comments: he says of the AP exam using Java that "it's unclear whether that will change anytime soon." This is untrue. It will not change in the next 3-4 years at the very minimum---and that would be the time frame if the College Board said right now today that a change was happening. In fact, though, although the idea's been mentioned, there has been no study or apparent motion on that front, so a more realistic earliest-date-of-change would be around 2019 or 2020. And I wouldn't count on it, given how many CS1 courses are still taught in Java.)
Author here. Thanks for your comments blahedo. I'm a bit short on time right now, but I will convert some parts of this response into more coherent form and edit my article accordingly to clarify.
I'll address each of your paragraphs in turn:
1.) Re: CS0 vs. CS1. Like you mentioned, I was very upfront about the methodology for this analysis. And I am very aware that most CS majors do not ever take a CS0 course. But please also be aware that there are a minority of CS majors that do take CS0 before moving onto CS1, usually because they feel ill-equipped to take CS1 right away. I harp on this point because even though this is a minority population, it's a very important one since those are students who, without CS0, would not major in CS. These students are more likely to be female or from underrepresented minority groups. On a related note, a larger and larger fraction of non-CS majors are taking CS courses because they recognize the value of programming and computational thinking. (Again, these non-CS majors are more likely to be female and from underrepresented minority groups.) So I would argue that CS0 is just as important as an "introductory programming" course as CS1, if not more important, due to the rise of the non-software-engineers-who-want-to-learn-programming population. [Edit: And I think that a CS department's choice of language to offer to the CS0 student population is still very significant, and a decision that is not made hastily.]
2.) Re: Python CS0 and Java CS1. You are absolutely right, and in my original article, this sentence appears in the discussion section: "Some schools have fully switched over to Python, while others take a hybrid approach, offering Python in CS0 and keeping Java in CS1."
3.) Re: AP exams using Java. Thanks for the clarification. I will remove "it's unclear whether that will change anytime soon." from the article.
[Also, as to cyorir's comment that the chosen data may have been cherry-picked, please rest assured that I tried my best NOT to cherry pick. If I had cherry picked, wouldn't I have made Python look even better? ;) Any incomplete data is strictly due to human error or oversight on my part, and I gladly welcome any correction requests via email.]
Thanks for responding! Do you still have in your notes the breakdown for each course whether it's CS0 or CS1? That would be good to put into the table in some form so the rest of us can make our own judgements about what that might mean. I don't disagree it's important, for many of the reasons you mention; I only really disagree with its characterisation as a majority intro language.
Indeed, I expect that when they're broken out, Python will be the overwhelming top choice for CS0, and a respectable second in CS1; but then, that's not as big a change as you might think. I'd be surprised if Java was ever a majority or even a plurality in CS0 courses. One subgroup of what you're calling CS0---the ones focussed on programming, often called a "service" course and targeted at engineering and science majors---have long been using things like C or Matlab, and never really had a strong movement towards Java. The other subgroup, what I'd call the "true" CS0 (i.e. based on what the ACM/IEEE curriculum guidelines used to call CS0) may have had no programming at all, or if they did, a short unit in a relatively lightweight language, certainly not Java.
Basically, the things that make for a good language in either kind of non-majors course are not always the same as what makes for a good language in a intro-for-majors course. That doesn't make the non-majors course unimportant and it doesn't mean that decisions are made hastily, but it's still not super helpful to conflate the two.
great suggestions. sadly i am limited by time and bandwidth. this was a weekend hack that i did by myself, so it's unclear how much more rigorous i can make the analysis. i'll try to incorporate some lightweight edits for clarification, though.
wow great questions! right now www.pythontutor.com is my ongoing effort to support Python CS0/CS1. REPLs are important for live programming via trial-and-error. see:
http://pgbovine.net/teaching-librarians-programming.htm
ML (OCaml and SML) seems to get short shrift in your analysis. CS51 at Harvard is in OCaml. Brown's cs017 is OCaml, Scheme, Scala and Java. Penn's CS120 is OCaml (I think. It definitely used to be, but I can't find the course online anymore. You can reach out to Benjamin Pierce to ask, though.) And CMU has an SML-based FP course as part of their intro sequence
None of this is to deny the fact that Python is highly popular as an intro language.
Thanks, will look into those and update accordingly. However, Harvard CS51 doesn't count as CS1 since CS50 is already its CS1 course. (But please correct me if I'm wrong.)
(arg can't reply to your reply) this is actually quite important ... let me keep looking into this. sorry there's a giant backlog of (often angry!) emails in my inbox right now ... might not be able to sort thru them all :)
i wish the blogging platform supported some version-control friendly format.
[Edit: I made a typo about Penn CS120 ... I mistakenly found another course named CS120 that wasn't at Penn. Also, Harvard CS51 calls itself "Computer Science II" [1], implying that it's a CS2 course. It probably has CS50 as a prereq, tho I haven't double-checked yet.]
It depends on what CS1 really means. I do think cs51 is comparable to CIS120 at Penn, and I think the later is pretty clearly CS1. That said, I find it hard to really figure out what the precise criteria are. It's pretty clear that Harvard encourages concentrators to take cs50, but I think the same course at other institutions would often be skipped by concentrators.
@yminsky -- yep, taxonomies r hard :) but i'm trying to go as much as possible by what each department proclaims for itself rather than trying to calibrate across departments. If Harvard calls a class "Computer Science II", I'm going to consider it as CS2. (fwiw I consider Harvard CS50 a hybrid of CS0 and CS1 since students voluntarily split themselves into two tracks called "less comfortable" and "more comfortable", respectively.)
I would have thought cs50 is a cs0 course, and cs51 is the cs1. cs50 is the concentrators-and-non-concentrators course that's all flash and fun. What do you think is the cs0 at Harvard?
As for your point about Penn's CS120, I guess I don't fully understand the criterion. From the way you described it, CS120 seems like a CS1 course: not for someone who has had zero programming, but the very first course taken by most CS concentrators.
FWIW, I spoke to Benjamin Pierce about the structure of the course, and he said he didn't want to use "Real World OCaml" because it's too early of a course: he has students who he thinks don't yet understand things like what a scope is. From the sense I got from him, it's quite early in the curriculum.
That said, this is mostly minor quibbling. I think there's little doubt that "very early" courses, for some reasonable definition of "very early", lean towards Python. And reasonably so.
I'm not sure how you define CS0, but my understanding is that there are now three introductory CS courses at Harvard: CS1, CS50, and CS51. Note that I graduated in 2007, so my understanding may be dated.
CS1 is intended for non-concentrators or people with very little prior CS experience.
CS50 is the standard intro course that most CS concentrators (or people considering CS as a concentration) take. Although it's taken by non-concentrators, most of the non-concentrators taking the course do so because they're in some field that has a lot of exposure to programming (e.g. Applied Math).
CS51 is the "advanced" intro course and is usually not the first CS course a CS concentrator takes, if for no other reason that it's not offered until the spring semester.
I went to Penn and TAed for some lower level CS courses (but not 120).
CIS120 is the course for students who have some programming experience, and probably falls under CS1. I had taken AP Java AB before and was overprepared for the old (Java) version of the course. The current course teaches basic data structures and uses Ocaml.
Penn's CIS120 is half OCaml and half a continuation of Java from 110 (the first course). Python is not found in the intro courses at Penn. Just Java and OCaml.
...it's a very important one since those are students who, without CS0, would not major in CS. These students are more likely to be female or from underrepresented minority groups.
I'm curious - would the group be less important if they were less likely to be female/non-asian minority? I get this impression from your text, though you don't actually state it explicitly. As a result I'm curious what your actual view is.
English isn't very good at expressing set logic, and I'm no mathematician :)
You can delete this sentence -- These students are more likely to be female or from underrepresented minority groups. -- and the meaning of the previous sentence remains unchanged. The key point is that CS0 is important for attracting people who didn't have prior background in computing and thus could have a latent interest in CS that would otherwise go unmet. Now, it turns out that a larger fraction of those students are from certain underrepresented groups, but even if that weren't true, it wouldn't make the case for CS0 any less important for the "group" that I think you're referring to, which is "students who have latent interest in CS but not enough prior exposure to know that they would love CS." (I wish I could've expressed that in an elegant mathematical equation!)
[Edit: when I wrote "even though this is a minority population", I didn't mean "minority" in the underrepresented minority sense ... I meant that this is a small population compared to all CS majors. Again, English != set logic]
My question was slightly different - would the helping the group of people taking CS0 be less important if that group was comprised entirely of asian males?
I couldn't tell if the sentence was meant to support your main argument ("we should help group X because lots of non-asian minorities") or if it was merely a curiosity that was irrelevant to the main argument ("ooh look, lots of non-asian minorities, statistically unlikely therefore interesting").
(argh can't reply to replies) -- @yummyfajitas -- no i don't think it would be fundamentally different, since the mission would still be to bring out the most of people's natural potential and interest in a subject, to kindle a spark that might have otherwise been unlit due to lack of early exposure.
[Edit: I'm not eloquent enough to make a convincing case at 2:20am, but I don't think that extra sentence was merely a curious non-sequitur, even if it could be cut without modifying the point of that paragraph. I can list out tons of points/counterpoints in my head ("what about redheaded people with thin left eyebrows? why not mention them? would that be a non-sequitur? what about the historical and socioeconomic implications behind this demographic group and its relationship to technological access? ..."), but I don't have enough expertise to give this topic a proper treatment. So I'll leave this thread by recommending the following two books:
I think this is a good point. And in this respect I feel like the chosen data may have been cherry-picked or are somewhat incomplete.
As an example, take Northwestern. At Northwestern there are several intro routes you could take, depending on your major. CS Majors are not required to learn Python - it is used in some 300-level classes but not in very many 200-level classes (although that may change in the coming years). I've actually taken classes with projects using Python without ever taking classes teaching Python. CS majors will likely take C, C++, Java, MATLAB, or a variant on scheme for their first language, depending on which sequence they take, whether or not they are double-majors, or course scheduling.
In contrast, non-CS majors may learn C++, Python, MATLAB, or a variant of scheme as a first programming course, depending on why they are learning a programming language.
So while schools like Northwestern may offer introductory courses in Python, I think the truth is more something along the lines of "students may learn Python, or they may learn any of a number of different languages as their first language, depending on what their academic goals/requirements are."
Using Georgia Tech as an example, the two classes mentioned are the introductory courses for industrial engineers and business majors. The article excludes the introductory course for CS majors (1331) which is taught in Java and the introductory course for most engineers (1371) which is taught in Matlab .
>the introductory course for most engineers (1371) which is taught in Matlab .
A fair point of warning is that, having been at Georgia Tech 2008-2011, most of my friends in engineering and physics freshman year hated CS 1371, and spent many a late weeknight complaining about Matlab. The common wisdom, often picked up too late, was that engineering students would have a better time if they switched into the Python course, iirc CS 1301.
For my part, I had placed out with AP credit, but took CS 2110 using a combination of LogicWorks (circuits), LC-3 assembler and C. I thought it was awesome; that course is probably the reason I have my current job, but many of my friends never considered taking another CS course, or only did so several years later, because they hated Matlab so much.
In my experience: ask any Computer Science professor interested in undergraduate education whether engineers should take an intro course using Matlab or Python, and they'll always say Python.
Ask any non-CS Engineering professor interested in engineering undergraduate education whether engineers should take an intro course using Matlab or Python, and they'll always say Python.
This argument seems pervasive across many institutions.
Author here: Please email course correction requests to philip@pgbovine.net ... I'll probably miss a few important corrections amidst the hundreds of comments.
As a clarification, a CS0 or CS1 course usually has no CS prerequisites, so if a course you're trying to mention has other CS prereqs, then it's probably not CS0 or CS1.
ha, feel free to DOWNVOTE so that this appears at the very bottom, so that it's visible :) but don't downvote so much that it disappears.
It says a lot about our understanding of our field that new languages generally do not formally test their syntaxes (and other "mental model" features) for usability.
If programming language selection for introductory university courses was based on pedagogical research on introductory computer science courses, I suspect Scheme would be nearly universal given its roots and nearly forty year track record.
But that's not how languages have typically been selected for 15 years or so. The direct vocational skills argument has held much more sway.
IMO Scheme is one of the easiest languages to learn. Have a look at SICP (http://mitpress.mit.edu/sicp), one of the best introductory programming courses available, complete in Scheme. They just _use_ the language, without actually _teaching_ it, and students just pick it up on the go. Sure, there are lots of corners omitted, but at the end of SICP, you've written your own Scheme interpreter / compiler combo that you can build upon.
It depends who you are. People who have the recursion muscle tend not to appreciate what an obstacle it is for people without it.
Basic use of scheme requires you to think recursively. If you have that, scheme gives enormous power for a small investment of learning, thinking and typing. If you don't have that, you need to deal with that before you can use scheme.
You could take a group of elite students with a math/physics background and have some confidence that their grounding in mathematics and physics will have prepared them, or that it will be a short and healthy jump for them.
But it would be bigger hurdle for other groups. Python is a useful because you can ease people into this kind of thinking. And people who aren't there yet can still get things done.
Regarding SICP, even in early chapters, it requires you to understand maths concepts that wouldn't be available to you even if you were a professionally-employed person returning to to university. The Little Schemer path would be a better intro for groups without that.
There is a really useful criticism of Scheme implied by your comment. Idiomatic Lisp is difficult to write, and there is a willingness in Scheme based Curricula to infuse it from the beginning. Unlike implementing objects in Java, use recursion and an aversion to mutation is a stylistic choice in Scheme. Iterative looping and state variables are viable alternatives and Scheme and other Lisps do mutation really well. The language has no prohibition on imperative style.
After the first lecture, I am pretty sure Scheme syntax is not an impediment. For someone with _no_ CS background, learning Scheme is the right, in every connotation, the way to go.
Does anybody here know of any resources where I might read up on best practices in CS pedagogy? I'm very interested in helping to reform my university's intro-level CS program.
Best practices differ from context to context. MIT switched from Scheme to Python because a large segment of students in 6.001 were not necessarily going into computer science and were there as because of a common core curriculum. Does it really make sense to teach such people to write language interpreters in an introductory course?
But every place isn't MIT, nor is every course a common core requirement for a college or department with diverse majors. Does a civil engineering student really need to implement lazy evaluation?
Felleisen's paper and subsequent book as well as the work of the PLT group show the current state of Scheme as a pedagogical language.
"Best practices differ from context to context. MIT switched from Scheme to Python because a large segment of students in 6.001 were not necessarily going into computer science..."
Nope, MIT has an introductory computing course that logically goes before 6.001-4 or the new 6.01-2. For a long while they didn't have one due to resource constraints, they dropped this service course about the time 6.001-4 was developed, but other departments had their own more domain specific introductory courses (see below for more details).
6.001 was canned and Scheme completely purged from the undergraduate curriculum for political reasons, after the Department, and likely School of Engineering and the Institute itself panicked when the dot.com bust more than halved enrollment, after it had been 40% of the undergraduate class for decades. Oh, yeah, after the Institute had also blown a quarter billion dollars in a not very functional new building for the EECS research labs.
"... and were there as because of a common core curriculum."
That ignores that MIT's EECS department has for a very long time insisted that EE majors have serious programming experience, and CS majors the same for hardware. 6.001 programming, 6.002-3 hardware, 6.004 both, all majors had to take all 4. This was true but less coherent in the '70s before that common core was developed, and it's still true with 6.01-2 which combine both. MIT's EECS department started as an EE department, this is not an unusual approach for such, vs. ones that started from the math side or otherwise independent of EE. If you want pure CS, you shouldn't go to MIT or U.C. Berkeley, but to CMU or Standford (or so I gather about the latter three).
"Does it really make sense to teach such people to write language interpreters in an introductory course?"
Yes, if you want them to really understand what it's all about. Since it's Lisp, that's also much less intense than it normally is, essentially all the colossal parsing messes with other language families can be skipped thanks to its S-expression syntax, and you can focus on the very basics.
As for e.g. Civil Engineers, MIT's Civil and Environmental Engineering department has its own 1.000 and 1.001 courses for its type of computing, and as I remember that was true way back when 6.001-4 was developed. In the late '70s 4 or so engineering departments got together and got VAX11/780, which they used for their own education etc. purposes, while EECS later procured first a DECSYSTEM-2060 (which had a CPU twice as fast, but a max of 1 MiB address space), then/in addition HP 68000 workstations (eventually Project Athena took over this sort of duty, I think).
Finally, whatever the pedagogical merits of Scheme and/or SICP/6.001, Python is now winning due to fashion rather than merit (although it does have some merits compared to other languages that could and are often used like Java).
And this exactly what everyone needs in the beginning.
Simple and clear syntax where you are able to produce results in a short amount of time to keep you interested.
I just remembered some people i learned with who struggled along time to get behind the concept of pointers and how to use them when they thought us c/c++ as an intro language to programming.
Yes. At Pasadena City College, at least, most CS classes are taught in C++. AFAIK the creation of even one Python class is still mired in political nonsense.
Thank you for that link. I finally understood why Guido added the colon before an indented block (second paragraph). It's not needed by the parser, it's only to make it easy for programmers to understand that a block is about to start. Actually: "after early user testing without the colon, it was discovered that the meaning of the indentation was unclear to beginners". So it's aimed to beginner programmers. That's why Guido inflicted that useless and silly looking colon to all of us. How about making it optional in future?
A lot of the recent Rust syntax changes are based on applying proposed changes to the Servo codebase and analyzing the changes subjectively across the entire sample. In cases where new semantics help with inference of types or annotations, the metric becomes quite objective.
But there is some weird stuff in Python syntax. Like, why do you create a dict with {} but access it with []? It always felt to me like an organic language, not one that was planned with consistency in mind (which is fine, of course).
The access pattern is probably the right aspect to make consistent (at least in the context of duck typing). If the access isn't generalized across various types they might as well use named methods.
Declaration doesn't reflect usage, it's not a helpful principle in general. You create tuples with () but access them with [], and the fact that [a, b, c] is a list has very little to do with the fact that you use [] to index into it. The {} syntax for dictionary literals is reasonably common; Ruby's looks fairly similar, and Javascript uses almost exactly the same syntax, which many people will be familiar with thanks to JSON. Are there any other popular syntices for dict literals?
I've always been of two minds about this. My intro course at MIT was in Scheme, a class that has since been renamed and is now taught using Python. On the one hand, it was a great course, and the FP was an awesome mindfuck (previous programming experience: TI BASIC). Though I took it all for granted at the time, it definitely set me up to think about programming better for years, and I still reread the book sometimes. The downside was that for a little bit, I couldn't really program in anything really practical, because by 2001 Scheme kinda wasn't. In other courses, I found myself crunching data with Perl that I only sort of understood, or writing C programs in situations where C probably wasn't the ideal tool, because it was the only other language I really knew. The thing where people say "once you know all the theory, you can learn a language in a couple days" isn't really true, and anyway, you probably don't have that couple of days. Point to Python.
So the question is really: can you use Python to get most of the benefit of teaching using Scheme while at the same time giving the student a practical tool for future use? I haven't taken the new Python courses, so I don't know how well the material bends to the language, but I remain skeptical of the "most of the benefit" part. Maybe I'm just a curmudgeon, but I sort of think they should keep Scheme and then just have a short (e.g. over IAP) course on practical programming techniques using Python (or whatever). Or maybe someone should rewrite SICP in Clojure? Finally, I know Scheme has continued to evolve, and I wonder if it's really still true that Racket is impractical as a day-to-day language.
MIT grad here. I took the new course (6.01) 5 years ago, but I'm reasonably familiar with SICP. They are _very_ different. 6.01 has an emphasis on robotics and AI, probably because it's supposed to be an intro to EE as well as CS. SICP is more focused on the semantics of programming languages and the power of abstraction. 6.01 teaches some of the latter, but with some OOP mixed in. But it also teaches circuits, discrete signals stuff (e.g., the z-transform), probability, graph search, state estimation, etc. It's a breadth-first intro to EECS, whereas SICP is more of a depth-first intro to the building blocks of computer programs.
Telling me that the class I think of as "new" has been around that long makes me feel old. Point taken that they're just fundamentally different classes, not just the same class using a different language medium. 6.001 had some OOP in there too, actually (you ended up building your own OOP impl in Scheme; it was kinda neat), but otherwise I agree with your characterization of SICP. I definitely see the value of 6.01's approach (especially since the other 6.0x classes are also different from the 6.00x classes), but I can't help but feel like that really solid exploration of programming fundamentals is crucial, and something important has been lost. So maybe this is less about Scheme and me just lamenting the loss of 6.001. (I do hear you can take 6.001 over IAP, and of course you can always learn it on your own, so I get that a lot of this is just me being attached to a formative experience.)
I'm also an MIT grad (class of 1992). I took 6.001 back in the Scheme days. I loved it, and have found that the lessons learned in that class have stayed with me for a very long time. I now teach professional programming classes, and it's a rare lecture that doesn't use some of what I talked about there.
The irony is that I teach a lot of Python programming classes. And despite that, and my love of Python, I think it's a shame that they no longer use Lisp (or Scheme, or Clojure) for the equivalent of 6.001.
Python is easier to understand and get into. And Python has some amazing features that you want, such as first-class functions. And it's a real-world language, which you can't say about Clojure, even with its growing popularity. And Python is my strong recommendation for everyone's first language, because it's so easy to get into.
But there are some things that are just easier to understand in Lisp than in other languages. Implementing Lisp in Lisp (metacircular evaluator) is a fabulous way to think about programming languages that is harder in Python than in Lisp. (Not impossible, but harder.) Lisp allowed us to think about programming in all sorts of new ways, because it handled so many of those paradigms so well.
So I still think that the intro CS class should be in a form of Lisp, because it opens your mind up to possibilities. But if you're not going to go that route, then Python is a terrific first language, and I'm quite happy to see it getting the attention it deserves.
It's relatively new, but I'm really excited about it, and that is what I'm learning, it allows you to get the benefits of learning lisp and python at the same time. It's compatible with python in both directions, so you can use it in any practical project - with django or with machine learning libraries, etc.
That does seem like a nice option, and sort of what I was going for with my Clojure suggestion. And I suspect that mostly any Lisp will do for SICP's purposes.
What is the purpose of CS101 for CS majors? Is it to teach a tool you can use in your future courses? Then Java, Python and C++ are good choices.
Or is it to teach the fundamentals of programming? Then Scheme is the ideal choice, but they should map concepts to mainstream languages. I.e., here's how OO works in Java, now we'll simulate it in Scheme. If more people took a course like this, we'd have fewer language fads IMO.
Clojure would be a great language for teaching, keeping LISP and SICP but giving a tool that compiles something running on the JVM. Teaching Python definitely has the advantage of being more ready for the market, even though Ruby is nicer and friendlier and probably better OOP, also having better lambda functions support.
SICP is such a great book, it is a must have for programmers...
I think Python is the perfect first language for beginners. Yea we all know it's not fast or low level or the best language to teach about types or advanced CS topics etc. But it is extremely readable and easy to understand, it makes coding fun and once the students are engaged and know the basics it is much easier to teach more advanced concepts like typing and memory allocation.
1) I hope they are teaching Python 3.x instead of 2.7.x.
2) A good portion of the schools cited in the Appendix are not "computer" schools. UC Davis for example, is the last place I would attend for a CS degree (I know many students at UC Davis -- it's much better suited for other degrees).
3) My school (CSU Sacramento) does not even offer any Python courses, but there's a lot of Java and C courses (depending on your major concentration).
I think it's important teach what's relevant in the current day, every likes to parrot the "Python 3 has no support" line but it's simply not true[1].
The developers have said multiple times there will be no 2.8 and the 2.7 line will only receive security fixes. Python 3 is the future of Python and not teaching it to students would be a disservice to them.
Sure. My point is that nearly everything you learn about writing a program in 2.x is relevant to writing a program in 3.x.
You do have to keep track of whether print is a statement or a function or how literals map onto types, and some libraries are imported using different names and some APIs have changed a bit, but if you understand all that jargon, it's not that big a deal (I'll repeat my disclaimer here about transitioning a significant code base being a different thing than managing the differences).
Well, yes, sure... core programming concepts are the same regardless of language used and/or version.
However, from my experience, most students don't venture off on their own until after becoming very familiar with a language. Not all students are experimental enough to hack on personal projects for fun or what-have-you. I've actually had a fellow student tell me the Java class file he emailed me for a project would only run on Java 6 since that's what the lab had installed -- obviously wrong.
The point being -- students will intern what they are taught. They should be taught on modern versions/concepts so they intern modern ways of doing things. By teaching Python 2.x to students, you do them a disservice by teaching them the way Python "used to" do it... not how it "does it" now.
As an aside -- with all of the major Python libraries now ported to 3.x -- I see little to no reason anyone should ever learn Python 2.x, or use it (with the exception of the obvious legacy codebase support issue).
Every so often I check to see which of my 3rd party libraries has acquired python 3 support. It's edging up over the years, but it's still between 60-80% (depending upon the project). Not all of them small, either - some are ones you really wouldn't relish rewriting.
It has some neat features, but nowhere near enough for me to be prepared to dump 1/3-1/5 of the python ecosystem.
I'm sure python 3 is coming eventually, but I think it's still a looooong way off.
Well, from my perspective, students should learn on current gen tech, be it Python 3.x or Java 8, C++11, etc.
I agree that simply learning some programming has no bearing on the language version, etc... however, as students get familiar and start to experiment on their own, they may learn "the old way" of doing things.
There is no downside to teaching Python 3.x, there might be a downside to teaching 2.7.x.
Yeah there is a downside to 3.x. The transition has failed from what we can see. It was released in 2008. It's 2014 and it's still the used by the (vast) minority of people in the Python community.
Programming isn't one of those fields, like most things in life- where 'the latest version' is always the right choice.
The best way to explain it... it's like saying everyone should adopt Windows 8, that there's no downsides.
It's hard to explain that to someone who isn't in the trenches using it because it goes against peoples natural urge to be 'updated'. Unfortunately, this transition hasn't worked out- no one is saying it but it has pretty much failed. Turned into a 20 year transition. Red Hat is supporting Python2.7 until 2027. Core dev team till 2020. Guido van Rossum works at Dropbox, who started working on a new JIT compiler (Pyston)- this year that is targeting Python 2.7.
There are actual performance regressions in Python3. I'd wait this one out for a while longer.
Python 2.7 is only 4 years old and the stability to the language has actually been a boon rather than churning out flawed release after flawed release like many Python3 versions have been.
To be honest, if someone forks Python2.7 and continues that branch, that's where I'll be going. Not to 3. That will be the end of 3, I assure you. That will be a -huge- downside to teaching Python3. :)
> Short version: Python 2.x is legacy, Python 3.x is the present and future of the language[1]
This should be enough of a reason for classes to move to 3.x.
Regarding your "in the trenches comment", remember, most of the folk on HN are involved in software engineering in some capacity, myself included.
No doubt it has taken a long time to move into 3.x, and it may take a while still. That doesn't make an argument to teach students a language dialect that, according to it's maintainers, is legacy.
> In particular, instructors introducing Python to new programmers may want to consider teaching Python 3 first and then introducing the differences in Python 2 afterwards (if necessary), since Python 3 eliminates many quirks that can unnecessarily trip up beginning programmers trying to learn Python 2. [1]
As I mentioned before, most (all?) of the major Python libraries have now been ported to support 3.x, so this leaves little reason to start out learning 2.x (unless you are setting out to maintain an old codebase, but we are talking about academics here and new beginners, so that point is moot).
One good reason why you'd want to use 2 for learning is that any job would be using 2. 95%+ percentile chance. It's not worth debating you on this if you don't work in Python for your day job. There's just so many misconceptions in what you've said.
From the way it sounds, you don't, and Python3 is hardly a dialect. The challenges of Python3 aren't for me or you, it's for library maintainers. And anyone with years of software built on 2.x (which is nearly all of us). I assure you moving from 2 to 3 as an end user is not a challenge. I have used Python3, since 3.0. It's no different 'dialect'- but when I last did (3.4), I found major performance regressions in basic functionality (JSON processing) and I think most of us have given up on Python3 (there are plenty of reasons). I'll revisit the idea of moving to it in another 5 years.
These issues are going unchecked in Python3 because there are no users. This is a major problem and plowing forward with Py3's development is a huge mistake. It remains to be seen what the consequences of this are.
It's simply propaganda to insist everyone is taught on Python3 as if it's the future of the language. You know it is a logical fallacy to appeal to authority. Python.org is taking a political stance, not one based on the real world. Python 2.x is the present. If you don't think so and refer to Python.org you're not programming in Python for a living. There's a major disconnect between the majority of the users of Python, and the core dev team that run that website. No one knows what the future of Python is yet, TBH Python.org should not be presenting 3 that way. In reality it's the experimental branch.
If you are using Python, then use 3 if you wish. Trust me, no one cares. You won't have support from most major distros and no online hosts. Google App Engine, Azure, etc are all on 2.x. You can't be doing too much though to not be running into missing libraries and roadblocks.
The silent majority will move to 3 if and when it ever makes sense. More likely we'll either go with the a fork, or simply move onto another language for the type of work that we do. Porting/testing is very expensive, why bother with that for no gains? We'll simply move to something else and the majority of the userbase will be gone.
There's only gains by using 2 but think what you will. Everyone is free to use what you want, including schools. I'm just explaining why it's not as simple and straight forward as you put it.
If it were that easy, there wouldn't be any controversy.
>>>> it's still the used by the (vast) minority of people in the Python community.
You sound so certain. Tell me exactly the number of people using Python 3? Not a percentage or something else without meaning. Real numbers. Don't know eh?
"1) I hope they are teaching Python 3.x instead of 2.7.x."
Not me. Are you a professional Python programmer for your day job? I don't think it matters what they're teaching as far as version. As an enduser programmer the difference really isn't that significant (not worth a breaking change in the language at least). But whether they learn 2 or 3, they'll be able to move to the other without any major issue.
I would recommend anyone pick 2 over 3 at this point in time. Everything that says 'Python' supports 2 so there's just never any worries or concerns there.
UC Davis student here. To be fair, ECS10 is not a class that CS majors usually take--it's primarily an elective, and taken by non-majors as a way of satisfying science and engineering general elective credit. The CS majors start with ECS30, which uses C; ECS10 isn't even a required class for the Computer Science programs.
As for whether or not they are computer schools, that criterion was determined through USN&WR, as mentioned in the article, which is where your criticism of not being a computer school should lie. Davis is listed at #34 though, so there may be a sharp decline after the first 10, first 15, etc...
Very interesting (CSU Sacramento student here). Our CS Major's start with CSC10 (or is it 15?) which is C, then CSC20 which is Java, then CSC35 which is Assembler (GAS/Linux). From there, it depends on your focus, but a large amount of classes favor Java, and a few C/C++.
For reference, the ranking used by the author is from US News & World Report[1]. While it's far from objective, it certainly holds enough merit such that UC Davis being the last school you would attend for CS sounds ridiculous.
Bachelor level students do not conduct research... so it seems odd to rank schools for their Bachelor CS degree program based off of stats about their Graduate CS programs. (this thread is about beginner programming courses, ie. undergrad students). -- as a footnote, at least at my school, the graduate level professors are not the same ones teaching undergrad classes.
Grad school rankings tell you how strong the professors are. Both professors and lecturers teach courses. The strength of the department also tells you the quality of the lecturers it's able to hire.
Just as a contrasting opinion and at risk of sounding defensive -- I'm a UCD Alum (BS in a different field) who took the lower div CSE track there.
ECS10 (Intro to CS using Python) was my first formal CS class. Took it as a filler elective and was hooked. It was definitely geared toward first-time programmers and used Python to ease into fundamental CS/Programming concepts. I found it successful in that.
Following that was ECS30 which is in C and the first major-track CS class. The difference in difficulty/workload substantial, mostly due to the introduction of things like pointers, dynamic memory allocation, etc. But the intro class helped a lot. From ECS30 on, it's primarily C/C++ for OOP/Software design, Data Structs, systems, algos, etc. I don't think Python is used much if at all in major track CS/CSE classes I can remember.
I'm a little surprised at the dismissive tone toward UCD's CSE program relative to CSUS. I've audited classes at CSU in consideration of applying for a master's there, and while not outright disappointed, nor was I impressed.
I got the feeling that the CSUS programs are not as geared toward fundamental research and "CS" to the same extent as UCD and are slightly more skewed toward developing employable skill sets. That's not a bad thing, just a different goal. I also disliked the general push toward MS stacks.
Having peers at Cal that took I took summer sessions with at UCD, the feeling was that UCD was less challenging in the sense that there are smaller class sizes, better professor access (especially in classes taught by excellent lecturers like Sean Davis who devote lots of time to students), and less competitive nature of grade curving. I didn't hear anyone say it was easier, simply that it felt more collaborative. Though, that's not hard to imagine when they were in classes with nearly 1000k students split across two simulcast lecture halls in freshman CS classes at Cal.
Being in the shadow of Cal and Stanford makes it harder for UCD to get noticed, ut it has world-class faculty, smaller class sizes and great options for research. I have friends/acquaintances that ended up at Goog, Twitter, MS, SpaceX... etc. Is it MIT, Stanford, CIT? Perhaps no. Is it more known for it's Bio Sciences? Sure. But it's far from the 'last place' I would matriculate into for CS.
Just presenting (an admittedly verbose) alternative opinion.
Python was my first programming language 10 years ago. Right now, I'm using it to teach programming to an English major. He's using it to make text based games. That dating sim was sorta insulting.
All imo...
Python as the "101" language is a pretty good idea for universities in general because many fields do data crunching of sorts and Python is fairly widespread in the scientific community.
As long as signup to the class is open for all (and not just CS-students) there will be healthy benefits for other departments (economics, biology etc.)
Aside from that I've always kind of defended high level languages as entry level teaching tools. The closer to pseudocode/natural language the better. Once you start to think about solving problems like a programmer you can switch languages relatively painless anyway.
With the advent of RubyMotion etc. I'm starting to wonder if there would be some value in starting to teach programming on mobile platforms in a high level language as 101 (i.e. the first ever code they write is mobile code). From my experience the "app classes" are always popular because there's a certain cool factor in being able to show a friend stuff you did on your own mobile device.
If you'd tailor those classes to low-level phones it would be pretty helpful for certain areas of the world where many families have at least one cell phone but often no computer. Writing the code on the cellphone itself is pretty painful but at least one could write it at a university/school with a computer and carry the stuff that was built home.
Honestly, I'm not surprised, but I believe this is bad news. Learning C programming (first) was probably the most advantageous thing that has ever happened in my life of programming. When you understand C you understand how the computer functions, and in turn can write better software. It prepared me for virtually all programming and has enabled me to understand concepts correctly, the first time, and has made my life significantly easier.
I think C makes a great second language. Use Python to teach if statements, variables, control structures etc - then later use C to teach low-level algorithms, pointers and how computers actually work.
I don't think there's any harm in learning Python first. I think they may be harm in learning C first - it's much more likely to frustrate people and potentially even put them off programming.
This is the same reason I dislike Java as a teaching language: having to tell people "don't worry about what public static void main(String[] args) means just yet" when teaching "hello world" isn't a great introduction to programming.
>This is the same reason I dislike Java as a teaching language: having to tell people "don't worry about what public static void main(String[] args) means just yet" when teaching "hello world" isn't a great introduction to programming.
This is exactly why I have always thought that the move to Java for intro courses was a terrible idea.
C/C++ were never ideal for beginners, but it was still possible to strip it down to building blocks without immediately resorting "don't worry about this now."
>C/C++ were never ideal for beginners, but it was still possible to strip it down to building blocks without immediately resorting "don't worry about this now."
I agree. Except during that first C lecture when reading console input and was told 'don't worry about the ampersand, just remember to use it for scanf variables, we'll get to that later'.
Those can be understood from a high level point of view in a few minutes time. I've sat through multiple classes that have done just that.
Java is so deeply OOP that there's just no way for somebody to grasp and understand the concept of e.g., a public method when they haven't even learned about variables or basic conditional logic yet. A total novice just plain isn't equipped to handle some of the concepts that are immediately thrown at them with a simple hello world.
That's how it's done at the University of Central Florida for the introduction to programming course. You learn Python for the first six weeks, and then learn C for the remainder. It's nice!
To get a turing complete programming language all you need are functions and function applications. Pointers and low level memory management are just an implementation detail :)
Hey there! I've just started learning to code for building web apps (I know about Fortran 90, it doesn't fit very well on the browser haha), and it took me a while to decide. I looked up in an endless number of hacker websites, but at the end I'm sorry for betraying Python lovers... I'm learning ruby on rails..!
I've just started learning, please tell me If it's best to learn python (I'm not looking for an easy language, but for a powerful and versatile one :) Something to develop some ideas I have..
While Python has a healthy web dev ecosystem, Ruby's feels much larger to me. That's almost certainly because Rails is so wildly popular. And Rails is an excellent, mature framework. So for web dev, I would consider Ruby the winner.
Python is the clear winner for scientific computing. That's not really due to anything
inherent in the language. It's an ecosystem thing. If you were using Fortran before,
you might be working in a problem domain where Python dominates.
Both are excellent for miscellaneous scripting work. E.g. reading in a CSV file and doing something with each row; batch-processing a bunch of images; renaming 1000 files according to some ruleset; gathering some system data and sending nightly status emails.
In terms of syntax and features, they're very very similar. Python has meaningful
whitespace, which you may like or dislike. (I think it's good for enforcing proper
formatting, but you're free to disagree.) Ruby has multiple ways of expressing the
mathematical concept of a function (methods, blocks, and procs), which has its pros and
cons. Both have metaprogramming facilities, though I find Ruby's more pleasant. If I remember correctly, it was in large part the metaprogramming that made DHH pick Ruby for Rails.
I kind of see what you mean :) at my university they announced a python course for developing Computational Fluid Dynamics programs (aerospace engineering), so my first vision of Python wasn't too much for web development but for engineering/scientific programming like fortran is. Although it's a popular choice for web apps. After having worked with fortran I found a bit more attractive Python because of all the indentation and syntax style. I also read that Paul Graham likes a lot more python haha
But many other readings and the fact twitter and groupon were coded in ruby, made the choice! I've read that twitter now is moving to scala, but as there are a lot more resources out there for learning ruby/python, I kicked it out of the list.
I like ruby, sometimes it seems a bit confusing when there are many ways to express the same thing, but it's approach to natural language is helpful.
Thank you all for your quick responses! It's a pleasure to join the HN community!!
> While Python has a healthy web dev ecosystem, Ruby's feels much larger to me.
It's funny because I actually feel the opposite: Ruby is mostly a niche language for the web community, whereas Python is now mainstream in so many different areas (graphics, scientific, sysadmin, financial...).
I've yet to see a non-webdev tool embedding Ruby as a scripting language, while Python is now featured in office suites, spreadsheets and enterprise software. Anywhere simple and clear syntax is more valued than metaprogramming features, Python is likely to appear at some point.
I think we have almost the same opinion, actually. I said Ruby's "web dev ecosystem...feels much larger to me." I agree that Ruby's strongest niche is web dev. And I think it has an edge of Python there. Outside web dev, I don't see Ruby dominating any particular niche.
It's not a Rails vs Django feature comparison that gives Rails the edge. Convergent evolution means they're pretty much on par with each other all the time.
The difference is in the size of the ecosystem as a whole. The Ruby web dev world appears to have more (or more visible) participants. Which affects things like Stack Overflow, web-specific packages on Github, blog posts, etc.
It's not an order-of-magnitude difference, as far as I can tell. But still significant.
It was particularly those web specific packages on github that I was curious about. For example, is there a package for doing CAPTCHAs for rails that's better than all of the ones for django? Better admin apps? Better apps to help you integrate with bootstrap more easily? Or create REST interfaces?
These things would make a huge difference to productivity if one platform had more (and better) of these things, but I haven't noticed that to be the case.
Welcome to HN! Ruby is a good idea for webdev but keep in mind that Python is also dominant in other domains of computing in particular scripting and scientific computing. So in my opinion for a beginner Python is more future proof than Ruby [activate language-flamewar-prevention-daemon] and if you later add a system-language (e.g. C) and a modern general purpose language (e.g. Haskell or Scheme/LISP) to your repertoire you are in a very good position to tackle most kinds of software development problems.
Or Emacs Lisp, if you don't have a specific project in mind and/or aren't already heavily invested in a text editor. As a programming language in and of itself, it's not a greap lisp, but it's a great practical way to learn lisp while you're customizing your text editor.
Either. They're not really that different from the other. I use 2, but I've used 3.. no issues. It will be an easy move as an enduser programmer when Python3's time in the sun arrives (and it may never).
Ruby is great for webdev, RoR is insanely popular so if that's your route you'll do fine. Ruby doesn't really do anything else well however, Python is pretty much a jack of all trades with a massive library ecosystem. It's used in scientific computing(numpy), web dev(django, flask etc.), game development(pyglet pygame), and a ton more areas. I'd say learning python is better in the long run.
I really like Python but I think there's nothing wrong with continuing to learn Ruby and Rails. Be sure to learn both Ruby the language and Rails the framework, there's more to Ruby than Rails. You'll likely also want to at least get your toes wet with Javascript so you can write native code for the client side of the browser, while you're learning that you'll be comparing with what you've learned with Ruby and from your previous experience with Fortran.
After learning those picking up Python will be fairly easy. Again, you'll be comparing with what you know, finding differences and similarities.
Don't worry too much about finding the perfect language, it doesn't exist. Spend your time building nice things in languages and frameworks that are good at at what they do. Be ready to try new things, you might like them, they might work well for a particular task or have a great ecosystem to accomplish specific things.
Ruby is perhaps more dominant than Python in web development. Not that Rails is necessarily better than Django, but it seems there are more Rails developers than Django ones. However Ruby seems to be pretty much confined to web development and deployment tools. It also makes great DSLs thanks to its block syntax. However in general Python is more popular than Ruby.
One reason is that Python is used for system programming (e.g: the Linux apt system), desktop applications, big data (thanks to numpy) so you might find it a better investment for general purpose software development.
Another reason, which maybe explains the first one, is that Python looks more similar to C++/Java. Basically it's a simplification of those languages (the {} pair traded for a single colon, no type declarations, etc) plus some handy shortcuts.
Ruby is from a somewhat different lineage (Smalltalk) and it shows in many places (e.g. the do |variable| ... end blocks). It can be used to write more natural language looking programs and combining that with some metaprogramming sometimes both amazes and unsettles programmers that come close to the language for the first time.
I suggest that you learn a little of both of them, working on the problem you want to solve, and decide by gut feeling what you would like to work with. I went the Ruby way because I can't stand some of the syntax of Python, not that Ruby is perfect.
There's really not much difference between Ruby and Python. I prefer Python and would argue it's a little more versatile (at least in terms of library support - people do command line scripts and scientific computing in python whereas Ruby is quite webdev-focussed) but honestly, who cares. Pick one or the other, learn it, have fun with it, use it to solve problems. And then make your next language something radically different from either of them (Haskell?)
I would not worry too much about making the "wrong" choice between Ruby and Python. I found it fairly easy to learn Python after coding Ruby for a year. Have fun!
Python/Django and Ruby/Rails are both clean frameworks and have fantastic communities. When I did the "Hello world" on both I was drawn to Python/Django. When it comes to moving away from the framework I tend to like Python more than Ruby. But it's just personal preference.
In addition, I feel like python tends to have a more consistent, but somewhat more verbose API in libraries and for frameworks such as web development frameworks. A strength and a problem that ruby has over python is that the syntax and language are more flexible, which both makes it harder to debug but also faster to develop and prototype in as compared to python.
When you say you've just started learning to code for building web apps, do you mean you
(a) already know HTML/CSS/JS and now want to learn a server side language, or that
(b) you are going from no experience straight to server side programming?
Regardless of whether you learn Python or Ruby, there is practically no way around having to learn HTML, CSS, and JavaScript for building web applications. You will be reading/writing HTML/CSS/JS in your Python or Ruby projects.
So if you have no prior HTML/CSS/JS experience, I would prioritize those skills first. Otherwise you will learn Ruby or Python then be disappointed to discover that browser technology is a giant hole in your skillset, impeding you from doing the useful things that got you to start learning in the first place.
You can do great things with just HTML/CSS/JS, especially since the browser environment supporting dynamic client-side applications in JavaScript has become so powerful.
Then when you're finally doing something that genuinely requires that logic and/or data be hosted on a server, managed by server-side tools, your knowledge of browser technologies will provide a solid foundation for stepping into that ecosystem.
If you already know HTML/CSS/JS, Ruby vs Python comes down largely to personal taste and whimsy. There are indeed differences, but the differences don't matter till much later, if ever. The greater danger is not that you will learn a language that isn't future-proofed, but that you could waste far more time picking what to learn than you ever might have wasted by picking the wrong language, especially since there isn't truly a wrong choice between two great languages. The debate between them is largely a religious war.
Since you seem focused on web apps, and have already started learning Rails, stick w/ROR. You're on a great path and your current choices won't hold you back. Good luck.
PS If you're interested in learning HTML/CSS/JS, I am thinking of teaching these subjects to beginners (no charge of course). PM me with a contact if interested.
Hi, I'm afraid it's (b). I have no web development experience.. It all started last year when I had an app idea, I started developing it with some computer science students but I realised that if I'm the creator of the idea and I cannot build it, I can only be like the commercial guy in charge of making money out of the programs those students make! And that wasn't the purpose! So now I've just got into web development from cero (fortran 90) to be able to create a kind of prototype of the product, so it's exactly as I imagine it.
I've started with the ruby course on codecademy, it seems it's just like a fun introduction to the ruby syntax. When I finish the codecademy course I'm thinking about going into "onemonth.com" for learning rails, and codeschool after that..
Thank you! I'll get started with those languages as well :) I'm focusing on web apps, but I'll have to use JS to take the idea to phone app (using Titan from appcelerator)
By the way! Thanks for the invite to the course! I'll contact you as soon as I find out how to send a PM haha
Most of the time (unless you choose something horrid like PHP) the language doesn't matter that much. Pick one (Python and Ruby are comfortable for learners) and stick to it while you learn the basics. Research on data structures, algorithms and the like. Have fun.
By the way, since it happened to me: if you find Rails to be too complex for what you need, there are more lightweight alternatives. Don't get me wrong, Rails will get you up and running fast; but when I was a beginner, it left me with the sensation that I didn't really know what I was doing. Using more lightweight frameworks, like Cuba or Sinatra, can help with that.
It is perfectly fine to learn ruby; Rails is one of the nicest frameworks to get started with --I do prefer python and django, but rails is pretty nice too
I work at Monash University in Melbourne, Australia. Last year we moved our foundational course in data structures and algorithms from Java to Python, with great success. This is a course that used to be taught in C, then moved to Java. I tutored it as a Java course, and was in charge of adapting the tutorial (classroom exercises) and laboratory (programming assignments) materials when we started using Python to teach the unit.
The main content of the course hasn't changed. Complexity, Big O notation, sorting and searching are introduced in the first two or three weeks. Students work through the usual implementations of arrays, linked lists, stacks and queues (both array and linked-list based), trees, heaps, tries, etc. All these data structures are explained first as chunks of data with external functions for the operations; then a second time as objects with methods for operations, once object orientation is introduced in week 5 or 6. Iterators are introduced as an abstract interface to list-, array- and tree- walking. Where relevant, we explain every operation twice, showing the iterative and recursive approaches. The course ends with two weeks of lower level work, where the students hand-compile pseudo-Python to working MIPS assembly.
The move to Python gained us, among other things, the freedom not to have to explain Java's generics and access modifiers. Many of our students have never programmed with a text editor before our unit, and dropping them into the big boilerplate world of Java micro-management was not good for them or for the course. We used to spend too much time on syntax and other details ("native types are like this, reference types are like htat") that we should have been spending on data structures and algorithms.
With Python, we even have time to explain idiomatic Python iterators and generators, and to discuss Python scoping rules (which come up again when we work on local variables in their hand-compilation tasks). So we still teach a bit of language-specifics, but we don't feel we are shackled to have to teach them. Also, teaching them idiomatic Python is a good deed after spending the first three or four weeks in a not very idiomatic subset of Python where we maintain explicit indices, update them by incrementing or decrementing them manually, use while loops the condition checking on the index value, etc. as required by the course content.
I don't have data on the results, because I was only the adjunct adapting the tutorial materials. The only difference I noticed was that students had less support from their IDE, and that some common errors (attempting to access attributes on None is the top one) that used to be picked up statically are now runtime errors, as expected. There didn't seem to be any big problems from Python's lack of lexical typing. Type errors are still caught, again by the runtime instead of the IDE and, if anything, this is a better exercise for students, since fixing them requires better understanding of what the program is doing.
If anyone is interested, I can ask the lecturer about the outcomes, but just from the fact that neither she nor anybody else teaching the unit has raised any issues the second and third times the unit was taught I can tell it's all going smoothly.
I do not know any Python (yet) - and my comment is probably off-topic. I spent a fall semester at Northeastern U, Boston in '97 and took a graduate course in Data Structures which was taught in C. I loved this course and finished top of the class (It helped that I was already familiar with C from an undergrad engg course). I fell in love with C. Implementing dequeues, linked lists, trees etc. with C was a pleasure - I learnt a lot about memory management responsibilities, smart error handling and began to understand how programming can be beautiful. Many years later when I started learning and working with Java, I felt that if this course had been taught with Java, I would not have gained as much (I could be wrong). The simple steps of typing your code in the standed editor supported by the environment (we were using a flavour of Unix in school), saving your work, compiling on the CLI and seeing the results - we just had to focus on our algorithm and nothing else. Your sentence "Many of our students have never programmed with a text editor before our unit, and dropping them into the big boilerplate world of Java micro-management was not good for them or for the course" stands out for me in this context. Since I know zero Python, I wonder how it compares. This reminds of Joel Spolsky's essay : http://www.joelonsoftware.com/articles/ThePerilsofJavaSchool...
Our unit is a first year unit, and runs for twelve weeks.
This means that we have ten weeks to take people who may have never programmed by typing text into an editor and teach them enough CS and programming that they can implement their own hashmap in Python and reason about its big O performance in one week, start programming in MIPS assembly the next week, and implement function calls in assembly and reason about stack frames the following week.
Anything we can do to make their learning curve less steep is good. You say that your Data Structures course was graduate level, and that you were already familiar with C from a lower course. These are two big differences between you and our first year intake.
Now, like Spolsky, I too have a bit of the Yorkshireman in me, and I wish our students would be forced to work with simple text editors and the command line, instead of having the IDE to do some of their lifting for them. I wish they were taught more maths too. And spelling, now that we are at it. And C, why not.
But when you say that in your course you "just had to focus on our algorithm and nothing else", it echoes exactly our intention when we picked Python for this unit. We wanted the language that would allow our students to focus on the fundamentals, and spend the least time and effort on the incidentals. We were already running the unit in Java, which was an inherited decision. I think Python was the best of our available choices.
I've commented before in a variety of places that if I were to design a first-year curriculum that would have suited me best, it would start with python in much the way that is discussed in this thread, focusing on the concepts and algorithms and general concepts of coding.
But the next course would be a hardcore introduction to C, embracing the language peculiarities instead of shunning them and trying to stick to concepts. I think at least once in an undergrad's career, preferably sooner than later, they should go through the process of learning the nuts and bolts of C.
I've probably been doing my own thing for too long to know for sure whether C is still the best choice, all I know is that NOT having the sort of class I describe, early in my college career, made things far more difficult later on.
Indeed, I'm surprised by how often people make arguments that seem to assume the only options are C from the start, or no C at all.
When you're teaching systems programming, all the things that have to be done by hand in C are precisely the things the students are supposed to be learning. But for almost everything else, they're pointless details that obscure what you're actually trying to teach.
Teaching multiple languages also has the advantage of showing people how they're mostly not that different.
> When you're teaching systems programming, all the things that have to be done by hand in C are precisely the things the students are supposed to be learning. But for almost everything else, they're pointless details that obscure what you're actually trying to teach.
I had a course kind of like this. My intro CS class was taught in Java, and the following summer or semester I took a 1-credit course called something like "C (and C++) for Java Programmers".
Why does your data structure course cover assembly? I am just curious because in my school, they teach assembly bottom up, starting from digital logic and transistors and eventually get to C.
1. Historically, before it was taught in Java, this unit was taught in C, and it would have been a natural transition from building data structures via explicit pointer manipulation to explaining stack frames as a specialised data structure for handling scoped variables in function calls.
2. Despite the move to a higher level language (and, from assembly and machine code, even C is a higher level language), it's good for the students to get some exposure to the machine model, and what is going on deeper in the stack of turtles.
3. The lecturer who runs the course also has a bit of the Yorkshirewoman in her. So students learn a tiny bit of MIPS like they should eat their spinach and shower with cold water, uphill both ways, in the snow, or something.
Most of that article is rendered moot with this section:
"Now, I freely admit that programming with pointers is not needed in 90% of the code written today, and in fact, it's downright dangerous in production code. OK. That's fine. And functional programming is just not used much in practice. Agreed.
But it's still important for some of the most exciting programming jobs."
Pretty much saying that everything he just spoke to us is purely for the "exciting programming jobs". And let's face it, most of us do not have that unless we go out of our way to pigeon hole business problems into being fun and exciting.
Yes it's funny, these places are teaching Python for the same reason they taught Java: it's "easy" and will get you a job. Yet we slate Java-schools, and cheer Python because it's trendy.
The truth is, universities shouldn't care what industry does, because "real-world" computing is more about fashion than it is about any sort of science. And if you do it right people can learn any language they need (case in point, I learnt FORTRAN at college, now I do Python for a living and OCaml as a hobbyist).
Yes it's funny, these places are teaching Python for the same reason they taught Java
That's not what I'm seeing at all. What I'm seeing is discussion of specific traits about Python that make it especially suited for a beginner language. And I'm seeing these arguments made over and over and over again, convincingly, in a way that was NEVER done for Java.
If these Python classes were given to an industrial engineer or a chemistry major who just needs enough programming knowledge to crunch numbers/create scripts I would definitely get the approach and agree with it.
However, if you want your Computer Science majors to be able to create the next runtime like the JVMs, .NETs etc. or be able to do some device programming they need to learn C/C++.
A decade after going through a top-tier CS undergraduate program what saddens me is that a lot of university students today, think that Computer Science is cobbling github sourced code with their choice of scripting language. Yes, the startup market is hot and sometimes that's all you need but CS is much more than that. CS is about understanding the underlying models and concepts rather than a language. If you are totally missing how memory management is done, how data structures are laid out in memory, how typing works you are missing a big chunk of code-base out there in the world today.
I think the new wave of CS students, who are expected to solve larger problems, need to understand how billions of devices/apps are running in today's technology so they can create something better in the future.
"However, if you want your Computer Science majors to be able to create the next runtime like the JVMs, .NETs etc. or be able to do some device programming they need to learn C/C++."
98% of all developers don't actually need to write the next JVM, or .NET or something.
"think that Computer Science is cobbling github sourced code with their choice of scripting language."
This is the beauty of programming, and I'm sorry you can't appreciate that. Not everyone is willing to stab themselves in the eyeball trying to re-implement the .NET/JVM wheel or other low-level library.
It's the same reason the majority of us don't write assembler anymore. We've moved on and abstracted away from that arcane knowledge. Does that mean that arcane knowledge is irrelevant? For most of us, yes. For others, no.
"Yes, the startup market is hot and sometimes that's all you need but CS is much more than that. CS is about understanding the underlying models and concepts rather than a language."
Meanwhile, in the real world, CS is not really necessary for most business problems. No matter how much you wish to romanticize it as a discipline.
CS is about understanding the underlying models and concepts rather than a language.
Programming Language implementations are exploratory tools critical to understanding those underlying models and concepts. Indeed, in my experience the "CS is about understanding the concepts" logic is more often than not used as an excuse NOT to teach C.
Having recruited 3 dozen interns/jr.programmers from schools in the tri-state area and overseas in the past 5-6 years, you would be amazed how many students don't get these basics.
Those schools actually DO try to teach students about these concepts. How do I know? I personally have spoken to the deans of CS departments. Now imagine these same students, who no longer even hear of concepts such as garbage collection, pointers, etc. because it just works in their chosen language.
I believe the era of a single language for your career is long gone. Most people in their 20s and even 30s today will go through at least 4-5 different languages throughout their career.
Personally, I learned and enjoyed C++ 15 years ago but I do not find it practical for anything I do today. However, the same "impractical" knowledge allows me to get familiarized with a new language in a few weeks and start producing working software.
I don't quite care what language students learn as long as the concepts are understood. However, if the language they first start with hides most of these fundamental concepts, they no longer have any incentive to even learn how anything works.
My first language was Basic (in 1982). The second was assembly.
This is not terribly different than what I suggest. I'm saying in the modern world, Python fills the role of Basic and C fills the role of assembly.
The abstract idea is:
First Language: Clean and simple, to introduce the very idea of coding algorithms for a computer.
Step two: Low-level language requiring a lot of attention to detail. Should be useful for exploring low-level details about computer architecture as well as writing real programs.
Notably: Java doesn't fit very well into either category. There's a lot of overhead for pure beginners, and too many layers between the programmer and the hardware
"we have ten weeks to take people who may have never programmed by typing text into an editor and teach them enough CS and programming that they can implement their own hashmap in Python"
Remember that if this is a 4-year curriculum, they may eventually be expected to write code for a hashmap in C or some other low-level language as part of a data structures or algorithms class. And often, a class like that will spend time on analysis and algorithm design tradeoffs. The best way to understand why accessing a hashmap can be done in O(1) time is to actually code one for yourself.
> they may eventually be expected to write code for a hashmap in C or some other low-level language as part of a data structures or algorithms class (...)
> This may simply be a preview.
No, this is it as far as the core units go.
This is the first course on data structures and algorithms.
There is a second year course that deals with further data structures and algorithms: we teach amortized performance, proofs by induction, some randomized data structures (skip lists and maybe another), different tree-based structures (I remember red-black trees and patricia trees), some graph search algorithms and a bit more than I forget now. And currently we are doing it in Python for the first unit, and in any language that the student and their tutor can agree on for the second unit.
Remember: these are just class exercises, not real implementations. Of course nobody is going to write actual functioning data structures in Python[1] and expect them to be usable for general purpose programming[2]. That would be daft!
I have a different experience tutoring (exercises and labs) in an engineering school in France for programming / CS beginners.
We moved from Java to python this year and I miss a few things. Due to the absence of explicit typing the students don't bother to understand types anymore and it leads them to a lot of mistakes and misunderstandings: confusion between simple types, lists, dictionaries, instances of a user created class, etc. Besides I think the verbose and rigid syntax of Java forced them to understand what their wrote and for a first language that was a good thing.
Overall I found that since it is easier to write code in python they rush to writing anything without understanding the algorithmic problem first. Thus they are less able to decompose a problem in sub-problems and write atomic functions to solve each of them.
Note that I think teaching python as a first language is a viable option but in our case the course needs to be intensively rewritten and the labs adapted. For instance my lattest point about algorithms is not a problem with the language: it could be resolved by having a part of the lab being pure algorithmics and then an implementation part.
I graduated from Monash Clayton a few years ago when the Java train was in full effect and the focus on Java was a big source of frustration for me. I distinctly remember an algorithms & data structures assignment about Markov chains which had to be done in Java. I spent a good 40 minutes writing what was effectively boilerplate to load the provided files into a meaningful representation and then just snapped and did the entire assignment in Python in maybe 15 minutes. Its crazy how much Java gets in your way for small projects.
Even though it was made clear that Python submissions won't be accepted, my tutor was very understanding and gave me full marks after I stepped him through the code. If that tutor was you, I owe you.
Regarding "small projects," how useful are they in motivating learners to continue the learning the language? Having robust tooling introduces overhead initially (and to be honest, seemingly eternally), but also makes the impractical possible.
The goals of a first year CS class probably don't include motivating learners to continue learning any particular language; they may include motivating the learners to continue learning CS and/or programming in general.
My first year CS data structures class was in scheme, as used to be popular. I enjoyed the class a lot, leanred a lot, and also found it very challenging. I think scheme was perfect for it. But I had no particular desire to continue with scheme afterwords, which is fine. (Not because I have anything against scheme, but on to other things).
No, it wasn't me, though I did let anybody in my FIT2004 tutorials (the second data structures and algorithms course, for those who didn't go to Monash) do their work in Python. I only had one taker.
Your first sentence made me a bit sad but the rest of your post changed my mind. All good points, and it looks like the move to Python was the right thing to do.
It's important to remember we're talking about an introduction, here. Creating the spark of interest in students is the most important point in this class. There's always time after this to move them toward more expressive languages.
That's a weird sentiment to hold, that less expressive languages should be used to spark interest. If I were learning a first programming language again, I would want to learn a language that let me get going as quickly as possible. It takes a lot more to build something in C than in Python and Python can much more quickly prove to a person learning their first language that programming is worthwhile.
> That's a weird sentiment to hold, that less expressive languages should be used to spark interest.
Maybe you felt you had to jump in and defend high level languages, and maybe slash the other guy's tires on the way out, but this reads as trollish. Plenty of people working in C find it plenty expressive. Is it "more" or "less" expressive to choose to express different things? I don't think so. This comes off as uninformed C bashing, sadly very common and accepted around here.
The rest of your comment, that it is good for learning or as a first language, makes more sense to me.
At the University of Queensland, Python is used for mech engineering courses. One of the other considerations that I think helped give Python the edge was its simplicity, in that it can be 'programming for non-programmers' and helps a lot of newcomers meet the simple scripting requirements they have without needing to understand more complex programming concepts.
Secondly, it makes for an easy transition from Python to Matlab, and Matlab is used a fair bit throughout UQ's Mechanical and Mechatronics departments. They both share very similar syntax (the major issue for most mech engineers, who deal with little more than loops, is that Python is 0-indexed whereas Matlab is 1-indexed).
The above makes Python a great introductory langauge for non-programmer beginners like mech engineers. It is ridiculously simple and human-readable, and when you're outside an IT/EE department then you don't end up spending excess time explaining the quirks of a language to those who will never care for it. On the other hand, it still gives more than enough rope for those that do end up wanting to run with it and they can end up doing some pretty fancy stuff with it.
Does Monash align there in that it's used in the non-IT related technical degrees?
> All these data structures are explained first as chunks of data with external functions for the operations; then a second time as objects with methods for operations, once object orientation is introduced in week 5 or 6.
Curious. Why is object orientation introduced in a data structures and algorithms course?
the most surprising thing I read was "Many of our students have never programmed with a text editor before" - what kind of CS students haven't tried programming before with a text editor?
I find that incredible. Long before I got to university I was been programming by myself with whatever editor I could find, edt, vi etc
Yes, many of us here at HN were programming before going to university, or instead of going to university, but at Monash we are dealing with our first year intake, not with the typical HN commenter.
I hedge that many of our students have never programmed "with a text editor" before because:
1. Many of them have used spreadsheets in higschool to build quite complex formulas, and spreadsheet formulas are turing complete, so they have programmed in some form of another.
2. I didn't want to discuss this here in order not to muddle the debate, but our introduction to programming course is now ran on Scribble[1], a dialect of the Scratch family. So many of our students in this data structures and algorithms unit have previously done just 12 weeks of programming instruction, and understand many of the concepts, but our unit is the first time that they have to deal with (for instance) syntax errors in textfiles.
Many people here in HN were teaching themselves programming by age 6, but to repeat myself: you are not our first year intake. We teach the students we get. These are not bad students: they are often hardworking and smart, and do well and learn a lot in our unit.
They may be ignorant of programming, but that's also ok, because that's what we're there for. And, to go back to the OP's topic, using Python as a teaching language lowers the barriers for access to the concepts that we really want to teach: complexity and big O, searching and sorting, etc.
it might be easier to use groovy, for noobs it's basically the same as python and runs on the jvm, also you can introduce java gradually without changing environments.
I love python - used it for years, but I am dumbfounded that people enter a CS degree without having ever first tried to write a program. That's like taking english lit without having read a book before.
> it might be easier to use groovy, for noobs it's basically the same as python and runs on the jvm, also you can introduce java gradually
Groovy (the JVM version) is marketed towards programmers who already know Java, and its main use cases (e.g. manipulating/testing Java objects, gluing together Java apps, and scripting in Grails) make the syntax terser but the programmer still needs to know what's happening underneath at the Java language level or they'll quickly hit a snag they can't solve.
The only use case where they don't need to already know Java is Gradle, where the script for a whole project is typically 20 to 100 lines long, usually copied and pasted from another project, and only using a tiny subset of the Groovy grammar, i.e. the DSL which is the least Java-like, so programmers won't be introduced to Java by using Gradle.
I even tried writing a Groovy(JVM) tutorial a while back for those new to both Java and Groovy [1] but found I had to introduce most Java and Groovy concepts at the same time. Groovy only gives some syntactic tersity via collection syntax, closures (though now obsolete with Java 8), and getter/setter syntax, nothing more.
I may have made this term up myself; I am on the go and can't check it. As I use the expression "lexical typing", it means that the type of a variable can be ascertained from examining the text of the program, without needing to run it. Python is strongly typed, but dynamically typed: you can only know for sure the type of a value by checking the value at runtime.
This by analogy with "lexical scoping", where you know where a certain name will be bound just by looking at the program's text, while with dynamic scoping, the stack is what determines where a variable is bound, and you can only know about it at runtime.
I guess I should have said "static typing". I probably made the mistake because lexical scoping is also called static scoping, and I was writing in a hurry.
I would assume from context, it is the ability to declare a variable in a lexical scope to have a specific type in that scope. With static typing you would then have the ability to catch use of that variable with the wrong type and make that a compile time error.
That said, I believe that pylint could statically catch some of the type errors that are being referred to.
Good news! I would like to add that another advantage of Python is its usage for scientific computing (Math, Physics, Biology, ...) so students hopefully choose Python to help solving problems for their math classes e.g. using numpy and co. They can then get easiliy into IPython which is IMHO a very important FOSS project.
At my university I feel that proprietary scientific programs and CAS like Matlab, Maple and Mathematica are still the preferred problem solvers for students in the hard sciences (except CS).
In the engineering fields (I am an ME), I am seeing steady growth in the use of Python and declining use of Matlab. The trend does appear to be more prevalent among younger engineers.
I've taught both, and IMHO, the Ruby is less suited for the job of an introductory language because a block (i.e. an anonymous function) is a hard concept for a beginner to grasp and a core part of idiomatic Ruby. Python's syntax simply has "less magic" and is more regular. Ruby's syntax seems to be suited for either folks who don't really want to know what's going on (just give me a DSL and leave me alone) or students who have one language under their belt and are looking for to branch out.
That said, Ruby includes a lot of interesting concepts, and while I don't believe it's an ideal first language, those concepts certainly belong in a CS program.
The "Programming Language" course at coursera [1], which is directly adapted from a course at U. Washington by Dan Grossman [2] features SML, Racket and Ruby as three languages exhibiting different properties (FP, OO and static typing).
A block is not an anonymous function. Try using `return`, `next`, etc... in an anonymous functions. Ruby has anonymous functions, they are called lambdas.
I'd say that blocks might be easier to grasp for a true beginner that hasn't been "contaminated" by the notion of anonymous functions as fundamental objects.
You are, of course, technically correct. However, I'd argue that at the level of a CS1 course, it's all just semantics. True beginners don't even understand the concept of a function, let alone an anonymous function. For a student that's struggling to understand what's happening (not just type and have faith), blocks turn into one of those 'down the rabbit hole' pieces of syntax that requires a bunch of hand waving and earnest assertions of eventual understanding.
If you've got an explanation that would work in week 2 or 3 of a CS1 course, I'd love to hear it. I never found one better than "It's shorthand for a function."
That's why I think the best introductory is the usual C/C++ classes that build on a foundation of memory addresses, the stack, function calls, etc. If you spent just a few days talking about how an actual program runs, blocks are very easy to place on that foundation(haha, see what I did there?).
You can say that about lots of languages. Scheme is good for beginners for reasons X and Y, C or C++ is good for reason Z, and Haskell is good for reason W.
But at the end of the day, if you teach someone Python, they can turn around and write Python code to solve problems in their science classes with minimal friction, or they can go on to be software engineers, or write web scrapers, video games, whatever. Python's arguably not the best at any of those things, but it's good enough at all of them.
It's probably some combination of happenstance and various numerical libraries making Python relatively popular in scientific computing before Ruby became well known in the U.S.
Maybe many of the people who take these intro course are not CS professionals. With the numpy and scipy modules there are a number of uses for Python for science/other majors that don't involve building web pages.
mainly the fact that python was designed to teach programming and its community has pushed that aspect of it; ruby's community, IMHO, tends to be more for cool, shiny things, and faster changes, which makes it less suitable as a teaching language
In college our intro classes were in C++. While its a tough language I do wonder if it was advantageous to forcefully learn about 1) value vs reference types via pointers and 2) how allocation maps into memory space. Not having that level of depth kinda worries me
Sure, but a systems-oriented course is the right place to pound that into people. Not CS1.
CS1 should be an introduction to the nature of the discipline. I can think of many things more important and more representative of the nature of computing than the semantics of C.
For 90+% of the problems out there, a business programmer doesn't need to know anything about memory#. Short of a few rules, I don't see why they can't ever go without learning about memory space, allocation, reference types, etc.
They will make mistakes, sure. But they will get the work done, and that's pretty much what most businesses need. Eventually, as problems creep in, the messes created by lack of knowledge get resolved/cleaned up.
Programming is incredibly easy and powerful, when made simple. The problem is that learning programming in C/C++ is anything but simple! And the only reason we'd think otherwise is because we already went through it, somehow.
I've seen non-technical minded people learn programming with C/C++ as their first language. They get frustrated because they don't know why certain things work magically while similar things don't. So they end up learning to memorize what works and what doesn't, and copy-pasting what the book tells them. Eventually kludging together a working solution to a problem in the book. With no hint of understanding why.
Wrong.Customer currently used to google speed.They want very speedy system and fast analytical decision making.Batching is old news,new system must speed 800% because customer paid high end core proc
I can see this rationale as an argument if it was an associate's or 1 yr degree, but for four years of study I expect real depth about what's happening all the way down memory etc
For 4 years study you dont need to learn C as your first language. You also get the benefit of being able to focus on the low level stuff when you finally go over C, instead of having the language do double duty of teaching both low and high level stuff at the same time.
Then I'd say: What you expect is required and what is actually required to complete the task to a client's expectations could possibly be two very vastly different things.
Besides, most of what is being asked of four-year-degreed individuals is not really worthy of it. Which is why I keep mentioning stuff to the effect of "to a client's expectation" to most of my comments on this.
My college programming intro class is C++, which IMO is a total nightmare.
It is not that I don't agree with your point, actually I think It makes perfect sense. Eventually, anyone who considers him/herself as a serious CS student has to get his/her hands dirty with those relative low level languages in order to have a through understanding of what computer really is and in reality how it works.
Ironically, it is this very same reason that makes C++ as a entry point into programming a horrible idea. Because in order to understand it, IMO, a solid understanding of computer architecture is a MUST. Things like how memory is allocated, how call stack works, and why IO interfaces looks like it is now are unrealistically hard to take in a single course for those new comers with definitely zero programming knowledge, who probably would scratch their head out all night wondering why the following code snippet doesn't work (which wasn't valid back then in my freshman year, don't know whether it is valid now):
And as it is a entry level course, usually a lot of details are skipped which makes it even more confusing. Telling from my personal experience, it is really frustrating that I once want to quit programming for good.
Luckily enough, I got to know Python at that time. In relatively short period of time, I could come up with a simple web crawler to scrap and download movie torrent files from the web. This is important to me that for the first time I realize I could do something both INTERESTING and USEFUL with programming language, which in turn gives me the motivation to go back and look at those lessons I run away from.
C++ makes it harder to work with functions/lambdas and the memory management makes some easy things much harder (for example, list and string processing)
This is about an introductory language. No one is saying CS students shouldn't eventually learn those concepts, but why would you want to bog down an introductory course (with non-CS students, many of which probably haven't written Hello World before) on that material?
Valgrind is bandage sadly only available to GNU/Linux developers with lots of RAM to spare. Not fully available, if at all, when other systems need to be targeted.
Fully agree with static analysis.
My point is that at least with C++ you get the help of stronger type checking, RAII and automatic memory management via the available collections.
But of course, the C compatibility is both an advantage in terms of helped adoption, but also a weight in terms of being able to write secure code.
Is sad that students never get to learn something like http://ocw.mit.edu/courses/electrical-engineering-and-comput... they could easily learn any other language after, learning just java or python is practical only to prepare them for a corporate job.
A few points that will likely get lost in the noise here:
1. It is weird to take the top graduate schools and ask about what they teach to undergrads. I'd rather see this done with the best national universities list - http://colleges.usnews.rankingsandreviews.com/best-colleges/... - as the basis for this study since that's where the top undergraduate students will end up.
2. I'd rather see trends than raw numbers - the fact that python has grown quickly over the last decade in terms of adoption is more interesting than it being "the top," to me.
3. It is sad that Java still has such a stronghold both in terms of being the intro language and being on the College Board curriculum. It is not the right language for beginners, or, really, anyone.
4. While I didn't take "CS1" in college (thanks AP exam) - I did take CS2 or whatever, and it was taught in scheme with SICP. Scheme was the great equalizer in that it was unlike any language anyone in the class had ever seen and that syntax no longer mattered - only concepts did. I hope it continues to keep its place in second-level courses.
5. It is unfortunate that garbage collected languages are extremely highly represented on this list - memory management may be unpleasant, but acknowledging that we are programming on limited-resource devices should be acknowledged early and often. Forcing students to experience this first hand seems that it serves both the purpose of teaching them abstraction (isn't this garbage collector great!) while maintaining a respect for the fact that resources (cpu, memory, power, network, disk, etc.) aren't free.
It would also be useful to include liberal arts colleges. I don't think it's representative of top institutions if you only consider research universities. For example, Harvey Mudd, has a top notch CS program, but wouldn't be considered under that metric as it doesn't have any grad programs.
I'd be interested to figure out the whole intro sequence. For example, my college begins with Java for the first class, then the next two classes can be taken concurrently, which respectively consist of functional programming/basic CS theory in SML, and Data Structures/Advanced Programming (memory management!) in Java and C++. I'm sure a number of other colleges follow similar procedures.
Also - I don't think that "CS0" classes should be included on this list as most are not taught with the intention of further CS education. For example, we have an "intro" CS course that is taught within the context of cognitive science (in Python).
One of the classes offered in my major (biochemistry) is "Python Programming for Biology" which is generally the first exposure that most of my peers get to programming, and I tell anyone who will listen how important that class is. Especially in the sciences, Python has a plethora of scientific libraries and great resources to do just about anything that a non-career coder would need. Except in computational chemistry everyone's stuck with ancient libraries, and I recently met an anarcho-capitalist grad student doing all his research in Fortran. Cool guy.
I got the impression that mostly everybody writes computational chemistry libraries in FORTRAN. The libraries are not older than in any other field. Some are quite modern, and have wrappers for several languages, but they are always in FORTRAN.
OT: Since I don't have permissions for an Ask-type post I'll do this here.
The lines in this thread break extremely 'late' (http://imgur.com/IzmwWBo), causing me a lot of painful sideways scrolling - my screen only has a 1024x600 resolution, but all other threads I have open look normal (http://imgur.com/Q45ZDQ6).
What causes this and how can I get rid of it (FF30 with NoScript and Disconnect)?
CMU transitioned from Java to Python when I was an upperclassman there. I'm not sure what other similar places do, but CMU splits introductory courses into 2 parts - 110, 112. 110 is the basic programming with gradual intro to OOP and stuff. 112 is where the datastructures come in - it's a programming primer to implementing datastructures and algorithms, so to speak.
Here is the issue I feel goes unnoticed as most people place out of 110 and never take it - CMU moved both courses to Python. While this helped 110 in a way by making it more accessible, it also resulted in people getting out of 110 with far less OOP principles than they did before. On the other hand, python works great in 112 because you spend more time building datastructures rather than running into the wall with Java OOP layers over and over. It's also cleaner so how a datastructure works becomes more transparent. But since the curriculum assumes OOP mastery from that point on, with no software engineering course required - I always wonder if replacing Java from 110 (a course for people with no coding background) was a mistake.
They put the transition to vote and Python won very clearly. But I always wished they continued 110 in Java (or shifted to Scheme, or anything that follows a stricter regime than python - so one gets comfortable with one way of abstracting problems, be it OOP or Functional) and taught 112 with Python. It probably isn't worth even discussing given how few CS majors take the course, but it always makes me wonder why we needed same language for both courses.
Speaking as one of the people who helped lead the push for moving to Python in 15-110 and 15-112, the reduction of OOP was extremely deliberate.
At the end of the day, Python won because it could be used to express programming concepts quite clearly in both an imperative and, to a slightly limited degree, functional manner, in ways that were appropriate for a variety of domains. And it did it in a way that substantially reduced the "boilerplate" syntax burden, which most of us take for granted, but is a large pain for newcomers to programming.
print "hello, world"
is very easy for a beginner.
Depending on what your background is,
[(i*3) for i in [1, 2, 3]]
may be a very clear way to multiply every element of an array by 3. Or it may be confusing as heck. :)
Our general hope is that by the time our current first years make it through the program that there will be even less of an emphasis on OOP needed. C0, for example, introduces an additional component of formal reasoning, and my shift of 15-440 replaced C++ with Go, which has a modestly less "OOP"-centric flavor. The big drawback now is that the transition from 110/112 to the formality of C0 and the functional intro with ml is rather jarring. That's partly by design, to get people to think about computer science differently than what they might have learned about "programming" in high school, but we'll see if we can optimize it.
(I'm posting about this because I'll be teaching 15-112 with David Kosbie next Spring.)
I think the intense focus on OOP is a problem, and it is symptomatic of teaching students Java as their first language. Just as if you learn BASIC first you love GOTO, if you learn Java first you love classes and think everything should be a class. What especially concerns me is that some people are interested in whether a program or language is object-oriented, and endeavor to make their programs more object-oriented. Object-oriented programming is a design aesthetic, perhaps one of the most nebulous design aesthetics, and it should not be considered a goal or a dogma but rather a source of inspiration for design. Java is a particularly bad example because Java's brand of OOP conflates encapsulation, data structures, and polymorphism. The languages we'll be seeing in the 2020s and beyond will probably be more sensible about these things. Look at the new languages with the most mind share these days (Go, Rust, Swift) if you want an idea of where the wind is blowing.
The other problem is that the farther you get from being a freshman, the farther you get from understanding which concepts are necessary in an introductory curriculum. As a senior you might say that OOP must be taught to the freshmen because so many freshmen don't get it, as a junior developer you might want the program to teach the standard library, and as a senior developer you might want to teach people about testing. This is all wrong. The most important things to teach freshmen CS students are very simple. Control structures, data structures, references / pointers, and recursion. Never forget that while OOP is warm and fuzzy and makes you feel good, the number one priority of the intro CS courses is to get the freshmen to write programs that work correctly.
Or, since OOP is a design aesthetic, it is only useful once design becomes important. Design is not important for small programs that freshmen write. Let them get off the ground with correct programs, and teach them paradigms once they have enough context to understand them.
I learned BASIC first and started hating GOTO before I was even in my teens. Who likes GOTO, seriously?
> The other problem is that the farther you get from being a freshman, the farther you get from understanding which concepts are necessary in an introductory curriculum.
I don't know, the farther I've gotten from being a freshman, the broader my experience of other people learning programming has been (and, even though I've never been an instructor in a formal introductory course, the more experience I have teaching other people, including programming.) Even if my freshman classes were my first introduction to programming, I don't think I would have had a better idea what an intro course needed then than I do now.
I think that's another thing to note: around two thirds or more of the CS freshman at CMU actually take 15-122 Principles of Imperative Computation, which is taught largely in C0 (a type safe subset of C), and then later they transition to full fledged C.
At my university, Python was primarily introduced when teaching certain concepts: scripting & untyped programming, web application development, and deeper diving into list comprehension functions. It was not the first language taught in CS, but many were exposed to it. That said, it was the first language taught in non-CS/non-major courses, and I imagine several other non-CS/Engineering majors were introduced to the language later in their academic career.
Yale teaches CPSC 112 (CS non majors) in Java. Or at least, they used to. I think I heard this term they switched to Python. I would say that was a wholly good move. The only downside is that CS majors who take 112, and might prefer to learn Java in a class but Python on their own, do not get exposed to Java in any part of the curriculum.
The "introduction to CS" course, CPSC 201, is taught in MIT Scheme. It's interesting because for CS majors, the first class is functional programming. A sizeable minority (including myself) skip that class though, since it's only offered in the fall. I chose to skip it because I started the major late, already had programming experience, and Scheme isn't actually used outside of education. From what I gather, I missed out on some valuable lessons in simulating circuits and binary logic. My knowledge there is still weak.
After that course, the rest of the core curriculum is taught in C. Scripting languages like Python/Perl/Ruby, you are encouraged to learn on your own, and will often help in assignments. But they aren't explicitly taught as the focus of any course.
I absolutely hated some of my programming assignments in C, but I think learning systems programming in such a low level language is timelessly valuable and makes picking up new, higher level languages significantly easier, since you already know the tradeoffs of the underlying data structures and algorithms. For example, if I had not been exposed to pointers, I would not understand the difference in Python between "is" and "==" when comparing strings. In general I think learning c is a valuable albeit masochistic move.
I'm an undergrad at one of the outlier schools (UCLA) where a combination of the quarter system and a relatively conservative CS department make for an interesting undergrad curriculum.
Our "CS 1" is taught in C++ and is mostly dedicated to well, C++. The entire first course is essentially dedicated to teaching you how to not blow your leg off while managing memory.
We don't even touch data structures, algorithms, composition, inheritance, etc until "CS 2," which is also in C++. When I first tried Python as part of a 1-week assignment in "CS 2.5" (a lab course), I was blown away and hooked pretty quickly. We don't get formal Python experience really until the extremely work-intensive upper division programming languages course. From my perspective, I think a large number of people are discouraged by CS 1 and how it's more of a programming class than a CS class. The only benefit of a C++ introduction is that it's easier for people to transition to C and lower level code for the operating systems class. Still, C is pretty well covered by our "CS 2.5" and "CS 3."
Other fun facts: our CS 0 (programming for non-CS majors) is split into three classes and is also in C++!
I accidentally took a C++ at a community college during high school and was forced to skip "CS 1" when I enrolled for my first quarter. I felt so unprepared for "CS 2" that I actually audited a good number of "CS 1" lectures and did half of the projects for the class even though I didn't take it.
I went to Davis which also teaches C in the CS1/first course for concentrators, and C++ after. It was really hard, I really liked it, and I've always felt it's given me an advantage over people who have no idea how the lower layers of the stack work.
I definitely saw it scare some people off though. So if your only criteria is "what is the passing rate", it probably does worse than python.
We had an intro to programming course at our college (NIT, Srinagar) which was based on C. Seemed perfectly alright, after some thought, because most of my batch mates had never programmed before (and there was a significant population who had no access to computers prior to college)
Since I'd been programming since I was 11, it seemed a little arduous to sit thought classes explaining what a variable was, etc..but in the end, I understood they had to do what was best for the whole lot, and obviously couldn't teach something more advanced to an underprepared set of people.
It's very interesting. My introductory course was in C, but on my last year they already had switched to Java.
I think Python nowadays is really the better choice: although C has its own merits (and it should be taught at a later course, possibly together with embedded systems) , it introduces more complexity than necessary.
Python, with its readability and "batteries included" is really the ideal language for an introductory course, not to mention that it has potential lifelong use as a "Swiss Knife" type of language.
While I love python, I feel like Java has more application for beginners. While I am a backend guy, Java is better right now for split front end backend sorts as you can build a client Nd server for an Android app in Java. This lets you learn front and backend without having to know two languages.
I say this knowing I hate a lot about Java, but thinking about how I would teach concepts and techniques while making a student employable and fostering the ability to experiment on their own.
I think the idea that the simplicity, lax compiler and readability of python will serve as a easier introduction to programming and keep more students from dropping out early. Java certainly has its place in enforcing verbosity, OO, coding standards, etc.
I think the mission here is to ease people into coding so if they don't want to progress any further up maybe they'll just manage a companies small PHP website or something and have the knowledge to do that.
I honestly foresaw this and advised my kids to learn Python, it's the new BASIC and the new JAVA-in-college. That does not mean Python is better than others, but it does fit a niche in the education system somehow.
For EE major etc I would still recommend the plain C stuff though. For web-design etc Javascript might be a better course. Overall Python is a nice choice for the generic public.
Bit of a reversal from that "Python is dieing" post a few weeks ago. But like i said then - just one person posting some anecdote about people switching away from python does not a trend make. While Python becoming a teaching language of choice among many universities does make a trend (to add to similar upwards trends in job postings, high TIOBE rankings, etc etc).
It's not surprising to me that Python has become the most popular intro language at universities. For someone who went a school with a poor computer science program, Python was a saving grace and landed me my first job as a developer. Unfortunately the school I went to did not have any courses on Python so I took it up in my free time.
This is completely misleading. Just because a course is offered at a 100 level, doesn't mean it's an introductory course. These numbers don't mean a thing... (in case people cared, yes I did go to one of these schools and at least state for that given school, this data is complete bs)
UC Irvine has a couple ICS intro classes. It was ICS 21 back in the day and it was C++ and then Java based. However it is now ICS 31 and it is python based.
a language without types, without extensive research? Then, why not C/D/Pascal. i strongly believe for universities sml/ocaml/haskell or even rust are far better choices. I prefer sml.
> i strongly believe for universities sml/ocaml/haskell or even rust are far better choices.
Rust is a horrible choice now, because it hasn't stabilized. When it has stabilized, it may be a suitable choice (at one point, it looked like it would be too large of a language to a be a great intro language, but since then its been getting smaller, so I'm less convinced that it will remain unsuitable for that role.)
Python's a pretty good intro language; as is Scheme, and Haskell might be, though I haven't seen any good intro-to-programming material built around it, and while I think a good intro should prefer a largely-functional style (as the best Python and Scheme-based courses I've seen have), I'm not convinced that a pure FP language is ideal for an intro to the field (though it certainly a good thing to learn in the course of an education in the field.)
The computing courses at my university use Haskell in the introductory programming course - apparently it works rather well as a vehicle for exploring algorithms and data structures.
The students move on to Java, C++, Perl, Prolog and others after that; not sure about Python.
The computing courses at my university use Haskell in the introductory programming course - apparently it works rather well as a vehicle for exploring algorithms and data structures.
I think a strict language in the ML family would even be nicer, because students wouldn't have to reason about laziness.
Rust killer feature is the interaction between the type system and memory management. I don't think this is the sort of thing you want to worry about in an troductory class...
I made a mistake to suggest it. But the purpose here is to learn something in parallel. Pattern matching/ Types or even lambda calculus are good to introduce with the language. Python is good if you have an algorithms/data structures course, students are not required to optimize for a hardware, only algorithmically.
Python's syntax is a breeze to get ahold of without any experience making it ideal for introducing people to CS and algorithmic thinking without overwhelming them with theory or syntax.
Also, a lot of colleges (at least the best ones) offer functional programming in a second semester intro course (Berkeley does Scheme I believe). My own institution uses SML.
Berkeley's intro class was entirely in Scheme up until a few years back (2011) when they switched to Python. It switched when the course's professor (Brian Harvey) retired, he wrote a great post of his thoughts on Scheme vs Python http://www.cs.berkeley.edu/~bh/proglang.html
Berkeley's first intro CS course uses Python for 2/3 of the course, then uses Scheme. The final project is writing a limited Scheme interpreter in Python. To further clarify, Java is used in Berkeley's 2nd CS class (followed by C & MIPS in the 3rd intro class).
Python is definitely more friendly than those you've mentioned. I think as introductory courses, the number one goal to achieve is to get students interested, rather than throw every possible details at them and intimidate them away.
I've seen Processing used in some intro courses for that reason, because the time-to-interesting-experimentation is fairly low (similar to the goal of Logo in the '80s, which was my own first language, although I think Logo was much better as a programming language). I don't think it's a common first language in intro-CS programs, but there are a number of curricula being built around Processing outside the context of CS majors.
Processing is a good candidate because it allows a novice to make something interactive and interesting in very few lines of code. The downside is that they wind up typing things about which they understand nothing...e.g void setup {···}. And it's not like void will really make sense after two weeks. On the other hand, placing the mastery in the executing artifact, not the underlying code, might be a good thing.
I think pyprocessing (https://code.google.com/p/pyprocessing/) makes an excellent middle ground here. It implements much of the Processing API in Python, without making an effort to hide the underlying principles of the language (like Processing seems to do for better and for worse).
Unless your university is really weird, I can guarantee you will learn many more languages than just the intro language. When I got my computer engineering degree in the early 2000's I was exposed to C, C++, Java, Scheme & Lisp, SPARC & 68k assembly, and even a bit of Python. Learning Python as the intro language instead of C++ would have been much nicer for folks who were completely green to programming.
Seriously? What do you think it means to be a freshman? I am getting seriously tired of this attitude in the software community that you're only a legitimate programmer if you've been doing it since you were in diapers.
That's ridiculous. What level of skill do you expect in an introductory program aimed at freshmen? Because that description sure seems like it should be tailored to someone completely green to programming to me.
I assure you there are many that are. My first year at uni was filled with classmates that could barely operate a computer, there were many like myself that already had a good handle on programming and cs fundamentals but we weren't the dominant group. Most of the greens dropped out/switched majors after a year once they realized how hard it was. I hear from a lot of parents equating spending a ton of time on a computer with being tech savvy and a natural fit for computer science, this may be why most of them tried it out.
Well some introductory CS courses are taken by people in other majors. At my university, all science majors(as far as I know) have to take at least 1 introductory programming course. In that context, it makes sense to use python to introduce the concept of programming to people.
rust? please. maybe 3 years from now, once it matures. python is good for teaching programming because it's easy to read - a feature that not a single language you mention can brag about.
It's easy to read and it just makes more sense when someone only knows English.
I could show this to my 80 year old grandmother and have her understand what it does with minimal help on my part.
word_list = ["hat", "dog", "cat"]
for word in word_list:
print(word)
The same example in Java isn't extremely hard to understand, but it definitely is a decent jump in complexity from the almost plain English syntax of Python.
EDIT: Forgot about Java's foreach operator, but even so with java you then have to explain what String[] means and the keyword "in" makes for an easier read than (String string: strings) and System.out.println vs print
Types dont make things more complicated. Most languages have types and I prefer dealing with them explicitely rather than letting an interpreter do random coercion like Javascript.
Types lead to cleaner programming,instead of shoving random stuffs into an untyped list,a dev has to think about data structures.
Furthermore,as a non native english speaker,I dont find python more readable than staticaly typed languages like C# or Java.Especially when a codebase grows larger. I wish python was statically typed.
"Types lead to cleaner programming,instead of shoving random stuffs into an untyped list,a dev has to think about data structures."
You're more than welcome to not think about what types you put into a structure/list. You'll quickly learn how to use that list/structure without hand-holding. For 90% of the problems out there, you don't need anything other than the basic data structures, and they're pretty straightforward to use without having to think about types.
Note. Please don't confuse Javascript's weak-typing with Python's typing, as Python is strongly-typed.
> Types lead to cleaner programming,instead of shoving random stuffs into an untyped list,a dev has to think about data structures.
Once you've learned to think about types -- as most introductory courses, even those taught in dynamically typed languages, will teach you -- you can do that just fine in a language that doesn't put a gun to your head and force you to.
(And you can do so in ways that most statically-typed languages have too limited a type system to let you do, too.)
I'd say the above is pretty simple if explained after:
putStrLn "hello world"
You don't really have to go into Monads for this to be understood/used. You can just leave it at "you use functions ending with an M if they do something like print out to the screen or get stuff from a web page".
My first reaction to this was that this was definitely less understandable, but I realized there were two pieces to this. One is banal - mapM_ and putStrLn are not near as simple as `for' and `print' (the latter have English meanings directly).
The second piece is interesting: Is something of the sort "<mapfn> <printfn> <list>" easier to explain to someone who's new to programming than "<foreach> <print>"? I've always thought the latter was easier to understand, but I'm curious which one someone who has no previous programming experience would grasp more immediately.
Right, I think that once the concept is taught of what mapM does/means it will be easier to keep applying than something like a foreach loop.
The reason being that "<mapfn> <somefn> <list>" is more declarative, whereas the foreach loop is ambiguous. Something we often take for granted is how hard it was at first to come up with what exactly we need to do or mutate inside that loop to build the right result.
If nothing else, there is interesting discussion to be had ;)
Who cares? If I'm a teacher I don't want to have to update material extensively every semester / year because the language is changing rapidly. Three are plenty of good, stable languages to use.
So we should teach the students old stuff because it's easier on the teacher? This isn't history, the tech world is changing daily with new languages and technologies being created. Sticking to one curriculum because it's easier on the teacher is ridiculous.
Java is about as stable as it gets; that foreach syntax is ten years old. And if your teaching material stuck to the old-style syntax it wouldn't be wrong, it would still work, it's just not as readable as the new style.
Indeed, the discipline of computer science is really the study of computation and the ideas we want to convey from a first language are things like abstraction, recursion, proof by induction, equational reasoning, sequential/parallel time-complexity. It's much easier to address these foundational concepts in a functional setting (Scheme, SML, etc) and then once undergrads know how to formally reason and do proofs about their programs introduce other languages.
>> the ideas we want to convey from a first language are things like abstraction, recursion, proof by induction, equational reasoning, sequential/parallel time-complexity.
Really? I thought good old imperative programming was the best starting point. Concepts like assignment, branching, looping. IMHO goto is actually useful at that point and structured programming is an abstraction that takes a little while to wrap your head around. Perhaps it's due to the way I learned that I think some things are more basic (no pun intended). Either way I don't think high level abstract concepts like functional programming are even relevant in the real world of software development - abstract stuff is CS, basic stuff is for everyone else who has to take a programming course.
I'd say just the opposite. Variables are hard, in a way that you don't even notice as an experienced programmer, but when we write
x = 1
x = 2
that's immediately very confusing for a beginner - are we saying that 1=2?
If we stick to immutable values and recursion, and functions that are, well, functions, we get something that corresponds quite closely to first-year mathematics, which makes it easy to learn at the same time.
Most Python material that I've seen that is an intro course makes limited use of changing variable [1] values , being written in a mostly-functional style (generally, preferring, at least initially, recursion to iteration to achieve that -- even though recursion is generally the wrong approach in production Python code.)
Using a modern imperative language doesn't prevent you from adding complexity a bit at a time, and -- especially initially -- getting the advantages of teaching things in a functional style.
[1] but not object fields, as objects are taught as containers of mutable state, usually late in the course.
> preferring, at least initially, recursion to iteration to achieve that -- even though recursion is generally the wrong approach in production Python code
Exactly! Intro courses are usually - with good reason - taught in functional style. Which suggests that a language where that style is more natural and appropriate might be a better fit for those courses than Python.
(Though OTOH I wouldn't advocate a language where IO is anything more complex than a function call)
that's assignment AND reassignment. Mutable state is fundamental to how computers work no matter how hard the fans of the current fad of functional programming try to deny it.
SML is great, but based on the UW paradigms course, I'd say it gets a lot of backlash from students wanting something common in industry.
Transitioning first classes from C to go would make great sense, but it looks like a lot of schools already replaced C with Java and C++ as first languages.
Personally, I do think there is merit in teaching a primarily imperative language first, if only to appreciate other paradigms later and get an inkling of what is really going on beneath your code.
What I took from the Coursera version of the UW course was that SML gets out of the way and let Grossman cover the material at a higher level of abstraction than language syntax. On the other hand, his Programming Languages class is not intended to be introductory.
I also thought SML worked quite well at that level, but I think the initial negativity on the forums about an unknown language reflected what to expect from a larger portion of 101 students.
I was in the second iteration. There were still plenty of people calling for a pony and M&M's for breakfast. I can only imagine the first iteration.
SML worked well because it's more or less dead. All the documentation is basically on a single unsexy website. There aren't a bunch of blogs cluttering up Google results or even much on StackOverflow. Racket also has a single Canonical source and good clear documentation and not much noise.
I can't say the same about the Ruby ecosystem, where I saw this crazy construct in lieu of a call to super posted in a forum question. The TA asked how the person arrived at it. I knew the answer because I had landed at the same StackOverflow page from Google earlier in the week. Which is not to knock Ruby. I understand why people describe it as beautiful.
1. C, D and Pascal dont have garbage collection.
2. An argument can be made for starting with dynamically typed languages before moving to static typing.
> 2. An argument can be made for starting with dynamically typed languages before moving to static typing.
As much as I like static typing, I agree that showing people run-time errors seems to do a better job of teaching than letting them know ahead of time via the compiler that their code will likely blow up.
Many of the better European universities teach FP languages in their introductory courses. Off the top of my head, Oxford and Edinburgh use Haskell and Cambridge teaches Ocaml.
Doesn't Wadler teach at Edinburgh like Harper teaches at CMU? I'm sure many French schools are into OCaml also.
European computer science has traditionally been more theoretical than American computer science, but even in Europe you'll find that a lot of 101 courses teach in Python.
It's the language for an introductory course, so it's purpose is more of a gateway language than as a be-all-end-all language.
Rust would probably be too strict and annoying as a first language. Maybe the same is the case for Haskell, but maybe it isn't such a big deal for those that haven't been "spoiled" by imperative languages and thinking already.
Help me out. I love the idea of Python as a first language. But do you think there's an issue teaching a whitespace-sensitive language as a first language? When students eventually move to Java/JavaScript/C#/Swift/whatever, will this be more difficult?
If you indent your code consistently, you'd never notice that python had syntactically significant whitespace. So the only reason it would ever cause you a problem was if your indentation was wrong. In many languages, programmers specify blocks using both braces and indentation. Python merely loses the redundancy.
Every intro to CS course I've ever been in has been in C++, it has always saddened me to see Java being the intro level course today. Now to see Python at the top of the chart makes my head want to explode.
I learned C++ first, then Java, then more recently Python.
I really feel like I'd have rather had Python or Java first instead, if only because being concerned about all of the strict formatting and structure in a C++ program puts a level of unnecessary complexity in front of the goal: to introduce a person to programming logic and solving problems with programs.
Those complexities are certainly important, mind you. But not as important as clearly opening the mind to the way a programmer thinks, IMO.
Because classic Scheme (or Racket) courses based on SICP are too complicated by new-normal (punk) standards.)
Show me a sane person who succeeded to grasp these few big ideas behind SICP after Berkeley CS61A or another similar Scheme or Racket course, who had any regret for spent time and effort?)
btw, AIMA and PAIP are much better in CL than in Python.)
So Python is the new BASIC. Being widely introduced in the universities, could be the mark of ageing and obsoleteness. Since the university is normally teaching something very irrelevant to the real life, shouldn't the student look around for that emerging tech that will bring him a job in 5 years from now?
PHP is all about magic.That's why it is popular at first place. And its popularity is an accident we'll have to live with for decades,given the amount of PHP code deployed outthere.
But if you are meant to be a CS student,you need a proper multipurpose language.You can do almost anything in python,from 3D games,to complex scientific computation.
PHP is merely a templating language,with enough good syntax to be bearable for advanced programmers.
PHP popularity is quite unfortunate if you ask me.Not because it is easy to start with or because it's a templating language.But because it could have been way better regardless of its execution model.
It could have been cleaner and more elegant if its original author had a clue about how to design languages.
most of PHP syntax is due to Rasmus inability to write a sophisticated parser.
They're majoring in CS not Web Development. PHP doesn't really do anything other than web dev which makes it a poor fit for a general purpose CS curriculum.
> PHP doesn't really do anything other than web dev
This is such a complete misconception. Command line apps written in PHP are becoming increasingly popular, not to mention the fact that PHP runs everywhere.
No reason to down vote because I prefer PHP for web development. In python you have to learn an entire framework just to do web stuff. PHP everything is right there is that simple.
that would be very confusing to understand for a beginner. PHP you can write <?php echo "Hello world" ?> right in a html file and see it just by refreshing the page.
It is confusing if you don't know the difference between a page templating language and a web framework. If ease of writing "hello world" is what matters, Python:
I agree, there's no reason to downvote you for that. However, php only comes with "everything" if you assume it is already installed with a web server. As far as I know, you can't use the php (dev) webserver to serve php scripts without additional configuration.
Contrast[1] with python:
mkdir cgi-bin
cat <<'eof' > cgi-bin/hello.py
#!/usr/bin/env python3
print("Content-type: text/plain")
print()
print("Hello, world")
eof
#try to run:
./cgi-bin/hello.py
bash: ./cgi-bin/hello.py: Permission denied
#fix error:
chmod u+x ./cgi-bin/hello.py
#test in console:
./cgi-bin/hello.py
Content-type: text/plain
Hello, world
#Ok, does it work on the web?
python3.4 -m http.server --cgi 8000 --bind 127.0.0.1
#open localhost:8000/cgi-bin/hello.py
So, how can I claim that this convoluted mess is "simpler" than php? Because everything is right there, and everything can be explained conciesly (contrast with apache configs, getting the www-user right etc). You could build on what the students know of print() and string interpolation/concatenation to output html/use templates etc.
Is this how I would start teaching web programming in python? Probably not -- but it might be useful as one approach, assuming we've started with doing some stdin/stdout stuff, and showing how we can do the same on the web. Crucially showing how if you know how to read a csv file with values, do some aggreation and format it as a text table, you can output it as html, as text over the web, or even as json paired with javascript. That again generalises to database access ... etc etc. It's python (and "invisible" c) all the way down.
[1] Comparing using a fully wrapped up install of webserver+php (either mod-php or fcgi or what not) would a) not be fair, because if you really learn python, you have all the tools to examine the whole stack (well down to the OS level anyway), and b) fair comparision wouldn't be to
python but rather with something like web2py, that also bundles a lot of infrastructure.
YMMV and all that, my only point was that there isn't really anything "simple" about web development -- it's distributed development with not very clearly defined security, state or even messaging semantics -- it's
a mess. A useful mess, but a mess.
[edit: formatting, spelling. Aslo I agree flask is probably a much nice place to start, if one isn't building on an introduction involving text/console i/o]
I know, I know, what I'm about to say is just an artifact of the Language Wars and will cost me karma. I can't help myself.
We should not be teaching impressionable students that whitespace has semantic significance. This is a fundamental design flaw of Python, and it's just a Bad Idea.
> This is a fundamental design flaw of Python, and it's just a Bad Idea.
That sounds like a gut reaction and not a reasoned argument. I was skeptical of it at first, but after coding some Python I decided all my reasons for hating it weren't valid. Turns out it's very practical and makes the code prettier. I don't like the language for other reasons, but the significant whitespace never gave me any problems, headaches, or weird spurious bugs.
The only negative thing about it for me was that hitting TAB in Emacs doesn't indent to the "proper" place (since it's impossible to actually determine the proper place). Instead, it would alternate amongst the valid tab points from most likely to least, which felt like a reasonable compromise to me.
Interestingly, I never hear anyone whine about Haskell's use of significant whitespace…
I was bit by haskell's whitespace - but I think that was connected to a bug with the do-notation or some such syntactic sugar. In general I think significant whitespace is a great way to deliniate blocks
I guess you agree that "impressionable students" should be taught that collaboration is essential, hence clear formatting is necessary and precise indentation (aka whitespace) is critical.
However, when a language comes around that actively enforces and promotes these rules... you say it's a "Bad Idea". Could you elaborate on why you think it's so bad?
> I guess you agree that "impressionable students" should be taught that collaboration is essential, hence clear formatting is necessary and precise indentation (aka whitespace) is critical.
Not particularly, no. Or at least I think that should be the IDE's problem.
*> Not particularly, no. Or at least I think that should be the IDE's problem.
You still have to tell the IDE what to do with your code, which you do with brackets, I guess?
So now your code is full of brackets and parentheses, which are naturally less readable (how many levels of parentheses do you use in natural language?), and their role is almost entirely to... help the IDE indent your code.
About as many carriage returns as I use in natural language. The act of inserting any sort of punctuation seems inherently approximate; an attempt to encode that which is usually transmitted in tone, rhythm, body language.
People, even before computers, invented several different types of formal logic notation just so that they'd have a language that was better suited to these sorts of puzzles. I don't think that code is a good approximation of natural language, nor do I think that it should try to be. Natural language is very poorly suited to expressing layers of abstraction, and even basic logical relationships.
Of course it's easier to write programs that resemble natural language, just as it's also easier to speak than it is to write Vx[Yx->...etc...] but that doesn't mean it's the best way to look at a problem.
#
But that's a bit besides the point. You can do it however you like, even with whitespace if that's your preference. The key is that the IDE has to be aware of the meaning of your code. Python's solution to that is to assert by fiat that the white space should be a syntactically meaningful unit in this sense, but it could be anything.
If you have the information there, then people can look at it and write it however they please.
You know what man, FUCK YOU. You broke the layout of the discussion for everyone and it's next to impossible to follow the conversation without a ton of horizontal scrolling. Take your trolling elsewhere, it's not welcome here.
If we then look at the table of actual data to filter a little bit, the story changes. There are certainly some schools, big ones, that use Python in CS1. But a lot of the Python representation is coming from CS0 at schools where CS1 is taught using Java.
(I can also clarify one of his other comments: he says of the AP exam using Java that "it's unclear whether that will change anytime soon." This is untrue. It will not change in the next 3-4 years at the very minimum---and that would be the time frame if the College Board said right now today that a change was happening. In fact, though, although the idea's been mentioned, there has been no study or apparent motion on that front, so a more realistic earliest-date-of-change would be around 2019 or 2020. And I wouldn't count on it, given how many CS1 courses are still taught in Java.)