Hacker News new | comments | show | ask | jobs | submit login
Why SICP Matters (berkeley.edu)
312 points by wskinner 1690 days ago | hide | past | web | favorite | 127 comments



I am revisiting SICP at the moment, because a friend uses it to learn programming. But I have to say, I am not nearly as impressed with the book as I used to be. Wadler's critique "Why calculating is better than scheming" (http://www.cs.kent.ac.uk/people/staff/dat/miranda/wadler87.p...) fully applies, and the book can be rather confused and ad-hoc in some places, e.g. section 1.3. A stronger focus on data structures would have been useful, instead of ad hoc recursion schemes, like version of `accumulate', `sum' and `product' they make you write in the exercises in that section, which are essentially unfoldr and foldr rolled into one---and would be better written as a composition of those functions.

`How to design programmes' (http://www.htdp.org/) leaves a better impression as a beginners' book for me.

If you can already programme then "Programming Languages: Application and Interpretation" (http://cs.brown.edu/courses/cs173/2012/book/) is a great follow-up.


Thinking on "Why calculating is better than scheming" vs SICP I realize that there is an axis where most programmer's ways of thinking can be placed, with 2 extremes:

- the mathematician-programmers: the thinking in terms of "what is" extreme, or mathematical thinking inspired programming: people closer to this like to think in terms of types and usually dislike dynamic languages, they also like to be able to properly reason about all aspects of a program, to have defined "states" they can think about. These people tend to like MLs, Haskell, OCaml, F#. The "mathematical beauty of perfect code" is their ideal and favorite metaphor. And I think they are right that teaching a Lisp to a beginner pushes ones mind away from "what is" oriented programming.

- the Schemers: the thinking in terms of "what transformations to apply to something (it doesn't matter what "something" actually "is") to get something else or somewhere else" programmer: these people like to be able to transform everything, including code itself, they like homoiconicity or macros in general, they prefer thinking in terms of "how to compose/cahin transofrmations" instead of types, they tend either no know or not value category theory too much and they are ok with seeing code as an "organic entity" that can't always be reasoned about. These people tend to like Lisps, Smalltalk and dynamic languages. The "programming as magick", in the sense of "controlling the spirits (processes) inside computers" is their favorite metaphor. They also like to maintain a playful mood and their "it's ok that paying customers get shafted every now and then" may have had a reason for businesses' dislike for "schemer" types and "schemer" realted technologies. As an offside paradox, some "schemers" tend to also like OOP, though they understand it should not be overused (I consider Alan Kay, usually mentiones as the father of OOP to be a "schemer"). I think that exposing a young learner to Lisp pushes one mind towards this direction, and this is why I love SICP. If you are a competent engineer you surely have a serious amount of math knowledge, so you can easily switch your mind to what-is/math mode and see the mathematical beauty when there is one to be seen, but most "mathematician-programmers" never seem to be able to love this "ruleless organic beauty" of the "schemer way".

I consider myself a "schemer" and I am biased to this side, but I understand and appreciate the value and arguments of the other - I'd rather fly in a plane with the control systems coded by a "mathematician-programmer" than by a "schemer" like me :)


Interesting.

I prefer the "programming as magick" metaphor, and would say I have a playful mood regarding programming, but I still like static typing. Having said that, I love higher-order functions and macros as well.

Defining it as a continuum feels like I should be either on one end, or somewhere inbetween on the various traits, but I have rather strong attachment to some traits from each side.

I think you're right that a lot of the traits you mention tend to be clustered together in programmers though.


...anyone grows to love static typing sooner or later, after endless debugging sessions in dynamic language codebases :)

But I think that higher-order functions don't belong to one extreme or the other - anyone uses them, even weekend-coders that do javascript or have been exposed to php 5.4's new features end up using them - they are something so basic that any language has it in one form or another, even C has function pointers. Macros, otoh are a very different creature: they may seem similar to higher order functions in Haskell where you can use hof for what you'd use macros in Lisp, but they are a whole different ball game because you can't reason (formally, or informally but within the "what is" mindview) about code that transforms/generates code (or you can, but it's just TOO HARD), and I guess this is why so many developers are afraid of them and language-designers omitted them or added them as an extra feature and not as core language defining features from day 0 (for example Scala - I bet the language would've been very different if macros would've been there since version 0.1, been more "usable", and used to build more core language features on top of them...).


If you're critiquing the book in such a lucid manner, it sounds like you got good value from it.


I don't deny that. I liked it quite a lot when I read it in my teens, and learned a lot. But revisiting the book in detail has brushed off some of the nostalgia. It is a product of its times.


Wadler's critique seems mostly rooted in Miranda vs. Lisp, rather than critiquing the general contents of the book.

Is there another book that is like SICP in overall thematic content that uses Miranda? (Or I guess Haskell might be a more likely choice today?)


I found How to Design Programmes very nice, and it uses Racket (which is a dialect of Scheme). HtDP was at least partially created as a reaction to the perceived shortcomings of SICP, including Wadler's critique. You can have a look for yourself at htdp.org. It's early focus on graphics and interactivity should jive well with people yearning back to the days of the C64 or ZX or BBC Micro.

For the more advanced reader "Programming Languages: Application and Interpretation" has a similar focus like SICP on the workings of programming languages---not for an accident they both share `interpration' in the name. Read the book at http://cs.brown.edu/courses/cs173/2012/book/

I have also been impressed with Learn you a Haskell for Great Good (http://learnyouahaskell.com/), but it is not very much like SICP.


Fair enough, but try picking up any other computer science book from that era and I bet there isn't a single one that has aged as well.


Thinking Forth was published the same year and is still very relevant.

http://www.amazon.com/Thinking-Forth-Leo-Brodie/dp/097645870...


TAOCP Knuth

C Programming Language by Kernighan and Ritchie

The Practice of Programming Kernighan and Pike


I still like K&R's C book, and have used it to teach C to a friend last year.

The Practice of Programming shows more signs of its age. It is a good read, but it could do with an update.


"In my experience, relatively few students appreciate how much they're learning in my course while they're in it."

That quote describes me to a T.

The University of Minnesota taught a sophomore class on OO using SICP. At the time, C++ was hot (this was about 1991 or 1992) and I distinctly remember someone raising their hand and asking "Why are we learning this language that we will never use again? Why isn't an OO class taught in C++?" And I remember thinking to myself "Who cares what language it's in? I'm kind of excited about learning another language." And I truly was.

It wasn't until much later, after being in industry for a decade (currently starting my third decade now ;) )that I recall thinking to myself "You know, that book had a tremendous impact on how I write software and program in general, I need to pick that up again, what was it?" And I re-found it and am on my third reading of it as I write this, mostly just to refresh the ideas in my mind (since I mostly work in Java and JavaScript these days), but I have never truly finished the whole book. The class took us through chapter 3 and then sort of a mixed bag of topics from later in the book, so I've never truly read the whole book, but certainly enough of it to help me.

I didn't realize it was a famous book at the time, it was just our text book. I still didn't realize it was a famous book when I re-bought it a decade later. And now I know why it is a famous book, it is that good. And though the third re-read is very familiar, I still enjoy it and find it profound.


I'm intrigued that anyone would teach OOP using a language other than Smalltalk. As well as giving you the bits you need to build objects, it puts them together in a dispatch system and class library, fully thought out and tastefully designed, then gives you the tools to take it apart, study how it works, think about why parts of it are clumsy, and consider the consequences of fixing them. Perhaps graphical workstations were too expensive for universities in 1992?


I don't recall there being an open source graphical Smalltalk in 1992.


For those turned off by its mathiness, one of the coauthors explained, "There’s a tremendous amount [of math] in this 6.001 book. That’s partly because we were writing it for MIT. You want to talk about higher-order procedures, so the obvious thing you can do is you can express the idea of an integral. So that’s an obstacle for lots of places. Remember, at MIT the context we were writing in is every person who took the course had just had a semester of calculus."

http://www.codequarterly.com/2011/hal-abelson/


Thanks. The mathiness has been an obstacle for me. It's not that I'm incapable of groking the ideas, just that when I sit down to play with a language I haven't yet mastered I don't also want to be learning tangential concepts. Still, I love it, and keep coming back to it. It's so readable, and scheme changed my brain.


That is really a wonderful interview. Thanks for sharing.


You're welcome. You might also enjoy Rich Hickey's interview, as I did. http://www.codequarterly.com/2011/rich-hickey/

"And, once there [writing only code that matters], you are able to achieve a very high degree of focus, such as you would when playing Go, or playing a musical instrument, or meditating. And then, as with those activities, there can be a feeling of elation that accompanies that mental state of focus."


SICP videos are also great. The first part of the Lecture 1A, talks about "computer science". From the first moment, we learn that computer science is also not really much about computers.

"...when some field is just getting started and you don't really understand it very well, it's very easy to confuse the essence of what you're doing with the tools that you use..."

"...Well, similarly, I think in the future people will look back and say, yes, those primitives in the 20th century were fiddling around with these gadgets called computers, but really what they were doing is starting to learn how to formalize intuitions about process, how to do things, starting to develop a way to talk precisely about how-to knowledge..."


"...when some field is just getting started and you don't really understand it very well, it's very easy to confuse the essence of what you're doing with the tools that you use..."

So true of so many fields, namely photography. Can't tell you how much camera gear I bought before I finally sold it all and started taking photographs.

As it is with computers. Languages mean nothing, it's the concepts that matter.


Gerald Sussman is a fantastic, enthusiastic teacher. It takes a little getting used to watching him write all that code on a blackboard but the videos are a great compliment to the book.


SICP is a masterpiece. I took the old MIT 6.001 in Spring 1986 where we burned through the entire book in about 14 sleepless weeks. Sussman was the lecturer and Rod Brooks was my recitation instructor. I was like 19 years old at the time and it was the most intellectually beautiful thing I had ever experienced. I ended up acing the course and went on to get a PhD in CS. I've never read a better book on programming.


I wish that had been my experience with it. I did well in the class, but I didn't learn nearly as much as I should have. To be clear, that was entirely my fault, not the class's; I was lazy and got away with a lot of shortcuts. And even when I learned real stuff, I shrugged it off as "not actually how you build programs in the real world; can't I have for loop?" (so, sooo dumb). After years in the industry and a slow, organic rediscovery of all the great concepts I should have taken to heart as a college freshman, I cracked open SICP and said, "holy shit, this is what I was being taught?" 6.001 really was incredible, had I only been smart enough to see it. Eat your veggies, kids.


I have to confess I just discovered you can order a relatively cheap copy (well, compared to Amazon) of SICP direct from MIT.

I have only skimmed so far, but combined with the lectures I have to say that I really wish that had been my introduction to programming. Maybe it is just because of where I'm at, but it really feels like in the first 3 lectures some of the more elusive topics of programming are at least touched. Sure, you will need to seek out more, but it is not as hidden behind "but for this language, you have to do this" kind of crap.


Years ago, I was very interested in having the physical book to read, but too cheap to buy a new copy. Every once in a while I would check half.com and book shopping comparison sites looking for a cheap used copy. One day in December 2006, there was a rift in The Matrix and Barnes and Noble was selling the hard copy 2nd edition for ... $7.00 (new). I ordered two copies and as soon as they were physically in my hands, I tried to order more. Alas, the glitch had been fixed by then. Sometimes I wonder if there shouldn't be a general price scanning app that will alert on wild changes in the price of a popular book.


Like camelcamelcamel.com? I set up price alerts for big wishlist items; it's a great service.


I hadn't heard of that site, thanks! Back in 2006 or so, I was a heavy user of Addall.com for comparing prices across various book sellers. I would point people there and mention that it must be "Amazon's worst nightmare." Fast forward a few years and Amazon used books ate their lunch. Addall.com gave up so completely, their site often returns two hits for the same item, the short and long ISBN. There can't be much profit for Amazon to sell used books, but I suppose it is important to keep even the used book buyers "down on the farm."


You can also read the online version which is free!




> I have only skimmed so far

For the mind-blowing effect to occur, you must actually do the exercises. (-:


Yes, agreed, if you are reading this book, don't skip the exercises! You might need to refresh some math skills (just a bit), but it is worth it.


I was just going to post that they are on my agenda. I have actually toyed with the thought of putting together a "book club" to go through them. I will probably have to put this off till the Fall, though.


https://groups.google.com/forum/?fromgroups#!forum/reading-s...

is a reading group just starting up now if anyone is interested


I've put in my request to join. :) Apologies for leaving the body of the request blank. Didn't realize it was something that would get reviewed.


Link to the book at MIT Press [ Paperback | $49.00 | £33.95 ]

http://mitpress.mit.edu/books/structure-and-interpretation-c...


Amaozn's got it for $46.55 (and it's eligible for Prime shipping) and MIT has it for $50; maybe I'm missing something?

http://www.amazon.com/Structure-Interpretation-Computer-Prog...


Only that my information is out of date. :) Last I looked they did not have this as an option. Well, they did, but look at the hardcover cost, and that is all I could find for a while.


When I was student, my school has decided to stop using Ada as teaching language. They were hesitating between CAML Light and Scheme. They asked me my feeling because I was one of the rare student to know well both of them. I like both of them, but I choose CAML Light. Scheme has a lot of parenthesis that makes it look very old fashion. Scheme type system was also outdated. Programming in CAML Light was more fun for me, the syntax was simple and pleasant. The currification was cleaner in CAML Light. 20 years latter, CAML Light does not exist anymore, but I still think that my choice was right. My school has choosen CAML Light.


I went from SML, Caml Light, and OCaml to Common Lisp, and then to Scheme.

I prefer Scheme by far to any other language I've ever used (which includes a whole bunch of languages not listed above).

For me, Scheme is much more fun to program in than OCaml, etc. With OCaml I found myself constantly wrestling with the compiler to get my program to compile, and had problems decyphring its obscure error messages, which seemed to require taking university-level courses on type theory in order to be understandable.

With Scheme, my programs mostly "just work", and I can program the way I think.

Some people have problems with Scheme's parenthesis, but I personally quickly learned to love them because they make scope explicit and obvious. Through long and bitter experience, I've learned to value explicitness and clarity in programs above almost everything else. Scheme is great for that.

True, Scheme might not be as safe as OCaml and friends, but for ease of programming, speed of development, and sheer pleasure, I find it very hard to beat.


The most useful didactic feature I've found in ML-like languages--Haskell for me, but they're all somewhat similar--was pattern matching. Honestly, pattern matching is what really got me to understand recursion. It also makes the language feel much more declarative than using `if` and `cond` statements.



Hi,

>CAML Light does not exist anymore

French classes prépa[1] have been using it in the CS course until this year (I have heard it would be dropped in favour of Python next September). So even though there hasn't been any development for the past 10 years[2] it still exists! And then there is OCaml which is still lively.

Even though a CS graduate may say that S-expressions are the simplest syntax possible - not that he's wrong - I'm happy I discovered functional programming through OCaml, whose syntax feels way more natural (to me at least).

1: http://en.wikipedia.org/wiki/Classe_pr%C3%A9paratoire_aux_gr...

2: at least according to http://caml.inria.fr/caml-light/release.fr.html


"Scheme has a lot of parenthesis that makes it look very old fashion."

Might I argue that this is a huge advantage? No one knows scheme, and it's very obscure, and looks nothing like other programming languages. This means that you can teach concepts using it and pre-judgements about the language don't get in the way.

Then, I suppose the same could be said of CAML and its obscurity.


Which one is obscure ? define factorial (lambda (n) (if (= n 0) 1 (* n (factorial (- n 1))))))

let rec fact = function | 0 -> 1 | n -> n * fact (n - 1);;

There is a little more syntax to learn in caml, but the suppression of 90% of the parenthesis makes the code far easier to read.


With syntax hilighting and proper indenting, the Scheme code is pretty easy to read:

http://img1.imagilive.com/0413/scheme-fact.png

You could even simplify it a little by getting rid of the lambda:

http://img1.imagilive.com/0413/scheme-fact2.png

In any case, I find toy examples like this less than convincing. In real-world programs, issues of style, design, and documentation usually trump most anything else, as far as clarity is concerned.

Also, I personally find the penchant for one-letter variable names in the OCaml/SML/Haskell world to be very confusing and obfuscating.

The Scheme way is to be more verbose and explicit. For me, that results in much greater clarity -- especially when looking at unfamiliar code or code that you've stepped away from for a few months.

Of course, there also such a thing as being too verbose and explicit. But for me, Scheme is in the sweet spot between verbosity and terseness.


E.g. Haskell has enough syntactic sugar for some an even simpler version

    factorial n = product [1..n]


Something similar is achievable in scheme, just takes a little work http://repl.it/InA (click run session) :)


Oh, you don't even need macros for something similar. E.g. Python's range function works just fine.


I find the lisp easier to read.


Was the school in France?

These don't seem like very good reasons to choose one over the other; the choice probably came down to culture.


Can you elaborate? Is your school still using CAML Light, or have they moved on to some other language (O'Caml?)? How effective was it as a teaching language?


According to their site, they have switched to python followed by java the second year. As it was after I finished, I have almost no feedback.


Dynamic typing is outdated?!


Yes, since at least 80s or so. Languages based on Hindley-Milner type system (https://en.wikipedia.org/wiki/Hindley%E2%80%93Milner) are static and nice to work with. E.g. OCaml or Haskell.

I have some professional experience in rewriting Python and Ruby programmes in Haskell. I make the same amount of stupid mistakes in Python as in Haskell. But whereas Python blows up at runtime / test time, in Haskell it's the compiler yelling at me. The nice thing: that's much faster to detect, and also saves me writing about half the tests.

Don't get me wrong, dynamic typing beats inane static typing like C's or Java's. But good static typing beats dynamic typing.


I have experience with OCaml, and I don't find its static typing to be nice to work with at all.

I like the final result of safety, and how whole classes of bugs are excluded once I get my OCaml program to compile. But the process of getting my program to compile in the first place is pretty painful and not fun.

Then again, writing endless unit tests in a dynamically typed language like Scheme is not much fun either. But I don't have to write the unit tests until I've written some functional portion of my program (or even the whole thing) and am satisfied with its design and how it works. Then I could add unit tests or even rewrite it in a safe language like OCaml, if I wanted to.

As I said elsewhere in this thread, for fast prototyping and sheer pleasure of programming, I find Scheme very hard to beat.


The soon-to-be-released GHC 7.8 (Haskell) can defer type errors until runtime to allow you run your program even if part of it is broken, and you can also add "holes" in place of an arbitrary expression and the compiler will tell you the type of the expression you need to replace it with (of course, this will explode if run). I suspect GHC's error messages are better than Ocaml's as well.


Actually, -fdefer-type-errors is already in GHC 7.6: http://www.haskell.org/ghc/docs/latest/html/users_guide/defe...

-XTypeHoles will be in GHC 7.8: http://www.haskell.org/haskellwiki/GHC/TypeHoles


That's good to know, and gives me extra incentive to learn Haskell some day.

On the Scheme side of things, I've heard that the newish (4.8.0 and up) versions of Chicken can perform flow analysis to catch some type errors at compile time, and optimize based on types.

There's also Typed Racket, and Chicken has a contracts egg that allows procedures to have pre- and post- conditions.

For some years now, I've heard predictions that in the future languages will allow their users to "dial up" or "dial down" safety features on demand. I guess the above features of Scheme and Haskell are some early steps along that path. We live in interesting times.


I wrote a simple OCaml compiler for a class in college. I particularly enjoyed using different operators for floating-point and integer math — brilliant usability there, really sold me on static types. Haskell does better with its type classes, except my four attempts to understand monads and arrows have, so far, met with rather mixed success.

Static typing has its place in some people's hearts, and I respect that, but saying that dynamic typing has been outdated since the 80s is (1) trolling, and (2) happens to also just be, like, your opinion, man.


If you want to have another go at it, I recommend Learn you a Haskell for Great Good (available online for free at http://learnyouahaskell.com/) for your Haskell learning needs. Don't stress the Monads and Arrows so much. I know what arrows do, but can't spot their applications in practice. And still I get paid for writing Haskell programmes.

Oh, and please excuse my snarky tone. Your ancestor comment seemed to ask for it. Yes, they are, even now, languages around that are even worse. But better ways have been around for ages. A similar example is garbage collection: It, too, has been around since basically the dawn of programming languages, but only made serious inroads into the mainstream in the last decades.


Honestly, Python 2.x's dynamic typing doesn't make the experience of floating-point/integer maths much better.

The value of 3 / x changing depending on whether I pass 2 or 2.0 is the trade-off of the Python approach, and I honestly don't know which I prefer.


I agree. A type system is good if it allows you to express everything you want to express, if not (as in Java), it is a tool for oppression.


I'm a big fan of Clojure but I think this is true. I expect that in this decade we'll start seeing a shift to programming languages with so-called algebraic type systems.

Java, Python, Ruby, and Clojure will seem outdated just like C++ seems outdated now (not that it doesn't have its uses).


C++11 is not outdated. Except for backwards-compatibility artifacts like header files, it's actually quite modern and pleasant to use. If performance is a consideration, then, put bluntly, it's the only high-level language available.


hopefully rust will fill this void, pun not indented


Type inference like in ML languages can provide a similar development experience with the added benefit of performance and tooling.

Additionally when doing big projects where the team does not care for unit testing, dynamic typing can turn out to be a big problem.

Having said this, it has its places, I do use dynamic languages a lot, just not in big enterprise projects.


I took Brian Harvey's SICP course, what I didn't realize until I read this article is that I was in the first class he taught in 1987. I remember helping a friend in the course. We stared at five lines of code while I tried to explain it, he had a really hard time with it (some type of recursion I assume). I still hold the lesson from that tutor session. Write code to be easily read by the average joe. Python, Ruby, etc are great because they are easier to read by more people. Languages are about people not machines. p.s. I am still writing code today, and loving it.


Excellent article. I'm immensely glad and lucky to have had Brian Harvey as a professor, and this echoes exactly what he taught us and the general teaching at Berkeley: that the concepts of programming and computer science were of paramount importance, and that you should understand them in the abstract.

To this day, this is why I value a good theoretical CS education, why I value university education, and why I continue to look back on and use my supposedly theoretical and impractical CS education with great respect.

This is so true: "learning another programming language isn't a big deal; it's a chore for a weekend. I tell my students, 'the language in which you'll spend most of your working life hasn't been invented yet, so we can't teach it to you. Instead we have to give you the skills you need to learn new languages as they appear.'" Can you imagine how valuable it was to be told that from the start?

He's right about scheme too: part of the value of using Scheme or Lisp is its inherent crazy unfamiliarity to the great majority of students. The first thing when you see all those parentheses and weird paradigms is "woah, this is unlike anything I've seen before." And that's perfect. You don't get lost in things you think you already know—you learn concepts fresh and the language, because it's so weird and wild and totally abstract and you can't imagine using it for any real project—is as malleable and temporary as modeling clay. You get to make things with it that teach you about art, and how to look at things, how to see infinity—not about the properties of the materials in your hands.

For this reason among others, it's profoundly disappointing to me that Berkeley has switched their introductory language to Python. As great as it is, it's no Scheme, if only for the fact that its popularity and usefulness as a language detracts from the underlying concepts. I have to think Brian Harvey fought this change, had he not retired and lost some influence.

"Every five years or so, someone on the faculty suggests that our first course should use language X instead; each time, I say "when someone writes the best computer science book in the world using language X, that'll be fine.'"

SICP was, in retrospect, a masterpiece of education. Someone sat down and thought about exactly these concepts and how best to elucidate them without getting in the way of the connections being made. The new trend toward applications-focused education feels to me like the wrong direction, and I don't think I would get as good an education today as I did learning abstract concepts theoretical computer science taught as the most important part.

Sure, it's fun and attractive to think about applying what you learn to the revolution at hand quickly, making mobile apps and designing robots and putting everything together into useful applications, but you can learn all that stuff after a couple semesters of awesome CS core theory anyway. And while you might get frustrated for those 2 years as an 18-year-old hotshot upstart startup-minded student, in 10 years you'll look back on it and think, "wow, that really was the most important part."


Sounds like Berkeley are making a huge mistake and losing sight of what a University is there for. They seem to have moved to a more practical approach than theoretical approach.

This is the type of approach that is more appropriate to a Technical College than a major University. This new approach almost seems like teaching civil engineers how to weld and rivet in order to build bridges, rather than deeply understanding the mathematical theory behind stress and vectors.


Yes, I think they're moving in that direction.

The intro courses, while in Python, still use many of the concepts and text of SICP, so it's not all gone. Many of the concepts translate well, and it's not the end of the world or even a slippery slope—I think that would be overreacting.

There's nothing fundamentally wrong with using a newer and more useful language to teach these concepts—after all, as Harvey always said, the language doesn't really matter. However, it's conceding to rationality just slightly, and you're right, a University, especially Berkeley in my mind, is a place where you're free to learn concepts and think theoretically without needing to find a real-world application for at least a couple years. They should keep that as a core value, and I certainly hope they do even as they use more modern languages.


Does MIT even use Scheme for their intro CS courses anymore? Harvey's article makes it sound like they switched to Python when they went from a "curriculum organized around topics" to a "curriculum organized around applications" - it seems to be the trend everywhere.


MIT did indeed switch from Scheme to Python for their introductory courses (though I've heard they still use Scheme in some more advanced courses).[1][2]

However, I recall that a little later some MIT graduate students started teaching their own version of the classic SICP-based course. I can't seem to find a link to it now, unfortunately. I wonder how that turned out, and if they're still teaching it now.

[1] - https://news.ycombinator.com/item?id=602307

[2] - https://news.ycombinator.com/item?id=530605


Brian Harvey was also teaching his own version of the old CS61A last I heard, but he may be retiring after this year. (I've been hearing about him retiring pretty much since I started at Berkeley 5 years ago, for what it's worth).


There's a variant of 6.001 that's taught in January IAP sessions:

http://web.mit.edu/alexmv/6.S184/


Actually switched entirely to Python and Java. In theory a lot of SICP is taught in the 6.005 Elements of Software Construction course, but since it uses Java, to use Pauli's phrase, for that purpose it's not even wrong.


Northeastern still uses Scheme, but I'm not sure who else does. A real shame.


Nope, scheme is still being taught in the "Python class". The final project is writing a Scheme interpreter in Python, so students still get to learn Scheme.

http://www-inst.eecs.berkeley.edu/~cs61a/sp13/projects/schem...


s/Berkeley/most universities in the States/


It's also not just CS, though CS is the largest culprit because well... software is eating the world.

No one cares about getting an education. They just want a job.


But there are still areas, lets say for example civil engineering, where you need to know about building materials, some physics and math to do your job. In CS, on the other hand, if you know how to put together a few web pages in Python and solve half a dozen brain teasers you may get a job in a very good company.


You're missing my point, which was about priority order.

Most people are going to college hoping to get a job. They're learning because they're told they have to learn, so they do a bad job learning. In a field like civil engineering, you can't graduate unless you pass classes that are simply unpassable without learning some stuff. (I have a friend going through aerodynamics right now and I'm hearing about everything he's learning.)

This is kind of a microcosmic version of what we're seeing in general: we see people as workers first and human beings second. The universities and the kids are following suit.


We created elementary schools. The children who attended them surpassed their peers. Employers began to require them as a signal of competence. Eventually, being such a "universal good", elementary education became a public service of the state.

Then we created secondary/"grammar" schools. The children who attended them surpassed their peers, who had only attended elementary school. Employers began to require a secondary-school education as a signal of competence. Eventually, being such a "universal good", secondary education became a public service of the state.

Then we created universities...

---

Although the inductive step is valid, there's a problem in the assumptions: we already had universities for thousands of years before the introduction of elementary or secondary education!

The traditional "liberal education" of The University, where the upper-class and the cunning go to cloister themselves with one-another and thus boost their mutual productivity in all sorts of status-signalling arts (sounds sort of like TED, doesn't it?) has come crashing head-first into the rising bar of minimum-expected human competence. Success in secondary school no longer tells you anything about a person's class or cunning, and that's forced employers to look for increasingly-lofty-and-meaningless trust-signals. So "everyone who's anyone" expects to go to university now, from the spoilt valley-girl to the farm bumpkin.

Perhaps, in the end, if we want to preserve the "usefulness" of university, we'll chop off the undergraduate portion of it and call that "tertiary school" or something. Everyone gets to go, it occurs at community colleges, probably most of the material comes from Khan Academy and the like. Then the rich and the cunning can go to The University after that.


> Although the inductive step is valid, there's a problem in the assumptions

I'm really not sure who you're addressing. No one is claiming these assumptions.


If you know those things... and are fresh out of Stanford, MIT, or Berkeley


> No one cares about getting an education. They just want a job.

Internal vs. external motivation? Maybe some day, when programming is no more needed for anything useful, it can really flourish as an art. Like painting, when its principal real-world reason-to-be was supplanted by photographing. E.g. I don't understand the obsession of quest to invent always _more and more_ optimal algorithms, because often the less optimal ones are more interesting and elegant.


Honest interest from someone who doesn't know: While I know this is happening at least some other places in the US (it happened to the one I went to and to at least one a friend attended), is it largely limited to this country, or are other countries seeing similar issues?


My university (Helsinki University of Technology) changed the freshman course from SICP to Java in the beginning of the century. The reason was Nokia, who wanted Java programmers straight from the school.

Nowadays they're using Python.

I'm very disappointed I didn't have a chance to study that course. I read the book last year and did some exercises by my own. I never had so much fun with any CS book than I had with SICP.


Nokia. That company just keeps giving. Some universities in Finland fell into that very same trap a second time and moved towards the holy triad of C#, .Net and Windows Phone. Utter shortsighted idiocy


It is the difference between encouraging plasticity (true creativity, etc.) and indoctrination (control), the latter which is considered must, when cranking out armies of soldiers, clerks and corporate programmers.


Thanks for the input. I just started learning Clojure and it's blowing my mind... I felt sorry for some friends in college that started with Scheme but now I'm realizing I missed something pretty beneficial.


"woah, this is unlike anything I've seen before." - it so totally isn't, if you know Ruby and to a lesser extent JS, both popular languages. It's a funky syntax coating on a fairly normal dynamic language, with about three quirks (tail calling, homoiconicity, and call/cc) all of which are more of interest academically than practically.


The funky syntax does enough to knock an 18-year-old freshman out of his one-track mind. That's all you need.


My university also started off the CS programme with Scheme in the mid 90's when I went, and it was a great leveller. Maybe half the kids had been programming a lot before starting uni, and the other half hadn't, and by starting with Scheme, everyone was at the same level, which built a lot of confidence among the ones that hadn't been programming before.

Five years later or so, the great teacher that ran the CS intro course quit, someone else took over, and switched to Java. Typical. :-/


I don't think "quirks" is the right word for tail calling and homoiconicity, and those are powerful features that I miss in practical situations all the time in lesser languages.


> JS

> a fairly normal dynamic language

You call this a "fairly normal dynamic language"?

0 == '' // true

0 == '0' // true

'' == '0' // false


I had a scary moment, looks like UofC still may do ten weeks of SICP:

http://www.cs.uchicago.edu/courses/description/CMSC/10500/99...


That's an intro course for non-majors. Scheme is use in the curriculum for majors (the 151 course), but it uses the How to Design Programs text, not SICP. The Honors curriculum uses Haskell as its functional language, with touches on some other bits.

On a personal note, I learned using SICP up at Northwestern, but I'd be hard-pressed to justify it to a non-engineering school. Much of the math used in it hits harder on the engineering-style calculus (compute all the things!) than a more discrete math-y style that would be more appropriate for CS programs with a closer affinity to a math department (or schools like the UofC, which does not have an engineering program at all).


I didn't discover SICP until after I had graduated from a good CS school. Its clear and away the best programming book I've ever read, even for an experienced developer.


Yep... I was coder for many years before I discovered it. I'm mad at my university for not forcing it on me!


Ha, this is exactly it! As an 18-year-old interested in computers, you just want to start writing iPhone apps and making the next big web service, but quite often those old geezers know better than you.

Catering to the things that 18-year-olds want has never really gotten us far in society.


I think thats the great thing about SICP and its younger, more spritely cousin The Little Schemer - they're both really good to read.


Which would you recommend to read first?


The little Schemer

Don't worry if the last few parts don't make too much sense.. I went back and forth between the two quite a lot


Personally, SICP. I found that the format of LS got old fast.


Not to mention the patronizing "Now go make yourself a baba ghanoush pita!" and "This space reserved for jelly stains."

Maybe the humor was just lost on me...


I studied ME and my only computer course was Fortran but over the years 70% of my work has become programming. For me SICP was a great introduction to CS. Up to then I had the typical practitioner's view that computing is all engineering and no science but SICP changed my view.


Slightly off-topic : You can download the epub version of SICP from here. https://github.com/ieure/sicp


I was working through the coursera functional programming course on scala which is modeled on SICP. While the content is really good, it suffers from the same problems pointed out by Dr Harvey. Getting used to scala syntax takes time while Scheme was a breeze.


Yeah, Scala is the new C++ - immensely powerful but there's ten different ways to do everything.

I like the language, but I'm simultaneously disgusted by it.


"there's ten different ways to do everything"

I believe this observation is a bit misleading. Indeed, Scala does offer some alternatives, but these are helpful in the transition from other languages with less powerful type systems. Idiomatic Scala is not really a hodge-podge of choices.


Trying to have a small language sometimes results in allowing (or having to allow it in the sense of not outlawing it) multiple ways to do the same thing, especially as the expressiveness increases.

As an example, I don't think Python's “there's only one way to do it”-approach would have a chance of surviving if the language was as expressive as Scala.

In the end, as long as Scala stays simpler than Java 8 and vastly simpler than F#, C# and C++, it's good enough for me.


Universities are communities of people from very different sub-disciplines, with every person considering their own sub-discipline or course the most important one, and as a result the curriculum is diluted and hurried to please the greatest amount of people. It might have its benefits, but if training great computer scientists is really the goal, I think it is a bit of a pity you cannot go to a place where you could just do a mix of SICP, algorithms and mathematics for full three years, spending much more time with a single instructor on a single broad topic.


Actually, you can do something really close. The trick is to study mathematics and specialize from there, e.g. in algorithms or mathematical optimization.


But an average student might need the exposure to other "common" paradigms from C, Python, or Java, before getting introduced to SICP, to "fully appreciate" the power of functional language.


Pedagogically speaking, I think this is backwards. SICP isn't a course in functional programming; it's a course in programming. And the use of Scheme is critical, because it doesn't waste time on nonsense like classes and structs and other ephemera -- you just jump right into the conversation with the computer.


When I went to Berkeley, those with no programming experience were encouraged to take CS 3 (Simply Scheme) before taking CS 61A (SICP).


At Chicago in 1990, there were two intro classes; if you wanted to study CS (a math major at the time), you took 115/116, which were basically 6.001.


agree-that is what is so good about SICP.


Hi everyone, is there a kind of video/guided tutorial through Scheme and SICP? I have tried picking the book up on my own but have had a hard time retaining the information.


There is a video of the lectures from SICP, which was produced for HP. http://ocw.mit.edu/courses/electrical-engineering-and-comput...


The course webcasts of CS61A from berkeley for various semesters are online:

The ones prior to 2010~2011 used scheme, and it tracks SICP fairly closely.

http://webcast.berkeley.edu/playlist#c,s,Fall_2010,EC6D76F0C...


I'd like to add that the material covered is just hard to learn, and fairly mind-bending if you're used to traditional (procedural/OO) programming.

Take it slowly, don't be discouraged if it takes you hours to get something concept, and DO THE EXERCISES.


I wish my school had used this book. The first year of CS at my school was futzing around with operators and basic data structures. No tie-in to the big picture.


I started an SICP study group at work, where a few of us get together and watch the MIT video lectures given by Abelson and Sussman. People were a little skeptical at first, given that they consider themselves top-of-the-line l33t hackers and this is an introductory course, but after the 3rd lecture, nobody questioned the value of it.

It's a timeless course.


Here's the previous thread for this article, from 5 months ago:

https://news.ycombinator.com/item?id=4784827




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: