Hacker News new | past | comments | ask | show | jobs | submit login
Programming and Programming Languages (brown.edu)
256 points by xearl on Dec 15, 2014 | hide | past | web | favorite | 63 comments



His other book was PLAI which was pretty cool: http://cs.brown.edu/courses/cs173/2012/book/

I worked through it for a class, and I distinctly remember a moment where I thought "what the fuck - did I just write a working type inferencer?". It has a great way of simplifying things to the point where adding a complex new language feature isn't a distant theoretical proposition, it seems like a simple and straightforward next step. Of course, it helped that I had a great instructor.

I can't speak for this new book, but it seems like an interesting combo of intro to programming and programming languages. And it's in pyret, which is like the bastard child of python and plt-scheme: http://www.pyret.org/


PLAI is almost completely subsumed into PAPL. With each revision, though, the PLAI material gets reorganized and distributed across PAPL, so at some point it will no longer be an identifiable subset. The idea is that programming inspires new programming languages, while programming languages help us make sense of programs. Therefore, the two should be tightly intertwined. The first edition of PLAI tried one way to achieve this synthesis (but didn't really succeed as much as I'd have liked); PAPL is a fresh attempt at getting there.


Shriram, thank you for the PLAI and I'm looking forward to take a look on PAPL as well. But one thing I don't like - types (static typing) for the implementation language is an afterthought in your books. This happened to PLAI, and later it switched to "plai-typed" language (which was a good move). Now we have PAPL, but again using a dynamic language. Although, as I can see, static typing is a planned feature for Pyret. So I suppose PAPL will switch to static typing in the future, but I think it would be better (for readers/learners, novice programmers) to start with (static) types in the first place.

EDIT: just skimmed through the text, you seem to be using "type-like annotations" mentioned in Pyret docs.

Another thing to note - Pyret is much more readable/enjoyable (and I guess writable :)) than the previous lisp languages you've used. Thanks!


Thanks for your comments; criticism is always welcome! And for your kind words about Pyret, though I actually find parenthetical syntax a bit easier to write. It's still taking me some effort to get used to the irregularities of non-parenthetical syntax.

I don't know why you say types are an afterthought. HtDP teaches a profoundly _typed_ discipline of programming. The datatype mechanism in PLAI continues this with more enforcement. And plai-typed is anything _but_ an "afterthought", right?

PAPL is written as if types work. We have an internal type-checker that I've been using off-and-on this semester (but it's still in development). As soon as it's ready, all programs in the book will pass through it and the book will be almost completely typed.

I'm not going to get into an argument here about whether it is better to start with static types. Though I somewhat share @noelwelsh's viewpoint, I think the full situation is far more complex and demands actual research — HCI research — as opposed to just matters of opinion (including mine).


> And plai-typed is anything _but_ an "afterthought", right?

No, no, plai-typed is great. I meant that PLAI book initially used a dynamically typed language (and I guess it is preserved in the printed edition), but you switched to plai-typed afterwards, and I believe you'll switch to full static typed Pyret for this book as well, when it becomes ready :)

("shameless plug": Actually, it was me who notified you about the plai-typed effort and you further developed it, during your online PLAI class at Brown ;))

Agree about a typed discipline, but I meant specifically static typing for the implementation (host) language.

As for preferences, yes, actually Noel has a point - learning debugging type errors in a dynamic languages is a skill useful in a "real world" (unfortunately :))


Aaah, YOU are responsible! Well, then I should publicly thank you for notifying me about that. It was a great addition.


My understanding is that it is useful for beginners to run their programs, complete with type errors, so they can see where they fail. Debugging type errors if difficult without a strong model of evaluation and type inference. Since beginners are learning these very things, type errors can be very confusing to them and hinder learning.

I'm sure Shriram will correct me if I'm wrong.


does this also subsume HtDP?


No, it doesn't. There is a _lot_ of detail to the design recipe of HtDP. Right now the recipe isn't written down at all, just assumed, in PAPL. Over time PAPL will provide a light introduction to the recipe, but it will never reproduce the full level of detail of HtDP. PAPL's intended audience is people who have a little bit of programming experience already — ideally through HtDP — and are ready to take their programming up to the next level.


By the way, you've summed up Pyret quite nicely (-:, though it is more than just that sum (with more goodies on the way).


Thank you - I wasn't expecting a response from you here! "Bastard child of python and plt" is a high compliment coming from me :-)

I keep hearing your name, first because Kathi was my beloved teacher and advisor, second because Justin worked with you, and third because I made Frank Goodman take your class. Maybe we'll finally meet one day!


You seem tied up to all the happiness in my life. Come by and visit sometime -- I owe you lunch. (-:


Will do!


I read the introduction and I really liked the following excerpt from The Structure of This Book:

  We will often build up programs incrementally, just as a pair of programmers
  would. We will include mistakes, not because I don’t know the answer, but
  because *this is the best way for you to learn*. Including mistakes makes it
  impossible for you to read passively: you must instead engage with the
  material, because you can never be sure of the veracity of what you’re
  reading.
I really like this idea as any techniques that help the learner to actively think about what they’re reading work better than books which don’t employ such technique. It’s for the same reasons that I recommend the Head First series – even though for some books, the pace can be too slow for my liking.

I’d also agree with the author that a good book for readers to learn from does not make a good reference book (and vice-versa).


At first glance this book appears in many ways to be a spiritual successor to SICP... does this hold up for anyone who has read it more throughly?


More like a spiritual successor to HtDP, which is somewhat opposed to the SICP philosophy, even though both use Scheme variants:

http://cs.brown.edu/~sk/Publications/Papers/Published/fffk-h...


Interesting...

It seems to me that FP is having quite a renaissance recently, which could limit the relevance of that paper's contentions today. I'm also seeing quite a backlash against OOP and especially OOP as the One True Paradigm that it was pushed as in throughout the 00s - this can be seen not only in FP language growth but also in the design of other new languages like Go that lack many OOP features.

Honestly, I would argue for the "Never Compromise" route for undergraduate regardless of current trends. I've found firsthand that mid-to-high-achieving high school students are able to grep SICP just fine, even if they haven't been introduced to calculus as of yet. But I haven't read HtDP or PPL yet, so I can't really pass judgement on which book would be better.


The paper's premise is a broad one. Its authors are hard-core functional programmers, but believe that eventually students have to learn something beyond just pure FP. Too often, introductory curricula are designed with a "screw what happens later" attitude, which ends up being self-defeating. The paper's point is that they should instead provide a gradual progression, and that this progression should be over at least two different time-scales.


It seems to me that FP is having quite a renaissance recently

It seems that FP is always having a renaissance. I distinctly recall lots of people (including me) talking in 2001-2002 about the big FP renaissance just round corner. I also recall a few old school lisper rolling their eyes at us and saying that they where saying the same thing back in the early-mid 80's.


Doesn't mean it's not true. There is now hard-core industrial use growing, which has never quite been true in the past two decades. Not to mention, mainstream languages have mostly all now accepted the starting premise of FP (closures as first-class values). That's the first step; there are many more to come.

Old Lispers' eyes are stuck in a permanent roll position, like the Agena-Gemini 8 dock.


I am curious what evidence there is that SICP is responsible for "[many introductory] courses quickly disappeared again due to shortcomings of the book and the whimsies of Scheme."


It's a comment based on our view of the historical record, having been present during the rise and fall of SICP. It is not a formal evidence-based claim.


Fair enough. And I hasten to add that I do not have any counter evidence. I think I heard rumors of the book when I was at gatech in the late 90s. Never took a course in that book, though. I wasn't in CS. CmpE was an interesting beast back then.

It seemed that the money was heavily pushing Java and that most curriculum immediately jumped to where they thought future money was. Sometimes they would back off and try to get back to a more theoretical or instruction based methodology, but would always hedge with a language that was seen as friendly to industry.


@taeric, by "the book" do you mean SICP or HtDP? HtDP was used for a few years at Georgia Tech, and then that stopped. There are various views on why. (-:


Apologies for the ambiguous pronoun. Sadly, I wouldn't have known one from the other back then. When I found SICP, some of the things I recalled hearing from CS friends made sense. So, I assumed that was the book.

Curious if there are any expansions on why it was dropped.


How is HtDP opposed to the SICP philosophy? From my reading both come off as espousing the more or less same ideas for building programs.


Here's a good quote from the paper "... sicp suffers from a serious flaw. While the course briefly explains programming as the definition of some recursive procedures, it does not discuss how programmers determine which procedures are needed or how to organize these procedures. While it explains that programs benefit from functions as first-class values, it does not show how programmers discover the need for this power. While SICP introduces the idea that programs should use abstraction layers, it never mentions how or when programmers should introduce such layers of abstraction. Finally, while the book discusses the pros and cons of stateful modularity versus stream-based modularity, it does so without explaining how to recognize situations in which one is more useful than the other."


True, having discovered SICP while deep in college, I already went through many languages, paradigms and such, giving ground to SICP ideas. For a newcomer it might be pure fluff.


One is that SICP had a fair amount of math (I believe caused by Electrical Engineer authors), so while you're trying to grok recursive logic, you also have to deal with arithmetic invariants (convergence with reals). HtDP had much less. Based on https://www.coursera.org/course/programdesign (which is inspired by HtDP) you only deal with simple data and logic but focus on regular and precise analysis of ad-hoc types (at first) to derive solid program structures. I felt anyone could follow the ideas in it.


SICP and HtDP are two very different books that have the unfortunate feature of using somewhat similar languages (which are syntactically very similar, hence causing the confusion).

I was recently re-reading SICP, and was amazed by how cleverly structured it is. It somehow goes for an entire fifth of a book without any data structure at all. It retrospect, I consider it something of a virtuoso achievement to have pulled that off while writing a book that is utterly compelling on every page. However, the only way it can achieve that is by focusing _entirely_ on what we call "domain knowledge". As the kind of person who has always loved the play of numbers, I was immersed fully and barely noticed what was happening when I read it as a student. To most students, however, this material is both uninteresting and difficult, and difficult material that isn't even relevant to the domain is especially uncompellling.

Of course, diving into data early means you need a lot more help structuring your programs. HtDP focuses precisely on that problem, _using_ the data to drive the structure (inspired by earlier efforts such as The Little Lisper). And this highlights exactly how very different the two books are. It is much like the comment about the US and UK: "two countries separated by a common language". The coincidence of using (almost) the same language in both books has, sadly, created a great deal of confusion on this point.


Wasn't HtDP written to address issues in SICP ? completely or partially ? I always felt that HtDP was sharing more than a language to express programs in.

> It somehow goes for an entire fifth of a book without any data structure at all

Funny, in the MOOC I linked, from what I remember, they spent a lot of time writing programs without compound data types too. Only by having naive conventions (item0, item1,...) to let us recognize the logic behind complex type in the problem data and let it sink in our minds. I don't know how close to the book this was though so I don't wanna give off conclusions.


> Wasn't HtDP written to address issues in SICP ? completely or partially ?

Absolutely. (Though that wording suggests a stronger causal link than was really present. It was written to address a need, a need that happened to not be addressed by SICP.)


One is that SICP had a fair amount of math (I believe caused by Electrical Engineer authors)

Errr, not really, both are CS professors, although Sussman is very interested in EE and teaching it.

Rather, it was written for MIT students, who arrive at a bare minimum ready to learn the calculus, and taking the SICP course 6.001 was strongly discouraged during your first term (it's a lot of work, even for students who know programming and Lisp).

And the core curriculum 6.001-4 that all EECS students had to take back then included two math not optional EE courses. MIT's vision of what a CS student should know includes a lot of EE, which is common in department that started from EE (e.g. UC Berkeley), and less so in ones that didn't, e.g. that started from Math.

That said, most any MIT student could take 6.001 and benefit, the use of math is simply based on the fact that it was a common body of stuff all the students taking 6.001 would know.


The paper describes it; it's only 14 pages and is quite readable.


What @pgbovine said. (-:


read the linked paper :)


With all due respect to @pgbovine, it is intended as a spiritual successor to both, but is also its own thing. It is being built atop the design methodology of HtDP, which SICP sorely lacks. It has some topical overlap with SICP. But it has an entire second textbook contained in there about the study of programming languages, which neither SICP nor HtDP has. Hence the title: P _and_ PL. It's two books in one, but (increasingly) intertwined.


The author is one of the big people in the Racket curriculum.


So that is why this looks like PLT's documentation format.


Yep, it's written in Scribble [http://docs.racket-lang.org/scribble/]. Which is magical for writing substantial documents. You notice the build time for your book has crept up to 30 seconds because you've forgotten to run the separate compiler in a while. You do it — which compiles all the chapters (taking a little over 30 seconds) — after which subsequent compiles are back down to 4 seconds. Good luck getting that with most other writing tools.


@shriramkmurthi may I know what license PLAI and PAPL are released under? I couldn't find it on the linked page.


PLAI's license is CC BY-NC-SA (3.0 US). I have not yet decided for sure with PAPL.

At any rate, I'm committed to having a full copy of the book available for free to all, as is also true of HtDP (we were one of the pioneers in this space) and PLAI (which has even more free/commercial options than HtDP).

You may assume at least a CC BY-NC-ND, which lets typical users have at it right away.


I have my own odd way of rewarding CC-licensed content:

If it has no DRM but is 'all rights reserved', and I really want a copy, I wait for a chance to buy it at a deep discount.

If it is released under a CC 'non-commercial' license, I buy a copy at full price.

If it is released under a CC license which allows commercial use, I pay extra for a copy (at sites like www.leanpub.com) or buy 2 copies.

At any rate, I would gladly pay for a copy (or two) of PAPL if the option were available and a CC license declared.

BTW, thanks for this resource!


With PLAI 1/e, I did something much simpler. I made a pay version of the PDF available, making clear it was also available for free, so people would only be buying it to reward the author. I'm pleased to say some number of people have done so with regularity. Search for "Electronic Version" on http://cs.brown.edu/~sk/Publications/Books/ProgLangs/2007-04....


Could you please change the body text color to something closer to #000, if not #000 itself? The current contrast ratio fails WAG AA and AAA standards for normal sized text.


I like their definition of an object:

"The simplest notion of an object—pretty much the only thing everyone who talks about objects agrees about—is that an object is:

a value, that

maps names to

stuff: either other values or “methods”."

(I don't want to start a debate or anything, just thought it was a practical, minimal definition)


Minimal definition: "behaviour and state".


It's surprising to me how few developers, even hardcore OO developers, are able to just give this minimal definition. It's the core of the entire paradigm, that an object is state and behaviour on that state, and yet everybody always gives some convoluted, hardly understandable-to-a-beginner definition. If you're going to be teaching people, "behaviour and state" should be your starting point, not "a value that maps names to stuff". Seeing how the latter maps to an OO language is not trivial, something it should absolutely be.


You should read the chapter. Apparently even Alan Kay doesn't quite agree with your view. Also please note that I did not offer this as a _definition_. What I wrote, quite carefully, was: “pretty much the only thing everyone who talks about objects agrees about”. I stand by that, because there are people who don't agree with what you say.


Alan Kay disagrees that objects exist to 'encapsulate state'. I agree with him. They also encapulsate behaviour. While it's difficult to get a precise overview of Kay's definition (he never gave them explicitly), this page does a good job of collecting the various bits: http://c2.com/cgi/wiki?AlanKaysDefinitionOfObjectOriented

Notable for the "behaviour and state":

   3. Objects have their own memory (in terms of objects).
   4. Every object is an instance of a class (which must be an object).
   5. The class holds the shared behavior for its instances (in the form of objects in a program list)
But my point is that I don't agree that an object is "a value that maps names to stuff". I bet even Alan Kay would disagree, seeing as he was a proponent of the Actor model style of working, where objects exchange messages instead of method calls or values. In that case, there are no names to map, there is incoming data that matches a protocol. The object itself determines what behaviour to enact on its own state. Seeing an object as just values and methods is too simplistic, and at the same time too vague to mean anything.

And more importantly, as a first sentence to explain what an object is, it's misleading and will raise a lot more questions later.


Diving into the middle of a large and complex document and extrapolating from that is always a dangerous venture. The style of the book is to gradually reveal things as it goes along, often even starting with incorrect statements, because we learn best when we see missteps, not only from perfect solutions. Therefore, there's nothing wrong with "starting from" a statement that may not be where we end up. It's meant to be a pedagogic instrument, not a dictionary or other book of definitions. Every chapter is written in this style.


I agree, my gripes are not with the entire document, or even with the chapter in question. I also don't have a problem with starting from a statement that may be wrong in an informative way, it's a valid approach to learn a subject.

My problem is that it's not presented as a starting point or even implied that the definition might be wrong. It categorically states "The simplest notion of an object—pretty much the only thing everyone who talks about objects agrees about— is". If it had said "One notion of an object -one that many people who talk about objects agree on- is", all would be dandy. When you're stating things as fact in a document dedicated to learning, they should be solid.

My other gripe is that I've heard too many vague, biased or incomplete answers to the question "what is an object". These are structures people use every second of every day, but they've never learned a proper definition for it?


> My other gripe is that I've heard too many vague, biased or incomplete answers to the question

Well, "behaviour and state" is also incomplete, because it fails to mention anything about how you refer to parts of state and particular behaviours.

"a mapping of names to values and functions" on the other hand mentions both state ("values") and behaviour (functions) and also tells you that they have names.

> they've never learned a proper definition for it?

Well, there are many different valid definitions for objects, and even more for OO. There are people who don't know any of them for sure, but I strongly suspect that you've dismissed many good definitions as "vague, biased, incomplete" on the grounds of them not being your preferred one. You did it at least once just now, when you failed to realize that "a mapping of names to values and functions" is the same thing as "[named] behaviour and state".


How you refer to state or behaviour is not strictly relevant to the definition of an object, because it depends on how you implement it. It might fit in the language it was used in, but it doesn't in others. Behaviour might be reactive on state changes, it might be perpetual in a nameless loop, it doesn't matter. In the same vein, referring to state as values and behaviour as functions is too narrow. The lifecycle of the object is part of its state too, an object's construction and destruction is part of its behaviour. That may sound like hair-splitting, but it can seriously trip you up and give you a wrong idea of what an object is in non-conventional settings. An object is more than a map of 'name' to 'value or function'.

It's possible that my dismissal of many definitions is partly because they're not my preferred one. I'm human, so it's most likely the case. I like to think my preferred one at least came about through some investigation, and it's the only one that I've seen fit a wide enough variety of cases. For me, at least part of the validation comes from the fact that I was frustrated at the definitions people gave me long before I found one I liked. Too many conversations with college teachers, mentors and colleagues that went

  "So how would you define/describe an object?" 
  -"..."
  "OK, but what about [language/edge case/model]?"
  -"Oh yeah..."
I'll gladly have this conversation again, and again, and again, and I hope somebody can poke holes in the definition I prefer, and put me on the other side of it.


The difference between you and me is that I've seen enough different notions of object (and defined some of my own) that I don't believe there is any such definition. In class I discuss why, but it's difficult to transcribe such an interactive discussion; but it is essentially the other side of "this conversation again, and again, and again".

I don't find your definition particularly useful, either: it's too encompassing to really be of much use _to me_. By this definition just about all computation is an "object", which doesn't really tell me anything interesting.

Also, as a general pedagogic principle, I'm opposed to definitions that are too vague to say much to a middle-of-the-pack undergraduate. I'd definitely put your definition of objects in that space. It sounds like good stuff—it's certainly "quotable"—but the typical student would be hard-pressed to actually explain it or put it to use.


It's definitely possible that there is no one good definition, but it's a good discussion to have. I applaud that you treat this with your class.

The usefulness is debatable. To me, it's been useful because it helped me see that, for example, a process can also be an object, and you can apply object oriented rules to interacting processes. The Actor model is an extension of that. Modelling a set of Microservices as an object model has also proven valuable.

How you communicate it to students is a not an easy matter. I'm opposed to giving definitions as an absolute truth if they aren't, even if you're doing to prove a point. If you want to teach with different definitions, that is of course perfectly fine, but at least a mention of the debate could solve some later problems. People tend to adhere to the 'laws' they learn for a long time, and if you teach them a definition that is incorrect in some cases, they'll have trouble with those cases.


> [...] An object is more than a map of 'name' to 'value or function'.

Ok, after that explanation I understand what you mean. I could argue that, for example, object construction and destruction semantics are not characteristics of an object itself but rather of a particular object system or "meta-object protocol" being used; or that it's almost impossible to talk about object destruction outside of particular language context, because it relies on many different language-level features to work. But that would be too much of a hair-splitting for me :)

On the other hand I could argue that a set of functions is enough to express every possible type of behaviour and that a set of values is enough to express any possible state. But this in turn depends on the definitions of "value" and "function", which are highly language specific.

So, after thinking about it for some time, I think I can agree that "behaviour and state" is the best (possible?) definition of those which avoid diving into any implementation details and language specific features.

I'm not entirely convinced that such a definition is very practical though. You probably won't need that level of generality until you decide to implement your own object system. It won't help you (I think?) in OO design or OOP inside of some particular object system.

But anyway, you're right in that your definition is the best one in the category of "general" and "minimal" definitions of what objects are.

Sorry for ad personam in my previous comment and thanks for the discussion, it was enjoyable :)


The practicality is a very personal thing. For me, it's helped me see that a wider variety of things match the definition. Realizing an Actor model, communicating processes, microservices or nodes in a storage model can all be seen as objects, and applying some object oriented modelling techniques to them, has been a very powerful tool for understanding them and building them. This value is entirely personal to me, though.

No apologies necessary, it was absolutely fair to call me out on a bias on my end. You too thanks for the discussion :)


I'd like to learn from this book.

> If you ask Pyret for the value of this program, it will unsurprisingly tell you that it’s 3.

I am not sure how to do that. In [this tutorial](http://www.pyret.org/docs/latest/A_Tour_of_Pyret.html#%28par...) everything is a test. Is the program 3 supposed to be run as a test as well? Just running 3 in the editor does nothing. Where can I read up on Pyret syntax so i can follow this book?


Sorry about the confusion. Looks like you've already found https://code.pyret.org/editor (but the link is here for the information of others). You can run programs by typing them in the interactions area (a.k.a. REPL), which is the area on the right. To wit:

> 3 3 > 1 + 2 3


I see. I was typing my code on the left half of the window and expecting the result show up on the right half. It works now. Thank you.


I know the answer to the following question is probably scattered in the comments below but, to get a concise/precise version of it in one place, I am going to ask it anyway - How does PAPL compare to SICP, HtDP, PLAI & EOPL considering there is quite some commonality amongst them? What does each book offer, what is the intended audience for each book, where do they overlap, which would be a good idea to read first for a novice?

Thanks in advance. :-)


This is great. I'm building a interpreter in F# and the clear explanation of things have solved several of small-questions I have before.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: