Hacker Newsnew | comments | show | ask | jobs | submit login

This is a great article.

To me, the most powerful example is this: "It forced me to think of every such problem as a chain of the primitive list operations, maps, folds, filters and scans. Now I always think in these terms. I "see" the transformation of a container as a simple sequence of these operations. "

That's a higher level of thinking that is faster and yet at the same time more precise than thinking in terms of imperative loops.

Haskell has taught him to think more powerfully.

I recommend people study APL / J / K for the same reason: It has the right primitives that make everything expressible as maps, folds, filters, scans, ranges, and stuff like that.

Even more so than Haskell.

It doesn't have monads and the kind of abstractions that lets you modify control flow semantics. It doesn't even have the facilities to build abstract data types - which makes you work with less-abstract data, and realize that although some abstraction is useful, most of what is practiced today is useless.

APL / J / K promote, at the same time, container abstraction, and concrete down-to-the-metal real work.


I've always wanted to dive into these guys. Is there a good introduction for fluent Haskell programmers somewhere?


I haven't done anything in Haskell, but I doubt that makes much of a difference.

Maybe more inspiration than introduction, but the "Game of Life in APL" video is a must see (http://uk.youtube.com/watch?v=a9xAKttWgP4&fmt=18)

The description on YouTube points to a "Game of Life" tutorial at http://tryapl.org. I haven't tried it, but it looks nice.

For a more academic (as in "read about, don't play with") introduction, I found Iverson's "Notation as a Tool of Thought" (http://awards.acm.org/images/awards/140/articles/9147499.pdf) a good introduction.


I don't know how i feel about APL now, but I have some funny APL stories. One of my friends was the Jimi Hendrix of IBM APL2, he was amazing, unfortunately i can't tell the stories but when he learns haskell he'll be top notch. Another was the Dweezil Zappa of APL, he learned the language as a high school sophomore or junior from Iverson himself in Yorktown Heights.


Learning other languages can broaden your repertoire in a similar way. Why is it more important to decompose problems into map/fold/filter/scan than other sets of operations?


Well, when you really boil it down, these are the fundamental operations.

At least, I believe that is true. Someone stronger in CS theory could come along and verify/correct that statement.

This paper [1], for example, appears to argue that FOLD (foldr in particular) is universal, meaning that there is only one way to move through a list and resolve it to a single value. In other words, you may have some other algorithm that you think is not a FOLD, but if it moves through a list and resolves it to one value, then your algorithm is FOLD. (I say the paper "appears" to claim that, because in all honesty there's about 10-15% of the paper that I didn't fully grok).

I suspect that applies to map/filter/etc. After all, if you look at their definitions, they contain nothing but the essence of the operation (map/filter/fold/etc) itself.

So, map/filter/fold/etc are THE operations on lists.

[1] http://www.cs.nott.ac.uk/~gmh/fold.pdf


Interesting. Going through the 4clojure problems gave me a taste of this.. And the ->/->> operators make it even simpler.


..to think of every such problem as a chain of the primitive list operations, maps, folds, filters and scans. Now I always think in these terms. I "see" the transformation...

You mean these functions - http://karma-engineering.com/lab/wiki/Tutorial5

Why Haskell, then?)


Type safe pattern matching would be my reason, followed by separation of effects from pure functions, and allowance for creating real data structure types so we don't have to do things like "(define make-tree cons)" as in the last example of your page.

We can instead give tree a proper type, give operations over the type proper signatures which help to explain what the operations do, and get verification from the compiler at the same time that we aren't mixing things up.

It's hard to beat data Tree a = Node a [Tree a]


(define make-tree cons) - is very natural thing - it is called synonym - just another word for the same thing in a new context. Moreover, this is a better candidate for a proper way.)

separation of effects from pure functions - this I cannot understand. Aren't these function pure?

And another part of "properness" - each node of a tree is a tree itself. Sometimes less types is more.)

Well, I'm old-fashioned, but I think that being able to put any kind of data into a list is the strength, not weakness of a language, and that user-defined ADTs and compound data-structures needs no explicit typing, they are nothing but conventions.


Think again.

Types provide one important thing: static guarantees. Compile-time type checking can only work with strong, expressive data types, and ADTs provide a very good way of enforcing that. (Together with Haskell's `newtype` keyword, which really is nothing more than a convention, if you will, a different way of expressing a type synonym, but with the static guarantee that when you make up a `newtype Name = Name Text`, you cannot use `Name` wherever you can use `Text`, but you can access the `Text` `Name` is a wrapper for easily.

This makes it much easier for your intent to be expressed directly in the code, namely in the data types. Just compare:

    makePerson :: Name -> Age -> Gender -> Address -> Person
    makePerson' :: Text -> Int -> Text -> Text -> Person


I really like your executive summary there, these functional concepts really help with solution composition and abstraction. I really am pleased the penny has dropped with myself too. I am trying daily to encourage my colleagues to spend some time learning these functional fundamentals!


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact