Hacker News new | past | comments | ask | show | jobs | submit login
Becoming Productive in Haskell (mechanical-elephant.com)
354 points by pieceofpeace on Apr 24, 2015 | hide | past | web | favorite | 203 comments



Scripting languages try to seduce you to just fiddle around until the output looks like something you want. While that quickly gives you some results, I think it's a huge roadblock in the mid- to longterm. Especially when programmers are only familiar with "easy" scripting languages, there are rarely insights about the general approach to the problem until the project already grew to become an abomination.

While fiddling around is still somewhat possible in Haskell, the language itself makes it quite difficult. Haskell kind of forces you right at the beginning to pause and think "Well, what is it that I'm actually trying to do here?" It let's you recognize and apply common patterns and implement them in abstract ways without having to think about what kind of values you actually have at runtime. In that way Haskell is the most powerful language I know.

Have a tree/list/whatever? Need to apply a function to each of the elements? Make your tree/list/whatever an instance of the Functor type class and you're done. Need to accumulate a result from all the elements? Make it foldable.

Something depends on some state? Make it a Monad.

You either get a result or you don't (in which case any further computations shouldn't apply)? Use the Maybe Monad.

You need to compute different possible results? Use the List Monad.

Need to distinguish three different possible values that are different compositions of elementary types? Make yourself your own type and pattern match the behavior of applying functions.

Need to output in a certain way? Make it an instance of the Show class.

Most concepts that are used every day have some kind of idea behind them that is abstract and implementation independent. Haskell kind of forces you to reference those ideas directly. The downside is that you actually have to know about those concepts. However, knowing about the such concepts makes you also a better programmer in other languages, so it's not like it's a bad thing.


Ability to fiddle isn't tied to static or dynamic programming, it's just every static language platform I've seen including The Haskell Platform neglects making it easier to fiddle.

When trying to build complex software in Haskell, I find myself spending a lot of time commenting/uncommenting swaths of code, just so I can get part of a algorithm to load in GHCi. It sucks. What I wish would happen is GHCi allowed me to load just the things that type check, and skip the rest, so I can fiddle. This is definitely possible. Not compiling is great for production, but not while developing.

Software is built in pieces, if I'm working on one piece, another statically unrelated piece shouldn't prevent me from working. In this regard Haskell GHCi (and many static languages), makes developing more complex than dynamic languages, but again it's not intrinsic.

I also wish when I run my tests, it listed all the type errors, as well as run tests on the code that do type check. Having more safety mechanism in Haskell helps with writing correct code, but compiling doesn't mean the code works. Automated testing is still more useful for writing software that works. Haskell isn't as safe as many people think [1].

    sort a = a ++ a  -- it compiles, so it must sort
[1] http://hackage.haskell.org/package/base-4.8.0.0/docs/Prelude...


> What I wish would happen is GHCi allowed me to load just the things that type check

Use `-fdefer-type-errors` (should work with both GHC and GHCi), all errors become warnings, and if you try to use a function which was compiled with an error, you get a runtime error.


Thus perfectly illustrating Cunnigham's law.


Of course compiling successfully does not imply a correct program with the desired behavior. Nobody claims that GHC is able to verify the behavioral correctness of your program and it also can't tell you if your function actually sorts a list or if it does not. However, if you claim in your definition of "sort" that it takes a list and returns a list, then GHC can verify if that is actually true. That is the power of types. They are no magic things that write correct programs for you.


IIRC, the Show class is supposed to be the reverse of the Read class -- so you should not use it to pretty print stuff.


It's not a pretty-printer. It's fine for printing stuff when you don't care what it looks like most of the time. Pretty printing is a vague term though. There are certainly many ways to pretty-print in Haskell, some of which are bi-directional pretty-printers/parsers.


More typically, "Show" is whatever you want it to look like in the repl and "Read" doesn't get used. It's still useful for Show to be essentially Haskell syntax though so you can copy and paste into the repl.


This split was learnt from in Rust, and Rust has both the Debug class (which is the inverse of Read) and a Display typeclass for showing nice versions of things.


Yeah, technically I agree.

But I must confess, I used it in the past to have a convenient way to print out something like: 'Add "x" "y"' as "x + y". I didn't care about using read to turn it back into the internal representation since expression parsing is kind of difficult. I used Parsec instead. So I had show to output and a parser as the inverse.


A lot of things are supposed to work certain ways but in fact I never really used `Read` for anything more than reading simple user config in scripts. Usually my data is serialized/deserialized from more common formats such as JSON.


Requirements Uncertainty: Fast iteration has many benefits - especially when you (or your client) don't actually know what's required ("agile"); and when what is required changes quickly, because of changes in competitors, customers, technology, regulation etc.

But you're right, as projects get larger, a priori design and static types quickly become essential. And at that point, requirements are usually known and frozen.

I predicted that the natural resolution would be languages with both (i.e. optional static types, especially at important interfaces) - but while this feature exists, it hasn't taken off.

Instead it seems that performance is the main attraction of static types in the mainstream (java, c#, objective-c, c, c++); and ML-family and Haskell are popular where provable correctness is wanted.


To be honest, I think Haskell is the best language I know of for fast iteration as well. You can successfully create and manage far worse, yet working code when you've got a good type-driven guardrail. A common pithy quote, which I agree with wholeheartedly, is that "`IO` is the best imperative language out there today".

There's definitely a lot of missing documentation about this folk practice of "fast, loose, shitty Haskell" due to the strong culture of pretty code that's also enabled by Haskell. I remember seeing a video presented at CUFP that went into the merits here, though.

Essentially, this is a "tricky" concept because you want to design your types to be exactly as restrictive as you can afford without having to think too much. It probably requires a good grasp of the Haskell type system applied in full glory in order to bastardize it just right.

So, tl;dr?

I think types are the ultimate fast iteration tool, but this is not a well-documented practice.


"Civilization advances by extending the number of important operations which we can perform without thinking about them." A. N. Whitehead.

It's often said that a Lisp advantage is to be able to write sloppy - without fully understanding what's needed. I guess you can model that degree of "less constraints" in Haskell as well? Otherwise constant forcing "understand what you're doing" can sometimes be a burden.


> ML-family and Haskell are popular where provable correctness is wanted.

Haskell doesn't have a theorem prover, you may want to check out Idris [1]. Haskell gives you more safety that you aren't going to get run time errors than say Java, but not completely. You'll still need automated testing for correctness, that you're not getting garbage in, garbage out.

[1] http://dafoster.net/articles/2015/02/27/proof-terms-in-idris...


Absolutely, Haskell has a richness of concepts to it that's entirely in a class of its own... but that being said, Haskell is to programming what sex is to life (for a woman... because men's natural enjoyment of sex throws off my following analogy). Granted there is richness and joy in finally discovering it, one needs to realize it's important that one isn't forced into it too early. Haskell demands a lot of thinking up front and therefore requires a programmer of a certain logical / organizational maturity in order for the process not to be frustrating and painful (or even traumatizing). Haskell (like C, Lisp, and the arcane slog known as Erlang) should rarely be someone's first language.

And that's where Ruby, Python, Javascript, and to some extent Matlab come in. For whatever else people may say about them later (they don't scale, they're a roadblock, they're a mess, null is not a function, etc.), they were there for you when you were programmatically young and they introduced you gently into a world that's otherwise extremely complex.

After all, programming, like literally everything else, is 99% human and 1% logic, machines, data, "scaling", etc. Programs are written by people for people (incidentally they can also be read by a computer), so it's incredible important that the 99% of that equation (you the programmer) don't become discouraged at the onset by an extremely elegant, expressive, but rather rapey language before you're ready for it. In that sense, it's absolutely okay to be "seduced" by an easy scripting language in the beginning. Eventually, though, when you start lamenting about "undefined is not a function" and how that could be so easily avoided when proper type-checking, that's your body telling you that you're ready for Haskell now.


> And that's where Ruby, Python, Javascript, and to some extent Matlab come in. For whatever else people may say about them later (they don't scale, they're a roadblock, they're a mess, null is not a function, etc.), they were there for you when you were programmatically young and they introduced you gently into a world that's otherwise extremely complex.

Well, no, they literally weren't there for me when I was new to programming. (MATLAB existed then, but I wouldn't actually see it for more than a decade.)

And while I don't think they are bad languages for beginners, I don't see a clear argument presented as to why they are superior for that purpose (just a somewhat vulgar analogy that presumes that people share your subjective opinions about the languages involved.)


> (for a woman... because men's natural enjoyment of sex throws off my following analogy)

Pro tip: If your analogy needs a disclaimer that perpetuates gender stereotypes for it to work, then its probably sexist.


The parenthetical statement by ffn, although a bit tongue-in-cheek, is largely "men want sex more than women." Which, judging by the statistical consumption of porn across the world, is resoundingly true. I get your point about not perpetuating gender stereotypes, but the whole spirit of that movement (feminism, equality, what have you) is to not perpetuate gender stereotype types where they are irrelevant - as in when sex is not directly involved (e.g. leading a corporation, programming a computer, playing with children, etc.)... But you will literally not find a field where sex is more directly involved than sex itself. Granted there are always exceptions, but the statement that "man want sex more than woman" is sexist the same way "man is physically larger than woman" or "woman have higher % body fat than man" is sexist.

In other words, it's sexist in the sense that we recognize there is a biological difference between the sexes - we're not applying it to infer men are automatically rapists, or women are automatically unable to make executive decision. So maybe instead of playing around with labeling terms that carry a lot of negative connotations, you can actually consider the circumstance and context of what is being said before you label.


so men want sex more than women because of some statistic that you didn't even bother to fully pull out of your ass? nice


Here's a study (scroll to page 6 for the table):

http://www.hawaii.edu/hivandaids/Gender_Differences_in_Porno...

Now combine that with some data from Christian Rudder's Dataclysm (just a link to a info-pic + summary article here):

http://www.bloomberg.com/bw/articles/2014-09-04/mining-okcup...

Men consume a lot more porn than women and hunt for casual sex a lot more than women. Actually, if you weren't so rustled, you could've just google'd "consumption of porn by gender" and gotten a lot more results than the two I put up there. But yeah, way to not walk away and accept that someone else has a valid point, and feel free to continue loudly cry "no, your stats suck", "give more sources", while hiding behind a throw-away account and throwing out sensational accusations of "sexism!" for the sake of accruing karma on your main one.


Taking only graphic porn into consideration is a bias; women consume 'porn' in the form of romance novels.

"By and large, men prefer images and graphic sex sites; women prefer erotic stories and romance sites." - http://rescuefreedom.org/parallax/wp-content/uploads/2015/01...


For the record, that really isn't me. I didn't think your comment deserved a response so I ignored it. I don't have a way to prove it to you. Sorry.


I agree, except for the sex analogy. You could've easily went with something like:

Haskell is to programming what bugs are to food. Both are functional, an acquired taste and look scary from the outside.


> they were there for you when you were programmatically young and they introduced you gently into a world that's otherwise extremely complex.

This may be true for you, but it is not true for everyone. You assume that learning Haskell as a first programming language would be more difficult, but you don't present any evidence to support that claim. People who have done so disagree with you.


>While fiddling around is still somewhat possible in Haskell, the language itself makes it quite difficult. Haskell kind of forces you right at the beginning to pause and think "Well, what is it that I'm actually trying to do here?"...

Wouldn't Scripting languages allows one to gradually build that understanding. Suppose you end up with a lot of complex code? Ditch it and build it from scratch. Usually takes around 1/10th the time it took first time with much better results.

So I think by the time one can think up and build the perfect abstractions in Haskell, one can write 3 or 4 iterations of the program in a dynamic language. Each time with better abstractions and neater organization....


In my experience, to borrow seanmcdirmid's terminology below, one can use Haskell as a "program to think" language - and in fact Haskell brings some significant tools to the table that I miss when I'm using Python to feel around a problem, to the point that I often choose Haskell for this. In both languages, writing small pieces of code and examining their shape and their output can be helpful in growing my understanding of the solution space, but I find that Haskell tends to let me ask more complete questions with less complete code.


No. The debugging time, refactor/rewrite time of writing in scripting languages is substantially longer, harder work, and distinctly unpleasant compared to just thinking and using good tools to do it right in Haskell.

In Haskell, I'll often have a problem and just stare at my laptop and think for an hour. Then write a dozen lines of simple, straightforward code. The code is easy to test, and the problem is marked as "solved" instead of "seems to work" as happens in scripting languages.

Edit: auto complete fixes


Pausing for an hour to think about and understand your problem is important in any language. Probably more important than testing honestly.


Indeed. I'd argue that strongly typed languages represent many problems in the type system, often letting you know very quickly that you'll have to spend that hour.


I now do all of my logic and data structure design on paper. It usually ends up looking like a series of incomplete sketches, as I very quickly iterate over wrong ideas that would have taken hours to discover if I'd coded up all the alternatives.

The final few sketches almost always end up simpler than the original idea seemed!

I also try to follow ESR's paraphrasing of Fred Brooks:

"Show me your code and conceal your data structures, and I shall continue to be mystified. Show me your data structures, and I won't usually need your code; it'll be obvious."

I think this concept is actually more important than the choice of language.


The nice thing about Haskell is that the type system lets you work at a higher level of abstraction. One of the problems with dynamically typed languages is that even though they let you create abstractions they are bad at error detection. For example, if you have a function that expects non-null inputs and you pass null to it the error is only going to be caught when you try to call a method on the null. On the other hand, in a statically typed language you get an error message poining directly to the real source of the issue.

Another othing Haskell gets right is the support for parametric polymorphism (generics). You are forbidden from manipulating generic parameters other than passing them around so there is less room for error. This "theorems for free" is what makes things like monads "tick".

That said, one thing that is in vogue right now is adding optional type systems and runtime contracts to scripting languages. Its still a bit of a research area but I think it has a very promising future.


Different languages lead to different exploratory behaviors.

Haskell makes it very safe to change your code, but it adds some initial costs. Scripting languages make it very unsafe to change the code, unless you spend a lot of time writing tests, but then they stop being fast to iterate.


I think that's an interesting point, _if_ people actually do that. That said, I'm wary of the claim of 1/10th time and being able to iterate 3 or 4 times per every one iteration with Haskell. Sure, maybe when you're starting out, but once you become proficient I don't think that would be the case anymore. And, there's no guarantee that 3rd or 4th iteration will be as good as the well thought out Haskell code, since the first iterations may be prohibitively complex.


The point is that a scripting language is a "program to think" language, while haskell is often seen as a "think to program" language (at least when described as in the top level post). That you have to do more thinking and planning when using haskell (supposedly) doesn't help when the problem you are working on is not well understood and requires exploration (where you are forced to do exploration in your head...or on a whiteboard, rather than in code).


I strongly disagree. In Haskell the compiler helps me think a LOT more than other languages because it's checking more things for me. I don't have to explore things on the whiteboard, I can explore them in code and get very quick feedback about things I might have missed. I have built things in Haskell that I don't think I would have been able to build in other languages. The compiler is your friend, not your enemy. It's like having another developer there to bounce your ideas off of.

EDIT: Bottom line, I think Haskell also works well for people who "see programming as a cybernetic extension of their mind".


I didn't make a claim, especially that claim. If haskell requires a lot of up front thinking, then it might turn off those who see programming as a cybernetic extension of their mind (using the computer to help you think, vs. thinking to use the computer). I do not know if the premise was true, but was made by the top level post.


On your edit, I also didn't make that claim. All my premises are open.


I think part of it may be that it helps with thinking, but with a different kind of thinking than scripting languages?

There's that Perlis quote: "Show me your data structures, and I won't usually need your code; it'll be obvious". For me, a lot of thinking about programs involves thinking about the types of data involved, and there Haskell gives a language to talk about it. You can start writing down your datatypes, and the function types, directly in your emacs buffer (leaving the function bodies as just "undefined" at first). By contrast, if you are programming in some untyped language like Scheme, you have to do all that work inside comments---e.g. if you write a compiler you maybe start by writing a huge comment saying "this is the grammar I expect input expressions to follow". Having a type language around kind of helps by providing a notation.

I guess there is some other kind of exploratory thinking which untyped langauges provide a good notation for? But in my life I have mostly worked in typed languages, so I don't have any concrete idea of what it is.


The quote is from Fred Brooks (author of "The Mythical Man-Month"), and it goes like this:

> Show me your flowchart and conceal your tables, and I shall continue to be mystified. Show me your tables, and I won't usually need your flowchart; it'll be obvious.

I don't think it's about exploratory programming. It's more about reading other people's code.


Wow, I got that extremely wrong huh. Thanks for the correction.


> Wouldn't Scripting languages allows one to gradually build that understanding.

Lately I've reading this[1] academic paper about End-User Software Engineering. While not an easy read nor a good introduction if you've never read about End User Development, it delves precisely on methods for building tools that allow just that, while adding opportunistic checks to correct bugs.

[1]http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.360...


Haskell allow you to express algorithms in a more error-proof way.

For really complex things the implementation in scripting language can introduce errors that are catchable in typed language.

So you code your complex things, it fails to perform to expectations (but somehow performs, not just stack dumps). Where the source of fault lies, in the complex idea itself or in the almost whole source code?

My rule of thumb is that I write in Tcl/Python/Bash something that is not longer than 200-300 lines.


And what makes the model of haskell any more superior to models developed in any other field like accounting, physics or chemistry ?

I am sorry, haskell is just a huge roadblock to get things done in the real world.

In the real world proffesional need to juggle all sorts of models. Haskell just says "Fcuk you ! its my way or the gonadway !".

I need to juggle between json, matrix, html, etc. Each of them have hundreds of expections.

You can say my model is imperfect but guess what buddy, every model is. The only models that will work for all cases is prolly einstein's equations but even that has exceptions when dealing with blackholes !

I tried writing a music library in haskell and haskell makes it really hard to create rules that are expections to the model. Apparently the models developed by 1000s years of music theory is not good enough for haskell !

I cannot even image what it must to be code chemical rules using hakell that have hundreds of expections, or biologial models ! Oh My !

I am sorry haskell just gets makes computation much more difficult. Apparently mutation is a crime even though god himself thought it was okay as a rule for everything in the universe.

my anecdotal experience.

((

btw i really like the concepts in haskell. I read two of its famous books - LYAGH and RWH. And use all haskell concepts almost daily in production. However the implementation of haskell is not really ready for production or useful enough for the average developer. Its also not easy for the average developer to put food on the table using haskell

))


I'm not sure what you where trying to do, but there's a whole chord recognition service written in Haskell: http://chordify.net/

Apart from that, I have written many production grade Haskell application, and I can not agree with you that Haskell is getting in the way. I admit that when learning Haskell I sometimes had that feeling too, but basically that was just me thinking about the problem too complicated or in a wrong perspective. Now that I am past that point Haskell is super fun to write, very productive and results in extremely maintainable code - it is just so easy to refactor anything you can imagine - and when it compiles again you are probably good to go!


I'm curious to find out what problems you ran into in your music library, but I suspect you were downvoted because of your tone. That being said, the reason I'm curious is because there is an entire book about writing a music parser, renderer, and player in Haskell: http://haskell.cs.yale.edu/euterpea/haskell-school-of-music/


> Apparently the models developed by 1000s years of music theory is not good enough for haskell

Perhaps not, but Haskell is good enough for them:

  Functional Generation of Harmony and Melody http://dreixel.net/research/pdf/fghm.pdf


You might want to know that there is a book related to Haskell and Music: http://haskell.cs.yale.edu/euterpea/haskell-school-of-music/


And what makes the model of haskell any more superior to models developed in any other field like accounting, physics or chemistry ?

- Type-safety. Correctness. Speed.

I am sorry, haskell is just a huge roadblock to get things done in the real world.

In the real world proffesional need to juggle all sorts of models. Haskell just says "Fcuk you ! its my way or the gonadway !".

- I don't know what this even means.

I need to juggle between json, matrix, html, etc. Each of them have hundreds of expections.

- Haskell has great libraries for each of these.

You can say my model is imperfect but guess what buddy, every model is. The only models that will work for all cases is prolly einstein's equations but even that has exceptions when dealing with blackholes !

- models?

I tried writing a music library in haskell and haskell makes it really hard to create rules that are expections to the model. Apparently the models developed by 1000s years of music theory is not good enough for haskell !

- It just doesn't let you do it incorrectly.

I cannot even image what it must to be code chemical rules using hakell that have hundreds of expections, or biologial models ! Oh My !

- The more complex, the better Haskell is suited.

I am sorry haskell just gets makes computation much more difficult. Apparently mutation is a crime even though god himself thought it was okay as a rule for everything in the universe.

- Immutability doesn't make computation harder.

my anecdotal experience.

((

btw i really like the concepts in haskell. I read two of its famous books - LYAGH and RWH. And use all haskell concepts almost daily in production. However the implementation of haskell is not really ready for production or useful enough for the average developer. Its also not easy for the average developer to put food on the table using haskell

- You say you like these concepts, but it doesn't sound like you have the slightest idea what those concepts are useful for.


> Haskell just says "Fcuk you ! its my way or the gonadway !".

I think it's more accurate to say that Haskell makes you annotate specifically which way you're doing it, and only combine ways when it's okay to do so.


I didn't try to say that Haskell is the best language for everything. I also wouldn't want to do everything in Haskell. But as you are pointing out, you learned a lot from Haskell and you are using its lessons in production.

The best a language can do is to fill a niche and to be very good at that particular thing. You should always use the language that is most suited for your problem, whatever that is. But there are many languages that let you get away with being a terrible programmer. Haskell just isn't like that and I want to point out that you can learn a great deal from being forced to think in more abstract ways, just like you did.

In programming, we often encounter the temptation to just mutate everything and to use side-effects since it's quite convenient to do so in the short-term. In the long-term, these things will come back and bite us. I argue that it is important that a programmer should have experienced what it is like to simply not have the option to do so. After learning Haskell, I tried to avoid side-effects in other languages as much as possible and to use them consciously. That was something I didn't even consider before learning Haskell. And, obviously, the less side-effects you have, the easier it is to maintain or exchange parts of your program.

I currently use Haskell to calculate probably a hundred analytical derivatives for a large sparse array that is used in a simulator written in Fortran. And it's very good at that. For quickly writing some evaluations of output of this simulator I use Python, because Python is better suited.

Pick a language based on the problem. Don't just use one language because you know it. In my experience, Haskell is very well suited for a lot of mathematical stuff.

------

Off topic:

By the way, Einstein's field equations will work for gravity in the classical regime, not in all cases. But still, if you simply want to calculate how far you'll throw a ball, you really should take another model based on Newtonian gravity or simply take a constant gravitational force. Planetary movements are also fine with Newtonian gravity (except when you really need it accurately. E.g. for the precession of the perihelion of mercury). However, GPS calculations are terribly inaccurate without general relativity (time flows differently if you are close to a big gravitational potential well). So, pick your model based on what you want to do, just like you pick your programming language based on what you want to do.


> I also wouldn't want to do everything in Haskell.

Examples, please? And why?


Lack of dependent types? :-P


I heard Haskell is a poor choice for real-time systems and numeric computation. It's kinda ironic that a language as "mathy" as Haskell is such a poor choice for doing actual math. Assuming these things are true.


This definition of "the real math" is quite arguable. Doing a lot of computations efficiently is a realm of modeling and statistics; that does not clarify the very nature of computation, the inner laws of formal systems. Haskell is much more about "what computation is" than about "how to do computations".

Besides that, from mathematician's point of view, "efficient numerical computation" is a necessary and useful practically, but very ugly thing. Speaking of C-like types `int`/`long`/`uint32_t`: they are only a crude approximation of natural numbers, they are a ring modulo 2n, which we pretend to use as natural/integer numbers, silently failing when this range wraps up.

And that is not the end of the story: for integer numbers we can at least specify what mathematical model describes them (a ring modulo some power of two), for floating point numbers it is impossible: set of possible IEEE 754 values is a very weird and irregular finite set of values (NaN, +Inf, -Inf, +0, -0, exponential distribution of points density with min/max bounds) with complex modes of failure. Associative law, distributive law, commutative law? The very equality check? Forget about it, floating point numbers have none of that.


Actual math is really a pain in the butt on any computer, in any language. Doing it all in binary efficiently is the problem.


If you have an array that fills a significant fraction of your memory (say, tens of Gigabytes), you don't have another choice but to use mutation (Haskell doesn't support that).

While quite fast, it is not in the league of low level programming languages. If you need ridiculous speeds, you don't have another choice but to use C, C++ or Fortran.

Python has a lot of very useful modules. If I can solve my problem with basically a few import statements and don't care about performance or anything, I find Python to be better suited.

Erlang's light-weight threads are a boon. Having a webserver written in Erlang and using Erlang as a server-side language, you can support a lot of sessions at once.


i write incredibly mutation heavy code in haskell on a nearly daily basis. I've even added compiler support for doing hardware prefetch to the most recent version of ghc https://downloads.haskell.org/~ghc/7.10.1/docs/html/librarie...

there are many lovely mutable data structuers in haskell http://hackage.haskell.org/package/vector-0.5/docs/Data-Vect... is one for unboxed C-struct style arrays,

http://hackage.haskell.org/package/hashtables is a super mature stable mutable hashtable library, that is used, among other places, in agda!

..... please fact check your feelings in the future :)


Sorry. I seriously didn't know. I only ever used the immutable parts of Haskell. Thanks for the correction.


Haskell does support mutable arrays. Your code that does the mutating will have to live in IO (or perhaps ST), but the downsides of that are exaggerated.


Oh. I didn't know that. Thanks for pointing that out.


Haskell strikes many as "bad for mutation" because introductory tutorials don't really cover it. This is because 1) mutation is not terribly idiomatic, 2) relying on mutation often (not always) a bad design decision, 3) Haskell handles the immutable case really well, and 4) mutation involves some more complexity than is involved in other languages. All of these are good reasons to avoid talking about mutation in a beginner Haskell tutorial, but when you need to address a problem where mutation is the best fit you'll find that it actually works pretty well. Haskell doesn't make mutation difficult, it makes it explicit, which has upsides and downsides.


> If you have an array that fills a significant fraction of your memory (say, tens of Gigabytes), you don't have another choice but to use mutation (Haskell doesn't support that).

Haskell does support mutation, it just requires it to be controlled. Take a look at Data.Array.ST and Data.Array.IO

> Erlang's light-weight threads are a boon. Having a webserver written in Erlang and using Erlang as a server-side language, you can support a lot of sessions at once.

GHC provides a very similar thing in the form of "green threads".


> If you have an array that fills a significant fraction of your memory (say, tens of Gigabytes), you don't have another choice but to use mutation (Haskell doesn't support that).

Haskell supports mutation, although many short tutorials don't address it and even many longer tutorials don't do much with it.


I mean, there are Data.Vector.Mutable, though I haven't needed to use them yet.


I've personally found it very easy to handle "exceptions" in Haskell. Partial answers feel "bad" in Haskell because all the core functionality is clean enough to not need them, but on the other hand that makes for some of the best tools around for dealing with partiality.


> Partial answers feel "bad" in Haskell because all the core functionality is clean enough to not need them

Not to need them, sure, but not not to use them. (I swear that's the right number of 'not's.) I don't understand why something like Neil Mitchell's `Safe` isn't just the way that things are done by default.


Ugh, yeah, total agreement here. Historical accident, I suppose. I try to pretend like `head` and `tail` just don't exist.

You can always recognize a file where I'm doing list ops because I define `uncons :: [a] -> Maybe (a, [a])` at the top of the file, haha.


Even better than safe, all this has been consolidated very nicely in the errors package.


I would assume that anything that has an underlying pattern would be describable with Haskell. It is a functional programming language after all. If you are not very mathematically savvy, it might not be for you.


I am learning Haskell. Could you describe an example/s of Haskell and music theory not working together? I am genuinely curious about this.


I am sorry, haskell is just a huge roadblock to get things done in the real world.

Languages tend to suffer from an Iron Triangle: quick to write, quick execution, quick to learn-- pick 2. Haskell takes a long time to learn but it produces very high-quality executables and, once you know it, it's very productive.

While "quick execution" may seem separate from the type safety which is also a major selling point of Haskell-- and, arguably, a bigger one-- they're actually tightly coupled. Safe code can be optimized more aggressively, and it's often for the sake of performance that unsafe things are done... so the fact that Haskell can be robust and generate fast executables is a major win.

Haskell just says "Fcuk you ! its my way or the gonadway !"

It doesn't, but I am going to start saying this. Thank you for the inspiration.

Apparently mutation is a crime

Not so. Every program's main method has type signature IO (), which means that it does perform mutation. You just want to get as many functions as possible not to involve mutation because it's easier to reason about them. It's a similar principle to dependency injection, but more robust and clear.

However the implementation of haskell is not really ready for production or useful enough for the average developer.

I disagree. With Clojure and Scala, I've met people who've used them and moved away. Satisfaction rates seem to be about 60% with Scala (that is, 60% of teams or companies that make a major move to Scala are happy) and 90% with Clojure. I've never heard of anyone who's become unhappy with Haskell or rolled back on it.

One of the dangers of using Scala, for an example, is that, if that if your Scala deployment doesn't work out (or is sound but is blamed by the business for something unrelated) you can get stuck doing Java. Haskell, at least, doesn't have that problem.


The author's comments on noise chime true with me: every time I give Haskell a try I end up struggling with frustrating and opaque vocabulary, sometimes completely at odds with the way other languages use them: e.g. C++ also has functors, and they're completely unrelated to Haskell functors.

I really like the author's suggestion of mentally translating Functor to Mappable. Are there any other synonyms for other Haskell terms of art?

What I'd really like, I suppose, is a complete overhaul of Haskell syntax to modernise and clarify everything: make it use actual words to describe things (foldl vs foldl'? BIG NO). Put in syntax redundancy and visual space to avoid the word soup effect: typing is cheap, understanding is expensive. Normalise and simplify terminology. Fix the semantic warts which make hacks like seq necessary --- if I need to worry about strictness and order of evaluation, then the language is doing lazy wrong. etc.

Basically I want language X such that X:Haskell like Java:K&R C.

This will never happen, of course; the people who have the knowledge to do such a thing won't do it because they are fully indoctrinated into the Haskell Way Of Life...


As others pointed out, the way Haskell uses the term "functor" is related to the way mathematicians had been using it for at least a decade before cfront.

I agree that a shared vocabulary is important, but standardizing in a way that makes the mathematical writings on the topic more accessible seems a big win. Moreover, "functor" is a bit more precise than "mappable" - a functor is a mapping that preserves structure. In what sense? The math can guide you. In this case, it means the functor laws.

That's not to say that coming up with other associations to help ease understanding is a problem - I have no problem with saying, "for now, think of Functor as 'mappable'". The equivalent for Monad would probably be "flatMappable", and Monoid would be "appendable".


> As others pointed out, the way Haskell uses the term "functor" is related to the way mathematicians had been using it for at least a decade before cfront.

Rather a bit more than that. Eilenberg and Maclane's original paper defining the basic notions of category theory was published in 1945! http://www.ams.org/journals/tran/1945-058-00/S0002-9947-1945...


Thanks! I suspected that was the case, but the looser bound was much easier to be confident in with the level of effort I could spare.


For Monad, 'join' or the kleisli arrow (>=>) might give better intuition than bind/flatMap - maybe something like "contextComposable": kleisli arrow in the Identity monad is just function composition, and for everything else, it's function composition within a context, combining contexts according to 'join'.


I'm... hesitantly okay with "Functor == Mappable", but I really think Monad should be "Embedded DSL" and Monoid "mergeable". Or really Semigroup.

Monad is definitely abnormally difficult to humanize. The trio (T, ∀ a. a -> T a, ∀ a b. (a -> T b) -> (T a -> T b)) is really hard to nail down.


I like "embedded DSL" for Monad, although I think more specialized notions might provide more hooks to hang understanding on in particular cases.

I don't object to "mergeable" for Monoid, but I think I weakly prefer "appendable" since it seems to say a little more about how things merge (and of course the free monoid is exactly that).

Speaking again to the broader context, one thing I really like about Haskell's choice of naming these abstractions after the math is that this type of discussion has no bearing on what types adhere to the abstractions - we're not left arguing over whether Sum and Product are "really" appending, or set intersection is "really" merging. Integers are a clearly monoid under Sum and Product, and set intersection is clearly a semigroup but not a monoid (if our universe is open) because there is no identity.


Its interesting, I know one Cambridge theoretical physicist who is immensely successful in his field. I found it really odd how he worked 5 years as a proffesional writer between working in university. He thinks of good writing as his "secret weapon" that made him successful in academia. He is one of the most well spoken person I have met and has instilled new respect for being a good writer which I didn't think about before ( I like the HN crowd think programming is everything ).

Its also interesting that pg is such an accomplished writer. I think programmers need to think about code and well written document as having the same importance.

Just my two thoughts.


> e.g. C++ also has functors, and they're completely unrelated to Haskell functors.

You can't blame that one on Haskell or the functional community - the term was already established before the C++ community decided to use it in spite of pre-existing definitions. They even ignored Prolog's pre-exising abuse of the term functor :-)

A few similar terminological accidents of history come to mind, where the original definition of some term is now obscure and a different definition popular:

- POSIX capabilities (as implemented in e.g. Linux), which are a security mechanism that has nothing to do with what security researchers have been calling capabilities since the 1970s

- Microsoft operating systems using the term "Format" for creating a file system, despite the fact that it is impossible to actually format hard disks at the hardware level since the 1990s

- imperative programming languages abusing the term "function" to mean procedures with side effects

- "thunk" meaning a stub that emulates/bridges different calling conventions, instead of a call-by-name (or lazy) closure

- "Tea Party" used to refer to a fine rock band from Canada


Functors and monads are somewhat not going to change their names, partly because Haskell derives from math and those are what they're called over there.

But foldl' is horrible, I agree.


foldl' is a consistent and meaningful name. fold performs a fold without specifying an order[1], foldl folds from the left, and foldl' is a non-lazy version of of foldl.

1: fold :: (Foldable t,Monoid m) => t m -> m


Is there a spec that leaves the order unspecified? Having it not be a right fold would be quite nutty. It is a right fold in implementation.


Ugg, I had to spend a fair amount of time coming up with a good example. Hope this is helpful!

The Monoid operation mappend is guaranteed to be associative, so the order is irrelevant. Data structures can fold in whatever way is most efficient for their structure.

It's true that lists and arrays are implemented as right folds, however the fold implementation for sets is neither:

From Data.Set:

  fold = go
    where go Tip = mempty
          go (Bin 1 k _ _) = k
          go (Bin _ k l r) = go l `mappend` (k `mappend` go r)

  -- Here, I reorganized the code of `fold` to have the same shape as
  -- `foldl/foldr` so that you can see the difference in structure more
  -- clearly.
  fold2 = fold3 mappend mzero
  fold3 f z = go z
    where
      go z' Tip           = z'
      go z' (Bin _ x l r) = f (go f z' l) (f x (go f z' r))

  foldl f z = go z
    where
      go z' Tip           = z'
      go z' (Bin _ x l r) = go (f (go z' l) x) r

  foldr f z = go z
    where
      go z' Tip           = z'
      go z' (Bin _ x l r) = go (f x (go z' r)) l


I went through some examples to make 100% sure that fold has different behavior than foldl/foldr:

  fold      [[_ 4 _] 3 _] → f 4 (f 3 #)
  fold2 f # [[_ 4 _] 3 _] → f (f # (f 4 #)) (f 3 #)
  foldl f # [[_ 4 _] 3 _] → f (f # 4) 3
  foldr f # [[_ 4 _] 3 _] → f 3 (f 4 #)
Reductions:

  fold [[_ 4 _] 3 _]
  f    (go [_ 4 _])  (f 3 (go []))
  f    4             (f 3 (go []))
  f    4             (f 3 #)

  fold2 [[_ 4 _] 3 _]
  fold3 f                       #             [[_ 4 _] 3 _]
  f     (go [_ 4 _])            (f 3 (go _))
  f     (f (go _) (f 4 (go _))) (f 3 (go _))
  f     (f #      (f 4 #     )) (f 3 #     )
  f     (f #      (f 4 #     )) (f 3 #     )
  f (f # (f 4 #)) (f 3 #)

  foldl f                     #             [[_ 4 _] 3 _]
  go    #                     [[_ 4 _] 3 _]
  go    (f (go # [_ 4 _]) 3)  _
  f     (go # [_ 4 _])        3
  f     (go (f (go # _) 4) _) 3
  f     (go (f # 4) _)        3
  f     (f # 4)               3

  foldr f                    #                      [[_ 4 _] 3 _]
  go    #                    [[_ 4 _] 3 _]
  go    (f 3 (go # [_ 4 _])) _
  f     3                    (go # [_ 4 _])
  f     3                    (go (f 4 (go # _)) _)
  f     3                    (f 4 (go # _))
  f     3                    (f 4 #)


It's useful for more general folds over more general types. If you have a tree structure then you might not want to fold from the left or the right but instead in multiple places in parallel and then combine them at the end

    * + (* + (* + (* + (* + (* + *)))))   versus
    (((((* + *) + *) + *) + *) + *) + *   versus
    ((* + *) + *) + (* + (* + *))


Oh God, right. The changes over the years have left me bamboozled. Is (.) fmap yet?


I mentally pronounce foldl' as "fold ell prime," and it never gives me any difficulty.


> sometimes completely at odds with the way other languages use them: e.g. C++ also has functors, and they're completely unrelated to Haskell functors.

Really, this should be considered as C++ perverting the existing terminology from category-theory for Functors.

> I really like the author's suggestion of mentally translating Functor to Mappable. Are there any other synonyms for other Haskell terms of art?

I think that there is a great deal to be said for leveraging intuition. But who's intuition? Who was Haskell designed by/for when Functor was first defined in the standard library?

> What I'd really like, I suppose, is a complete overhaul of Haskell syntax to modernise and clarify everything: make it use actual words to describe things (foldl vs foldl'? BIG NO).

The intention is admirable, but what does it cost to do it, and what is gained by doing it? It seems that the implication is that certain functions become immediately intuitive to people (what kind of people?) in certain contexts, and that possibly-by-analogy, these context can be extended (how far?). I'm not saying that this is a bad goal, but rather than try to compromise in this manner, the Haskell community has often adopted terminology that is precise instead of intuitive.

Functors could have been Mappables, but how far would that analogy hold, and who is already familiar with maps in this context? Better to use an accurate term, and when someone unfamiliar with it learns it in this context, they will be able to apply it to many other contexts.

> Put in syntax redundancy and visual space to avoid the word soup effect: typing is cheap, understanding is expensive. Normalise and simplify terminology.

On the surface, I've always supported this - if only for the reason that I would always like to be able to pronounce a combinator when I'm talking to someone. The downside would be the combinatorial explosion of different subsets of names that people would learn for even one library. I'm not sure weather it would be a net plus or minus.

> Fix the semantic warts which make hacks like `seq` necessary --- if I need to worry about strictness and order of evaluation, then the language is doing lazy wrong. etc.

I think you will find that this is an unsolved problem. Better to allow people to be explicit when necessary instead of making the language totally unusable.

> Basically I want language X such that X:Haskell like Java:K&R C.

I think I understand the sentiment, but the analogy feels too shallow. For instance, I would make the following predictions from your analogy - Do they hold?

* Runs on a virtual machine instead of being compiled * Extraordinary measures taken to make the language and binary-formats backwards compatible. * More type-safe * Less primitives * More automated memory-management

> This will never happen, of course; the people who have the knowledge to do such a thing won't do it because they are fully indoctrinated into the Haskell Way Of Life...

Indoctrinated is obviously a loaded term. I think you will find that nearly all Haskell programmers in any position to influence the development of the language are very open-minded when it comes to new ideas. Part of the reason why Haskell looks the way it does today is because it was intended to be a platform for experimentation.


> > sometimes completely at odds with the way other languages use them: e.g. C++ also has functors, and they're completely unrelated to Haskell functors.

> Really, this should be considered as C++ perverting the existing terminology from category-theory for Functors.

Only if you assume that category theory is the correct source of meaning of such terminology. But Wikipedia, for example, lists functor as being ambiguous - there's the category theory version, and there's the programming version, which is a function object. It lists a bunch of languages (C++, C#, D, Eiffel, Java, JavaScript, Scheme, Lisp, ObjectiveC, Perl, PHP, PowerShell, Python, and Ruby) that support some variant on this theme. It seems rather arrogant to say that the category theory definition is the one that should be the one we mean when discussing a programming language, rather than the one used by a large number of programming languages.


The definition of functor in category theory is from 1945 while most of today's mainstream programming languages are much younger than that. The people who wrote those definitions for C++, C#... simply ignored the existing terminology.


Fair enough. And yet, at the time (and even today, outside of the FP crowd), category theory is really far outside the scope of what most people consider programming, so it's hard to blame them for not looking there for terms.

For that matter, category theory borrowed the word from linguistics, and most definitely did not keep the same meaning.


I certainly wouldn't blame anyone for not being familiar with the terms. What I would blame people for is attacking Haskell's choice of terminology due to it not 'being like c++'. A generous interpretation of intent goes a long way, and if that is out of reach, at least do enough research to make sure you're not pointing out weaknesses based on a false premiss. Instead I commonly see people starting from "I don't find Haskell intuitive" and extrapolating to "Haskell's contributors are indoctrinated and people who defend its use of established terminology are arrogant". Maybe I should just ignore such points of view but they seem to be infectious.


Well, let me see if I can meet you partway. Haskell is grounded in abstract algebra and category theory; using words like functor to mean what they mean in category theory is therefore a reasonable choice for Haskell.

But Haskell is a programming language. Using words like functor to mean something different from what other programming languages mean by the term creates a barrier to understanding for (non-FP) programmers. (The other definition is rather well established in non-FP circles, which is by far the majority of programming.) And when Haskell proponents state that their definition is right because it's the one from category theory, non-FP programmers find that rather arrogant.


According to http://en.wikipedia.org/wiki/Function_object, many languages implement the "Function Object" pattern:

  * C
  * C++
  * C#
  * D
  * Eiffel
  * Java
  * JavaScript
  * Lisp
  * Scheme
  * Objective-C
  * Perl
  * PHP
  * PowerShell
  * Python
  * Ruby
Although many of the languages listed here implement the pattern, it's not clear that many of them use the "Functor" terminology. I've certainly never heard of it referred to as such in Javascript or Ruby, and most-often referred to as "Callable" in Java.

That being said, I think anyone would agree that terminology choices in Haskell can only fairly be accused of ignoring PL parlance that was in existence before the terms were adopted in Haskell...

With that being said, the "Functor" terminology timeline:

  ~ 1942 - Category Theory - http://en.wikipedia.org/wiki/Category_theory
  < 1991 - Haskell         - Notions of computation and monads (Moggi)
  > 1994 - C++ STL         - http://en.wikipedia.org/wiki/Standard_Template_Library
  > 1995 - Gang of Four    - http://en.wikipedia.org/wiki/Design_Patterns
  > 2000 - C#              - http://en.wikipedia.org/wiki/C_Sharp_(programming_language)
  > 2004 - Java Generics   - http://en.wikipedia.org/wiki/Java_version_history

(correct me if I'm wrong here)

Now clearly we can't accuse Moggi of arrogantly ignoring existing PL terminology, because it didn't exist at the time. So, should we then say that Haskell users should have abandoned the term once it started being used differently?... This seems unfair too, as it was already in use in the Haskell ecosystem by then. I really can't accept that Haskell users are arrogant simply for using a term they adopted very early on consistently and in line with its original definition. Maybe they are arrogant, but certainly not for that reason.

So maybe they are arrogant because they don't play well with others? How do they react to other people using the term in a different fashion? I have never seen a Haskell user complain about someone calling a C++ function-object a functor. Maybe it has happened, but I just don't see it coming up very often.

> And when Haskell proponents state that their definition is right because it's the one from category theory, non-FP programmers find that rather arrogant.

I've never seen a Haskell user bust up a conversation and chastise a bunch of C++ users for talking amongst themselves about function-object "Functors". That would be arrogant, but does that ever happen? Why would they do that? The only time I can see Haskell users forcing definitions down people's throat is in situations like this, where they are being berated for using their own terminology and decide to set the record straight. That being said, who's forcing? The replies could really only be accused of being informative.

What do you suggest Haskell users should do? Stop calling functors functors? Sheepishly demure and say "okay you're right, we're wrong" when someone says that functors are how C++ does them?

Really I couldn't care less about the terminology point as I don't believe it has ever caused any significant issues in terms of ambiguity. And I'd be surprised if anyone earnestly attempting to learn Haskell was slowed down because of these terms (slowed down more than if Haskell invented totally new terms). The only reason why I'm getting worked up is because of the "arrogant" label.

Now how did this thread of conversation start?

  > I really like the author's suggestion of
  > mentally translating Functor to Mappable.
  > Are there any other synonyms for other Haskell terms of art?
  > ...
  > What I'd really like, I suppose, is a complete overhaul of Haskell syntax

  >> As others pointed out, the way Haskell uses
  >> the term "functor" is related to the way mathematicians
  >> had been using it for at least a decade before cfront.
Is that arrogant? Surely it's more arrogant to come in as a beginner of a language and suggest that it change its terminology without attempting to understand why it uses it? Still, none of the Haskell users here accused david-given of being arrogant, they simply informed him of the provenance of the term. Because we love to help.

I guess I just wished that people would make sure that they are at least justified when using inflammatory language.

Sorry about the rant.


I was at that same Wikipedia article, but I got there from the "Functor" disambiguation page. I assumed that "function object" was a synonym for "functor" within the context of those listed languages. Perhaps that assumption was wrong on my part...

Your point about chronology is noted. I have no rebuttal.

> What do you suggest Haskell users should do? Stop calling functors functors? Sheepishly demure and say "okay you're right, we're wrong" when someone says that functors are how C++ does them?

Stop saying "we're right, you're wrong" when someone says functors are how C++ does them. Accept that C++, C#, Java, and the Gang of Four can use the term to mean what they mean without them being wrong. Ideally, recognize that, within the wider world of programming, the FP use of the term is the minority, and so some effort at translation to the majority terms may be appropriate.

That said, I'm well aware that I'm talking to the wrong person. Comments of the type that I'm complaining about occur on HN, but I don't think they come from you.

The only quarrel I could pick with what you said was your original comment, when you faulted C++ for not adapting the term from category theory. I had my timeline wrong in my first reply to you, but I still think that, since the roots of C++ are very far from category theory, expecting it to go there to find its terminology is a bit unfair.


you might like elm [http://elm-lang.org/]; it's inspired heavily by haskell, but the author thinks long and hard about finding the right names for things rather than just using the defaults from haskell or ml.


actually, Scala is in some ways similar in that it has higher kinded types allowing you to abstract over functors/monads/traversables

for expressions in scala are monadic comprehension and implicit parameters are analogous to typeclass constraints.


Great article. I do think a better "Getting started with Haskell" guide is Chris Allen's:

https://github.com/bitemyapp/learnhaskell

OP's article is still a great way of wetting appetite, and sharing insights; but moving on from there is better facilitated by Chris Allen's recommendations.

There is also the IDE issue; FPComplete has a web-based IDE that is good for beginners, and it is possible to setup Emacs to be a very helpful IDE (though this is by no means simple). With Haskell an IDE is really helpful: see the errors as you type, and resolving them before running into a wall of compile errors.

Anyway: go Haskell. I'm looking forward to a less buggy future :)


I think you meant "whetting". :)


I don't know about that. Haskell can be rather sink-or-swim...

;-)


Thanks.. Lol! Maybe I've never read that word so far, and only heard it spoken.


I've been programming Haskell for quite a while now and find that many of these types of articles don't capture what I really value in the language. The author really captured what I love about Haskell excellently! It's super terse, (really) readable, and the barriers to entry people worry about are more in their minds than in reality. It's a great language and a great article!


I really like Haskell, but one of the main problems I've had (that I don't see many people cite) is that the libraries just aren't made for use under serious load/concurrency. Many of the people that have written these libraries, and use them are not using them in high-performance, memory-sensitive areas (production use at companies).

There are Haskell libs of course that are used in these environments, and the companies usually end up fixing them such that they're quite good. Most libs used by pandoc are likely to be great, and there's a few dozen others of the same caliber (its useful to search around and see what libs are used by the other few companies using Haskell since they have likely been vetted as well).

The other largest issue to actually using Haskell is that all the knowledge your ops team has of running a production system are essentially null and void. All your existing knowledge of how to fix performance issues, null and void. Learning Haskell and becoming productive in it almost starts to look like the easy part compared to effectively running a Haskell (dealing with space leaks, memory fragmentation issues, and ghc tuning for stack sizes, allocations, etc).


I've actually found the exact opposite. The library ecosystem is rich and mature. Haskell is, by default, "concurrency safe" because of referential transparency. You can safely "async" and compose almost any library in the Haskell ecosystem without worrying about shared memory etc underneath.

Also, a lot of the really common libraries like text, attoparsec (parsers), aeson, networking, etc are highly tuned for low latency and performance. Many use compiler rewrite rules and techniques called stream-fusion to compact a lot of the machine code away. Also aggressive inlining etc can be done.

I'm sure there are some memory-heavy or poorly optimized libraries out there but that's certainly not the norm. I've had no problems with the libraries off-the-rack.


I actually thought that too, but I guess that's not the case. I helped write some HTTP2 frame-parsers for Haskell using attoparsec, but apparently it wasn't fast enough as the lib author later rewrote all the attoparsec code to use pointers to the underlying byte buffers.

https://github.com/kazu-yamamoto/http2/commit/0a3b03a22df1ca...

The stream fusion stuff is sweet, but not exactly unique to Haskell since any language with good iterator/generator abstractions have similar constant-time memory characteristics.


I believe you're misunderstanding what stream fusion is. A language compiler does not really need "good iterator/generator abstractions" more than a guarantee of side-effect free transformations in order to be able to de-forest the intermediate data structures. http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.104.7...


I meant more that languages with an iterator/generator usually have similar constant space usage. That is a drastic oversimplification of stream fusion and fails to mention other practical outcomes as you mention, along with a variety of optimizations.

I found this posting a little more approachable to seeing the various optimizations possible with stream fusion: https://donsbot.wordpress.com/2008/06/04/haskell-as-fast-as-...


Haskell has a problem in that people think about the basic libraries as deprecated, but won't deprecate them due to backward compatibility. Thus, people starting on it will get plenty of slow and unsafe constructs, while people used to it only look at the fast and safe ones.

There should be warnings all over the Prelude and basic libraries documentation.


Yup, agreed. The #haskell channel on FreeNode has been great about providing feedback on what libraries one should use for performance.


Could you name some of these libraries? IME, most of the libraries that are needed for common things are very mature.


I tried to write a Haskell websocket server, the library is quite nice, but it leaked memory (space leak? fragmentation? some of both?): https://github.com/jaspervdj/websockets/issues/72

The author helped me narrow it down to some issues with how ghc by default allocates a stack space that is rarely enough, and once it starts growing the stack space the RAM per connection gets pretty ridiculous. Using higher default stack space helped remedy this some, but the per-connection RAM cost was still way higher than Golang/Python which I was comparing to.

So... separate project, I write a load-tester in haskell for a websocket server. I need to issue some HTTP requests, and I see Brian O'Sullivan made a nice library, wreq. I use it as described and quickly discover it uses ridiculous amounts of memory because it doesn't mention that you should always re-use the Session (the underlying http-client emphasized the importance of re-using the Manager): https://github.com/bos/wreq/issues/17

(I am sorry that this issue prolly came off as a bit whiny there, I was very frustrated that such a gap was omitted from the docs)

So, my program is working pretty nicely, until I discover that its not actually sending multiple HTTP requests at once (even though the underlying http-client lib has a thread-safe TCP connection pool). After browsing some code, I see the problem: https://github.com/bos/wreq/issues/57

The solution that was so far implemented seems equally weird to me.... letting different requests stomp over the Session's cookie jar... I forked it so that I could have multiple wreq Sessions use the same Manager, and now it finally works as it should.

I won't even go into how some of these libs have occasionally wanted conflicting dependencies which leads into its own 'cabal hell' (googling for that is entertaining unless its happening to you).

I've only been writing Haskell for a bit over a year now, but everytime I write code with it, despite my love of the language, the libraries and run-time end up frustrating me.


Great comments and better bug reports, just want to mention it actually says to always re-use Session in the wreq tutorial[0]. Perhaps it should be stated more prominently, repeated, or even both.

    For non-trivial applications, we’ll always want to use a Session to efficiently and correctly handle multiple requests.

    The Session API provides two important features:

    When we issue multiple HTTP requests to the same server, a Session will reuse TCP and TLS connections for us. (The simpler API we’ve discussed so far does not do this.) This greatly improves efficiency.
0: http://www.serpentine.com/wreq/tutorial.html#session


Yep, that was added after my bug report about it. Also fairly recently, he added the ability to do a request with no CookieJar at all.


Thank you! This is a really good comment.

Also, your bug reports are really solid.


> So, I started calling it Mappable. Mappable was easy for me to remember and was descriptive of what it did. A list is a Functor. A list is Mappable.

I wish there was a language or library that was willing to take the Haskell functionality and just give it all names like this.


This is actually possible with the ConstraintKinds extension in GHC.

    type Mappable = Functor
    type NotScaryFluffyThing = Monad
This makes Mappable a synonym for Functor and likewise for Monad. (This is not necessarily a good idea; it goes against the principle of least surprise, but it'll work).


My code definitely needs more NotScaryFluffyThings in it.


Isn't aliasing types a core language feature?


Functor and Monad are technically not types but type classes (i.e. interface vs an implementing class), hence why simple/naive type aliasing is not enough.


That would be wrong. Functors also describe things that aren't necessarily "mappable". The actual definition of functor is more general than that.


A fork of Haskell with descriptive names and operators could be really popular, I think.


Actually wouldn't be a fork. It's just an alternative Prelude library. People do this occasionally. The result tends to be idiosyncratic.


In Scala there aren't any names for functors that I'm aware of. I just though one day that "wouldn't it be practical if I could map an option instead of match-casing the meaning out?" and lo and behold it worked. Maybe we just don't need that much terminology.


That's because the Scala stdlib doesn't abstract over things that can be mapped over. If you want to do that, then you need a typeclass. Scalaz has a typeclass for mappable things and it's called Functor.


You need a term for "thing that I can map" if you want to be able to write a function that takes a generic "thing that I can map". Which I do.


    > I wish there was a language or library that was willing
    > to take the Haskell functionality and just give it all
    > names like this.
I don't think this helps understanding that - for example - Either is also a functor.


`Either a` is a functor, not Either. I agree with the rest of your point, though. Maybe might be a better example - similar in spirit to Either, but without requiring an application to get a functor.


That Either is Mappable is, IMO, both clearer and more informative than that it's a Functor.


Maybe someone can correct me here, but that kind of approach seems ill-founded to me when after a couple examples, people are already talking about different things using the same terminology.

The article says "A list is a Functor". Now you're saying "Either is a Functor". But those two things don't have the same nature.

Maybe what the author meant "The [] list constructor is a Functor"?

I'm not sure what is gained by garbling abstractions and reducing them to a subset of their potential interpretations.


I think the problem here is terminology. A functor is the abstract concept and it implies having the map operation with the right signature and obeying the right properties.

The best way to say it is "The list type 'forms' a functor" or "The Either type 'forms' a functor". The fact that they form a functor implies that their map operation has a fixed set of properties, and these properties are independent of what exactly the data structure does and how it works.


"Either" actually isn't a Functor, at least in Haskell

In Haskell, a Functor actually consists of two parts: The type itself (f), which 'transforms' a type a into type "f a". ie. Maybe "applied" to Int gives "Maybe Int", a new simple type (let's handwave kinds away for now). In addition to that, the fmap function is required for Maybe to be a Functor. A Functor is defined by this ability to "add structure" to existing types and the mapping operation.

Seen this way, Either is clearly not a Functor: "Either Int" is not a simple type. However, "Either Int" is a functor: Either Int String forms a simple type, and you can implement fmap. In fact, that works for any type, so "Either a" is the functor as usually defined in Haskell.


There's no subset, no garbling; Functor literally has one member, map (fmap in Haskell but that's a historical artefact).


More specifically, a pedagogical artifact. Originally, fmap was spelled map. The list-specialized map was added to make teaching easier.


Scala programmer here.

  class Container(val property:Int)

  val list:List[Container]
  val mappedlist:List[Int] = list.map(x=>x.property)

  val option:Option[Container]
  val mappedOption:Option[Int] = option.map(x=>x.property)
It works in exactly the same way...


Learning enough Haskell to feel "productive" is an incredibly good way to deepen your understanding of programming, even if you've been programming for years.

Two things, in particular, stand out for me when thinking about Haskell this way (as a "tool for thinking" language).

First, unless you're a mathematician, you probably haven't thought very deeply about algebraic data types, and how useful and expressive it is to build up a program representation from a collection of parameterized types. The article touches on this a little bit in noting that Haskell teaches you to think about data types first.

But it's more than just "data first," for me, at least. Grokking Haskell's type system changed how I think about object-oriented programming. Classes in, say, Java or C++ or Python are a sort of weak-sauce version of parameterized abstract types. It's kind of mind-blowing to make that connection and to see how much more to it there is.

Second, monads are a really, really powerful way of thinking about the general idea of control flow. Again, the most useful analogy might be to object-oriented programming. When you first learn to think with objects, you gain a flexible and useful way of thinking about encapsulation. When you learn to think with monads, you gain a flexible and useful way of thinking about execution sequencing: threads, coroutines, try/catch, generators, continuations -- the whole concurrency bestiary.

I think monads are hard for most of us to wrap our heads around because the languages we are accustomed to are so static in terms of their control flow models, and so similar. We're used to thinking about control flow in a very particular way, so popping up a meta-level feels crazy and confusing. But it's worth it.

For example, if you do much JavaScript programming, and are ever frustrated translating between callbacks and promises, having a little bit of Haskell in your toolkit gives you some mental leverage for thinking about how those two abstractions relate to each other.


Nice article. I find FP incredibly elegant and I'd like to learn Haskell, but every time I search, there are never any jobs in it, so it seems like Scala is the better choice for where I live...

Speaking of which, I found "Functional Programming in Scala" excellent for teaching someone with an imperative background how to "think functionally". Monads are explained in an easy to understand way. I can imagine that without reading that book I'd have been looking at a couple of years of coding before I started to see the abstractions, etc. By contrast "Learn You a Haskell" lost me part way through both times I tried to read it...


We used functionaljobs.com for hiring our very latest Haskell developer (ex Scala guy) at Front Row, check it out, it should at the very have a few interesting openings on it.

Also, companies actively using and recruiting for Haskell are now starting to join the Commercial Haskell SIG, so if you want to poke around, you can find them here: https://github.com/commercialhaskell/commercialhaskell#readm...


If you wait for the jobs to come, you'll be late to the party. Very few people will want to hire someone who doesn't know Haskell for a Haskell development position. Learn it now so you'll be up to speed when more jobs start appearing. Or better yet, learn it and then create the Haskell jobs yourself.


Your last sentence is basically what we did. It's hard, but it's doable.


I don't think I have ever used a haskell program written by someone else that wasn't ghc. Is that usual? Are there now a bunch of .debs for useful things other than writing haskell that are actually written in Haskell? I'm not trolling, it's just a good test of what something is useful for when it's been around a while is to ask "Well, what has it actually been used for?"


The Haskell wiki has a page called "Haskell in industry" [0] which lists all the people using Haskell in the real-world.

Some notable ones include:

* Facebook Haxl, an abstraction around remote data access [1]

* Microsoft Bond, a cross-platform framework for working with schematized data [2]

* Google Ganeti, a cluster virtual server management tool [3]

* Intel Haskell research compiler, a custom Haskell compiler used internally at Intel Labs [4]

---

[0]: https://wiki.haskell.org/Haskell_in_industry

[1]: https://code.facebook.com/projects/854888367872565/haxl/

[2]: https://github.com/Microsoft/bond

[3]: https://code.google.com/p/ganeti/

[4]: http://www.leafpetersen.com/leaf/publications/hs2013/hrc-pap...


Anytime a technology needs to publish a list of "who's actually using this in the real world" the answer is not all that many (relative to other peer technologies). Most projects in those lists fall into the following categories:

1. It is just a small team or even one person using it and they're doing it because they really want to use that technology badly.

2. The project is some side research thing or trivially small that it could have been done using any technology.

3. It is actually just a tool or sub-system of the main system that was low risk enough.

4. The project is no longer operational, if it ever made it to that stage.


While your point isn't invalid it is important to keep in mind that popularity is not a valid proxy for quality.

Also he is focusing on large companies who have huge reasons they can't use Haskell, mostly related to internal resources. If you have several hundred Java engineers (for example) you literally cannot just switch to Haskell, it wouldn't work.


Completely agreed about quality. Popularity is highly correlated to actual utility though.

Lisp falls into the same category. High quality and very interesting but it will never, ever gain widespread use. Don't believe me? A half century of proof exists. Haskell is already at a quarter century.

Both are very cool and everyone should learn them to some degree because they will make you a better programmer but neither will ever be used widely. They just aren't appropriate for most general purpose programming tasks.


I would be wary of painting Haskell and Lisp with the same brush. Yes, on first glance they both appear to be "difficult" languages that are over the heads of the average programmer. However they take very different approaches.

Lisp gives the programmer maximum raw expressive power. This appeals to lone wolves and autodidacts, but it completely punts on the issues of standards, teamwork and maintainability.

Haskell on the other hand, promises a direct solution to a huge swath of problems that are experienced across the board in software development today. The pitch is essentially an extension of what Sun used to sell Java in the 90s: it makes your code safer and more maintainable. Except Java only really did that for memory management in a C-dominated world, the type system gives you barely anything in that regard, so you still have just as many NullPointerExceptions as you suffer from lack of types in languages like Ruby. Haskell type system gives you infinitely more meaningful safety, but with suitable state-of-the-art functional abstractions to minimize the pain of acquiring it.

The only catch is the learning curve is steep, but as more and more programmers scale that wall, the benefits to performance and maintainability will become apparent to the pointy hairs. Lisp never really had an equivalent value proposition, except in a few narrow fields where its expressiveness and plasticity were key.


But beware lag time.

That is: Some languages aren't suitable for general use. Some aren't... and then they are. But popularity probably correlates with how suitable the language was at least two years ago, and maybe more like 10. (Call it 5 as a compromise.)

So popularity doesn't tell you that the language is unsuitable now. But I agree, there is a correlation. Programmers for the most part aren't stupid sheep, afraid to use something new.


Pandoc - lets you convert documents from/to many different formats - http://pandoc.org


There's git-annex, which is a relatively complex software, runs on multiple platforms (Linux, OS X, Windows and Android) and is definitively useful.


This is a standard that Haskell is held to a lot, but I'm not sure how robust a test it is. I never use programs written in Java, for example, and I use a program written in Scheme every week. That doesn't say much about their relative popularity, let alone their quality as programming languages.

Presumably I use server-side applications written in Java, but I've no way of telling. If server-side counts then most people with computers indirectly use Haskell via Facebook's Haxl project.


There is also postgrest, providing a REST API for postgresql databases.

https://github.com/begriffs/postgrest


I daily use Xmonad, hledger and sometimes pandoc.


pandoc is probably the one I see discussed the most without people knowing it's written in haskell.


ShellCheck [0] is actually a pretty useful tool.

[0] http://www.shellcheck.net/


Shellcheck is excellent, especially since syntastic will use it without any configuration.


I've used Xmonad as my window manager for years. Recently I used pandoc to convert some Markdown files to LaTeX.


darcs version control (http://darcs.net/) (what most people should use instead of git).


For what we have learnt from the history of mathematics, you might have to give it some hundred years before people realize usefulness.

As a practical note, the fact that educated people use it is an indicator that it is useful.


> As a practical note, the fact that educated people use it is an indicator that it is useful.

Possibly. It could also be that they use it because it's interesting and informative rather than useful per se.

It could also be that it's useful in particular contexts in the same way that Feynman diagrams are useful.


The bank Standard Chartered[1] uses it, and instead of GHC they use their own compiler. There's some more info in this job posting[2].

[1] http://en.wikipedia.org/wiki/Standard_Chartered [2] https://donsbot.wordpress.com/2014/08/17/haskell-development...


I've mentioned this before, but there's a lot of music stuff written in Haskell - tidal[0] (DSL for live-coding electronic music, not the 'save Jay Z from penury for $20/month' initiative that has just clobbered its google results) and Euterpea[1] spring to mind.

[0] http://yaxu.org/tidal/ [1] http://haskell.cs.yale.edu/euterpea/


The Elm compiler is written in Haskell, and as others have mentioned, I use xmonad and pandoc.


A few of those will likely be server-side and not on your personal machine, but they're out there.


protoype for sel4 is written in .hs


Whenever I try to be productive in Haskell I end up taking the research phase too far and end up over-my-head in category theory that I don't understand. I'm never productive in Haskell.


Haskell allows very "deep" abstraction - but what I find is that you should not overuse it. Abstract where practically useful, otherwise don't go for it. :-)


You might not be productive in the traditional sense of "delivering business value". However over the years of letting Haskell change my brain a little piece at a time, the results have accumulated. I wouldn't swap that deeper understanding for all the productivity in the world. :-)


Haskell is a great language for delivering business value. The typed FP paradigm encourages reusable code much more than any other paradigm I have tried.


Yes, absolutely agree with you. I wish I was writing Haskell day to day.

My point was that it takes a long trivial amount of time to learn Haskell during which you might feel "unproductive" by that measure. I feel that it's during that period that the magic happens :-)


Start by coding. Be prepared to rewrite your first big project once or twice (good thing is that you'll realize it needs a rewrite sooner, rather than later - anyway, don't start with something too big), but start coding. You won't learn it by reading.


excellent article, thanks.

it's comforting -for me- to see that almost everybody is going through the same phases while learning haskell. i believe that should say something to haskell community.

i've recently started learning haskell. it's been 25 days. (so says cabal) i was reading a book and struggling to build a web app. (why web app?) i was so close to quitting. later i decided this is not the way to learn haskell. one simply does not read the book and try stuff. that was not enough. at least for me. so i changed my method.

my new method of learning haskell is:

- read the book.

- find one or more mentors (i have two) that are really good at haskell and can answer all kinds of impatient questions you have.

- watch people doing and explaining haskell stuff.

- join #haskell-beginners on freenode and ask your questions.

- create something small first that you can turn into something big later.

online haskell resources are surprisingly deficient however #haskell-beginners community is awesome when it comes to helping n00bs like me and "learn you a haskell" book is an excellent book.

one more resource that i use as reference material is the "haskell from scratch" [0] screencast series by chris forno (@jekor).

before you begin, make sure you checkout chris allen's (@bitemyapp) "learn haskell" [1] guide.

we'll get there people, we'll get there. :)

[0] https://www.youtube.com/playlist?list=PLxj9UAX4Em-Ij4TKwKvo-...

[1] https://github.com/bitemyapp/learnhaskell


> After writing a parser with them, I began to understand other code that used them. I then started to understand the abstract nature of them…but that abstractness was a lesson for another day, not for starting out.

This definitely helped me too. I started out looking at functions and monads as 2 'types' of function that could only be mixed in certain ways, and didn't bother with the gory details at first. IME It's only when you experience monads and their effects that the gory details make perfect sense.


I've got a side project webapp I'd like to use to learn some Haskell. What is the most mainstream Rails-like web application framework out there? By Rails-like I mostly mean convention-over-configuration, with a strong ecosystem of plugins so I don't have to re-invent the wheel for auth, file uploads, etc. So far I've seen:

- Yesod

- Snap

- Happstack

- Scotty

- Spock

Right now I'm learning Yesod, but I don't feel confident that's really what I want. Which of these are closest to Rails? Which are closest to Sinatra?


Yesod would be the closest to Rails.

Scotty would be closer to Sinatra and Flask. Spock is similar to Scotty but comes with a few more built-in features like type-safe routing, sessions, etc.

I recommend Yesod but there are certainly some advanced metaprogramming features (routing, models).

Have you checked out the Yesod scaffold site? https://github.com/yesodweb/yesod-scaffold


Yesod is the closest thing to a Haskell version of Ruby on Rails.

Scotty and Spock are both Sinatra-like.

There's a lot of good info here: https://wiki.haskell.org/Web/Frameworks


Yesod's great. There's a bit of a learning curve, but it comes with a LOT of batteries.


Could you guys explain to me what Haskell would be mostly useful for. For instance, can it be used to make a web app backend or a GUI app? Or is it mostly for mathematical calculations and such. I tried to pick up Haskell once, but I guess I just couldn't get it. I mean, I got the core concepts, wrote a bunch of starter code, like prime checker and the like. But after going through several tutorial chapters, I still could not figure out how I would use Haskell in the real world.

I don't mean to criticize or anything, just mean to understand. There are so many people who are very passionate about Haskell that it makes me think that it must be worth while to learn. But I just don't get how it would be useful for things that I do most with programming: writing Web/Desktop/Mobile apps in Swift, Python, and PHP.

Also, can you recommend a good book or resource that uses real world examples to teach Haskell?


Haskell is a bit weird in that there are no niches where its a total killer-app and has the best libraries for everything. What its really good at is that it has a very solid type system and the core language is very clean, which encourages the use of powerful abstractions (for example: coroutine libraries for async IO, parser combinators, etc).

Out of the things you mentioned, server-side programming is the one where Haskell fits best. Server-side programming is more amenable to unusual languages because you get to choose your own platform and there are plenty of mature web frameworks you can use (too many of them, I might say). It might be worth a try to experiment writing code in a more type-safe language. Even the simple things like algebraic-data-types are things I miss a lot when working on other languages.


> Could you guys explain to me what Haskell would be mostly useful for. For instance, can it be used to make a web app backend or a GUI app?

Yes, and yes.

> Also, can you recommend a good book or resource that uses real world examples to teach Haskell?

The obvious thing to recommend here is Real World Haskell [0], which directly addresses some of the areas you raise.

Also, Write Yourself a Scheme in 48 Hours [1] is more in-depth and real-world than most tutorials (writing a Scheme interpreter isn't exactly a common real-world application, but its more real-world scale than most tutorials address, and it uses a lot of things that are of concern in many real-world apps.)

[0] http://book.realworldhaskell.org/read/

[1] http://en.wikibooks.org/wiki/Write_Yourself_a_Scheme_in_48_H...


Thanks, I think Real World Haskell was the resource I was looking for. I'll try to get through it in the next month and see how it goes.


It's worth noting that it was written quite a while ago, so while it's still a great book, there are portions that have a wee bit of code rot. In the event that one of those sections trips you up, I'd encourage you to visit the #haskell IRC channel!


Haskell is useful for just about everything that Python, Ruby, Java, C#, etc are useful for. For the last 5 years my day job has been writing Haskell. Most of that has been web apps, but there have been other things like machine learning applications, a nice command-line interface to Amazon Redshift, an automated ETL tool for a large production application, and most recently a complex interactive browser app (using GHCJS to compile Haskell to Javascript).


What confuses you about using Haskell for writing Web/Desktop/Mobile apps? System.IO exposes all the primitive input/output functionality one would expect in Swift, Python, or PHP, and there is an abundance of higher-level libraries for networking, parsing, graphics, etc. Haskell even has a fairly usable C FFI.

Haskell is a general purpose programming language.


A popular introductory text like Learn You A Haskell doesn't introduce IO until chapter 9. It never gets beyond simple toy programs and the only further resource suggested (in the FAQ) is Real World Haskell.

RWH is well-written and covers some real-world tasks, but some of its examples are outdated enough that they don't even compile anymore (at least, I encountered that scenario a year ago or so) and Haskellers will frequently warn people that parts of it are out of date (see elsewhere in these comments).

I actually think one of the shortcomings of Haskell's approach to new developers is that it _is_ very much a general purpose programming language and sold as such. Other languages have extremely popular frameworks or applications which serve to attract newcomers. People teach Swift or Objective-C to write iOS apps, Java for Android apps, JavaScript to do web apps, Ruby to write web backends in Rails, C# to write games in Unity... hell, people learn Java to make Minecraft mods. The closest thing I can think of for Haskell is Xmonad, which doesn't exactly have mass appeal.

Someone else suggested "Write Yourself A Scheme" as a good practical introduction, and that in itself says a lot about who Haskell appeals to -- people who are interested in programming languages. The MLs and Haskell remind me of Brian Eno's line about how the first Velvet Underground album only sold 30,000 copies, but "everyone who bought one of those 30,000 copies started a band".


Got sidetracked by this:

> "We store memories by attaching them to previously made memories, so there is going to be a tendency for your brain to just shut off if too many of these new, heavy words show up in a sentence or paragraph."

That has always been my belief. I don't have anything else to back it up, only that my own speed of learning seems to increase for new subjects with time. The more I know, the easier new concepts seem. Very few things are completely new, unless I start delving into subjects I'm completely unfamiliar with. Say, Quantum Mechanics.

With most programming languages, I (and probably many here) can learn enough to start creating something useful in a weekend. Haskell always gave me trouble because it seems to take longer than that.

Then again, so does Prolog. I'll try yet again.


Nice article, i've been learning Haskell for about a month solving kata on http://www.codeswars.com/ using https://www.fpcomplete.com/ as my IDE. I'm finding it quite a learning curve understanding what library functions there are and how to use them. The code I write often ends up being quite different to the other solutions on codewars.

I'm missing Visual Studio, are there any realy good Haskell IDEs out there? for example ones which allow debugging.


Haskell with Emacs is awesome! There's a few modes available to have a powerful IDE. I don't remember which ones as xmonad + yi (or vi) is enough for me now.


How've you liked actually using yi? My only experience with it was rather frustrating.


My main reason of change is because emacs was getting too slow and buggy. After trying a few hacks in 'init.el' and co, it was getting worse... Suddendly Yi!

As I code only in haskell, it's perfect fun for me. Now, maybe a good way to start is using/practicing their Vimgolf client. [0]

In emacs, as I didn't use any others modes (except haskell-mode ...), I don't need their wonderful package managers any more.

[0] https://yi-editor.github.io/pages/vimgolf/


I have used Leksah and EclipseFP a few times, they are ok. I don't remember how good they are debugging code.

http://leksah.org/

http://eclipsefp.github.io/


Unfortunately, the state of debugging in Haskell, last time I checked, was pretty dismal.

Vim + ghcmod + syntastic has a useful subset of the functionalities of an IDE.


HI GUYS! PLEASE HELP ME! IS GOOGLE DNS DOWN?


lamdu has some interesting features:http://peaker.github.io/lamdu/


Love the Bret Victor talk mentioned on the Lamdu page: https://vimeo.com/36579366 "Bret Victor - Inventing on Principle"


Kind of agree with refactoring with python. While prototype building was easier with python, I used to have lot of changes and always has this nagging problem that something is not right.


I really liked this article, in particular the sections around learning things like Functor and Monad (if you haven't read the article, don't worry, it's not a list of new explanations that's why I like it).

A minor wording recommendation:

> better in every measure(lower complexity, speed, readability, extensibility)

Apart from a missing space before the parenthesis, this reads like there was lower complexity, lower speed ...


Could just change "lower complexity" to "simplicity".


Find more discussion on this article over at /r/haskell:

http://www.reddit.com/r/haskell/comments/33mnlc/becoming_pro...


> Composing functions out of other, smaller functions offers a big reduction in complexity. If they’re named well, this allows you to write functions that are easy to read.

there are only two problems in CS, cache invalidation and naming things - phil karlton


As someone who recently groked some of the higher order Haskell concepts, I found myself nodding at everything this says.

An RSS feed would be great.

Also, does anyone know what colorscheme this is using for the code samples? Looks nice.


Thanks for this, I really appreciate the section at the end on a suggested learning process. Looks like I've got my weekend planned!


OP: Thanks for this. I'm teaching a class on Haskell for my company's interns this summer and I'm trying to come up with a syllabus and a plan for it. This really helped.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: