

Nuclear weapon statistics using monoids, groups, and vector spaces in Haskell - jackpirate
http://izbicki.me/blog/nuclear-weapon-statistics-using-monoids-groups-and-modules-in-haskell

======
jheriko
all this proves to me is that Haskell can make even the simplest of tasks
difficult to understand or follow. perhaps i miss the point... :/

~~~
rauljara
I wish I didn't agree with you. I'm still a Haskell novice, and I've enjoyed
the time I've spent learning it, but this post was just terrifying. All I
could think was how simple all that would be in R (or any statistical
package), or even Ruby (or any popular scripting language).

Of course, one can write overly complicate code in any language. I think the
purpose of the post was more to show off conceptually advanced techniques
rather than to actually analyze the dataset in a straightforward manner.

~~~
jackpirate
_I think the purpose of the post was more to show off conceptually advanced
techniques rather than to actually analyze the dataset in a straightforward
manner._

Correct. I admit it turned out to be a bit too much for one post, but i wanted
to use some real world data (the nukes) to demonstrate the techniques.

Edit: I could be wrong, but I don't think R (or any other stats package)
supports "group subtraction" of distributions, which is how we calculate the
survivable nuclear weapons.

~~~
rauljara
And actually, I want to amend that on second read through it was considerably
easier to follow than the first.

It's just an article for Haskellers at least a little more advanced than me. I
probably should have read it twice before labeling it "terrifying".

------
dschiptsov
Is it more precise, uses less resources, runs faster than if done with R or
Octave?

What are the benefits, if any?

~~~
jackpirate
When analyzing data in stages, algebra saves you from repeating work you've
already done. It also makes implementing the library's back end easier.

