>>Why didn't they hire 2-3 (more expensive) experts instead of 10 juniors? <<
For the same reason they don't hire 2-3 architects instead of one architect and 10 workers to build a small house.
Actually hiring solely "experts" the productivity may be lower because they're used to doing only certain kinds of things.
In startups? Then they're doing it wrong. Even in enterprise-size businesses they're doing it wrong. If manual repetition of a task occurs it has to be automated or refactored if it costs significant developer time (=money): http://xkcd.com/1205/
Repetitions are often a result of changing requirements things that can't (normally) be automated. If you think that enterprises are doing it wrong and you know more than them, you could become rich by showing them the right way of doing things - on one condition: you have to prove what you're claiming. :)
It's a myth that large companies have a lot of repitition tasks. They know how to automate. No need to tell them what they already do.
As my postings are focussed on the German market, I rarely see real Lean Startups that. It's still very common to build one thing big and to hope that it succeeds. So actually you don't start from scratch more than 1 time except some low profile developers messed it up.
There's plenty of concurrency and parallelism out there but there's not a single piece of important software that is written in a functional language. Many people claim many things but when it comes to empirical evidence. it turns out that imperative, stateful programming is still the best choice.
Erlang has some limited and very niche uses in telecom, bua it was replaced even at Ericsson to good degree with c++.
>>Important to who? I use tools written in functional languages every single day.<<
To the market. There are all kind of servers from web servers to database servers to other kinds of servers. Almost none are made using functional programming. Then, there is lot of software out there, which is made almost exclusively in C/C++, Java & .Net. Haskell & co. are only used in academia and by hobbyists (math people who can't learn proper programming)
Good luck porting a professional grade tool like Eclipse. :)
Not much of a surprise. There's way, way too much not only useless research but totally wrong research (bad data, close to zero use of the scientific method, reliance on fallacies like correlation = causation and so on)
Rigor in science must increase and fund should be directed to serious, experimental research based on the scientific method.
Haskell can offer absolutely nothing Java can offer, while Haskell can't offer even 0.001% of what Java offers.
You see, there's a very, very good reason why you won't see any (even remotely) serious and useful software made in Haskell and why you won't see it used by companies (which means that jobs are literally extremely close to zero)
Haskell is nothing more than an exercise of constructing a language based on certain mathematical concepts, mainly abstract stuff like category theory.
As a result it as cryptic, hard to understand and useless in practice as those theoretical concepts (though the theoretical stuff might serve as basis for some other stuff that might be useful - not so with functional languages and especially Haskell)
Huge mistake on their part, typical of math people.
High level programming languages have to be as close as possible to human languages (English) not to some obscure mathematical concepts and notation that most people don't care about. Actually math notation itself (cryptic, inconsistent, ambiguous) is a horrible result of math people's communication handicap and ineptitude.
No. The math notation that is used across all branches of mathematics is consistent and unambiguous (the notation of formal logic, naive set theory, etc). The only ambiguity typically comes from traversing different branches of the discipline, which is inevitable considering how many branches there are and how deep they go.
Furthermore, mathematics is all about communication. The language of mathematics exists to codify concepts so that they can be talked about concisely. Being able to say 'group' instead of 'set with an associative binary operation with identities' is essential if you want to be able to build on top of that concept without taking an hour to read one theorem.
The handicap in communication isn't on the mathematics end, it's on your end. You seem to expect them to be able to explain structures to you that took years of work to build by using the same language that you use to talk about sports or social events. The reality is that you are not the target audience of their communication, and they are okay with that. You should be too.
The weirdest assertion that you made is that high-level programming languages ought to be as close as possible to human languages. The two categories of languages exist to communicate fundamentally and widely different groups of concepts. Words represent categories of analogous concepts, and the relevant categories in human life are nothing like the relevant categories in programming. In Haskell, 'functor', 'applicative functor', and 'monad' are highly relevant categories. They pop up everywhere and can be leveraged with great benefit. In human life these concepts are far less common, and thus do not merit words in the common vernacular. Were we to use a programming language modeled on English, we would miss the benefit of these abstractions, trading them for categories like 'dog' and 'car' which have very little practical use in typical programming.
>>No. The math notation that is used across all branches of mathematics is consistent and unambiguous (the notation of formal logic, naive set theory, etc). <<
Nonsense. Actually, ambiguity starts with basic arithmetic.
Take multiplication for example. We have several kinds of notation for it. Which is inconsistent. In the case of juxtaposition, it's ambiguous because two or more juxtaposed letters don't necessarily imply multiplication. And I'm talking about arithmetic only. Then, ambiguity only builds up. Cross product, Dot product & crap.
>>>Being able to say 'group' instead of 'set with an associative binary operation with identities' is essential <<<
OK, but the word "group" should be used for no other meaning...
>>>The reality is that you are not the target audience of their communication, and they are okay with that. You should be too.<<<
As you can see pretty much anyone is the audience of some math and its inconsistency and ambiguity. It just varies the level and the amount of it.
>>>The weirdest assertion that you made is that high-level programming languages ought to be as close as possible to human languages. The two categories of languages exist to communicate fundamentally and widely different groups of concepts. Words represent categories of analogous concepts, and the relevant categories in human life are nothing like the relevant categories in programming. In Haskell, 'functor', 'applicative functor', and 'monad' are highly relevant categories. They pop up everywhere and can be leveraged with great benefit.<<<
False. Computers and software are mainly used to emulate some real world stuff (objects, actions etc.) and to help people with real world stuff in a more automated way.
They aren't used too much to prove theorems or some other math stuff. And pretty much no one cares about proving the so called "mathematical correctness" of a program - a concept that doesn't even make sense in most cases.
Old misconception among FP advocates, even Dijkstra himself admitted that he was kinda wrong about how computer would evolve and what they'd used for. But the associated misconceptions live on.
A language close to human language also helps avoiding errors. That's why you won't see functional languages in critical systems, but rather languages like Ada which is probably the closest programming language to human language. The claims of clarity of FP languages are pretty much at odds with the evidence the real world provides.
A programming language can be created such that it resembles a natural language but it also avoids the ambiguities of that natural language. Anyway, the main idea is that FP languages are way too cryptic and ambiguous, they use too many symbols with multiple meanings which don't make any logical sense at a first glance. If you have to go to great lengths to explain the meaning of a simple symbol, then its use is wrong in a language that is claimed to be general purpose, clear, easy to read and so on. Either that, or the language is not general purpose and/or doesn't have those claimed qualities (clear, etc.) in a general sense.
Or, you simply have not learned enough yet to understand the language. Programming languages are not designed to be easy to pick up with no prior knowledge, they're designed to be powerful when used by professionals who actually know what they are doing.
If you're going to make the claim that functional languages use ambiguous symbols, you're going to need to back that up with some examples. I find it exceedingly hard to believe that there is any ambiguity in the operators of a statically and strongly typed language like Haskell.
Or, maybe the language is very poorly designed. There are many programming languages. From Basic to LISP to C to Java to Haskell to ... Brainfuck. Some of them are used in the software industry and some are not. There are many claims about many languages: language x is good because [insert some random ramblings], language y is good because [...]
However, no language is adopted by the industry solely based on claims (and btw. I have seen some utterly ridiculous claims made by those who try to promote Haskell.) Every once in a while, some companies try out new languages.
Very few such languages get adopted and as you can see functional languages are almost completely absent from the industry. And there's a very good reason for it: they're simply not suitable for producing professional grade commercial software. If it had been otherwise, someone would have figured it out. The funny thing is that the start-ups that try to use them (usually founded by FP advocates themselves) also fail one after another. But some people never learn. Furthermore, many companies forbid the use of functional style or characteristics implemented in certain imperative languages. The code of good, proper lanaguages for general purpose software engineering, is almost self describing! What is unclear should be sorted out quite easily using the documentation.
Those "professionals who actually know what they are doing" don't seem to exist when it comes to functional languages. The evidence is the very fact there's not a single piece of important commercial software written in such a language.
The question is rather: can such specialists exist? Because I'm afraid they can't exist because the functional approach is fundamentally wrong.
Examples of ambiguity in FP?
What is the following line supposed to mean and what part of it suggest anything about that:
a b c
How is ~ an intuitive replacement for minus? How is (* 5 5) supposed to be as clear as 5 * 5 ?
ps. dynamic typing and type inference are two awfully bad things and either of them can lead to trouble in large programs
This depends on the point of view. I expect to find a rather high standards in Haskell programs than, say, Ruby, for exactly the same reasons you've mentioned and I think it's an awesome feature, quite the opposite to a mistake.
And yes, I am a "real programmer" and don't use Haskell at work myself (for the run-time performance reasons).
I don't think it's ever a mistake for something to push standards higher, even if those standards are so high that it might turn off some talented people.
I also disagree overall with your assessment of mathematical notation. But it doesn't really matter because Haskell doesn't actually use any math notation. It does use some math terminology, though, and overall mathematicians are absurdly pedantic about terminology. Mathematicians may not be great at telling jokes at parties (though you may be surprised!), but it's their job to make sure they are writing in consistent and unambiguous ways.