This release adds "binding operators," which make functors, applicatives, and monads more convenient to use. The "binding operators" are like OCaml's version of Haskell's do-notation, but IMO even better. In Haskell's do-notation, each binding in the form of `pat <- expr; next` desugars to monadic bind, with an extension to use applicatives instead where possible, but with OCaml's binding operators, there are three separate operators for functors, applicatives, and monads.
The 4.08.0 release also adds the Fun, Option, and Result modules to the standard library.
I meant Haskell's do-notation sugar - it only works for monads (or applicatives, assuming ApplicativeDo and absence of dependencies on intermediate results). OCaml's new "binding operators" let you define your own variations of `let` and `and`, which resemble do-notation and therefore have the same cognitive convenience. Then, you can define `let` for functors, etc. You can basically have `let` sugar for any operation that takes a function as the second argument.
It doesn't get said very much but the ML ecosystem is full, complete, and very much production quality. It is a shame that there are not many projects or companies that make use of all that it has to offer.
And I have a similar feeling of sadness that Racket (or one of the other Scheme variants) isn't used more.
I've wondered whether one of the barriers is still that students' only exposure to these is usually using it in school for some contrived homework assignments, and dismissing the tools for "real work" before they've tried applying them. (And at least some schools are emphasizing having students ready for internship/interview with popular languages now, so innovative languages can get even less attention.)
I'm always surprised that F# didn't take off. It's a great language and in addition to the strenghts of ML you have the .net ecosystem and Microsoft behind it.
It took awhile before it seemed that Microsoft was truly behind it. I tried looking into it when version 2.0 was released and it was a pain in the ass trying to get it to work with Visual Studio 2010 Express. I was not a professional programmer at that time so I didn't have access to Visual Studio Pro. That experience turned me off to the language and I never really went back to look at it.
There’s some hedge funds using F#. C# has historically been popular in certain parts of finance any many of those shops are now exclusively using F# for new code.
Personally, F# is nice and all, but the lack of higher kinded types rubs me the wrong way.
I really want to use Ocaml for web application back-ends, but last time I checked the ecosystem around that in Ocaml is less than extensive compared to languages like Python, JS, java, etc. It really is a shame. I need to set aside some time to really dive deep into Ocaml and make a simple micro framework or something.
From my small amount of exposure, I can say that I love Ocaml's expressiveness and flexibility.
There are some scattered pieces in the ecosystem for routing, sessions etc but i agree that compared to Python, JS, etc the ecosystem might not look as cohesive or expansive for web applications. So there should definitely be room to have new solutions in this space
Did you check out F#? I’ve never personally used it, but I know it takes advantage of the .NET ecosystem (and with .NET Core, is no longer tied solely to windows).
One little thing which would improve the ecosystem is public ci support... as simply as with other languages. Yes, eg travis or circle can do OCaml but with considerably more effort than other languages where for personal projects I often have a one liner.
I have not been using OCaml, but I have been using Reason / Bucklescript on Windows extensively. Other than some expectation that GNU tools like cp are available on the path, everything has worked beautifully.
I'm not entirely convinced that the let operators were a good idea. What's wrong with the ppx rewriter notation (let%bind = ...)? That's, after all, what it's for: provide custom extensions to the language syntax.
Quality of life improvements in the standard library are maybe the biggest deal for me (since I don't use any stdlib replacement like Batteries or Base). Stuff like the Int, Bool, Option, etc. modules, or filter_map... It's mind boggling how they've only been added now (but good thing they were).
I wonder how come people working in PL research aren’t more interested in concurrency. Typeclass, monads, etcs are great but they still haven’t reach the point were everyday people could expect to write proved safe program ( unless the code base is extremely small and you’re using special software such as Coq), and while they make programs more concise and expressives, i think concurrency remains the elephant in the room.
Rust is the first PL i’ve heard in a long time that tries to actually do something in that area since erlang ( that opted for strict message passing and let it crash philosophy, but gave you the tool to deal with that approach).
Is it because it’s too « down to earth » for PL purist, or is it simply because they don’t have the mathematical or logical tools to deal with the problem ?
Quite a lot of things have been tried since Erlang (1986). Concurrency has been a fairly well funded sector of compiler and PL research to the point of starving other arguably more important areas.
Futhark is one of the more interesting recent parallel languages IMO since GPU languages were in such a lull for a while. Too bad everyone is stuck with emitting opencl or going vendor specific though.
> Concurrency has been a fairly well funded sector of compiler and PL research to the point of starving other arguably more important areas.
Concurrency is enormously important in a world where instruction speed has saturated. Many problems are highly data-coupled, so you can't just separate these problems into loose tasks that you connect with a slow message pipe.
Of course, if you're just writing a web server or a chat server, then you might be lucky and you can get away with it. But please don't assume this holds for everybody.
Yeah, but it's "just" speed of execution, while for most things the bottleneck of making something possible is correctness and ease of developing while keeping complexity in check. So yes it's important but still mostly less important than some others :)
Tensorflow is interesting in this respect. Leveraging a lot of parallelism and using a parallel pl under the good, while presenting itself as a dev friendly Python lib on the surface.
The problem is that engineering an efficient concurrent garbage collected runtime environment is hard. It may even be harder than writing the compiler itself, especially if you rely on llvm for the back end.
Python was already in the top 5 well before the whole AI craze and before pandas was out.
Python has a huge amount of introductory learning material that assumes it is your first language, while most FP languages (Ex: Clojure, Haskell, F#, Scala, OCaml) really struggle in this area. I really like FP, but there is a bit of a steep plateau when learning. If you look at some of the questions people ask in the Python stack exchange, you get the impression that millions are learning it as the defacto first language (I was one of those nearly a decade ago). I try to find similar paths to FP and everything from the tooling to lack of thorough introductory material keeps killing it for me.
Another issue is that from a pedagogy perspective all the "building blocks" in FP are different to what many people already know. If you're used to the imperative/OO paradigm you can move between languages by just learning the syntax to (lists, dictionaries, while and for loops, branching, array access, file IO, and classes). When learning FP you have to learn similar, but different concepts (pattern matching, monads, currying, discriminated unions...etc).
The beginner books that do exist (ex: learn you a Haskell) are nice, but I've talked to many (myself included) that when they finish say "I still have no clue how to program in Haskell". To give another example, I spent two weeks reading a Python book on building text games and when I was finished I was like "OMG I can do stuff". That book covered how all the main data structures could be used with short and fun programs. It also included reading text files, string operations, pickling data, classes, modules..use of the included IDLE IDE. It was great.
> Python has a huge amount of introductory learning material that assumes it is your first language, while most FP languages (Ex: Clojure, Haskell, F#, Scala, OCaml) really struggle in this area
I don't know how long ago have you tried Clojure, but it is a lot easier to start with than Haskell or Scala. There are now more than a dozen of books available (for the beginner and for the advanced levels). Clojure is much better than Python - it has extremely nice, consistent standard library; It has "true" REPL - with it you can evaluate almost any chunk of your code with no preliminary ritual, even much praised Jupyter doesn't feel as nice; Clojure not statically typed but it has Spec, which is totally awesome - the way how you can derive property based tests is almost mind-blowing; Clojure's stability is almost legendary; It makes concurrency simple; It makes dealing with dependencies less painful;
It can seamlessly run on both: front-end and back-end, having live-updates in your browser and REPL connected to it feels like magic. Honestly, transforming data using Clojure is a pure joy.
I have a few Clojure books, but not knowing Java and the JVM very well and then having to learn Emacs/Cider and all the other tools really kind of killed it for me. In short, some of the Clojure REPL advantages are better than Python, but not by enough to justify learning the ecosystem.
I might try again later. I have "Clojure for the Brave and True", Carin Meiyer's book, and one of Fogus' books.
I really want to learn Clojure, but just need to sit down and put the time in. It's also discouraging to see people comment about some of these languages (Clojure and F#) as being on life support.
I can't say anything about F#, but Clojure is doing quite alright. It gathers more conferences and meetups around the world (more than Haskell, OCaml, F#, Elm or Elixir). Has more podcasts (defn, the REPl, Clojurescript podcast, Apropos, Cognicast), there are other podcasts created and run by people actively using Clojure, where they talk not only about Clojure. Clojurians Slack and clojureverse.org are very active. New libraries and books coming out regularly. My company recently was hiring and I shout out on Twitter and I got DM request from all over the world: Chile, Mexico, Brazil, Japan, India, Bangladesh, Jordan, Poland, Latvia, Ukraine, Russia, UK, Germany, US and other countries, people want to write Clojure full-time. So, yeah. I don't know where you read that Clojure is dying or whatever. It is not as big as Python or Javascript, but it's slowly, steadily growing.
I think Python has become so popular because Google started using it, but then later they switched to Golang. But the Python train was unstoppable at that point, even MIT swapped Scheme with it in their CS 101 course.
Modern security exploits and application crashes due to state corruption have proven that while threads were an enticing idea, they aren't something that we should keep around if we care about security and application stability.
This applies to all programming languages, not only OCaml, hence why everyone is moving away into other concurrency models anyway.
You have two matrices, with allocated memory A and B, that you want to multiply and store to allocated memory C, as efficiently as possible using 8 cores. How do you do this without threads exactly?
I wanted an efficient no threads solution to a very common problem of matrix multiplication using multiprocessor CPUs. You gave me nonsense answers like vector operations (uses only one CPU at a time) and GPUs (memory transfer bottleneck, need to copy arrays, significantly less RAM compared to CPU RAM, and GPUs make it a completely different game). So no, I'm not moving goalposts.
And will spend more time shuffling data through their horribly bottle-necked memory interfaces than the CPU would take handling the matrix operation, especially if the GPU is already used for graphics.
Multi-core systems and tools that can use them are valuable and very much reality right now.
For example: embedded applications, or when I'm the only one running trusted code on my computer. Think also of scientific computing.
If you insist on networked solutions: database servers.
Also: "multicore + purely functional style" is just as clean as "multiprocess + purely functional style", but ... the latter is less efficient because of communication overhead.
Unless you manage to do any kind of work without IO, there is always a security risk.
And if that is irrelevant to you, there is still the application stability and memory consistency to ensure data is being handled in a memory consistent state.
Fortran is a good example on how to do scientific computing without low level threads.
Yes threads should be used with extreme caution, that's why I advocate using a functional programming style. But you still want the performance of multicore. It's not one or the other (but with OCaml it unfortunately is).
Correctly implemented concurrency/parallelism primitives are not a bad idea. I am thinking along the lines of Clojure/Erlang/Go. They make life more enjoyable.
One of the strengths of functional programming is using structurally shared immutable data structures. You can't use this approach efficiently if you don't share a heap with other processes. Converting everything to a shared byte array (or similar) mostly ruins the usefulness of the programming language, and also makes GC impossible.
Shared memory is still an issue though. To my knowledge there is no convenient way to share memory between processes in OCaml. That said, you are right, threads are horrible.
Concurrent programming in OCaml is fine though. Multicore is more about parallelism and the ability to use multiple cores without the need to rely on threads and multiple processes
* The printing of error message has been improved to display the code source responsible of the error outside of the REPL.
* Some probable beginner errors (like discarding a non-applied function with `ignore`) now raises a warning.
* Type name captures should not happen anymore in error messages: no more `val x: int` is not included in `val x: int` where `int` refers silentiously to different types.
* A handful of typing errors has been fixed to speak to users and not the compiler developers (no more "unexpected existentials" for instance).
* Some compiler internal change to make it much easier to use the typing context when explaining a type error. This is not used much yet, but I hope to improve the scope related type errors in the next versions.
While not a traditional IDE, Merlin [1] combined with vim/emacs/vscode is really good (autocompletion, jump to definition, type lookup and more). Merlin also gained support for language server protocol recently so that could be an option for any editor/IDE that has a language server client.
EDIT: I'd also like to point out OCamlformat (https://github.com/ocaml-ppx/ocamlformat). It works really well in my experience and makes it really easy to perform automatic code formatting, and can be used in CI to check that changed conform to the formatting style a project prefers.
The 4.08.0 release also adds the Fun, Option, and Result modules to the standard library.