The caveat that comes with this description is that sometimes people shouldn't create their own reality. Whether it's making decisions that ultimately lead to really screwy universes or just not being prepared to deal with the power that comes with changing the very fabric of reality, some people will not mesh well with LOP. And there's nothing wrong with that.
> Programming languages are distinguished not by what they make possible, but by what they make easy.
Practically all programming languages are computationally equivalent. The design of a new language simply seeks to answer the question: what should be easy for the programmer? Various languages are just different answers to this question.
There's maybe 3 major islands of languages today. Within each island, they make essentially the same things easy.
The answer to a question like "Python or Ruby?", "C# or Java?", or "Rust or Go or C++?" mostly boils down to historical accident: what system did the original designer pick, which usually means which one just happened to have a good library for the primary goal of the first prototype.
>"Rust or Go or C++?"
Does one of these make memory problems easier to avoid?
>"Python or Ruby?"
One of these prioritises having a single way to do most tasks, while the other prioritises programmer expressiveness (simple example: "unless" is absent in Python). Thus one is easier to onboard newbies with and have them relatively quickly be able to read the code in their ecosystem, while the other allows skilled programmers to convey semantics more efficiently.
I'll admit being mostly ignorant about the differences between Java and C#, but at least in general the languages you contrasted do make different things easier.
And yet, whatever they claim to prioritize, they aren't particularly different on those fronts.
OTOH, they have radically different features even if both are dynamic multiparadigm languages, though it's true that decisions between languages often turn on ecosystem rather than language features independent of how different the language features are, and that's also true of Ruby vs. Python.
Haskell is an interesting case because it is a fairly simple language by default (hm type system) with laziness as big innovation. But it also lets you play with experimental features via language extensions. Some work out great, others like implicit nested data parallelism get axed eventually.
With different predecessors, that's also true of at least C and Java on that list.
OTOH, the particular "combination* of features in the grab bag can be quite significant in practice, even if none of three features are new individually.
I bet a true random sample of programming languages would be a lot weirder.
50 programming languages of the last 50 years, illustrated in 50 minutes. It includes all the popular languages, but also covers some truly weird ones.
Remember the first couple of programs you wrote? I bet they were all quite similar.
Structural typing, no generics, interface oriented.
Sure, Java has interfaces, C doesn't have generics and TypeScript is also structurally typed, but I guess nominal typing, generics, and OOP is what most people who like static typing in the first place prefer.
I almost always start Lisp development projects in a repl, getting low level things I need working. Initially I don’t worry at all about making things tidy. I keep this process going until I have something that works. Then I start cleaning up code, perhaps factoring out common code, etc. If I see enough value in the new code and if it makes sense, then I move parts out to a separate library.
It is at this point, after experimenting with a problem and having good code, that I think in some cases it would be good to step back and decide whether to design a new language for the type of problem you just solved.
Personally most of the DSLs I write in Clojure are "private" (i.e. I write them for myself to help me develop a library or an app) and thus they tend to be small. This is why I favor functions over data: it allows me to reuse common function combinators (e.g (fn or-fn [f g] (fn [& args] (or (apply f args) (apply g args))))) so that I do not have to reimplement them which is something you have to do if you approach the problem with a data-first mindset. Instead, if I really want to prettify such function-based DSLs by bringing forth the use of data to express the problem at hand, I write a DSL-as-data parser and implement it as one or multiple DSL-fns.
More @ https://gist.github.com/TristeFigure/20dd01b0d3415f34075cfc0...
How does this compare to Racket DSLs ?
I think when people think about "programmers/developers" vs "computer scientists", this is where the difference could really show up in the future. A developer who knows how to code and hack may not have the skill/design knowledge to properly create a language/DSL, perhaps making a monster in the process. A well trained computer scientist (academic, self taught, doesn't matter) should have those tools and more importantly the design know how. A lot of teaching (again in both academia and not) really lacks on teaching good design for programming generally.
I think we're already starting to see this difference show up with libraries and frameworks (somewhat already DSL's themselves), and if we keep moving towards the direction of DSL's I think you will find many of the bright programmers of tomorrow going away from writing code to make things but instead code to make languages to make things. The final step of code to thing (from a technical perspective) seems to become increasingly boilerplate more than ever before as tooling keeps expanding. You still need to know the deep internals for large scale things, but it's never been easier to spin up a quick project that "just works" and does quite a lot. If library/framework/language creators do their job, this should only get easier.
At a certain point I wonder if the actual programming ever becomes a "lower tier" of the software world.
The author of Beautiful Racket, Matthew Butterick, is a lawyer and a typographer. He is not a computer scientist. And yet, he designed a DSL called Pollen for creating web-based books. Pollen has been quite successful within the Racket community.
My understanding is that Racket is predicated on the idea that people like Matthew, as much as people like your computer scientists, are the best authors of DSLs. They are the ones who understand the domains they're working in. If the division you predict ever does exist, it is because the tools have failed to make people who are not computer scientists capable of creating the DSLs they need to solve their problems. It's not because languages are inherently better designed by experts in the domain of programming languages.
That said, there's many ways to write code and learn to code, and I think of the web programming bootcamp style or the cookie cutter college grads who go through four years learning how to program in X language to work at fancy company Y. My point is that many routes are just not focusing on the higher level design skills that I think are needed to make good libraries/frameworks/DSL's.
To clarify, I'm not saying you have to be a PL expert, simply good at program and language design, which I think is what lacks in many places and could create such a division. That skill is/should be accessible to everyone.
>many routes are just not focusing on the higher level design skills that I think are needed to make good libraries/frameworks/DSL's.
I have observed that too. But I don't think this is about who is and isn't a computer scientist, whether self-taught or formally trained. I think it's more a change in the way people relate to programming languages. Perhaps programming languages were commonly assumed to be principally an academic topic. Perhaps it's not that more people are becoming computer scientists but that more people are finding non-academic ways to relate to programming language design. I think what Butterick did was to build the tool he needed to do a job (write a book). So language design becomes just like any other form of hacking.
>Why wouldn't the author qualify as a self taught computer scientist though?
Butterick himself is adamant that he is a lay person. He compares himself to a "squirrel in a ferrari": https://www.youtube.com/watch?v=IMz09jYOgoc And that's his point in that talk - Racket makes it possible for even the lay person to build the language they need.
I think there are some very interesting and elegant ideas in some languages that come from academia but it seems that something is missing when it comes to designing languages that are actually widely adopted. Despite their elegance Haskell and Racket are much less used than Go and Java.
It also has an incredibly powerful inheritance system. In 3-5 lines I can inherit all the functionality from Racket and then add, modify, or delete any part I like. Reminder, I can do all this from the compile time level, not the runtime level. I can change something as basic as how you define functions with just a few lines of code while still getting everything else.
Basically, I can easily write Racket+X or Racket with modified Y or Racket without Z and make it it's own entire language with very minimal effort.
A lot of this occurs because of Racket's insanely good macro system. You can also use these tools in libraries/packages and still get the compile time control while the language is still officially Racket.
No more work than any other language - Racket has reclusive descent, lex/yacc and PEG parsers generators. There is a whole section on parsers at http://docs.racket-lang.org/ including ‘Megaparsack’ based on higher-order parser combinators (like parsec for Haskell)
Racket can also read ‘C’ yacc/bison grammars.
It’s worth noting that a variety of non-Lisp-parenthesis-languages have been done in Racket including Java and Python.
But there's also stuff like:
And after a quick Google I found:
And there are of course things like:
But I suppose that qualifiés as "lisp like"...
[ed: and the article mentions "brag" which I think would be odd to call "lisp like" :
1. Lispers often find s-expression syntax easier to edit for many applications. So it's not surprising that a lot of the folks who design languages in Racket also might choose an s-expression syntax for their language design.
2. By using the same (or similar, in the case of Pollen/Scribble/...) syntax as the main racket language, it's easy to give access to the full power of racket in a DSL.
Some of the more specialised languages are different, though.
Let's be fair, many lisp programmers would be perfectly fine with that.
Unless your goal is to be source compatible with another lisp variant, of course.
I don't know. Most examples I've seen from people extolling the virtues of lisp macros and metaprogramming wind up just generating more lisp code in the same language with the same syntax and semantics, so I don't know why you couldn't just use functions in that case.
To be fair, I only have a surface understanding of one lisp variant (Arc) so it's entirely likely I just don't get it.
(define (show x)
[(vector a b c)
(display "It's a vector, total = ")
(displayln (+ a b c))]
[(list p q)
(display "I's a list, average = ")
(displayln (/ (+ p q) 2))]))
(show (list 8 2))
(show (vector 1 2 3))
The idea is that `match` can analyze the value of the first argument (`x` in this case) and then it binds the variables (a,b,c or p,q in this case).
In this case I'm matching only a vector with 3 values or a list with 2 values, but you can match more complicated and nested pattern.
The interesting part is that inside match you have the full power of the language, you are not restricted to a few special hardcoded cases (like `display`). You can send an hppt request, play a sound, save a value to disk, calculate the prime factorization, transform the text to uppercase, anything that is available in the language is available inside match.
But `match is just a normal macro. Racket is implemented in layers. The bottom layer is implemented currently in C, but it doesn't define `match`. The bottom layer is almost hidden and it is used to define some libraries and an intermediate language that it is used to define some libraries and an intermediate language, ..., and after a few steps you get the official Racket language.
But if you like you can reimplement a macro like `match`, or a variant, or your own macro, with all the magic to define local variables and use all the power of the language inside it.
Despite having the same syntax, `#lang lazy` and `#lang racket` behave differently:
(define (say-hi _)
(say-hi (exit 1))
$ racket lazy-example.rkt
$ echo $?
(define (say-hi _)
(say-hi (exit 1))
$ racket racket-example.rkt
$ echo $?
So given an and() function, you can't safely do "and(False, fire_missiles())", because the language will evaluate both arguments.
But an and!() macro could: macro expansion is essentially "lazy", in that it happens prior to the evaluation phase, so it can avoid evaluating any pieces of code it wants to, such that "and!(False, fire_missiles());" is perfectly safe, because our macro can stop expanding after "False" and thus fire_missiles() is never evaluated.
Incidentally, this is why at least some macro patterns are unnecessary in lazy languages like Haskell: you can write short-circuiting and() as a function there, because a function in Haskell only evaluates as much as is necessary (provided it's been written properly). Yet even there, Haskell still has support for macros and things like TemplateHaskell and so forth, because there's just some things you can't do solely with functions, like arbitrary syntax, language extensions, etc.
Modifying control flow, for one. A classic example is anaphoric if. If you want to implement that, you can't do it just as a straight forward function. It needs to be a function that modifies syntax, aka a macro.
// a line comment
@$ 'null $@,
@$ (* 6 7) $@,
@$ (= 2 (+ 1 1)) $@,
@$ (list "array" "of" "strings") $@,
@$ (hash 'key-1 'null
'key-2 (even? 3)
'key-3 (hash 'subkey 21)) $@
see html encoding through functions, or kotlin builders, or json dsl ..
Neither the sntax object, nor hygiene are of concern when you build external DSLs, which the author's examples seem to be.
I have not yet seen a compelling argument for Racket there vs. for instance a parser generator framework.
I might just be missing the point. Either way the author's convincing seemed to have failed
Racket allows for easy interop between the various languages, so for instance, you can use typed racket from untyped modules. You can import documentation modules in base racket and so on.
With other languages, you have to glue the infrastructure yourself, racket does it for you.
I think PG has some writings on this topic, but I don’t remember if it is online or in his LISP textbooks. His company Viaweb did this, IIRC, using a DSL written in LISP to generate HTML.
This is nothing special. In all languages, code is a list: a list of characters, that is.
Beyond the code being a list, Racket's macro system and syntax parse / creation tools are incredibly powerful and come built in where you could spend years in another language just getting the tooling to write a language, let alone writing it.
Not sure if that makes anything clearer but I can't think of a better phrasing.
To address the examples you gave, RSpec, Chef, etc., those literally ARE just using Ruby, no new syntax, no new semantics.
Although using Racket for this wholesale integration appears impractical, it does suggests a killer-app: extending general programming languages. In other words, using Racket as a transpiler. I.e., a better Babel.js, but for any language.
I routinely use Python for machine learning tasks, but frequently am frustrated by the inflexibility of the language. Instead of solving the problem itself I have to first solve how to make the language do what I want (usually resulting in code that lacks expressiveness and brevity, and thus extensibility).
I’d love to use lisp to leverage ML work that can only be done effectively in Python (Tensorflow, PyTorch, etc).
Can Racket be an effective transpiler?
Racket was developed by PLT Inc.
There's not a lot of information on PLT Inc. these days, so: Racket was developed under direction of Matthias Felleisen, with his students Matthew Flatt, Shriram Krishnamurthi, and Robby Findler serving as the core development team. They also published the book "How to Design Programs" , and Krishnamurthi published "Programming Languages: Application and Interpretation" , which are both books that rely on Racket. (PLAI actually uses its own language which is developed in Racket, called plai-typed, but which relies on the DrRacket environment.)
(You knew there was a "but").
While I don't quite agree with the assertion that with the right tooling (so: this tooling), creating a language is as easy as creating a library, I don't think it would solve our problems even if it were true.
Language design is much harder than library design, even if the tooling is perfect and transparent. In the extreme we're going to end up with what Alan Kay calls inverse vandalism aka the Mount Everest Effect: making things because we can, not because we should or they're good.
In essence, designing a (domain specific) language should be your last resort, not your first resort. At least one that requires a parser or macro system to implement. After all, when we create a library, we are in fact creating a domain specific language, or at least a domain specific vocabulary. A framework goes a little further towards the idea of language. In fact, the first version of Simula was a simulation DSL. It was the second version that made the language into an OO language and the simulation "DSL" into a framework. The advantage being that you can use the same language with different frameworks for other domains.
If we look at natural language, it is very much the same: we use the same language to talk about any number of domains, the most we tend to add is some domain-specific vocabulary. Imagine the tower of babel we would suffer under if every domain had a completely new and different language!
That effect was very much encountered and later described by the LRG at PARC when they started using Smalltalk 72, which allowed every object to defined its own language. It was chaos! Turns out we are not very good language designers in the heat of the programming battle. So they very consciously made the language more restrictive later. However, the Smalltalk message syntax is specifically designed to give you something akin to DSL, but without most of the problems.
A similar effect was documented by Bonnie Nardi for end users. Here as well, the expectation was that users would prefer domain-specific tools that were tailored exactly to their needs. This, surprisingly, turned out not to be the case. End users apparently vastly prefer general tools, even if the effort to adapt them to specific tasks is greater (see also: people writing documents in Excel...)
That said, current general purpose programming languages are probably not quite flexible enough. I think there is a sweet spot to be found somewhere between Smalltalk-80 and something like Racket.
It’s incomprehensible and for each system you are operating you have to learn some arcane yaml incantations that have dramatic impact on your production systems. It’s a world crying out for a set of well factored dsl.
On the other hand I really would like to see a world with 10s or even 100s of little (preferably non Turing complete) languages with extremely good integration(with each other) and very good tooling and visualizations. Because right know the languages we use tend to be way too powerful. This prevents good tooling and optimizations.
Of course, anyone would expect those proficient in Racket and LOP to have a better understanding of programming and software design in general, and to be generally more productive than many, but that's not necessarily because of LOP.
But regarding your comment on users preferring general tools - I notice this happening in my domain, where the trend is now to build absolutely everything with Python, regardless of whether this is a good choice or not. And it's impressive how libraries are built with embedded languages that feel completely non-pythonic, just for the purpose of using Python. Library writers could at least strive to make better use of the syntax of the general language (see Tensorflow vs. Pytorch).
It'd be interesting to see more studies on general languages vs DSLs, and reasonable heuristics to know when to prefer one over the other.
Now, not only end users prefer more general tools (see also: people writing a NES emulator in Emacs Lisp).
"Macros are catnip to a certain type of developer"
This is the trap of macros and LOP approaches. This development approach often leads to code that makes sense to nobody in the world except the original developer. There are a lot of genius pieces of code out there using macros and DSLs, but there is also a lot of spaghetti code with a side of Macro meatballs.
Has anyone used Racket with success in another IDE or programmer's editor?
Sometimes maximum precision means having a first-class typechecker & type inference. (You could build some of that too in Racket if you really need to. Of course, that code itself will not be verified or typechecked.)
> Why aren’t these languages DSLs?
They aren't DSL's because they need statically-linked binaries, and good error messages, and then your task is not as easy as desugaring to a LISP/dynlang.
Easy, just used typed Racket :)
Either way, that seems like a silly approach, IMO.
Well, wouldn't we expect that the author of a particular programming language to be mad productive in it because, you know, they designed all the mental models and assumptions that are baked into it?
You would never once stumble on syntactic ambuigity because, to you, there is no ambuigity...or perhaps there is but you slyly added that bit just to fuck with programmers heads...or who knows the reason.
I'm just postulating that for an author to boast about their productivity with the language they designed is somewhat self-serving.
Racket is built by a team and has been around a long time.