Hacker News new | past | comments | ask | show | jobs | submit login
Why Language-Oriented Programming? Why Racket? (beautifulracket.com)
281 points by jessealama on Feb 23, 2019 | hide | past | favorite | 105 comments

I was thinking recently about why I find programming languages so interesting. The answer I came up with was that programming languages allow you to create your own reality. You get to define how things work in this reality. Want functions to be values that can be arguments and return values? Sure! Want a lot of crazy symbols to do complicated math? Go for it! Want everything to be dynamic? Why not.

The caveat that comes with this description is that sometimes people shouldn't create their own reality. Whether it's making decisions that ultimately lead to really screwy universes or just not being prepared to deal with the power that comes with changing the very fabric of reality, some people will not mesh well with LOP. And there's nothing wrong with that.

The problem is that people often create languages that are basically the same as existing languages; no new concepts. New languages are usually a grab-bag of features found in C, Java, Lisp and/or Haskell.

There's a quote on the whiteboard in my lab that is (perhaps falsely) attributed to Larry Wall that says something like:

> Programming languages are distinguished not by what they make possible, but by what they make easy.

Practically all programming languages are computationally equivalent. The design of a new language simply seeks to answer the question: what should be easy for the programmer? Various languages are just different answers to this question.

This seems unsatisfying. Most of the modern languages are choosing pretty much the same sets of features. I've been using them for years (decades?), and I couldn't tell you significant differences between most of these languages.

There's maybe 3 major islands of languages today. Within each island, they make essentially the same things easy.

The answer to a question like "Python or Ruby?", "C# or Java?", or "Rust or Go or C++?" mostly boils down to historical accident: what system did the original designer pick, which usually means which one just happened to have a good library for the primary goal of the first prototype.

I'd add some caveats to GP's comment, but I think it's mostly spot on.

>"Rust or Go or C++?"

Does one of these make memory problems easier to avoid?

>"Python or Ruby?"

One of these prioritises having a single way to do most tasks, while the other prioritises programmer expressiveness (simple example: "unless" is absent in Python). Thus one is easier to onboard newbies with and have them relatively quickly be able to read the code in their ecosystem, while the other allows skilled programmers to convey semantics more efficiently.

I'll admit being mostly ignorant about the differences between Java and C#, but at least in general the languages you contrasted do make different things easier.

Edit: grammar.

> One of these prioritises having a single way to do most tasks, while the other prioritises programmer expressiveness

And yet, whatever they claim to prioritize, they aren't particularly different on those fronts.

OTOH, they have radically different features even if both are dynamic multiparadigm languages, though it's true that decisions between languages often turn on ecosystem rather than language features independent of how different the language features are, and that's also true of Ruby vs. Python.

>the differences between Java and C#

Not much.

Are you primarily targeting Microsoft platforms? If so, then C# might have the edge. Otherwise, you might prefer Java.

I feel like programming languages that wants to be used in production have a novelty budget. Rust does a great job of managing this, for instance.

Haskell is an interesting case because it is a fairly simple language by default (hm type system) with laziness as big innovation. But it also lets you play with experimental features via language extensions. Some work out great, others like implicit nested data parallelism get axed eventually.

At the same time, even a language that is just a combination of features from those languages can strike a different balance between the concepts and it's the balance that defines the languages.

> New languages are usually a grab-bag of features found in C, Java, Lisp and/or Haskell.

With different predecessors, that's also true of at least C and Java on that list.

OTOH, the particular "combination* of features in the grab bag can be quite significant in practice, even if none of three features are new individually.

My guess is that you’re seeing popularity based selection bias at work: languages with more common features are easier to understand and gain popularity faster than truly bizarre and unique languages which are confusing to newcomers. These popular languages end up being the ones you see every day.

I bet a true random sample of programming languages would be a lot weirder.

I'd love to see some examples.

Find the video '50 in 50' by Steele and Gabriel.

50 programming languages of the last 50 years, illustrated in 50 minutes. It includes all the popular languages, but also covers some truly weird ones.


I think the ability to conjure up things out of nothing, that have no constraints derived from the physical world, is a lot of the appeal of computer-based activites such as programming or 3d graphics in the first place.

This is exactly why I got into programming, to make my own realities in then-current 8-bit games. I've since switched to this more abstract endeavour which gives the same satisfaction without realizing this being the reason. I'm also holding out for a VR title that can showcase this aspect rather than a horror in SciFi backdrop.

Some demos/press copies of Dreams VR recently showed off some compelling creative storytelling tools

- https://www.giantbomb.com/shows/dreams-01302019/2970-18741

- http://dreams.mediamolecule.com/

It looks like one will have to write a whole lot of languages to be reasonably good at language oriented programming.

Remember the first couple of programs you wrote? I bet they were all quite similar.

A pretty great example of "microworlds" from Seymour Papert'a "Mindstorms," which you would enjoy!

Yeah, that's what I like about languages like Lisp and to a lesser extent Python and, conversely, why I dislike languages like Go.

The worst is when people do use Go to make their own reality. It's not meant for that and it's never pretty.

uh..What is? (what you like about them)

Go has quite its own reality.

Structural typing, no generics, interface oriented.

Sure, Java has interfaces, C doesn't have generics and TypeScript is also structurally typed, but I guess nominal typing, generics, and OOP is what most people who like static typing in the first place prefer.

I bought two of the author’s books and also enjoy his newsletter. That said, I still have not drank the LOP coolaid, yet.

I almost always start Lisp development projects in a repl, getting low level things I need working. Initially I don’t worry at all about making things tidy. I keep this process going until I have something that works. Then I start cleaning up code, perhaps factoring out common code, etc. If I see enough value in the new code and if it makes sense, then I move parts out to a separate library.

It is at this point, after experimenting with a problem and having good code, that I think in some cases it would be good to step back and decide whether to design a new language for the type of problem you just solved.

As a Clojure programmer I find Racket's "#lang" feature fascinating, all the more so it seems to be doing exactly the opposite of what's recommended in Clojure: favor data over functions over macros.

Personally most of the DSLs I write in Clojure are "private" (i.e. I write them for myself to help me develop a library or an app) and thus they tend to be small. This is why I favor functions over data: it allows me to reuse common function combinators (e.g (fn or-fn [f g] (fn [& args] (or (apply f args) (apply g args))))) so that I do not have to reimplement them which is something you have to do if you approach the problem with a data-first mindset. Instead, if I really want to prettify such function-based DSLs by bringing forth the use of data to express the problem at hand, I write a DSL-as-data parser and implement it as one or multiple DSL-fns.

More @ https://gist.github.com/TristeFigure/20dd01b0d3415f34075cfc0...

How does this compare to Racket DSLs ?

Some notes on Data > Functions > Macros in Clojure: https://lispcast.com/data-functions-macros-why/

An interesting tangent I think about sometimes:

I think when people think about "programmers/developers" vs "computer scientists", this is where the difference could really show up in the future. A developer who knows how to code and hack may not have the skill/design knowledge to properly create a language/DSL, perhaps making a monster in the process. A well trained computer scientist (academic, self taught, doesn't matter) should have those tools and more importantly the design know how. A lot of teaching (again in both academia and not) really lacks on teaching good design for programming generally.

I think we're already starting to see this difference show up with libraries and frameworks (somewhat already DSL's themselves), and if we keep moving towards the direction of DSL's I think you will find many of the bright programmers of tomorrow going away from writing code to make things but instead code to make languages to make things. The final step of code to thing (from a technical perspective) seems to become increasingly boilerplate more than ever before as tooling keeps expanding. You still need to know the deep internals for large scale things, but it's never been easier to spin up a quick project that "just works" and does quite a lot. If library/framework/language creators do their job, this should only get easier.

At a certain point I wonder if the actual programming ever becomes a "lower tier" of the software world.

> A developer who knows how to code and hack may not have the skill/design knowledge to properly create a language/DSL, perhaps making a monster in the process. A well trained computer scientist (academic, self taught, doesn't matter) should have those tools and more importantly the design know how.

The author of Beautiful Racket, Matthew Butterick, is a lawyer and a typographer. He is not a computer scientist. And yet, he designed a DSL called Pollen for creating web-based books. Pollen has been quite successful within the Racket community.

My understanding is that Racket is predicated on the idea that people like Matthew, as much as people like your computer scientists, are the best authors of DSLs. They are the ones who understand the domains they're working in. If the division you predict ever does exist, it is because the tools have failed to make people who are not computer scientists capable of creating the DSLs they need to solve their problems. It's not because languages are inherently better designed by experts in the domain of programming languages.

Why wouldn't the author qualify as a self taught computer scientist though? To me your point more of a statement on the accessibility of computer science first, and I do completely agree that the accessibility is important, which the Racket/HtDP ecosystem does pretty well with.

That said, there's many ways to write code and learn to code, and I think of the web programming bootcamp style or the cookie cutter college grads who go through four years learning how to program in X language to work at fancy company Y. My point is that many routes are just not focusing on the higher level design skills that I think are needed to make good libraries/frameworks/DSL's.

To clarify, I'm not saying you have to be a PL expert, simply good at program and language design, which I think is what lacks in many places and could create such a division. That skill is/should be accessible to everyone.

Ok, I think I read more into your first comment than you actually said.

>many routes are just not focusing on the higher level design skills that I think are needed to make good libraries/frameworks/DSL's.

I have observed that too. But I don't think this is about who is and isn't a computer scientist, whether self-taught or formally trained. I think it's more a change in the way people relate to programming languages. Perhaps programming languages were commonly assumed to be principally an academic topic. Perhaps it's not that more people are becoming computer scientists but that more people are finding non-academic ways to relate to programming language design. I think what Butterick did was to build the tool he needed to do a job (write a book). So language design becomes just like any other form of hacking.

>Why wouldn't the author qualify as a self taught computer scientist though?

Butterick himself is adamant that he is a lay person. He compares himself to a "squirrel in a ferrari": https://www.youtube.com/watch?v=IMz09jYOgoc And that's his point in that talk - Racket makes it possible for even the lay person to build the language they need.

He can view himself however he wants, but the man just wrote an article that competently covers Turing completeness, regular expressions, and Lindemeyer trees, among other things! He's definitely earned his comp sci merit badge, so to speak.

Computer science is a specific treatment of these topics that is based in formalism. Compare John McCarthy's "Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I" and Paul Graham's "The Roots of Lisp". These papers cover exactly the same material. But only the first is computer science because it uses a formal language to express the ideas.

If that's the definition of CS we're using then my original post is a very egregious misnomer. I feel like that definition is way too restrictive though. CS can be formalized and informal, but both are still CS IMO.

I never understood why CS people are so afraid of non-CS people touching language development. They hide the keys as much as possible so nobody but the predestinates can touch it. Racket gives that key. Despite that few people will use it (probably only the most prepared) and if these languages hacks are completely non-sense they'll fail. The world will not suffer much with this process.

If anything "CS people" are less afraid than they should be, given the demonstrated popularity of horrible retrograde languages designed by people who didn't know what they were doing.

Languages designed by academic computer scientists are less used in the real world than languages designed outside of academia. It seems that academic computer science does not often lead to the design of languages that are actually chosen by working programmers to solve their problems.

I think there are some very interesting and elegant ideas in some languages that come from academia but it seems that something is missing when it comes to designing languages that are actually widely adopted. Despite their elegance Haskell and Racket are much less used than Go and Java.

I'm currently browsing the list of languages made with racket (here http://docs.racket-lang.org/search/index.html?q=H%3A) and it seems like most languages are just lisp variants. I suppose they were made for educational or fun purposes, but i wonder if i haven't missed something more deep as to why someone would want to reimplement a lisp in a lisp language.

Racket tends to be geared to writing lispy things because of the heavy reliance of the syntax parsing and macro transformation on s-expressions, so most/all being variants of lisp makes sense. Writing non-sexp languages in Racket tends to take a lot more work that's not batteries included.

It also has an incredibly powerful inheritance system. In 3-5 lines I can inherit all the functionality from Racket and then add, modify, or delete any part I like. Reminder, I can do all this from the compile time level, not the runtime level. I can change something as basic as how you define functions with just a few lines of code while still getting everything else.

Basically, I can easily write Racket+X or Racket with modified Y or Racket without Z and make it it's own entire language with very minimal effort.

A lot of this occurs because of Racket's insanely good macro system. You can also use these tools in libraries/packages and still get the compile time control while the language is still officially Racket.

> Writing non-sexp languages in Racket tends to take a lot more work that's not batteries included.

No more work than any other language - Racket has reclusive descent, lex/yacc and PEG parsers generators. There is a whole section on parsers at http://docs.racket-lang.org/ including ‘Megaparsack’ based on higher-order parser combinators (like parsec for Haskell)

Racket can also read ‘C’ yacc/bison grammars.

It’s worth noting that a variety of non-Lisp-parenthesis-languages have been done in Racket including Java and Python.

This looks like a nice topic for a blog post / single page tutorial. Perhaps show the rackety way and compare with version in Racket that use the usual parsers in other languages.

It's not that those languages are "lisps", they just mostly use s-expressions. That's because the syntax isn't what they're trying to innovate on, or what they're trying to stand out with. And also because once you understand s-expressions, it becomes kind of hard to justify not using s-expressions. Honestly, why bother with novel syntax when your PL semantics can be expressed in terms of s-expressions?

I don't see it on that list, but there's apparently an implementation of Algol 60 baked into dr racket, so you need not limit yourself to lisps if you want to put in the work to do something more far-reaching.



I'd say most are racket variants - and that makes sense - the scribble lang (for documentation/documents) is a prime example of that kind of "tool" DSL, and "lazy" (as mentioned by another commenter) is another.


But there's also stuff like:


And after a quick Google I found:


And there are of course things like:


But I suppose that qualifiés as "lisp like"...

[ed: and the article mentions "brag" which I think would be odd to call "lisp like" : https://docs.racket-lang.org/datalog/datalog.html ]

2 reasons:

1. Lispers often find s-expression syntax easier to edit for many applications. So it's not surprising that a lot of the folks who design languages in Racket also might choose an s-expression syntax for their language design.

2. By using the same (or similar, in the case of Pollen/Scribble/...) syntax as the main racket language, it's easy to give access to the full power of racket in a DSL.

That was my impression as well: you can have any language, as long as it's LISP.

Some of the more specialised languages are different, though.

If the language uses S-exps, is it a Lisp? It’s convenient in racket to design a language with an S-exp based syntax, but the semantics of the language can be anything you want. Like prolog.

> you can have any language, as long as it's LISP.

Let's be fair, many lisp programmers would be perfectly fine with that.

ok but then since lisp already has an extremely small syntax, why not simply define functions in the language rather than using the macro system ?

Unless your goal is to be source compatible with another lisp variant, of course.

>why not simply define functions in the language rather than using the macro system?

I don't know. Most examples I've seen from people extolling the virtues of lisp macros and metaprogramming wind up just generating more lisp code in the same language with the same syntax and semantics, so I don't know why you couldn't just use functions in that case.

To be fair, I only have a surface understanding of one lisp variant (Arc) so it's entirely likely I just don't get it.

A nice example is `match` that is one of the macros that comes by default in Racket.

  #lang racket
  (define (show x)
    (match x
      [(vector a b c)
       (display "It's a vector, total = ")
       (displayln (+ a b c))]
      [(list p q)
       (display "I's a list, average = ")
       (displayln (/ (+ p q) 2))]))
  (show (list 8 2))
  (show (vector 1 2 3))
(Playable version http://pasterack.org/pastes/75154 )

The idea is that `match` can analyze the value of the first argument (`x` in this case) and then it binds the variables (a,b,c or p,q in this case).

In this case I'm matching only a vector with 3 values or a list with 2 values, but you can match more complicated and nested pattern.

The interesting part is that inside match you have the full power of the language, you are not restricted to a few special hardcoded cases (like `display`). You can send an hppt request, play a sound, save a value to disk, calculate the prime factorization, transform the text to uppercase, anything that is available in the language is available inside match.

But `match is just a normal macro. Racket is implemented in layers. The bottom layer is implemented currently in C, but it doesn't define `match`. The bottom layer is almost hidden and it is used to define some libraries and an intermediate language that it is used to define some libraries and an intermediate language, ..., and after a few steps you get the official Racket language.

But if you like you can reimplement a macro like `match`, or a variant, or your own macro, with all the magic to define local variables and use all the power of the language inside it.

Syntax is different from semantics. To give a concrete example, say you want a part of a program to use lazy constructs. In racket, you could use `#lang lazy` for that one module and then import and use that code within `#lang racket` modules.

Despite having the same syntax, `#lang lazy` and `#lang racket` behave differently:

    #lang lazy

    (define (say-hi _)
      (displayln "Hi!"))

    (say-hi (exit 1))

    $ racket lazy-example.rkt
    $ echo $?

    #lang racket

    (define (say-hi _)
      (displayln "Hi!"))

    (say-hi (exit 1))

    $ racket racket-example.rkt
    $ echo $?
Exact same syntax, different semantics.

Eager evaluation prevents many forms from being implemented at the function level. You can't for instance easily write a short-circuiting and() function in most languages, because arguments to a function are eagerly evaluated.

So given an and() function, you can't safely do "and(False, fire_missiles())", because the language will evaluate both arguments.

But an and!() macro could: macro expansion is essentially "lazy", in that it happens prior to the evaluation phase, so it can avoid evaluating any pieces of code it wants to, such that "and!(False, fire_missiles());" is perfectly safe, because our macro can stop expanding after "False" and thus fire_missiles() is never evaluated.

Incidentally, this is why at least some macro patterns are unnecessary in lazy languages like Haskell: you can write short-circuiting and() as a function there, because a function in Haskell only evaluates as much as is necessary (provided it's been written properly). Yet even there, Haskell still has support for macros and things like TemplateHaskell and so forth, because there's just some things you can't do solely with functions, like arbitrary syntax, language extensions, etc.

> why not simply define functions in the language rather than using the macro system ?

Modifying control flow, for one. A classic example is anaphoric if. If you want to implement that, you can't do it just as a straight forward function. It needs to be a function that modifies syntax, aka a macro.

Isn't it also done in SICP? So lisp users probably have that in mind when trying it out in racket.

semantics != syntax

I like his style of writing and how elegant and powerful LOP is. The jsonic snippet is insane (in a good way). In the wrong hands or for the wrong job it could lead to a security disaster though. Still pretty cool to be able to think out of the box like this. Makes me rethink totally the way we do things and why.

  #lang jsonic
  // a line comment
    @$ 'null $@,
    @$ (* 6 7) $@,
    @$ (= 2 (+ 1 1)) $@,
    @$ (list "array" "of" "strings") $@,
    @$ (hash 'key-1 'null
             'key-2 (even? 3)
             'key-3 (hash 'subkey 21)) $@

methinks a lot of the programming world at the interface (semantics/syntactic/ergonomic) is .. implicit tree processing.

see html encoding through functions, or kotlin builders, or json dsl ..

If you create whole new languages, i.e. external languages, not really embedded DSLs, with their own syntax, how is Racket better than other languages at implementing those?

Neither the sntax object, nor hygiene are of concern when you build external DSLs, which the author's examples seem to be.

I have not yet seen a compelling argument for Racket there vs. for instance a parser generator framework.

I might just be missing the point. Either way the author's convincing seemed to have failed for me.

Alexis King covers some of the benefits in one of her blog posts about Hackett[0] and in her talk Languages in an Afternoon[1].

[0]: https://lexi-lambda.github.io/blog/2017/01/05/rascal-is-now-...

[1]: https://youtube.com/watch?v=TfehOLha-18

>I have not yet seen a compelling argument for Racket there vs. for instance a parser generator framework.

Racket allows for easy interop between the various languages, so for instance, you can use typed racket from untyped modules. You can import documentation modules in base racket and so on.

With other languages, you have to glue the infrastructure yourself, racket does it for you.

I think racket makes it simple to compose your language implementation modules. So e.g., hygiene could be imported (with minimal adaptions to your syntax/binders).

Yeah, that's true. I would love to read an article about that with practical examples, i.e. showing that language composition is really a common and needed pattern (not just a theoretic concern, ... are LOP users really composing several language modules often?), and how Racket enables it.

In LISP-like languages like Racket, the code is a list. That is, a program is just a list of tokens (a b c), the same data structure one would use for storing any other kind of data (the equivalent of Python’s [a b c]). When programs are themselves just lists, they are easy to manipulate with code. So, in LISP-like languages, it is easy to write code that writes code. That makes them especially suitable for domain-specific languages or LOP. You can write functions that write code, making a new language.

I think PG has some writings on this topic, but I don’t remember if it is online or in his LISP textbooks. His company Viaweb did this, IIRC, using a DSL written in LISP to generate HTML.

If I give GP the benefit of the doubt, I think GP is well aware of this. However, their point is that in these examples, the original input was not in S-Expression form, so the ease of manipulation did not exist until it was parsed.

> In LISP-like languages like Racket, the code is a list.

This is nothing special. In all languages, code is a list: a list of characters, that is.

Just saying.

What the parent likely means is that Lisps, structurally, are akin to the AST many other languages generate as an intermediate step to compilation.

A list of characters (a string) and a list of tokens are drastically different to work with. If you haven't worked with both and want to write programming languages, I would recommend filling the gap of the one you haven't to see/feel the difference.

Beyond the code being a list, Racket's macro system and syntax parse / creation tools are incredibly powerful and come built in where you could spend years in another language just getting the tooling to write a language, let alone writing it.

LISP code is not a list as you know it traditionally. It's made up of cons cells. The code of other languages is not a cons cell, altho it is a list of characters. Very important distinction.

A parser generator is m-expressions to racket's s-expressions - which is all about syntax, whereas s-expr based DSLs are focused on semanics.

Not sure if that makes anything clearer but I can't think of a better phrasing.

sharing ?

I hear this alot about Racket as if this is the value proposition and Racket stands alone with these features. Languages like Ruby are routinely used to build DSLs (RSpec, Rails, ServerSpec, Chef...etc). I would like to hear somebody address why Racket is superior to languages like Ruby for creating DSLs, because it for sure doesn't stand alone. Metaprogramming is a feature in a LOT of languages. I'm not arguing for those languages, I'm arguing Lisp languages aren't the only ones capable of creating other languages and a comparison would be more useful than repeated emphasis on unique ability for the task which I don't think is necessarily true.

In practice most Ruby "DSL"s are just regular libraries that you happen to be able to use without the familiar foo(bar) function call syntax. It's the difference between "Domain Specific Language" and "A Domain Specific Language", or embedded vs. hosted/standalone DSL's [0]. The value that Racket provides is facility in developing standalone languages, as opposed to the embedded DSL's that people make in Ruby.

To address the examples you gave, RSpec, Chef, etc., those literally ARE just using Ruby, no new syntax, no new semantics.

[0] https://www.quora.com/What-is-an-embedded-domain-specific-la...

The author acknowledges that other languages can be used for DSLs. He claims that Racket's hygienic macro system is the special feature that makes it superior for the task.

If I’m understanding the article correctly, it should be possible to create a Racket language that compiles to any other language. A quick google does reveal work has been done on C and Python, although it appears there are bugs and various work-arounds that precludes a full and clean integration. The Javascript integration appears dead (in terms of using the latest version of Racket)

Although using Racket for this wholesale integration appears impractical, it does suggests a killer-app: extending general programming languages. In other words, using Racket as a transpiler. I.e., a better Babel.js, but for any language.

I routinely use Python for machine learning tasks, but frequently am frustrated by the inflexibility of the language. Instead of solving the problem itself I have to first solve how to make the language do what I want (usually resulting in code that lacks expressiveness and brevity, and thus extensibility).

I’d love to use lisp to leverage ML work that can only be done effectively in Python (Tensorflow, PyTorch, etc).

Can Racket be an effective transpiler?

IIUC Babel.js is essentially a fancy preprocessor for JS. So if by that analogy, you mean "can I use Racket to design my own notation that compiles to Python", yes of course.

I find it a pretty amazing feat that the author has created a programming language (Racket), then used it to create another language called Pollen[1] for book publishing, then went ahead and used Pollen to publish three books. Oh and the fonts are designed by him too[2].

[1] https://docs.racket-lang.org/pollen/

[2] https://mbtype.com/

No, the author hasn't created Racket. The author is the lawyer https://en.wikipedia.org/wiki/Matthew_Butterick.

Racket was developed by PLT Inc.

> Racket was developed by PLT Inc.

There's not a lot of information on PLT Inc. these days, so: Racket was developed under direction of Matthias Felleisen, with his students Matthew Flatt, Shriram Krishnamurthi, and Robby Findler serving as the core development team. They also published the book "How to Design Programs" [0], and Krishnamurthi published "Programming Languages: Application and Interpretation" [1], which are both books that rely on Racket. (PLAI actually uses its own language which is developed in Racket, called plai-typed, but which relies on the DrRacket environment.)

[0] https://htdp.org [1] http://www.plai.org

As the comment below points out the author did not create Racket. However there is an author (Donald Knuth or DEK as he is known) who did just that: create a new language (two of them: TeX and METAFONT) for typesetting and making fonts, created new fonts (Computer Modern), published a series of books about the languages and the fonts, AND the programs that implement these languages. Then he used his creations to typeset his 'The Art of Computer Programming' (or TAOCP) books.

And don't forget about Leslie Lamport, who created LaTeX and won the Turing Award for his work on distributed computing systems (as part of which he designed the TLA+ language).

About a decade ago, I would have agreed with this 100%, and I still am fascinated and, quite frankly, awed by Racket and LOP (and looking forward to Racket Fest 2019 here in Berlin[1]).


(You knew there was a "but").

While I don't quite agree with the assertion that with the right tooling (so: this tooling), creating a language is as easy as creating a library, I don't think it would solve our problems even if it were true.

Language design is much harder than library design, even if the tooling is perfect and transparent. In the extreme we're going to end up with what Alan Kay calls inverse vandalism[2] aka the Mount Everest Effect: making things because we can, not because we should or they're good.

In essence, designing a (domain specific) language should be your last resort, not your first resort. At least one that requires a parser or macro system to implement. After all, when we create a library, we are in fact creating a domain specific language, or at least a domain specific vocabulary. A framework goes a little further towards the idea of language. In fact, the first version of Simula was a simulation DSL. It was the second version that made the language into an OO language and the simulation "DSL" into a framework. The advantage being that you can use the same language with different frameworks for other domains.

If we look at natural language, it is very much the same: we use the same language to talk about any number of domains, the most we tend to add is some domain-specific vocabulary. Imagine the tower of babel we would suffer under if every domain had a completely new and different language!

That effect was very much encountered and later described by the LRG at PARC when they started using Smalltalk 72, which allowed every object to defined its own language[3]. It was chaos! Turns out we are not very good language designers in the heat of the programming battle. So they very consciously made the language more restrictive later. However, the Smalltalk message syntax is specifically designed to give you something akin to DSL, but without most of the problems[4].

A similar effect was documented by Bonnie Nardi for end users. Here as well, the expectation was that users would prefer domain-specific tools that were tailored exactly to their needs. This, surprisingly, turned out not to be the case. End users apparently vastly prefer general tools, even if the effort to adapt them to specific tasks is greater (see also: people writing documents in Excel...)

That said, current general purpose programming languages are probably not quite flexible enough. I think there is a sweet spot to be found somewhere between Smalltalk-80 and something like Racket.

[1] https://racketfest.com

[2] http://worrydream.com/EarlyHistoryOfSmalltalk/

[3] http://stephane.ducasse.free.fr/FreeBooks/BitsOfHistory/Bits...

[4] https://youtu.be/Ao9W93OxQ7U?t=627

I largely agree but then I enter the modern ops stack where templated yaml is the lingua franca.

It’s incomprehensible and for each system you are operating you have to learn some arcane yaml incantations that have dramatic impact on your production systems. It’s a world crying out for a set of well factored dsl.

Something like Dhall you mean? https://github.com/dhall-lang/dhall-lang

Or a general purpose language with the ability do describe architectural configurations?

I think this is what Pulumi (https://www.pulumi.com) is trying to achieve.

Better than a dsl would just be a mature general purpose programming language with tooling etc.


Making a language "in the heat of the programming battle" is indeed a very bad idea. So is making a lib.

On the other hand I really would like to see a world with 10s or even 100s of little (preferably non Turing complete) languages with extremely good integration(with each other) and very good tooling and visualizations. Because right know the languages we use tend to be way too powerful. This prevents good tooling and optimizations.

Reminded me of this: http://langsec.org/occupy/

Very much spot on. I'm also fascinated by Racket and LOP, but I find it hard to believe in those who claim being more productive than everyone else without proper evidence to back up their statement.

Of course, anyone would expect those proficient in Racket and LOP to have a better understanding of programming and software design in general, and to be generally more productive than many, but that's not necessarily because of LOP.

But regarding your comment on users preferring general tools - I notice this happening in my domain, where the trend is now to build absolutely everything with Python, regardless of whether this is a good choice or not. And it's impressive how libraries are built with embedded languages that feel completely non-pythonic, just for the purpose of using Python. Library writers could at least strive to make better use of the syntax of the general language (see Tensorflow vs. Pytorch).

It'd be interesting to see more studies on general languages vs DSLs, and reasonable heuristics to know when to prefer one over the other.

Now, not only end users prefer more general tools (see also: people writing a NES emulator in Emacs Lisp).

>making things because we can, not because we should or they're good

"Macros are catnip to a certain type of developer"

This is the trap of macros and LOP approaches. This development approach often leads to code that makes sense to nobody in the world except the original developer. There are a lot of genius pieces of code out there using macros and DSLs, but there is also a lot of spaghetti code with a side of Macro meatballs.

This is tangential to the article, but. Scheme was one of my first languages and I would love to pick it/Racket up again. But the Racket IDE (like DrScheme before it) is painful to use. Especially compared to a more "native" Mac editor.

Has anyone used Racket with success in another IDE or programmer's editor?

Try `racket-mode` for Emacs. It has many of the features that DrRacket has.


I'll give it a go, thanks!

Definitely interesting, and the examples look nice, but biased in that absolutely no evidence is given to support that LOP makes anyone more productive.

> maximum precision

Sometimes maximum precision means having a first-class typechecker & type inference. (You could build some of that too in Racket if you really need to. Of course, that code itself will not be verified or typechecked.)

> Why aren’t these lan­guages DSLs?

They aren't DSL's because they need statically-linked binaries, and good error messages, and then your task is not as easy as desugaring to a LISP/dynlang.

> Of course, that code itself will not be verified or typechecked.

Easy, just used typed Racket :)

Ok, so now you have Typed Racket (that last point was a relatively minor comment on the larger argument). Does implementing the above then correspond to the strengths as described in LOP?

I'm not sure I'm fully following what you're getting at due to the terseness and my first comment was more of a fun note that you can always build on language features, but generally speaking you could inherit and use the types and checking of Typed Racket in your DSL and still add the syntax you want for increased precision.

so many programming languages... i wonder if there is someone that knows them all

I’m sorry to nitpick on a good article, but this article lost me when it proved that HTML is a programming language by showing that it can be output by a Python program. This seems to be a case of proving too much [0].

[0] https://en.wikipedia.org/wiki/Proving_too_much

It may be proving too much, but it is also arguing the general point. Your objection would be that this example would prove that all strings are programs, which might seem absurd. But in fact, all strings are programs if you haven't specify which interpreter they are supposed to be programs for.

Either way, that seems like a silly approach, IMO.

"Before you leave, please know: I’m hedged either way. LOP and Racket have been an incred­i­ble force mul­ti­plier on my pro­gram­ming pro­duc­tiv­ity"

Well, wouldn't we expect that the author of a particular programming language to be mad productive in it because, you know, they designed all the mental models and assumptions that are baked into it?

You would never once stumble on syntactic ambuigity because, to you, there is no ambuigity...or perhaps there is but you slyly added that bit just to fuck with programmers heads...or who knows the reason.

I'm just postulating that for an author to boast about their productivity with the language they designed is somewhat self-serving.

The author of this article is not the author of Racket.

Racket is built by a team and has been around a long time.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact