Hacker News new | past | comments | ask | show | jobs | submit login
Hy (wikipedia.org)
850 points by tosh on Aug 4, 2019 | hide | past | favorite | 141 comments



hy was first introduced to me at PyCon 2013 in a lightning talk by @paultag, the language's creator. It was a very fun lightning talk. You can still watch it online here:

https://youtu.be/1vui-LupKJI?t=968

I remember that day pretty clearly because in the same lightning talk session, Solomon Hykes introduced the Python community to docker, while still working on dotCloud. This is what I think might have been the earliest public and recorded tech talk on the subject.

https://youtu.be/1vui-LupKJI?t=1579

This was also the same PyCon that Eben Upton opened with a keynote to discuss the mission of Raspberry Pi, while also announcing that each of the 2,500 attendees would receive a free physical Raspberry Pi at the conference.

It was a pretty darn good year of PyCon; 2013 was apparently a good vintage for open source tech with Python connections.


I also remember it because my thought was "That's amazing!" but also "That's such a toy, nothing useful could come of that".

Which is also what I thought about PyPy.

Both of which seem to have proven me wrong.


He (author of Hy) is quite entertaining.


Had the good fortune to work with Paul briefly. He's not only smart, but he's a heck of a nice guy.


Aw thanks!


That was really a PyCon to remember. I was there as well. The lightning talks were amazing that year. :)


Love the lightning talk, thanks for the pointer.



If Hy could resolve its problems with let being moved to contrib and attracted enough manpower to reach 1.0 I think it could attract significant mindshare. Hy fills in the missing piece for devs who want to go all-in with modern Lisp supported by an established ecosystem, ie. front-end/Clojurescript/NPM, scripting/Hy/PyPi and back-end/Clojure/JVM.


As of recently, scripting is viable in Clojure due to GraalVM's native image

If it's the ecosystem that's of interest there are solutions for that too: https://nextjournal.com/chrisn/fun-with-matplotlib


I was suggesting Hy adds PyPi to your arsenal, especially for ML and data science. With these 3 you have a modern lisp solution to anything above system level and backed by a solid ecosystem of libraries.


It seems to me that this solution to call python from clojure defeats the purpose of using clojure in the first place. You lose the interactive loop, as you can't really inspect the underlying python objects and it's hard to jump around definitions or refactor. Also, imports and functions being called as strings is terrible for debugging.


I agree that removing ‘let’ makes the language much less pleasing for me to use. I recently looked in contrib for ‘let’ and saw the macro definitions but could not get it working. It seems like a small thing, but it does keep me from doing more than rare experiments with Hy.


Ditto. I moved away from it partially due to the removal of let and partially due to lack of Python 3/async support (which is now landing).

I’d really like to see a 1.0 with let back in the core, but I don’t see that happening just yet.

Edit: typos, plus I’m also curious to see what https://github.com/gilch/hissp will do.


Not a small thing really. To anyone who groks Lisp a dysfunctional let is a non-starter.


Curious what problems you had with let. I tried it recently and it worked fine. Having an import is annoying but it's not the end of the world.


It was removed without the replacement macro being fully baked, and a number of other things were changed at the same time. Rather than rewrite my stuff at every minor release, I moved on.


That's what I'm asking. What's not fully baked about it? In my (admittedly small) testing, it does its job of assigning stuff in a temporary scope.


I haven’t tried, but are the variables properly captured in closures? LET should have “temporary” lexical scope, not “temporary” temporal/dynamic scope.


It does at least act like it has closures. Try starting the REPL with $ hy --spy and you can see how examples get compiled.


At the time, it just wasn’t there :)

Nowadays it seems to work, but rewriting my code to catch up with today’s version (and cope with the other changes introduced over the last couple of years) is not something I’m interested in doing.


Joker fills in the scripting with clojure syntax quite well.


It's not about Clojure syntax. In fact Hy is only a rough approximation of Clojure. It's about the whole of PyPi available from Lisp. If it resembles Clojure, even better, but that's not Hy's raison d'etre.


Ahh, Clojure also has access to Python objects through libpython-clj (https://github.com/cnuernber/libpython-clj).


Clojurescript with planck is another option. https://planck-repl.org/



Hy made me reconsider Lisps and I'm glad it did. I'm now using CL quite a lot. It's possible to use Python libraries from CL:

- there is py4cl and async-process: https://github.com/CodyReichert/awesome-cl#python

- NumCL is a clone of Numpy: https://github.com/numcl/numcl

Common Lisp has many advantages though:

- you can build a self-contained binary of your app, including web app. It's such a breeze to deploy, and it's possible to ship a binary to users.

- the REPL is much better and more fun than ipython

- CL is compiled. We compile our code function by function, and we get many compiler warnings at compile time.

- CL is stable, the syntax is simple, you can add the syntax you want with extensions (decorators, string interpolation,…) and yet, the implementations keep improving, and new ones are created (Claps, CL on LLVM).

- CL is fast (how pgloader got rewritten from Python to CL: https://tapoueh.org/blog/2014/05/why-is-pgloader-so-much-fas...)

- there may be more libraries than you think: https://github.com/CodyReichert/awesome-cl

- the object system is very nice, the multiple dispatch makes me have a cleaner and shorter ABI of my app.

- there are more editors with good support than Emacs: Atom support is getting very good, and there's more: https://lispcookbook.github.io/cl-cookbook/editor-support.ht...

- the (default) package management system works with "distributions", à la apt/Debian, so it doesn't break suddenly because of a third party of a third party dependency.

Of course, the web stack is less mature (but more promising: see Weblocks http://40ants.com/weblocks/) and CL might be more difficult. It's not for every one. But it's been a joy so far.


Anyone got an ELI5 on the goal of this language, or the pros and cons?

I'm not a dev, but, a Python user with a background in data/info science - so I'm a bit unclear on what Wikipedia means by things like "a dialect language of Lisp", and using it to write "domain specific languages."

Is this just a way to use Python libraries in Lisp (which seems to be a low level programming language?)


Lisp is not typically a low-level programming language, but rather extremely high-level (high level of abstraction)[0].

The "dialect language of Lisp" quote just means that it's a variation of Lisp, but running in a Python environment.

The term "domain-specific languages" refers to programming languages created for a specific, often one-of, task. If you look at this repo, you'll see various languages used to create diagrams and graphs in this library. These are all examples of DSLs (note, they're in many places, this is just an example):

https://github.com/francoislaberge/diagrams

Regarding Hy specifically, it's basically Python that uses a different syntax, or skin, of sorts. Instead of white-spaces and colons, you get parentheses and...parentheses. It's more complex than that, but it's basically just a way to write Python for people that like Lisps.

0. Yes, there are some examples of low-level lisps without features like memory management, but they're pretty unusual.


So, if I was to boil it down, its like "Hey, use Python, but with a different writing style you are used to."?


Sort of. It's more a case of having all of the flexibility of a Lisp - the ability to program with wishful thinking and fill in the details later - and all of the "batteries included" of Python.

It would genuinely be worth your while to watch the SICP lectures with Gerry Sussman and Hal Abelson [1] to get an inkling of an idea of what "your program is just more data" can mean. Lisp is more about brining the language up to the level of the problem than bringing the problem down to the level of the language, and it's difficult to appreciate what that means until you've seen it. By the time you get to the end of lecture 3B, it should click.

[1] https://ocw.mit.edu/courses/electrical-engineering-and-compu...


There's also the Common Lisp package "burgled batteries"[1] which allows you to call Python from Lisp. Unfortunately, the original author no longer maintains it and it hasn't been updated to Python 3.

[1] https://github.com/pinterface/burgled-batteries


There is a fork that supports Python 3:

https://github.com/snmsts/burgled-batteries3


Thanks - appreciate that and I'll double check the link.

Because I'm still a bit confused at what Lisp is!


>Because I'm still a bit confused at what Lisp is!

You're in for a treat!

http://www.paulgraham.com/avg.html


How does Hy manage recursion given the limitations there in Python?


It doesn't. Or does the same way as Python: imperative loops. TCO is a key feature of Scheme, but not of Lisps in general. Clojure doesn't have it. You can always trampoline like Clojure's loop does.


10-4, thank you!


What kind of limitations are you referring to?


Python limits the stack to a 1000 and a series of recursive calls hitting that ceiling will throw a `RecursionError`.


You can actually set this to a higher number in the sys module. Maybe not too high if you don't like segfaults.


> Lisp is not typically a low-level programming language,

For most practical reasons is pretty low level. You don't really write in a language you write in AST directly, which while powerful IMO is not very high level.


One writes in a hierarchical data structure: s-expressions. Since Lisp has built-in macro expressions, on top of s-expressions, one can represent code independent from 'ASTs' and make it executable:

For example the LOOP macro introduces a non-prefix syntax:

  (loop
        for i below 100
        for j = (random 100)

        sum j into s

        while (< s 1000)

        finally (return i))
This is actually pretty powerful and high-level (the user being able to extend or change the syntax in complex ways in portions of a program). The LOOP implementation as a macro is actually an integrated sublanguage and the user can add arbitrary (and even extensible) complex macros like that...

Generally Lisp is low to mid level as a programming language, with a bunch of features which can be considered high-level: extensive macro system, Common Lisp Object System, extensive error handling, ...


> Generally Lisp is low to mid level as a programming language,

I believe that no programming language with automatic memory management can be considered "low-level".

That aside, why would you call Lisp "low- to mid-level"? Commonly-used implementations are at about the same level of abstraction as Ruby or Python.


> I believe that no programming language with automatic memory management can be considered "low-level".

That's a view of a whole language and its implementation. Still it may have a range of features which are low level.

Lisp for example has Foreign Function Interfaces (FFI) with low-level interfaces and manual memory management.

Example: http://www.sbcl.org/manual/#Foreign-Function-Interface

Basic Lisp stuff like CAR and CDR were (almost) instructions on a CPU ( https://en.wikipedia.org/wiki/CAR_and_CDR ).

Something like a cons cell (the building block for lists) is basically a two-element vector. Lists were made by chaining them together via a CONS operator, which creates such a two-element vector.

Such a linked list data structure is pretty low-level and the typical mark&sweep GC of the early days is also relatively basic.

There is not much magic to it.

Many other programming languages have much more complex basic data structures (see object-oriented programming in Java with classes and instances, inheritance, interfaces, namespaces, ...). Compared to that the basic linked list in Lisp is primitive.

> I believe that no programming language with automatic memory management can be considered "low-level".

See the standards for Scheme or Common Lisp. There is not a word about automatic memory management in the specifications. Automatic memory management is a feature of an implementation, just like foreign function interfaces. Most implementations have a kind of garbage collector. But most implementations also have manual memory management.

People even write operating systems in Lisp sometimes: https://github.com/froggey/Mezzano


Sounds like lisp is doing the low level stuff for you, so really like a mid/high level language. Most languages have statements that correspond to a single cpu instruction. And it sounds like this cons cell is in fact an abstraction away from bytes and memory. Which is more of an abstraction than, say, a byte string.


Do you love Common Lisp, Racket, Scheme, and/or Clojure, but wish you had a language like that, but with full interop support for Python libraries (in much the same way Clojure has full interop for Java libraries)? If so, you might like Hy. It's not as mature as those other projects, but its support for Python interop is impressive, and by letting you write Lisp-style code, you can make more functional-feeling code and also do things like macros and metaprogramming more easily.


related: a story on Hy popped up on hacker news a few days ago [0][1]

but didn't get much attention at the time:

  """
  BETTER IS BETTER THAN WORSE:
  TALES OF A WEEK WITH HY AND A CALL FOR REVOLUTION
  by "Mr. Mojo Rising"
  """
I put it into nextjournal for easier reading:

https://nextjournal.com/tosh/better-is-better-than-worse

--

[0] https://0bin.net/paste/xmJWIlIv+Ws+Fbjs#zCyc4Yc4T+YISv3MHwMT...

[1] https://hn.algolia.com/?query=hy%200bin.net&sort=byPopularit...


Huh, I love this guy's writing style, thanks for the link.


I tried it out some time ago, but my view is that is too similar to clojure but also different in significant ways, making it a pain to remember the differences if you also happened to be working on clojure code. Being too similar but not the same can cause problems with programming language syntax.


TI(also)L:

Kawa [Polish for coffee] is a language framework written in the programming language Java that implements the programming language Scheme, a dialect of Lisp, and can be used to implement other languages to run on the Java virtual machine (JVM). It is a part of the GNU Project.

https://en.wikipedia.org/wiki/Kawa_(Scheme_implementation)


Interesting. Why not use Clojure which runs on the JVM?


I've used both Clojure and Scheme (lazy racket) and find them both interesting. The part about Kawa that's novel is that it can be used to make other languages that run on the JVM.


I'm a long time Lisper and Hy makes Python just tolerable enough to use as a data scientist. It can't or doesn't want to solve Python's worst problems, but its a big improvement.


What are Pythons biggest issues from your perspective?


Scope and type weirdness and just general lackadaisical design philosophy. Hostility towards functional programming. Its also still really really slow, particularly in this era of JIT languages like Lua.

Also, I generally find the libraries to be amateurishly designed. Big one that comes to mind is Pandas.


DNS error for the docs http://docs.hylang.org/ :/



Not sure if it's quite the same thing, but this is still up https://hy.readthedocs.io/en/stable/


It shows build status: build failing.


Sounds very interesting! Anyone here who uses (or tried) Hy? I'm curious how easy it is to use some arbitrary thing in Python (library, code snippet, etc) in particular.


I wrote https://github.com/rcarmo/sushy on it before they moved let to contrib.

(I later ported it back to pure Python 3, because I couldn’t move forward while Hy was in flux)

If let was “fixed”, I’d probably move back to Hy.


It's great, for me it's basically Python but better.

Downsides:

- On CPython, the generated Python code tends to be noticeably slower in hot loops than the equivalent "native" Python code. I haven't done any testing with PyPy.

- Can't really use it at work since nobody will know how to read it.

My conclusion is: it's fun, it's Lisp, and it's Python. For personal projects and small standalone tools it's perfect.


Hy makes it trivial to use Python stuff. Like Django [0].

Most things are just a (import x) away.

[0] https://github.com/paultag/djlisp/tree/master/djlisp


I'm (perhaps irresponsibly) mixing some of it inside a medium-sized Python codebase.

The one thing that doesn't work is autodoc/Sphinx. Otherwise it's all there, and quite pleasant to work with given a rainbow-parenthesis extension and Parinfer.


I played around with Hy and found it fun but I'm not sure anyone is using it for anything serious?


Cloudflare throws a dns origin error. Repo: https://github.com/hylang/hy/blob/master/README.md


Ditto, getting `Error 1016` from CloudFlare.

Have to resort with using web archive: https://web.archive.org/web/20190714180208/http://docs.hylan...


Works for me in Austria.


One problem I had with Hy is that some codebases really depend a lot on annotations, which gets a bit messier in Lisp syntax.


It's a lisp, so you can build your own syntax, the messiness of annotatations is only limited by your creativity.

  (defna add [a :: int b :: int] (+ a b))
https://pastebin.com/5kpTBiz2

You could expand that macro to support decorators as well via extra arguments before the args list, you'd be in great shape.


WHy? Super cool project, but what would be the purpose of it? Or is it just a fun project to develop skill?


Lisps are pretty pleasant to write in, and having metaprogramming support in your language is a game changer.


It seems like it is pretty much just Python, but written with s-expressions and supporting macros.


I don't agree with the "just Python". Hy gets around Python's crippled lambdas and frees you from indentation hell. Hy also gives you real macros, expressions instead of return statements, improved lisp-style comprehensions, :symbols and switch/case whilst removing the need for nonlocal/global.


The macros support is a pretty big deal.


The word "just" almost always adds nothing of value makes a sentence less polite.


I completely agree and try to limit my usage of it. I am using it here to emphasise that it is only a different syntax for Python and nothing more. That is not to say that it is not valuable or interesting.


This is not like the clojure is to java, it's only a syntax transformation, superficial.

Usually lisp and FP means that you get immutable data structures and a good concurrency system, both of which you get in clojure.

I looked at this a while ago and figured it's not useful for production, but an interesting project none the less.


>Usually lisp and FP means that you get immutable data structures and a good concurrency system, both of which you get in clojure.

No, Lisp and FP wasn't traditionally tied to "immutable data structures and a good concurrency system".

That's something that got in fashion much later (let's say < 15 years ago).

In discussion pre-2010 for FP in HN, few mentioned immutability, or "good concurrency" as typical FP traits, if they were even mentioned at all. The same is true for old Paul Graham posts on Lisp. Those were just another tool in the toolbox, not something inherently FP. Haskell wasn't what "FP is really about" either.

FP back then was all about first class functions, map/reduce/filter/and co, homoiconicity, code as data, macros, and DSLs.


Homoiconity, code as data, maros and DSLs aren't FP topics, but Lisp topics. FP has always been about preserving referential transparency when possible. That philosophy is pretty evident in the difference between the more FP oriented Scheme and Lisp.

Paul Graham is also a pretty poor example in my opinion. He has traditionally used a very Scheme like Lisp style. In 'On Lisp' he starts off by advising the reader to structure their programs by trying to keep side effects mostly in the outer layer of the program. He's not a FP purist who considers destructive procedures a sin, but he definitely advocated for immutability (just not under that name).


>Homoiconity, code as data, maros and DSLs aren't FP topics, but Lisp topics.

FP discussion outside of the academic context (and especially on HN) before a certain date in the past 1-2 decades were all about Lisp(s) - and to a lesser degree Scheme(s).

Doesn't matter whether that was "really FP" (real Scotsman), that what was was discussed, advocated, and practiced by most as "functional programming".

ML(s) played very little into the discussion, and purity/immutability was discussed but not as the most important thing (rather as the more pure/mathematical version of FP, but not necessarily practical for professional programming, that came later, and Haskell and then Clojure popularized it).


This is certainly true if you are used to clojure.

But interestingly enough, idiomatic common lisp code can be roughly play in the same field as Python in terms of "purity". Also, lot's of scheme code is almost imperative in nature, if you look at it from the perspective of a Clojure or Haskell programmer. This is why Paul Graham once gave the advice to use Python if you cannot use Common Lisp.

I think Richard P. Gabriel wrote a "complaint" about functional programming being redefined by the research/types/ML crowd (he wasn't so much complaining about them striving for purer FP, but for taking over the conferences, etc. driving out the more traditional FP traditions from the lisp communities).

Anyways, what I wanted to say is: FP is fairly broad as a topic, a lot can fall under it or out of it depending where you draw the line. That's not a bad thing and there is space for mutable lisps.

The other aspect is: If you have Python, there is not much to gain from s-expressions alone. sure you get macros, but otherwise you already have a lot of the benefits you get from Lisp over let's say C++ (dynamic typing).


Interesting; functional programming was introduced to me through dr. racket scheme, and the core concept in there was that there is no mutable state. To me, having mutable state makes it imperative, you're not evaluating functions/expressions, you're executing statements that modify a state.


Yeah, this is one of the reasons why Racket doesn't call itself PLT Scheme anymore. Scheme has procedures `set-car!` `set-cdr!` that allow you to mutate the content and the address register of a pair.

Scheme has pioneered or picked up a lot of advances for making programs more functional, for example lexical scoping, closures, etc. that were not present in every lisp.

What's a bit funny is that I have experience some Schemers sneer at CL for "lacking the functional elegance" and common lispers sneering at Scheme for not being a true lisp with namespacing, etc.

One observation I have is, that in my career I start to see more patterns in code (they were probably there years ago but I wouldn't have recognized them). Sometimes I see monadic patterns in otherwise object-oriented code bases, or I see dependency injection realized in functional, immutable code bases. Maybe it would be time to revive the pattern movement to recompile common patterns and aim at a modern description of these. I feel like a lot could be gained [].

[] Also I'll walk out of any interview were I will be asked again for the visitor pattern. so tired.


Racket originally had mutable cons cells but they got rid of them sometime while it still was called MzScheme. One of the reasons they rebranded to Racket was that they no longer implemented scheme as specified.


As I see it, there are two orthogonal roots to functional programming: Mathematical functions on the one hand, and lambda expressions/closure/higher-order functions/... on the other.

The former implies immutability and technically belongs to the declarative side of things: A mathematical function declares a relation between input and output, but I say technically because functional code can read rather algorithmic.

The latter can be added to imperative languages that do feature mutable state. Depending on the language, I might still classify it as functional, but impure.


yeah, I'd say its quite simple. there is "Functional programming", and it is an umbrella term. Then there is "purely functional programming" and it's kind of better defined than it's hypernym. For purely functional programming we are certain that we want to be immutable, try to limit the usage of effects.

Lately, I think, there is even a de-emphasis in _functions_ as the base of (purely) functional programming. What's now beimg emphasized is the composability-aspect. Functions are an abstraction that is composable, but so are other abstractions.


But with Hy you have Lisp + PyPi.


>Usually lisp and FP means that you get immutable data structures and a good concurrency system, both of which you get in clojure.

FP, yes. Lisp? No. Lisp data structures are generally mutable, and I don't think concurrency is a key aspect of Lisp (just look at Emacs Lisp for something with much worse concurrency than Python).


Or rather, it isn't useful in production in any way Python is not, and there's little reason to choose a weird stunted Lisp that offers few advantages over its host, an established language with a huge community.


Syntax transformation via macros can be a significant advantage when the semantics of the language don't support an abstraction that's shaped like your problem.


totally agree with this.

I can see even in larger python codebases some places where e.g. support for some s-exp minilanguage could be useful and implemented rather trivially via Hy, or also to embed some processing kernel which is well suited to being written in lispy semantics (e.g. symbolic programming, etc). Another use could be to define core primitives in python and glue them together using a heirarchy of configuration+Hy macros to generate specialized '__main__' programs for task-specific use cases.


That is true; Hy has macros and Python doesn't. Thus "few advantages".


FP certainly means that you get immutable data structures (and that you don't get or at least don't use mutable data structures, by the definition of FP); Lisp - I dunno, CL has setf (and singly linked lists made of cons cells which can share a tail where you can modify the shared tail when working with of one of the lists and the other will get modified as well). As to a "good" concurrency system, that's a bit harder to define.


This is not like the clojure is to java, it's only a syntax transformation, superficial.

Python doesn't implement a standard virtual machine. The best you could do is some kind of translation to CPython bytecode, which doesn't really buy you much (and loses compatibility with other Python implementations like PyPy).


The Jupiter notebook kernel seems to be broken for it right now.


Which one is broken? There were at least two, last I checked:

* https://github.com/bollwyvl/hy_kernel/

* https://github.com/Calysto/calysto_hy


Does it support syntax-rules or something similar?


It has defmacro.


DNS error for me as well.


Let's combine things the many people hate: lisp parenthesis with python's GIL. :)


The speed of Python with all the readability of scheme


Some of us really like the parentheses!


I personally like parentheses. But a surprising amount of my engineering friends bring it up as a major issue as soon as we start talking about List or Scheme.


Lisp with the most amount of libraries.


Does the language support macros? If not, it’s totally worthless. Python is a pretty bad language for even casual FP, and a S-exp syntax layer isn’t going to help.


_ If not, it’s totally worthless._

Kind of a harsh assessment don't you think? Maybe some people enjoy working in Lisp-like languages while also enjoying Python's huge ecosystem of libraries. I doubt it's worthless for those folks.


No one actually likes the parens of Lisp. I mean, just look at Racket2. Even the dedicated lispers who work on Racket think they suck.

I’ve written both Racket and Common Lisp professionally and consider myself very competent in both. In fact, I think they’re both great languages and one of the best around.

It’s not because of the parens though, it’s because what the parens give you: namely homoiconicity and easy macros.

Without these things, the increased syntax burden of parens don’t buy you anything except visual noise and being forced to type extra characters.


Your statement doesn't represent everyone. I thought that the design of Lisp was a great idea before I learned about its macros. (This was at a time twenty years ago when macros weren't being hyped in programming forums.) I was impressed that it simplified the specification of the language and the implementation of parsing, as well that it eliminated ambiguity. Then came the realization that formatting the code to look consistently good, no matter which way it is broken into multiple lines is trivial, and that it enables great editor support. Oh, and being able to just quote any chunk of the program as a datum!

Also:

Symbols tokens made up of almost any sequence of non-whitespace characters: if you type <->, that is a symbol without having to modify the lexical analyzer of the language to recognize a new token pattern.

Basic math operations being variadic:(+ term0 term1 term2 ...).

Here is an advantage of Lisp syntax: -123 is an actual negative integer token, and not the unary operator applied to 123, requiring evaluation.


S expressions bring a syntactic simplicity that makes code easier to read and more explicit

It also makes it easier to edit structurally.

So yes homoiconicity and easy macros are nice but please refrain from including me in your use of "no one"


And me :) - I love s-expressions.


> Without these things, the increased syntax burden of parens don’t buy you anything except visual noise and being forced to type extra characters

So instead of f(x), I have to accept the visual noise and extra characters of (f x)?


What's worse, instead of f(x, y); you have to accept the visual noise and extra characters of (f x y).


I don’t really understand these kind of arguments. Seasoned lispers use ‘ or #’ or a backtick or ,@ all the time. Or look at the CL loop macro. If I had to wrap every function in CL with (function ...) I would shoot myself. Same thing with quotes. Or, what about strings, or vectors? Strings have a special syntax, and no one complains about writing “foo”. You know why? Because it’s nice. This magical syntaxless world everyone wants to live in actually sucks big-time, even for Lispers.

If anyone really cared about parens, they wouldn’t be using (or creating) these kinds of things in the first place. I think it’s a pretty indefensible position to say that Lispers actually just like parens for the sake of them. If that was the case, why not start a form with (( or even (((?

Like, does anyone like using progn or begin? I mean seriously..

Syntax is great, DSL’s are great, they help you to express things elegantly and succinctly. This is my major beef with Lisp. If I’m not manipulating the AST, I don’t want the heavy syntax burden. If I am, then great, Lisp is awesome.

Maybe my career is different from you guys, but I’m not transforming source code even 5% of the time. I’m adding numbers, I’m multiplying stuff, I’m writing out formulas. And let me tell you, trying to write out math with S-exps majorly sucks. You would have to be insane to think that “(+ (fib (n - 1) (fib (n - 2)))” is better than “fib(n - 1) + fib(n - 2)”.

Sorry if this sounds like a rant (it kinda is), but I don’t think anyone thinks syntax is bad. It’s just that Lisp has minimal syntax and that seems good enough. You don’t notice all of the nice and convenient things you would have to give up if you actually had a completely regular syntax.


> Lisp has minimal syntax

It hasn't. It just works differently.

S-expressions define syntax for data: lists, strings, numbers, symbols, vectors, arrays, ... Plus a bunch of abbreviations (quote, function quote, ...) and some extras (templates, reader control, ...). One can program that part with reader macros.

Then Lisp usually has around three (or more) syntax classes for prefix forms:

1) function calls

2) special forms like LET, ...

3) macros like DEFUN, DEFCLASS, WITH-SLOTS, ...

Special operators and macro operators introduce syntax. See the Scheme manuals or the Common Lisp spec for the EBNF definitions of these syntax operators. This can be simple or relatively complex (LOOP macro would be an example, but also something like DEFSTRUCT, ...). The user can write macros to extend the syntax.

Now if we want infix/mixfix arithmetic, we can embed it via the reader

#i( fib(n - 1) + fib(n - 2) )

or as a macro:

(infix fib (n - 1) + fib (n - 2))

Problem: it's not built-in and not that common.

> I’m adding numbers...

Now one would have a bunch of options:

1) live with the Lisp syntax as it is, improve it where possible

2) use syntax extensions for mathematical code, may have tooling drawbacks

3) use a different surface syntax for Lisp or a specialized syntax (see Macsyma and similar)

4) use a different language with the usual mix of prefix, infix and postfix operators

Libs:

https://github.com/ruricolist/infix-math

https://www.cliki.net/infix

https://github.com/peey/ugly-tiny-infix-macro

https://github.com/rigetti/cmu-infix

http://readable.sourceforge.net


> So instead of f(x), I have to accept the visual noise and extra characters of (f x)?

(f x) and f(x) have the exact same number of characters; one mans'visual noise' is another's 'visual clarity'


What about (+ (/ 1 2) 5) and 1/2 + 5?

Also if you think parens and prefix notation are so great, would you also like to see math papers written that way?


Now you have to know that 1 / 2+5 is still (1/2) + 5.

The S-exp can be shortened in anything calling itself a Lisp, using the unary reciprocal: (+ (/ 2) 5).

Math papers are full of obscure notations; they should standardize on s-exps. Then just a straightforward dictionary of symbols would be needed to look up a notation.

Math notation is at least 2D: it makes good use of spatial structures to reduce ambiguity. For instance, instead of inventing some crazy ternary operator, mathematicians will just arrange three arguments spatially around a symbol. The space subdivides recursively, so elements end up being positionally determined, somewhat like what S-exps do in 1D.


I can format equations in latex in fewer characters than I could express in Lisp, so what you’re saying about 2D vs 1D really doesn’t make any sense.


LaTeX is (understandably) geared toward single-character variable names, and provides canned control sequences for common functions like \sin \cos. Try it with variable names that are two or more characters, and user-defined functions.


I don't think it's a given that a useful way of expressing programs must also be a useful way of expressing an academic paper.

That being said, a lot of mathematical notation is pretty close to lisp syntax, basically anything that works like sigma.


I see what you mean: Math uses 2D notations with fixed positions. The space around an operator can be subdivided into nine sections (top, bottom, left, right, corners). That yields up to nine positions for operands. Fixed positions means, effectively, no precedence parsing, just like what S-exps do in one dimension. The operands themselves use spatial subdivision likewise. E.g:

      x
           z+w
  y   op   ---
           s+t

this would (op x y (/ (+ z w) (+ s t))).


The fact that lisp syntax introduces no extra characters and no extra noise is my point; I'm glad we agree.


The slide that the Racket folks presented recently on allowing non-parenthesized syntax literally had “I <3 ( )” branded to the top. I think you’re mischaracterizing the position of Racket developers, who are interested in catering to folks who don’t want to spend a good 20 minutes on how to use ParEdit.


I like the parens for the structured editing they allow. Also, you don't need parens to be homoiconic.


Yes it does support macros. That's one of its main selling points.


I genuinely fail to understand the advantages of any sort of Lisp. Contrary to what gigantic brain lisp devs might think, it's syntactical annoyingness is far too much of a barrier to overcome.


>it's syntactical annoyingness is far too much of a barrier to overcome.

You're likely being downvoted because statements like these are more a reflection of you than the language, but instead of phrasing it that way, you come across as making an absolute statement about the language.

In any case, I'll bit.

I'm not a Lisp programmer - the extent to which I do Lisp is the occasional Emacs Lisp. Still:

You can use virtually any character as part of your variable name. Lispers tend to use hyphen and '?' a lot. Once you get used to it, it's hard not to view all other languages as inferior. Why did they restrict the variable names so much?

A lot more substantive: I saw this in John Kitchn's blog post somewhere. My details are probably a bit off, but I think I have the general gist of it. His team works on, I believe, computational chemistry. Their code base is in Python, and they've developed an API for various aspects of their calculation. A lot of their functions require a lot of arguments - not generally a good idea, but it makes sense in the domain they work in. To manage that headache, they've built a systematic way of documenting all those arguments in their docstring, with a particular format.

Unfortunately, that means anyone in his team who creates such a function needs to conform to that format, and people get lazy, make mistakes, etc.

So using Hy, he wrote a macro for another syntax for defining a function. With this new way of defining a function, there was special syntax for all the special information that needs to be in the docstring. The macro then creates the Python function with the appropriately written docstring. If the programmer makes a mistake, the macro fails. So now everyone uses the macro to create those types of functions.

I don't recall the details, but let's say it was something like:

    deff func(x, x_doc="doc for x", y, y_doc="doc for y",...)
If someone makes an argument called x, but does not provide x_doc, then it will fail.

(I think in reality it was more than that - not just x_doc but x_doc_a, x_doc_b, etc and it generated the docstring from all those pieces).

Now are there other, perhaps Pythonic ways to do this? Probably. However, it is nice that for a domain specific problem, you can just invent new syntax and use it.


> You're likely being downvoted

I do not care about downvotes on here.

> You can use virtually any character as part of your variable name. Lispers

This is not a good thing.

> However, it is nice that for a domain specific problem, you can just invent new syntax and use it.

This is equally not a good thing.


Saying nothing other than:

1. I don't care 2. Not a good thing

Is not adding anything to the conversation. Have you examined your internal need to spend time commenting on this?


Readability trumps just about all else in software dev, so I am 100% with you there on variable name limits and lack of syntax extension.

```[a-zA-Z_][a-zA-Z0-9_]*``` is such a universal and useful convention that I find myself using it even when the domain allows other characters (bash, Xnix filenames). Obviously this is English-centric so I would include other language constructs if I were in a country with another dominant language.

In fact even though I have the ability (languages supporting unicode and a QMK keyboard) to use symbols, e.g. Greek letters for math, I don't, because I know it's a hassle for anyone else.

> You can use virtually any character as part of your variable name.

> invent new syntax

are a hassle and anti-pattern for the same reason.


I think you think all this stuff is bad because you’re imagining having this power in C or JavaScript, in which case these features would be awful.

However if these features are exposed through clean abstractions in a language that is syntactically bare to begin with, you might have different thoughts.


The original (implicit) question was how one language is better than another. Responding to the answer with "Things shouldn't be different" is probably not the best counter. If you value familiarity, stick to the familiar.


The world’s best optimizing compiler for a quantum programming language was built in Common Lisp [1], and it couldn’t have reasonably been built otherwise.

[1] https://github.com/rigetti/quilc


W-Hy?


Cuddles the cuttlefish <3


The disturbing logo really puts me off wanting to learn more about this language.


Am I missing something? It seems it's just a "cute" octopus.


Funny enough, Cuttles is created by Karen Rustad. She also drew the Ferris logo for Rust.


Cuddles is a cuttlefish.




Applications are open for YC Summer 2023

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: