
Show HN: A multi-syntax programming language with bidirectional grammars - alehander42
https://github.com/alehander42/hivemind
======
cyrus_
There has been some interesting work published recently on "resugaring" that
you might be interested in:

[http://cs.brown.edu/~sk/Publications/Papers/Published/pk-
res...](http://cs.brown.edu/~sk/Publications/Papers/Published/pk-resuarging/)

and more recently:

[http://cs.brown.edu/~sk/Publications/Papers/Published/pk-
hyg...](http://cs.brown.edu/~sk/Publications/Papers/Published/pk-hyg-
resugaring-comp-desugaring/)

~~~
mjn
Also related in a bit different vein, there's work on unifying parsing and
pretty-printing (sometimes called "unparsing"): [http://www.informatik.uni-
marburg.de/~rendel/unparse/](http://www.informatik.uni-
marburg.de/~rendel/unparse/)

~~~
j-pb
I'm always dreaming of an invertible compiler that automatically provides a
decompiler for the language it implements.

Then we could transpile every language into every other language, as long as
it compiles to a common platform.

Using a logic language for implementation might help, but a compiler is
probably not surjective in general which makes this very hard :(

~~~
sklogic
This is nearly impossible. Just imagine decompiling the result of lowering
some deeply nested pattern matching.

~~~
j-pb
Of course you won't get out what you put in, but you might get something.

What I found most impressive was this scheme interpreter written in minikanren
(schemes prolog) that can produce quine simply be expressing them as logic
constraints over multiple programs.

[http://www.infoq.com/presentations/miniKanren](http://www.infoq.com/presentations/miniKanren)

[https://www.cs.indiana.edu/~eholk/papers/sfp2012.pdf](https://www.cs.indiana.edu/~eholk/papers/sfp2012.pdf)

~~~
sklogic
Interesting. Yes, in some cases decompilation can return a higher level code
than the original, which is quite an interesting an yet very much unexplored
area. I only scratched that while I was working on a static code analysis.

------
shpx
Somewhat related, hy (aka hylang) is a lisp that compiles to the python
abstract syntax tree which technically makes python a multi-syntax language
already. Plus you get all the python libraries.

[https://github.com/hylang/hy](https://github.com/hylang/hy)

It's been posted a bunch of times on HN

[https://www.google.ca/search?q=site%3Anews.ycombinator.com+h...](https://www.google.ca/search?q=site%3Anews.ycombinator.com+hy)

~~~
alehander42
I love hy !

Actually I've done a prototype of a lisp compiling to python bytecode, so
that's why when I found hy, I was pretty excited about it.

[https://github.com/alehander42/bach](https://github.com/alehander42/bach)

However it's not making it multi-syntax in the same way, because you don't get
1-to-1 python-hy inter-translation.

~~~
eric_bullington
Hey, you're the guy you did hermetic! Looking at your project, along with
Smallshire's Hindley-Milner blog post, have been a big help to me as I use
Python to build a compiler/interpreter for an ML.

I was just planning to make a mini-ML to prepare for a larger project that
requires type inference, but it's getting a bit addictive so I might just
finish it off to match (most of) the Standard ML spec.

Anyway, well done on hermetic. It's a great start to my ideal Python, and
where I'd like to see it move for 4.0 (although it will never happen).

~~~
alehander42
Ha, I didn't expect that people remember Hermetic, I am glad it helped :]

Actually I took in a different direction with Hermetic, and now I have a type-
infered python with a more classical type system and a language-independent
code generator, compiling to Go, C# and Ruby. It can translate parts of
standard library to equivalents in targets standard libraries, so it's a bit
more practical, but I hope I can extend that somehow to Haskell support too:
[https://github.com/alehander42/pseudo-
python](https://github.com/alehander42/pseudo-python)

Your work seems pretty interesting, I'd love to chat sometimes about it

~~~
eric_bullington
> Your work seems pretty interesting, I'd love to chat sometimes about it

Your work does, too. Yes, we should talk. My email's in my HN profile.

------
earleybird
Related: [https://github.com/cdglabs/ohm](https://github.com/cdglabs/ohm) and
Alessadro Warth et als work on OMeta. See parsing/unparsing in
[http://tinlizzie.org/ometa/](http://tinlizzie.org/ometa/)

~~~
xKingfisher
I actually built a prototype multiparadigm programming language for a class
Alex taught using Ohm: [http://chimera.kaseyc.com](http://chimera.kaseyc.com)

It's a great tool.

------
agentgt
Not entirely analogous but you can sort of do a lot of this today with the
very programming platform that the HN implementation uses: Racket.

And yes with Racket you can mix and match languages and create them with out
parenthesis ([https://docs.racket-
lang.org/guide/languages.html](https://docs.racket-
lang.org/guide/languages.html)).

BTW the "if" in Python is very different to the "if" in lisp as one is a
statement and the other is an expression. I'm curious how the OP plans to deal
with the expression != statement.

------
duncanawoods
I like these type of things but I don't believe they are practical because of
the massive potential variation within a single syntax which is typically
hidden by language specific culture, convention and idiom. When you
transliterate from one language to another, the gulf is made visible and the
resulting code is alien and unacceptable despite executing correctly.

The unavoidable problem is finding a language in which every one's solution to
the same problem is the same but I think that goal is more likely provably
false than possible. If it is possible then finding solutions in such a
language might be like threading a tiny needle and too hard to use.

~~~
logicrook
Well, you mean the Zen of Python's "There should be one-- and preferably only
one --obvious way to do it."

In any case, even then, people think in different ways: you can model a
problem via category theory, geometry, algebra, and those would never
translate into the same syntactical constructions, since the abstractions are
of fundamentally different natures.

Now I think mostly in a functional way, so my Python code is full of small
functions to help write "pythonic functional code", e.g. the function
iff(b,x,y) that returns x if b and else returns y. The resulting code could be
translated algorithmically to a functional language in a very natural way; the
internal def would be translated to where clauses, etc.

The point being, I want to be able to think in the way that is the most
adapted to tackle a problem, even if the code seems to be written in two
different languages. This is no different from mathematics where each "X
theory" is essentially its own language and each implies its own set of
natural theorems.

~~~
duncanawoods
>> This is no different from mathematics where each "X theory" is essentially
its own language

Nice connection between models and languages.

The maths/physics discoveries achieved by using a new representation of a
problem don't change the problem, they just present the issue at hand simply
enough to be tractable. Each representation moves complexity trade-offs to a
different category of problem.

I guess that does say something in favour of automatic syntax conversions but
it seems too low level. Maybe conversions between the mode of abstraction
being used is where it gets interesting.

~~~
logicrook
In mathematics, there are fairly few such 'meta-results', and each one is duly
celebrated as impressive; the one that comes to mind is Curry-Howard
correspondence... but in more 'traditional' maths, it's very hard to find
general high-level translations.

There are many cases where automatic syntactic low-level constructions can
bring huge improvements, in compiler design, etc, it's more that they are less
susceptible to be directly useful to laymen.

------
drostie
Awesome! This has long been a dream of mine, but I never solved the problem of
creating a better syntax than BNF grammars etc.

I don't know if we have the same goals, but my end-goal was to have some
treelike serialization (JSON/XML) but a canonical editor which would display
in whatever you were most familiar with (Lisp, Python, Haskell, C-ish). The
limitation would be that you'd have to store metadata-ish stuff on the tree to
handle custom indentations that don't follow some canonical style guide.

~~~
alehander42
Yes, that was basically my goal too : )

I also imagined automated reordering of code in an editor, but I still have
only a demo
([https://github.com/alehander42/rehab](https://github.com/alehander42/rehab))

~~~
logicrook
Looks cool. This is really a great work, it has probably been the pipe-dream
of many programmers.

------
sklogic
Bidirectional parsing-pretty-printing is a nice thing, with the only
potentially tricky issue in an _idiomatic_ rendering of the infix opetators
with priority. Firstly, parser must be aware that they are special (I am using
a dedicated Pratt parser for the binaries inside a Packrat for this reason).
Secondly, it is often important to print parenthesis around an expression for
readability, even if it is not dictated by the priority.

------
winter_blue
This is very cool. People who want to create new languages usually have to
write a lot of code even though most of the heavy lifting is done by a lexer
and a parser generator.

This seems to me like a very high level way to specify a new language. _A much
easier way,_ definitely. I assume the cost of this is that only a small subset
of programming languages can be specified in _hivemind_. Nevertheless, it's
still an incredible project. Target LLVM, and this could become a compiler
generator ;-).

------
vbit
Funny, this was one of the goals of Nim, I believe.

~~~
nyan4
Nim has a powerful templating system that allows introducing new syntactic
elements and semantics, but maybe you are thinking about generating _output_
code in multiple languages (C, C++, JS...)?

~~~
vbit
No, I meant having one ast but multiple 'skin' languages. Eventually I think
they didn't pursue it all the way.

------
zeckalpha
Awesome! I've often wondered about converting between OO/Infix and FP/prefix
notations, which might be an extension of this.

~~~
nv-vn
What is the connection between OO and infix vs. FP and prefix? Most FP
languages use infix notation for operators, and Haskell even supports infix
notation for all functions. Lisp does both OOP and FP in prefix notation.

~~~
zeckalpha
[https://en.wikipedia.org/wiki/Word_order](https://en.wikipedia.org/wiki/Word_order)

SVO/OO/Infix: subject.verb(object)

VSO/FP/Prefix: (verb subject object)

SOV/RPN/StackOriented/Suffix: subject object verb

~~~
nv-vn
I think in that way it would make sense to say that procedural is also prefix,
then, since even very procedural languages like C use VSO. Some OOP languages
use VSO too (Perl's indirect object syntax, Common Lisp with CLOS, etc.) And
someone could write all their Haskell as:

    
    
        1 `elem` [1, 2, 3]
    

is a common way of calling elem in Haskell, and many times functions are
redefined as operators (such as <$> instead of fmap, that gets used a lot more
often). I just think that any kind of split with syntax based on paradigm
seems very artificial to make.

~~~
zeckalpha
The same is true for natural languages.

------
jiyinyiyong
Also you can write Scheme like syntax in [Cirru](http:/cirru.org) and get a
third syntax.

------
ilaksh
Will this compile to web assembly?

