Hacker News new | past | comments | ask | show | jobs | submit login

As someone who's barely used lua (only when forced to), I dont get it.

There error messages aren't great, the 1-indexed arrays, the syntax is super weird.

I understand there is good technical aspects to the language, but I am really thrown off by the syntax. I get that that could go away with time, but I have no reason to believe that "learning lua" would actually bear any fruit.

Part of me is convinced that Lua is liked simply because it's unpopular, and that if it was used in production more, people would hate it. Kinda like we hate javascript and c++. Maybe it's that we hate our jobs, and blame the language, and then we go home and screw around in lua, we enjoy ourselves and blame the language.

EDIT: Just to piggy back on my comment here:

I've been optimistically looking at the wren scripting language. That seems like the "correct" way to make a scripting language, at least from the outside. I should make it clear that I am not speaking from a position of authority, but I am sharing my opinion of LUA as a pessimistic outsider, which I think is very common view.

[1] https://wren.io/

Having used Lua a fair bit in the past year (in order to write custom plugins for Kong), the main draw (and in some cases, drawback) for me with Lua is its integration of coroutines. Draw as it makes writing event-driven code dead simple, with coroutines managed behind the scenes as you write code serially. Drawback as if you pull in the wrong library, one that does something in a blocking fashion (such as with a c module that doesn’t “get” Lua), you can get into some trouble.

In terms of syntax it’s basically JavaScript with stupid array indexing, which is enough to turn people off I’m sure.

IMO Lua's main strength is its implementation, not the language itself. Compared to Python, it's more performant, lightweight, portable and robust [0].

Unfortunately, the language itself is "quirky" at best. Its has unusual syntax, variables that are global by default, and pervasive use of `nil`. I personally also dislike build-your-own-object-system and the language's conflation of lists and maps.

I too have been looking for an alternative to Lua - I want a language like Python with an implementation like Lua's, which I could use both for embedding and standalone command-line applications. I've even considered building something like this myself - either from scratch or, perhaps, as a fork of MicroPython.

[0] https://www.lua.org/bugs.html

> There error messages aren't great

No language has _great_ error messages. But comparing the Lua message for e.g. `print(nil+5)` to the Python message for `print(None+5)`, I somewhat prefer Lua's natural language approach "attempt to perform arithmetic on a nil value" over Python's jargon-heavy "TypeError: unsupported operand type(s) for +: 'NoneType' and 'int'". Of course Lua's message could be better (it could/should tell you which operation and where the nil is in the expression), but "not great" basically describes every language's error messaging, and Lua's messages are not exactly eye-bleeding like C++ STL errors either.

> the 1-indexed arrays

1-based indexing is actually really human friendly, and programmer affinity for 0-based indexing in languages that don't map directly to RAM pointer offsets is largely excess familiarity with C plus cargo-culting and wanting to believe that everything Djikstra ever said is universally true regardless of context. Indexing by 1 gives many nice human-oriented properties, like for instance that indexing the front and back is symmetrical (first is 1 and last is -1), whereas in 0-based languages you have 0 and -1 which makes a lot less sense (people who haven't been bent to the will of the C language gods don't think of the last element as one before the first one, it's all the way at the end) except as a conceit to the fact that there's no -0 available. It's easier to reason about a list of five things where the first element is the first and the fifth element is the fifth, instead of constantly having to do mental -1 arithmetic to satisfy a paradigm that requires visualizing hardware memory layouts.

And if you really really really want to cycle over an array using modulus operations, the classic "0 is better" example that people love to trot out, Lua also lets you start your array at 0 if you want. Lua doesn't care what you use for table keys as long as they're not nil.

> the syntax is super weird

Is it? I teach Lua to people, and most novice programmers I meet find the syntax quite intuitive with its plain english keywords and operators. Do you think "function" is more weird than "def" or "sub", "or" more weird than "||", "not" more weird than "!"?

> No language has _great_ error messages.

Elm and Rust have great error messages.

I was almost going to argue with you, but Rust takes pride in actively developing and improving error messaging and it is strongly to their credit.

So instead of "no languages" let's instead say "very few languages, and none of the top 8 languages according to TIOBE". (Rust is now #9 as of like just now when it was #18 a year ago, so mad props to them for breaking into the top 10!)

And credit where credit's due, Rust got into that inspired by Elm.

I love that example of the lua error message, but aren't there non-arithmetic functions that also error? What's the message in that case?

What's nice about Python's message is how compositional or "orthogonal" it is, which means that the errors can easily be handled in a program. Does Lua have anything like that? From the manual

> If you need to handle errors in Lua, you should use the pcall function (protected call) to encapsulate your code.

This actually helps me understand a bit the dichotomy between "configuration" and "program", and why Lua is so good for the former. As another comment put it, it's good when you need to "describe stuff". It doesn't make a lot of sense to describe something that handles errors in its own description, which motivates relegating error handling to specific contexts.

> aren't there non-arithmetic functions that also error? What's the message in that case?

Any function can error in any way imaginable, but if we're talking about built-ins...

`print(nil.."a")` -> "attempt to concatenate a nil value"

`print(({}).foo.bar)` -> "attempt to index a nil value (field 'foo')"

`table.insert(a, 1)` -> "bad argument #1 to 'insert' (table expected, got nil)"

On the Python side you get:

`None[1]` -> "TypeError: 'NoneType' object is not subscriptable"

`str.split(None, "d")` -> "TypeError: descriptor 'split' for 'str' objects doesn't apply to a 'NoneType' object"

Languages exist in a technical space, but also in a social space. The single largest reason I can think of that Lua is known is because World of Warcraft used it as the UI language, and therefore a critical mass of people had to learn it to do a popular thing. Kinda like how it's irrelevant how good or bad JavaScript is if it's the only way to get dynamic behavior in a web browser.

... and in terms of why Blizzard used it in WoW... One of the strengths of Lua is it's so small that its interpreter is extremely easy to understand and debug, while still having just enough power to make most things one wants to do not super irritating. That's the technical-space aspect that boosted its social-space aspect... It's not so bad to use that nobody wants to touch it, but it's so simple to integrate into an existing complicated system that developers aren't pulling their hair out building the integration layer to whatever they're really writing the system in.

Absolutely right.

The Lua implementation was definitely its strong point, and LuaJIT is reportedly extremely good as well. I say “was” because the circumstances have changed and there are much better alternatives around.

Circa 2000 or so, if you wanted to stick a simple, self-contained language in your app for extensions and scripting, Lua was one of the very few reasonable choices. There are a ton of reasons why you might not want to stick TCL, Guile, or Python inside your application.

So Lua was used not only in the WoW UI but in a bunch of other apps from that era, and gained a lot of social traction that way. Among other things, if you were making a game, Lua let your level designers and UX people write a little bit of code without having to get their hands dirty with C++ (which nobody wants). The language is very off-putting otherwise.

> there are much better alternatives around. Coule you give some examples ?

This is gonna be subjective, because it depends on what your priorities are.

The two alternatives at the top of my list are Gravity and Wren. They are both designed for the same general profile that Lua has—a scripting language, safe to use, embeddable, with a small VM (low code size).

- https://github.com/marcobambini/gravity

- https://wren.io/

The language design choices are nice and familiar to people who are used to other existing languages. Lua is a bit radical.

Two other options are AngelScript and Squirrel, which are both a bit older and more mature than Gravity and Wren.

- http://www.angelcode.com/angelscript/

- http://squirrel-lang.org/

Finally, it’s much more feasible these days to embed something like Mono, and Guile has gotten a lot better.

> Lua was one of the very few reasonable choices. There are a ton of reasons why you might not want to stick TCL, Guile, or Python inside your application.

That sounds quite editorial - can you list some of the “ton of reasons”? C. 2000 Tcl was already 12 years old, received a major performance upgrade w 8.4[0], and was liberally licensed.

[0] https://www.linux.com/news/tcltk-840-release-announcement/

It seems like people have forgotten how bad TCL is, as a language. Everything is a string, except variables, which can be arrays of strings. You want a number? String. You want an object? String. You want a data structure? String.

TCL is a bad choice for almost anything. It is only marginally more usable than using a shell. It seems like the only reason you would use TCL is so you could throw together a GUI in front of a command-line program, or maybe stick a very bare-bones scripting interface inside something else.

I wrote a lot of TCL code in 2015-2017, and it was better than I thought going into it. TCL has first-class lists and maps, and they're not string-based under the hood at all. It's true that TCL was "all string all the time" in the early days, but now it's more like Python - you represent those types as a string, but that's not the internal storage.

I think TCL gets a really bad rap. Once you get past the unusual syntax, it's pretty nice to work in. Almost Python level nice. It supports classes, exceptions, first-class functions, and has some concepts I've never seen in other languages (such as upvar). It's easy to embed, or easy to use standalone. Installation is small and portable. And it feels like if Lisp and Bash had a baby together, conceived with love.

TCL in 2005 might have been a beast. TCL in 2020 is way better than you might expect. And it persists also, with a stable ecosystem as an embedded language in various CAD/EDA tools.

"luckily", someone created cmake so younger generations get to experience this problem too...

CMake is definitely worse than TCL, having used both.

So, I like Lua as a language, but I think that Janet, 5 years down the road, will be a better fit for a "Lua+Batteries", for people that can stomach parens. It's already well on it's way there now.

It takes like 10 minutes to learn Lua's syntax. You can literally stumble through it with [1] the tl;dr

    .. is for concatenating strings
    ~= is not equal to
The way it handles closures is beautiful and extremely powerful. Lua is a Scheme in FortranScript's clothing.

I like your theory of unpopularity, that statement is always true in some capacity for esoteric things, liked this are also liked because they are popular. My favorite posts are the ones that are like yours BUT they continue to use it because it is the best tool for the job. Thats when you know it is true.

[1] https://learnxinyminutes.com/docs/lua/

I believe there is value in learning Lua, and reading Programming In Lua in particular, early on. It's object and inheritance system is in your face. I feel learning it afforded me a more intuitive understand of prototypal inheritance systems.

Lua’s object system is in-your-face and in my personal experience very difficult to use. I found it very difficult to reason about basic questions like, “Is my Lua wrapper for a foreign object safe?”

Prototype inheritance is kind of nifty but I would really like an escape hatch to be able to avoid using it, mostly for wrapping foreign objects.

I have mostly the same feelings, though my only experience is with WoW and Payday 2 addons. A big gripe of mine is there's really only one namespace. If something is exported, it goes to the aether and everything can see it, making it very difficult to track where variables are defined and modified.

Aren't Lua modules [1] the solution for namespace?

[1] https://www.lua.org/manual/2.4/node37.html

I only use it to extend redis. I don't love lua, but I like what it does inside redis.

>the 1-indexed arrays

That's like complaining about braces vs begin/end. It's just a non-issue, has been standard on other popular languages, and it's just a convention to get used to.

>the syntax is super weird.

How so? Seems extremely normal...

Applications are open for YC Summer 2021

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact