I used to do a lot of game dev and it was incredible for testing different interactions/movements etc at runtime just to get all the inputs right and then implement in the game code. It has saved me probably 1,000s of man-hours in my career. Nowadays I use it in NGINX for high-velocity requests that are just backed by Redis. Lua, LuaJIT, and Redis have made me a lot of money in my career. I'm forever indebted to their incredible creators and community.
I understand that the fact of the matter is that Lua dominates, but I'm curious if there are technical reasons or "right thing/right time" kind of luck, after TCL's star dimmed.
The problem of embedding one language inside another is how do you make code written in one language call functions and objects defined in the other language? For example, when an object is transferred from the scripting language to the host language, how does the garbage collector keep track of it? This isn't easy to do and in some languages it is quite cumbersome. For instance, in Python the C programmer has to manually increment and decrement the reference counts for the python objects that they interact with.
TCL solves this problem by making everything a string, taking the "everything is a string" mantra farther than any other programming language. Strings are easy to pass from one language to the other.
Lua went the opposite direction. Lua exposes a rich API that lets other languages interact with Lua. The API lets other languages create Lua objects and call Lua functions, as well as send C objects and functions to Lua. Lua is entirely designed around this API and even the core standard library is built using it, ensuring that every Lua feature is accessible to C code as well. One interesting example of this is exception handling. In most languages this is done using try-catch blocks. But that feature isn't easy to use when embedding one language inside another. How would you write try-catch blocks in C? Because of this, Lua has a different way to do exception handling. Instead of try-catch blocks, there is a "pcall" function that invokes a function and returns whether there was an exception or not. Function calls and if-then-else statements are things that are also available to other languages via the Lua API...
local ok, err = pcall(function()
if not ok then
-- handle exception
Lua has real types in a dynamic language; in Tcl everything is a string.
They can both pass functions around, but Tcl’s work by “eval” all the time—so security is quite tricky. Lua has real higher order functions.
Lua has lexical scope. In fact, by semantics it’s almost a Scheme dialect (say, R4RS) with extraordinary syntax.
It is small, fast, simple, powerful and easy to embed.
It's also very easy to contain in that the standard library is small and most of that can be removed easily. Isolating Lua's allocation arena and controlling Lua's GC is pretty easy.
When there was no competition in the space of already made scripting languages that were easy to embed (except for possibly perl) TCL had a place, but it is a relic of the past now and for good reason.
Consider donating some money to them if you aren't already :)
Can you talk more about this? Sounds like some dark magic
The architecture has some nice properties. For example the buffer being not in-process means (besides an obvious performance hit) that the streaming server can be seamlessly redeployed without loosing the buffer.
The Openresty Lua implementation is amazing.
It’s lightning fast.
16,000+ requests per second served from the cache.
I wrote a console-based mail-client which was configured and scripted entirely within lua. That was my solution to the weird not-quite-real configuration options available to mutt. I don't use it any more, and it never became terribly popular, but it was a wonderful fit.
Do you use any library for the binding? I use sol2.
I learned Lua for a game jam - during the course of the game jam itself. Like many, I found it a bit weird at first. 1-indexing feels wonky (you get used to it, though); now that I use Python all day, I would miss syntax shortcuts like list comprehensions going back to Lua.
When I was a kid, I learned C from a little pocket reference book while building little self-motivated projects. I still think this is the best way to learn - have a project in mind, and just enough fundamental examples to get you there.
I threw together the Lua tutorial mostly in one sitting, spending some time to edit, and going back and forth on how to keep the whole thing runnable while still explaining how external modules worked. Shortly after publication, Adam Bard contacted me with his idea for learnxinyminutes.com. I said great idea and ok'd the inclusion of the tutorial on Adam's site. Adam invited others to write their own similar-style tutorials, and his site grew up quickly.
Lua itself is a truly beautiful language, though I understand this is not a popular opinion. It's beautiful because you can learn the entire language, through and through, quickly. Memory and comprehension of the language itself get out of the way, so the only hard problems left are your decisions about what to do with the language. (You ask "How will I?" vs "How can I?") Instead of being feature rich through added features, it's feature rich through good language design. I expect some will disagree with me, but as one comparison point, consider the complete syntax of Lua:
versus, say, Chapters 11-14 of the ECMAScript spec:
It's not apples-to-apples - the ECMAScript spec is spelling things out in detail, while the Lua section is an overview - but it's still a thought-provoking contrast.
Even though I don't think I would start a big project like this on such a simplistic language, I can't ignore the benefits it has. I guess at end of the day, you have to balance out not just how good and secure the code is, but also how quick it is to make changes to it and to onboard people to it.
I think most people who actually use Lua come away marveling at its elegance and simplicity. The ones that don’t get stuck at “arrays start at 1?! What a garbage language!” In fact, this is a fairly good way to tell if someone has a well-reasoned opinion about Lua ;) (I will put up Go as a counter-example: “hurr durr no generics” is a meme complaint but it remains one once you actually use the language…)
I'm amazed the language hasn't caught on more. It's perfect for education, as it is simple to learn basics and algorithm design. It's already used in some of most popular games (Roblox, Minecraft mods, WoW...) so there's established playground for children creations. Code flows like English language (which is bad for advanced users but perfect for kids). I'd say it's more suited for education than Python because it is simpler and more transparent.
The things missing are tooling and documentation. There's only single popular IDE. The Programming in Lua book is awesome read (I'd recommend it even to non-Lua developers), but it's not a good language reference. The reference itself is quite useful but sparse on examples. I wish there was clojuredocs.org for Lua, community-sourced examples are amazing idea.
PS. Love Lua, TY for spreading the learning.
I have to admit it does have its quirks (Lua lists have a base index of 1 instead of 0), but so far it's been phenomenal. As far as I know, Cloudflare is also leveraging the same stack (OpenResty) for their global CDN.
Disclaimer: I am a Kong contributor.
 - https://github.com/Kong/kong
 - https://luajit.org/
The first few releases of the "Monkey Island" series from LucasArts head a grimy pirate-themed tavern called the "S.C.U.M.M. Bar" (for Script Creation Utility for Maniac Mansion).
When they transitioned to a "3d" experience and switched engines/tooling, the SCUMM Bar was replaced with tiki-style called "Lua Bar": https://monkeyisland.fandom.com/wiki/Scumm_Bar
-- Indices start at 1 !! SO CRAZY!
I was once implementing some math algorithm whose description had various subscripted variables. Some ran from 0 to whatever, and some ran from 1 to whatever. I was implementing in C++, with each of these variables becoming an array. I didn't want to write var[i-1] all over the place for the ones that ran from 1 in the book--it was a tricky algorithm and for maximum clarity I wanted my code to read as close to the book as possible.
What I did was overload the () operator to make it a 1-based array access operator. I.e., var(i) == var[i-1].
That worked out reasonably, although it probably would have been better just to make the arrays for the 1-based variables one longer and just ignore var. In other words, if var in the text has subscripts that run from 1 through 10, make the var array in C++ 11 long, and just use var through var.
I think it would be better if the end-points were “low” and “high” rather than “start” and “stop”, so the reverse of `range(0, 5)` would simply be `range(0, 5, -1)`.
Lua is a small language, and tables are the basic building blocks for arrays, dictionaries, classes, modules, ...
1-based indexing is IMHO a small price to pay for such flexible and efficient syntax, with small cognitive load)
var a : array of integer;
b : array[0..9] of integer;
c : array[1..10] of integer;
writeln(low(a)); // 0
writeln(high(a)); // -1
writeln(low(b)); // 0
writeln(high(b)); // 9
writeln(low(c)); // 1
writeln(high(c)); // 10
I started learning Lua and began working on a game within an hour without needing to reference language documentation much at all. It's a language without many secrets.
Supports debugger and most refactoring features of the IDE
A rare case where readability and consistency may not be the same. (Let's not argue about this, it's subjective. I'm aware of the obvious counter- and counter-counter-arguments, such as syntax hilighting and greppability.)
Why not just settle permanently on double quotes? This is what Go and other languages did and I think it's a good idea.
Well Lua is about 30 years old and Go is, what, 10 years old? I would expect a language that is barely a decade old to improve on past ideas and mistakes.
While having choice of quotes reduces the need for escaping, most languages that provide it also still support escaping. E.g., Ruby has double-quoted strings with escaping, single-quotes without, %Q(...) with, %q(...) without, and two different heredoc forms, with and without escaping (the forms that allow escaping also support interpolation).
For example, double-quotes for literals that interpret escape sequences and do variable interpolation, and single-quotes for “raw” string literals?
I actually like this possibility since it gives you more meaning: E.g. I would always tag localized strings by encapsulating them differently. It's great!
EDIT: oh never mind, it's just borked? HN hug of death?
Also, seriously? Double is more than enough for integers? No. No. No. I'm not going to attack every single line of this article, so lets just say I was triggered.
My daily use languages are Verilog, Python and Tcl.
Edit: I'll actually backtrack to say that this post does a great job of demonstrating things that are non-obvious.
Lisp's ; is narrowly my favourite, but I suspect that's more because I like lisps than any inherent virtue...
> Also, seriously? Double is more than enough for integers?
Not OP, but for me it's not... "blocky" enough? Especially for inline commands it's maybe just too easy to glimpse over it. But otherwise it's also maybe just plain unfamiliar.
I am sorry for your suffering, btw. I, too, have used Tcl.