Hacker News new | comments | ask | show | jobs | submit login
A Look at the Design of Lua (acm.org)
316 points by creolabs 3 months ago | hide | past | web | favorite | 148 comments



I studied at PUC-Rio and I met Roberto Ierusalimschy (the main author of Lua) when he was teaching a course on semantics. I once asked him if I could send a PR about a feature I wanted in the language and he said something that I never forgot: "Yes, but I won't use your code. I love that people send me ideas, but I actually enjoy coding... so I will gladly take your suggestions, though I will write it myself."

He then explained this "dictatorial" behavior is what allowed him to keep the implementation simple and concise over the years. At the time he boasted about the source code being less than 8K LOC, though it has likely increased in recent versions. (I think it was around version 3.) I'd recommend taking a look at the source code, it's truly a C masterpiece.


I'm currently a PhD candidate at PUC-Rio. For context, there's a 'meet the researchers' round of seminars during the first semesters mainly targeted towards Master's students (so they can choose an advisor).

Prof. Ierusalimschy's was one of the most interesting seminars in my first year. His passion is apparent in the way he talks about Lua's history and current development. Unfortunately I haven't taken any courses with him

As a side note, it's nice to see folks from around here on HN


Lua is a good example of an open source project that is not an "open collaboration", isn't it.


I second that recommendation about the Lua source code. Back in the 00s I built a kiosk management system in Lua, C, and C++ and it was a delightful language to work with and I found myself reading through the sourcecode a fair bit, always impressed.


Wow, I’ve never met any faculty with the time to write code for fun.


FWIW Lua is somewhere around 15,000 LOC these days.


I've the feeling that Lua became popular mostly based on its implementation, fast, small and self-contained portable C code, and not for the merits of the language itself. Certain things are addressed in Lua 5.3, but in general Lua has several flaws:

- Before Lua 5.3 there was no integer type, this makes a lot of things harder than needed in programming, unless you just write high level code.

- Lua trivially departs from standard syntax like ==, !=, zero-based indexes, without good reasons, creating a general feeling of confusion.

- There was an odd idea about what to include by default and what was optional. The lack by default of bit operations makes the language always in need of a trivial library.

- Tables as a single data structure are not ideal for many tasks. Especially the dualism between tables that are indexed by sequential numbers and tables acting as maps. It's a lot more kind to your users to give up with the solely academical ideal of "just one data structure", and give programmers at least an ordered and a map data type, or at least come up with something better than that. Lisp attempt at doing most things with lists was much more successful and clean.

- The stack-based C API is not comfortable to use.

I could continue to be honest, for instance by default you can't type an expression in the REPL and see the value returned. However when the task at hand is to write small scripts for a larger system, the implementation of Lua offers big enough advantages to make it a worthy choice.


> Tables as a single data structure are not ideal for many tasks. Especially the dualism between tables that are indexed by sequential numbers and tables acting as maps. It's a lot more kind to your users to give up with the solely academical ideal of "just one data structure", and give programmers at least an ordered and a map data type, or at least come up with something better than that. Lisp attempt at doing most things with lists was much more successful and clean.

I have a different view.

Lua tables are the foundation for just about any conceivable data structure in CS. While you can do a lot with raw tables in Lua, you can also use them to build all kinds of things on top of them, like b-trees. Or an object system (which is a popular hobby among Lua aficionados).

There's a limit to what can and should be built into any base language. Trying to wedge in more container types can cause a bit of difficulty, in my opinion. Python is the classic example here, where you have tuples using parens, arrays using square brackets, and dictionaries using curly brackets. At some level that is clever, but at another level it is a cause of confusion.


This perceived simplicity of only having a single non-scalar datatype comes at the cost of increased complexity when you have to stuff a square peg into a round hole, i.e., when you approximate other data structures with tables.

My favorite example are arrays in Lua: They are hashtables, but not quite (in the implementation, every hashtable has a separate array region). They look like hashtables, but not quite (there's syntactic sugar to help dealing with them). You can use hashtable-functions on them, but they don't quite work (which is why there are special functions for when you want to treat a hashtable as an array). Hashtables are great for unordered maps and sets, but they're a crappy replacement for ordered sequences.

> Python is the classic example here, where you have tuples using parens, arrays using square brackets, and dictionaries using curly brackets. At some level that is clever, but at another level it is a cause of confusion.

If that confuses you, then the source of the confusion is you not having a clear understanding of the intended semantics of your code or datastructure. Mixing all of that into a single type won't help here.


Yeah, came here to say the same thing.

Tables + metatables mean that pretty much any structure you want is available. Single Inheritance + Interfaces, Multiple Inheritance, Component Based Design are all things I've seen in the wild on shipped products with Lua.

It's a fantastic 'orchestration' language. Coroutines were also invaluable. We used them extensively to script AI behaviors. Our designers(yes, not developers) would write them with "go to X location", yield, "search for players", yield, etc. Was a really compact way to build a FSM that they could understand.


Sorry, but tables are awful in practice. I have to iterate all the elements to get a count? This is especially egregious when I’m using one like a list/array and need to know how far to iterate. Maybe there’s a better way I don’t know?


If you need something quick-and-dirty, you can maintain an extra `["length"]` field on your table to cache that information. If you want something more polished, use a proxy table with custom methods/metamethods mediating access to your array so that the length is properly kept up to date. This isn't terribly different from the situation in C with arrays.

Tables are a basic mechanism for implementing any data structure. That is a qualitatively different statement from saying that tables can replace any data structure. If you need semantics different from what tables immediately provide, you'll need to implement the desired data structure yourself -- but Lua gives you good tools to do that.

> Maybe there’s a better way I don’t know?

You can use `for i, v in ipairs(arr) do ... end` to iterate over an array without worrying about "how far" to iterate. The `ipairs` iterator will determine when it hits the end of the array.


Yeah, this is fine until I want the last element of an array. Then I have to iterate it to find it. And writing a loop to get the last element of an array is crazy-town.


Have you actually used Lua? You don't need to iterate to get the last element of an array.


I have! I’m not an expert at it, though.


There's the length operator (https://www.lua.org/manual/5.3/manual.html#3.4.7): #sometable returns the length of the sometable if it is a sequence, and has an upper bound of O(log(N)).


I don't think that's a completely fair assessment. It's true that the language has some gratuitous feeling warts (not being zero based is a very severe flaw where embedding into C/C++ apps is a primary use case; it also messes up luajit and terra's UX no end, global by default is another problem, albeit one that can be mitigated with tooling). But what other language packs so much punch for so little weight? Despite being quite simple, the syntax still is expressive enough for a lot of DSL uses. Lua has proper tail recursion and an elegant co-routine design and it's slightly idiosyncratic lpeg library is worlds better than the regexp offerings of most languages. It has serious meta-programming power based on a much simpler model than e.g. python or ruby.

Despite its warts lua still feels like a better (imperative, statement-based) scheme than scheme to me. What scheme offers a similar combination of speed and expressiveness for the same conceptual and file-size weight as lua(jit)?

Wrt to repl: the torch7 repl is quite serviceable, so it's worth giving it a try even if you're not into deep learning.


I agree on the value/codebase-size ratio to be great, it's why I chosen it for Redis for instance. But yet most of the things I expressed in my comment were not competing factors, since the code base would be enlarged very slightly. In fact the bigger addition that is the integer type was added in 5.3.


I'm not disagreeing with your assessment that lua has serious warts (that have basically no upside, unlike say no "true" integer type, which has pros and cons). I'm just saying your portrayal is a bit lopsided, there are various aspects in which lua is a more elegant design than other dynamically typed languages like python, ruby, perl, scheme etc. I.e. small footprint and good embedding story are not the only merits of the language.


I've problems at recognizing any significant part of Lua that is better designed than Ruby, from the POV of the language itself, pretending implementations do not exist. But this is just a personal POV so we could argue forever. I also bet that a very good subset of Ruby could be implemented with a similar-sized code base than Lua (but not at the same speed probably).


> a very good subset of Ruby could be implemented with a similar-sized code base than Lua

That would have to be quite the subset. The Ruby parser alone is almost 12kLOC. I've implemented entire languages in 1/10th that.

https://github.com/ruby/ruby/blob/trunk/parse.y


mruby (http://mruby.org/, https://github.com/mruby/mruby) is the official attempt to do just that. Depending on what you count it's around 30-60kLOC. It has a standalone bytecode compiler, so the parser (which is "only" 6kLOC) doesn't have to be a part of the runtime.


> Wrt to repl: the torch7 repl is quite serviceable, so it's worth giving it a try

Sounds great, but:

    $ luarocks install trepl
    Error: No results matching query were found.
Also the torch7 repo[1] mentions:

> Torch is not in active developement

Is there a maintained torch7 repl somewhere?

[1] https://github.com/torch/torch7


What other language: autohotkey


Autohotkey looks nice but Windows only.

Maybe Pawn or NekoVM.

https://nekovm.org/doc/lua/ https://github.com/compuphase/pawnv Also, everybody loves benchmarks: https://github.com/r-lyeh-archived/scriptorium


Fair points overall, except:

> The stack-based C API is not comfortable to use.

What is so uncomfortable about it? I think it's absolutely fantastic. There's nothing you can't do with it, and if you really want a heavyweight "magic" binding library you can do that on top of it. But you'll find even then you occasionally need to fall back to the "raw" API.


The problem I've with the Lua stack API is that you have to make all the mental gym about the current stack layout. And consider that I'm a fan of FORTH and Joy and other stack based languages... But when you have N parameters on the stack, you call a function from Lua, and then you have a different layout based on what such function yields, and so forth, you basically need to comment the state of the stack in order to be able to quickly understand what's going on six months later. An example is the Redis scripting.c code where we call a Lua function to sort the Lua array result.

Another problem I've with the stack API is that it's insecure by default. If you push too many things it will result in a C stack overflow. I'm not sure how much performance this gives, I mean, the inability to grow the stack automatically looks like an optimization matter... But at least, panic when we reach the max stack instead of smashing the C stack.


I believe the Lua creators' claim on this is not that it's necessarily a good API, but rather that it's "just" a better one than any other they know of :) (Or at least knew of at the time when they created Lua. But I think if they learnt of a better one in the meantime, there's a high probability they would try it, given that they're not afraid to break backwards compatibility when they see value in a change.)


for Hammerspoon, we ended up building a fairly large abstraction layer that takes care of all of the repetitive/fragile stack operations for us, and it's improved our productivity significantly. Wiring up new Objective C objects to Lua is now very simple, but we can still fall back to the Lua C API when we need it.

There are still quite a few places in the code though, where some complex stack operation is happening and each line of C/ObjC is surrounded by a lot of comments that describe the expected state of the stack at each step, and there have been a lot of situations where I've ended up having to sketch the stack on paper as I'm reading code, to figure out why something isn't working.

So, I think I'd say that the C API is not comfortable to use, but it's also not very complicated. It's somehow managed to be simple, but hard, if that makes any sense.

Edit: If anyone cares, our Lua abstraction is all written in ObjC and is called LuaSkin. It lives in the Hammerspoon git repo, but doesn't actually depend on Hammerspoon at all, so could be extracted and used by others (it's MIT licensed). The API docs are: http://www.hammerspoon.org/docs/LuaSkin/Classes/LuaSkin/inde...


I used Lua's C API a couple of years ago to write some lightweight JNI code to use Torch from Java. The API documentation was so clear that it was relatively easy to write. Since the API is stack-based the project ended up being a good learning experience. The only thing that was a little annoying was needing a few if/else statements to return the desired type.


> Lua trivially departs from standard syntax like ==, !=, zero-based indexes, without good reasons

Having taught programming to many people outside of the programming field, one of the biggest sticking points I have found they face early on is calling the 1st thing the 0th thing, instead of just the 1st thing which is what they've been doing their whole lives. This leaves them with the general feeling that programming as a practice is nonsensical and arbitrary (and they're not wrong). It can turn off a lot of people as they grow frustrated by constant off-by-one errors, which tooling doesn't do much to catch.

If Lua wants to be a language for non programmers, 1 based indexing is a good choice imo.


I think the tension here is that Lua also wants to be a low-overhead-implementation scripting language to interface with the innards of binary applications, much like its role in Wireshark. And for that role, 1-based indexing was a disastrously bad choice.


Those people should not be dealing with indexing in arrays then. But frankly - I think your point is invalid as much. You call the ground floor a ground floor in English, which is different from calling it a first floor in many other languages. So it isn’t as unnatural.


> I think your point is invalid as much.

I'm just telling you what hundreds of people have told me. They expect the first thing to be #1. Their words. I dont see how you can call their own experiences invalid.

Your dismissive attitude is another reason why so many non programmers have trouble learning programming.


My dismissive attitude is due to me being professional enough to tell someone to either get some proper training or stop trying to do something they are not good at in the first place.

>> "They expect the first thing to be #1".

I expect the laws to be easily understandable by a layman. Should I become a law-making advisor, then?


I'm sorry to say, but I don't believe a dismissive attitude can ever be described as professional.

I believe everyone can be taught to program, and the choice of language, semantics, and syntax has a profound effect on how far people can get, and what frustrations they face.

There are people out there with 0 formal training who run entire businesses on Excel, the most widely used programming language bar none (it's notable that in Excel, rows start at 1 and not 0. There is a reason for this). Ask a 7 year old to use C and they're not going to get very far. Give a 7 year old Logo, and they'll be writing programs with very little instruction, with results I've seen college freshmen struggle with.

I teach a summer robotics program to middle schoolers. We used to teach it in C++ because that's what the SDK came written in. In this mode, we spent most of the time getting them to think like the compiler, teaching them about memory layout, allocation, compiling, headers, preprocessors, etc. because they constantly ran into frustrations due to the design choices of C++. They never left the session with a firm understanding and confidence around programming because they spent all their time trying to build a model from scratch in their head without any relation to their own world.

Then we switched to Matlab. With one uniform data structure, a REPL, 1-based indexing, etc. they were much more comfortable, and they were able to make the robots do amazing things for their age. The most impressive thing I've seen is making a robot choir through writing a distributed protocol to synchronize the robots' notes. They were able to do this because the language, Matlab, got out of their way, which allowed them to focus on the task and relate it back to something they knew very well: music.

All I'm saying is this attitude of "Oh, you don't understand this thing we've built and these arbitrary limitations frustrate you, therefore you shouldn't even try it in the first place" is just toxic, given the evidence I've seen that people can learn and do amazing things if we give them a fighting chance.

Required reading on this subject: https://www.amazon.com/Mindstorms-Children-Computers-Powerfu...


I can see where you are coming from. Just $0.02 these issues don't bother me so much in my day-to-day usage over many years (though this is with LuaJIT/RaptorJIT so YMMV.)

The default numeric type (double float) can precisely represent integers up to 2^53. That's ample for most of my purposes. You can even fit x86-64 pointers into them.

Syntax doesn't match C but that doesn't bother me because I'm used to other languages already e.g. Erlang that uses different operators and one-based indexing.

Bit operations are always there in LuaJIT.

The C API does seem awkward to me too but it's superceded by the lovely FFI in LuaJIT.


That is very one sided. First it was designed to be embedded, so small size was a priority, not every imaginable feature. Second, the fact is that the language was always running faster than all “favorite” scripting language. It was also much better suited for Jit. The minimalism demonstrated obvious advantages to the competition.


> I've the feeling that Lua became popular mostly based on its implementation, fast, small and self-contained portable C code, and not for the merits of the language itself.

And this is quite fair. Those are fundamental qualities of a new programming language.


It's not very popular, but Wren seems like a better alternative if you'd like a straightforward object-oriented language with a small C implementation.


> Before Lua 5.3 there was no integer type, this makes a lot of things harder than needed in programming, unless you just write high level code.

Wasn't the idea to only use lua as a high level scripting language, and drop down into C(++) if you needed to go low level?


Exceedingly minor correction: Lua does use == in the way you'd expect it to.


Lua is a big reason why I'm interested in both systems and PL today. I started using Lua to build game mods in high school, and I was struck by how simple it was compared with Java, or even Python that I had tried learning earlier. The all-encompassing "everything is a table" paradigm really sparked my interest in the design of systems of computation, and how a broad range of programing patterns could be expressed with a few primitives.


You might find Isle interesting (https://github.com/gvx/isle). It is a small programming language I've designed and implemented a couple of years ago heavily inspired by Lua's "everything is a table". In it, a function call looks like a name concatenated to a table literal, and calling a function actually consists of creating that table and using it as the environment for the function.

I still think it contains some neat ideas and concepts, even though I would never actually use it for something practical.


Stay away from Lua and use Python, Go, Node, Rust or even C in new projects. This is going to be a lengthy rant for a comment; apologies in advance.

Having written a ton of networking code in Lua I can tell it has a lot of design defects that most other programming languages have avoided. Especially the deeply embedded one-based indexing and lack of a separate integer type (chars are strings, ints are floats) make it wearing to type any code that has to parse any binary data. Even bit manipulation requires going through the FFI. Considering that Lua is often touted as the optimal lightweight solution in small embedded systems, these features are must-haves.

The builtin string pattern matching library attempts to mimic regular expressions, but bites the user in the leg every time they attempt to use it for anything remotely regex-y. Unfortunately most usual string manipulation seems to revolve around them and the string.gsub function.

The inconvenient string library is also largely irrelevant. The developers call their language eight-bit transparent which in reality means that the programmer needs a separate Unicode library, rendering the whole standard string library mostly useless. One cannot even format UTF-8 text, because the string library doesn't differentiate between characters and bytes.

The official documentation mostly revolves around C bindings. The actual Lua language documentation is provided by another web site that looks like it has been written by an attention-deficit child. Instead of getting to the point, the articles often first provide a list of ways on how not to implement things, or they are actually benchmarks of fifteen different ways to call the standard library to get the same effect. Sometimes, the provided code does not even work – for example, the given XML parser code crashes when it hits one of the error paths.

The small runtime and fast VM do not really matter when the language itself is lacking and requires a metric ton of external libraries to achieve even the simplest things that are possible in standard C and Rust, which provide the same level of performance and better type safety.


These are quite unfair arguments against Lua, especially since that you seem to be using it for low level networking code and on embedded systems, which are not really Lua's strong points. (Lua is designed to be embedded as per being a scripting interface to C and other compiled languages.)

One based indexing is just a design choice, while it complicates interoperation, I find it justified as Lua tables are actually hash tables, so it would be weird to model it upon memory offsets. It reminds me that I am only dealing with a hashmap with integers as keys.

Bitwise and UTF-8 string operations are lacking out of the box, but these are arguably too specialized to be expected in a general scripting language. The patterns are a little underpowered compared to other languages, but it is still quite useful. I suggest looking into LPEG for more complicated operations. Lua 5.3 do have some functions to deal with UTF-8 codepoints.

Also, regarding Unicode, I'd rather the PL provide only operations on bytes, and let me decide what "characters" are to be represented. Nearly all PL providing some Unicode facilities fail in different ways, just let me deal with ICU myself if I really need Unicode. Also it is a royal pain in the butt to deal with different encodings, with a language having special "characters" and "strings" constructs.

I don't understand the complaint on documentation. It is one of the most practical and clear technical writing that I have seen, if a little terse. Perhaps you mistook the wiki as the main document? (The Lua language manual is an one-page HTML page.)


> These are quite unfair arguments against Lua, especially since that you seem to be using it for low level networking code and on embedded systems, which are not really Lua's strong points. (Lua is designed to be embedded as per being a scripting interface to C and other compiled languages.)

If this is true, and I actually agree with you, then where is Lua actually better than an alternative?

And, while I don't share the specific points of the parent poster, I experienced similar issues with Lua. I found that while the Lua core language itself is small, I had to compile in so much extra stuff to make it useful that it wasn't actually small anymore.

And, once you get large enough that you start demanding big ARM Cortex M4 microcontrollers, Python is now in your scope.


Running untrusted code to allow modding of games. Suddenly the fact that you have a standard library or a complex language with a large amount of features is no longer a benefit and instead is actually a liability often to the point that you disable the standard library in lua entirely and only selectively load parts of it and write your own modding API to allow modification of the game.

There are entire games built around modding via lua such as gary's mod which similar to a web browser has to download random code from a server. Then there are mods like computercraft/opencomputers for minecraft in which direct file access to the host doesn't make any sense at all because every computer has it's own filesystem.

I just looked up Python's sandboxing capabilities and I can't find anything. All I see is answers like [0] that suggest that it's not trivial.

What about other implementations like pypy that claim to have a sandbox? [1] I can see lots of warnings like "This is not actively maintained."

The lack of mature sandboxing for python makes it impossible to use for the usecases I've described above.

[0] https://stackoverflow.com/questions/3068139/how-can-i-sandbo... [1] http://doc.pypy.org/en/latest/sandbox.html


> I found that while the Lua core language itself is small, I had to compile in so much extra stuff to make it useful that it wasn't actually small anymore.

I think the idea is that most functionality is meant to be provided by the host program. If you want a standalone scripting language, then Lua probably isn't for you.


> One based indexing is just a design choice, while it complicates interoperation, I find it justified as Lua tables are actually hash tables, so it would be weird to model it upon memory offsets. It reminds me that I am only dealing with a hashmap with integers as keys.

One based indexing isn't just about memory offsets or getting used to it, it makes lots of things a nightmare. Like index-math that involves the modulo operator, which will result in zero values, which are invalid for 1-based indexing. Or turning a 2D or 3D index into a 1D index, which now requires completely unnecessary additions or subtractions of 1s. And other index magic that I frequently need for computer graphics and number crunching.


>These are quite unfair arguments against Lua, especially since that you seem to be using it for low level networking code and on embedded systems, which are not really Lua's strong points.

Why is Lua an oft-recommended programming language for embedded development if one is not allowed to form opinions on the language?

>so it would be weird to model it upon memory offsets

Why does the language provide table constructors {1, 2, 3} that do one-based indexing then? Why is there a table length operator (#) that works on one-based indexes? Why does ipairs work with one-based indexes? A sequential data type (array, list, for some quirky reason named "tables" in lua) is one of the most basic data structures of a programming language. If Lua is supposed to be a scripting language, why should a programmer care about the underlying implementation?

>Bitwise and UTF-8 string operations are lacking out of the box, but these are arguably too specialized to be expected in a general scripting language.

Is Lua a toy or a tool?

>Lua 5.3 do have some functions to deal with UTF-8 codepoints

I welcome you to the real world, where 5.1 is still the latest available release available on too many platforms.

>Perhaps you mistook the wiki as the main document?

Lua-users.org turns up as the first result when you google for documentation.


> Why is Lua an oft-recommended programming language for embedded development if one is not allowed to form opinions on the language?

I have never heard anyone recommend Lua for embedded development. It's an embeddable language in the sense that you can easily embed the entire interpreter in a larger application and extend it. That it happens to be relatively small and easy on memory is a bonus, not a reason to believe that it's a great systems programming language.

I'm not at all surprised that it disappointed you as a language for low level networking code. Most Lua success stories are about its use as a high level glue/scripting/configuration language, letting the application handle the low level code via an API.


What you're saying is true, but it's not a stretch, in an embedded system, where you are using Lua as a "high level glue/scripting/configuration language, letting the application handle the low level code via an API" to then consider using it for scripts and more on the device itself.

The runtime is small and, all things considered, it's very fast. While I never got to a place where I was writing low-level networking code with it, when I was still in embedded I did use it for a number of tasks that it was perhaps not quite perfectly suited for because of it's characteristics. I'm no longer in embedded, but I can easily imagine having pushed that idea further with projects like LuaJIT showing up in more recent years.


Yes, I'm definitely not saying that it can't or shouldn't be used for embedded applications, memory and speed constraints permitting, just that I never heard anyone recommend it as being more useful for that purpose than languages that explicitly target that niche. I also think that GP rather confused "embedded language" with being a language particularly suited for embedded systems development than having heard it being touted as such.

The GP obviously ran into some obstacles using Lua 5.1 for systems programming, though, and my point is that his experience probably doesn't reflect the overall utility of the language as much as it presents a case of "the wrong tool for the job".


Your complaints are mostly addressed in Lua 5.3 (64 bit integers, utf8 in stdlib, bitwise ops) & many would suggest you use LPeg if you're itching for non trivial string manipulation

Also the documentation provided by lua.org is quite accessible: https://www.lua.org/manual/5.3

What I'd really like is stdlib to provide a Buffer type, as opposed to having to use a table of integers which then gets pushed into a string with chunked calls to string.char(table.unpack(buf, n, n+4096))

Example of what I'm complaining about: https://github.com/serprex/luwa/blob/master/rt/make.lua#L758

So all said I don't disagree with you: Lua can fall short, it doesn't have the community other languages have built up, & the community it has is rather fractured. Like LuaJIT making 5.2 what most call Lua, but also that being an extension language it's hard to upgrade without losing compatibility, so projects like Redis can't upgrade without deprecating 5.2 for a 5.3 version & making some method to signal 5.3 & then eventually removing 5.2 while packaging both in the interim


We do serious networking in Lua, but it's on top of C socket code (implemented using IOCP or epoll) that transparently serializes, compresses and encrypts Lua tuples to bytes on one and, and decrypts, decompresses and deserializes bytes to Lua tuples on the other. This is the backend of a real time game (multiple messages from each player per second), and we handle on the order of 10k connections per core.

The "actual Lua language documentation" you refer to probably is the wiki? I agree it is crap.

By all means, stay away from Lua and use Python and Node for new projects; we like that competitive advantage.


> This is the backend of a real time game (multiple messages from each player per second), and we handle on the order of 10k connections per core.

I pretty much agree that Lua can survive such throughput, but I question whether Lua can survive the code complexity. I have written about Lua previously on [1] with a vocal conclsion that Lua cannot really handle a large code base (in the order of 10^5 lines), so I like to hear otherwise (hint: I hadn't). That's why...

> By all means, stay away from Lua and use Python and Node for new projects; [...]

...we have ultimately dropped (a heavy duty use of) Lua from all new projects. C# seems to suit our needs (or any other statically & strongly typed languages---obviously excluding C/C++---would probably do).

[1] https://news.ycombinator.com/item?id=13902023


> I have written about Lua previously on [1] with a vocal conclsion that Lua cannot really handle a large code base (in the order of 10^5 lines), so I like to hear otherwise (hint: I hadn't). That's why...

We have shipped and maintain a series of games built on progressive iterations of the same codebase; we have on the order of 200 KLOCs of Lua in the common "engine" code and tooling, and 200-400 KLOCs additional game-specific code per game. We tend to hit our milestones and performance targets on six platforms, with programmer quality of life way above the game industry "standards" (i.e. we do normal working hours).

Can things be better? Of course, they always can. But om our experience, large-scale use of Lua is not the disaster it usually is assumed/predicted to be.


Thank you for sharing. But I'm not sure if your experience is not an outlier, that is, some people do produce great and large software in every programming language including annoying ones ;-) I claim that my experience is more typical because many annoyances pointed to Lua's core decisions.

For the context, we had about 100 and 200 KLOCs of Lua code for the server and client respectively. We had to use Lua 5.1 because there were not much tooling nor library around 5.2 or later at that time (pain point #1). We of course needed 64-bit integers and sticked to special string representations with some metatable hack, which itself was quite usable. There were several, often critical, bugs that were found to be fixed in 5.2+ but left out in 5.1 even though everyone knows that Lua 5.1 won't die out any sooner (pain point #2): the most critical one for us was the use of thread-unsafe `localtime` which made my jaw dropping. Also there was no clear migration path from 5.1 (AFAIK this holds for 5.2 as well) to later versions (pain point #3); we had to build many system around Lua's dynamism to deal with the complexity and, as you can guess, every minor release of Lua breaks enough dynamism to prevent easy migration.

Besides from this, every single library we have tried to use had a critical issue: luasocket plus luasec had spectacularily failed at correctly handling transport timeouts (pain point #4; we did use ASIO as a backend, we only needed them for some external services), luasocket alone had a very annoying and easy-to-mess API for every but simplest HTTP request (pain point #5; eventually we had to shell out to curl which was much, much better than any other Lua-based solution), dkjson had no control over empty arrays vs. empty objects which caused some interop problem; and so on and so on (I can't remember all of them). Ah, of course there are also shitty wikis but I dare not to describe them. We eventually had learned to avoid any external library or even snippet. I feel like that Lua actively encourages DIY mindset which is absurd for modern large-scale programming.

Have I mentioned that we had used Lua's dynamism to deal with the complexity? Well, we could only try. We had built our own version of unit testing system because, well, obviously Lua is built for embedding and standalone unit testing does not make sense when you have a custom environment, right? This one is acceptable, but the lack of coverage calculation is not (pain point #6). I had built a custom one, but that was waaaay slow that we couldn't use all the time (and I'm pretty sure that a right debugging API will immersely change the situation, but there is still none as of 5.3). Debugging was also abysmal (pain point #7), again Lua is built for embedding and standalone debugging experience does not make sense, right? We already knew and used MobDebug [1] which kinda worked but was brittle. We had eventually built a serious debugging tool (open-sourced!) that integrates to Visual Studio Code via Language Server Protocol. The remaining story goes much like this: there are no or only partial tooling around embedded context (supposedly the biggest use case of Lua), and we had wasted too much time to build ours including a type checker (!!!!) [2].

I do agree that Lua has some upsides as well. In fact, Lua greatly helped us to quickly build a functioning product from the scratch, and the ability to safely discard the whole state massively simplified the error recovery process (alluding Erlang's approach). I really, really want to like Lua because it seems the only one viable programming language with those charactistics. But my own experience says otherwise.

[1] https://github.com/pkulchenko/MobDebug

[2] https://github.com/devcat-studio/kailua


Thank you.

It seems the major difference is that we are fiercely NIH, or as you put it a bit more mildly, DIY. We very quickly switched to our own socket implementation, we built our own IDE based on lua's remdebug and Scintilla (this was way before Visual Studio Code was a thing), etc. I wouldn't attempt to use a "lua JSON library", we'd wrap an existing C JSON library and have it convert to/from Lua structures. At our scale and business model - lots of reuse across products, small programmer turnover - this is quite OK; I don't feel like wasting work when I do something like that, it's investing in the future.


> The official documentation mostly revolves around C bindings. The actual Lua language documentation is provided by another web site that looks like it has been written by an attention-deficit child.

This is such an unfair point. Lua has a single manual that covers the language, the standard library, and yes, it also covers C FFI (unsurprisingly for an extensibility / scripting language), but FFI actually comes after the language in the docs. And the manual is both very readable and very thorough where it talks about the language - it's quite easy to learn the whole thing just from it alone.


Are you criticizing LuaJIT rather than Lua? LuaJIT is a third party implementation of an older version of Lua (with cherry picked features from newer versions). A lot of this criticism is no longer valid for Lua.

> lack of a separate integer type (chars are strings, ints are floats)

Lua 5.3 introduced integer types. [1]

> Even bit manipulation requires going through the FFI.

LuaJIT includes a bit manipulation library, but its FFI acceleration is optional. Also, Lua 5.3 introduced bitwise operators. [1]

> One cannot even format UTF-8 text

Lua 5.3 introduced "basic utf-8 support" [1] although I am not sure to what extent.

> The official documentation mostly revolves around C bindings. The actual Lua language documentation is provided by another web site that looks like it has been written by an attention-deficit child

Again, I suspect you are criticizing LuaJIT. The actual Lua documentation [2] has never let me down.

[1] http://www.lua.org/manual/5.3/readme.html#changes

[2] http://www.lua.org/manual/5.3/manual.html


Well, all the criticism is valid if the parent did any Lua programming in Lua in the last 15+ years.

5.3 where all this are solved is the most recent stable version -- not what most people have used for the entire history of Lua (and not default in all distros).


>Again, I suspect you are criticizing LuaJIT.

Nope, plain old Lua 5.1 which is still the newest one can get on many platforms. You're stuck with what you have, not what was released yesterday.

>The actual Lua documentation has never let me down.

I read the document you linked and found the documentation lacking on anything but basic syntax and C bindings (the chapter on C API comes before the Lua standard library, btw). The standard library function documentation underspecifies parameter types and both return value types and numbers.

I find the Lua Short Reference [1] much more accessible than the official documentation. It has the very same information in a much more readable format.

>Lua 5.3 introduced "basic utf-8 support"

Looking at the documentation you linked, the Unicode support does not integrate with the string library. Walking a UTF-8 string manually character-by-character has never been difficult and Lua 5.3's support doesn't seem to address any other use cases.

[1] http://lua-users.org/files/wiki_insecure/users/thomasl/luare...


It amazes me how many tech people can't compile projects on their system's. Lua is crazy easy to compile on any system.


You compiled it, you have to support it.

Most devs aren't interested in the additional burden of maintaining and crawling through the code when an edge case pops up.


You are implying that the problems that you were facing would be a problem for every potential Lua user. This is certainly not the case. The biggest use of Lua is scripting for games and non of the shortcomings you mentioned relevant in that context.

Other than that, as others pointed out, most of your complaints are addressed in 5.3.


> Especially the deeply embedded one-based indexing

That's already reason enough for me not to use it. One based indexing caused me so many headaches in the past in Mathematica...


One-based indexing doesn’t really make sense if you’re not using pointer+offset array indexing. So the only problem I see here is that you can’t directly port code that deals with arrays over from different languages, which is what you should be doing anyways because tables are not arrays.


One based indexing isn't just about getting used to it, it makes lots of things a nightmare. Like index-math that involves the modulo operator, which will result in zero values, which are invalid for 1-based indexing. Or turning a 2D or 3D index into a 1D index, which now requires completely unnecessary additions or subtractions of 1s. And other index magic that I frequently need for computer graphics and number crunching.


I wholeheartedly agree.

As a matter of fact, I had the exact same argument in a Julia thread (modulos don't work, polynomial accesses don't work).

It honked off a whole bunch of people who've never had to do any of these things for a living and therefore considered them edge cases.

Additionally, for LUA, being one-based while trying to claim it's the scripting language for C inter-operability is quite preposterous.


>As a matter of fact, I had the exact same argument in a Julia thread (modulos don't work, polynomial accesses don't work).

In the same ways that simple for loops "don't work" because we need to e.g. check if i < length - 1 to know if we're at the last element...


Which is easy and legible if you do it. With modulo and 1 based indices, it gets messy.


Yeah why on earth did they do this? It's such a pain. I can't think of a reason why it would be a good idea, except perhaps "non-programmers don't expect to count from zero" but surely it's easier to overcome that than overcome all the little problems that stem from starting indices from 1.


When Lua was originally created its primary audience were engineers used to FORTRAN's 1-based indexing.


> but surely it's easier to overcome that than overcome all the little problems that stem from starting indices from 1.

It's not. You're asking new learners to overcome a great deal of intuition when already attempting to learn a mind-bending thing like programming. I've taught C++ and Matlab side by side to freshmen in university. They learn C++ first and encounter all kinds of off-by-one errors the compiler doesn't help them catch. When they get to Matlab, the universal sentiment I've found is "wow, this makes so much more sense"

For new learners, 0 indexing seems like such a pointless frustration with an obvious solution. It can be a needless roadblock for people otherwise excited to learn.


An underappreciated facet of matlab that makes it mistake-resistant: by default, all parameters are passed by value. Thus, all functions are pure / referentially transparent at their interface, even if mutation is used internally. This combination is the easiest for beginners to learn, IMO.


I'm no fan of 1-based indexing either, but for these use cases it's not that hard to add a function to adjust the indexing for you. For example: a mod1() function that will adjust mod to 1-based indexing, and then using mod1 instead of %.

Same goes for indexing, you could write some functions like idx2(x, y, width, height), idx3(x, y, z, width, height, depth) that would compute the 1d index for you instead of using y*width+x (or the equivalent form adjusted for 1-based indexing) like you normally would.


> lack of a separate integer type (chars are strings, ints are floats)

> Even bit manipulation requires going through the FFI

This, and some of the other things, have not been true for years.


Like I pointed in another post: Welcome to the real world, where 5.1 is still the latest available release on many platforms.


The latest release is available in source form. I easily built the library with

    make generic local
Provided that you have a GNU compiler toolchain for the target you should have no problem doing this. If you don't I'm sure there's more effort involved, but ultimately Lua can build with any C99 compliant compiler and most C89 compilers (with the caveat that it can't guarantee that there is a 64-bit integer type in C89).


> The actual Lua language documentation is provided by another web site that looks like it has been written by an attention-deficit child.

How can you say that? The Lua documentation is on par with Kernighan and Ritchie.


The one-based indexing is indeed a total waste of time.

I've switched to using Wren for my new projets. Despite not being as mature as lua (you may have to add some functionalities yourself):

* the syntax is much nicer.

* it's simpler/cleaner to embed.

* it's (allegedly) faster if you are running on a platform where it's forbidden to JIT (e.g. iOS).


I really don't understand why some people think that C-like syntax can be "nicer". You may be used to it, but it's certainly not nicer in any way and has lot of strange design choices.


A couple of concrete examples:

* Classes in Lua are a hack, and consequently the syntax for them is terrible.

* Using "end" instead of "}". That decreases readability.

* no "+=" or "++" operators.


To be pedantic, it’s not forbidden to JIT on iOS. You can JIT your own code outside of the App Store if you self-sign your app, and you can always take advantage of the JIT that JavaScriptCore offers.


C and Rust solve different problems. They’re compiled, for one.


C and Rust solve exactly the same problems as any other programming language in the embedded context: they try to interface with real world. Whether they're compiled or not rarely matters, when software updates are done by reflashing the whole device.

Edit: If you have a device with enough computing power to run a scripting language interpreter, you can use Python in most cases.


I agree with the parent, C and Rust are not scripting languages, Lua is. The frontier between what does and does not constitute scripting is admittedly blurry these days but in the case of these languages it's fairly clear cut IMO.

Lua is very dynamic, easy to code for (syntactical quirks aside, I'm talking from a "cognitive load" standpoint) while C and especially Rust are a lot more static, require more careful coding but can result in faster and, at least in the case of Rust, more robust code.

Lua is the type of language you embed in your C app to make it configurable.


>easy to code for (syntactical quirks aside, I'm talking from a "cognitive load" standpoint)

That's not a fair comparison. A large chunk of the cognitive load comes from the awkward syntax and other design decisions that stem from it.

>Lua is the type of language you embed in your C app to make it configurable.

Situations where configuration files aren't sophisticated enough but a Python interpreter is too heavy are few and far between.


Syntax is very subjective and a matter of taste and familiarity. I don't really like Lua's syntax but from a cognitive load standpoint it's still your average dynamically typed, garbage collected scripting language. Syntax notwithstanding it's easier to code for than C's unmanaged and unchecked runtime or Rust's complex type system and borrow checker.

>Situations where configuration files aren't sophisticated enough but a Python interpreter is too heavy are few and far between.

That's not the point I was making and I'm not trying to get into a language war between Lua and Python, I have no horse in that race.


> easy to code for ([...] I'm talking from a "cognitive load" standpoint)

I strongly disagree. The unfortunate design decision of folding arrays into hashtables forces you to be very careful when you conceptually need a sequential list, because if you store a null value in it, you cut your list in half. Ooops.

An array is the most basic, simplest, most useful non-scalar datatype there is, and Lua bungles it.


C and Rust are usually used for firmware-level components, while Lua is convenient to provide as a way to provide a lightweight way to provide scripting.


For those of you who are interested, I run a Lua newsletter.

The Lua community is small so every bit helps :)

Shameless Plug: https://Luadigest.immortalin.com


It's worth mentioning the debt Lua owes to Self and Newtonscript. An extraordinary amount of Lua's syntax and semantics is directly lifted from these languages. Newtonscript in particular seems to be a template for a great many of Lua's interesting features.

What drove me away from Lua is its philosophy of providing minimal tools in which it is possible, but never easy or convenient, to do work in various programming modalities. Polymorphism, namespaces and packages, arrays, unicode... these things all can be done, but require significant head-standing and in many cases a lot of boilerplate. Even proto-style OO, which you would imagine would be Lua's signature feature, involves lots of rigamarole compared to languages (like Newtonscript) where it is baked in and elegant.

I kept wanting Lua to pick a pony.

Throughout my usage of Lua, I've consistently gone back to Lisp, where I could change the language to make inconvenient things convenient when I needed to.


Maybe it's because I've mostly used Lua as a tool in my homebrew game engine side project, but what you said about their philosophy of providing minimal tools where it is possible to work in different programming modalities is one of my favorite things about Lua.


As a developer, I've always liked languages that stress maps/hashes/dictionaries/tables/associative-arrays as their man data structure.

Off topic, but I've always thought that JavaScript and JSON would have been a lot better without the relatively artificial distinction between objects and arrays. Now that there's object shorthand and destructuring, this will never happen, but it was a nice thought for a while.


The javascript spec makes it clear that an 'array' is just an object with a magic length property that is always one bigger than the largest set numeric index.

The array syntax can be thought of as a shorthand where you want the indexes inferred from order instead of having to list them out. There are obviously a few useful methods that you use on arrays but almost all of these will work on any object with a length if you .call or .apply them.


Speaking of JavaScript. For a lightweight, Lua-like, embedded JS runtime there's always Duktape

https://duktape.org/


Nobody is stopping you from just using arrays for everything - remember they're also objects.

    const a = [2,3,…]
    a.prop = "hi"
Now whether that's actually something you should be doing is a different matter...


I was thinking more along the lines of using "objects" for everything like Lua: let foo = {'a', 'b', 'c'}; .

(Also on my JS wish list: Killing the use of const for anything besides actual constants.)


> Killing the use of const for anything besides actual constants

You should be using const for everything besides variables that need to get re-assigned.


Heh, I like how you carefully worded that, because normally 'const' would signify that a variable is immutable in just about any other programming language in the world. Except in JS, where const was awkwardly jammed into the language. I guarantee if the spec writers had known how badly const was going to be abused, they wouldn't have added it.

If you want a strongly typed language, go use C or Java and stop making JavaScript incomprehensible to all but the most pedantic developers.


> normally 'const' would signify that a variable is immutable in just about any other programming language in the world

That's not true in Java, C, Ruby, or Perl (with the 'constant' prag-module), not really true in C++ (unless you're careful), and . . . generally just isn't true in most languages.

While a few languages do add recursive immutability properties to their equivalent of 'const', most languages use the idea of a 'constant' only as a means of prohibiting name re-binding, not data modification. And even then, many languages that only prohibit const re-binding only do so in an optional (i.e. it's a warning) way.

Many of those same languages do support immutable data structures either out of the box or via popular libraries (though in non-compiled language, such structures often come with a performance cost). They just don't call them const[ant]s.

Edit: I'm also curious about how you think 'const' is being abused in JS. I'm not an expert in the language--this is a genuine question; not a veiled disagreement.


As long as you understand how `const` works in JS, I don't see it as in impediment.

  const x = {};
  x.foo = "bar";
is not that confusing, because you aren't re-assigning x, you're mutating its data. Maybe `fixed` would have been a better name for it (like a vector that doesn't change direction but can change magnitude) if people think `const` should mean immutable.


That's not quite true. Const forces a variable to always point to the same object, but doesn't make the object immutable. Only primitives like bool, string, or number are actually guaranteed to be constant.

The following is perfectly valid code, but breaks almost everything someone would reasonably expect from const.

    const foo = {bar: 123};
    delete foo.bar;
    foo.baz = 456;


This seems to be in agreement with GP:

> You should be using const for everything besides variables that need to get re-assigned.


You wouldn't start coding python for the first time without some research. You wouldn't for Java, ruby, C, bash, or any other language you can name.

JS is the one language that very few people bother to learn. They just start writing and see what works. For these people, a const that isn't const isn't a very good idea. Even among the good JS devs I know, most haven't learned this until it actually bit them in production because most books about JS don't mention the problem.

The TDZ (temporal dead zone) is a direct result of adding `const` to the language and it complicates everything (plus has performance implications). A feature that doesn't do what it seems to do while complicating everything else about the language is a textbook example of bad design. There are several people who agree in hindsight (I believe Brendan Eich is among them).


From C99 and C++, also Java, `const` does not affect the bound value. It's a declaration keyword, so it affects only how to bind the declarator name or names that follows immediately. This is all stating the obvious, but expecting effects on the initialiser will lead only to frustration, on two counts:

1. JS is in the C/C++/Java lineage syntactically. Ok, that is a deeper design flaw, but Netscape management to keep Sun on board said "Make It Look Like Java".

2. If you want declarations to affect captured expressions, you'll soon want a type system if not immutability controls and defaults. IOW, at least Scheme-like binding rules and set! as special form to use when mutating. Again the ships sailed 23+ years ago, and in truth for "Mocha's" target audience, immutable-by-default would have been harder to use.

"They just start writing and see what works" => a primary design goal. Flawless Victory! :-)

Seriously, the Scheme in the browser baby was killed in the cradle; the recruiting bait used to get me on board was snatched away because of the Sun/Netscape Java deal. All is not lost, as JS grows better special forms (but for backward compatibility must keep old ones). Among these, `const` is least better but still useful.

What's more, those who advocate mass-market immutable-by-default are still working to make it usable in the large (e.g., in Rust). Pedagogy helps, but also has not delivered the goods yet. We are living through the full experiment to refine and confirm that i-b-d is best.


`const` "seems to do" this: prevent variables from getting re-assigned. That's all one needs to know to understand the associated pitfalls.


so, in a saner language, variables would be const by default


I just recently learned Rust's 'let' works that way, unless you add the 'mut' keyword as well. (And const, of course, is always immutable.)


You can always use PHP!

A PHP array works as a sequential or an associative array; or both at the same time


You can also use a donkey as a car.


The distinction is syntactically important for JSON, IMO. Nobody wants to write integer keys for each element in an array.


You don't have to { 1, 2, 3, Prop = 'Example' } is the same as { [1]=1, [2]=2, [3]=3, ['Prop'] = 'Example' }.



To courious web nerds: currently both Nginx and HAproxy have got Lua scripting modules to extend themselves by implementing e.g. customized filtering, routing, caching, proxing, storage etc. One can even build a full featured web app backend in embedded Lua tied to Nginx.


Disclaimer: I am CTO of Kong.

If you want to look at a large OpenResty/Lua application from a code perspective you could take a look at Kong[1] - which is the most popular public codebase built on top of NGINX/Lua at the moment.

CloudFlare also uses the same stack but most of their codebase is proprietary.

What really makes Nginx/Lua a good combination is primarily LuaJIT[2] support, a much faster implementation of the Lua VM.

[1] - https://github.com/Kong/kong

[2] - http://luajit.org


What made you choose Lua instead of say, embedding Python into nginx? I can think of a few good reasons like embeddability, code size, simplicity but it would be interesting to hear your take on this.


Can't speak for Kong, but I'm currently working on creating a microservice in lapis (using moonscript). The speed of the underlying OpenResty server, as well as lapis ergonomics make this really fun. I guess the only practical reason I can see is the async everything without clutches approach, as well as the truly impressive TTFB.

For me, day 1 productivity is really high, comparable to Nancy, the framework I was using before.


Luajit is pretty fast


Redis and Envoy also have the same. It seems to be the scripting lingua franca for interacting with native code.


Shameless plug: I maintain an open source Mac app called Hammerspoon that opens up a ton of OS frameworks/events to a Lua environment for your scripting/automation pleasure :)


Of particular note; LuaJIT can pretty much directly call C libraries. From here:

* http://luajit.org/ext_ffi.html

    local ffi = require("ffi")
    ffi.cdef[[
    int printf(const char *fmt, ...);
    ]]
    ffi.C.printf("Hello %s!", "world")
So while, say, Python is "batteries included", LuaJIT is "all the batteries".

Using C libraries in regular Lua is fairly easy as well, but LuaJIT is exceptional.


> So while, say, Python is "batteries included", LuaJIT is "all the batteries".

Do you say this because LuaJIT can call shared libraries dynamically? If so, what about ctypes[0]? Is LuaJIT's FFI just easier to use, or does it do something that ctypes can't?

[0]: https://docs.python.org/3/library/ctypes.html


The two most appealing features of the LuaJIT FFI:

1) The syntax. To specify the types of the FFI functions you you use a DSL that is similar to C declarations. Often you can copy paste function prototypes from ".h" files with very minimal modifications. With ctypes you need to specify the types with Python function calls. See "ffi.cdef": https://luajit.org/ext_ffi.html

2) Performance. If your code is inside a JIT-compiled trace, LuaJIT FFI calls have minimal overhead. They compile down to something very similar to what a C compiler would generate. Ctypes, on the other hand, must "interpret" the FFI calls which adds more runtime overhead (although admittedly this is also true for LuaJIT if your code ends up outside a compiled trace).


Roberto Ierusalimschy, et al., on "The evolution of an extension language, a history of Lua (2001)" [1], is an interesting read.

The word "configuration" makes repeat performance in that paper, but oddly is entirely missing from this 2018 paper (by the same authors).

[1]: https://www.lua.org/history.html


For game development lua is wonderful. Pico8 uses lua as does love/lua. Both are wonderful tools to write games.


While I'm not a developer nor I like lua I'm curious about a thing: what's the difference between for instance Guile scheme?

Because they seems closely similar to me and Guile is far more clear and effective, at least for my programming skills...


I'm planning to use tcc as a scripting language. Not sure if it's a good idea, but as long as I don't have to rebuild my code, and as long as it's not slow like most scripting languages, it sounds like it's good enough.

Although I agree it might be difficult to write scripts in C, in the end it might just depend on the core functions that are exposed.

I'm not a fan of C, I prefer python, but I guess I will have less issues when interfacing c code to c++. Python seems quite fat, slow, and too much different than C++.


What Lua lacks in terms of performance and broad-scale, low-level library integration can be made up for within the Rust ecosystem where these exist along with a well-built Lua wrapper, rLua.


The images appear to be dead..


I always perceived Lua of being completely weird, and asked myself why people would ever want to use it. Especially nowadays, where embedded Python is everywhere.


> I always perceived Lua of being completely weird, and asked myself why people would ever want to use it.

It depends on your habits. I for one prefer Lua to Python and find it much more pleasurable to code on. There are some things in Python that I like (like the blocks defined by indentation), but overall the language seeems too complex for no reason when coming from lua.


The thing is, my free time isn't infinite, and python is already everywhere on the PC (I only use Linux), so I really appreciate the fact, that I have to learn only one language for both PC and embedded. Micropython runs very well on modern microcontrollers, and you can use many of the libraries from PyPi. Also, there's a bunch of hardware drivers, which saves a lot of time. In the end, it's all about conserving knowledge. What's the best method of doing complicated things once, and then provide them to others in an easy and convenient way. I think, that currently the reusability of python code is very high, and enables a lot of people to write custom programs, that otherwise wouldn't have been able to do it. The friction of Python is just low. It's easy to learn, runs nearly everywhere, and you don't have to use advanced features, if you don't want to.


Python the language is great. What sets me off somewhat is the "ecosystem" and culture around it. Pip, virtual environments, the fanaticism about some PEPs; oh God I hate these things.


Yeah, python is quite approachable, but it seems to me that this is now also its weak point: it now attracts the ever growing crowd of programming newbies, generating and solidifying weird/bad practices based on misunderstanding/lack of experience/zealotry (like the string/unicode mess, awkward handling of unix primitives, preoccupation with questions of "the correct style" or identity politics recently).


It's so bad that even Guido noped from the project.


Most of us who deep-dive on Lua/JIT are using it as a glue for C libraries.

Lua is a small language, it fits in the head with room left over for Python, which is the tool I favor for general-purpose data munging and small OS scripts.

If your case resembles "scripting over a collection of C ABI static or linked libraries" LuaJIT is head and shoulders above the competition. PUC Lua is itself outstanding.


Not to praise LUA which has way too many shortcomings to be useful IMO, but I had to both of these things (embedding Python and embedding LUA).

Python is FAT.


If you're sensitive to performance then Lua's speed might be a major advantage over Python.


On modern embedded hardware, Python is a better alternative in practically every case. If size and computing power are an issue, either investing in better hardware or spending time writing C yields better results.


C solves different problems, there are cases where scripting languages can be extremely helpful yet running under heavy RAM constraints. Lua and JS can be adapted to quite low-mem situations (we were using on 128 k) and there are no comparable & production tested Python implementation that can do the same.


> On modern embedded hardware, Python is a better alternative in practically every case.

How so? Do you have an example where Python outperforms Lua or uses significantly fewer resources?


> or uses significantly fewer resources?

Do you count developer time as a resource?


Everyone does, of course, but hardware constraints are a reality, nonetheless.


Since the lua authors wrote this article, it can't bring much of a new perspective on Lua




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: