I get the feeling that Lua is slowly on its way out because the academic nature of it never meant it had to be more than "good enough" outside it's core strengths.
I think they should have kept a super-light "classic" Lua where ~= still means !=, there are only tables, and all the other quirks (but also speed). Then add official, optional support for classes, lists, dicts, sets. This probably can all be built on top of setmetatable, so be pure lua, but at least it comes as one consistent standard library and not in the form of a fractured ecosystem.
All the mentioned compiles-to-lua Projects tried something similar, but never gained significant traction, probably because of the lack of an official blessing.
That's a wrong impression. Even the original author of LuaJIT still commits regularly and there are dozens of companies maintaining and using it in industrial scale applications, and there are a couple of forks which focus on specific use cases and are independently developed. Not to forget the hundreds of games using Lua as a scripting language.
> only supports Lua 5.1 (5.3 is current, 5.4 imminent).
Which is obviously enough for most people using it including me. I don't see any unique selling point which would force me to update to Lua 5.3 or higher.
> and all the other quirks (but also speed).
Well, do your research. LuaJIT is among the fastest JITs available (e.g. a factor ~1.5 faster in geometric mean than JS V8), but with a much smaller footprint.
Doesn't that show that the actual users of Lua and the authors have drifted apart, and the former are not willing to commit to a hard fork or taking over development? Or both sides could come together to form a much-beloved committee. The curse of "good enough".
> quirks / fastest
I think you misunderstood: Yes, it is fast even without LuaJIT, and super easy to embed and integrate - which is why this slow fading away into the embedded only realm would be quite a shame.
But it is also "quirky", and adding some (optional, possibly slower) syntax sugar over it could make it more appealing. Sure, Lua-fans will argue that tables are better than anything, but for those who just want to transfer existing language knowledge are hard to convince, and setmetatable makes everything even weirder.
So what? It has an MIT license, and there are obviously different communities with different requirements.
> it is fast even without LuaJIT
Not really. The PUC Lua VM is even slower than Python (in geometric mean) according to https://benchmarksgame-team.pages.debian.net/benchmarksgame/.... LuaJIT in interpreted (not JIT) mode is still four times faster than PUC Lua (see http://luajit.org/performance.html).
> this slow fading away into the embedded only realm would be quite a shame.
I assume you mean "embedded use in applications" (i.e. not embedded systems). That's exactly what Lua was designed and built for by its original authors. An that's how it is mostly used.
> adding some (optional, possibly slower) syntax sugar over it could make it more appealing
Lua can do surprisingly much for this lean and simple syntax. If instead you prefer a baroque, pretentious syntax like Python or TypeScript, and are obviously satisfied with Python's performance, you already have a well established solution. And as I've demonstrated e.g. with https://github.com/rochus-keller/Oberon you can replace Lua by a more complex language and still profit from the performance and leanness of LuaJIT.
LuaJIT has that already.
> utf8 support
This is rudimentary. Easy to add an even better library to previous Lua versions.
One reason using global variables is bad practice is it makes testing harder. An unfortunately high percentage of embedded software doesn't have any sort of harness-based testing because it's written with globals spammed everywhere, which prevents you from using any kind of principled testing strategy where you mock out all the hardware dependencies. It's especially bad if there's globally defined MMIO stuff like "#DEFINE CCR1A (uint64_t)0x74FEA10". Good luck testing that!
In domain of C, passing by reference means passing pointer. If you chuck everything into single struct and pass by pointer, it has same problems as global scope.
Not that I'm advocating for global variables. Even tiny projects tend to grow with time, and localizing scope across the code base is not fun at all. In context of Lua, I've just trained myself to prefix variables with 'local' and I don't give it much thought.
Not true. In fact, this refactor is one of the best things you can do to improve an old shitty embedded C codebase. Among other benefits, it allows you to have multiple instances (an arbitrary and easily-adjusted number, in fact) of a system sharing the same memory, reduces the complexity of linker-related BS, and simplifies testing. It's vastly better than relying on horrendous C cross-module scoping rules for sharing.
I agree that passing in structs is vastly better than communicating over globals. On the other hand, taking existing code base and implementing this state-passing is quite large undertaking that affects all function declarations/implementations. It might be beneficial, but there are often better investments of your time.
1. Having only 1 data structure makes for a simpler, more elegant language. Once you understand tables, you understand everything you need to know about lua.
2. You could implement any other data structure with a table.
* A table is basically a dictionary / map already
* A table can be an array if you use numbers as keys. They don't even have to start at 1.
* A table can easily be made into a set if you use the elements of the set as keys and 'true' as the values.
* A table can be made into a proper OOP class with metatables
* A table can use prototypical inheritance too
* A table can act as a basic data record object
Those using the language don't really miss having the more specific data structures.
Context is important IMO; if you're going to using a language as your primary development tool, then you'll get over that hump fairly quickly, and my objection is less relevant. But for our use-case (embedding Lua in a C application to have user-provided scripts drive our C library's callback hooks), the developers were primarily working in C and Python, and so most of us didn't use Lua often enough to really click with it.
For our use-case, something a bit closer to Python or Java would have been much easier to grok, and therefore would have made our development easier and more productive. "Easy and more productive" is all I'm really looking for in a language, always within the context of the specific usecase of course.
I'd be fine with dropping most of the sugar in Python or Java, but I'd be surprised if there were no "zero-cost abstractions" that could be added to Lua without making it too heavyweight for embedding.
It literally doesn’t have the extra weight.
Sorry for the sarcasm, but if you are thinking with theses words, you are thinking wrong.
... so, now we CAN have scriptable / interpreted code in iOS?
The amount that it can do with the OS and stuff is fairly locked down, but in the version I used to have installed (incompatible with iOS 11, so I can’t use it anymore) iirc even allowed using import os to use some shell commands (though, of course, the shell was very locked down).
> Closures are self-contained blocks of functionality that can be passed around and used in your code. Closures can capture and store references to any constants and variables from the context in which they are defined. Closures can be nested and can be anonymous (without a name)
I think, the term "closure" gains some unnecessary semantic meaning of a block of code / anonymous function. I might be wrong, but It is better to think about it as simply a technique for implementing lexical scoping.
In other words, "lexical scoping" is a property of a language, while "closure" is only an implementation detail to make lexical scoping work. So the term closure does not have to leak in the description and semantics of the language itself. What is your opinion?
Edit: I just think that such proliferation of terminology confuses people, making them ask questions like: "What is the difference between 1) function, 2) anonymous function, 3) lambda function, 4) closure?" Instead, focusing on the idea of a function (possibly without a name) + lexical scoping clarifies everything immediately.
Well, okay, if a language does not normally have lexical scoping, but has a special "closure" feature to implement it (dunno, maybe C++ lambdas may be regarded as such a special closure construct?) That could be a justification for the term closure on its own. But if the language has lexical scoping by default (e.g. Gravity says that it has lexical scoping in its overview), then, I think, there is no need in a separate notion of "closure" in the language semantics.
Lambdas have been spreading lately, but plain old functions are still not lexically scoped in a lot of languages otherwise regarded as such, C/C++/Java/Pascal etc.
My point is that instead of reusing the term closure, one could use the term anonymous function. To rephrase my question: Do you have a specific example where a nested function (or an anonymous function) is not lexically scoped (In a language that is otherwise lexically scoped)?
x = 1
undefined local variable or method `x' for main:Object
All closures are not anonymous either.
The reason we use different terms is because there are subtle differences that are important to capture.
irony is that this used to be a negative thing back in pascal days.
Ease of integration, interopability with the "host" language, the embedded size of the scripting engine or compiler etc..., all might be more important.
Please Consider creating a Gravity 2.0 with this fixed as a compile error, you will save a lot of debugging time to your users.
At best it could perform definite assignment analysis on variable declarations, but it can't go very far. In particular, it would do nothing for function return values.
As a result, the best it could give you is runtime errors when you e.g. access the value of a dynamic Maybe without checking for presence first. Without static analysis, this isn't hugely useful.
(For context: a language without null pretty much implies an Option type, which is mostly useless (or no better than null itself) without static typing. Once you're that far you're probably gonna want to get monads to make them workable and congratulations you're reinventing Haskell)
But cmoon it's not about FP. Checkout C# value types nullable<T> for example. C# 8 brings same principles to reference types (however being non-safe since you cannot break compatibility). Only good reasons to have (Java like) null is to either because you already did it and you cannot take away or you are in ecosystem where it's fundamentally in.
And to be precise it's not about null per se. Null can be just fine but it shouldn't be valid value for every type. You can solve it either by boxing (traditional option type) or some Type Scripty flat: Car | null ideology.
In some sense all languages with Java like null have some option types buuut they are missing regular non-nullable types.
If you just mean that a program has certain properties (like performing IO) that are not expressed in its type, then I can see what you mean. But even in Haskell, some properties like “this function might fail to terminate or throw an exception” are not expressed in the type (although you can use monads for those things in languages where all functions terminate by default). There are no distinctions between a function that performs network accesses and one that does not, or functions that never return odd numbers, etc. There are infinitely many program properties that you can come up with which are not expressible in Haskell types.
Even dependently typed languages with very expressive type systems cannot capture all possible program properties due to logical incompleteness results.
I was using the term more informally. In fact it is the operational semantics of your hypothetical language that is not complete in the sense that you will have a hard time putting it into a form that allows you to do anything meaningful (e.g. proof absence of certain errors) with it.
Interestingly, you can define such a language with side effects (think ML), by actually pulling the state monad into your meta-level (for instance by turning your reduction relation into an instance of a state monad).
As an example, imagine an "updateDB" call that returns void. Totally could exist, and be useful, but all it does is a side effect; as a function you haven't expressed its functionality via its type. From a 'pure' perspective, it would be fair for the compiler to remove that call entirely, since you do nothing with the result (since there is no result). Instead you need a monad or something to express that side effect.
Obviously, many languages don't choose to express side effects within their type systems. It's not that means the language isn't useful, just that its type system isn't complete; you have things happening without a type attached to them (and in that way it's dynamically typed).
In the end, type systems (all of them, static as well as dynamic) are just a tradeoff between safety and practicality.
And in languages that lack static types altogether, it's a moot point.
I really don't get it because in many years of programming, in practice, null pointers are one of the least of my worries.