I wrote a trading system where the strategies are written in Lua. It has been a delight, fast and simple. The traders have the expressivity of a full programming language and I can add any function to that language and provide any data as well so that their programming (and execution) sandbox is extremely ergonomic and suited to their specific task. In other words, a trading DSL.
Other code outside the sandbox pulls in up to date price data and if certain safety rules are violated will automatically close out positions. So even if the traders code their way into an infinite loop or make other mistakes, the supervisor can step in and automatically prevent disaster.
Using Lua to make a language for others has been a wonderful experience. FYI, it was approx 11K lines of Lua).
Man, I tried at Bloomberg to get my managers to let me incorporate lua into Tradebook's trading system. They just simply couldn't get it. I'd try to explain and they would look at me like I was from mars or something.
Eventually they got tired of me pitching for it and fired me. How did you ever get managerial buy-in for something like this?
Certainly. The problem is, not all cost savings are welcome. What if your manager is halfway through a 2-year, $5 million project----which you made obsolete in 2 days using a swig + lua + a 1-page long script?
Is your manager going to go to his manager, and tell him he just wasted $2.5 million? When his manager has promised him a big bonus if he completes the project by the end of the year?
---///---
I've seen--very understandably--comments like "there's no way that could ever happen" or "this guy must be very hard to work with in other ways." Not so.
The problem, was a culture clash between how west coast/silicon valley software companies work, and how east-coast companies work. Not that one is necessarily better or worse than other, but they are very different and if you are slow to pick up on that (as I unfortunately was) its very easy to get burnt.
I’m so sorry this happened to you. I’ve been extremely fortunate so far, and if this had happened to me early on in my career I would have been absolutely crushed.
If you repeatedly contact management with the same proposal you risk coming across as dissatisfied in the organisation and sometimes there's very little tolerance for this, might be because it's just a tyrannical place, but could also reflect the type of business and customers.
If you’re the type of person who won’t just move on from an idea when you’ve been repeatedly told no that indicates you’re probably a difficult person to work with in other ways.
That kind of behavior is less welcome in enterprise environments where they want you to get in line with the top level goals.
From my experience working at Bloomberg, there was a limit to how much you could rock the boat as a dev and in many ways the technology/language choices were limited to the approved stack.
bloomberg has some great teams though. i had lunch with the bond pool team about 5 years ago, and oh my gd they were good. plus the snack floor is awesome in the main office.
I loved most of my time working at Bloomberg. Super smart people, a very engineering-centric culture. The politics, however, were very different from the west-coast companies I worked at before, though. You might not think that there are much politics where you work, but that just means you are so familiar with the politics that you conform to them without even thinking about it.
But every organization has its sacred cows, and no matter where you work, if you happen to get yourself into a situation where in order for you to be right, your bosses have to be wrong, it's game over.
Yeah I definitely agree that they’re doing some great engineering work in the company.
It’s a strange place because it’s fully owned by Mike Bloomberg without any board or other group of shareholders that oversees his decisions. There is a “board” equivalent but they report to him. It’s great in someways like how they’re able to spend so much of their profits on philanthropy but leads to a lot of quirks.
i have a colleague like this whose main avenue of logic seems to be wearing teammates down. i wouldn’t fire him over this, but gawd are there days where i wish he’d move on. we’d be losing out on a few skills but i’d dread meetings a little less.
> If you repeatedly contact management with the same proposal (...)
My point is that at most presenting a proposal is only tangentially related with a grievance that's important enough to warrant firing someone. No one gets fired for reaching out to their boss and say "hey, I think I can improve this". It's a scenario that's unbelievable.
I understand your skepticism. Here's the thing: pretty much every time you go to your boss and say "hey, I think I can improve this" you are going initially get blown off. Every organization has inertia, and--lets face it--most clever ideas don't really work out anyways.
If you really want to effect change in an organization, you have to champion it. Your bosses are already busy with other things, you have to get their attention and get on their agenda. If they raise objections, you have to find answers to their objections, and ask them again. And again.
How do you know whether you are the heroic technical visionary, or whether you are just "that guy" who never will just shut up and go away?
More to the point---how do you know how to raise the issue to your bosses, and how many times you can raise the issue to your bosses, before they class you as "that annoying guy."
The answer depends upon the politics and culture at your company. Some things are considered rude in some cultures but not in others. If you are politically acute enough, you can sense when you ar going too far, or you are kicking some sacred cow, or interfering with some powerful person's agenda.
In my case, I had always worked at West coast companies, and I didn't pick up on these cues. Am I hard to work with? Am I "that guy"? Yeah, I guess I was, but I have worked at other companies which would have really appreciated my approach.
BTW, I'm not making value judgements here--when you are dealing with somebody else's money, and in areas where there are a lot of government regulations, etc, a more conservative culture is entirely appropriate. Alas, I'm a pretty good programmer, but slow to pick upon these sorts of things. Live and learn.
I tried a similar approach with a team I was working with. We were building it on top of redis and after some basic benchmarks gave up the idea and figured out from the docs that that eval script is blocking.
Lua in Redis is not useful for serving as a runtime for running Lua application code; it's valuable for allowing you to perform a series of steps atomically. You can read and write data without worrying about something else changing: only one piece of code can write at a time. We use it for rate limiting, which requires reading and writing atomically, for instance.
When I did that on a project I solved that problem by replicating redis to the same container that the ads were being served from. Replication was very fast, and all that was blocked was the local redis. I worked to make the matching rules in the script efficient, and the blockage was truly not a problem.
Doesn’t local redis kind of miss the point of using redis in the first place? On the surface, that’s a big chunk of additional complexity for something that could be done internal to an application. Was this meant to be an incremental step in a larger refactor?
well, redis manages the replication without fuss. Unless the framework / language one uses supports it out of the box, there is no point in re implementing replication.
Yeah with the added advantage of reduced latency.
The system I worked on had a CRUD application which was in Django , ultimately the rules was stored in a Redis.. The ad serving application was in golang that connected to this local redis instances that were replicas of the central redis..
Integration complexity could be a factor. I’m making an assumption about the architecture, but embedding Lua in a program is dead simple and you can do so without introducing external dependencies. Python IIRC requires you to ship and package the standard library or have it already installed, the Lua interpreter can be statically embedded in a program.
I've embedded Lua in my game engine and I'd say that
- performance is a joke
- GC can cause really bad hick-ups
- loose runtime typing makes it hard to build static tooling for developer support
- lack of predetermined more rigid structure makes some things confusing (like you can mess with the internals such as the meta tables and stuff, which can be useful but also confusing)
+ functionality wise it's adequate for a game programmer, perfect for writing (smallish) scripts of game play
+ it's simple to integrate and provide bindings for
+ lots of flexibility how you use it and what kind of code you use to write with it
In my experience I'd say that if you're planning to integrate and use Lua in your projects you need to
- have a fallback for performance sensitive code, either by being able to mix and match native and Lua user (game) code or provide functionality natively in your engine OOTB
- make sure you can run things properly (using real threads not joke threads) in parallel as much as possible
- stay on top of perf at all times and integrate performance tests and bench marks to your development process
Despite the problems I've managed to build a Lua code editor that has
a) auto formatting
b) syntax highlighting
c) auto completion (limited but still)
d) simple navigation support
e) built-in help
The editor is integrated in my game engine's editor.
Good question. But essentially because things started out small and then grew from there.
Going forward being able to have seamless integration between the scripting and rest of the editor is a key differentiator and value added. For example being able to jump to the right API documentation relevant for the function call under the caret, or being able to open and jump to the game asset from the script.
I'm not sure how well a LSP type of integration would work here. As far as I can tell it'd need to evaluate the script in order to have best possible diagnostics and analysis and that won't work without the interpreter having the native game engine code available as well.
Of course my "solution" was mostly about slapping components other people built together.
- Qt already has a text editor component with components for doing stuff such as syntax highlight quite easily.
- I use tree sitter to extract symbol information from the Lua code.
- I use an open source code formatter someone else built.
I read that as "green threads". But it could be anything just short of a thread maybe even futures and promises.
Just about any thread that isn't a full-on proper thread is not ideal for general purpose game development. There are a lot of thread implementations for various user interfaces that have a goal of minimizing latency or maximizing CPU utilization. But if you are in a latency sensitive environment like a game and you're highly likely to be pushing the CPU to 100% you often need real threads not something like them.
Coroutines have their place in game development too, but they're not (in my experience) used as a thread replacement. They are used to manage complex flow control. Similarly, I have seen green threads and event loops used to manage the user interface or Futures and Promises to manage certain IO in a game but not the rest of the general game work scheduling.
I also need elaboration on this. I presumed calling lua from c can be used to implement user-defined callbacks. Launching a thread for lua + locking seems even slower?
It isn't terribly hard to write the bridge code to lua, just monotonous to translate inputs and outputs from C++.
You throw a lua function into a table (usually tables are used as namespaces and globals are just a table) and then make that C++ translate to your normal one and handle the return value correctly.
The hard thing is building a system to pass in a C++ object in general as that required building a table for each instance that matches its class.
I've had some success by tuning the GC with `collectgarbage('setpause', 100)` and then manually calling ` collectgarbage('step')`every once in a while.
That being said this problem isn't unique to Lua, There are horror stories about Unity and their integration of C# for example.
It's especially bad in Unity because it's using the Boehm garbage collector. C# everywhere else has a performant generational GC. Unity have been working on this problem for years now trying to get the engine to work with .NET (Core) and it's better GC.
I believe Roblox's Luau would be suitable for this case? It's got pretty good performance and typing support. Still supports metatables I think (they needed backwards compatibility with old Lua 5.1 code), but it's a start ¯\_(ツ)_/¯
I'm wrapping up a multi-year personal project, a game written fully in lua using love2d.
To me, the beauty of lua is the simplicity and lack of learning curve: I can usually accomplish whatever I need to without looking anything up (as the author said, everything is a table so there isn't much to overthink). Also, the community and support around love2d is fantastic.
One thing that's bothered me is that lua silently returns nil when you reference non-existing elements. That's been a pain when debugging, since a line with a typo (`a = typo`) doesn't fail, and the code fails much farther downstream (e.g. when doing arithmetic on `a`). So almost all my errors end up being "trying to do operation on a nil value", and there is no indication of why it's nil.
I think you can add a metatable to _G with an __index function. This should be called when accessing undefined local variables (as they end up trying to access the global scope _G) and you can thrown an error.
You can. In our game engine ~10 years ago we would hook the global table to stop designers from creating globals (the default in Lua without using the local keyword) at certain areas in the game frame, mostly to stop this exact trap.
I toyed with the idea of inverting the semantics of global and local in lua and to remove "local" and instead default to local and have a "global" keyword. Looking at lua.c quickly dissuaded me when I was a much more junior programmer, but now days it might be fun to try.
Local by default would have been a much better choice. The entirety of my code looks like local this local that. In all my years of Lua I have only found 1 reason to use a global variable: in Love2D when loading a texture, a local variable will go out of scope and get garbage collected, resulting in a black texture, but a global variable will not. Maybe it's even been fixed by now, as that was years ago.
Yes, you can control a namespace (in the form of a table, of course) in which a module you require executes. The global definitions are then placed in the table. You can implement copy-on-write using this namespace and a metatable, you can change the semantics of accessing an uninitialized (global) variable, etc. Lua is incredibly flexible, after all. Unfortunately, since you have to implement (and then maintain!) those yourself, it's hard to justify (IME) using Lua instead of a more full-featured solution (that might still incorporate Lua somewhere in the stack[0]), esp. since the main selling point of Lua is simplicity.
[0] You can use something like Fennel or Haxe to compile a more structured language to Lua, or you can use an alternative implementation like Luau.
My biggest gripe with lua was that depending upon the internals of the implementation, it could "swallow" an error entirely. The program would just die in absolute silence and not give an error at all or any indication it was still running.
> One thing that's bothered me is that lua silently returns nil when you reference non-existing elements. That's been a pain when debugging, since a line with a typo (`a = typo`) doesn't fail, and the code fails much farther downstream (e.g. when doing arithmetic on `a`). So almost all my errors end up being "trying to do operation on a nil value", and there is no indication of why it's nil.
I am also making small games with love2d. I've found you can prevent many of such issues if you:
1. Create objects with private fields, using getters and setters to access values (as function calls will crash if you call them and the functions don't exist, unlike fields). I like an approach that is somewhat similar to this: https://www.lua.org/pil/16.4.html
2. Add assertions liberally, especially in constructors.
Use LuaCheck if you don't want to use the LSP. It warns about the use of globals, which is generally what you want. It's perfectly possible to design your program to use no globals.
If you use the lua lsp, you can make type annotations which basically work like jsdoc.
With those annotations, the lsp will warn you about such issues, there is a diagnostic that's called something like `needs-nil-check`.
Well, yeah, you shouldn't name your variables so similarly, that's what's causing he confusion. Idunno, maybe varaible is set to "Let's groove, baby!" ? ;-)
That has been nice (at least for editing neovim configuration files). But what if I am editing anything else?
For example, I have then tried to edit Wezterm config files and there are no types. I did find some types someone made online but no idea how to instruct my editor/lsp where these types are or what they are for.
You can define a metatable on your objects of interest (or the root table meta table if you don't mind breaking the language's conventions and thus libraries) with __index and __newindex members. Then you can throw in those by calling the `error` function when they'd otherwise normally return nil, should you desire it.
But runtime checks have a cost, and static types that transpile away are a bit better for overhead so long as you don't mind the build step, so using one of the typed lua variants is probably a bit nicer in the long term. Catching those typos early is their bread and butter.
The default should be an error, since that's the only way to prevent issues where the wrong value gets passed around a lot before finally blowing up somewhere else (and good luck debugging that).
In cases where you really want to fetch the value or else get some default if it doesn't exist, there should be a way to do so that is distinct from regular dereferences and element access.
I do not know how/why the language should magically know that something should error, if nil is not acceptable as a return then you use || or "this". if the table should not have nils then you turn it into a metatable that errors or returns whatever you want it to.
The language should not "magically know" anything. It should do the safest thing, which is not to silently return a marker value for which there is no guarantee whatsoever that the caller will remember to check it. If the caller does not want to see an error, then they should use a different method of retrieving the item that is specifically defined as returning a marker value. E.g. in Python:
d["foo"] # exception if missing
d.get("foo") # None if missing
d.get("foo", 42) # 42 if missing
This follows the principle of least surprise - [] is the standard syntax for indexing, so it raises exceptions, and if you forget to check you get an error right there and then, not an unexpected but valid value (that might be saved into a variable etc and then break things much much later!). If you need a market value than you must use get(), and the very act of doing so indicates the intent to both the language and to another person reading your code.
This is your opinion on what the default behavior of nil should be, fortunately, We can disagree and you can set a metatable that errors on nil, I will choose to do or not do that depending on the object.
assert() also exists for this very reason.
PS: there is a performance penalty between doing d["foo"] v d.foo, one expects an expression, the other does not. So it cannot be compiled. lua also errors on use of a nil value, d.foo() will error, d.foo + 3 will also error. So while it does "silently return a marker" it will crash on runtime.
> Maybe i should have known better than to ask about things I dont know.
It is fine to not know things. But when I look at your post, I see that you didn't ask any question. You only made statements that were based on very unfounded assumptions.
But to get back to what tables are: They are key-value pairs with both array- and hashmap-semantics. By convention, arrays start at 1 in Lua. They are very similar to dictionaries in Python , associative arrays in JavaScript and other key-value data structures in other programming languages. However, lua also uses them for storing variables in environments (some people would call environment "scopes") and there are special callbacks for missing variables and accessor functions when working with these tables.
You're thinking PHP. Arrays and objects are discrete things in JavaScript. You can add random properties to arrays (since they are also objects) but don't expect them to behave well when doing things line loops, getting they length, etc.
They really aren't. Tacking random properties onto an array doesn't affect the array's length, they won't show up when using `forEach`, map, reduce, filter, or any of the other built in functions.
etc. This is ipso facto what an associative array is - a mapping of keys to values. The only difference between JS and Lua in this regard is that in JS, object keys are always strings, while in Lua they can be of any type except `nil`. So obj[1] is the same as obj["1"], but table[1] is distinct from table["1"]. However, that does not change the fundamental semantics of the data structure in question - `Object` is still mapping arbitrary (if type-constrained) keys to values, so it's still an associative array by definition. Back when JS didn't have a dedicated `Map` type, it was often used as such in practice, too.
I will also note that things like length and iteration can be defined in many different ways. E.g. the way Lua defines length (operator #) for tables - which are also associative arrays, of course - excludes non-numeric keys. And for-loop does not directly work over tables, so how you iterate depends on which iterator function you use - `for index, value in ipairs(table)` will also only iterate over numeric keys, while `for key, value in pairs(table)` will iterate over all of them. That, again, does not change the nature of the data structure in question.
Aren't Lua metatables and JS prototypes almost the same mechanism?
I say almost, because I think metatables are more broad than JS prototypes - you can emulate JS prototypes behavior using Lua metatables[0], but I don't think the reverse can be done (maybe with some Proxy[1] hackery?).
Yes, you're right, `__index` does behave a lot like `__proto__`. I often forget that it is one of the few entries in the metatable that doesn't have to be a function.
> After that, I went back to Dmitry and asked him if my understanding of “everything is a table” was correct and, if so, why Lua was designed this way. Dmitry told me that Lua was created at the Pontifical Catholic University of Rio de Janeiro and that it was acceptable for Pontifical Catholic Universities to design programming languages this way.
There's not really anything uncommon about a programming language having its origins in academia, a university being Catholic, or a university being in Brazil (which is the world's 7th most populous country). So I also don't really get this.
> Semantically, Lua has many similarities with Scheme,
even though these similarities are not immediately clear be-
cause the two languages are syntactically very different. The
influence of Scheme on Lua has gradually increased during
Lua’s evolution: initially, Scheme was just a language in the
background, but later it became increasingly important as
a source of inspiration, especially with the introduction of
anonymous functions and full lexical scoping
Of course, Haskell was also Scheme influenced although it's an ML descendant.
Speaking of this, I wonder what would've happened if they embedded Lua into Netscape instead of writing JavaScript...
I think if Lua were adopted as the web language, it would have lost its spirit / design philosophy of simplicity. I think the web would have exerted an evolutionary pressure on whatever language was the chosen as the exclusive way of writing code. Everyone forced to use a single language as the only option plus too many people clamoring for their own pet language feature to be added plus design by committee will probably always lead to a mess.
I have been having thoughts, wild and incoherent, about the evolution of a given programming language. What does it take to survive and thrive? I might describe my programming career as being kind of a refugee: on the web, I moved from plain .asp to Perl, fled Perl for Python ...
At the start, a language needs the ability to evolve. I recall reading a book on Python before I adopted it, a book which was so off-putting that I ended up delaying my attempts to use Python, and it went on about Python's philosophy when it came to division. I knew that, at some point, Python would have to change away from integer division by default. And it eventually did. It was a mistake because it was non-obvious, and the language changed to fix it. Good.
Now, though, all kinds of things have crept into Python which are decidedly non-Pythonic: the walrus operator, which looks like something that escaped from a Perl dungeon; the docopt module, which has a weird trap of it being a module you must know about, but only if someone has decided to use it; the utter shambles which is the whole environment and library packaging "let entirely too many ways to do it exist;" or the community's preference for Requests while it somehow has yet to be co-opted into the standard library. I am sure I have gored someone's ox here.
Perhaps a language needs a certain window before it is finally frozen in place. Or a more stringent, forceful mandate than the Zen of Python, which I think has been Not Enough. In a way, I am reminded of the Agile Manifesto, which was well-intentioned but had no enforcement, no orthodoxy; it lacked teeth to nip at the heels of those who fail, who go astray. Originally, I had considered that the Network Effect as a way to freeze things, but it seems to only work on protocols and less so on languages.
There are people who just don't know C++ and people who also don't know that they don't know C++
This makes the existence of JTC1/SC22/WG21 aka "The C++ Standards Committee" more akin to the Académie française (which officially defines the French language, but that's not how natural languages actually work) than many C++ proponents seem to grasp.
The analogy isn't very good, because the natural French language isn't a minefield full of undefined behavior.
In Linguistics, natural language, and native speaker intuition is the gold standard. The theory of the language is deemed to be wrong where it fails to capture it.
So right off the bat we know that a prescriptive institution like Académie française is on linguistic shaky ground; it is not aligned with science.
In computing, the specification is the gold standard, followed by documented, committed implementation behaviors.
You don’t have to follow the specification. If enough people don’t, the standard committees will eventually adjust the standards, or, make more standards.
That's true when you have one implementation of something, and you're lucky enough to have a document that is so detailed that it counts as a specification.
The specification of an ISO-standard language is the main contract, taken seriously by implementors. In areas where implementations happen to conflict with it, you will find that over time they give way to the specification. Implementors add their own bits to their local version of the specification. Beyond that are undocumented behaviors.
Neither the specification nor implementations give way to wishful thinking on the part of the programmers.
The only thing that matters is the implementation of the system you’re communicating with or using.
With any luck, their interpretation of the specification will be similar enough to yours that you can communicate.
The only wishful thinking in such cases is that the implementation will perfectly implement your interpretation of the spec. The code you’re calling isn’t going to magically conform to your interpretation just because it says it’s conformant.
This is why we have interop testing and even interop conferences. Specs almost always have subtle ambiguities and missing details which implementers need to fill in themselves.
> In computing, the specification is the gold standard, followed by documented, committed implementation behaviors
my point is that a specification is not the gold standard, because it can be interpreted differently by different readers.
We need interop conferences to help identify the places where the specification is insufficient to fully and unambiguously describe the desired behaviours.
This can only happen because of input from implementors. And since the specifications are always evolving, they are always going to trail the implementations.
It's true that Mother Nature doesn't care about your wishful thinking, but she also has no time for your paperwork.
There's a rough notion of the C++ programming language, and the ISO document is a description written by some humans which resembles it closely enough that you know they weren't describing something else. The people implementing the three major compilers for C++ don't really care about the ISO document†, but the meetings to agree the document serve to also agree changes they should make to their implementations, to a lesser or greater extent.
† ISO/IEC 14882:2020 is the current standard. Yes you read that correctly, that's C++ 20. No that's not the "current" C++ language, an ISO document for C++ 23 will be published probably later in summer 2024. Nobody cares. GCC volunteers working on their compiler don't care obviously. Microsoft employees paid a substantial amount of money specifically to work on MSVC as their day job don't care. Nobody cares, the ISO document is purely an exercise in vanity. If you try really hard to buy the "official" ISO publication as an actual paper document, most likely after months of struggle you'll get... an apology and a link to the sprawling PDF for your money.
You absolutely can buy the ISO publication (though I wouldn't recommend it). Also, given that n new version of the C++ standard is published roughly every three years (and named every three years), I'm not really sure what your point is.
Implementors will (mostly) implement the new standards in due time and it actually does matter what the standards say. So what is it that nobody cares about here?
You can buy the PDF. If you're under the impression you can buy the paper document you're most likely just looking at some Print On Demand automation and, if you gave them the money their actual POD backend would look at the PDF and say error, that's not a document I can actually make [it would be about 2000 printed pages] - hence as I said you'll get an apology and a PDF link.
Nobody cares about this document. The implementers do not, in fact, go back and change their implementation to match mistakes in the document, they might eventually fix the document instead, but honestly it just doesn't matter. Given the work expended to write this document that's very stupid, but it's not my fault, if it were up to me obviously they wouldn't do it. The implementers do care about consistency, and so there are cases where they resolve an inconsistency by comparing against the document, but these aren't common and if the general feeling is that the document is wrong, well, it's wrong. They're just words, if you can be bothered you fix the document, you don't pretend that somehow it's sacred and your compiler should be altered to match the error.
On most of the difficult questions, the document just quietly avoids answering. In a huge document you can easily just assume there's an answer somewhere else, unless you knew there isn't. Maybe the ISO document explains how provenance works somewhere right? No? No. That's tricky, there are powerful forces who want different things, much easier to just not say anything.
The authors themselves concede that they're not really very careful with the language, did they write "can" for a normative declaration here even though the document insists it will use "should" not "can" when it is making normative declarations ? Well, too bad, I guess somebody wasn't careful enough when editing that part. Nobody is going to re-issue the document, if you make it your life's work for a year or two you can probably fix it, but you will be met by exasperated sighs everywhere. Nobody cares. You are trying to make them care, they will resent you.
haskell is great if you enjoy learning haskell. there's always a new build system, a complete redesign of how things are done, etc. i lost the plot around the time everything became an arrow (or was it a lens ?) myself, but I'm sure a few more generations of improvement have happened already!
it just seemed like the tutorials just pivoted from monads to lenses to arrows. but fair enough, I never did "get" haskell. I loved the functional aspect of functional problems (e.g science equations), but once it came to carrying state, everything just turned into a mess for me.
My favorite thing about Lua is that it trivially builds anywhere with a C compiler. Lots of other languages that bill themselves as "embeddable" are a real pain in the ass to build on weird platforms.
Fun example: there's a project (Lunatik) that embeds Lua inside the Linux kernel, where no userspace C APIs are available. You'd expect that would require extensive patching but they mostly just had to tweak the makefile and a configuration header file.
> However, when it comes to separating code into modules or packages, Python is more convenient.
Kind of curious about this. I find packaging in Lua quite convenient. I make a file and return either a value or a table full of values and then I require that file in the dependent file that uses the package.
Also, wrt to the missing features like an increment operator. It is possible in at most a few thousand lines of lua to implement a lua' -> lua transpiler where lua' adds some missing features, such as increment operators, in an idiomatic way. In other words, the generated code looks almost exactly the same as the input code except in places where say `x++` gets translated to `x = x + 1`. As long as your file is less than a few thousand lines of code, the transpiler can run in < ~200ms (and obviously much less for shorter files or much faster if the transpiler is implemented in a faster language). Of course the tradeoff is that the original source code may break existing tooling for lua, though the generated code will work fine.
makes me think that they haven't used python in a production environment. It's a mess, pip, poetry, stupid eggs.... that's just deployments then there's personal dev environments.
Guess you never had to fight luarocks then. Everything’s easy when you don’t need to pull in any dependency (which is the case for the most common Lua use cases: host application provides helpers the best they can, you write your own code for everything else). Hell, C’s package management is great if you have nothing to manage.
The first and foremost reason for the complexity of Python’s dependency story is that it’s powerful and people use it a lot.
> Kind of curious about this. I find packaging in Lua quite convenient. I make a file and return either a value or a table full of values and then I require that file in the dependent file that uses the package.
While that aspect of it is nice, I think the way paths are handled by default can get annoying.
Since everything has to be relative from the current directory, it's not very convenient to move files around or make a contained module that depend on its own module.
If you're in charge of your own environment, sure, you can roll your own thing, but then you deviate from the norm and your code becomes less portable.
There is also the LuaRocks package manager, which I believe is decent, but it's largely ignored by a big portion of the Lua community.
> Since everything has to be relative from the current directory, it's not very convenient to move files around or make a contained module that depend on its own module.
What? Not in my experience. For instance, I have lpeg install ~/.luarocks/lib/lua/5.4 and LuaXML in /usr/local/share/lua/5.4 (just to name two modules I use). To use them, it's just
local lpeg = require "lpeg"
local xml = require "LuaXml"
Yeah so this is both the "in charge of your own environment" and using luarocks, which is ignored by a big portion of the lua community.
A common situation is wanting to distribute a script that can be loaded and run by a lua-using environment, say a game client or something along those lines. What can you depend on? Can you expect or demand that users of your script have luarocks installed? If not then what is the path? On windows too? Do you maybe need to vendor that? Can you even? If the module is pure lua then fine, but lpeg isn't. Is the OS going to allow the host program to dlopen an unsigned C bin?
Lua is used for so many different things you could possibly never run into any of this stuff. But in my experience it's a major headache for a lot of uses. Partly this is a consequence of lua succeeding on its own terms, and being embedded in a lot of highly variable situations. But it still sucks in practice.
I’ve used teal before, the statically typed Lua that is mentioned at the end. It’s actually really good! There are some limitations (especially around generics) that can be annoying, however. Overall, I really enjoyed using it and would use it again.
I haven't used the static version, but almost eveyrone that I know have used Lua loves it. There is beauty to its simplicity/bare bones approach to programing.
While I do think Teal is pretty cool, I think adding gradual typing by the having a static language that compiles down to a dynamic language Typescript-style is the worst way to add gradual typing support.
The extra compile step and the added complexity kind of kills the advantages of using a simple dynamic language like Lua in the first place. You can only compile a file if the types are correct. You might think this is good but no, not at at all. This is horrible for prototyping because sometimes you might want to test a specific path even when you haven't yet refactored the rest of the program.
It is much less hassle to just add type annotations as comments and have the linter figure it out. So you can have your cake and eat it too, having both type safety when you need and want it and the power and flexibility of dynamic typing in one.
Now, an even better way is probably what Luau is doing because you also have type information available at runtime which then can be used for reflection and performance. Especially as they have an option to disable the type checking if needed.
It’s kind of common to use the TypeScript compiler purely as a linter these days even in a TypeScript-only codebase. You actually want to use a faster transpiler anyway, and when you do, there’s no technical reason not to run the code or the test suite while you still figure out the types. In fact, I’m becoming convinced that this kind of setup (and way of working) is actually the best of both worlds.
Eh, I think I disagree with this. The point is that I can make very complicated libraries in Teal, and those compile down and are usable in lua scripts. This is where I actually used it - very simple scripts, very large and complicated library imported into those scripts. If you really need to prototype, just use lua, not teal. I've never found prototyping in teal to be a problem though.
On the other hand, I absolutely despise type annotations as comments. There is something so unclean about that - if I want a readable library, I want there to be type proofing all the way down. No switching between static and dynamic.
I don't think Teal can be total. Lua is super expressive and you would need to get deep into dependent types and whatnot to caption a fraction of the power of its types.
> Some Lua libraries use complex dynamic types that can't be easily represented in Teal. In those cases, using any and making explicit casts is our last resort.
Personally if I paid the cost of having an extra compile step then I would rather wish for some serious type safety like what Elm or Reason provide. But then you end up with a different language.
I agree though that for your use case of making complicated libraries Teal might be actually good fit.
I was going to argue that Luau is superior anyway if you really can't cope with annotations but obviously if you want to make libraries for lua users that doesn't work. (Though I guess you could technically compile down from Luau to Lua, just not sure if anyone has ever bothered.)
Nope, you can embed it. Teal is written in teal, and compiles down to lua - so you can simply put teal as a lua dependency. I used it with gopher-lua (Golang lua environment). https://github.com/teal-language/tl/pull/523
I have a lot of technical respect for the project, especially in how carefully they have selected a goal and always develop towards it. And gaining expertise in lua is almost like taking a master class in language design: you are nearly forced to learn about the underlying constraints and how they have been addressed. It definitely improved my understanding of coding more than most languages have.
All that said I simply don't like it very much and don't think it's a good choice for most things I see it used for. When I see people talking about how much they enjoy it it's usually on small projects, or solo projects, or both. The simplicity of the language and variety of runtimes and distribution methods means every significant lua project is a totally unique framework with homegrown standard library.
It also really badly needs better string handling utils. I appreciate the constraints they're working under and admire the decisions they've made in respect to them. But one of its most common practical uses is as a DSL or scripting extension to another environment. I agree with the decision not to include regex, and patterns are incredible for what they are but they aren't sophisticated enough for many cases. Some parser rudiments or even a peg engine built in would completely solve the problem. But if that's too big just a dozen or so common string tools would go pretty far.
The problem with "lack of classes" is that every non-trivial program makes its own implementation of something-like-classes, but built on top of metatables.
So before you can start hacking on anything in Lua, you have to first understand the class system, which may or may not be documented.
I have used it production (embeded/mobile), and I liked a lot. Their associated tables are just awesome.
The problem with Lua is that object oriented programing is not necessary a first class citizen (you have to create it from scratch with object prototypes) and that Python occupies the same space and has sucked the oxygen out of it.
Think of Lua as a much leaner/simpler Python, and better for embeded situations.
Yes I love fennel for this. I've done a lot of professional lua work and I don't enjoy it as much as people here tend to.
Fennel is an astounding technical accomplishment, laser focused on the worst practical issues with lua and completely avoiding lisp nerd quagmires. It's extremely practical too: you can hook the compiler into the lua module loader and since it's just a syntax layer over lua semantics you can drop it into any lua project. Being able to freely share functions and tables between the two makes working on legacy lua code so more comfortable.
I've used it extensively and have almost nothing bad to say about it, extremely rare for me with any programming language.
After 10 years and many projects, the greatest with 40k lines of code including a time-of-intersection-solving single body physics system, I have come to know how to use Lua for my purposes. Even so, from the moment I started watching Rust hype videos on YouTube, I knew I would eventually be converted into a Rust fanatic. At the end of a 2 year project to rewrite my Lua game with a new physics system, I felt it was a good time to learn Rust (and rewrite the game again in Rust). So far I have spent 6 months in Rust and my old tool looks like a toy. How did I ever live without iterators, algebraic data types, the safety of a strong type system, and especially separate types for arrays and hashmaps for god sakes?
Lua makes the scope of learning programming smaller compared to other languages, so it is probably fair to say that it is a good language to learn programming with. However, knowing the details of heap vs stack, array vs hashmap, and explaining that to someone learning programming for the first time within a language that attempts to hide those details is frustrating. I can't see the smaller picture and view a table simply as a thing that you get and set values from, I can't see the weak types as anything more than an annoying source of bugs at runtime, and I crave the borrow checker which saves me from myself.
My 10 years of Lua set me up to appreciate Rust's fantastic design choices, and I'm having a great time in Rust land. I wish to remain in Rust land but my finances demand me to use my Lua skills at least a little while longer. End of ramble
The fanaticism of Rust programmers are well known and off putting. Why, one of them just posted about it into a Lua thread. It couldn't be less relevant. Much as I don't have a hard choice between picking assembly and PHP for a task, Rust and Lua are just not in the same problem space.
I read his post less as Rust fanaticism and more someone who has learned their first programming language isn't perfect - strong types? iterators? These are all things that are present in many languages, it just sounds like Lua is lacking them and he is discovering some of the cool features of other languages.
It does not. The most viable solution is transpilation, teal is the most recent candidate.
You can have a hack of sorts using a schema and tables as in
local schema = { a = "number", b="string" } -- key = "type"
local data, link = {a=4, b="default value"}, {} -- link -> data, link itself is always empty
local tbl = setmetatable(link, {
__index = data
, __newindex = function (t,k,v)
if schema[k] and type(v)==schema[k] then data[k] = v return true end
error"type does not match schema"
end
})
tbl.b = 5 -- this will error because its the wrong type
There are certain solutions you can do with functions, lsp is also really good at telling you when you went wrong.
People complaining about Rust evangelism is about as much of a trope as the Rust evangelism.
Though I would rather spend my days listening to people talk about what they love rather than what they hate, so your response is more tiresome than the person you're replying to.
I think they are. Rust is a general purpose programming language, and Lua is a language specifically designed to be embeddable. But that means if you're making a project that embeds Lua, like Factorio for example, you have the choice of what language you write the code in. The choice is whether to write a function as native or to write it in Lua, and that choice might not be super clear unless someone else already made that choice for you.
This is completely different than assembly vs PHP. Assembly is a 'don't use it unless you have to' programming language, while PHP is a programming language designed for website building. You wouldn't use assembly to make a website because there are no frameworks for it and you'll scream and cry. You wouldn't use PHP to write the entrypoint to your operating system because PHP can't do that.
Rust and Lua are a lot closer together than PHP and assembly and it often would be reasonable to compare these languages for the same problems. It's not relevant at all to the blog post though because the engine they're using is built for Lua, and it's not really up to them unless they want to work against the engine and try to use a different language.
I think he was just offering the perspective of someone else that has also written a lot of Lua.
> The fanaticism of Rust programmers...
Honestly the endless criticism of Rust "fanatics" is far more tedious than anything Rust developers say. Rust is a fantastic language. Do you expect people not to talk about it?
> Honestly the endless criticism of Rust "fanatics" is far more tedious than anything Rust developers say.
We can agree to disagree. I am sick and tired of Rust people screaming at the top of the lungs in every possible place about how their language is great and want to rewrite everything in it.
So yes I expect people not to praise Rust in a Lua thread, it's completely off topic.
Rust's ecosystem is also very sporadic. It seems everyone jumped on board in the gold rush (and still do), reinvent the wheel in some package to lay claim, and then abandon it when its 70% there once they get bored and/or realize rust doesn't magically solve programming.
That isn't really about people abandoning GUI development. Most of the big Rust GUI efforts are still going. The problem with Rust GUI libraries is that Rust isn't really old enough to have mature ones yet.
> The problem with Rust GUI libraries is that Rust isn't really old enough to have mature ones yet.
Rust is between 14 and 18 years old now. Depending on who you ask. [0]
If anything that's a testimony to what bmitc wrote.
> Rust's ecosystem is also very sporadic. It seems everyone jumped on board in the gold rush (and still do), reinvent the wheel in some package to lay claim, and then abandon it when its 70% there once they get bored and/or realize rust doesn't magically solve programming.
Rust 1.0 was released 9 years ago (the language wasn't stable and hardly anyone used it before that). 9 years is a tiny amount of time to develop a Qt-level GUI toolkit.
Go 1.0 was released 12 years ago and it still doesn't have one.
I think you're vastly underestimating the enormity of the task.
I don't think folks wait for v1 to start GUI projects.
And regardless if we're talking specifically about GUI or not (original parent wasn't):
> ...reinvent the wheel in some package to lay claim, and then abandon it when its 70% there once they get bored...
I think Rust is too niche to appeal to the masses and the community suffers from a RiR syndrome that tends to produce less than high quality projects on average. Because delivering high quality v1s takes a lot of effort and sweat. It is not fun.
> knowing the details of heap vs stack, array vs hashmap, and explaining that to someone learning programming for the first time within a language that attempts to hide those details is frustrating
These are implementation artifacts. Knowing how to program does not require any knowledge of them.
> and my old tool looks like a toy.
The great thing about toys is they are very easy to put down for a while and then pick back up again later with very little effort.
> The great thing about toys is they are very easy to put down for a while and then pick back up again later with very little effort.
That matches my experience.
I've written programs in Haskell, and it was very satisfying. But going back to the project after 6 months of work in other languages was very hard. I took some time to remember the concepts, the abstractions and what the cryptic operators meant in their contexts. I've kept away of Rust for this reason: maintenance would be hard if I only dabble episodicly in Rust.
On the other side, my own experience with Lua was not very nice. I contributed (and patched for my needs) Koreader, an ebook reader with 180k lines of Lua. The lack of explicit types and data structures was a strong impediment. And the language has subtle traps... Everything is a table, but not all tables are equals: some tables are sequences (indexed by 1..n) with special operators (like # whose result is undetermined for other tables (which hurts like the famous Undefined Behavior of C++). With Lua, simple questions like "Is this table empty?" or "What is the size of a table?" are too hard for beginners.
So, complex artefacts are hard to go back after a long break, but many toys break easily when you come back after a pause, having forgotten how fragile they were.
I find Rust much better on this front than Haskell. Haskell has a much stronger culture of using library-specific operators and abbreviations for function, variable, and type names, than Rust does. In Rust, you can’t even define a custom operator.
If you understand borrowing and some of the basic traits, you’re a long way there. There aren’t new DSLs to learn or relearn, and the documentation is good at closing any gaps that open up while you’re away from Rust.
> Everything is a table, but not all tables are equals: some tables are sequences (indexed by 1..n) with special operators (like # whose result is undetermined for other tables (which hurts like the famous Undefined Behavior of C++).
IIRC, the right way to check if a table is empty is
next(mytable) == nil
As the sibling comment points to, the documentation of the # operator is complex. It returns a border which is defined as:
a border is any positive integer index present in the table
that is followed by an absent index, plus two limit cases:
zero, when index 1 is absent;
and the maximum value for an integer, when that index is present.
Note that keys that are not positive integers
do not interfere with borders.
When the table has multiple borders, the behavior is undefined.
{} is explicitly defined as a sequence of length 0 by the language spec.
But for any table which has a nil in it between two non-nil values - i.e. non-sequences - there's basically no guarantees on which one of the several possible values # will return.
Yes, that's true. What isn't true is that "some tables are sequences" which have "special operators" whose results are "undetermined for other tables".
A table has an array portion, it has contents or it doesn't, you can check this as I illustrated (I didn't intend it to check if the table is entirely empty, the sibling comment to yours shows how to do that). `#` works on any table.
It is in fact the case that if you start sticking `nil` into the contiguous array portion of the table, you'll have problems: `#` and `ipairs` won't work correctly. That's part of why `false` (and therefore `true`) were added to the language.
It's a quirk you have to know to use Lua effectively, that's all. In years of programming Lua, I've never had a bug which originated from adding `nil` within the array portion.
As the only composite data structure in the entire language, it's reasonable to expect people to learn how they work if they're using Lua. The one correct observation in the post I was replying is that there's no obvious or cheap way to find the number of entries in a table (other than the array portion). Luajit has `table.nkeys`, and that should have been imported into stock Lua.
There are a lot of anecdotes in this thread which amount to "my experience with Lua was as a means to an end: I was using it as a scripting language for something else, and it subverted my expectations in some way or another (mostly by not being Python)". Understandable in a way, but it should take half a day at most to read the entire manual and the online edition of Programming in Lua, and there's no sense in blaming the language because a user wanted to get something done and figured they would skip that part.
The point is that # is basically unusable for non-sequence tables because there are no guarantees on which border it will return. But at the same time, when that happens, you do not get any clear error, either. That is the nasty part here, and it is exacerbated by the lack of any kind of static or even dynamic typing to distinguish sequences from non-sequences - e.g. Python is also a very dynamic language, and has a rich set of built-in data structures, but there's no point at which len() will silently do the wrong thing for any of them. Either a given collection type supports it, in which case you get a meaningful value, or it doesn't, in which case you get an error. In Lua, if you ever get an invalid table as input, you will silently get the wrong value which is nevertheless indexable, and so when you use it you get another wrong value etc, and eventually your computation just produces garbage output with no obvious clues as to where the error was introduced. That is a clear design defect in the language.
As far as personal experiences, I can't speak for OP, but I've learned Lua in early 00s, long before Python. As languages go, I don't think it's the worst dynamically typed language by far - it sure beats JavaScript, and Python could learn some things from it as well. But when it comes to long-term maintainability of large amounts of code, it is subpar.
> The point is that # is basically unusable for non-sequence tables because there are no guarantees on which border it will return.
This isn't true, though! The concept of a "non-sequence" table is not something Lua actually has, and if it means a table without an array portion, then `#mapTable` returns 0, no exceptions.
The ambiguity is if there is a sequence, and you stuck `nil`s into it. So definitely don't do that. Want to call that a trap for the unwary? I don't disagree. If you want to make entirely sure that you iterate all the keys, you use `pairs` or `next`. Not much else to it.
Laser-focusing on one case of malformed input is a strange move. If a Python dict doesn't have a key you expect, looking for it will throw an error, in Lua you just get `nil`. Which is better? No idea, but I know which I prefer, and it's the one that doesn't drag my entire program to a screaming halt.
Trying to create an artifical difference between "sequence tables" and "non-sequence tables" is exactly what I meant by wanting Lua to be Python and being surprised or offended when it isn't. There's just... tables. It's one of my favorite things about the language, in fact, because it makes for a very clean expression of ASTs. Metadata goes in keys, child nodes go in the array portion, everyone's happy. Works a treat for XML and HTML too: attributes are keys, child elements are in the array. In a language like Python, you need an `.attr` dict and a `.child` array, because if you just use a dict, you could have an attribute collision if there's a `child` attribute. It's an entire level of indirection which I don't have to deal with in Lua.
Just don't stick `nil`s in the array portion. It's a mistake. You won't be happy. If you need a conditional branch while iterating, use a `false`. It's a cost-cutting measure that was taken to get a language runtime that fits in 70KiB and has fast arrays which are also dicts. I have several reasons why that's a good idea, you've got one reason why it's bad. I say don't do the bad thing. Simple.
Again, I've never had a single bug from bad `nil` insertion in a table. Used the language for years. I've had nil-related bugs, dynamic-type related bugs, and plenty of logic errors. Just never the one thing that people focus on so diligently anytime Lua comes up. YMMV I suppose.
Maintainability wise, I've seen no difference between Lua and Python, having written plenty of both, up until Python added annotations. A gradual type system for Lua would be an excellent addition, Luau-style but compiling to plain Lua. Teal exists, but it ain't quite it.
I linked directly to the Lua language specification earlier which precisely defines which table is a sequence and which isn't:
"A table with exactly one border is called a sequence. For instance, the table {10, 20, 30, 40, 50} is a sequence, as it has only one border (5). The table {10, 20, 30, nil, 50} has two borders (3 and 5), and therefore it is not a sequence. (The nil at index 4 is called a hole.) The table {nil, 20, 30, nil, nil, 60, nil} has three borders (0, 3, and 6), so it is not a sequence, too. The table {} is a sequence with border 0."
> If a Python dict doesn't have a key you expect, looking for it will throw an error, in Lua you just get `nil`. Which is better? No idea
The one that lets you easily diagnose an error at the point where it happens instead of silently producing incorrect output is better, naturally.
> Metadata goes in keys, child nodes go in the array portion
There is no "array portion", they are all keys, just some are numbers and some are not. It's not at all like XML & XDM, where attributes and child elements are completely different namespaces, so count(foo/bar) and count(foo/@bar) are two different things.
> In a language like Python, you need an `.attr` dict and a `.child` array, because if you just use a dict, you could have an attribute collision if there's a `child` attribute.
Python dicts map exactly to Lua tables. If you want to store data in this manner, you absolutely can:
foo = {"a": x, "b": y, 1: z, 2: ...}
But in Python usually you would instead do:
class Foo(list): ...
foo = Foo([1, 2, ...], a=x, b=y)
and then:
foo.x, foo.y
len(foo) # only counts items, not attributes; None is okay!
OTOH if Foo semantically does not have child items, then you wouldn't derive from list, and then len(foo) would straight up throw an exception. And if your index is out of bounds, you again get an exception rather than None.
> Just don't stick `nil`s in the array portion. It's a mistake. You won't be happy.
It's not like anybody is deliberately writing something like {1, nil, 2}. But tables get filled with dynamically computed data, and sometimes that data happens to be nil (often because you e.g. computed an element by indexing another table, and the key was missing so that operation returned nil).
So now you have to always remember that and guard against it, because it is not an error to construct such a table, either. Which, again, is weird if it is "a mistake".
I agree with this, but my point was that I cannot hide the details from myself, and will bring them up when they are relevant and confuse my poor friend just trying to learn Lua.
Definitely very little effort picking up my toy again after a 6 month break
I haven't used Rust, but lots of C++ — do you ever find the strongly-typed and compiler-driven approach constraining, compared to Lua?
Like, in C++ if I want to change the API for something, I need to update a few different places (header, implementation, maybe some downstream typedefs, etc.), maybe recompile a library or two, then re-run the application, whereas in a more loosly-typed and "interpreted" language like Lua, I can rely on duck typing and such, make my edits and save, and the code is immediately live.
The iteration cycle can be very short, if the system is built that way.
Anyway, maybe I just come from the opposite end of the spectrum, so Lua feels like a breath of fresh air sometimes vs. writing everything in a stricter language. My project is also very amenable to that, though.
As a long time C++ developer with only a few years in Rust, C++ is downright painful in comparison. However, I still cannot iterate in Rust as quickly as I can in Lua, unless the project is large and complex, then I'd argue that I can work faster in Rust. Languages like Python, Lua, Lisp really struggle in large projects because it becomes way too easy to break things unknowingly.
I question this belief for Lisp, because for instance SBCL says quite a lot of things at compile-time. See also the new Coalton (Haskell types), handy language features and the fast debug loop.
I know that I would have found strong types constraining when I first started learning Lua in 2009, but do you ever really need to change the type of a variable after you create it? I did a lot of hacks using ternary true-false-nil and drive-by appending random state to a table it doesn't belong in, to be recalled later in some other random place in the code. The lack of rules makes even following your own ideas of structure easy to cheat and subvert.
I think your comment about opposite end of the spectrum has merit, because I find the strict rules to be refreshingly binding to my coding ideals. Rust is ergonomic to think in, and I have even used it to prototype things before implementing them in Lua.
To me, Rust vs Lua is changing a struct and then simply following the chain of compiler errors instead of trying to remember every last place a mushy table gets manipulated.
Iteration cycle of strongly typed languages can be very short too, if you pick the right technologies. Rust (and c++), unfortunately, has pretty long compile times compared to, say, Go.
The problem with loosely typed languages is that you don't find out until runtime that you have a problem. And if the problem is inside an if statement that users only hit 1% of the time, you might not find it at all except in the form of users very occasionally complaining that it sometimes crashes unexpectedly.
IDEs are getting pretty good at refactoring these days. I do changes like that in Visual Studio and it has gotten remarkable good in the last few years.
> So far I have spent 6 months in Rust and my old tool looks like a toy. How did I ever live without iterators, algebraic data types, the safety of a strong type system, and especially separate types for arrays and hashmaps for god sakes?
I programmed in many languages with similar features (C#, Java, Swift, Objective-C) and programming in Lua is refreshing to me.
In daytime I can deal with doing C# at office, but at home, Lua is more fun to me. Great for hobby projects. Would probably love programming full-time with Lua as well (if I could make money with game dev).
Balatro is another game written in Lua, using the Love2d engine. About 30,000 lines of code (including comments) you can read if you unpack the Steam distribution. Great game and runs very well on many platforms.
I don’t know whether or not it’s still the case that at Planimeter we ship the largest pure Lua game engine published on GitHub, but I remember our biggest issue being rather unrelated to the language itself in that game development has some real performance concerns that are hard to measure even with standard tools.
I've been wanting to pick up lua for ages. My only issue, as always, is where to start. Is there a runtime I can install where I can start creating with lua? Cross platform, windows and Linux? That will allow someone with no c/c++ experience to start doing stuff with it?
I'd suggest working with LuaJIT as well because of the C API (you can easily use something like SDL and build metatypes around them) with the caveat that it's stuck at an older version of Lua[0,1].
I have used Lua a long time ago and I don't remember ever touching C/C++. You can even make simple games without ever going low level thanks to https://love2d.org/! If I recall, most https://stabyourself.net/ games are based on Love2d.
If you're using Linux/MacOS, you can copy-paste these commands on the terminal and you should be settled:
curl -L -R -O https://www.lua.org/ftp/lua-5.4.6.tar.gz
tar zxf lua-5.4.6.tar.gz
cd lua-5.4.6
make all test
You should find the lua executable inside the lua-5.4.6/src folder.
Lua is a just perfect for application scripting. It fills a real market gap. It's astoundingly simple compared to any other language. In my mental map of programming languages, it's C but garbage collected, it's Python but simple, it's for the express purpose of application scripting.
You can have and mutate global variables in Lua. I'd never voluntarily write in any language with global mutable shared state. But after using Lua in my job for several months I've realized that using mutable global shared state is really the right tool for the job scripting.
My experience with Lua is only writing Redis scripts.
Maybe I'm pampered, but I found its lack of editor support is off-putting. I juggle between different languages (I'm writing a "full stack" book about building mobile apps, https://opinionatedlaunch.com), so I always need auto-complete when coding.
Edit: FYI I use IntelliJ with Lua support, and I'm comparing it relative to other languages that I also use (Java/Scala, Dart, Python, JS)
In other words, you don't like Lua, because your favorite IDE doesn't properly support it? (Or maybe it does, when you install the aforementioned language server)
> In other words, you don't like Lua, because your favorite IDE doesn't properly support it?
That's normal. Many - perhaps most - programmers lack appropriate mental tools to compare languages meaningfully. They instead use any kind of heuristic they find useful at the time. The never-ending discussions that mistake familiarity with readability are another example of this mechanism.
I learned Lua modding the game Homeworld and writing access control filters for NGINX. So easy to understand, mostly because Lua is pretty bare bones. I've always wanted to fit it into a project, but Javascript and Python are tough competition for a language like Lua.
Every config file becomes something close to a scripting language at some point. Build files, video game maps, etc.
Eg. You have a game with enemies spawning in some circumstance. The file that defines the level of the game now needs a bit of logic to specify that the enemy spawns given conditions. If you don't use a scripting language you'll have a field in that file that takes parameters for the conditions. Now do this 1000 times over for all the other little things you may want in the level. Pretty soon your file that defines the level is slowly becoming a shitty adhoc scripting language.
Jonathan Blow has shipped two games, Braid is very well liked and The Witness got more mixed opinions but certainly there aren't many people who think it's just terrible - more that it wasn't appealing for them. But two is a very small sample size, and the fact that The Witness annoyed more people isn't necessarily a great sign - given more time and money, Blow made a game he presumably liked better but others did not.
Maybe Jon will like Jai very much when it's done. Is that enough? In some sense maybe. But why should anybody else care?
In 2014 there weren't a lot of people doing this, maybe you'd look around and say "At least somebody is trying something". In 2024 there are plenty of programming languages to choose instead of C++ even if you specifically want to write video games. If you like roughly where Jai wants to be, no interesting ideas just do stuff we know works, there are even other languages in that space, including Odin and Zig.
Because Johnathan Blow programs in Jai and his experience is talking about the tradeoff set with <technique X> when using Jai.
Since we plebians can't use Jai, that tradeoff is irrelevant.
For example, Jai has hot loading, IIRC (somebody please correct me if I'm wrong). The utility of an embedded scripting language is completely different if your tooling has really good support for hot loading of your engine implementation language.
Furthermore, because we plebians can't use Jai, we can't verify anything said about it in context, either.
Jonathan Blow criticizes the use of scripting languages in modern game development as outdated and inefficient, pointing out that they slow down performance, are error-prone, and complicate the development process with extensive engine integration needs. He notes that while initially appealing for allowing quick iterations and non-programmer accessibility, scripting languages often result in poor quality, bug-ridden code that requires significant oversight from professional programmers. Advocating for robust, compiled languages like C++, Blow dismisses scripting languages as a failed experiment and praises the shift towards more constrained and user-friendly visual scripting systems in platforms like Unreal and Unity.
(i don't mean to suggest that i agree with him, I just was curious)
Interesting. I feel like his opinion makes sense for first party code, but what if you want to support mods? I don't really have time to develop them anymore but I did a couple in QuakeC back in the day (I'm old) and really enjoyed it. IIRC it had lots of domain specific niceties like native vector types. One slightly hilarious thing is any loop that looped more than 100k times was killed with an infinite loop error. Actually saved me a couple of times.
I feel like if there was a nice modern statically typed language that was easy to embed I would go with that. Something like Dart maybe. But I don't think that's easy to integrate.
Was a lot of the functionality of Quake done in QuakeC or was that scripting language added for the sole purpose of allowing for mods to the game?.
I think scripting languages are a very bad idea for the core functionality of a game and I see no benefit to them. If its done after you have nailed down all your core mechanics and just want to add mod support for things such as player created maps then I think that's fine.
That means that a scripting language should only be added towards the very end of a project. Adding it towards the beginning when you don't even know your core mechanics yet seems crazy to me since I see no benefit to it.
The engine was all C. QuakeC was used to script the game logic - how weapons behave, doors open, items move around, etc. It was reasonably powerful. You could make grappling hooks, proximity mines, etc.
I guess the core mechanics of Quake were very obvious. It wasn't a very innovative game in that sense; basically identical to Doom but with way fancier graphics, controls and so on.
Play any Bethesda game and you understand his points about designing your games to be largely scripted by less experienced programmers is error proned.
At the same time, the success of Bethesda games despite these bugs is an argument against it mattering all that much. Not to mention mods, which doesn't really require a scripting engine but 100% lowers the barrier to entry for community creations.
"accused" implies that it's a bad thing. I see nothing wrong with using AI for summarization.
But I also see nothing wrong with someone wanting to know, as a general disclaimer, whether a given text was human- or AI-generated. The passage does read to me very much like GPT's college essay-like speech patterns.
I got exposure to Lua with Cyberpunk 2077 mods and neovim plugins. Some stuff is indeed strange (like array indexes starting from 1 by default), but overall it's easy to use.
Not sure I'd want to use it for something big though.
Congrats to the author, but I really can't stand Lua. It takes its commitment to simplicity too far IMO, especially with regards to tables not really specifying between (in Python terms) "dict-like" and "list-like" tables. Recursively searching through a table of tables can for this reason be annoyingly difficult.
Every time I write it, I find myself wanting a language with just a few more batteries included.
Lua is both cool and not, depends on how you look at it. I used it for several years before going elsewhere. I wrote a graphical toolkit with animations in it (based on pango/cairo), embedded it for a business-objects platform and used in Quik trading platform for trading bots.
It’s very nice, but it has rough edges that never get addressed, despite being proposed in the mailing list in every proposal phase.
__newindex only works once, then it’s not “new” anymore. You have to proxy your table with an empty table and a metatable, also make sure you now implement all metamethods for your proxy table to be indexable, iterable, #-able, tostringable and so on. No __setindex for you. Every version of Lua has little changes in there (it uses no semver), so your 5.x iteration protocol/etc are bound to that x. Why would you upgrade then? Cause every 5.x lacks something you really want and it’s in 5.<x+1>. There’s also something you don’t want.
“a and b or c” and “b or c” paradigm doesn’t work with boolean “false”, because it’s unsurprisingly as false as nil. It’s not haskell vibes, it’s a lack of ?: and ?? operators right there. Nil/false are basically second class citizens in Lua cause only these two share the same room, also nil is prohibited from any array.
Lua ignores most syntactic ergonomics. You can’t destructure a table, everything has to be “local a, b, c = t.a, t.b, t.c”. And vice versa: “{a=a, b=b, c=c}”, which is syntactically non-fixable due to table duality. Its “…” literal has inconvenient semantics of collapsing before another argument: f(…, x) is equivalent to f({…}[1], x). So you have to “local t = {…}; table.insert(t, x); f(unpack(t))”. No way you can get something like t.push(x) either, for reasons too long for a medium sized comment. Lua is also highly function-esque, but there’s no way to shorten “function (x) return <mylambda> end”.
Also its gc is pretty much stop the world. They experimented with it in recent versions, but at 5.1 times you didn’t expect your complex scripts to not freeze for a substantial amount of frames if you “wow so simple”d them and created {} here and there. It takes time to collect the garbage.
Rocks, Lua package manager tends to build everything locally, but your windows box ecosystem is out of luck. If you can’t find a pre-built dll, you’re in the mud. That’s windows fault, but your problem at the end of the day. Luckily you can make it yourself, cause Lua has great C API and LuaJIT has ffi. No /s here, I made all my wrappers and connectors myself.
Overall it’s a nice language idea, fully functional prototype, you name it. It works in the falling in love phase and has unique features for a non-esoteric lang. But don’t expect too much from it in a “marriage” if you’re one of those who want things to get out your way and you don’t get why an obvious feature cannot be done for ideological reasons while compatibility gets broken regularly anyway.
I understand that this post looks negative, but that’s because it enumerates only downsides, which are real, although sometimes subjective. I have pretty fond memories of it, but still had to move on.
The first to write "stored procedures" in Redis (I forget the correct terminology). It allowed to improve the performance of Django endpoints by an order of magnitude.
In the second, I wrote a Wireshark plugin (also not sure of the proper term) to dissect a proprietary protocol.
I don't remember the details but this is not a programming experience I look forward to renew... (Same feeling about Perl btw)
Possibly indices starting at 1 were the most disturbing.
"it is customary in Lua to start arrays with index 1" [1]
Not being a native English speaker, I may have phrased something wrong. I find "one-based indexing" in arrays not particular intuitive, and error-prone.
No, what you phrased is a common argument but IMO doesn't follow from natural language. The fact that C uses pointer offsets as it's "indexes" is IMO the truly counterintuitive thing.
Obviously I use 0 based indexes for most of my programming. I have not found 1 based indexing to be a serious problem since Lua is a high enough language that I'm rarely doing complex index arithmetic.
Other code outside the sandbox pulls in up to date price data and if certain safety rules are violated will automatically close out positions. So even if the traders code their way into an infinite loop or make other mistakes, the supervisor can step in and automatically prevent disaster.
Using Lua to make a language for others has been a wonderful experience. FYI, it was approx 11K lines of Lua).