In 2010 I worked on building a "dynamic" web app that would run on a very tiny embedded device -- no MMU, about 130 bogoMIPS. Think of an old wireless router you have at your parents' place.
There aren't that many options for building a "modern" web backend on something like this -- the most popular being C compiled into executable(s) served via thttpd and CGI.
Things like templating, or JSON parsing for that matter, would be much easier in something higher level than C, but python, PHP, node, etc are simply too huge and slow.
I discovered Lua, and it fit the niche exceptionally well. The syntax and the data structures takes a while to get used to, but once you do, they feel rather powerful, especially for such a lean interpreter. It compiles into something like a 120k statically linked executable and is (relatively) phenomenally fast -- not only for the tiny computer on which I had to run it, but also for me, who had to write a complicated web app using it. I almost felt like a demoscene developer -- I was able to run something that would otherwise require a full featured machine on a tiny embedded board. Even to this day, LuaJIT, OpenResty, Lapis remain up there on the benchmark tables when it comes to web apps.
> Things like templating, or JSON parsing for that matter, would be much easier in something higher level than C, but python, PHP, node, etc are simply too huge and slow.
FYI, there are some nice (small, performant) JavaScript runtimes now. A typical Moddable[1] xs engine app needs less than 200KB of ROM, runtime included, and a few dozen KB of RAM to run.
One thing that I like about Lua over Javascript is how self-documenting, fun to write, and easy to read it is. It has a very simple syntax, but the extensibility of its core structures (like tables) let you accomplish a lot with that simplicity.
Its 3rd-party libraries also tend to be closer to C/C++'s "drop-in" solutions like STB[1] than the enormous JS ecosystem which often requires extra tools like Node and/or extra packages like JQuery.
Admittedly, Lua's ecosystem is much smaller than Javascript's, but in my experience Lua is vastly easier to maintain. You can certainly write bad Lua, but it's one of the only languages where I don't implicitly dread reading other people's code.
It is sort of annoying that tables which are treated like arrays are 1-indexed by convention, though.
Does C/C++'s STB or Lua's 3rd party libs cover cross-browser quirks and http methods like JQuery does? It sounds like Lua/C and JavaScript are used for totally different things. And Node isn't an "extra tool" it's a server. No doubt JS projects sometimes come loaded with excessive dependencies but Node/JQuery are bad examples and not equivalent by any stretch to STB.
Lua doesn’t really run in browsers (except with emscripten or WASM? and also fengari (Lua interpreter written in JS with interop and GC sharing)) but for http luasocket / cqueues / openresty libs are good, and for servering also openresty seems good.
The equivalent for stb libs would be crossplatformness and in my experience they’ve worked for their advertised purpose on mac, windows, linux, android, ios.
Nope; Lua is not a good choice for a website's frontend.
But people use Javascript for a lot more than just websites, and I've lost count of how many times I've been frustrated that a JS library is distributed exclusively through package managers like NPM even when the library is not specific to a browser.
NPM is definitely usable as a package manager for frontend Javascript applications, just needs a little more complicated webpack or similar setup than a node.js app.
Lua's feature set is tiny, and that's precisely what I love about it. Ruby feels a lot like Lua, but I vastly prefer Lua to even Ruby, because of how minimal it is. As someone who writes ES6 on a daily basis, I feel that Lua's syntax explains itself -- you can start writing Lua code in 30 minutes. People have always said that Python is fun to write, but I've never felt that about Python. On the other hand, I can definitely say that Lua is pleasing to write in. Python, to me, has always felt like a chore compared to Lua.
Funny, it's Lua that feels like a chore to me, because most things take around twice as many lines of code than in Python. (Though some of those lines are just 'end'.) I want to like Lua more because it's so much simpler and faster, but it could've come closer in expressiveness and catching errors.
I'm on the python side of the fence, even though it's not my favorite thing by far, I rapidly clicked to their idioms. I like lua for all the reasons people say lua is great but I always forget it :)
Started coding in Lua 2 months ago and instantly fell in love.
Higher order functions, coroutines, the ability to return multiple values from a function, no coercion on comparison operators make coding in Lua a breeze.
You can learn most of the language within a day.
Ofc not everything is perfect: The package ecosystem is growing but still small compared to other languages.
Anonymous function definitions are pretty noisy.
Still I think it's one of the best designed languages you can find.
That is a nice euphemism. The lack of something even remotely resembling a standard library is what keeps me from using lua more than I absolutely have to.
Because more complex constructs can be created so easily in Lua everybody is happily copy/pasting half baked stuff from the web. That means there is no common style which might make it consistent across code bases so Lua remains a "Bring Your Own" language. No proper arrays, no easy exceptions, no easily recognizable classes ("but it is so easy with setmetatable etc."), iterators, prettyprint...
Plus the various annoyances, such as nil access not being an error, no += (I read all the reasons against it), weird "false" concept, that tables are - if you look closely - neither arrays nor dictionaries, the "require" imports are not modules but more akin #include.
In my opinion Lua 6 should drop some historic weirdness (~=!) and make the core language more in line with JS or Python and provide a standard library - even if components there sacrifice some of the holy performance.
Luvit is node-copied environment, which nils “no library” downside. Lua doesn’t push it, but it is there and even works. As of no classes or ‘weird’ parts, js has more rich history of bullshit included, not included or misbehaved than any language ever. I’m not going to enumerate these here. You just get used to it and now can’t stand it in others. Js prototyping is much dumber though, so it could be hacked by 5-minute newbies easily, unlike mt/index magic. But there is no such thing like ‘being in line’ with javascript, since it draws no line.
You can write a parser that makes Lua js-like (or even better) in a couple of hours, but that woudn’t make it popular. Including it into a browser would though. But browsers still resist to include anything viable like RxRS scheme-likes at least that could be transpilation target, repeating that unsmart ‘maintaining complexity’ argument. You’re stuck with js and that’s why rants go all the way. If it had enough concepts, there would be less rants and more transpilers, but it is dumb af and oriented on end-developers rather than on library, system or metadevelopers. At least wasm is a hope, but it was not seen seriously yet in this regard.
Ps. require() is a regular one-time module-load routine, nothing to do with #include.
I dont understand what you mean by comparing require with include. Lua's require only loads each module once, and doesn't introduce any new variables to the local scope.
Lua is amazing in that it is very small, portable, and easily embeddable. But aside from those three things, Lua is semantically almost identical to modern JavaScript.
I've embedded both and used them heavily in my various window managers, and these days I personally recommend embedding JavaScript over Lua if possible.
But when a small, portable and lightweight language is needed, there's also Sparkling[1], which is like the best ideas of JavaScript with the minimal footprint and portability of Lua.
> Lua is semantically almost identical to modern JavaScript.
Having spent a lot of time looking pretty hard at Lua, LuaJIT, and embeddedable variants of JS(duktape and the like) I think that's a bit of a stretch.
It's pretty trivial to induce JS memory leaks[1] that can be brutal on embedded platforms. We had a prototype project written in JS and Lua, same functionality but hit issues like this pretty often. Like most memory leaks they weren't straightforward to track down as well.
For my money LuaJIT still reigns supreme despite the fact that it's not under active development any more[2].
All these "Let's do Lua, but right" languages fill me with both hope AND dread. This also shows what Lua could have become if a few more bold decisions were made. But I guess without a large commercial entity behind it, that was inevitable.
Is there list of lua-ish (especially wrt. speed and easy C integration) languages somewhere? I did not know this one for example.
You might want to check out Squirrel[1]. It has some major differences when compared to Lua[2], such as a C-style syntax and true objects, but like Lua is known for easy embedding in C and C++. The Source Engine includes Squirrel as one of its supported scripting languages[3], and it's used in non-gaming applications such as the Code::Blocks IDE.
There is also a squirrel fork SquiLu[1] that add some Lua code (like pattern matching, some stdlib), extend it with string IO, have several extensions (sqlite3, http server, mpdecimal, axtls, curl, ...), accept a subset of Javascript/Java/C/C++, ...
What do you mean by ‘almost’? You cannot have fair separate threads of execution or substitute function’s global environment in javascript, but this is the other way ‘almost’. What features does js have beyond syntactic sugar and second-hand techniques?
There's duktape (https://duktape.org/), which, from skipping over the documentation seems very similar to Lua when it comes to embedding: It also uses the stack based approach. Looks like there's some ES6/7 support.
Maybe, but my use-cases (window manager and other automation stuff) don't really have a strong need for speed or high performance. Mostly it just sits there idle 99% of the time and runs callbacks quicker than the UI that it manipulates is visibly updated.
Yeah I think it does end up boiling down a lot to use cases because the big differences b/w Lua and JS are in availability of libs / integration and perf. In my case I do a lot of 2d graphics where love2d.org is available and perf is important. :) C interop is also really important, and LuaJIT C FFI is waaaaay nicer to use than JSC C API.
I’ve ended up doing a lot of JS with React Native for “app-y” mobile apps though.
If perf is important, I'm surprised you're using Love2d and not just using C++ with SDL and Box2d directly. Afaik Love2d just uses vanilla Lua (and not LuaJIT) which, while faster than similar languages, is still pretty wasteful of cycles when high perf is needed.
perf is important but I also want user scriptability (as part of the actual app feature(s)). Love2d uses LuaJIT (it does use the C API to bind its native calls rather than the FFI -- but actually that has better perf when not JIT'ing).
You seem like you might know the answer to this question that I've had about Lua:
I can find games, window managers, embedded devices, and a bunch of other stuff that can be scripted with Lua, but there doesn't seem to be anything like a Visual Basic equivalent that lets you layout a GUI graphically and tie the controls and events together with Lua, at least not for native applications. Why is that? It seems like it'd be a good fit.
I have some familiarity with GUI builders. In general, those tend to be targeted at end-user consumer software like desktop applications, web services, and mobile apps. For game development and embedded systems, there is a strong preference for custom, bespoke stuff. It's especially bad in embedded systems which never had an open source "standard library" till Arduino. The ecosystem is highly fragmented compared to e.g. your average smartphone and PC. An industrial control panel UI using Material Design on a high end ARM core won't necessarily run on a health tracker powered by a STM32 or Atmel chip. For gaming, the story is a bit different. Lua is heavily used there but because every game is different, the GUI parts tend to be done with custom designs. The type of UI builders used in Game Development is rather different than the sort used in desktop software but there are plenty. If you are looking for a minimal resource consumption language with good cross-platform rapid UI development as a first-class priority, I recommend you check out the Red programming language.
For Lua, it's use as a mainstream application development language is still in it's early stages. The language follows the mantra of move fast and break things so backwards compatibility isn't a top priority and this can be an issue in evergreen codebases like web apps (unless you are willing to lock to a specific version). Most web servers (most famously OpenResty, a Lua NGINX hybrid) are still on 5.1 due to LuaJIT.
Given Lua's age and use in so many projects, I was under the impression that it was somewhat stable.
Basically, my interest in Lua is this: I'd like to build a personal computing environment that is oriented towards end-user empowerment the way I remember them being in the pre-web era. To that end, employing a relatively simple interpreted language to build as much of the environment as possible is desired such that the end user is able to read and modify large portions of it, as well as create their own tools. Something on the level of Visual Basic or HyperCard would be useful for the GUI portions, and Lua is popular enough to have a lot of learning resources available, so it was an option I was considering.
It is fairly stable in terms of syntax. Just that features like Unicode support etc. tend to be difficult to implement without deep changes. For OS dev, look into Node9 and
https://www.lua.org/wshop13/Cormack.pdf
There have been quite a few attempts at building OSes with Lua as main system/userland language. There are very few Lua implementations that can bootstrap so you still need a bit of non-Lua code as glue for the interpreter bits if not for juggling pointers.
Thank you, I've joined your newsletter. This is exactly what Lua is currently lacking. When I have more time, I'll probably do some writeups on Lua, especially CI/CD and developing libraries.
> Higher order functions, coroutines, the ability to return multiple values from a function
These features are in almost any language nowadays, including most of languages that AOT-compile to native code. Well, maybe without multiple returned values, with tuples instead, but this is the same thing, especially if there's some primitive destructuring support.
Type coercion for comparison is quite rare too, even js has `===`.
But no standard library at all, so everyone brings his own incompatible hacks (by copy-pasting snippets) and processes lists (hashes emulating lists because there are no real lists) with `for` loop. Even with `for` loop it's not that obvious how to loop over list.
I tried writing in Lua and can't stand it at all, even JS feels much nicer. Lua is minimalism for sake of minimalism, and it heavily hampers practicality. Easily embeddable? Who cares if embedded runtime is 100 Kb or 2000 Kb?
Well, it might be a good language if comparing it to Basic from 80s home computers, so it's used in PICO-8.
> Who cares if embedded runtime is 100 Kb or 2000 Kb?
We ran Lua on the PSP with 8MB of system memory inside a pre-allocated block of 400kb. Did the whole game logic and state management with no hiccups(and this was before LuaJIT).
Not everything is desktop scale, lots of applications in spaces like routers, wearables and other constrained devices.
I've also seen few langauges handle coroutines and clean and simple as Lua does them.
pip really sucks, in fact any packaging system discrete to its programming language sucks, because it bypasses the operating system's software management subsytem (pkgadd, pkgsrc, rpm, et cetera). Then one has to have an imaging system like Docker to work around that, so developer's convenience leads to one bad decision after another in the operational repeatability space.
I used to install Ruby gems with apt circa 10 years ago but quickly gave up because they were outdated and I couldn't have different versions in different projects. Same with Python later on.
Things like virtualenv solve the problem of having multiple versions of the same package or interpreter in different projects. Rpm and apt can't do that, that's why developers are using package managers for virtually every language. Furthermore creating native packages for even the major distros and LTS releases is too difficult. There are also tools like asdf that handle multiple languages (and databases!)
Docker solves the problem of distributing the application. Virtualenv is quite weak about that. Docker can also be used to create separate environments on the development machine and install packages without virtualenv. We still need pip because nobody creates native packages. The same reasoning applies to every language I'm working with (Ruby, Node, Elixir, PHP.)
"Furthermore creating native packages for even the major distros and LTS releases is too difficult."
This is simply not correct: the truth is that people are lazy and don't want to spend the required time reading the documentation, the first thing a real engineer does. Instead, they just want instant gratification. They just want to hack. OS packaging is actually quite easy.
"We still need pip because nobody creates native packages."
I created over 170 CRAN R packages as RPM's. Was it time consuming? You bet! But now it's done correctly. Now it's done cleanly. Now I don't need nonsense like Docker, and I could even plug them all directly into Kickstart and have Kickstart spit out ready to serve systems with a rich R environment on them. With no scripting or any code. Just because I made RPM's, the client now has an architecturaly simple application ecosystem. Simple is robust.
I did not create .deb's because my client uses a RHEL-based operating system. The point is that I had to do it because CRAN simply decided to re-invent the wheel. I did not contact every single one of the 170 authors to create RPM's: if they really wanted to, they would have done that by now, not to mention it would be impractical for me to contact 170 people and attempt to explain to them what it means to design for operational maintainability.
Yes, I maintain the RPM's. It's easy because I always build .src.rpm's (SRPM's) by default, and a binary .rpm (RPM) ends up being a product of one of the steps in that process (`rpmbuild --clean -ba ...`).
But all of that is almost irrelevant. The relevant bit is that if done correctly, if done cleanly, as operating system packages, organizations need neither Docker nor "infrastructure as code" (copious amounts of amateurishly written glue in a scripting language) and the software is easy and fast to install and maintain -- for both system administrators and end users. I even do very large scale configuration management with operating system packaging, rather than with applications like CFengine, Chef, or Puppet.
Lua is a pleasing language to work with, but only that, unfortunately. It has never caught on in desktop apps due to a lack of suitable GUI library bindings. It never caught on in web apps thanks to a lack of suitable web frameworks.
There are at least a dozen half-implemented or abandoned Lua GUI projects, including at least three Lua-to-Qt binding toolkits and one briefly maintained by PUC-Rio (where Lua originates). But the other day I wanted to write a simple GUI to support formatting documents with a certain LaTeX template, and I struggled to find one that could guarantee that my users would be able to run the software on their various OS platforms. There are even more half-working Lua web frameworks, including Lapis, Ophal, Orbit, Sailor, and Tir, three of which have had brief moments in the limelight as the "preferred" Lua framework. There are at least three projects that attempted to add types to Lua, and no less than six parallelism libraries.
If you view it as a competitor to Python, Lua is a case study in open-source community mismanagement. There are five projects that attempt to solve every outstanding problem usually none of which has more than three regular contributors. There are many competing "Lua standard libraries", and after the falling out between Mike Pall and PUC and the controversial introduction of integers, there are four similar but distinct versions of the Lua language in common use: Lua 5.1, Lua 5.2, Lua 5.3, and LuaJIT 2.
But if you view it as a research project, Lua has been incredibly fruitful and continues to be. Any organization with the critical mass required to maintain their own internal ecosystem can use Lua without ever noticing the disarray in the wider community, and many do. It's just... annoying... from the perspective of the US/European open-source crowd.
Perhaps I'm uninformed, but I was always under the impression there were two camps:
Lua 5.1.5/LuaJIT users, and Lua 5.2+ users. Game developers, performance-sensitive developers, FFI users all tend to use the former, and when not necessary, I've seen people use the newer versions.
Is there more segmentation between 5.2 and 5.3 than I'm aware of?
I love that Lua uses one-based numbering, if only to point out undesirable developers who don't understand the difference between offset and count.
You don't work with pointer arithmetic directly in Lua syntax, so why would you need offsets?
Complaints about ~= as the negation of equality are as petty as well. The syntax in question isn't just used by Lua, either, and it usually tells me that a developer can't respect differences between languages.
> The syntax in question isn't just used by Lua, either, and it usually tells me that a developer can't respect differences between languages.
Actually, the two examples you bring up are paragons of asinine design. I can't imagine it being easy to justify such design decisions, your attempt to do so was wholly unconvincing.
That "asinine design" comes from decades old syntax practices like ALGOL, Ada, and MatLab.
Perhaps exposing yourself to other languages might inform your responses on debatably the largest technical forum in the industry: one with many developers from widely varying backgrounds.
Heh, you don't even try to justify it, just point out prior art in some irrelevant "languages". Yeah, definitely not easy to defend such choices. Even the fact that this turns off potential Lua programmers such as myself makes it do more harm than any potential good (and I am hard pressed to find a single good thing about these asinine choices).
What is there to defend? Can you explain why it's "asinine design?" You don't actually elaborate as to why, and I've provided you both academic background in mathematics and prior art. I apologize if I've missed your point.
The fact that this discussion ALWAYS comes up when talking about Lua, with people saying they won't even try the language, should clue you in how terrible these design decisions are.
Lua uses 1-based indexing because its initial userbase were engineers used to FORTRAN.
As for ~=, you need to keep in mind that Lua goes all the way back to 1991 and back then there wasn't a big pressure to make your syntax look like C (of Java or JavaScript) like happens today. Syntax used to be much more varied.
~= is a poor choice in and of itself, because it doesn't have any clear justification. It would make some marginal sense in C, where it's binary negation, but that's not the case in Lua - it only ever uses ~ to mean "not" in that one context of inequality.
And, on the other hand, ~= looks too much like an attempt to spell out ≅ in ASCII, but the meaning is completely different.
If anything, /= (as used in Algol-68, Ada, Eiffel etc) makes a great deal more sense as an obvious rendering of ≠. It's unfortunate that C confused matters by reusing it as assignment.
Lua tables are semantically almost identical to JavaScript objects. The one key difference is that any object can be a key in a Lua table, whereas all JS object keys are coerced into strings. The other more minor difference is that you use getmetatable() and setmetatable() instead of modifying or setting object.__proto__
This is one of the things I enjoyed about Lua relative to JavaScript. Subjectively, Lua tables felt like a cleaner abstraction than JS objects.
In general, I have both of the languages filed in my head as "Decent scripting languages with the risks associated with any-typed variables and soft coercion", but given those risks, I prefer Lua because it's simpler; there are just fewer abstractions and corner cases to trip over. Contrasting with JavaScript, where the sheer scope and complexity of the language (which has only grown as features have been added to try and avoid the sharp edges, which unfortunately only makes the sharp edges into legacy sharp edges, it doesn't eliminate them) means there are an uncounted number of ways to shoot yourself in the foot (https://www.destroyallsoftware.com/talks/wat).
I've been a fan of that talk since 2012, and I agree that JS is full of bad decisions (but hindsight's 2020). But I use JS every day for work and probably the only gotcha I actually run into is due to hoisting, which my linter (tslint) catches for me within my IDE (VS Code). Meanwhile all the new features are actually letting me code more quickly and easily, so that writing Lua now feels archaic and frustrating. Destructuring for example is a great way to emulate multiple return values, and it's more generally useful than just multiple returns too (used in parameter lists, etc). I do hope for the day backwards compatibility in JS is broken and we get a cleaner, lighter, more portable, more embeddable spec and implementation. I don't think we'll see that for another 15 years at this rate. But it'll probably arrive one day.
If your linter is TSLint, aren't you using TypeScript?
That's actually my solution. I don't code directly in JS anymore; I always approach it from TypeScript. I do not have time to track down bugs resulting from typos (or brain burps) causing me to mis-assign values to the wrong variables, and static typechecking coupled with variables with immutable type specifications eliminates about 80% of those instances for me.
(Going that road, backwards compatibility in JS isn't a concern for me as a web app developer, because it's pushed into the realm of "Things compiler and browser writers care about." ;) )
Yep, I use TypeScript for the exact same reasons. Although I've been bitten a couple times by its unsoundness (specifically with mismatched function arity in an interface) and looked briefly at ReasonML as a possible alternative. But that's more of a long term hope. TypeScript is good enough for 99% of the things I need, and great for web dev. That said, semantically it's still JS, and it still has the same flaws, like hoisting (or something having to do with const and let not being what I think they are), which still bites me every once in a while.
It's 20/20 and is used in ophtalmology to express dioptry: 10/10 is perfect vision and 20/20 is elven or hawk eye sharp, beyond human eyes' capability.
Yes but it would need to be highly coordinated by a bunch of competing parties with opposing interests who are probably only going to agree on anything when the status quo hits a serious breaking point and not a second before that. Look at the ES4 disaster and original Dart situation, any hypothetical breaking evolution of JS would have to overcome similar challenges.
> The one key difference is that any object can be a key in a Lua table
How does this work? Internally it is implemented using a hashmap so somehow they must generate a hash value of the object? What if the object has member variables that are themselves objects? What if those things point to each other in a cycle?
Statically typed languages can more easily constrain keys[0], but even then the lure of all objects being hashable and equatable is strong e.g. I'm pretty sure all Java and C# objects are hashable and equatable (and can be used as hashmap keys out of the box).
[0] Rust's hashmap requires that its keys implement Hash and Eq, neither of which are implemented by default; Haskell's Data.Map that its key be Ord; ...
They're identical to dictionaries/hash tables in pretty much any language, unless I'm missing something. Besides C, every language I've worked in comes with one of these things built-in. It's surprising to me that so many people consider this noteworthy today. Would someone please enlighten me as to why it is?
Because `.foo` means `[‘foo’]` and because you can easily make sequences out of them (if you write `{ ... i1 = v1, v2, ... }` then `v2` automatically gets the “next” natural number as a key) the ergonomics make them usable as structures and arrays easily. Also the ‘:foo()` syntax binds the LHS of the operator as the first parameter for a method call, and metamethods allow you to easily implement inheritance / dispatch / etc.
>Besides C, every language I've worked in comes with one of these things built-in.
JavaScript? When talking about a language most refer to as "JavaScript-like", it's definitely a noteworthy feature. Otherwise, you're right, it's pretty much expected to be available. Although, I'd say beginner developers might forget about it since the typical use case has string or number keys.
> Would someone please enlighten me as to why it is?
Both Lua and Javascript attach protocols to their hashmaps which most languages attach to other bits (classes, typeclasses, traits, …) — even if those other bits are underlaid by hashmaps at the end of the day.
They're more like general-purpose objects which can act as hashmaps (badly in the case of javascript).
one can "overload" operations on table instances, most importantly, accessors by forwarding them to another table, which is somewhat similar to prototype-based inheritance
Tables and metatables are incredibly flexible constructs in Lua. I was able to support nearly all the Haxe language constructs for Lua with little more than tables, floats, and strings.
Gradual typing can be very useful. Coercions, like any implicit casting/conversion can have problems. 1 based arrays are a pain. On this much, I'd agree.
Why are arrays as tables a problem? Is it a matter of efficiency, or is it a type safety issue?
Note the use of { and } instead of ( and ). This almost makes it possible to build a DSL. Oh. And the __mode option in metatables to create weak tables by key/value.
This is a major antipattern IMO since it makes it very difficult to programatically change configuration files. Unfortunately it seems to be all the rage these days to make a DSL for your config files.
I don't understand. Lua config is just Lua data structures. Load Lua data structures, modify them directly because they are the language's native data types, serialize them back to text. How much easier do you want it?
Then yer shooting yer own foot. Don’t put functions or expressions in your “Lua-SON” any more than you would in your JSON.
I guess if you lack confidence in your coworkers’ ability to resist temptation, you could write a checker using Metalua. That would make a good git submit hook.
Doing it “wrong” was one of my fav dev tales. I wrote a custom UI system in OpenGL for an iPhone 1 game (memory budget 32mb). Didn’t have time budget to make a visual editor, so I made up a Py-SON notation that simply loaded as Python. From there I used CTypes to convert the Python data tree into binary files full of arrays of C structs. Loading that in C was just fread(), cast a pointer.
The big win came when we realized we had way too much UI to create and not enough artist time to create it. So, another programmer and I sat down and wrote a suite of Python functions that made generating UI components much easier. It required a programmer-artist pair to use. But, otherwise it would have simply been impossible to complete on time.
A plain ASCII text file which doesn't require the user to be familiar with any programming language. The fathers of UNIX postulated that for a reason. It came out of experience.
One obvious thing I've missed in the past, in some program configuration-files is the ability to have per-host config checked into git.
Sure you can create symlinks so "~/.foo" is a pointer to "~/.foo-www.example.com", or "~/.foo-www.example.net". But having the ability to load a per-host file, natively is a good thing.
I started to get annoyed that my mail-client of choice (mutt) didn't have a real-scripting language, just something that was kinda-sorta-like one, with big holes.
That lead to me writing a console-based email client which has a core written in C++, and the UI and most handling written in Lua.
My configuration file is 100% lua, and the whole system is pretty flexible.
I've stalled over the past year, but I guess that means it is feature-complete with no glaring omissions!
I have been using tables in Icon since 1986 (they were in Icon before that). Tables allow key/value pairs to be anything. You can set up a specific value to be returned if the key presented to the table doesn't exist. Lists, sets, tables, records are the mutable values, strings, csets, integers, reals are the immutable.
Failure is an option, so all expressions can succeed and return a value or fail with no result. So there is not an issue with the truthiness of values and the semantics of true/false.
Simple tests like
if a < b < c < d then { do something } else { do something }
are standard in the Unicon/Icon languages.
Icon was my goto language until about 2000 or so and thereafter I have been using Unicon (the Unified Extended Dialect of Icon).
1 based indexing is very useful when you have the dual of indexes < 1 starting at the right hand end of any string or list. Hence, you can work from either end if you need and there are good use cases for starting from the RHS of strings and lists.
I have looked at Lua in the past and nothing in it has given me any incentive to move away from Unicon/Icon. Lisp/Scheme/Kernel and FORTH/Factor have more notable (as far as I am concerned) facilities than Lua.
Though Unicon/Icon has flaws and certain kinds of missing facilities like lambda's, I find that I am more productive in Unicon/Icon than I have been in other languages. YMMV.
Love lua. Luajit is stupid fast and the ffi stuff makes it a snap to integrate with other libraries. You can find Lua in neat places like nginx, redis, tokyotyrant...
I dislike the direction python has been going (twisted...i mean asyncio, type annotations, f strings). Lua is like a breath of fresh air.
Downsides are small standard library and 1-based indexing shudder.
0 is truthy # I just love the simple logical mathematical python way (0=ø=()=[]={}=false)
if undefined … -- treated as nil
function pseudoclass:new …
But by far the biggest complain is the packaging. luarocks SUCKS more than almost anything, at least for me.
luarocks install torch
Error: Your user does not have write permissions
sudo luarocks install torch
Error: No results matching query were found.
and on and on and on
(non-representative excerpt of my pains)
Nope. 0 being falsy only makes sense in pointer-oriented languages where NULL is a null pointer at address 0. In Lua, 0 is a number. It's not a pointer address.
Which leads to:
> a={1} a[0]==nil a[1]==1 indexing starts at 1 but a[0]=1 does SOMETHING
So does -1: should -1 be falsy, too?
> if undefined … -- treated as nil
Pardon? It is. Are you asking for an undefined keyword? Why? In the Lua C API, you already get `lua_isnoneornil` to begin with.
> function pseudoclass:new …
Lua is a prototypal language.
> but by far the biggest complain is the packaging.
You know, thank God. Because JavaScript is probably the closest thing to Lua and look how that turned out.
The only thing holding it back is that the community doesn't need or want it. It's also small enough that perhaps no one has made a good package manager for Lua yet. But the `package` module in Lua already provides search paths, so it's fairly low effort. Frankly, I've never had a need for it. I don't want npm for Lua.
Using -- for comments is a long-established convention: Ada, Eiffel, SQL...
0 (and everything but nil and false) is truthy is also pretty common in the Lisp world.
Side note: I actually find it ironic that Python of all languages does this implicit falseness thing on so many types, given that it puts "explicit is better than implicit" on its banner. I don't see the difference between treating an empty list as false when the context requires a Boolean value, and treating string '0' as numeric 0 when the context requires a numeric value.
And it can be a source of bugs - while there are many cases where an empty list and None are semantically equivalent, that's not always true. And it can be something that's very easy to forget when reading code that is broken that way (e.g. during code reviews). I'd much rather have functions like `is_none_or_empty`, `is_none_or_zero` etc that make it clear what the check is actually doing.
Lua predates PHP by 2 years, and I'm sure this combination of arrays and hash-maps predates both languages.
Either way, I'm not sold on it. Arrays and hash-maps are fundamentally different, not only for optimization's sake[1] but even in how people use them.
[1]: Recent versions of Lua now try to detect whether a table is an array, and apply optimizations when all its keys are ordered numbers without holes.
Lua doesn't try to detect whether a table is an array. The way it works is that a table is internally composed of two data structures, a hash part and an array part. Normally, integer keys go to the array part and everything else to the hash part. However, integer keys of a certain size will overflow into the hash part so that, e.g., storing a single integer key of 2^32 doesn't allocate an enormous empty array part.
In this way Lua already optimizes the array use case naturally. It never tries to infer whether a table is supposed to have array semantics or hash semantics. You can use a table as a hash, as an array, or (commonly) as both.
The cost of this simplicity and concision is born by the semantics of the length operator (#). The default __len metamethod on a table does a binary search looking for the first missing positive integer key, the boundary that marks the end of a logical array. The binary search will work even if your integer keys have spilled over into the hash part, though it works much faster if it doesn't have to inspect the hash part.
This why in Lua your arrays can't have holes (non-sequential positive integer keys), at least not if you want #t to behave as expected. Lua has no way of knowing the size of your array otherwise, at least not for plain tables lacking user-defined metamethods.
That said, there's a convention that uses the string key "n" to record the intended length. For example, table.pack() assigns the argument list to a table and sets "n" to the number of arguments. It does this because a nil argument value would create a hole. Also, since Lua 5.3 you can overload the __len metamethod, which could simply return t.n. Similarly, you can overload the __index and __newindex metamethods so that insertions update t.n or some other marker.
FWIW, I've tried hard not to express any value judgments in the above description. I've also deliberately abstained from discussing array-related language proposals.
I would argue that the length semantics are mostly unrelated to the hybrid array/hash nature. You could have the same problems on an array-only data structure, and you can invent semantics that avoid them without significantly changing the data structure.
It's amazing how many of the languages that feel recent today actually came out in the 90s: Python, JavaScript, Lua, Haskell, Java from the top of my head. Python is only five years younger (1990) than C++ (1985).
I guess it shows how much time you need to invest until a language is mature enough to enter common use, unless it's backed by a large entity like Microsoft (C#), Google (Go) or Mozilla (Rust).
There aren't that many options for building a "modern" web backend on something like this -- the most popular being C compiled into executable(s) served via thttpd and CGI.
Things like templating, or JSON parsing for that matter, would be much easier in something higher level than C, but python, PHP, node, etc are simply too huge and slow.
I discovered Lua, and it fit the niche exceptionally well. The syntax and the data structures takes a while to get used to, but once you do, they feel rather powerful, especially for such a lean interpreter. It compiles into something like a 120k statically linked executable and is (relatively) phenomenally fast -- not only for the tiny computer on which I had to run it, but also for me, who had to write a complicated web app using it. I almost felt like a demoscene developer -- I was able to run something that would otherwise require a full featured machine on a tiny embedded board. Even to this day, LuaJIT, OpenResty, Lapis remain up there on the benchmark tables when it comes to web apps.