So excited for this addition! I like to program Nim in a functional lite style, and this is some nice sugar to write functional procedures. Its also nice that the language reserved the function keyword for procedures that are _actually_ functions.
I've been using it for a while in the devel version (Nim v0.18.1), and it is great — I like it very much and I completely understand your excitement!
Before I didn't bother with {.noSideEffect.} pragma, and now with `func` here — my code is much more clear (at least to me ;)), and I can immediately tell which of my procs are 'pure'.
I want to use the func keword more. I think many of my procs can be funcs instead. In my programs i like having a simple imperative middle, but fuctional no side effects edges.
can someone expound on what this does exactly and how it works? Will the compiler try to statically verify that it is actually pure and error if it can't? Does it unlock optimizations or other features?
> If none of its parameters have the type var T or ref T or ptr T this means no locations are modified. It is a static error to mark a proc/iterator to have no side effect if the compiler cannot verify this.
I'm not sure how many compiler optimizations are possible, but I do know that procs marked with no side effect are automatically considered thread safe.
I'm not really a Nim expert, but I did a quick test on the scenario you laid out with these results: iamsafe could not call imodifyglobals, but could call a proc with no side effects. I'd guess that a func can call out to any proc that the compiler knows is side effect free, with the compiler erring on the side of caution.
Nope, it can only call other side-effect free things. Note however if you call a procedure from a function the function will try to see if the proc is actually side effect free, just not marked as such. So you can still call a proc, but only if it could've been a func.
I started playing with Nim last weekend, after the article on here about Nim.
I absolutely love it! I'm looking for an opportunity to use it at work for something minor, and I have been porting a few of my (small) Rust applications to Nim in my free time.
> I'm looking for an opportunity to use it at work for something minor
Recently I've been using Nim at my work as a Python+Numpy replacement for a numerical simulation of multi-body contacts. (Just two hours ago I had a small talk at a conference where I showed the results of it)
I'm sure there are lots of opportunities to 'throw in some Nim' wherever you work ;)
May I ask what the nature of the Rust apps you're porting is? I'm currently learning Rust and would be interested in knowing why you're moving away from it.
Not parent post, but to me, Nim is very easy to write coming from Python. For virtually zero effort, I can get near C speeds in a tiny self-contained executable.
Rust is probably more mature (more users) but requires me learning a lot of low level concepts. If you already know Python and just want fast code and easy deployment, Nim might be a better choice. Crystal is similar for Ruby users.
I'm feeling old. This is really at the top of a HN thread?
Since you self-describe as "coming from Python," you maybe haven't yet really gotten that deep into the internals of computers and what kind of stuff you might wanna do that isn't well supported by higher-level languages. This is especially relevant when we get to talking about performance guarantees, wherein very little can get "guaranteed" about a garbage-collected system, by definition. So while it is a pain to deal with "low-level concepts" like in Rust, it is also well worth it depending on what kind of tech you need to make.
I'm an EE, so I did take some microprocessor and FPGA classes and can write some assembly.
With that being said, my day job isn't programming, but engineering which involves a lot of managing data so scripting languages are very vital: Python, Perl, R, Bash, Powershell...etc. Also, other technology like SQL and Excel are daily drivers.
I've used Java & C# a bit in the past which are fast languages, but they're missing the majority of linear algebra and optimization libraries that are widely available in Python/Matlab/Julia.
So what if one of my simulations is too slow for comfort? Re-writing Python in Julia or Nim isn't too hard (syntax is very similar) and i get a huge speed boost. That's all I'm saying. I'm being pragmatic about the situation. Of course I can rewrite performance critical code in C/C++/Fortran/Ada/Rust, but that is a huge last resort as those languages (well Fortran isn't as bad as they say) take me much longer to get something going and you're absolutely correct that my hardware understanding has greatly atrophied since college. For an engineer (in this case a traditional engineer and not a software engineer), we generally need code to A.) automate processes B.) design and run simulations C.) do data analysis.
We rarely write low-level software to twiddle bits unless we really have to. There are simply better choices to make the things I care about run faster even if I was good at C.
I have noticed a lot of similarities between Julia and Nim as well. It would be an interesting compiled companion language to use alongside Julia. Both have really nice generics but I assume interop + generics isn't something that works out too well.
I agree with the sibling comment. Nim feels much more like Python, which was the first language I used before I picked up C/C++. For better or worse I was taught Python at a young age!
I have a group of related cli apps that need to do a lot of work while also polling swaths of the filesystem. In Python it would have been much, much too slow without C extensions, so I wrote them in Rust instead.
There's nothing wrong with Rust, it's a fine language! But ultimately, while ownership is a clever way to do memory, I personally prefer the sheer convenience of garbage collection when I'm trying to write something that's good enough, quickly. Nim is perfect for that!
I see Nim lately made some cleanup in the macros module. Thats great. Macros now are much more easy to wrap my head around and I have already produced my first usable macro: https://github.com/b3liever/protocoled. Also others are trying to use the new caseStmt macros: https://github.com/alehander42/gara. Great stuff!
I'm considering using Nim for an embedded project, because Nim can generate C or C++ code, and a C and C++ compiler is all there is for my target platform (a DSP).
I would like to use C++11, but the manufacturer only supports C++03, and will never add support for C++11 because it's a legacy DSP.
I bought your book with the hopes that you would include a chapter on embedded programming, but was disappointed to find it barely got a mention in the book. I remember an earlier discussion here where you were considering and it seemed like a real possibility.
Unfortunately, I haven't been able to find recent (ideally official) documentation/discussion on embedded programming. Contrast this with Rust where there's a significant focus on embedded bare metal, particularly for Cortex-M MCUs. With the recently announced support from Status, are there any plans to provide more support for embedded bare metal?
Support for embedded is indeed getting into focus. I hope to release an article soon how the '--gc:regions' switch works (which is misnamed, it's a way to do memory management, not a GC...) and in the longer run the destructor-based runtime will ensure that memory is freed sooner than a GC would for memory constrained devices.
Having said that, "embedded" is a wide field and sometimes one cannot even afford a heap, no matter if garbage collected or managed manually. But even then Nim can be used, you still get array bounds checking, Ada-inspired strong type checking plus the meta programming goodies.
Great! My interest is in bare metal environments where you typically don't use a heap. I'm not familiar with the `--gc:regions` switch, but I think a good start could be a guide on how to implement simple non-garbage collected programs.
Looks like they concluded that having the compiler automatically choose an appropriate region but silently leak the memory when that fails doesn't work well? Kind of like GC by reference-counting without a cycle collector.
Couldn't the compiler just reject code that would need to allocate in the global region, unless the programmer explicitly annotates it to make it global? It says the SML type system is too weak for this, but perhaps Nim's type system is sufficient?
Yeah, sorry about that. The initial book plan had a chapter on embedded programming but it had to be cut because I exceeded the page count.
I made a promise that I would write an article about it as a replacement and I still intend to do that. It takes time of course and I personally don't have as much embedded programming experience as I would like right now, so it'll take me even more time. But it will happen :)
As far as plans for more support from Nim itself I don't think there are any specific plans right now. If you've got time and this is your passion then please consider starting contributing to Nim :)
That would be great if you do write an article. I would be happy to contribute, but I would really like to see some official discussion or documentation from the team first before moving forward.
Very intriguing! I'll have to look into how this works; my understanding is that you can't do this without signing NDAs, and only distributing to people who have also signed NDAs...
That's messed up. Sony and Nintendo have the opportunity to build something great and useful beyond entertainment, but they always find a way to take a step-backwards in time.
I'm thinking about rewriting a very popular open source project of mine in a language that can spit out a simple binary people can double click or have it run in a terminal window to serve API requests.
Does Nim have true single-binary deploys like Go has? My original project is written in Elixir and even with Distillery it still seems a little convoluted for what I'm trying to deliver to end users.
I want a real, download this and run it boom, go to localhost:1234 for some cool stuff.
Yes, Nim has this. There is a caveat though, but AFAIK Go has the same problem.
Certain Nim modules/packages might wrap a C library. If you use those then your binary will typically depend on a DLL for that C library. With some effort you can statically link against those libraries, but I wouldn't call that easy. This is in particular a problem when you're using regex or SSL, the former depends on PCRE and the latter on OpenSSL/LibreSSl.
Go benefits from having native implementations of those written in Go. We also now have a regex engine[1] implemented in Nim but it hasn't made its way into the stdlib yet (that is planned though). But you can use it pretty easily as a package if you want to avoid the dependency on PCRE.
It's certainly possible! You just need to know which compiler to use. For example, I used musl-gcc to compile a static binary that can run on any 64-bit GNU/Linux type OS.
I've really been enjoying using Nim for one-off data processing projects here at work. I'm also excited to see where nimx [1] goes (a cross-platform UI library).
A library which allows you to connect Python and Nim so you could, for example, use Nim (instead of Cython) for speeding up some parts of your code, compile it, and call it from your Python program.
Nim is fantastic, I especially like that it takes a pragmatic approach to all its features, so that it doesn't get bogged down in theory but lets you get productive with it right away. One example is that even though it says it's a systems (and application) language, it has GC (ref counting + cycle detection), which means you don't have to worry about memory management and can focus on the core of your application. Honestly I think Nim is comparable to Go, so if you're thinking of adopting Go, look into Nim, you might like what you find.
I too was enthusiastic about using Nim, as it is clearly a much superior language to go. However, the ecosystems are not at all comparable. Even without considering the huge amount of 3rd party packages for go, the go stdlib itself is too good and has almost all batteries included. For Nim, if you are going to develop any non trivial program, it is going to be really difficult to find the relevant libraries.
Does calling C++ libs work in the latest version of Nim though? I mean real C++ interfaces, not C ones that just happen to use C++ in their implementation.
I disagree. Nim is very much a batteries included language and has a very healthy range of libraries available in the set of standard libraries. The lack of of 3rd party modules is a little problematic, but the large number of standard libraries and good C FFI can help mitigate this. Go is definitely the more supported and mature language, but Nim still has a lot going for it.
i don't understand why people make these kinds of criticisms. everyone already understand that the assumption for a new language that the ecosystem won't be as rich. you're basically saying something akin to "x high school football player is much worse than y professional football player in a professional game". of course. judge them in context.
Go has a wealthy patron and a "rockstar" principal dev so it wins the popularity contest. Vue has also been playing third fiddle to the big boys for some time but people are coming around. There is no reason why Nim can't have its moment in the sun too.
>People have to actually use the language, and this is a practical concern.
do they? is it in the rules of building languages and ecosystems that every new language has to be production ready from day 1? it is also possible that you can use it for hobby projects and that that's sufficient merit to talk about it and have posts about it too.
What on earth are we comparing when discussing Nim vs. other languages, if not the language itself, in practice? Are we just discussing how good you feel about it?
My point is, this is an actual deficiency that you need to consider when evaluating Nim, that isn't just a function of the language being new. For example, Zig is another option which puts any C libraries at your fingertips, without requiring bindings. If Nim had better interop (not just relatively good, but zero effort) this would no longer be a concern and you wouldn't need all of your libraries to be in Nim.
Not sure what you're getting at. People in this thread are discussing what it's like to use Nim, including its ecosystem, vs. other languages and ecosystems. You don't seem to have anything substantive to add to that other than that Nim is "new" which as I mentioned isn't relevant to the experience of actually using it.
Objectively, it's very hard to say. But practically, yes. Smart people building critical systems do not run to relatively untested languages, no matter how good they may look.
But... there's a loooooot of "high school level" applications out there in the world. And the way a language graduates up to the higher classes is getting tried out in the "lower criticality" systems, and getting experience.
I wouldn't try to build an S3 competitor in Nim right now, no matter how good it may look. But even a company building an S3 competitor in some other language could still have tons of other places Nim would be a fit.
> Smart people building critical systems do not run to relatively untested languages, no matter how good they may look.
Except they do, if the new untested language is qualitatively better than the old one. For example, a critical crypto algorithm in Firefox is written in F* and compiled to C: https://blog.mozilla.org/security/2017/09/13/verified-crypto... . The new implementation is formally verified free of certain classes of bugs (like buffer overflows), and significantly faster than the hand-written C implementation.
Also for people actually targeting production there is hufe dif. between say DB driver that is used in literally millions of projects and has a large number of active maintainers and DB driver written by 1 dude that has not commited any fixes in 12 month
Looks impressive. However, since Nim has such an easy C/C++ FFI you have to consider also all C and C++ APIs. That looks superior even to Go, unless Go's C FFI is as easy as Nim's.
For a while (before I found zig) I was looking for a language to do my hobby development in. Most newer languages lack a comprehensive library ecosystem, but I'm not a fan of having a giant web of dependencies anyway so one of my criteria was "how much would I dread writing my own X?". You always take a productivity hit when starting out in a language anyway, and making your own libs that interact with real code and data gives you a much better feel for how a language is going to work in the real world than more common "hello world"-like exercises do in my opinion.
How would you compare go and Nim? What's better and worse in each (setting aside ecosystem size)? Are there any core architectural decisions made in Nim that you would see as mistakes that will eventually drag the language down? Go and NIm kinda sorta seem to sit in a similar solution space, from the little that I've had time to really read about Nim so far. Thanks for your thoughts!
In my experience, Nim was the more enjoyable language to write.
I had initially written an application in Golang that effectively keeps our DNS records in sync with CloudFlare across ~100+ machines (A, AAAA, CNAME, etc). The application was simple enough to write in Golang but I still wasn't crazy about the error handling (if err != nil...) after every few lines, as well as the large binary size once compiled (although I understand the reason for its size).
To see if I could do better, I re-wrote the application in Nim, a language I've always had a lot of interest in. With the logic already figured out the code writing was quick and easy. I found the finished code to be cleaner and more concise and when it came time to compile I was excited to find a binary that was only 83kb in comparison to the 6mb binary of the Golang version.
Of course your miles may vary and there are many advantages/disadvantages to every language but I personally look forward to writing more Nim in the future :)
The realtime support for the GC is really fucking cool. In many game-dev situations, you would happily give up a millisecond per frame for GC deterministically instead of the GC occassionally taking 20-40 milliseconds.
In my opinion Rust suffers from this type of problem.
I actually suspect but haven't proven that I can make very memory or type 'safe' programs with Nim. I guess it goes back to practical necessity or reality for starters though. Do we need to or can we really prove that our programs are safe somehow? Is that something we generally need to the expense of everything else?
But Rust ergonomics have been getting better and in my opinion some of the Rust guarantees may put it in an elite category of languages along with Nim but for totally different reasons.
Yeah, I tried reading some code to see if I could get a grasp on Rust, but found that the language puts too much friction on the developer, when writing or reading code. The syntax ends up too dense. For gamedev particularly, I think Rust is just too much, though I can understand it's value if you're coding highly concurrent code with huge memory safety needs.
In my experience theoretical purity is largely independent of usability, which I find is more a function of the language's expressibility and the available abstractions (and how well those abstractions play together.)
Take Elm, Haskell and Agda/Idris as examples of four programming languages with strong theoretical foundations, that also vary widely in expressibility and usability
Elm is (imho) very usable even when teaching beginners unfamiliar with the ML-style syntax. The main abstraction is parametric polymorphism (generics in oop-terms), no subtyping with inheritance and no typeclass/interface mechanism. The language has a carefully crafted culture to maintain the beginner-friendliness of the language, explicitly at the expense of expressibility (but not usability.) I personally find elm to be an incredibly usable language, specifically because it's both simple and theoretically pure. Less to understand, for more gained reliability. More practically, I often prototype completely without type signatures, relying on type inference to make sure that I'm not doing anything nonsensical. The end result (for me at least) is the speed and ease of a dynamic language, with the reliability of a statically typed language.
Haskell (without GHC extensions) offers more abstractions and is consequently "less usable" in the sense that you need to understand more theory to use things like Higher Kinded Types and whatnot. By adding GHC extensions you can gradually ramp up the expressibility (by adding abstractions,) while at the same time making the language (slightly) more difficult to use. You can get almost all the way to dependent types which brings us to..
Agda and Idris offers an incredibly powerful mechanism for abstraction called dependent types, where you essentially program your type checker as a logic. But they are (somewhat infamously) difficult to use due to having to understand the consequences of having computations at the type-level (and beyond) and consequently at compile time.
If any of those are any less pure than the other it's probably Haskell, and that's more due to the ad hoc nature of the GHC extension system, which allows you to combine extensions in problematic ways that can hide impurity.
They all compile based on some type theory as a model of computation, but those type theories can vary widely in how complex they are to understand, but the level of complexity doesn't make anything more or less pure. All of them also offers some degree of escape hatch from their "theoretical purity"-prison. Elm has its ports for js interop, Haskell has unsafeIO and friends, Agda has FFI to both js and haskell depending on backend, and I'm guessing Idris has some way of doing this as well. In this sense Elm is probably the most pure.
It's important to use a language built on strong theory, but the language shouldn't force developers to use exactly the same theory in building their own software. Developers should use the most appropriate theory for their program's problem domain; only minimal adaptations should be required for the language. Otherwise developers end up trying to work out a general theory that fits both theoretical models, which is rarely a good use of time.
Has anybody thought to submit some benchmarks to the Benchmarks Game? I know that benchmarks aren't perfect, but would be nice to at least have an idea of how it stacks up against some of the other languages.
fwiw A couple of those task names are for older tasks which have been replaced by: fannkuch-redux, regex-redux: and there are other changes which don't seem to have been mirrored.
Indeed. They even specifically named Nim in their FAQ[1]:
> Why don't you include language X?
> Because I know it will take more time than I choose. Been there; done that.
> Measurements of "proggit popular" programming language implementations like Crystal and Nim and Julia will quickly attract attention and be the basis of yet another successful website (unlike more Fortran or Ada or Pascal or Lisp).
With version 0.19.0 Nim also moved away from nilable seqs and strings, so they are much safer to use. An empty string and a nil string is now the same, and a nil seq and a seq without elements is the same. Saves you from that one missing nil check that crashes your entire program.
If you want the behaviour of nilable seqs and strings you can use the options module.
I found it was easy to write Nim coming from Python and doing a few tutorials I found around the web and some programming puzzles. The advanced features like macros, tempates, generics, etc. are harder to learn, but I haven't needed them yet so far. I did have to dig into the actual library source code several times because the documentation is quite sparse, and for things like async and parallel programming there aren't many examples to pull from at all. However, I find it's way easier to program in Nim with static typing and the applications I write are much faster than Python, even without the crutches of numpy operations, pandas dataframes, and the huge Python ecosystem.
Great news! I have been using Nim almost exclusively for my open source projects for years now, since when it was called Nimrod...
It is simply amazing to see how it evolved and improved over time. I love the fact that it runs seamlessly on several different platform, that is so fast and so syntactically expressive and elegant.
Way to go guys, Araq, dom, and everyone who made Nim possible! Really glad you are finally getting the momentum and the attention that you deserve. Keep it up!
Does anyone know any good learning resources for this language aside from the tutorial. Just tried jumping in and got weird compile errors when trying to work with lists.
If you would like to go from the very beginning (no or minimal programming experience): https://narimiran.github.io/nim-basics/ (disclamer: I'm the author of that tutorial)
If you are more experienced, and you want to explore all the possibilities that Nim provides, check the language manual: https://nim-lang.org/docs/manual.html
If you need any help with getting started, join Nim channel on IRC/Gitter and ask the questions. The community is very friendly and helpful! (It is one of the main reasons I kept using Nim in my Nim beginnings)
I understand that it might suck to be buying a book to learn a language, so just in case you don't want to commit to that there are two chapters of the book which are free. You can find them listed in the link above or here: https://book.picheta.me/
Thanks so much! Want to get started with it before I decide to buy a book on it, but seems like a great language to program functionally in without all the hurdles :)
Adding one more "learning resource".. they are basically my notes.. you can see them get advanced as I learn Nim more (they are still getting constantly updated.. I last verified that all the snippets in there work on Nim 0.19.0).
I use Org Babel in Emacs as a "REPL", and an Org Babel package for Nim exists: ob-nim[0], and it's awesome.
For folks uninitiated about Org Babel, it allows one to write small code snippets and quickly evaluate the results and have them inserted below. I keep a nim-scratch.org handy to quickly try out new snippets. Unlike traditional REPL's, this approach allows me to keep on saving new snippets and their results too!
Here's another example: https://i.imgtc.com/xLfiR6p.png. The beauty of this approach is that then I can simply export that my blog[1] (you can correlate 1-1 how that screenshot matches with the content on that blog) :).
Nim appears to be quite similar to Haxe, though Haxe compiles to numerous additional targets. Aside from that, and the syntax, what are the main differences between Nim and Haxe?
I think the one downside of Haxe (which is also is strength) is that it doesn’t only support compiling to C/C++, but to every major language possible (Java, C#, PHP, Javascript), so the language ends up being too complex. You constantly have to think about how the languages features will map to your target language if you care about performance, which can be pretty unintuitive. And the C++ target (hxcpp) is quite heavyweight and takes a lot of time to compile (at least the last time I checked out). Also C interop doesn’t seem that convenient.
On the other had Nim only transpiles to C, so the language is simpler and easier to reason about performance. (And it probably has faster compile times). It is also explicitly designed to interop with C, so you can easily use all those C libraries in the wild...
But I still have some hope on Haxe, because the main developer is working on Hashlink, which can run Haxe code on a virtual machine but can also transpile it to C for production builds... It seems a more simple and “focused” Haxe target that doesn’t have all the warts of hxcpp...
> I think the one downside of Haxe (which is also is strength) is that it doesn’t only support compiling to C/C++, but to every major language possible (Java, C#, PHP, Javascript), so the language ends up being too complex.
I don't understand --- I would think it's quite the opposite. Implementing too many unique target-language -specific features in Haxe would mean lots more compilation acrobatics to make those features work across every target language, correct?
> the main developer is working on Hashlink, which can run Haxe code on a virtual machine but can also transpile it to C for production builds
Status has recently become a large sponsor, but over the past years it was all donations from our users (who were and continue to be incredibly generous!)
So excited for this addition! I like to program Nim in a functional lite style, and this is some nice sugar to write functional procedures. Its also nice that the language reserved the function keyword for procedures that are _actually_ functions.