Hacker News new | past | comments | ask | show | jobs | submit login
Ray Tracing in Nim (nim-lang.org)
147 points by beagle3 on July 1, 2020 | hide | past | favorite | 66 comments



What I like about Nim is that you can always get C performance if you just spend a little time. You can always use a profiler you like (I use vTune) to get to the hot loops and fix them.

See my post here: https://forum.nim-lang.org/t/5363#33576


Is that not true of almost any other systems language (depends if you count Go as a systems language)?


Nim still Mostly retains its pythonic nature (fun and readable) even when you do whatever’s needed for C/C++ level performance.

I think it’s exceptional in that it feels lightweight while delivering great performance, and it has no “glass ceiling” of any sort.

Rust and D have no ceiling either; but D feels like a better C++ and Rust feels like a court filing.

D is definitely under-appreciated if you prefer C feel to Python feel


I've been paid to work on D so no need to preach to the converted. I'll have a look at Nim too.


> Rust feels like a court filing.

But HackerNews and Reddit told me that rust is very ergonomic after two weeks and if you do something in other languages that you can't do in rust, you are doing it wrong and it needs to be fixed.


It depends. On current CPUs the bottlenecks are in order:

- IO processing

- Memory

- CPU

In particular, any language that doesn't have any escape hatch to do manual memory management will have troubles to optimize memory bound workloads.

In scientific computing, machine learning, deep learning and in image processing, in particular stencil code, 90% of the workloads are memory bound (the CPU-bound being those involving convolutions and matrix multiplications once you get the data at its proper place, and things involving exponentiation or trigonometric functions).


Does nim support SIMD intrinsics? Without those you aren't getting "C performance."


Absolutely yes, you just need to let the Nim compiler read the C headers for <immintrin.h> or something like that. Here’s a nice SIMD wrapper for Nim: https://github.com/numforge/laser/blob/master/laser/simd.nim


With all due respect, Nim is a great language performance wise however I fail to see how is it different from D. Many advertised features have been present in D for a long time.

Syntax wise Nim it is a step back. It's hard to read and understand while any C/C++ dev will have next to no effort reading through D code. And of course while D is C ABI compliant it interoperates with C++ well too.

I seriously doubt that Nim or any other language in this regard has better metaprogramming than D.

Nim might have a speed overhead in certain tasks but that depends on a benchmark. Besides, does Nim have anything similar to NumPy which is actually faster? D does.


* For a full comparison of features D vs Nim see [0], or a more succint selection of items by Nim's creator [1]; these also contains some remarks about metaprogramming and C/C++ interoperability.

* syntax-wise you could also argue that is a step forward: it embraces Python syntax and it would be easy to read and understand for all Python devs.

* Nim does have a "faster NumPy" by the same author of the article: [2]

[0]: https://github.com/timotheecour/D_vs_nim [1]: https://forum.nim-lang.org/t/1779#11098 [2]: https://mratsim.github.io/Arraymancer/index.html


Thanks. So here are the points for D according to the table. Very few breaking changes because language is pretty mature. Not sure if Nim is as stable but just this one fact is enough for me personally.

D has slices, ranges and lazy evaluation which is a joy when you do data number crunching. Nim does not have them.

Also no mention of dpp in a table is surprising tbo.

D PRs take time mostly because they go through a rigorous community review. And by that I mean your PR does need to hit the quality bar. Which is only good and does not make a language a sandbox of community features like C++. See the PR for macros in D and why it didn't happen. Another reason is the lack of people of course.


The table is out of date. Nim is post 1.0[1] now and stable, and has few breaking changes.

[1] https://nim-lang.org/blog/2019/09/23/version-100-released.ht...


> it embraces Python syntax

Nim looks most like Pascal syntax to me, with a bit of Python


I read through Arraymancer readme and the rationale and got very mixed feelings about the project and its purpose. It's everything and nothing at the same time. NumPy like syntax and functionality, sklearn algorithms and look, deep learning too, here is and example and some screens which we borrowed from Scipy. Seriously? Sorry if that might be too judgemental but it sounds and looks amateurish. And I thought of benchmarking it against D mir... Also, the name... And this is what I feel about Nim in general, rushed and undercooked.


[0] is a great comparison, are there other comparisons of this kind?


>I seriously doubt that Nim or any other language in this regard has better metaprogramming than D.

D's macros work through string interpolation. There are certainly people who find the Nim approach of generating a typed AST to be preferable.


> Syntax wise Nim it is a step back. It's hard to read and understand

This reads like satire to me. You shouldn't claim your feelings as objective truth.


I agree that it is strange that, in all of the discussions surrounding relatively modern languages, and new technologies, while nim and rust and many others frequently get mentioned, so rarely it seems that anyone talks about D. Wonder why that is? I’m genuinely curious.


Because D is effectively dead/has lost its momentum completely?

If you look at the development surrounding D, the stdlib got its allocator module around 5 years back IIRC in experimental. It is still not stable. Same for many other modules. Due to lack of resources not much work is being poured into the stdlib. Same for developer tooling, some people have created ide tools and stuff, however they are not AST based and so most of the things that should work don't work. Yet, instead of pouring resources into these issues, the resources are poured into developing 3 compilers dmd, gcc based d, and llvm based d.

Also, the readability issue posted by the op are subjective. To me, and many others, Nim is much much more readable and elegant than C/D/C++/Rust.

Also regarding OP's >Many advertised features have been present in D for a long time.

Nim has macros and many features are based on or around macros, whereas D will never get macros. IIRC D uses reflection based stuff?

> I seriously doubt that Nim or any other language in this regard has better metaprogramming than D.

Well, you are wrong (Unless you like reflection more than macros). Nim, haxe, etc. languages have very powerful AST based macros.


Are you the same person who attacked ppl on D forum. The nickname is familiar and the style ;)

Chill. All languages have dusty corners and given the amount of ppl D has it's obvious that not all things are fixed in time but they were and they will, history already proved it. Besides, complaining is easier right?


> Are you the same person who attacked ppl on D forum

Nope. I am the person who asked about dlang VS plugin and new ds module recently. I don't remember attacking anybody.

> All languages have dusty corners and given the amount of ppl D has it's obvious that not all things are fixed in time but they were and they will, history already proved it.

Well, it was you guys who asked why D is not discussed/used more. I just gave my honest opinion. D's ide tooling is very bad compared to newer languages, even Zig. It's stdlib is not being worked on. Now, what do you expect? Why would people use D? How is creating 3 compilers not a waste of resources when there are 2 important issues that are not being worked on by anybody? Do you think more people/orgs will flock to D because it has 3 compilers even though its tooling is very bad?

> Besides, complaining is easier right?

I don't like how this is posted everywhere somebody points out flaws in something. I am not a compiler developer. I don't have much knowledge about low level stuff. I want to use a programming language for my field of expertise. If the language/ecosystem is not good for that, what do you expect me to do? Do you want me to leave everything aside, set apart 2 years for learning compilers and how its tooling is made. Then leave everything aside for 2 more years, and start contributing to D. Is this what you want me to do?

When you release a project, if you want it to be used by people, it is your responsibility to make sure that it is usable. It is not the customer's responsibility to fix the product.

Look man, I intend no disrespect to anybody. I respect walter and other people. However, what I said is a fact. D/Nim/Crystal etc. have few to no resources for development. It is upto their leads to prioritise things for sustainability. I understand how difficult it is for a project this size. But, you people can't be putting the blame on people like me for not contributing to the project or for saying that the reason why people not use D is because its tooling is nonexistent and its stdlib is dead. AFAIK, many core people also agree that dmd should be deprecated and focus/resources should be put elsewhere.


Ok. But going back to the issue. I live in Munich and know at least two big companies who use D in production and organize meetups. I have yet to see a single company that uses Nim.


The biggest sponsor of Nim Status:

- https://github.com/status-im?q=&type=&language=nim

It's used by Beamdog for tooling for Neverwinter Enhanced Editions:

- https://github.com/niv/neverwinter.nim

It's used in Academia at the University of Utah and in Germany:

- https://github.com/brentp

- https://github.com/Vindaar

And a couple more: trading, pharma

- https://github.com/nim-lang/Nim/wiki/Companies-using-Nim


Well, mainly Status.im uses Nim, and there is one based in China I think (Some of their employees are regulars in the Nim gitter/irc/discord). Also, there is https://github.com/nim-lang/Nim/wiki/Companies-using-Nim , though I don't know how up to date it is. Anyhow, there is not much jobs for either nim/d/haxe/... So, they are mostly used for personal projects. I wanted to use D however it didn't work out, so onto other languages :(


Here's a good overview: https://news.ycombinator.com/item?id=23494490

Basically, D was too little, too late, with a lack of build tools, while Rust, Nim etc are designed from the beginning to have certain features and have easy to use build tools like cargo.


There is a perception that D has shot its bolt, having been around for quite some time without making a noticeable impact, either by itself or by influencing the mainstream.


It was a developed by too few ppl from the start and by the time it became usable it was way too late


Or, alternatively: Once something _works_ it is no longer interesting . . . ?


Depends on where do you read ;) But tbo D community concentrates around D forum and in majority consists of former C/C++ veterans. Being a younger language Nim lures younger and active members who probably never heard of D anyway.

Historically D never advertised itself enough imho and that is a shame. I blame the lack of proper leadership and management at the start. Its early development progress has been a rocky ride many languages would not survive at all but it did and for me personally it's a testament to its maturity and resilience newer languages still have to prove.


D has a native IDE and working REPL OOTB. They are all jealous!


> Many advertised features have been present in D for a long time.

They have also been in Nim for a very long time. The initial Nim version was released in 2008, and is comparable to D2 (started 2007) rather than D1 (started 2001). So they are effectively contemporary, and gained a lot of the features at comparable times (and with some cross pollination, I'm sure, although I wasn't there to witness it).

> Syntax wise Nim it is a step back. It's hard to read and understand while any C/C++ dev will have next to no effort reading through D code.

There is no accounting for taste, but Nim syntax is Pythonic; Python is generally regarded as one of the easiest (if not THE easiest) real programming languages to pick up. Both D and Nim get hairy when you use advanced metaprogramming features, but at the surface level -- Nim is likely to be easier to pick up except if your audience is exclusively C/C++ programmers (and maybe even then).

> I seriously doubt that Nim or any other language in this regard has better metaprogramming than D.

You should learn some Lisp then, the mother of all metaprogramming systems :) and also, you should look at Nim, it's not quite Lisp but it goes as far as (and perhaps farther than) Lisp-without-reader-macros. This example[0] embeds a compile time type-checked syntax-checked SQL dialect into Nim.

> Nim might have a speed overhead in certain tasks but that depends on a benchmark. Besides, does Nim have anything similar to NumPy which is actually faster? D does.

Yes, Nim has ArrayMancer (by mratsim, who wrote this raytracer as well) which is considerably faster than Numpy and also natively supports CUDA and OpenCL, IIRC, even though it's still younger so it's not as complete as Numpy. But Nim also has Nimpy which lets you mix Python and Nim with the least friction (and generates one executable that works, and works equally well and quickly, with whatever Python you happen to use at runtime - Py27, Py36, Py37, Py38 - not familiar with anything else that does). There was also NimBorg which provided similar mixing with Lua, but it seems to be abandoned now for lack of interest.

> And of course while D is C ABI compliant it interoperates with C++ well too.

Nim is source level as well as ABI level compliant with C, C++ and Objective C (you can use C++ exceptions and objects natively, no need for an "extern C" wrapper; same with Objective C). And also natively compatible with Javascript (though obviously not in the same compilation unit ...) . D is under appreciated, for sure, but Nim is definitely not lesser, and it's about as old.

[0] https://juancarlospaco.github.io/nim-gatabase


Thank you for the detailed answer. It clears things up.


> Syntax wise Nim it is a step back. It's hard to read and understand while any C/C++ dev will have next to no effort reading through D code. And of course while D is C ABI compliant it interoperates with C++ well too.

On the other hand, people coming from Python, Ruby, Pascal, Ada will appreciate the minimal amount of sigils.

> I seriously doubt that Nim or any other language in this regard has better metaprogramming than D.

Can you write an Embedded DSL + compiler running at compile-time in D

- https://github.com/numforge/laser/tree/master/laser/lux_comp...

Can you write a cryptographic library with hex conversion to big integer and modular arithmetic running at compile-time in D?

- https://github.com/mratsim/constantine/blob/master/constanti...

Can you generate a state machine that lowers down to optimized computed gotos with no dynamic allocation suitable for multithreading runtimes and embedded devices and able to display the actual graph at compiletime?

- https://github.com/mratsim/Synthesis

Can you describe x86 opcodes in a declarative way for a JIT assembler and do all the rex/modrm/sib computation at compile-time?

- https://github.com/mratsim/photon-jit/blob/master/photon_jit...

And the same thing for an emulator?

- https://github.com/mratsim/glyph/blob/master/glyph/snes/opco...

Can you implement async as a library?

Can you emulate classes with ADTs to solve the expression problem, avoid cache misses and multithreading problem due to OOP and the double indirection due to the visitor pattern?

- https://github.com/mratsim/trace-of-radiance/blob/master/tra...

> Nim might have a speed overhead in certain tasks but that depends on a benchmark. Besides, does Nim have anything similar to NumPy which is actually faster? D does.

It does:

- https://github.com/mratsim/Arraymancer

With CPU, OpenCL and Cuda backends (in differing state of maturity)

and it's being used for data analysis for example ancestry prediction:

- http://home.chpc.utah.edu/~u6000771/somalier-ancestry.html

You can even run a neural network and see its training live:

- https://github.com/Vindaar/NeuralNetworkLiveDemo

AFAIK D doesn't have any plotting library while Nim has plot.ly integration and a ggplot2 port in pure Nim:

- https://github.com/brentp/nim-plotly

- https://github.com/Vindaar/ggplotnim

And lastly nim can easily call Python:

- https://github.com/yglukhov/nimpy

or be called from Python:

- https://github.com/Pebaz/nimporter


Also, all good points. Appreciate that.


I suspect d's metaprogramming is not as good as nim's (haven't used the latter), though it's better than metaprogramming of every language I've used without macros.

That said, it can do all of the metaprogramming-related tasks you mention.


Are you aware of D libraries in that vein? I'm especially interested in the first 2, since those are something I'm actively working on at the moment.


I don't know of any DSLs, but the stdlib arbitrary precision integer library works just fine at compile time:

  enum x = BigInt("0xffffffffffffffffffffffffffffffff") % 2;
  writeln(x);
('enum' means 'make this value a compile-time constant; error if you can't')


I gave quite a few statically-compilable language a try a few years back before settling on nim as my "fat binary" language of choice.

D was a contender, but ultimately the reason I dropped it was I was unable to compile a hello world on my laptop. Now, that machine is not fast, but it is actually pretty new, it's a very low end 2017 dell machine, and it turned out that bootstrapping a D environment requires a midspec machine or it literally cannot complete.

I donno man, D seems to have a "last 5%" problem. It looks good on the surface, but as you start looking into it you discover that the bootstrap tools are fat as hell, the core library has a weird split in GC styles, the doc is inconsistent. Everything you do in D is 5% harder than it needs to be, nothing is buttery smooth. Overall, all those 5% multiply together to make it a 20-30% worse experience overall, although I couldn't point at any one thing and say "that is what has killed D".


This is pretty good. I was wondering about CUDA and it looks like Nim has support for writing CUDA kernels. Is there a repository of kernel implementations I can take a look at?


https://github.com/mratsim/Arraymancer probably has a lot of examples in it's implementation


Yup. That's what popped up when I searched. Thanks for the reference though.


The CUDA kernels are stored here:

- https://github.com/mratsim/Arraymancer/blob/master/src/tenso...

With the higher order function:

- https://github.com/mratsim/Arraymancer/blob/master/src/tenso...

And their code generation:

- https://github.com/mratsim/Arraymancer/blob/master/src/tenso...

It's possible to avoid inlining C++ like they do here:

- https://github.com/jcosborn/cudanim/blob/338be782/cuda.nim#L...

but I will explore that later.


Thanks.


I keep seeing Nim and it looks really productive and very interesting! Does anyone know how it compares to Zig?


Zig is mainly focused on low level programming - every function that allocates takes an allocator argument, there are lot of builtin functions to do low level bit twiddling, doesn't hide any control flow, etc..

Nim is mainly a productivity language. You have GC by default, extensive stdlib, lot of syntax sugar and metaprogramming features. You can create untraced pointers and do manual memory management.

Thus, among the new crop of compiled programming languages

Productivity languages with GC: Crystal, Nim, Go, Swift (automatic refcounting, not GC)

Manually memory managed, low level: Zig, Rust, Odin. Nim can do manual memory management as well but doesn't seem to be primary use case.


Even though Rust has manual memory management, I'd say it can be pretty high level due to its functional programming features that are nearly equivalent to Haskell or OCaml.


True. But rust's aim (at least as its original author envisioned) is for low level systems programming where having automatic memory management is not desired. The aim of safety in low level programming is nice, but that comes with tradeoffs. Eg: not allowing certain patterns and thinking about memory / ownership even when not required.

Some vocal minority of rust fanboys herald it like these are not a problem and rust is right tradeoff for every programming task, as if when crab god doesn't like the way you structured your program there is fundamental mistakes in your programming and you should have fixed them anyway. That kind of dishonest, blanket statements are what leads to Rust Evangelism Strike Force memes.


Zig supports zero-effort C binding by reusing existing C headers instead of wrapping them^. The only other language known to me having such feature while not actually being a superset of C is Terra†. I feel like we are still several light years away from the Holy Grail of programming languages, which would retain full compatibility while also being extremely productive. Walter Bright (the creator of DLang) once stated that the least age to develop a feasible programming language is 40-ish+, so hopefully generation raised in the world of modern computing will manage to find the ultimate syntactic solution.

^) https://ziglang.org/#Integration-with-C-libraries-without-FF...

†) http://terralang.org/

EDIT: replaced nasty star parsed as markup with a caret. Sorry.


Not sure the age of the original author is very relevant. GvR was 35 when he started work on Python. Chris Lattner was 31 with Swift, Brendan Eich was 33 with JS. These are some of the most used languages today. I’d say it’s more a matter of being in the right place at the right time to serve a particular use case.


Has anyone had any luck reproducing the benchmark figures given for the smallpt program? When I try, gcc handily beats nim in the multi-threaded scenario and they are neck and neck in the single-threaded one. For 10 samples per pixel:

    g++ multi-threaded : 33.52
    nim multi-threaded : 36.90
    g++ single-threaded: 75.78
    nim single-threaded: 75.95
nim 1.2.0, g++ 10.1.0. I used the flags -O3 -march=native -mtune=native -Wall -Werror for gcc and the default flags for nim.


Here is the code if anyone is interested: https://github.com/bjourne/c-examples/blob/master/programs/s... With a few tweaks, g++ beats nim both in the multi-threaded and single-threaded setups.


Since Nim transpiles itself down to C code, and then compiles it to assembly, and machine code. Then, how does Nim avoid the occasional security issues, buffer overflows, and other undefined behavior problems that always arises from C written programs?

Is this not a concern? Or is Nim fully vetted, that it doesn’t use defective C libraries?


Anyone have thoughts on nim vs rust for data centric (Scala like) applications


I can't speak to data centric but I was looking at both Rust and Nim for a project recently. I love the syntax of Nim and it feels incredibly easy to write in. The power of the macro system is amazing.

But the Rust community is so much bigger, pretty much every question I had was answered after a quick Google search.


I agree. I really struggled to get help with certain problems with Nim, but with Rust it has been fairly trivial to find help. The community is very active and helpful.

Nim does have a syntax I prefer more, but overall I find I’m quite a bit more productive with Rust.


The most effective venue for quick Nim answers is the Nim forum - almost everything gets a serious well thought out answer within a few hours to a day.

That said, nothing beats “already been answered” and the Rust community is much bigger.


That depends on why you use Scala?

Do you use Scala as a better Java? If so, Nim may be what you want.

Do you use Scala for its incredible type system? Rust may be what you want.

Do you use Scala for its functional tendencies? Rust may be what you want.

As an aside, I've found that once you get over the Rust learning curve, writing 'correct' programs is easier in Rust than any other language I've used. And for data centric applications, being 'correct' is extra important.


What about scalable data? Like spark applications? I am familiar with weld but seems like more mature distributed computing is still on the horizon


its been a couple years - but mratsim (the same person who authored this ray tracing post) wrote this (which is at least tangential to your question) https://medium.com/@MARatsimbazafy/why-rust-fails-hard-at-sc....

Some of the rust problems mentioned (like clone for arrays) have been fixed. Others, like static generics, have not been (AFAIK).


Apart from Python, you could look at Julia.


If they’re data centric, would you not be better off with vanilla python? If you have numpy do the heavy lifting, it’ll be as fast plus all the syntactic/ecosystem benefits


The main reasons I see for nim in scientific computing over python are:

(a) you need performance for applications outside of numpy / pandas & you want to avoid writing a C extension to do so (nim is also actually a good choice for python extensions via nimpy - if you are still using python as the main language of your project/system)

(a2) you want good concurrency (some would say python's async/await is good - but it feels too little too late to me) & you at least want the option of parallelism (like this article takes advantage of)

(b) you want / need the powerful macro system for dsls / custom (compile-type-checkable) language extensions.

There is a good future for nim in scientific computing (frankly, there is a good present) - but for it to be better, people need to be willing to forgo some of the ecosystem (when possible) and to help build that future.

https://github.com/nim-lang/needed-libraries/issues/77


The OP said Scala-like. A big advantage of Scala/Spark over Python is more compile-time checking: the code is much less likely to fail at runtime. Particulary helpful for production code (e.g. pipelines). For people coming from an engineering background (as opposed to data science), Python's lack of any static type checking for data processing code can be a big productivity killer. In that sense, Nim might be able to offer similar advantages to Scala.


Numpy is nice but pure-Nim applications appear to outperform it easily:

https://narimiran.github.io/2018/05/10/python-numpy-nim.html

https://github.com/mratsim/Arraymancer


Interesting but given the low total execution time, the majority of time in the numpy implementation is going to be spent initializing CPython.

I think if this were on the order of minutes/hours you would see numpy outperform.


Yeah, I like nim but it isn't going to offer and advantage over python in this regard.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: