Roc couldn't be optimized for writing the Roc compiler without sacrificing some of its own goals. For example, Roc is completely memory-safe, but the compiler needs to do memory-unsafe things. Introducing memory-unsafety into Roc would just make it worse. Roc has excellent performance, but it will never be as fast as a systems language that allows you to do manual memory management. This is by design and is what you want for the vast majority of applications.
There are a number of new imperative features that have been (or will be) added to the language that capture a lot of the convenience of imperative languages without losing functional guarantees. Richard gave a talk about it here: https://youtu.be/42TUAKhzlRI?feature=shared.
It still feels kinda weird. Parsers, compilers etc are traditionally considered one of the "natural" applications for functional programming languages.
A compiler is a function from source code strings to binary bytes. Writing out instructions to do memory-unsafe things is not in itself a memory-unsafe activity.
FP is bad for computing the Fibonacci series unless you have the compiler optimization to turn it into a loop (as seen in imperative languages).
To be fair, most practical FP languages have that, but I never saw the appeal for a strictly functional general purpose language. The situations where I wished one could not use imperative constructs are very domain specific.
That's an interesting point, and something I thought of when reading the parser combinator vs. recursive descent point
Around 2014, I did some experiments with OCaml, and liked it very much
Then I went to do lexing and parsing in OCaml, and my experience was that Python/C++ are actually better for that.
Lexing and parsing are inherently stateful, it's natural to express those algorithms imperatively. I never found parser combinators compelling, and I don't think there are many big / "real" language implementations that uses them, if any. They are probably OK for small languages and DSLs
I use regular expressions as much as possible, so it's more declarative/functional. But you still need imperative logic around them IME [1], even in the lexer, and also in the parser.
---
So yeah I think that functional languages ARE good for writing or at least prototyping compilers -- there are a lots of examples I've seen, and sometimes I'm jealous of the expressiveness
But as far as writing lexers and parsers, they don't seem like an improvement, and are probably a little worse
When I debug a parser, I just printf the state too, and that is a little more awkward in OCaml as well. You can certainly argue it's not worse, but I have never seen anyone argue it's better.
---
Culturally, I see a lot of discussions like this, which don't really seem focused on helping people finish their parsers:
Roc is pitched as having great performance for a GC'd language, that is, on par with Java, Go, C# instead of Ruby, Python, JS. The Roc compiler team are looking for C, C++, Rust, Zig kind of performance. Roc will, by design, never reach that kind of speed.
And now they are doubling down on that by moving from "OCaml meets C++" to "C, the good parts"!
If FP isn't good for writing a compiler, what is it good for?