Hacker News new | past | comments | ask | show | jobs | submit login
Rust's Poor Composability (thedav.is)
115 points by sinistersnare on April 6, 2023 | hide | past | favorite | 159 comments



>I love Rust. I wish they would spend more time making it actually work for non hello-world use-cases.

Considering Rust has been used for anything, from basic userland utilities and databases, to high performance network stacks serving billions, compilers, and even Linux kernel drivers, I'd call this statement "not even wrong".

>What's the point of having the 'pretty' syntax if it only works in the simplest of cases?

That the "simplest of cases" are 90% of what you use 'for' for? The syntax should "make simple things easy, and hard things possible", and this is an example of exactly that!

>I hate syntax that only works in hello world examples

In which universe is basic iteration a "hello world" use case? It's used in basically EVERY program, all the time. That's "bread and butter", not "hello world" level.

And the rest also has a reason to be there, not that someone "infuriated" would try to go deeper.

>There should be one-- and preferably only one --obvious way to do it.

That's a design goal for another language. Which failed much worse than Rust at that, and for no good reason.


https://without.boats/blog/the-registers-of-rust/ is pretty transparent about Rust not having or aiming for "only one obvious way to do it". Rust has more of a mindset that if you're willing to mess with the fiddly bits, then you get control of all of the effects.


> That the "simplest of cases" are 90% of what you use 'for' for?

Anything that can't be done gets exactly 0% of the usage. I don't know where you got 90% from.

The author clearly comes from a mindset of trying to write Haskell code in Rust. But the criticism is fair, those are 3 very ugly aspects of the language, and it may be possible to improve them without breaking the general behavior... And even if it isn't, it is well worth it to be aware of them.


>Anything that can't be done gets exactly 0% of the usage. I don't know where you got 90% from.

I'm not sure what the first sentence means.

Plain iteration across all elements is the most common case for "for".

That's where I get the 90% from (meaning, most of the time you'll be doing that).

for/in without filters etc., is not something that's just for "hello world" style programs (only useful as a very crude example that you wont find in any real program) - as the author describes it.

It's the very opposite: what you will most commony use in every program you write multiple times.


I'm not familiar with Rust.

  for x in &mut something {
      *x = (*x) * 2;
  }
Here (above) it looks like the `&mut` is allowing the iterator to be mutable (https://doc.rust-lang.org/std/keyword.mut.html).

  for (i, x) in something.iter_mut().filter(|| {...}).enumerate() {
      *x = (*x) * i
  }
The `mut` keyword isn't needed here because, I assume if the above assumption is correct, "iter_mut()" is returning a mutable iterator. Is there some reason the result of "something.iter_mut().filter(|| {...}).enumerate()" can't be saved to a variable and then used as the "something" in the first example?

Anyway, this post reminds me of an article, "Why Programming Language X is Unambiguously Better Than Programming Language Y"[0]. Mostly talking about how [poorly written Y code] is why Y is a bad language and [beautiful and elegant X code] is why X is a good language.

[0] https://news.ycombinator.com/item?id=6960398; unfortunately the link is dead and I can't find it on the domain


> Is there some reason the result of "something.iter_mut().filter(|| {...}).enumerate()" can't be saved to a variable and then used as the "something" in the first example?

You absolutely can do something like `let iter = something.iter_mut().filter(|| {...}).enumerate();` In Rust, Iterators are just need to implement a Trait (kindof like an Go or Java interface) so they can be put in normal variables just fine.

> The `mut` keyword isn't needed here because, I assume if the above assumption is correct, "iter_mut()" is returning a mutable iterator.

Every for loop implicitly calls `.into_iter()` on the thing to loop over; the docs are decent at explaining what IntoIter is [0]. By doing `&mut something` instead of just `something` you get a different IntoIter implementation which has very different semantics. Typically `(&mut something).into_iter()` would have the same behavior and might even result in the same type as `something.iter_mut()`.

[0]: https://doc.rust-lang.org/std/iter/trait.IntoIterator.html


Thanks for taking the time to explain!


Like mentioned below, for loops desugar to something that ends up calling the "into_iter" method on the type. In rust &mut T, &T and T are different types, so the "into_iter" method is actually different for each one!

Here is an example to illustrate what I mean (i did not test to see if it compiles so there may be some mistakes):

    struct Foo<T>(Vec<T>);

    impl IntoIterator for Foo<T> {
        type IntoIter = std::vec::IntoIter<T>;
        type Item = T;
        fn into_iter(self) -> Self::IntoIter {
            self.0.into_iter()
        }
    }

    impl<'a> IntoIterator for &'a Foo<T> {
        type IntoIter = std::slice::Iter<'a, T>;
        type Item = &'a T;
        fn into_iter(self) -> Self::IntoIter {
            self.0.as_slice().iter()
        }
    }

    impl<'a> IntoIterator for &'a mut Foo<T> {
        type IntoIter = std::slice::IterMut<'a, T>;
        type Item = &'a T;
        fn into_iter(self) -> Self::IntoIter {
            self.0.as_slice().iter_mut()
        }
    }
Doing "for x in T" calls the first method, "for x in &T" calls the second, and "for x in &mut T" calls the last one.

As you see here, each IntoIterator returns a completely different type, and the first IntoIterator actually consumes the struct Foo as well!

There are also the methods "iter" and "iter_mut", these are just there for convention and technically don't need to exist for the iterator related traits. It's useful for doing iterator pipelines that don't have any for loop.


Taking everything someone says strictly literally does not make other people "not even wrong." It's a reading comprehension error and is uncharitable.

The author really likes Rust. He's not some naysayer. He'd just like Rust to be even better. He provides constructive feedback.


>Taking everything someone says strictly literally

The author intents it to be taken literally. He even goes on to expand on the "literalliness" of his put-downs, with several paragraphs of invalid arguments furthering the exaxt same non-point.

>The author really likes Rust.

Liking and understanding are too different things.

>He provides constructive feedback.

See, this is where I beg to differ.


> BUT when you try to use it with iterators -- which are also amazing, I love using iterators -- IT DOESNT WORK.

Yes it does

  let res: Result<Vec<_>, _> = iterator.iter().map(|x| x.foo()).collect();
  let res = res?;
> But this also happens elsewhere, because of this ugly inflexability, we cant do:

Use and_then:

  let z = x.foo().and_then(|y| y.bar())?;
I get that there's an upfront cost to learning this stuff, but I think he's blaming the language a little too aggressively.


Yea, it's a game i often play. "Boy it would be nice if i could do X" where X is something like `and_then`, and then i go look it up in the stdlib and.. it's there.

For a good while my X's were a nightly only experimental API, but slowly and surely they make it into stable. (Hash|Btree)Map's `.entry` methods (and friends) were another example.

Generally speaking these methods are super helpful and someone else already thought of it. From my experience at least.

.. also this article was infuriating hah.


Also,

  let res = iterator.iter().map(|x| x.foo()).collect::<Result<Vec<_>, _>>()?;


And people complain about ))) in Lisp...


I love rust because it satisfies my inner perl


It is a non-obvious Rust-ism that if you’re doing something with iterators and you don’t know how, it’s probably .collect(). ;-)

Kind of like how the answer is always ‘traverse’ in Scala.

.collect() gets overloaded a lot to deal with the intricacies of the return type, which usually says a lot about the computation, e.g. Result or HashSet.


It just reads better in terms of code.

`let res: Result<Vec<_>, _> = iterator.iter().map(|x| x.foo()).collect();`

Left to right reading this statement is "iterate over the values and map it to the output of foo for each value. Then collect that map into a vector."

His other example

``` for (i, x) in something.iter_mut().filter(|| {...}).enumerate() { x = (x) * i } ```

can also be read from left to right as "iterate over something, then filter and enumerate the values"

It's much more explanatory than top down structures with 10 lines to explain the same thing.


It's not the same.

Using the question mark in a closure in general is confusing.

Doing ? on the return value of a statement doesn't make sense. Normally ? is used for: here is this Result (I forgot the trait's name), and give me the Ok(), and if it is Err() then RETURN to the caller.

So the function itself must be of type Result<_, _>.

With the lambda in the example it is unclear by reading the code what the ? should do. Stop at the first error and return Err? Or should the return value be Result<Vec<_>, _>?

One can use the try_ functions on an iterator.


Considering the author has been using rust for 10 years (if we believe them) I think it's saying something


The problem with .map().collect()? is that it does not short-circuit.

Consider:

  for item in iterable.iter() {
    item.foo()?;
  }
You will exit the loop as soon as there is an error. But with .map().collect(), you iterate over the whole list, and then exit if one of the items yield an error.

EDIT: Thank you for your answers, it seems I was wrong and it does short-cricuit!


Incorrect: collecting to a Result stops on the first error, as documented explicitly in https://doc.rust-lang.org/std/result/enum.Result.html#method....


Realistically, the 'problem' is that many programmers today get scared away when you start mentioning monads or effect systems. Realistically though, the solution is to have first class monads or effect systems in rust and other languages. A lot of language design today is 'wasted' in the sense that we spend time implementing special cases (Promises in javascript, ? syntax in rust, async syntax in rust) without any thought given to the meta problem. In my opinion, the lack of attention to the meta problem is due to programmers being scared of such topics.


Map and friends are lazy, so the whole chain is run once and in full for each item. Collect will short circuit at the first Err encountered. You can even try it out by using `.inspect()` in the chain to see that the iterations don't go past the first error.


That's not true: https://play.rust-lang.org/?version=stable&mode=debug&editio...

    let v: Option<Vec<i32>> = (0..5).map(|i| {
        print!("{i} ");
        if i == 3 { None } else { Some(i) }
    }).collect();
    // prints 0 1 2 3


Short-circuit refers to stopping not skipping.


I don’t understand. The example you replied to stopped, not skipped.


that should print 4 and 5 if that was skipping ?


Yeah I read too fast.

One of the issues with type inference, this exact same code does different things depending on whether the result is assigned to `Option<Vec<i32>>` or `Vec<Option<i32>>`


Collecting into a result will short circuit.



Does this early exit on error?


Yes. You can prove it to yourself with this program: https://play.rust-lang.org/?version=stable&mode=debug&editio...


This is a pretty low quality rant. Going example by example:

    for (i, x) in something.iter_mut().filter(|| {...}).enumerate() {
        *x = (*x) * i
    }
you can this this in loop style if you want:

    for i in range(something.len()) {
        let x = &mut something[i];
        if ... {
            *x = (*x) * i;
        }
    }
or if you prefer, in iter style:

    something
      .iter_mut()
      .filter(|| {...})
      .enumerate()
      .for_each(|(x, i)| *x = (*x) * i);
The next two examples can be written:

    let res: Vec<_> = iterator.iter().filter_map(|x| x.foo()).collect();
    let z = x.foo().and_then(|y| y.bar());
There is an ergonomics issue around the fact that if you stick `?` or `.await()` in a closure, it returns to that closure instead of the function containing it. But the way this issue manifests itself is the standard library having a pile of different variations on methods, like `.map()` vs. `.filter_map()`. That's the thing to complain about, not the fact that closures work the same way they do in every other language.


I've never used Rust - in a conventional language you'd just have a mutable variable defined outside of the loop and used and incremented inside of the loop. I assume Rust's for-loops can't access the outside variable scope that way?

Like (pseudocode)

    var i = 0
    for(item in items) {
      item.DoThing(i)
      i+=1
    }
Does rust not allow that kind of mutation and access of a variable from outside the loop within the loop?


Sure, you can do something like that, but it is unnecessary. Like Python, you can use "enumerate" to give you a tuple that that will give you `i` automatically.

https://play.rust-lang.org/?version=stable&mode=debug&editio...

    struct Item(u32);

    impl Item {
        fn do_thing(&self, i: usize) {
            println!("The value of `i` is: {i}");
        }
    }

    fn main() {
        let items = vec![Item(1), Item(2), Item(3)];

        for (i, item) in items.iter().enumerate() {
            item.do_thing(i);
        }
    }


No, you definitely can. You generally can't mutate things that would cause the iterator itself to be invalidated (like, you can't delete items from a vector while you're iterating over it), but you can get an iterator over a vector that gives you a mutable reference to each of its members, which would be sufficient to do what you're suggesting here.

I think what they were pointing out (and I agree with) is that the mixing of combinators and imperative-style for loops that's going on in the post doesn't feel very natural. Typically if you would _either_ do

  blah
      .into_iter()
      .filter(|x| x.some_condition())
      .for_each(|x| x.do_something());
or

  for x in blah {
      if !x.some_condition() {
          continue;
      }
      x.do_something();
  }
But `for x in blah.into_iter().filter(...` is a bit unusual.


you can do it that way if you wanted to

    let mut i = 0;
    for item in items {
      item.method(i);
      i += 1;
    }
I don't know why you'd it that way, but Rust definitely isn't stopping you from doing that.

e: and as running example: https://play.rust-lang.org/?version=stable&mode=debug&editio...


Well, the complaint in the article is that the idiomatic way is kind of inscrutable. I'm more than willing to let pragmatism win in those cases, since so many languages make this pattern a nuisance instead of providing a simple "enumerate this array with index" out-of-the-box as part of the standard library. IIRC you have to roll your own in Java, C#, and Powershell.


In C# there's a .Select overload, that gives your lambda both item and its index.


This works fine in rust too. Iterators are a place where the power of the type system (etc) feel like magic and so you'll find a tendency for solving things with the magic (especially in newer programmers, but everyone does it at least a little).

Compare all the crazy things people build out of channels in go, or a similar tendency in python to over-do the generator expressions.


You could definitely do that with Rust, but using `enumerate` rather than mutable variables is considered idiomatic. Neither is more or less "conventional".

lol 3 people said the same thing at the same time. rust strike force assemble


haha yes the discourse level thread is like perfect for all us "we've read the Book" people to comment on.


> the for loop syntax should be simple syntactic sugar for iterator.iter().for_each(BODY), not bespoke syntax. The fact this breaks wrecks my mental model.

I'm sorry to be blunt, but this mental model is wrong. .for_each runs a function, while for does not.

If what you want is to return early on an error, you can .collect() an Iterator<Item=Result<T, E>> directly into a Result<Vec<T>, E> and then ? that. That's the equivalent of for and ?. Or, you can collect into a Vec<Result<T, E>>.

> You must use the for loop syntax if you want to use .await, because iterators are not powerful enough to support real world use-cases.

Given that what this requires is async closures, and async closures are an active area of work, this seems like a somewhat unhelpful comment.

I definitely agree about having better syntax for iter_mut, though.


Look, it's easy to beat up on an article like this and feel good and smug about it.

...but, let's not beat around the bush. Rust has some sharp edges (1).

Why can't you clone a boxed function? I get it, Box<dyn Foo> isn't sized, so you can't clone it. Ok! ...but you can clone a closure because a raw closure is clone if the contents is clone. Not when it's boxed though.

Why cant you express that a struct can contain an object and a reference to the object and have a 'static lifetime?

Yes, you can collect::<Result<...>>, but why can't you just use the `?` sugar for it? There's only one possible reason to end a collect() with ? surely?

Yup. I get it.

Lot of reasons, writing languages is hard.

...but, I dunno. I read this:

> the teams working on language design need to SLOW DOWN and focus on ergonomics and composability.

...and I think about `trait ?async` and I read the roadmap (2), and I have to say, I'm sympathetic. More sympathetic then I should be, perhaps, given how ragey the article is... but still.

I feel that pain too.

I use rust because it's nicer than C++. If it turns into a trash can of features (like C++), why bother?

[1] - https://stackoverflow.com/questions/tagged/rust?tab=Frequent [2] - https://rust-lang.github.io/async-fundamentals-initiative/ro...


We have plenty of languages which have slowed down. I recommend the author use one of them. I will never understand the idea that 'thing X doesn't work for me, thus those behind X need to stop their work to accommodate me, despite the desires of thousands of others'.

If you don't like Rust, don't use it. No one is forcing anyone. In particular, languages such as C are incredibly slow moving and well understood.


> If you don't like Rust, don't use it. No one is forcing anyone. In particular, languages such as C are incredibly slow moving and well understood.

Rust's goal seems to be replacing C++, which kind of means people will have to deal with it. If it's going to be in all these domains, people have to be willing to take criticism constructively.

Saying "you don't have to use all these features" doesn't work either - just look at C++. It's the mess that it is precisely because the language designers never learned to say no to the next cool thing someone wanted to add in.

At the moment, Rust is opinionated and polarizing. Some people who really like it try to push it everywhere. If and as that happens, people who disagree with a lot of the design choices Rust is making are going to have to have a voice in the community.


Wasn't the point of C++ to allow for tacking on everything that couldn't get out of committee in C?


>If you don't like Rust, don't use it. No one is forcing anyone.

To be immediately followed by

>Marginal systems language (D/Nim/V/Go/Odin/Vale) is useless bikeshedding and has no corporate backer, why would anyone waste their time with this, they should use an established well-known language.

PL debates are whiplash.


I think the point being made is that rust has lots of users now, and, perhaps, the “thousands of users” who want new features now now now are not the majority of users any more.

Those are the demands of passionate early adopters.

There are *a lot* of people who don’t like things that move that quickly.

Are the rust teams sure they’re serving their user base? Or are they actually pandering to a vocal minority?

Are they actually addressing real pain points? Or just working on cool stuff?

Have they even asked?


> Are the rust teams sure they’re serving their user base? Or are they actually pandering to a vocal minority? > > Are they actually addressing real pain points? Or just working on cool stuff? > > Have they even asked?

Open source maintainers and authors owe you nothing. If you are not paying them, they owe you zero consideration in their work.

The entitlement from 'users' which open source maintainers constantly face is over the top.

The rust team ought to build the language they want and people can use it or not.


Did you know you don't have to use every feature? You can write perfectly valid and useful rust code without async or const fn or GATs or ? or whatnot. Rust is really good at additive features - the new features tend to be a strict superset of what was already there, rather than a "you must do it this different way now" set of changes.

You might not get to use the latest and greatest versions of libraries that way, but that's the explicit goal of not changing things anyway.


>Did you know you don't have to use every feature? You can write perfectly valid and useful rust code without async or const fn or GATs or ?

Do you use libraries from other people? Do you work on projects with others that like those features? Then I bet you'll have to use those features, whether you want to or not.

This isn't Rust specific, mind you, but any feature added that is used and has an effect on the API will be forced onto you at some point.

Since async popped up it's become really hard to avoid. And most packages have converged on Tokyo as an executor.

If you use these libraries they will not only force you to use async, they will even force you to pick their executor.

The introduction of GATs will impact interfaces. They will be forced onto you.

I'm not saying this shouldn't happen, in some way it's inevitable for it to happen. But it can be quite harrowing if you're not a sole developer doing all their own libraries.

This is kind of like the "if you use C++ properly you can create safe code in it" mantra. Which again, yes, but only if you have total control of the project, don't have any other developers on it that might impact your design, and don't use libraries.


I’m not talking about stopping work, don’t be ridiculous.

I’m saying is the work currently being done valuable, compared to other quality of life issues in the language.

I posit to you, that, fundamentally if you have a finite allocation of effort, and that effort is devoted always to new features…

…the MVP of the features that exist, with their rough edges will remain, and the MVP of the new features will add to the pool of rough edges.

Go on, make a compelling argument that adding new features without pausing to tidy up doesn’t result in that.

I flat out don’t believe you can.

It’s not a point of contention; it’s a fundamental aspect of software development.

The point in contention here is: are the rough edges we have currently reaching a point where addressing them, rather than adding new ones should be a priority?

That’s a matter of opinion.

You clearly have yours; I disagree. That’s fine.

What I want to raise is that what you and I think individually is not important; this is something for the entire community to talk about and decide on.

Maybe the next rust survey should ask about it?

Then, rather than speculating, we would be able to know, and make decisions based upon n that.

Is that… totally unreasonable?


> What I want to raise is that what you and I think individually is not important; this is something for the entire community to talk about and decide on.

Communities discuss things in a whole bunch of one on one conversations as well as larger forums. There's an entire RFC process and a ton of outreach, and committee meetings that can be participated in as well, for more formal process.

So, lets discuss your specifics.

There's only one possible reason to end a collect() with ? surely?

Well there's Result as you point out. There's also Option. And there's a new feature, the Try trait that allows `?` to be used for other types too. It's still a nightly only feature, because these new changes that come way too fast for you still spend a lot of time in nightly to work out bugs (etc).

So I'm curious of your take here - is it that the features that exist should be worked on to fix them instead of adding new stuff? Is it that you suddenly think that the type requirement isn't a wart after all and they shouldn't be messing with `?`? Is it that your example wasn't one of the most pressing issues, just the first one that came to mind and obviously shouldn't be prioritized despite you claiming it should?

Or maybe you just are just wrapping your opinion (that is dislike of ? and rust in general) in some nonsense about how the community is being ignored.

Why can't you clone a boxed function?

What's your plan for fixing this? Are there already RFCs about it? What are the reasons it's not currently clone (e.g. Is it easy but would break old code? Is it actually really hard? Does it break some other invariant?)?

Saying "i want this" is great, but as you say PL ishard. The finite resource thing you mention is true: are you going to add resources and help out or are you going to sit back and complain about how other people choose to spend their time because they don't share your prioritization? No matter how highly you value yourself or your opinion, the people doing this work aren't beholden to you - do the work or hire someone to do the work. Maybe the community will accept your proposal or patch or whatever.

Why cant you express that a struct can contain an object and a reference to the object and have a 'static lifetime?

There have been proposals about doing this. Go read the discussions on them - usually they have all sorts of weird edge cases that are hard for borrow checking to prove. Perhaps you are the genius that can solve them, maybe you should participate as a community member and write some code to make this happen in a way that doesn't break old code or weaken the borrow checker.

Rust is done out in the open, and the community is who does it. Go participate instead of complaining that volunteers aren't catering to your whims (which no matter how you want to phrase it is what you're demanding).

Further about the wierd false dichotomies:

…the MVP of the features that exist, with their rough edges will remain, and the MVP of the new features will add to the pool of rough edges.

Go on, make a compelling argument that adding new features without pausing to tidy up doesn’t result in that.

The Linux kernel has spent 25+ years doing new features and cleanup in conjunction.

* Is that… totally unreasonable? *

Well no, that's why there RFCs, very welcoming working groups, great community projects, forums, chat, and... the survey. Maybe you could get your question on the survey if you go talk to the working groups? Perhaps you could go find the answers to your questions in one of the community spaces?

What is unreasonable is whining how a community driven project doesn't listen to people who choose not to participate in the community. What is unreasonable is claiming some imaginary mass of people who all just happen to agree with your exact opinion (coincidentally of course) is being ignored by design. What's unreasonable is that you are presenting a case for a massive conspiracy against users when it might simply be a case of "you didn't even try the existing channels" as it appears to be.


But that's not true, extra features in the language have a complexity cost. They slow down the compiler, they make certain things ambiguous, add bugs, etc. It's not the case that if I don't use GATs they have zero negative impact on me.


What things do gats make ambiguous? What bugs did they expose to you that affected non gat use?

The compiler hasn't slowed down for them, in fact it was sped up recently.

So I'm curious if you can point to instances of your hypotheticals, sure that's all possible, but did it actually happen or are you arguing in the sense of "if you just write perfect code..." and similar grasping at strawmen styles.


If you're claiming that adding complexity to any software project comes without trade-offs then I don't know what to tell you.

I will not spend time researching and writing up specific instances where existing complexity made something harder, slower or more bug-prone. If you think that's not a fact, then you're either not arguing in good faith or don't have much experience with real software.


Im not claiming that adding complexity doesn't come with tradeoffs. I never said anything of the sort.

> I will not spend time researching and writing up specific instances where existing complexity made something harder, slower or more bug-prone.

Im asking very specifically if this is true of rustc for GATs which you called out. I'm literally just asking you to back your claims. I apologize if I made a bad assumption that you were talking about a topic you were actually familiar with.

> If you think that's not a fact, then you're either not arguing in good faith

You listed some types of negative consequences that may arise from those complexity trade offs then you said: "It's not the case that if I don't use GATs they have zero negative impact on me."

I asked if you using GATs caused any of the listed issues for you (or if you knew of cases where adding GATs to rust caused those issues), particularly when not using GATs.

Yes there are trade offs, and there are many ways those trade offs might be made. It doesn't mean the specific trade offs you call out were made. The trade offs could be "rustc is harder to develop in the future", "the binary of rustc is bigger", "if and only if developers use GATs, things get slower and they may have bugs", and a myriad of others. None of those impact you the end user of the rust compiler (if you aren't choosing to use GATs anyway).

So again: do you have any actual knowledge that the trade offs you listed actually apply in the case you highlighted or are you hypothesizing and strawmanning due to ignorance of the specific case?


> If you don't like Rust, don't use it. No one is forcing anyone.

Actually, there is a lot of "force" and "forcing" behind Rust.


Last I checked, clang, gcc, g++, gccgo, go, etc are all freely available for download. Do you disagree?


> Why can't you clone a boxed function? I get it, Box<dyn Foo> isn't sized, so you can't clone it. Ok! ...but you can clone a closure because a raw closure is clone if the contents is clone. Not when it's boxed though.

And closures in Rust can do things that "fully named functions" cannot. Let that one sink in.


The only thing I know of is that they can enclose variables from the outer scope where named functions can't. What else?


> But the teams working on language design need to SLOW DOWN and focus on ergonomics and composability. Not adding new syntax because some other language has it.

Genuinely when was the last time Rust actually added new syntax to the language? I think the try operator and await syntax were both added in 1.39 back in 2019. Const generics was added in Feb 2021.

Generic associated types were added recently in November I think, but that's not really added syntax so much as removing a restriction on generics.


Ironically, given Rust’s design constraints, providing the kind of composability features the author wants would require adding new syntax. Because control flow out of map cannot work as long as map is just a regular function taking a closure.

It’s not much different from higher-order functions in JavaScript, really. (Other than the fact that JavaScript has exceptions.)


> Genuinely when was the last time Rust actually added new syntax to the language?

let-else and GAT were introduced in 1.65 (November 2022).


let-else is simple and trivial. GATs have been baking for a very long time which is the definition of moving slowly. The author wants them to slow down everything else to speed up on the area of the language that inconveniences him personally, but that’s not how engineering works and would just stall the language.


From the first few graphs:

> Rust has a nice pretty syntax for iterating:

    for x in &mut something {
        *x = (*x) * 2;
    }
> EXCEPT when you need to do anything else to the iterator, then its ugly:

    for (i, x) in something.iter_mut().filter(|| {...}).enumerate() {
        *x = (*x) * i
    }
This is just so goofy. Who writes Rust like this? Wouldn't everyone write:

    something
        .iter_mut()
        .filter(|| {...})
        .enumerate()
        .for_each(|(i, x)|) {
            *x = (*x) * i
        });
> There should be one -- and preferably only one --obvious way to do it.

Overhead-wise, you can't expect everyone to take to the iterator model right away. Sometimes you want a for loop, or you want a for loop for right now.

I think Rust being multi-paradigm is a strength, with the understanding that the preferred approach/model is the more functional, more immutable iterator model for 95% of your use cases.

> Again, the absolute lack of composability is astounding. Whats the point of even having iterator methods if you can't use them for real world usecases, where code is regularly fallable, so you need to return a Result.

You can, you just haven't figured out how yet. This is frustrating, but, gosh, you'll learn how sooner or later.

    let y: Result<Vec<_>> = iterator.map(|x| {
        x.some_fallible_method()
    }).collect();


> This is just so goofy. Who writes Rust like this? Wouldn't everyone write: something.iter_mut().…().for_each(|(i, x)|) { … })

No. As a user for just short of a decade, I have probably used .for_each() less than a dozen times in total (grep over the current state of most of the code I’ve ever written: three matches). I know it’s there, but it doesn’t often feel right, and I have certainly regularly used for loops where closure capture rules mean that I couldn’t have used .for_each(), which is strictly less expressive. Perhaps if I had come from a language like Ruby I might use it more, but I came more from Python and JavaScript.

I don’t think the article showed a good formatting, however; I’d write it more like this:

  for (i, x) in something
      .iter_mut()
      .filter(|| {...})
      .enumerate()
  {
      *x = (*x) * i;
  }


> I know it’s there, but it doesn’t often feel right, and I have certainly regularly used for loops where closure capture rules mean that I couldn’t have used .for_each(), which is strictly less expressive.

I think this is fine. I think Rust's strength is as a multi-paradigm language. I would say for this specific code, a for_each() makes more sense than the alternatives (on the margin), but for more complex use cases, where something like a try_for_each() might be used, where you may need to return 3 different error types, which for some reason one type can't be easily coerced out of the closure, a for loop makes tons of sense. But I think that's a 5% use case, and in the other 95% of cases the iterator approach is to be preferred. For many sound technical reasons, as well as for more squishy code readability reasons (maps and folds and filters are simply more readable to my eye).

Again I strictly don't have a problem with anything in your comment, or anyone using for loops, but if I was writing your for loop and my concern was the author's re: pretty syntax, I'd write it as:

    let y_iter = some.iter().built().up().like().this();

    for x in y_iter {
        ...
    }


I write `for x in some.iter().built().up().like().this()` a lot.

I think of the iterator building as mapping values from an iterator to values to consume - the for loop consumes them. for_each with side-effects just confuses me later in a lot of those cases - it has it's place but so do for loops. (also, closure issues around variable capture, moves, etc come into play with for_each).


Mostly agree with the points made in your comment, but I think the complaint of the article is "What's the point of having the 'pretty' syntax if it only works in the simplest of cases?"

If this is your concern, and you really must write it like that, I'd break it up, into your iter and your loop.

    let y_iter = some.iter().built().up().like().this();

    for x in y_iter {
        ...
    }


yeah you can often come up with a good descriptive name for that y_iter too, like `selected_widgets`


Edit: this is totally wrong

No, because that doesn't evaluate the iterator.

    let it = something
        .iter_mut()
        .filter(|item| ...)
        .enumerate();

    for (i, x) in it {
        ...
    }


Both for_each() and collect() into a Vec exhaust the iterator.


Reads like someone used to working in an abstract high-level language, who doesn't like the verbosity mandated by a language aimed squarely at embedded/systems programming.

Maybe one day there will be a language that has the expressive type system of ML, the garbage collection of ML, and the predictable performance of ML. Then people who don't need to care about the stack layout differences of an async function can use that language instead of Rust.


I don't think it is surprising. Nested functions do not capture the return (and yield) continuation of their containing function, so they can't invoke it. Rust is in good company here and I think it is the right default for lambdas.

In principle you could have a variant scoped function syntax that does that, but that will complicate both the language implementation and syntax


The reason is that `map`, `for_each` and the rest are not syntax and take closures. Closures in any language do not affect the control flow of their containing function.

These two annoyances could be resolved with:

- async iterators https://github.com/rust-lang/rust/issues/79024

- an extension trait for iterators over results: https://crates.io/crates/iterr


> Closures in any language do not affect the control flow of their containing function.

This isn't true, and it's a damn shame that it is so close to being true. The biggest feature I miss from Common Lisp is that closures capture blocks and tags, and this is but the simplest form of cross-function control flow available in PL design. Algebraic effects are the next big thing.


> Closures in any language do not affect the control flow of their containing function.

In ruby returning from a proc will return from where the proc was defined, usually resulting in a bug

https://stackoverflow.com/a/2325630


They can! Kotlin provides the crossinline modifier on closures, which are allowed to return in their parent scope. While making Iterable<T>.map take a crossinline function is not the wisest idea, it has values in cases like Option<T>, allowing you to write something along the lines of

    optionList.map { option ->
        val result = option.getOrElse {
            Log.d("Well that's fucked up")
            return@map null
        }
        ...more logic
    }
and further in, result will not only have early returned, but will also have the proper, unwrapped type.


it could also be resolved by a proper effect system.


They do in Kotlin. Sometimes, anyway.


I don't get it.

Why most "modern" languages are using such weird/contrived and hard to read syntax?

I can understand that C++ evolved over a long time, starting from C backward compatibility, I think mistakes were made but I can imagine the constraints.

But for Rust and Zig, they started from scratch, why do they have to use so many sigils (magic symbols)?

System programming should not look like sed or Perl.


There's no such thing as hard to read syntax[1]. There's only syntax you're unfamiliar with, or syntax that isn't supported by your editor yet. That's it.

Don't project your reaction to unfamiliarity as a general law. Instead, either ignore the languages with syntaxes you don't like, or just go and learn them before forming an opinion.

And ffs don't talk about syntax being "pretty" or "ugly". That's the epitome of subjectivity, I have a hard time trying to understand why would anyone out of their teens want to argue about that. Ancient Romans knew that de gustibus non disputandum est, yet here we are, "discussing" what is beautiful and what's not.

[1] There are some constraints that stem from the optics and mechanics of an eyeball. Other than that, it's purely subjective.


I strongly disagree.

I think that style is a matter of taste, and some part of syntax can be regarded as styling decisions, like with placement of tabs vs spaces, braces alignment, snake_case vs camelCase etc.

But there is a continuum of syntax readability, brainfuck is less easy to read than Perl that is still easier to read than Sed.

And I think exactly the opposite about familiarity, the less familiar you are with a syntax the better your judgment is about its readability.

Of course that can be learned, but that is not the point.

Programming languages should really strive to be as easy to read as possible, even at the cost of being more difficult to write.

Code should never look arcane, especially to someone unfamiliar with the language / codebase.


Brainfuck is harder to read not because of the syntax, but because it describes very low level semantics. Actually, it's not even harder to read - it's harder to comprehend.

You're conflating comprehensibility with readability - they're not the same. There's a reason why people get tested for "reading comprehension" and not just "reading".

In other news, I now know that you have your own idea about how programming languages should be. Good for you. That idea is not grounded in any research, because there is no research available, but you're welcome to have it, and when you write your own language you're welcome to stick to it.

Logical conclusion of your last statement is this: https://en.wikipedia.org/wiki/Inform#Example_game_2 I'll leave you to think about why it's not the most popular programming language on the planet.

I'd say more, but I don't want to inconvenience Daniel any more than I already have the previous time, so let's just strongly agree to strongly disagree.


> You're conflating comprehensibility with readability - they're not the same. There's a reason why people get tested for "reading comprehension" and not just "reading".

"Reading comprehension" is not the same as "reading" only for natural languages, because we can "read" words by transforming them into sounds.

In the context of programming, reading and comprehending are synonyms. Nobody cares if you can "vocalize" the code without comprehending it.


Maybe nobody cares, but visually identifying a token and comprehending its meaning are still distinct steps. What I'm trying to say is that visual identification - the thing that syntax is responsible for - is the easy part, and the latter is so much harder that the former almost doesn't count.

It's not that I think syntax doesn't matter at all. It does, in some context it matters a lot - just not because of readability. And most definitely not because it looks pretty or ugly.


Easy to comprehend, easy to read, alright, in this context I think they are close enough to avoid wasting time on semantics.

I have some ideas about programming languages but I think that in this field less is more, in an ideal world we should have only a handful of programming languages, to avoid the huge waste of resources.


> brainfuck is less easy to read than Perl

You are using a very strange definition of "easy to read". Brainfuck is obviously easier to read than Perl. There are only a tiny handful of symbols and there is a 1-1 mapping between characters and behavior in the program.


Sure, then I guess that binary code is the easiest to read, even easier than hexadecimal.


Again, yes, it is easier to read. In binary, there are only two tokens you need to recognize. With hex it's 16. With binary, when you see 1101, you only need to recognize two distinct tokens. You can teach a toddler (or maybe even a parrot) to read it in a day, and you'll get to "one-one-zero-one" in no time at all.

Now, understanding what this particular combination of tokens means in the context it appears in, is obviously much harder than if it was given symbolic name, ie. when using assembler. But reading assembly is obviously much harder than machine code (dumped in binary), because you either need to learn the whole alphabet ([a-z0-9] and a few more) the identifiers are formed from, or you'd need to memorize and learn to recognize hundreds of labels. Not something a toddler could do in a day.

Reading and understanding are not the same. You, and some other commenters, want to overload the term "read" to mean everything from first seeing a clump of pixels or a stains of ink on paper up to forming the mental model of the semantic meaning that's solid enough to be transformed in your mind (ie. having an understanding). I'm telling you: it's not that simple.

I'm also trying to tell you that it doesn't matter, because "forming a mental model" is so much harder than just recognizing visual patterns (ie. reading), that the difficulty of the reading part is inconsequential.

There's a pop-sci book titled "Programmer's Brain" - it could be like 3 times shorter if it dropped the irritating long-winded style, but it does explain how interacting with code works from the cognitive science perspective. It's a few bucks well invested if you're actually interested in knowing facts instead of going with your gut feeling.


This suddenly reminded me... I wonder how the "we need to translate programming language keywords into everyone's natural language" folks are doing these days?


Systems programming is about very precisely specifying all aspects of the program down to the lifetimes and sizes of all variables, so that the compiler can make advanced optimization decisions that result in fast and compact binaries. All that specification comes with a lot of syntax, and some of it can be inscrutable considering the limited character set we're working with. That's how you get things like the turbofish, but it's there for a reason.


>All that specification comes with a lot of syntax

CommonLisp can do this without that syntax


CommonLisp is not really suitable for systems programming (hence the lack of systems software written in it).


To give an example, the functionality of `?` in Rust used to be provided by the `try` macro (https://doc.rust-lang.org/core/macro.try.html). Dedicated syntax was later added because it made a lot of code cleaner and easier to read, at the expense of a minimal upfront learning cost.


Zig is very readable, and doesn't use many sigils at all.


Zig uses ? and ! in ways that are difficult to read IMHO, for example.


.{ zig has a lot of .{ } }


This part is ugly and annoying but not too difficult to read.


Many people complain about language's use of symbols, and then fail to think of what it would look like without the symbols. Realistically, humans are incredibly good at manipulating symbols to reduce verbosity, it's why I just used an apostrophe.


I don't mind a bit of verbosity. With words, on multiple lines, not with an esoteric combination of sigils.


> I don't mind a bit of verbosity. With words, on multiple lines, not with an esoteric combination of sigils.

Realistically, if you were expressing the amount of stuff going on in these lines, then you would want symbols.


Swift is modern and beautiful, but crippled by its ecosystem.


While I do like Swift and wish Rust would look as nice, it doesn't have references and lifetimes. Those two alone make code way nicer.

Anonymous closure arguments, aka $0, are probably the ugliest part of its syntax.

    numbers.contains(where: { $0 > 0 })


What is hard to read about Rust and Zig?


It would help to have more complete examples here. Also, at least one of the examples that are supposed to be correct clearly does not compile (the loop calling push will fail because the Vec is not declared as mut), which makes me doubt the assertions about the other examples, especially because of the lack of context.


> for x in &mut something { … }

The thing to realise is that this largely isn’t how you’ll use the <&mut Something as IntoIterator> implementation. It’s not “‘pretty’ syntax” so much as something that just incidentally worked in that case because it was powerfully useful for something else. Rather, `something` will come from an argument or such, and will not be of type `Something`, but rather of type `&Something` or `&mut Something`, and so `for x in something { … }` will automatically make x be of type `&Item` or `&mut Item`, as appropriate.

That is:

  for x in something {
      // What type is x? Depends on what type something is.
      //   • something: Something      ⇒ x: Item
      //   • something: &Something     ⇒ x: &Item
      //   • something: &mut Something ⇒ x: &mut Item
  }
(Mind you, I’m not entirely disagreeing with the article here. The limits of IntoIterator’s sugarness are annoying, and I’m not convinced it was worth having in the language. Much of the rest of the article hinges upon wanting closures to be Ruby-style blocks (or procs or whatever they are, I can’t remember) instead of closures; quite apart from introducing its own conceptual problems to balance those it solves, I don’t believe it could coherently be implemented in a language with Rust’s constraints, especially with async.)


The examples presented are not significant issues (in my opinion).

If someone from the Rust language development community is reading this, what I would really like to see is Rust supporting default arguments to functions. Coming from C++/Java/Python world, that is one language feature I sorely miss.


Is this person actually asking for Monads and Higher Kinded Types?

It would solve the problem they're raising but somehow I suspect that's now the solution they want to hear. And also the rust version would be quite hard to use due to the memory management model.


I think the author wants something like Ruby's blocks, although you're correct that monads would also be a good fit.

In Ruby, iteration can be performed by passing a sort-of-closure called a block. Unlike passing a closure (function), returning early from a block acts like returning from a for loop.

If a language has a garbage collector and no user-visible separation between stack and heap allocation, then there's lots of interesting things that language can do in terms of building "fluent" APIs.


In my experience, when it comes to language design, most people end up wanting monads and HKTs but refuse to admit it / are scared of admitting it.


Both iterator examples have alternatives where you don't need a for loop. Try using for_each and you can collect an iterator of results into a single Result<_> you can then early return with.

Both of those are made cleaner by not using for loops.


Not a rust person, but both of the for loop examples seem extremely contrived.

If you're going to do .filter , presumably you can do a .map or something and just supply a small lambda which does what was in the body of the for loop in the example?

Alternatively you could just do the filter part using an if statement in the for loop. Either of those syntaxes would be much more straightforward than doing half of one and half of the other. It just doesn't make sense to do that and point at the result as being unergonomic.

Secondly the fact that for and for_each have different characteristics is as someone else has pointed out fundamentally just that blocks and functions are different things. ie it breaks the author's mental model because their mental model is just wrong.


For clarity it is called try_for_each: https://doc.rust-lang.org/stable/std/iter/trait.Iterator.htm...

The ? operator can be used both inside and after it like so:

    iter.try_for_each(|_| { … xxx?; … })?;


The author is right that this doesn’t compose though. What happens when the closure is async? Do we need an tier.async_for_each? What if we want both? Does the library author need to provide a try_async_for_each? This is a combinatorial explosion for each additional effect added which means it’s not composable.

https://blog.rust-lang.org/inside-rust/2022/07/27/keyword-ge...


Oh neat, I didn't know that existed. That's a combination of what I was trying to say above.

If your for loop body is infallible then for_each is good enough.

If you're trying to 'collect' some iterator into some data structure T, you can collect into Result<T, _> and question mark on that.


The first example code doesn’t look like anything you should ever do - use immutability and don’t write rust like C.

The rest of the article is based on this use case, so I don’t see the point.

Composability can be done in a functional style without mutating state. The outcome would likely be very different and easier with this approach if the time were taken to grok it.


Funny how the author both demands vague new ergonomics features (with no explanation how they might actually work), and at the same time demands the language team to stop adding new ergonomcis features.


> But the teams working on language design need to SLOW DOWN and focus on ergonomics and composability.

Except his concerns have already been explored.

> Rust experimented with all of these concepts at some point in its history, it wasn’t out of ignorance that they were excluded.

https://without.boats/blog/the-problem-of-effects/

I think there are people still broadly exploring how to make it easier to write code that is generic over sync+async so that, for example, the standard library only needs one implementation of closures that can be either async or sync and work correctly. I can’t recall if they’re focusing on just async effects or also making it generic to Result. I can’t find the docs page describing this effort for some reason but I swear I read something about it recently. The only other languages that have it are

* Go via Goroutines: doesn’t fit Rust’s execution model of not having GC / controlling threading / no-std environments

* Zig: not explicit enough which gives up optimality for the sake of user convenience.

The Rust team is moving slowly and carefully here to make sure the solution they find balances ergonomics, complexity, and speed that people would expect of a “Rusty” solution.

The author is basically complaining “drop all other work and focus on MY inconvenience” which is short sighted. Different people have different interests and capabilities. It’s likely not helpful to throw more people at the problem given that it’s well known, studied, and people are indeed trying to tackle it.

Edit: found it by way of remembering that I came to it via the effing-mad crate which kind of gives you the composability albeit only working on nightly. Rust is calling this keyword generics https://blog.rust-lang.org/inside-rust/2022/07/27/keyword-ge.... If they pull it off, this will let you write generics over multiple effect systems transparently (failable, const, async, etc) for composability. I definitely want to see them try at the broader vision and deliver a high quality thing like they did with GATs but recognize this is a huge l language feature that will take time to get all the design and the fiddly implementation bits correct. Hopefully there will be short term features that can be delivered along the way to the full thing.


Rust's composibiliity inside a method / function doesn't really matter because it's easy to fix.

For me composibility problems with Rust come up with lifetimes, that's why I would like to give names to lifetimes of variables inside functions: as a step before extacting functions from it that isn't always possible.


Writing Rust recently got a bit easier because a certain large language model is able to work around all of these issues while also explaining why Rust designers made those choices.

I certainly hope that the language designers keep making the language easier to write for us totally fleshy humans that are not robots at all as well.


It sounds like the author is not aware of the `FromIterator` trait which is implemented for `Result` which allows short circuiting to a `Result` type.

Instead of collecting to:

    Vec<Result<T,Error>>
You collect to:

    Result<Vec<T>,Error>
as your explicit type and your iterator will short circuit on the first error. I admit this was not obvious to me either and should be talked about more as it is very handy.


The Kotlin compiler can handle chains of potentially().nullable()?.invocations(). Is the Rust early-error returning not a similar case, or am I missing something?


Yeah, Rust does that as well. The issue the author is talking about here is that when you write a closure:

    fn foo(inputs: &[Input]) -> Result<Vec<Output>, Error> {
        inputs
        .iter()
        .map(|input| { input.something_fallible()? })
        .collect()
    }
that ? is an early return from the closure, not foo. The correct way to do this is:

    fn foo(inputs: &[Input]) -> Result<Vec<Output>, Error> {
        inputs
        .iter()
        .map(|input| { input.something_fallible() })
        .collect::<Result<Vec<Output>, Error>>()
    }
ref: https://play.rust-lang.org/?version=stable&mode=debug&editio...


I wonder how come the type must be specified explicitly as "collect::<Result<Vec<Output>, Error>>()" and can't be inferred, given that the function return type is already spelled out as "Result<Vec<Output>, Error>"? Is there a specialisation for collect() for Result<> types that has to be explicitly triggered?


It doesn't have to be specified explicitly in this case, it can be inferred.

https://play.rust-lang.org/?version=stable&mode=debug&editio...

Probably just habit from GP to write out the return type, since collect so often can't be inferred. Or maybe to make it easier to change it to an intermediate value which you use ? with (at which point the type can no longer be inferred).

https://play.rust-lang.org/?version=stable&mode=debug&editio...


Yeah, apologies, that's just habit on my part.


Kotlin and also C# ? operators only “return” from the expression being evaluated. Most documentation says “return” a bit ambiguously but that’s what they mean. AFAIK Rust is the first language to use ? in such a way as to return from a function, and that’s quite different. Kotlin ? can technically be emulated in Rust with an IIFE (immediately invoked function expression) like so:

    // never do this, it’s rude
    let x = (|| potentially().nullable()?.invocations())();
Or with Option::map, and_then and friends:

    // this is the way
    let x = potentially().nullable().map(|y| y.invocations());


That made it click, thanks!


Rust does the same.


> Rust has a nice pretty syntax for iterating:

> `for x in &mut something {`

and a few paragraph later:

> So instead we need to use the ugly for-loop manual collection

> `for x in &iterator {`

So, is that syntax pretty or ugly?


> let res: Vec<_> = iterator.iter().map(|x| x.foo()?).collect();

This make sense to me? Why shouldn't it return early from the lambda in `map`? Had it return the "highest level function" in the scope then that would be bug prone imo. It is already possible to do fallible iteration:

> let res: Vec<_> = iterator.iter().map(|x| x.foo()).collect::<Vec<Result<_,_>>>()?;

However, I would like to mention that you should not try to force the syntax. This mantra is very misleading imo

> There should be one -- and preferably only one -- obvious way to do it.

Had Rust gone with your suggestion, imagine writing the equivalent of this?

``` fn foo() -> Result<String, String> {

    Ok("hello".to_string())
}

fn main() {

    println!(

        "{:?}",

        (0..2)

            .map(|_| {

                foo()?;

                foo()?;

                foo()?;

                foo()?;

                foo()?;

                Ok::<String,String>("world".to_string())

            })

            .collect::<Vec<Result<_, _>>>()

    );
} ```

How a "hacker news" website not supporting code snippet is beyond me.


> How a "hacker news" website not supporting code snippet is beyond me.

  Indent your code by two spaces.


""" for (i, x) in something.iter_mut().filter(|| {...}).enumerate() { x = (x) * i } """

uh, what's the alternative in other languages that's superior for what your trying to do here? Also I do not think this is ugly in any way.


The Rust apologists in the comments just don't get it. The fallow year is a great idea, particularly if you can somehow get Rust contributors to learn a Lisp and use it for that entire year before getting together and improving syntax matters.


Speaking of syntax:

  for x in &mut something {
    *x = (*x) * 2;
  };
Not Rust user but would not it make sense for compiler to understand x instead of *x in this situation? I thought it is considered higher level language than C.


The syntax discriminates between "changing the value where the reference points to" and "changing where the reference points to". The code is doing the former. The second use of * could be made unnecessary if the standard library had an impl Mul<u32> for &mut u32 {}.


> So instead we need to use the ugly for-loop manual collection

> let res = vec![]; > for x in &iterator { > res.push(x.foo()?); > }

I'm obviously not a rust programmer because I think that's much less ugly :-)


The main criticism expressed in this blog post is that in Rust, transforming a language construct into a seemingly equivalent one (typically using .map()) works in some cases (described as the "hello world" cases, which might be a bit exaggerated), but sometimes not (due to lifetime, fallibility, async…).

For example, this code (real world example from two days ago):

    fn main() {
        let value = None;
        let _processed_value = match value {
            Some(value) => Some(process(value)),
            None => None,
        };
    }
    fn process(_value: u32) {}
can be transformed into this equivalent form:

    fn main() {
        let value = None;
        let _processed_value = value.map(|value| process(value));
    }
    fn process(_value: u32) {}
But in async Rust, this code works:

    #[tokio::main]
    async fn main() {
        let value = None;
        let _processed_value = match value {
            Some(value) => Some(process(value).await),
            None => None,
        };
    }
    async fn process(_value: u32) {}
But this one does not work:

    #[tokio::main]
    async fn main() {
        let value = None;
        let _processed_value = value.map(|value| process(value).await); 
    }
    async fn process(_value: u32) {}
    
It fails with the following error:

    error[E0728]: `await` is only allowed inside `async` functions and blocks
     --> src/main.rs:4:64
      |
    4 |         let _processed_value = value.map(|value| process(value).await); 
      |                                          -------               ^^^^^^ only allowed inside `async` functions and blocks
      |                                          |
      |                                          this is not `async`
This is basically what is described as the sandwich problem: https://blog.rust-lang.org/inside-rust/2022/07/27/keyword-ge...

I have to admit that I often need to refactor parts of code while I'm writing Rust code due to such issues, so I understand the author's point of view.


Rust feels like a very patchwork language. There's all sorts of issues (for loops, error handling, referencing/dereferencing, type inference, etc) that they've worked around by adding another very specific compiler behavior that completely falls apart in any other usage.

Like having to do `&mut **` to pass a reference to a function once you get out of the compiler's comfort zone.


Rust is a very industrial language. It tries to identify specific useful patterns and then expose them to the programmer in misuse-resistant forms.

In contrast, academic languages try to identify broadly-applicable abstract primitives and expose those, allowing users to implement the behaviors they need in libraries. The result feels more consistent at a language level because you can tie language features very closely to an underlying theory of computation.

I've written a lot of C++ (industrial) and Haskell (academic). Although I love how Haskell allows users to (for example) define their own control-flow statements, that "bag of primitives" nature leads to Haskell projects easily forming their own dialect of the language[0] without even having to resort to macros.

[0] This is also a common criticism of Lisp and Forth.


Honestly I'd put rust somewhere between Lisp and Haskell, solidly in the academic language camp. It's full of all sorts of neat tricks, with a great variety of ways to accomplish any one thing. You also need to dip into 3rd party libraries for nearly basic things like decent error handling.

In my mind an industrial language needs to be boring and obvious. For instance, Go is very boring, and it actively discourages by it's design "clever" programming. Java Go and C are industrial languages. C++ is far too clever these days.

All that said rust is far better than Java and Go for interfacing with low level systems, and of course the safety improvements from the borrow checker are very valuable, despite the frustrations it often causes.

I don't dislike rust, but it sure could use a lot of polish in my opinion, making obvious things easy and discouraging clever things.


What's the idea here? Industrial languages are obvious and so more people can jump on a project without being lost? Is C really industrial in that case?

Are there any merits to languages like Haskell/Clojure when the goal is still shipping software?


  > Are there any merits to languages like Haskell/Clojure when the goal
  > is still shipping software?
Depends on what kind of software you're shipping.

Haskell is easy to use for building parsers, so if part of your value add is (for example) compatibility with a competitor's proprietary dialect of SQL then you'll be better off writing the parser in Haskell than in C or Rust. It also works well for software that is highly mathematical (in the "described as equations" sense, not the "HPC numeric kernel" sense), because a developer can apply formal methods without having to go all the way to Agda or Idris.

On the other hand, Haskell is not great when you need predictable optimization or precise memory management. A lot of modern commercial software is basically business logic scaffolded over an HTTP or RPC server, which is maybe a worst-case scenario for Haskell. It also doesn't work for Rust's target audience of systems programming because idiomatic Haskell requires a heavy runtime.

  > Industrial languages are obvious and so more people can jump on a
  > project without being lost?
Definitions will vary, but IMO the defining feature of an industrial language is that the language itself is not extensible. Every C or Rust project has the same keywords, control statements, operators, and type system -- as long as you don't go absolutely mad with macros. It's possible for a C expert to read C code from pretty much any codebase and understand what's going on at a tactical level. This is not true of research-y Haskell and it's very not true of idiomatic Lisp.


    >> On the other hand, Haskell is not great when you need predictable optimization or precise memory management. A lot of modern commercial software is basically business logic scaffolded over an HTTP or RPC server, which is maybe a worst-case scenario for Haskell
Why is this a worst case for Haskell? Aren't databases and HTTP the real blocker to performant web apps?

    >> It's possible for a C expert to read C code from pretty much any codebase and understand what's going on at a tactical level. This is not true of research-y Haskell and it's very not true of idiomatic Lisp.
I don't know enough about CL, but if you stay away from macros, is the average LISP developer anymore likely to make a non industrial code base than a Rust developer? It seems like the real cause of difficult to grok codebases is not just macros, but extreme abstraction, which can be done with just passing functions around.


  > Why is this a worst case for Haskell?
You generally don't want a garbage collector in the core request-dispatch loop, and Haskell's lazy evaluation makes it really easy to accidentally write code that can't be compiled to efficient output. It's not impossible to write a high-performance HTTP server in Haskell[0], but the skill requirement is much higher than in C++, Rust, or Go[1].

  > is the average LISP developer anymore likely to make a non industrial
  > code base than a Rust developer?
Yes. Not because of the developer, but because of how extremely flexible and dynamic the Lisp-family languages are. The power and joy of Lisp is in how it's almost a meta-language, so every project can become its own EDSL. The most famous (infamous?) example of this is Vacietis[2], which is a Common Lisp library that allows C code to be imported directly(!!).

[0] IIRC the Yesod framework's Warp does well on benchmarks, and when you look at code like https://github.com/yesodweb/wai/blob/master/warp/Network/Wai... you can see the lengths they had to go through to work around the choice of implementation language.

[1] Go has a garbage collector, but exposes the stack/heap distinction more directly than Haskell, so it's easier to write allocation-free code in hot paths.

[2] https://github.com/vsedach/Vacietis


Are you referring to writing an HTTP server from scratch? Or simply making a REST API with existing libraries? I don't see how a garbage collector would be a problem for the latter. When is the speed problem for your REST server gonna be the garbage collector vs the database calls or the network requests themselves?


> Like having to do `&mut *` to pass a reference to a function

I'm curious about your use case for this. `&mut *` I've seen, for instance to dereference a RwLockWriteGuard, but why the double dereference?


It was with deadpool, I don't recall the exact underlying reasons though: https://github.com/bikeshedder/deadpool/issues/104#issuecomm...


Pythonistas when a language doesn't follow PEP8 and doesn't have a GIL.

>EXCEPT when you need to do anything else to the iterator, then its ugly

God forbid you use one (1) whole variable to write

   let enumerated_something = something.iter_mut().filter(|| {...}).enumerate()
   for (i, x) in enumerated_something { ... }
>BUT when you try to use it with iterators -- which are also amazing, I love using iterators -- IT DOESNT WORK.

Pythonistas when you can't throw in the middle of a closure to end your loop. What is .map { } supposed to do ? Abort early but still stay alive ? Abort early and just return the first two elements that didn't fail ?

Or, you could define a Iterable<T>.map_or_none() that returns a bunch of Result/Options as an output, and be done with it.

>Again, the absolute lack of composability is astounding.

Coming from someone using python where extensions methods are a pipe dream and function(compositions(are(written(in(a(style(that(makes(me(want(to(go(back(to(lisp)))))))))))))), that's rich.


> function(compositions(are(written(in(a(style(that(makes(me(want(to(go(back(to(lisp))))))))))))))

I think you got that backwards. I'm pretty sure it's supposed to be lisp(to(back(go(to(want(me(makes(that(style(a(in(written(are(compositions(function))))))))))))))


Or even...

  (-> function compositions are written in a style that makes me want to go back to lisp)


f-rust rated


One day Rust will be as hated as C++


Love this kind of articles, keep it up!


Small nitpicks presented as huge flaws by angry man


Can you please not cross into personal attack, regardless of how wrong someone is or you feel they are? You may not owe Rust critics better, but you owe this community better if you're participating in it.

https://news.ycombinator.com/newsguidelines.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: