Hacker News new | past | comments | ask | show | jobs | submit login

> Also, sequential workflows require more code to be written. In Rust, async functions compile down to state machines, with each .await point representing a transition to a different state. This makes it easy for developers to write sequential code together with non-blocking IO. Without async, we need to write our own state machines for expressing the various steps.

Has anyone tried to combine async and sans-io? At least morally, I ought to be able to write an async function that awaits sans-io-aware helpers, and the whole thing should be able to be compiled down to a state machine inside a struct with a nice sans-io interface that is easily callable by non-async code.

I’ve never tried this, but the main issues I would forsee would be getting decent ergonomics and dealing with Pin.




Rust has generators/coroutines that can somewhat address the use case you're describing, but they're an extra-unstable feature at the moment. Unfortunately, in its current incarnation, coroutines have the annoying limitation of only being exposed via the std::ops::Coroutine trait, so the underlying state machine generated by the compiler can't be manually allocated, even though the size of the state machine is ostensibly a compile-time constant.

It's not an issue for a single coroutine whose lifetime is contained within the function that defines it, since the compiler can figure that out and stack-allocate the state machine. But arguably the most useful application of coroutines is as elements in a queue for event loop machinery. But implementing that is made impossible unless you box the coroutines. Vec<Box<dyn Coroutine>> is not a cache friendly data structure, and you'll feel the pain if you're doing extremely high concurrency I/O and need a million elements in your Vec.


If Rust ever gets a native generator syntax, this might be become achievable because one would be able to say: `yield transmit` to "write" data whilst staying within the context of your async operation. In other words, every `socket.write` would turn into a `yield transmit`.

To read data, the generator would suspend (.await) and wait to be resumed with incoming data. I am not sure if there is nightly syntax for this but it would have to look something like:

  // Made up `gen` syntax: gen(yield_type, resume_type)
  gen(Transmit, &[u8]) fn stun_binding(server: SocketAddr) -> SocketAddr {
   let req = make_stun_request();

   yield Transmit {
      server,
      payload: req
   };

   let res = .await; // Made up "suspend and resume with argument"-syntax.
   
   let addr = parse_stun_response(res);

   addr
 }


Rust has had native generator syntax for a few years FYI. It's what async-await is built on. It's just gated behind a nightly feature.

https://doc.rust-lang.org/stable/std/ops/trait.Coroutine.htm... and the syntax you're looking for for resuming with a value is `let res = yield ...`

Alternatively there is a proc macro crate that transforms generator blocks into async blocks so that they work on stable, which is of course a round-about way of doing it, but it certainly works.


They actually play together fairly well higher up the stack. Non-blocking IO (i.e async) makes it easy to concurrently wait for socket IO and time. You can do it with blocking IO too by setting a read-timeout on the socket but using async primitives makes it a bit easier.

But I've also been mulling over the idea how they could be combined! One thing I've arrived at is the issue that async functions compile into opaque types. That makes it hard / impossible to use the compiler's facility of code-generating the state machine because you can't interact with it once it has been created. This also breaks the borrow-checker in some way.

For example, if I have an async operation with multiple steps (i.e. `await` points) but only one section of those needs a mutable reference to some shared data structure. As soon as I express this using an `async` function, the mutable reference is captured in the generated `Future` type which spans across all steps. As a result, Rust doesn't allow me to run more than one of those concurrently.

Normally, the advice for these situations is "only capture the mutable reference for as short as possible" but in the case of async, I can't do that. And splitting the async function into multiple also gets messy and kind of defeats the point of wanting to express everything in a single function again.


One thing I toyed with, but didn't get very far, was to encode the HTTP/1.1 protocol as a Sans-IO state machine with .await points for the IO, but rather than the IO registering Wakers with an async runtime, it relinquished control back to the user to perform the IO manually. One can think of it as .await releasing "up" instead of "down".

In the context of HTTP/1.1 the async code became a kind of "blueprint" for how the user wants the call to behave. At the time I was dead set on making it work for no_std (non allocator) environment, and I gave up because I couldn't find a way around how to need dynamic dispatch via Box<dyn X> (needing an allocator).


https://news.ycombinator.com/item?id=40879547

    async fn stun(
        server: SocketAddr,
        mut socket: impl Sink<(BindingRequest, SocketAddr), Error = color_eyre::Report>
            + Stream<Item = Result<(BindingResponse, SocketAddr), color_eyre::Report>>
            + Unpin
            + Send
            + 'static,
    ) -> Result<SocketAddr> {
        socket.send((BindingRequest, server)).await?;
        let (message, _server) = socket.next().await.ok_or_eyre("No response")??;
        Ok(message.address)
    }
Fully working code at https://gist.github.com/joshka/af299be87dbd1f64060e47227b577...


I wrote a library which I didn’t release yet, where the natural async approach seems impossible to compile in Rust if async I/O is tied too tightly to main logic. (In b4 skill issue)

Sans I/O mostly means, write pure functions and move I/O out of your main functionality as much as possible. Then you can deal with each part independently and this makes the compiler happy.

80-96% of your work on a sans io rust project is still going to be I/O, but it’s not complected with your main logic, so you can unit test the main logic more easily


This is another take on defunctionalization. You create a model of execution but do not execute it. I.e., return or queue a value of type Send, and do not execute "send". The execution is separate and actually deals with "real-world" side effects. The execution can be done by sync/async/transformed to monads, it doesn't matter.


A long time ago I had "fun" implementing all sorts of network protocols with such an event based library on C: https://github.com/cesanta/mongoose




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: