Hacker News new | past | comments | ask | show | jobs | submit login

Could you clarify? I don't get where you are heading with that.

At least the concept of a "stream" isn't really linked with time. It's just a sequence of bytes.




The issue is that abstractions "leak".

Let's take your "stream", for instance.

It's a sequence of bytes.

But is it a "maximum bandwidth" stream of bytes (gigantic file transfer) or is it a "minimum latency" (audio packet) stream of bytes?

If I know or can control what the "time" the stack is working toward, the stack doesn't have to know the difference between those two.

In addition, when on embedded, I quite often want a stack API which is "foo_init(), foo_deinit(), foo_queue_send(), foo_queue_action(), etc." which all update internal state

And then "foo_make_incremental_progress(foo_state, foo_now, ...) where ONLY that incremental progress function actually carries out actions like reading hardware, writing hardware, timing out, etc. Now, I can use whatever concurrency construct I like without the stack getting in the way.

Now, that isn't necessarily the fastest performance as the stack needs to be structured such that it only carries out one "action" at a time on each incremental call. However, it's very flexible. It also has the wonderful property of being repeatable and testable. Something which TCP stacks are notoriously resistant to.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: