Hacker News new | past | comments | ask | show | jobs | submit login

"You can have concurrent tasks but with dependencies between them which means there is no parallelism available." - that isn't really concurrent then, is it? Concurrency implies running two things concurrently; you're just creating things to wait. No useful work is being done. You broke a sequential task across processes. Just because those processes are stuck on a receive loop doesn't mean they're doing anything. Might as well say your Java program is concurrent because you spun up a second thread that just waits.

That was what I was getting at with my question; arbitrary processes that do their work in a serial fashion...isn't a concurrent program. It's a badly written serial one.




> that isn't really concurrent then

It is, using standard industry terminology.

Concurrency does not require that concurrent tasks are able to make independent progress at the same time, just that they are in some point of progress at the same time.

Back in 2012 when I was working on my PhD I wrote a paper about an example algorithm with very little inherent parallelism, where even if you have multiple concurrent tasks all making forward progress you may find it extremely hard to actually produce results in parallel, if you want to do a deep dive into this problem.

https://chrisseaton.com/multiprog2012/seaton-multiprog2012.p...

I wrote a poster about it as well for an easier introduction.

https://chrisseaton.com/research-symposium-2012/seaton-irreg...


> It is, using standard industry terminology.

Just to add some support here, concurrency definitely has very little to do with actual independence of progress. It has far more to do with encapsulation of knowledge (for an active agent, not a passive entity like an object [^]), and how different agents coordinate to exchange that knowledge.

An echo server + client is a concurrent system, despite having only one meaningful scheduling of actions over time (i.e. no interleaving or parallelism). Serialization and parallelism are both global properties, as they require a perspective "from above" to observe, while concurrency is a property local to each task.

I can appreciate everyone is saying upthread, but I've definitely found it valuable to think about concurrency in these terms.

[^] Even this is overselling it. You can easily have two logical tasks and one physical agent responsible for executing both of them, and this is still a concurrent system. Dataflow programming is a lot like this -- you're not really concerned with the global parallelism of the whole system, you're just describing a series of tasks over local knowledge that propagate results downstream.


Thank you; the example helped clarify for me.

Yes, any time there are shared resources, concurrent tasks can hit synchronization points, and end up bottlenecking on them. Of course, I'd contend that a point of synchronization is a form of serialization, but I agree that the tasks being synchronized would still be described as concurrent (since they don't causally affect their ordering). But such a synchronization is a necessary part of the algorithm you're choosing to use (even prior to SMP support), or it would be completely unnatural to include it. I don't know that it really invalidates the OP's point, that the language didn't remove the bottlenecks you introduced?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: