
Thinking in Parallel - jondubois
https://blog.baasil.io/thinking-in-parallel-d854e904d821
======
toolslive
Is it just me that thinks the title is wrong ? I think networking is about
concurrency rather than parallelism and parallelism is merely a resource that
can be used when your code was written with concurrency in mind.

~~~
jondubois
I don't think so because the article discusses sharding cross multiple
processes (and therefore CPU cores) - Running in parallel.

You can handle multiple concurrent connections/requests on a single CPU core;
but that would not classify as parallel processing. So the term 'parallel' is
more specific than 'concurrent'.

The paragraph under 'Introduction' here
[https://en.wikipedia.org/wiki/Concurrent_computing](https://en.wikipedia.org/wiki/Concurrent_computing)
seems to give a pretty good definition.

~~~
toolslive
"Understanding and expressing scalable concurrency" ( [https://people.mpi-
sws.org/~turon/turon-thesis.pdf](https://people.mpi-sws.org/~turon/turon-
thesis.pdf) ) addresses the difference between parallelism and concurrency in
detail (page 11-12). Rereading that, I'm pretty sure the title is wrong.

~~~
jondubois
I just read those few pages of the thesis. It seems to imply that concurrency
has some sort of 'locking' or 'coordination' component. The 'Thinking in
Parallel' article makes no mention of either of those concepts - Quite the
opposite.

In French, the word 'concurrent' literally means 'competitor' \- Which in our
case implies competition for computational resources (E.g. specific memory
locations) - On the other hand, the 'Thinking in Parallel' article talks about
how to partition resources in a way which removes the element of competition
between processes - So the word 'Parallel' seems fitting.

~~~
stonemetal
Unfortunately in the last few years concurrent and parallel have taken on
technical meaning, so any understanding gained from what it means in the
common tongue doesn't really help.

Concurrent is for separate entities working together in a coordinated manner.
A prime example is threads working together on a work queue. Parallel is for a
singular entity doing multiple things at the same time. For example a vector
instruction that does multiple additions at the same time.

------
grigjd3
There is a big, abstract, and not entirely useful introduction to this post in
order to describe a useful case of pub/sub (rather the exact or nearly exact
case pub/sub was designed for).

------
z1mm32m4n
> To summarize; the key to thinking in parallel is to think about your data as
> being made up of small, fixed-size subsets of data that can be dealt with
> independently.

This isn't quite the idea behind thinking in parallel. In fact, parallelism is
more about understanding the dependencies between any given piece of your
program. Each piece need not necessarily be independent of another; rather,
it's understanding _where the dependencies are_.

