
Parallelism is not concurrency - pfleidi
https://existentialtype.wordpress.com/2011/03/17/parallelism-is-not-concurrency/
======
nivertech
_Concurrency_ \- property of systems in which several computational processes
are executing at the same time, and potentially interacting with each other.

 _Parallelism_ \- computation in which many calculations are carried out
simultaneously, operating on the principle that large problems can often be
divided into smaller ones, which are then solved concurrently ("in parallel").

Go to slide #13 in my presentation:

<http://www.slideshare.net/nivertech/migrationtomulticore>

------
kevinburke
If I'm reading the article correctly, the author is saying that with
parallelism, you can perform independent operations in parallel (like sorting
both halves of a quicksort), whereas concurrency as people think about it
deals more with parallel operations on a shared state?

~~~
chrisaycock
_Parallelism_ often implies that the tasks are working towards the same goal
together, like dividing-up the workload among sibling processors. This is
purely for performance.

 _Concurrency_ is simply the introduction of non-determinsm, like having a
thread run a task in the background. Concurrency can be used to make a
networked GUI application more responsive to the end user, even if there is no
performance benefit.

------
barrkel
I don't agree with the author's conflation of concurrency with non-
determinism. To me, concurrency means there may be more than one operation
logically "in flight" at any given time; parallelism means there may be more
than one operation physically happening at any given time. Determinism vs non-
determinism is an orthogonal issue.

You could have a single-threaded cycle-counting CPU simulator which is
entirely deterministic, yet if it is simulating a multi-threaded program, that
program would exhibit concurrency (for example, if it were a web server, it
could be serving multiple requests over separate TCP connections concurrently,
working a little on each request round-robin).

~~~
jlouis
To appreciate his point, you must be aware that Harper is thinking about a
formalization of concurrency. In such a formalization, you could have a
deterministic execution of the concurrent processes as you hint, but you will
also have a trace of incoming events in the formalization, not under the
control of the CPU.

For the system as a whole to be deterministic, you would have it be
deterministic for arbitrary event traces. This is rarely the case in practice
though. Harper does not tend to just sling out a postulate unless he has good
reason to think it is so, backed up by a formal system in which he identified
the association.

It could be that he has identified concurrency goes hand in hand with non-
determinism. To me, it does sound rather plausible.

~~~
barrkel
It sounds poorly defined to me. Concurrency for me means a specific thing, and
that specific thing does not have a necessary implication of non-determinism.
My position is similar to dmbarbour's
([https://existentialtype.wordpress.com/2011/03/17/parallelism...](https://existentialtype.wordpress.com/2011/03/17/parallelism-
is-not-concurrency/#comment-20)).

------
cousin_it
The post says that the depth of quicksort is O(log^2 n). Blelloch's list of
NESL algorithms gives a depth of O(log n) instead:

[http://www.cs.cmu.edu/~scandal/nesl/alg-
sequence.html#quicks...](http://www.cs.cmu.edu/~scandal/nesl/alg-
sequence.html#quicksort)

Or do I misunderstand something?

~~~
shasta
There may be some difference in accounting. The recursion depth of Blelloch's
recursive function you linked to is O(log n), but it uses an operation that
cannot be implemented in constant time: partitioning the elements into those
greater and less than the pivot. I'd guess that operation is itself O(log n),
which would account for the O(log^2 n).

~~~
kragen
"Depth" and "recursion depth" are not the same thing.

Quicksort partitioning normally takes O(N) time, but it can be in some sense
perfectly parallelized once you have the pivot.

~~~
cousin_it
By "in some sense", do you mean it can be parallelized to O(1) time as long as
you don't have to actually build up the data structures that result from
partitioning (e.g. arrays)? Then it's not clear to me why this assumption is
applicable to quicksorting an array. Doesn't it call for some non-obvious
representation of intermediate results?

~~~
kragen
I'm out of my depth (heh) commenting here, but I think it depends on what kind
of communications infrastructure you're assuming.

------
chrisaycock
> Isn’t concurrency required to implement parallelism? Well, yes, it is...

Not exactly. Non-determinism is not required for parallelism; just look at
VLIW:

<http://en.wikipedia.org/wiki/Very_long_instruction_word>

~~~
barrkel
Non-determinism is not concurrency either (I think the author is wrong to
conflate the two).

Parallelism is a physical, hardware, implementation of concurrency. VLIW,
vector operations, GPUs etc. execute very fine-grained operations in parallel,
but whether this is regarded as concurrency in any given context depends on
how "lumpily" you define a concurrent task for that particular context. For
example, fancy parallel instructions might be used to resize images in a
thumbnail generator in a web server, but if the task being considered is the
processing of a web request, then there is no concurrency. If the task being
considered is processing a line, stripe or block of pixels, then there is
concurrency.

------
gtani
_long_ (and some worthwhile) threads

[http://www.reddit.com/r/programming/comments/g6k0p/paralleli...](http://www.reddit.com/r/programming/comments/g6k0p/parallelism_is_not_concurrency/)

------
jderick
_Concurrency is concerned with nondeterministic composition of programs (or
their components). Parallelism is concerned with asymptotic efficiency of
programs with deterministic behavior._

Interesting distinction. My understanding is that these terms are typically
used interchangeably in a more general sense. Perhaps it would be better to
come up with some different terms for these concepts.

~~~
chrisaycock
> these terms are typically used interchangeably in a more general sense

No, these terms are used interchangeably by users who don't know what they
mean.

> Perhaps it would be better to come up with some different terms for these
> concepts.

These concepts already have different terms. We don't need to come-up with new
ones; we just need to start using the existing terms correctly.

------
krosaen
related: scalability is not efficiency. discuss.

~~~
jlouis
Yes, and efficiency is not latency.

------
Aetius
In other words, more confusion. Don't read if you don't understand words like
"asymptotic efficiency" and "nondeterministic composition". Ugh.

Can someone put this in lay terms?

~~~
te_platt
Parallelism is recognizing the 5 guys should be able to dig a ditch close to 5
times as fast as 1. Concurrency is making sure everyone has a shovel and is
digging in the right place.

~~~
barrkel
Determinism is making sure the digging is in the right place. Non-
deterministic digging may be possible, but it's harder to get right.

Concurrency could also be having one guy with one shovel, but digging
iteratively in 5 separate places and digging a shovelful or two at a time.

Or to put it another way, parallelism is digging two ditches with 4 people 4
times as fast as 1 person digging two ditches; concurrency is 1 person digging
two ditches at the same time, alternating; determinism is making sure that the
two ditches are being dug correctly, at an even pace, etc.

