
Parallelism /= Concurrency - DanielRibeiro
http://ghcmutterings.wordpress.com/2009/10/06/parallelism-concurrency/
======
javert
As a computer scientist, this article is very frustrating to me. I don't
understand why parallelism is not equal to concurrency.

Let's look at the definitions given:

 _A concurrent program is one with multiple threads of control._

 _A parallel program, on the other hand, is one that merely runs on multiple
processors_

Well, you have to have multiple threads of control to run on multiple
processors simultaneously. And if you're running on multiple processors
simultaneously, you have multiple threads of control.

So what's the difference?

If anybody can enlighten me on why parallelism != concurrency (a claim I've
seen often), I'd appreciate it.

By the way, to me, _all_ programming languages can be concurrent because
ultimately it's all just opcodes; and the word "non-determinism" is
problematic anyway. So there's some context for you.

~~~
keveman
I find it easier if I disassociate the English meanings of the words from
really how the words are used to mean different things in computer science. To
me, concurrency is only in the programmers mind. That is, a programmer writes
concurrent programs with different threads of control, say, one to process the
user input, one to process the network packets etc. In reality, the multiple
threads of control could really be running SEQUENTIALLY. That was the case
when there was only one core. Remember, pthreads predates multi-cores.

Parallelism is something more concrete, it REALLY exists in the real world. A
programmer could just imagine and write a sequential loop, but if the compiler
deemed it to be really parallel, it could really run parts of the loop in
parallel. Of course, a programmer could write a concurrent program, as in,
multiple threads of control, and the system could really run it in parallel,
say on different cores of a multi-core processor. So remember, concurrency is
abstract, parallelism is concrete.

~~~
javert
OK, I'm seeing this explanation being brought up, and it makes sense. Thank
you. I wish the original poster had actually stated it this way.

~~~
srgseg
This is one of those times where not being able to see comment points really
hurts my ability to read this thread.

Lots of people saying slightly different things, and no way for me to know
where consensus lies.

------
scott_s
My personal definitions, based on my experience in systems and high
performance computing research:

Parallelism: simultaneous calculations executing in the service of a single
problem, usually with the goal of improved performance.

Concurrency: executions in the same time granularity, but not necessarily
simultaneous. Also not necessarily in the service of the same problem, but
some form of synchronization is required.

(Admittingly repeating myself: <http://news.ycombinator.com/item?id=2387067>)

------
jchonphoenix
Isn't != the canonical programming form of "not equals"?

~~~
thenduks
This article is in the context of Haskell/GHC.
[http://www.haskell.org/onlinereport/standard-
prelude.html#$t...](http://www.haskell.org/onlinereport/standard-
prelude.html#$tEq)

