
"Sequential programming is dead. So stop teaching it" - jaydub
http://software.intel.com/en-us/blogs/2008/10/22/sequential-programming-is-dead-so-stop-teaching-it/
======
festivusr
Aside from the inflammatory headline, what the article seems to be saying is
"teach more about concurrent programming during early software programming
courses," which seems reasonable enough. Like pointers, concurrent programming
concepts seem to be a tough pill for some CS students.

~~~
apgwoz
I'm not sure how you can talk about concurrency without talking about locks
and other such topics, which I think is very ambitious for new programmers.

However, if universities start teaching programming with pure functional
languages (which is highly unlikely), concurrency becomes a much easier topic
for discussion.

Perhaps this is the sort of thinking that will lead to more universities
adopting something like PLT Scheme, which in recent versions, has moved the
notion of mutable pairs into a library. Doing this might bring functional
languages out of academia and more into the mainstream, which would be
fantastic.

I remember fellow students having trouble grasping pointers, even after a 2nd
year architecture course, which I thought was completely absurd since they
were writing assembly code without tremendous strain.

~~~
stcredzero
Often when competent programmers are having trouble grasping pointers, they
are actually doing fine with thinking about pointers in abstract, but having
trouble with the notation describing a particular instance of them.
(Especially with C programs.)

One of the things that doing good OO and following the Law of Demeter does for
you is to reduce the levels of indirection you have to deal with to one or
two.

~~~
orib
From what I've seen helping people I know handle pointers, they're really
actually having trouble with the concepts of pointers. Getting them to draw
the box diagrams with pointers correctly is challenging. The notation is an
extra challenge, but it's not the main difficulty.

It's not the notation, it is the concept.

~~~
stcredzero
Only time I've seen difficulty with pointers in the abstract is when helping
people in intro programming classes in C. Once you're past that level,
programmers get it, but they get confused by what the code is actually saying.

------
swombat
That's ridiculous. Programming is not all about multi-core performance. Also,
programs do not all need to directly implement parallelism to be performant.
The architecture can support parallel processing with sequential languages. A
great example of that is Rails, when set up with mongrel processes, each of
which can run on its own core if necessary.

Hyperbole, hyperbole, hyperbole <\-- summary of this article.

~~~
crabapple
_That's ridiculous. Programming is not all about multi-core performance._

it will be. you're getting beaten over the head by chip designers telling you
that your future cpu is going to consist of a (possibly large) array of
processing cores with a high-capacity bus connecting them. they are telling
you this is the only way they can give you higher performance. you had better
start believing them because these systems are starting to get delivered now.

 _A great example of that is Rails, when set up with mongrel processes, each
of which can run on its own core if necessary_

?????? so mongrel comes with its own OS kernel that has better support for
multicore than linux and freebsd? wow!! coolzzz!

~~~
axod
"you're getting beaten over the head by chip designers telling you that your
future cpu is going to consist of a (possibly large) array of processing cores
with a high-capacity bus connecting them."

Sorry, but I don't buy that. We're also moving to a thin client world where we
don't actually need that much power on our thin clients.

Of course the chip makers are saying that - they want to sell more chips. They
have to come up with some other number they can increase.

~~~
blackguardx
Can you envision a world where your "thin client" is powered by 25 cores
clocked at a low speed like 100 MHz? Why not?

Do you really think performance doesn't need to be increased from the current
status quo? The number of cores will only go up from here.

~~~
axod
No I can't. That would be simply ridiculous. The thirst for power is not
infinite. At some point, most people will have enough power to do everything
they need (Unless they're using say windows, which will always require 10
times more computing power than the previous version).

~~~
blackguardx
And 640 kilobytes of RAM should be enough for anybody.

There will always be an increase in power demand. To think otherwise is short-
sighted. If you could keep what we have now or have a sentient computer
sitting on your desk, which would you choose?

~~~
axod
Personally? I'd keep what I have. There is nothing better or more rewarding
than squeezing more performance out of fixed limited hardware. Where is the
fun if you can just buy twice as many servers? The day hardware is free, is a
very very sad day for programmers.

Of course there will be massive demands in the world of servers, research,
gaming etc, but that's not _everything_.

~~~
randallsquared
"The day hardware is free, is a very very sad day for programmers."

It may be a sad day for those of you who program primarily for the challenge,
but for those of us who want to get stuff done, it'll be a joyous day. :)

------
PaulSteinberg
As the instigator of the offending post, I am delighted to see the discussion.
I am down at Supercomputing 08 with Tom and others and we will be discussing
this topic on a panel this evening,. The discussants include NVIDIA and Intel
(and SUN and IBM and AMD) so clearly this is an issue of concern to ALL
computer manufacturers. If you are in Austin - come to room 10B this evening
(Monday, Nov 17). If you are not, we are doing a live webcast on the subject
<http://is.gd/7Rvz> on Thursday Nov 20. I would love to be able to carry
discussion on this topic further. One idea would be an open forum using some
kind of internet voice/text app. Not sure which yet. I know I could drag some
Intel folks and I'm pretty sure I could get a few folks from elsewhere in the
industry and acacdemia to join in. Maybe we could even make this a semi-
regular and continuing discussion on broader topics – you guys clearly have
both the opinions and the savvy. So let me know -post something on my blog if
you like the idea <http://is.gd/7RyX> and we'll figure out how to make it
happen. .

------
ntoshev
How can you teach concurrent programming without teaching sequental
programming first?

~~~
tjr
Teach them concurrently?

------
wolfmurphy
What I delightful thread. I am indirectly part of the creation of this
inflammatory headline. It comes from the Panel discussion we are holding next
Monday evening at SC08 in Austin. The panel's title is "There Is No More
Sequential Programming! Why Are We Still Teaching It?"

It is a complex issue. I taught myself to program at 16 to play blackjack. 41
years later, I am still creating and playing games (not video games). For over
thirty years I worked for supercomputer companies. Along the way, I went to
grad school and formalized my education.

I believe my experience is not atypical. My students who succeed as CS are
ones who at some point have a passion to solve a problem and are intent on
gaining the skills to do it. Some start from flow charts and pseudo code; some
start by debugging an empty file.

What does this mean for this discussion? I don't care whether we view
sequential as a special case of parallel; or parallel as a special case of
sequential. Ideally, I'm going to help my students have the thinking skills
and the experience to solve interesting problems by cutting code with threads,
with MPI calls, via Cuda, or just with other code. But there is no question
that many of the rarified programming skills of my supercomputer days are fast
becoming everyday programming skills.

------
wheels
In other news, Microsoft announces that UNIX is dead, so it should not be
taught anymore.

------
13ren
Many-core overshoots needed performance _for most uses_. Hence the rise in
sub-notebooks, iPhones, best-selling games console being by far the least
powerful console (wii).

+

Concurrency is a unsolved problem. There are locks etc; there's
smalltalk/Erlang pure-message passing and Web Services/SOA (it's concurrency).
Concurrency is of academic interest, and niche apps (game engines;
simulations; etc.

=

unsolved problem that is not needed... _so far, anyway_

------
bpyne
The idea of solving problems by breaking down solution steps and running them
in parallel is exciting. However, identifying areas where the solution to a
problem can be parallelized is the largest problem. As critical is analyzing
whether or not the administrative overhead that goes along with
parallelization is going to negate the benefit. Put more succinctly, computer
science curricula should emphasize this kind of analysis well before
introducing the techniques for achieving parallelization.

Donald Knuth's skepticism over the benefits of concurrency makes me want to
rethink my own assumptions.

I haven't really seen anyone describe the changes that should be made to
curricula. Do any educators on this thread have specific changes in mind?

------
Tichy
The purpose of programming is not to maximize performance of Intel CPUs. If I
want to calculate 1+1 or sqrt(2), why should I bother with parallel
algorithms. Also, a lot of the time there will just be parallel threads
exploiting the cores (like multiple applications running on an OS).

What would be interesting would be a kind of "complex systems" programming,
stuff like cellular automata, but maybe it is impossible to make them
tractable enough.

Also, aren't the most performant "parallel computations" simply specialized
matrix operations. I am not sure if learning specialized and hard to
understand programming languages for parallel computation are the best way
forward.

------
snorkel
Parallelism still costs overhead. Concurrency is not free so you have to weigh
the cost/benefit before deciding to use sequential or parallel for the task at
hand.

------
sherl0ck
I recently read the other thread, that IE 6 won't go anywhere, so it's the
same with sequential programming. I don't think it won't be dead.

------
rgrieselhuber
This ignores the millions of single core/processor, embedded systems out there
(and surely also represent a growing industry.

------
mattmaroon
In a nutshell "we moved our product in a direction that is largely useless.
Please help us."

------
axod
dead??? Not sure about that one. There is more to programming than just
multicore server backend stuff. Regardless, you don't "stop teaching it", you
just teach more related to multi-core architectures if those do start to
become more prevalent.

------
cabalamat
Rumours of its death have been exaggerated. 99% of programming is sequential.

~~~
13ren
Well, he did say it was a diktat, which I had to lookup:
<http://www.tfd.com/diktat>

------
projectileboy
"People aren't using the hardware you're building. So stop building it."

------
drewr
Perfect timing. My copy of JCIP came in the mail yesterday.

------
morbidkk
so java programmers get deep into java.util.concurrent

~~~
crabapple
no, java programmers pick up haskell is more likely.

~~~
jimbokun
no, clojure

~~~
crabapple
no, haskell. clojure in the end runs on the jvm. the ghc compiler has
parallelism baked into it in a fundamental way.

don't get me wrong, i like clojure, but it won't be better for multicore until
the jvm is better for multicore

