
Programming as a Way of Thinking - pmcpinto
https://blogs.scientificamerican.com/guest-blog/programming-as-a-way-of-thinking
======
shubhamjain
> With a computational approach, we can go “top down”, starting with libraries
> that implement the most important algorithms, like Fast Fourier Transform.
> Students can use the algorithms first and learn how they work later.

This is exactly how I understood it. Three years back, WebAudio API was a new
addition in browsers and I decided I should make something with it. I settled
on building a Whistle Detector using a cited research paper as my basis. I
barely understood anything in that paper and had to dig into DSP, FFT and
other basics to get around. As I had no external help, I struggled plenty but
managed to complete it after two weeks [1].

Funny enough, we had a DSP course which never made sense to me. Two weeks into
my puzzle I walked away with more useful knowledge than the course ever did.
What has always motivated me to learn is the application. I can work
tirelessly if there is an interesting thing to build, even if I have to go
through mundane theory. But, I find it utterly tedious to learn theory with no
immediate goal in mind.

[1]:
[https://stuff.shubhamjain.co/whistlerr/](https://stuff.shubhamjain.co/whistlerr/)

~~~
mtempm
That's a very interesting comment to me because I may be perhaps the opposite.
I got a slow and frustrating start learning application development--I wasnt
getting it. I felt I was following instructions on how to write esoteric
insutructions and not understanding how it worked. I found the book Code by
Petzold and read it. Then I discussed some of this with other developers and
realized they were even more clueless than I was.

It seems some people love theory/understanding while some love building
something practical/useful.

~~~
khedoros1
I learn best when I build something that illustrates the theory. Reading gives
me a dim idea, but building something (and freely experimenting on the way)
teaches me how it works.

~~~
rwnspace
I'm the opposite, and had to be persistent with searching for the right
learning tools for programming, with the least terms and behaviours left
undefined. If there are such things as 'learning styles', theoretical learners
vs applied learners is the most intuitively clear distinction. However
choosing a teaching and measurement paradigm that fits both without a lot of
wiggling is not so intuitive...

------
misingnoglic
I had a math teacher at my high school who used python and SAGE to teach us
pre-calculus as well as programming, computational thinking and some
mathematical logic. He was an absolute genius and I wouldn't be where I was if
it wasn't for him - unfortunately the school saw him as a threat to the
standard model of teaching and drove him insane before he just quit. I hope
he's doing ok.

~~~
mathgladiator
I've had thoughts like this, using sci-py to teach kids how to hack math. Our
education system is kind of sad face.

~~~
misingnoglic
It was definitely cool - he had us writing python functions to estimate the
area under a curve (and then visualize it), and other things I don't quite
remember at this moment. Once he made us define math functions just based on
there being a 0, and a successor function, which happens to be what I'm doing
in a graduate Math Logic course at my university.

~~~
mtempm
That's really incredible.

It's sad to see educators like this get pushed out of their field. I had some
incredible teachers as well, but also some awful ones. There needs to be
greater competitive components added to the education system, such as
statistically significant pupil success in education and career outcomes.

------
richard_shelton
In the beginning of 80s there was a concept called "Programming is the second
literacy" [1] introduced by Soviet computer scientist A.P. Ershov. Nowadays
it's obvious that there is a huge demand in special programming tools not only
for professional programmers (C++, Java..), but for domain experts who use
algorithms too (Python, DSLs..).

There is a popular objection that "not everyone should be able to code". But
it depends on the "to code" definition. A good example here is with game
designers. They may not know how to do low-level coding, work with 3d math or
use C++ templates. But still for a really good game designer it's very
important to have algorithmic thinking and they need to have a tool for
testing their algorithmic ideas on the computer.

As for Python, there is an issue: it's hard to find examples of good style of
Python programming (I think only few of us actually learnt Python with help of
textbook). Some time ago I was happy to find notes by Peter Norvig about
Python and about comparing Python and Lisp. His code is very elegant, see, for
example: [http://norvig.com/python-lisp.html](http://norvig.com/python-
lisp.html)

[1]
[http://www.sciencedirect.com/science/article/pii/01656074819...](http://www.sciencedirect.com/science/article/pii/0165607481900028)

~~~
jimmaswell
>it's hard to find examples of good style of Python programming

I've never really bought that this matters so much as long as you just write
sensibly in general. How many python users really make sure to do everything a
"pythonic" way?

~~~
richard_shelton
I think, it depends on how you use Python.

1\. "Executable pseudocode", almost a toy language to explain few algorithms
in your article.

2\. Sysadmin tool. Replacement for shell, a way to write quick throwaway
scripts.

3\. "A modern Lisp". Universal language for implementing AI algorithms, DSLs
etc.

I, personally, use 3rd variant. I write quite big programs in Python, for
example I made few compilers in it. That's why code quality is important for
me here.

~~~
noobiemcfoob
I would posit there are far more methods than these besides. Python is a
powerful language that is flexible enough to be used for all these and more.
You or anyone will inevitably write code they believe to be best suited to
their problem, pythonic or not, weighted by the coder's ability and personal
definition of "pythonic"

------
epalmer
I saw Allen Downey speak about this concept a year ago in a small forum at the
University of Richmond. Several professors in the audience challenged his
thinking. I approached one after that and asked what he thought and the answer
was basically that Dr. Downey's approach is spot on. This coming from a CS
professor.

I have just started reading his Think Stats book
[http://www.allendowney.com/wp/books/](http://www.allendowney.com/wp/books/)
which is starting to help me better understand statistics. Not far enough into
the book yet to make a complete judgement however.

Got to say I like seeing different perspectives on traditional subjects. My
youngest is going to Olin next year in Mechanical Engineering and seeing these
sorts of articles keeps me excited about her unconventional engineering
education at Olin.

~~~
mediocrejoker
I don't quite understand your comment. If the professor you spoke with said
"Dr. Downey's approach is spot on" then why would said professor "challenge
[Dr. Downey's] thinking"

~~~
epalmer
These were testing his thinking. They wanted to find holes in it and in this
one professor's view they did not.

------
nikki93
Nice. I've actually found ideas in programming about modularity and complexity
to be ways to approach such complexity in economics, physics, social
relationships etc. "Notes on the Synthesis of Form" for example is about
design in the abstract and often through examples in architecture but the
ideas re: the ontology developed help and are helped by similar ideas in
structuring software. This is especially the case when thinking about modeling
live simulations like video games. In the vein of "Notes," our normal
semantics about the "categories" of thought we have like CS, philosophy etc.
might be more "categorical" than needed.

Some thing I want to play with more is OS-y scheduling and AI-y thought as
ways to think about time management or problem-solving for "humans."

------
bigger_cheese
"With a computational approach, we can go “top down”, starting with libraries
that implement the most important algorithms, like Fast Fourier Transform.
Students can use the algorithms first and learn how they work later."

The "Learn how they work later" part sends out alarm bells to me. I don't have
a lot of confidence that students will be particularly motivated to dig into
how an algorithm works after their problem has been solved I fear this will
lead to a generation of "just use $x package" and people blindly plugging in
"magic algorithms" without understanding their choice. "Quicksort for
everything" or "include leftpad" if you will...

I see it a little in my industry (Engineering) "oh your data is noisy just
apply a kalman filter" never mind if it is appropriate or not.

~~~
j2kun
Another aspect, one that I've encountered many times, is that even when
someone has to implement the idea from scratch, they don't work to understand
it and instead just translate pseudocode. I once had an interview where I was
asked to explain why a piece of code (that the interviewer wrote) worked,
because the interviewer did not know.

A benefit of the bottom up approach, of starting with the math and no
reference implementation, is that you are less likely to implement it unless
you understand it. And I should stress, these are tradeoffs.

------
OrangeTux
This post reminds me of Amir Rachum's post about Knowledge Debt[1].

"This is how programming should be taught. You should do stuff way before you
can figure out how it works. For a while, you should intentionally be ignorant
about distracting details."

"You should, intentionally and tactically, decide which piece of information
you can do without, for now. But you should also, intentionally and
strategically, decide when to pay back that debt."

[1]: [http://amir.rachum.com/blog/2016/09/15/knowledge-
debt/](http://amir.rachum.com/blog/2016/09/15/knowledge-debt/)

------
Koshkin
> _Programming has changed. In first generation languages like FORTRAN and C,
> the burden was on programmers to translate high-level concepts into code.
> With modern programming languages—I’ll use Python as an example—we use
> functions, objects, modules, and libraries to extend the language..._

This is the first paragraph. I understand that it's just setting the stage,
but it makes so little sense by itself that I had to make an effort to
continue reading.

~~~
rskar
Yeah, that was an odd start, given that Dr. Downey is a Professor of Computer
Science. Even a charitable suggestion that he was looking to keep things as
understandable as possible for a wide audience doesn't seem to give much
defense.

I think he was trying to articulate how much more expressive/extensive
programming languages and libraries (and tools?) have gotten over the years,
so that a student can get to do something interesting with much less down-and-
dirty arithmetic and arcana.

~~~
kpil
What has changed its the modules and libraries, which has exploded in
availability.

The actual programming method haven't changed much since the dawn of Unix, or
perhaps Lisp - although it took a while to spread.

------
beagle3
The canonical, award winning version of these ideas is described in Ken
Iverson's seminal 1979 paper 'Notation as a tool of thought'[0] - highly
recommended to anyone who finds the ideas discussed interesting (even if they
find the scientific american article lacking ... because it is)

[0]
[http://www.jsoftware.com/papers/tot.htm](http://www.jsoftware.com/papers/tot.htm)

~~~
wmacaluso
Peter Naur's "Programming as Theory Building"
([http://pages.cs.wisc.edu/~remzi/Naur.pdf](http://pages.cs.wisc.edu/~remzi/Naur.pdf))
is also a closely related read.

------
ethn
"The computer revolution is a revolution in the way we think and in the way we
express what we think.", SICP, 2nd Edition.

The article misses what I believe is the most important point, which is the
concept of the abstraction of a function. A function is more than a computer
concept: a function is a hammer, a violin, and a microscope; you put an input
and you get an output.

~~~
yawaramin
Not only that, but you can then realise that functions are manipulable objects
too, and have operations of their own--eta abstraction, reduction,
composition, etc., and can be used to build all of modern mathematics pretty
much from scratch.

------
asavinov
The author focuses on the algorithmic and computational aspects of
programming. Yet, nowadays programming is much more than this beautiful
classical view on programming. Programming is more about describing a _complex
system_ by taking into account such aspects as concurrency, cross-cutting
concerns, asynchronous events, transactionality, distributed computations etc.

~~~
mannykannot
That is not surprising, as this is not an article about programming in
general, but about using programming for teaching in other areas.

I would, however, welcome the companion article, "Thinking as a Way of
Programming", to balance the prevalent notion that testing is the only way
that matters.

------
neves
BTW, the article author, Allen B. Downey, has a marvelous collection of free
technical books: [http://greenteapress.com/wp/](http://greenteapress.com/wp/)

~~~
neves
And in his books, he does what he prays: teach you complex topics using
computer programs. I don't know if it is a good approach of the general
population, but for us programmers, it certainly is.

------
tigershark
Not sure what I was expecting from the title, but for sure not what I read.
Maybe I'm quite old now, but working with latches and switches was always a
way of thinking. Writing awful BASIC with GOTO and subroutines was not that
different. Writing OO code is quite different at first impact but then you
discover that it is basically the same way of thinking. The real way of
thinking of programmers doesn't have anything to do with the languages, it's
more about finding solutions for some problem. If you can't find solutions by
yourself, banging your head, it doesn't mean that you have to find a better
language for your software, doing nothing in the meantime, it means that you
have to start from scratch understanding how to fix things and how to solve
problems. If you just wait for the next big thing or the next shiny solution
that fixes everything for you then imho programming is not for you.

~~~
yawaramin
Good point, I think it's about finding the 'correct' ways to break small parts
together into a solution for your problem. We certainly have some tools to
help us along the way--OOP, FP, etc.--but there's definitely an element of
craftsmanship to it.

------
dreamcompiler
Nice work. Of course teaching engineering [0] and physics [1] from a
computational perspective is not a new idea.

[0]
[https://en.m.wikipedia.org/wiki/Structure_and_Interpretation...](https://en.m.wikipedia.org/wiki/Structure_and_Interpretation_of_Computer_Programs)

[1]
[https://en.m.wikipedia.org/wiki/Structure_and_Interpretation...](https://en.m.wikipedia.org/wiki/Structure_and_Interpretation_of_Classical_Mechanics)

------
kccqzy
I agree partially with this article. I think programming in a highly
expressive language and runtime is a great way to get started exploring and
develop intuition. I solved differential equations using mathematica in middle
school, long before I even learned how to integrate. I solved instances of the
traveling salesman problem long before I understood how complex it is. I
played around with images and video files long before I learned anything about
signal processing. I find that it is very helpful as a student to be able to
learn from bottom up and top down simultaneously. Mathematica is much more
forgiving than python if a student has no prior programming experience.

------
blt
> _The languages I am calling modern are not particularly new; in fact, Python
> is more than 25 years old. But they are not yet widely taught in high
> schools and colleges._

Seems like a strange claim, because

> _eight of the top 10 CS departments, and 27 of the top 39, teach Python in
> introductory CS0 or CS1 courses._ [1]

[1] [https://cacm.acm.org/blogs/blog-cacm/176450-python-is-now-
th...](https://cacm.acm.org/blogs/blog-cacm/176450-python-is-now-the-most-
popular-introductory-teaching-language-at-top-u-s-universities/fulltext)

~~~
lazyasciiart
If you posit that python is a better way to teach programming, and that top
universities are more likely than average to use better ways of teaching, then
that data point may be skewed.

------
sriku
Pete Naur's "programming as theory building" is an excellent read on the
topic. Not only on the executable code front, I've also found the type
notation used in Haskell to be a great modeling tool using which I can check
the consistency of what I'm designing. I'm not talking Coq level proofs here -
just an elegant notation that you can use on paper as effectively as in code.

------
kensai
Also check out his companion blog post: "Python as a way of thinking". Has
some nice slides there.

([http://allendowney.blogspot.com/2017/04/python-as-way-of-
thi...](http://allendowney.blogspot.com/2017/04/python-as-way-of-
thinking.html))

------
kkylin
If you liked this, you may also like
[https://www.youtube.com/watch?v=6J1vRrozgBg](https://www.youtube.com/watch?v=6J1vRrozgBg)
.

PS There is a written version of this, but I could not find a copy that wasn't
behind a paywall. Perhaps someone else can.

------
jackmott
Some discussion of the efficiency cost of our current favorite higher level
languages would be nice. There are things about Python that people love, and I
think we should strive to provide that with something more like a 2%
performance penalty instead of ~200%+

Languages like Nim are very promising on that front.

------
mybrid
Opinion is not fact. Given this is Scientific American should we expect
something, say, more scientific?

Let's put this under the heading, "One Size Does Not Fit All".

The underlying assumption of this opinion is that there is only one model to
teach too.

In the modern era we really need to take into account our differences.

In this case, functional versus procedural programming.

1\. Write this in a Lisp as functional.

2\. Present the functional form as human readable as the given procedural.

3\. Conduct a study and gather the data.

4\. Present the findings.

My experience has taught me that there is subset of the human population who
think more naturally in functional programming.

But if one model is insufficient? then why would two be?

What languages are yet to be invented to meet the needs of the millions of
daily programmers?

It is time for experimental science to be conducted given the millions of
people involved.

The path has already been laid out with Usability research that is entirely
experimental.

It is time to do the hard science on usability of claims of functional and
procedural programming languages.

It is time to put to rest opinion and one size fits all.

------
GrumpyNl
The reason we went for Turbo Pascal in the old days was the readability. ( it
was C or Pascal )

------
ccleve
This article starts to get at an idea that I've had for a while now -- that
mathematicians are doing it wrong.

In the software business we learned a long time ago to name our variables
properly, name our functions logically, and to control complexity by breaking
ideas into modules and then hiding the details inside.

If you can't name something, then you don't know what it is, and that tells
you that you should rethink your design.

Mathematicians don't do that. They name variables "a" or "x", or worse, they
use some Greek letter I can't type on my keyboard. They are entirely
inconsistent in their use of variables: "phi" or "theta" can mean a zillion
different things. I can't tell you the number of times I've read a computer
science paper, a paper that uses entirely unnecessary equations, and doesn't
bother to define symbols. This practice wastes everyone's time.

It's laziness, pure and simple.

Mathematics needs a general overhaul. The language of math needs to a complete
redesign with a focus on understandability. And the key to it is to force
mathematicians to name their variables.

Yes, I realize that mathematics deals in abstractions that have little
relationship to the outside world, and it makes little sense to call a
variable a "dog" or a "car". So what? It just means that we need a new
vocabulary, a vocabulary that includes terms like "fourier transform" or
"hypotenuse". Pretty much every industry has it's own vocabulary. Chemistry
has thousands of terms. Biology even more. Computer science is full of them.
Math is full of symbols that have no inherent or well-understood meaning. That
should change.

~~~
eldavido
It's more standard than people realize. Offhand:

x: "some real-valued variable"

n: "a countable quantity, usually a total"

i: "an index"

k: "some kind of constant", often an integer, whose value doesn't change, "c"
is also used for this

e: almost never used as a variable, it's Euler's number

p: some kind of probability, or a prime number, along with p and q

t: some kind of parameter, often goes from [0,1] or (0,1)

z: complex numbers

I have a Master's in EE so I've studied this a bit.

~~~
osoba
U, V: vector spaces

u, v: elements of vector spaces

G, H: groups

g, h: elements of groups or

g, h: homomorphisms, isomorphisms etc

e: group identity

K, F: fields

I: ideals

f: functions

(x): sequences

x_i: i-th element of a sequence

A, B: matrices

It all depends on the context it's used of course

~~~
danielpatrick
This notation has developed over tens to hundreds of years before we had the
capabilities of autocomplete and formal typing where a computer can help us
write longer names more quickly. This is why single letters became prominent,
they were simply easier and faster to write.

But anyone who is serious about writing maintainable code today should be
using an IDE where the benefits of susinctness are entirely relegated by
intellisense-like tools.

Trading readability for conciseness is near the top of my list of "crimes
against future maintainers."

So I had never thought about this in the context of mathematical symbols, but
this makes total sense and I'm strongly in favor of relegating mathematical
conciseness in favor of readability and specificity.

~~~
osoba
It's actually the other way around. Mathematical notation used to be very much
language-like and tedious to read. As time went by (and math became more
complicated) notation was developed to make it more succinct and easier to
understand. (And sometimes the more succinct notation helps to develop new
insights. The change from Roman to the Indian/Arabic number systems made
calculations easier for everybody)
[https://en.wikipedia.org/wiki/History_of_mathematical_notati...](https://en.wikipedia.org/wiki/History_of_mathematical_notation)

Compare the two following statements:

One from Euklid's elements (written 2.5k years ago):

"Given two straight lines constructed from the ends of a straight line and
meeting in a point, there cannot be constructed from the ends of the same
straight line, and on the same side of it, two other straight lines meeting in
another point and equal to the former two respectively, namely each equal to
that from the same end."

And my attempt of translating the above, in what should effectively be
Hilbert's notation (19th-20th century):

If there are two triangles ABC and ABD where AC=AD and BC=BD and C and D are
on the same side of AB then C and D are the same point.

Which one was easier to parse in your mind?

As a bonus try rewriting this formula using longer variable names and tell me
how legible it would look
[http://i.imgur.com/wCWkyNL.png](http://i.imgur.com/wCWkyNL.png) (it's from a
proof of one of Syllow's theorems
[https://en.wikipedia.org/wiki/Sylow_theorems](https://en.wikipedia.org/wiki/Sylow_theorems)
)

~~~
douche
Conversely, you just draw a picture and leave all of this tedium behind,
making the import of what you are talking about obvious at a glance.

~~~
osoba
Proofs by picture arent proofs though. And how would you even convey by
picture that two line segments are of the same length. Or that if you drew C
and D as separate points they turn out to be the same point?

~~~
qznc
Oliver Byrne's edition of Euclid is a nice proof-by-picture example:
[https://www.math.ubc.ca/~cass/euclid/byrne.html](https://www.math.ubc.ca/~cass/euclid/byrne.html)

~~~
osoba
Those look more like proofs with pictures, than proofs by picture, but I'm too
lazy to get involved into the obscure notation of that book and check whether
a random proof from the book would be equivalent to a modern proof from a
standard textbook.

Altough, judging by the old timey language of the book, it's possible the book
predates Hilbert's axiomatization of Euclidean geometry and the proofs in it
were good enough for the standards of its time.

In modern mathematics proof by picture generally means you've drawn / pointed
out a single example, possibly wrongly or in a way that doesn't generalize,
and because you've shown that one example holds you assume all possible
examples hold. That, obviously, needs not be the case.

