
Reading, Writing, Arithmetic, and Lately, Coding - Libertatea
http://www.nytimes.com/2014/05/11/us/reading-writing-arithmetic-and-lately-coding.html
======
dasmithii
As essential as computational literacy has become, I feel a bit uneasy here.

Take mathematics for instance. What students learn in school is important,
yes, but it causes a hatred toward math in general. In a typical high school,
90% of kids dread Algebra, Geometry, and PreCalc. These classes are miserable
to them, and consequentially, they associate mathematics with their aversion
from imposed education. Because of this, the wonderful world of numbers is
reduced to mere repetition.

For those who appreciate the beauty in mathematics, this must be depressing.

I'm afraid that CS may have a similar downfall if we mandate programming
curricula. Maybe a more subtle approach is necessary.

~~~
maxerickson
There should be a nonacademic track in math education, starting around 8,9, or
10th grade.

It would focus on things like demystifying probability and balancing check
books and other things that sort of fall under numeracy.

I realize lots of people will like to complain that those aren't topics for
school, but the current program in the U.S. is anyway watering down the
academic track. People that have not yet seen any beauty in math would get
much more value from practical lessons designed to mostly be engaging than
they get from suffering through abstract math that is mostly designed to be
foundational.

~~~
jaredsohn
My high school had such classes; is this not standard?

~~~
derwiki
I can't tell if you're seriously asking. But yes, tons of schools (in the US)
are sub par, especially once you're one hour removed from a major city. From
what I hear from cousins in West Virginia, it's actively getting worse.

~~~
jaredsohn
Whether schools are subpar is a separate question.

I am saying that if you were in my high school and weren't expecting to go to
college, there were courses designed for you across all disciplines to teach
you practical life skills. However, I do think the school still required
students to struggle through at least some of the standard curriculum.

------
lifeisstillgood
We've said it before. We will say it again:

 _Programming is the new literacy for the 21st Century._

Charlemagne was the first and only illiterate Holy Roman Emperor. This current
generation of business and political leaders are the only generation that can
remain software illiterate.

No, We wont expect Fortune 500 leaders to do production coding but we don't
expect the Managing Editor of a newspaper to write articles either - but they
must be literate as well as everything else expected.

~~~
dave_sullivan
Agreed. People think of the application of software and automation to every
facet of our lives as an industry (tech) when really it is a new industrial
revolution. We're undergoing a major shift in how people will work in the
future.

"I'm not good with computers" will be the 21st century equivalent of "I can't
read good"

~~~
lostcolony
Or, "I'm not good with computers" will be the 21st equivalent of "I'm not good
at math". Meaning LOADS of employed people will be saying it.

The real benefit of programming for non-programmers is being able to break
large problems into small, solvable ones, and logically assemble those small
solutions into various larger ones (even outside of the original problem; code
reuse, as it were). This is a skill you can learn a myriad of ways, but
computing is perhaps the first time we've had a mechanism by which you can
directly practice it in a non-contrived, endlessly extensible manner.

------
bsaul
As a sidenote, France happend to have exactly this teaching programs in
elementary schools in the 80s. We had a language called "LOGO" running on
thomson mo5. Instructions where "move the tortoise 20 front", then "rotate
tortoise 90°", etc, and we would do pretty graphics on the screen and be
happy.

Or at least that's how i remember it. A friend of mine who's now an history
researcher told me recently how that thing disgusted him from computers for 20
years.

~~~
wallflower
> A friend of mine who's now an history researcher told me how that thing
> digusted him from computers for 20 years.

Can you please elaborate? Was it the infantilism? Or something else? Does your
friend pursue coding now?

Logo and other "beginner" languages are more about introductions to using
computers to create, not about teaching programming.

~~~
xmonkee
Can be a couple of things

1\. It intimidated him. Happened to some friends of mine. 2\. He realized
history is better than writing procedural code . Should have learnt haskell

~~~
quanticle
You do realize that LOGO is a Lisp variant, right? It's not exactly
procedural.

~~~
xmonkee
I had no idea, thanks. It never looked anything like a lisp though.

------
graycat
Easy enough, but the bottlenecks remain:

(1) What real world problem to solve.

(2) For a challenging problem, how the heck to solve it.

(3) How to cut through by far consistently the worst writing anywhere in
civilization -- documentation in computing.

E.g., I'm still mud wrestling trying to figure out why: (A) In program A,
serialize an instance of a class. The result of the serialization is a byte
array. (2) To check, in program A, deserialize the byte array and observe that
do get the original instance back. (C) In program A, printout the byte array
as hex. Transmit the byte array to program B. In program B, print out the byte
array in hex and observe that it is just the same as it was in program A. (D)
Deserialize the byte array in program B and observe that the operation ends
with and exceptional condition. Why? Documentation clear as mud.

Just for my basic work, I have about two cubic feet of books, nearly all badly
written, and 5000+ Web pages of documentation illustrating much of the worst
mistakes in technical writing.

The bad documentation is by far the worst bottleneck.

The whole thing, the industry and the science, needs to take Technical Writing
101. First lesson: A word used with a meaning not in a standard dictionary is
a 'term', and never but never ever, not even once, use a term without a prior
clear definition with motivation and likely examples. And, for such
definitions, sure, use hyperlinks.

(5) Organization and management of computing projects involving teams larger
that, say, half a dozen people.

(6) Computer system construction, installation, configuration, backup and
recovery, monitoring, management, and administration.

(7) Security.

------
imjustsaying
About 10 years ago, I took a Computer Science AB AP class my sophomore year in
high school. It was all in Java.

Man, that was head-slammingly difficult for an intro course. The prereqs said
only a knowledge of Algebra was required. But it took a lot of time to get
used to. The only kids who did well were the ones who had already been
programming for a couple years.

I didn't even know how to do command-line prompts in windows. I remembered a
few things back when I had to use MSDOS to fire up games, but that was about
it.

That all being said, I sincerely believe everyone should learn how to program.
And I furthermore sincerely believe the public school system should stay as
far away as possible from trying to impose a model for a one-size-fits-none
way to do so.

------
johnnygleeson
Agree with the premise but the article lost me at " that might someday lead
to.... instant riches."

------
mantrax5
Ultimately it all comes to cybernetics. Understanding how systems work,
understanding how to change systems, and how to design new systems.

Not about programming itself.

We were designing systems for millennia before computers existed. All the
basic algorithms we use in programming today were invented before computers.

Computers just happen to make systems a lot more reliable, performant and
scalable (if designed right), since computers have evolved to complement us
(i.e. we're bad at what computers are good at and vice versa).

I don't think programming is the new literacy, I think understanding the
common principles that tie system behavior together has always been the
highest form of skill for any human to possess.

As for turning a system design into code, if you know how to design the system
on a higher level, you can always delegate the "boring" act of coding itself.

I'm a programmer, but I understand that the act of programming by itself, even
though I find it enjoyable, isn't that interesting on its own. It's only
interesting in terms of how the program I'm writing connects to a system where
the real world and real people are involved. All programs, with no exception,
either interface directly with people, or interface with other programs that
do interface with people.

Currently we conflate programming with the act of designing systems
(algorithms, design patterns etc.), but I expect sooner or later system design
and programming will split into separate "trades". Would it be a resurgence of
cybernetics, or a brand new branch of science, I don't know.

But it'll happen. And it'll be good, because many programmers use computers as
their golden hammer of system design, and it's hardly the only component of a
good system.

~~~
graycat
You have unusually good insight.

> Not about programming itself.

Basically correct. So, the grades 1-8 or so work in 'computer science' will be
of questionable value.

E.g., my wife had no background in computing at all. I taught her programming
at the level of if-then-else, do-while, allocate-free, call-return, try-catch,
plus quite a lot in about a week. Then I gave her a lecture on 'rule-based',
'expert system', 'artificial intelligence' (AI) programming, and right away
she wrote a nice, first program. I gave her a second lecture, and right away
she wrote the best, early AI program I and my research group ever saw. No
biggie. For 'computer science' in grades 1-8, looks like mostly a waste of
time.

> All the basic algorithms we use in programming today were invented before
> computers.

Not really. E.g., there are books with collections of algorithms by, say,
Knuth, Sedgewick, and others, and more on, say, error correcting codes and how
transactions work in relational database, that really are "basic" but were
invented just for the world of computing and digital communications.

~~~
mantrax5
Error correction codes and transaction models have existed before computers
have, of course the schemes people have settled with before computers were
designed to fit within the limits of what people could work with back then,
with no computers to do the quick math for them.

People have used a simple form of error detection checksum when copying books
manually (like the Bible) thousands of years ago.

Retransmission requests based on error detection like the above have been use
over various pre-electronic communication channels, like telegraphs, which is
a form of error correction scheme.

Of course anything that requires a modern CPU to compute wouldn't be feasible,
but the basics of it, the seed, was.

Transaction protocols can be observed before computers especially in military
protocols where "distributed coordination" and consistency of command are
crucial for carrying out a military mission. Their channels of communications
were slow and unreliable.

I suppose I don't have to also point out same military protocols often
required a log where every formal exchange is written down in order, for later
review if needed. Here are your modern day database transaction logs.

I can also cite famous examples of encryption schemes going back to the Roman
empire, before computers existed, but we all know those.

We assign names to various inventions and we tend to think no one before had
any idea like that, but truth is good ideas keep getting "reinvented" over and
over, and the only difference is the level of sophistication that computers
afford us in combining such concepts and building upon them into more complex
schemes.

By the way in ancient Egypt, every person would be registered and written down
in a set of books, books would be split and sorted by a hash of their name for
easy look up.

Yup. Ancient Egypt had a hash-based index for their database of people (and
books are the hashmap buckets).

~~~
graycat
> Of course anything that requires a modern CPU to compute wouldn't be
> feasible, but the basics of it, the seed, was.

I still believe that what Hamming did in coding theory and the Reed-Solomon
codes, etc., are 'basic' and new since computers.

But, you are correct that transmission error detection and correction were old
needs with old solutions. E.g., there was parity with teletypewriters and old
paper tape for error detection and then retransmission for error correction.
Yes, and the improvements Hamming, etc., made for computers were not feasible
before computers, but I still believe that, while the problem was old and had
old solutions, the work of Hamming, etc. was new and 'basic'.

Yes, heap sort doesn't really work as a sorting technique before computers,
but trying to dream up heap sort is not easy -- just for the heck of it, once
I set aside reading how heap sort worked and tried to dream up such a thing
for about two weeks and couldn't do it. Heap sort's darned clever.

~~~
mantrax5
I don't mean to disregard the work of Hamming & co. as trivial.

Each one of us struggles to add something to the stream of human thought, and
it's hard every time, and it's worth praise every time, but it's interesting
to see it all as a part of a bigger picture, in that computers are just
systems like any. They have certain properties emphasized, and certain other
properties de-emphasized right now, but that's it.

I actually didn't go far enough in my error detection and correction examples,
I only went few thousand years back. There's an even more ancient example,
going back millions of years... the process of DNA replication itself:

[http://en.wikipedia.org/wiki/Proofreading_(Biology)](http://en.wikipedia.org/wiki/Proofreading_\(Biology\))

One shouldn't be surprised to discover that some cellular process might have
stumbled onto heap sorting, either.

System design is bigger and older than any of us. To think otherwise is just a
sign our culture is just too young yet to appreciate where we fit in the
world. In some ways, it seems we still think we're at the center of the
Universe.

We're not inventing algorithms, just rediscovering them. This doesn't make any
of the achievements of our inventors any lesser, but it's good to keep in mind
the big picture. It helps us, among other things, to look in more places for
inspiration and knowledge, and in turn, create better systems.

