Hacker News new | past | comments | ask | show | jobs | submit login
[dupe] Coding is not the new literacy (chris-granger.com)
229 points by wglass on March 2, 2015 | hide | past | favorite | 147 comments



Previous discussion from a month ago: https://news.ycombinator.com/item?id=8947603


I think it's something a bit more basic - logic. Learning programming teaches logic, because the computer will find every logical shortcoming in what you tried to express.

Natural language is the bread and butter of our every day existence, and it is a veritable swiss cheese of logic. Most people won't hear your argument and notice the unfounded assumption that destroys the whole thing, but a compiler will. Most people won't notice if you leave some of what you say implicit, but your compiler will.

Something I've noticed over the years is that people, in general, are just not good at logic. Very few things teach it, the few that explicitly cover it like symbolic and formal logic classes are arcane, scary corners of the educational system. Logic doesn't win arguments, it doesn't get you laid, it generally doesn't help you interact with other people. It does teach you to think, but like deodorant, nobody knows they need it until they already have it.

So, start small. Teach logic, teach people to think, teach them the very fundamental underpinnings of programming. Programming is then just another way for people to express what they already know, like learning a foreign language rather than learning language in the first place.

I think most people saying "teach them to program!", at least the ones I've seen calling out the "it teaches you a new way to think" part, actually mean logic in general.


People often confuse the mechanical learning of reading and writing, with the more advanced learning of how to write properly and how to analyze texts. Learning to write properly, learning to lay out one's argument, or story, in a solid, comprehensible and fun to read way is difficult. The learning process actually changes the student thought process, making it more structured and forcing a third-person view of the text.

The same actually applies to programming. The mechanical task of programming isn't very difficult, and can be taught in a few dozen hours. The thought process that leads to solid programs, though, is akin to proper writing. It is hard, requires an even more structured approach to defining solutions, and actually changes one's approach to modelling reality in general.

If we believe that a formal view of the world is a necessary skill (which I do), then teaching programming is as useful as teaching good writing. Note that I don't mean the basic skill of writing code, but the skill that is similar to good writing. Distinguishing the two is necessary for discussion. We can call it programming and software engineering, or programming and Programming (with a capital P), or something else. The rote task and the modelling task lead to different discussion paths.


I actually disagree. People have a pretty good grasp of basic logic. The issue is that when we say "computers are logical", what we really mean is that you need to treat computers like someone who is missing important parts of their brain.

People are used to interact with other entities that can perform advanced logical inferences, deductions, etc in an intuitive way. They are at a loss when they're forced to dumb it down to "talk" to someone who needs to be told not to crash into the wall when being told to walk in a certain direction.

Worse still, when people do understand that computers are really dumb, but actually good at following orders when you treat them like a mentally challenged toddler, they get stuck again when even that fails; when they encounter application after application that does not operate in any logical sense, because they aren't programmed by other computers, but by lazy/dumb/stressed/ignorant programmers like me.

The best quote I read on this comes from this essay: http://stilldrinking.org/programming-sucks

"The only reason coders' computers work better than non-coders' computers is coders know computers are schizophrenic little children with auto-immune diseases and we don't beat them when they're bad."


This. The whole essay you linked is golden.

People have good grasp in logic, but they totally underestimate the amount of computation our brains do in the background when communicating. A lot of information is contained in our shared cultular upbringing, education, social norms and internal brain mechanisms. Computers don't have any of that. Even as I write this comment, 99% of information required to comprehend it is not in the text - is in the shared context everyone has in their heads already.


This essay is priceless:

> Would you drive across this bridge? No. If it somehow got built, everybody involved would be executed. Yet some version of this dynamic wrote every single program you have ever used, banking software, websites, and a ubiquitously used program that was supposed to protect information on the internet but didn't.


If you're using logic that assumes the entity you're interacting with "can perform advanced logical inferences, deductions, etc." but don't understand how to dumb it down for a computer, does that not mean that somewhere you don't have a grasp of something? What is that something, is it logic? I'm playing devil's advocate more than anything but I think being able to tell a computer step by step what to do requires logic, albeit perhaps a specific type of logic.


It's logical sentiments that have faded to the unconscious. If the society you are within has the same elements in their unconsciousness, information bandwidth to convey a message is reduced for all parties, with An unknown amount of subtle complexities (within messages) still successfully conveyed.

This idea (amongst many other gems) comes from a great short book I just finished on cybernetics (old, old book) The Human Use of Human Beings (1950) by Norbert Weiner

But I'm more on your side, only because I think we should be able to elaborate on the elements that faded to the unconscious, since if we cannot, we will not be able to apply the subtle rules in novel ways


> People have a pretty good grasp of basic logic.

I disagree. Consider that most people fail the Wason test, unless it is being explained using a scenario involving cheating, when they pass.


I had to look up what the "Wason-test" is[1] -- and I'm not convinced it proves people do not have a good grasp of basic logic. I think it shows that people don't see logic as a useful tool in determining the classification of a grand total of four real-world objects.

I think people are just better at optimizing their brain power than that, with just four cards, why not look on both sides of all of them? You examine twice as many cards, but you'll probably complete the task quicker than if you think carefully about the wording of the question.

[1] http://en.wikipedia.org/wiki/Wason_selection_task


Why teach these things with programming? Historically we, the west, have taught these things with philosophy. That's how we went from tribalism to democracy. Why not teach with philosophy?

I know that we live in a technical world, but no matter what, we live in a philosophical world. Everyone deals with philosophy (either reflective or forced upon them) daily. How we work with each other might be via a keyboard and mouse, by why we work with each other is philosophical. It would be better to teach various systems of thought, allow a dialectic to occur, and get a better, more informed citizen.


>Why not teach with philosophy?

That sounds great but almost nobody teaches philosophy to children right now, and that does not seem likely to change in the near future. Our utilitarian ideas about education (you know, workforce training) make it easier to support teaching of subjects that are more directly relate-able to work.


Honest question: they don't teach philosophy in the U.S. in high school? Is that in the entire U.S. or does it vary by state or something? (Edit: I found that Wikipedia says "In the United States of America philosophy is not generally taught at pre-college level")

http://en.wikipedia.org/wiki/Philosophy_education

Philosophy IS taught in high school here in Uruguay (though it's very watered down, but it does include logic in the curriculum).

Apparently it's also taught in Europe, Asia, most of Arabia, South America and even Africa, so I guess the U.S. is the exception.


>Honest question: they don't teach philosophy in the U.S. in high school? Is that in the entire U.S. or does it vary by state or something?

Honest answer: As far as I am aware philosophy is not part of the standard HS curriculum in the US. There are probably exceptions, but I am not personally aware of any.

I think it is an embarrassment, but they didn't ask me.


Because philosophy is too imprecise. When you dig down to the most fundamental concepts of CS you end up in what is usually considered the territory of philosophers, only with a better mental toolkit to explore it.

There is an interesting paper by Scott Aaronson about the intersection of those two disciplines and how the more CS-y approach can help solve philosophical problems:

http://www.scottaaronson.com/papers/philos.pdf.


I guess there are a lot of STEM people that have never taken a course in first/second order predicate calculus, which is usually lumped in under philosophy. The field of computer science builds on and is rather indebted to philosophers such as Wittgenstein and Godel; their finds provide the theoretical background for all modern theoretical CS.

While what they teach at the university today for a philosophy degree may let you focus on things that look imprecise, I find your statement is rather ironic. The difference between someone, especially coming out of school, who is well versed as a logician (hint they may have a math degree, but it's still philosophy) versus some one who has been exposed only to computer science, often parallels what I have seen interviewing people with a physics background versus an engineering background: almost all of the time the engineer might be able to give you enough to solve many of the problems, and have a little background, but the physicist will typically have a far stronger "toolkit" to analyze the problem, especially when the problem is novel. So goes it with a CS grad versus a math grad, typically around here, though YMMV.


Is math classified as philosophy due to some historical accident? Is that really relevant to the discussion at hand?

And wasn't Wittgenstein essentially out to destroy traditional philosophy?


Not an accident. Maths is philosophy in the same sense that physics is philosophy (in fact it was called "natural philosophy"): both are subsets of the general thing.

Philosophy is the "science of everything", and as such, all fields of (western, academic) knowledge are considered parts of it. Math in particular is formally equivalent to logic, which is the universal language on which philosophy is written.


We did logic in high school. ∃ and ∀ and predicate calculus was introduced when explaining differentiation and limits.


Have you taken a philosophy department logic course? I learned way more useful logical thinking in philosophy classes. My CS classes didn't even come close. Sure I learned to make a mindless machine calculate the proper result, but that's far less important than reading the writing of another person an being able to understand their ideas.


I think it is interesting to note here that there are a number of people who are well regarded as computer programmers who have degrees in philosophy. I wish I could remember who they were at the moment.


I'm not well regarded, but my undergrad degree is in Philosophy :D


Same here, I think people have a warped idea of what philosophy is.


Then you have never taken a philosophy course in a western analytic department. The "Logic" that's used in computer science is a very small part of all possible logics. The discipline of logic is studied and expanded almost exclusively by philosophers. Computer Science Logic is to Philosophy what Physics is to Mathematics.


That's interesting. Is it because these other forms of logic can not be implemented on a computer in a straight forward way, or because computer scientists are just ignorant of these other possible logics?


Well one feeds into the other. Computer scientists just have no real use for them. They can't be straightforwardly modelled on a computer nor are they straightforward models of computers.

That isnt to say there are no uses of them in CS or that CS ignores them all. But many systems were invented to deal with explicitly philosophical problems (eg. modal logic for metaphysical necessity, para-consistency for paradoxes, etc.).

In some very specialized areas non-standard logics do get used (theorem proving uses intuitionist logic, AI/Machine Learning might use fuzzy, etc.) but its still a limited subset over all.

No doubt, in heavily mathematical CS departments there might be courses on logic in a more complete sense.

CS researchers do develop logical systems of a kind: in particular they have developed ways of writing logic systems programatically (and usually end up having to ditch a few things and thus invent a new system).

However if you bought any major books on logic (as a subject) or looked into any contemporary research in logic (as a subject, not "doing logic on an ARM, etc.") you'd find the authors were philosophers. Occasionally mathematicians, and esp. mathematicians who then became philosophers.


There are a lot of logic systems out there, and we tend to use ones that have happened worked out, and momentum builds. Of course, sometimes things are missed because our chosen tools are inept for some problem. Those are usually only solved with paradigm shifts.


You can also study there via a math department and have a more obviously rigorous path.

Edit: to be clear, I didn't mean to imply the philosophy dept isnt rigorous, but instead, merely, that nearly the same classes would be offered in each dept and math, unlike philosophy, suffers from the perception of being too formal if anything.


I think it's a common misconception that philosophy is an imprecise discipline. I think what makes people think that is the fact that often philosophy deals with topics that have not yet been (or may never be) formalized.


There is philosophy and there is philosophy. What a lot of laypeople have experience with is mostly the type of philosophy that is just meandering with thoughts and overanalyzing the meaning of words to the point of breakage. I think this is the part that makes many people treat the word "philosophy" as synonymous with "nonsense".

I agree that philosophy often deals with topics that have not yet been properly formalized - after all, this is the discipline that's on the frontlines of important problems, asking questions that later get answered with more formal approach.


Logicians like Turing, Church, and Gödel gave us computers.

So I don't think there is any sharp distinction between programming and what logicians do. And I think it's clear that Turing (for example) was very excited about the possibilities of furthering his studies with actual physical computers he could program, given his work on Engima.

So it may be programming is one of the best tools we have for teaching logic, and one could make a good argument many of the great logicians in history would agree.

(Note my emphasis on "logician", as there are many other kinds of philosophy for which computers aren't as obviously helpful, as they are for learning logic and logical thinking.)


Don't forget John von Neumann


I agree, this is actually more general than programming. Philosophy opens new doors for people, there's no reason to limit that to something that's still quite niche among the general population.


> Why teach these things with programming?

Because you can't bullshit a computer. It therefore enforces rigour.


Piggy-backing on that: "Troubleshooting" might as well just be called deductive reasoning IME.

If you approach an issue by verifying your assumptions and disregarding what you can prove isn't broken, the actual culprit tends to surface very quickly.


That's really an argument for teaching critical thinking, verbal reasoning, social presentation, and rhetoric, all of which are more useful in the real world than $programming_language, formal logic, or analytical modelling.

The humanities have been hacking human minds for millennia. They've become so good at it it's possible to hack the beliefs of entire populations so subtly those populations don't notice they're being... modified.

This might be more of an issue than knowing what xor does, or how to search a binary tree.

>people, in general, are just not good at logic.

No, people aren't. That's because people's brains are story machines, selected by evolution to work on quick-and-plausible heuristic arguments based on narrative relationships, and not on deep research and formal abstraction.

I wouldn't be surprised if across the population as a whole, only some percentage - less than half, more than a third, maybe - is capable of non-trivial formal logic at all.

Most people seem far more swayed and interested in by simple stories with an emotional punchline. The idea that you even can do formal modelling of the future is personally unfamiliar to them.


> No, people aren't. That's because people's brains are story machines, selected by evolution to work on quick-and-plausible heuristic arguments based on narrative relationships, and not on deep research and formal abstraction.

That, and the fact that logic itself is not the right tool to work with messy reality - probability theory is. And heuristics are part of this when probabilistic models meet limited computational resources and time constraints (you can't think too long about what's that yellow striped thing in the bushes, because you'll get eaten by it).


> The humanities have been hacking human minds for millennia. They've become so good at it it's possible to hack the beliefs of entire populations so subtly those populations don't notice they're being... modified.

I had to chime in here and congratulate you on your fine explanation of the concept Plato deemed the Noble Lie. I don't want to assume you know of its but I'll go ahead and do so anyway, since you seem the type.

For others, it's truly fascinating : https://en.wikipedia.org/wiki/Noble_lie

Humans so readily eat them for breakfast, lunch, dinner and in our dreams.

Not to say I am such a grand detective at the age of 23, but I feel as if the building software has taught me (and will continue to teach me,) many valuable logical lessons, which help me to methodically deconstruct things beyond programming (stories, my beliefs, other people's beliefs.) This would make sense since usual we build software by translating those things (stories and beliefs) into the fractal complexity necessary to achieve a real solution.

I can't think of any better substitution than teaching logic through programming since I happen to agree with one of the parents in that pure logic in philosophy is slightly harder to pin down concretely for younger minds. They are also less apt to pick up on and find existential value in philosophy.

In other words, I'm completely for programming literacy in middle, and high school. Especially for it's ineffectiveness with logic, but also to since they are just beginning to realize the complex programs in their very own mind (built by themselves... or others.) I'd want them (and everyone) to know how to deconstruct and find roots, reasons, and rationale.


A compiler can only catch syntax errors (ie grammar). But just as a well-written essay can be based on false-premises, so a syntactically correct program can be full of bugs, particularly if the programmer had a faulty idea of what the problem was to begin with.

Pure logic can only work correctly with reasonable premises. But it can't provide us with those premises. We can only construct, refine, and share those through natural language.

To get back to the point of the OP, I think the underlying assumption that systematic modelling is the new literacy only makes sense if we assume the "old" literacy. In other words, the ability to model the real world only works if we can actually understand and communicate about that real world outside the models. Without the underlying meaning, the model is just a toy.


> Pure logic can only work correctly with reasonable premises. But it can't provide us with those premises. We can only construct, refine, and share those through natural language.

Have you ever looked into formal verification or assisted theorem provers? There's a branch of math called "constructive mathematics" that utilizes them where formerly natural language would have been used, in an attempt to codify them and get productivity gains.


I personally don't mean just logic when advocating that programming "teaches you a new way to think". Sure, there's logic. There's a skill of very precise and clear thinking on multiple levels of abstraction simultanously that you need to write code that works. But there is more!

There's that feeling and understanding of basic units of computation. And I mean computation in the general, mathematical sense. Then there's the ability to express your will in terms of those computation units. And understanding how computation transforms information on an elemental level.

This gives you new abilities - like, being able to automate a repetitive process. Or understanding how one thing can influence another via information. Or understanding what is feasible and what not based on computational complexity.

For instance, it's hard to explain to non-programmers (and even some programmers!) just how much you can infer from information they leave on Facebook - because most people lack the neccessary mental models to even talk about it. And having them is increasingly important in comprehending the technological world around us.


I think some people learn to think in a deeper more abstract way but many don't.

My team leader always describes database results like an imperative program "if it has this show do this". I have to translate that into "return the set where this happens". We will debate database schemas, and I can see that one version is essentially a normalized version of the other option and a join will transform it into the same table .

I think there are plenty of programmers who reach a basic level of understanding and no more. The one years experience 10 times type guy.


Indeed. A lot of programming jobs don't really require you to grasp deeper and more abstract ways of thinking - you can just glue together some StackOverflow answers with latest JS framework and collect your paycheck.

Not that there's anything wrong with that (I have some objections, but it's a topic for another discussions) - but it shows that just because someone knows how to code, doesn't mean he learns the important abstractions. As Bret Victor says, for "thinking the unthinkable", coding is a tool, not a goal.


This is very true. Someone who builds websites/CRUD screens and someone who programs for NASA might both call themselves programmers but they are very different jobs. I'm fairly good at the first one, but I probably wouldn't know where to begin for the second.


And even if you're working for NASA your job may not be inherently more difficult than CRUD, it's just different. Where I care about server latency, as NASA engineer you'll care about real-time guarantees. Where I care about UX, you'll care about code correctness, etc.

Ability to comprehend abstractions is a bit orthogonal to all of this. It starts to shine when you need to architect a complex system or model some real-life phenomenon.


I always wonder why CRUD is so looked down upon. I do the database for a sequencing center (so essentially CRUD - though its a moderately complex schema), and I work with a lot of bioinformaticians. Their code is often just counting stuff, and or applying some basic statistics. Usually the quality of code is really bad (often the script will only be run on one or two data sets, so that's not such a problem).

Now compare that to a decent web app written to a good standard. There can be orders of magnitude more complexity in the second, even if it is "only CRUD". Getting HTML, CSS, Js working together, maybe contacting the database over REST, maybe just getting standard HTML. Maybe there is some SQL that needs optimized. URLs need to be set up. It needs to be deployed on a server somewhere.

I think I could do the bioinformatics job with if I was given decent description of the problem, but I would be surprised if any of them could do mine without a serious amount of learning.


I've seen enough programmers with poor logic to think it's not an ideal way to teach logic.


The piece argues that "[c]oding, like writing, is a mechanical act". It's certainly not the intent of the learn to code movement to teach typing skills! This statement seems to grossly misrepresent what people mean when they say "learn to code".

To learn to code is to learn to model systems. Programs are built from components -- abstractions -- and good programmers know a wide variety of abstractions and understand how to combine them to achieve the desired effect. A good program is one that accurately models the phenomena of interest, whether that phenomena is something as simple as "display the user's profile" or a complex simulation like a game.

Now I would agree that many of the current curriculums are terrible, and teach no great insight into modelling. But that is the fault of the curriculum designers -- often people who have neither great experience in teaching or in programming. That doesn't mean there isn't good work being done, such as the Program by Design team (http://www.programbydesign.org/).

In short, I think the blog post builds a straw man argument. I agree it absolutely belts crap out of that straw man, but that is beside the point.


English major here. I'm currently learning how to code as part of a bootcamp called (surprise surprise) "Coding is the New Literacy".

One of the things this program hammers in early is fundamental programming principles like OOP. A lot of MOOCs skirt around these topics, forcing you to 'build' non-functional apps first. I've taken those courses and learned nothing of real value.

This program, where I've been forced to learn algorithms, OOP, etc. has been of much higher value. It's been harder, but I feel like I'm actually learning the fundamental principles of programming.

I'll echo your sentiments: to learn how to program is to learn how to model systems.


I am trying to read between the lines of your comment and I _think_ you get this but I'm going to spell it out anyway:

"...fundamental programming principles like OOP."

The key word here is "like". OOP is actually falling out of favor as a successful way to express models. Make sure you investigate alternatives to OOP (functional, procedural etc). As a graduate of OOP myself, it's interesting and exciting to approach different paradigms. Experience tells me that the best practices obsess over what things "do" rather than what things "are" (which is a fluffy concept at best).

Anyway, tit for tat, YMMV. Glad you're doing something though.


How do you figure OOP is falling out of favor? I've seen a few arguments on HN (the old OOP vs FP discussion) but do you actually expect the trend to continue to the point that no one really uses OOP anymore? Highly doubtful. And which specific part of the industry is OOP slipping into obscurity?


It's just the HN hipster logic that confuses "successful" with "fashionable within a vary tiny echo chamber".


It's not so much that OOP is falling out of favour, but that functional programming in particular is being rediscovered, probably because we have to multithread now, and if bugs are hard to keep out of single-threaded code, they're nigh on impossible to keep out of multi-threaded code.

In fairness, a lot of us never truly embraced OOP anyway. I've always avoided objects unless I'm managing state. Even in OOP languages, much of my code essentially repurposes objects as modules for storing functions. The fact that they are implemented as methods is incidental. This is not an uncommon pattern.


I'm stronger with OOP than I am with functional programming because I've been doing OOP for longer. I started out with procedural programming when I was a kid (BASIC and QBASIC), and have since moved on to PHP, Javascript, Python and a bunch of other stuff.

Almost everything I've worked on in the last 8 years has involved OOP in either PHP, Python or Perl. Perhaps I stumbled into a niche and have yet to find my way out, but OOP has been a common theme in my career.


No, you stumbled into the mainstream. The wholesale migration to OOP was complete by the end of the 90s.

But you've probably been doing functional programming within OO languages and not realising it. For example, most of the PHP standard lib is functional. In fact, PHP is really a multi-paradigm language. Python is much closer to a pure OO language.


No, I do realize that PHP in functional. I suppose I misstated my history a bit - I went from procedural to scripting languages, to functional to mostly OOP in languages that support both functional and OOP.

I would say that Python isn't closer to a pure OO language than PHP, rather it allows for more pure OO than PHP. I only say this because you can go pure functional in Python as well.


It was popular in the 90s and early 00s to apply OOP techniques to everything, whether or not it made any sense. Today we use value types and higher-order functions all over the place, we customize objects with composition instead of inheritance, and we use module boundaries for encapsulation.

It doesn't help that it's easy to overdo your taxonomy, or try to model objects after the "real world" (rather than your domain model). Examples in some textbooks are pretty bad... if "dog" inherits from "animal" then you should throw the book out.


I would rather throw the book out if they had "dog" inheriting from "vehicle." Really, metaphors and analogies are nice, one of the most powerful parts of our capacity for human language. Just because math lacks those concepts, and so the same isn't possible in a functional mindset, doesn't mean it is bad.

Some FP textbooks are pretty bad...if fibanocci is implemented with just recursion, then you should throw the book out. </s>


My problem with "dog" and "animal" is not that I hate metaphors, but because the example tends to teach students that "X is a Y" is justification for "class X inherits from class Y". It's also usually found in books that focus more on the hammer than on the nails, if you know what I mean.

Recursive fib is usually taught to show how a naive translation of a recursive formula into code can cause poor performance. I can't think of a better example.


Reclusive FP is usssually the first example of the FP textbook, and is used to teach how elegant FP is from the beginnin jn spite of the fact that you should never write that (and who needs to write fib anyways). Inheritance is subtyping at the basic level: we all know a dog is an animal, and that all animals share a common ancestor. It is a crude instrument, especially in Java, and often can't be used, but not because the concept of extension is flawed in some ways. If you had traits/mixins, extension can be used much more widely.

I'm just as annoyed at the ideological "composition, not inheritance" crowd. The truth is, one has to learn a bunch of generalizations and then practice a lot to get it; good design is just hard to teach in a book.


FWIW, in my corner of the world (Scala, big data, web services) functional is very much the way things are going. It's a much better fit to the problem domain since all these systems very much look like pipelines. E.g. a web service is a function from request to respone. Big data applications are just applying some function over (possibly streaming) data.

It's happening on the front-end as well, with things like React.


Might be worth remembering that OOP was born out of a need to model the real world (SIMUlation LAnguage). I don't think it's a surprise that functions are a good way to work with mathematical models (that is, after all, how we do math).

For the front end, it's my understanding that event-passing systems are still used quite heavily -- which can also be considered OOP (message passing).

Modelling a car as a sub-class of vehicle etc -- makes most sense when you're modelling real-world objects.


You may be interested in the Actor Model [1] computation style. When used for programming, it looks like a programming paradigm where functional and object-oriented styles are neatly supported and blended without any impedance mismatches. Most code is written in a functional style, but there are abstractions that encapsulate state changes and make it work together with the mostly declarative functions.

There are at least a couple of Coursera lectures teaching this style (one about programming paradigms, the other about reactive programming), it's worth giving it a look.

[1] https://en.wikipedia.org/wiki/Actor_model


The actor model isn't a computation style itself so much as a concurrency discipline loosely resembling the style of message-passing OO (a la Smalltalk).

It appears that some people have become so zealously opposed to object orientation that they have been unable to realize that not everything is Java.


> OOP is actually falling out of favor as a successful way to express models. Make sure you investigate alternatives to OOP (functional, procedural etc). As a graduate of OOP myself, it's interesting and exciting to approach different paradigms. Experience tells me that the best practices obsess over what things "do" rather than what things "are" (which is a fluffy concept at best).

This is like saying:

"Noun-based writing is falling out of favor as a successful way to express stories. Experience tells me that the best practices obsess over what things "do" rather than what things "are" (which is a fluffy concept at best)."

If a writer didn't use verbs and nouns when appropriate, nobody would read their writing. You would consider such a writer to be almost illiterate.

Of course it's useful for a writer to experiment with focusing on verbs or nouns, and there are situations where one may be so much more useful that the writer should eschew the other entirely. But a dogmatic approach that tries to eliminate all of one or the other universally is bound to lead to bad writing.

And the same goes for coding. Focusing dogmatically on one methodology is only going to limit what you can do.


I don't know whether OOP is falling out of favor. For that, I would have to know the rules.

I am, as things stand, still in the process of learning the rules.

Once I've mastered them, perhaps I can think of breaking them.

Which, in the meantime, might mean learning things that have gone out of favor.


> One of the things this program hammers in early is fundamental programming principles like OOP

OOP is not a fundamental programming principle. Its a recently popular programming paradigm. The idea that OOP is a fundamental programming principle is a harmful concept to internalize (though lots of programmers trained in the last ~20-25 years have internalized it.)


OOP is a fine way to model many kinds of systems.

It's not the only way, and it's not always the best way. But I think you do a disservice to the English major you are replying to by suggesting he or she is learning something "harmful" to their future development.

I'm not saying your point is invalid or incorrect. I just think someone new to programming will be best served by learning how to write useful programs in one paradigm, before exploring multiple paradigms, or worrying about whether they are using the "right" one.

I would say the same thing to someone being taught the functional paradigm as a beginner. It will make more sense to compare with other paradigms in the future, once you have some mastery of one paradigm.


I didn't say (and don't believe) that learning OOP is harmful, I said internalizing the idea that OOP is a fundamental principle of programming is harmful.


What would you say are fundamental programming principles?

Genuine question.

Right now, for instance, I'm learning about different data structures. Recently, I learned about different sorting algorithms.

Would you say these are fundamental to understanding programming?


Principles would be stuff in the form of "You should do things this way to end up with a better product, regardless of paradigm". Of the top of my head would be things like:

- Modularity/Separation of concerns: The different functions a program needs to accomplish should be uncoupled as much as logically possible, to the point where you could easily swap out one with another way of accomplishing the same objective without affecting anything else.

- Don't repeat yourself: Any time you're writing nearly the same thing over and over, that whole pattern should be abstracted away.

- "I've only proven it correct, not run it.": You should be able to test the software in an environment that shows you what it would do, without affecting the thing you really want to operate on.

- Write in a way that other humans can easily understand what you're doing. (e.g. name variables after what they're being used for, write comments help the code make sense and convey any tradeoffs)


> fundamental programming principles like OOP

Learn you a Haskell for great good instead.

Functions and types are fundamental. OOP not so much. Computation is fundamental, state not so much. Higher-order functions are not quite fundamental (depending on your point of view) but pretty close.


  This means we needn't fully specify a system to get
  started, we can simply craft new pieces as we go. We end 
  up exploring the system as we create it and don't have to 
  get a "complete" model to gain value.
This is an essential part of the article, and why the author has a valid point.

The problem with most current approaches to programming is that they don't provide good support for this partial, incremental way to build models - either your abstractions are fully developed, or they don't work at all and the program crashes during execution; there is no middle ground.

Professional developers eventually learn how to build systems incrementally through testing (by building placeholders for yet-to-build components) and debugging (by exploring how the program behaves under different inputs), but these tools are complex and not really part of the programming language, and thus not adequate for basic literacy.

Excel and Hypercard fall out of this requirement for perfect modelling, allowing partial execution of parts of the program, and intermixing data with the code that works on it, lowering the abstraction barrier; thus these tools work better for exploratory modelling than conventional programming languages, which keep data and commands wholly separate and require an exact specification to work.

They have shortcomings, but the article is right that a tool for the wide public needs to reduce the level of formalism required to build models, and that "the computer revolution" based on that "lightweight modelling" approach has yet to come. There is an emerging movement for "live programming" that has created environments closer to that approach. This is still in its early stages, though we think we're finally close to achieve that vision.

*Edit: format


- Coding is not the new literacy.

- Neither is modeling.

- Neither is externalizing mental models or whatever.

It's not the human obligation to become more and more creative and resourceful, when less creative activities become less and less valuable, and fail to help earn a living.

The author of this article claims that people are missing the point. He's right. But he's missing the point as well, or at least arriving at a very wrong conclusion.

We're at a cusp of a very very strange era. This is the era where humans, and the capacity of human intellect, is approaching obsolescence. And the solution is not to train everyone to become like Einstein or, in this case, Knuth, or Steele, or Torvalds, or whatever.

"There can only be so many people in the entertainment industry" as Aubrey de Grey puts it.

Similarly, there can only be so many people in the "coding industry".

These people are missing a key property of coding: Coding eats itself! Or as Andreessen puts it "Software is eating the world". Meaning, coding by virtue of it happening, ends up automating aspects of itself, to the point where, a small kernel of information processing is able to carry out vast amounts of informational activity, as opposed to requiring a vast number of coders as those not familiar with coding tend to believe.

And this trend of automation will keep on increasing, giving rise to more and more serious technological unemployment, all the way to the logical conclusion of technological singularity. The sooner the "people in charge" realize this bitter truth, the better it would be for everyone.


You admit that computer and software are eating the world, but come to the conclusion that it's useless to teach people to speak to them at a basic level.

I'm not sure I follow your argument.

We don't teach literacy because we expect people to be professional writers, but rather, that because the written word is a basic means of communicating with each other and the world around them -- we're having this very discussion through the power of literacy, even.

In what way do you think people advocating teaching programming mean in any other sense?

When I advocate teaching programming to people, I don't envision a world full of software developers, I imagine a world full of slightly more informed software users. People who can interact with the Googles of the world by writing a sentence (or even a Boolean expression!) using correct logic or who can write a small Javascript snippet to automate a simple task like highlighting certain things on a webpage or who can write a simple Python script to get a rough approximation about how some numbers work.

Further, having even a tiny basis in computers - like the level of mathematics we teach in high school (on average) - gives me a basis as a professional to explain how things like Facebook work in a way that I can't now. It's much in a similar way to how once people know what a variable and a function is, I can explain more advanced math concepts using those ideas -- in a way that I couldn't if I had to start at the very beginning with "this is a number".

So again, I'm not sure I follow your argument: if software is eating the world, why wouldn't we teach people to speak basic software in addition to basic spoken language?


You seem to be coming from the opposing view of that of the article, that coding 'is' the new literacy. I was more challenging the article's claim that modeling is the new literacy, which I think isn't.

As to whether coding is the new literacy or not, I'd say we really need to ask what is the purpose of education. If the purpose of education is enlightenment, then yes, a child should learn how to code, but also how to paint, how to play a musical instrument, and so on and so forth. But then some children have the aptitude for one thing, some for another thing.

But I think the purpose of education as a means of making the individual economically viable is becoming more and more futile with increased automation. (In this regard I strongly adhere to the views expressed in the video 'Humans Need Not Apply' by CGPGrey[1], and TED talks by Jeremy Howard[2], Erik Brynjolfsson[3], Andrew McAfee[4]).

As to the need for coding so that humans could interact with machines, I think in the long run, this is also going to progress to the point where the machine would be aware of the abilities of the human it's dealing with and make it easy for the human to interact with itself. (EDIT 1: I should also add that interacting with machines is already happening at a massive scale when everyone has a smartphone in their back pocket. And yet when the software goes wrong, the inability of an average user to fix it is not the result of that user not being coding literate, but rather the insurmountable complexity of the code behind the easy-to-use interface, even if we ignore the hidden nature of proprietary codes).

[1] https://www.youtube.com/watch?v=7Pq-S557XQU

[2] https://www.youtube.com/watch?v=t4kyRyKyOpo

[3] https://www.youtube.com/watch?v=sod-eJBf9Y0

[4] https://www.youtube.com/watch?v=cXQrbxD9_Ng


> As to the need for coding so that humans could interact with machines, I think in the long run, this is also going to progress to the point where the machine would be aware of the abilities of the human it's dealing with and make it easy for the human to interact with itself.

And when is that long run going to play out, in the next 50, 100, 500 years? A lot of the learn to code movement is about economic empowerment, providing an avenue for economic advancement for children and adults, especially children and adults that wouldn't have thought coding as a career was possible for them or have had the access to resources to really pursue it.

Yes, automation is eliminating jobs, but still creating new and well paying technical ones. Whether and when we'll reach some technological singularity is all based on a lot of conjecture, and how that ultimately will pan out with how we structure our economy is impossible to predict. In the mean time we have the current reality, one that most likely will result in tech providing plenty of well paying jobs for at least a few more generations. The idea of coding as literacy, as has been stated before, isn't that everyone needs to become a software engineer. It's that as a person coming from any background you at least get the opportunity to see if it's a career you would want to pursue, and if not at least give you more insight into how technology works and make you a more well rounded citizen.


"modeling" is different.) If we think about oversimplified, or too abstract, disconnected form reality models and especially those probabilistic models, when they are trying to infer the next immediate outcome based on the probability distributions of from the past (which is completely wrong - all you could infer form probability distributions is what could happen on average), then, it is a total disaster. Look at these loses in finance and profound failures in humanities.

Models has to be "just right", like every molecular arrangement in the Nature.

But the skill of "extracting correct models from some aspects of reality" is really important.


When you get to the end of the article, it turns out that they're building the next generation of Visual Basic.

That's not a fundamentally bad idea. It's also interesting that they focus on modeling and simulation. That's what Alan Kay thought personal computers were going to be for. But all we got is the VisiCalc/Lotus/Excel world. That's useful, but limited.

The problem with visual programming languages has usually been that they don't scale. You end up with a huge collection of lines and boxes. (For a good example, find a game implemented with Blender's game engine and open up the wiring diagram.) They have a chance of solving that problem, because program interconnection has made some progress since the days of Hypercard and UNIX pipes.

There are a lot of line-and-box programming systems in domain-specific areas. LabView, for industrial and research control systems, is one of the most successful. But no one has yet solved the scaling problem. On larger problems, things just get too tangled.


>>That's what Alan Kay thought personal computers were going to be for.

This prediction did work out in one area that computing has always been significant: music-making. If you consider that your average synthesizer is an analog computer with a process pipeline, modern software has definitely fulfilled the modelling/simulation goals set by the pioneers in the early years. The average electronic musician using a computer to compute sound is well and truly driving a computing process with modelling techniques - though there is a very wide scope of variety between the interface being used to produce this result, it is true -nevertheless in the synthesizer world, these concepts have lived on and matured. Play with puredata (http://puredata.info/) or any one of the thousands of other sound modelling/synthesis applications out there and you'll see this reality.

The challenge then becomes to figure out how to turn your average business process into a synthesis, of sorts, which can be modelled in an analogous fashion ..


problem is, the hard part is not coding, the hard part is transforming requirements and workflows into logical structure a computer can understand.

that's the bottleneck, whatever the language.

the new literacy would be not just to see a lego construction and understand the pieces involved, but to see the final construction and to be able to recover the build instructions by sheer mind gymnastic.


> the hard part is transforming requirements and workflows into logical structure

that's like saying for painting a picture, the hard part is knowing where to put the paint. The act of slashing paint on canvas itself is trivial.

I think the real problem is that there hasn't been any proper primitives created - primitives that can be linked together to produce what looks like another primitive. I guess the non-physicalness isn't helping, because lego works really well as a building primitive.


primitives that can be linked together to produce what looks like another primitive

This was the design intent behind UNIX pipes; text processing through composable programs.

I'd go in the other direction and say the problem is too many primitives; it's easy to spend all your time just evaluating possible solution components.


Sometimes when I've had a module of about 500-2000 lines, I have thought that much of it could be represented pictorially in a way that could also be understood by a machine.


I hope they get the next generation of Smalltalk/Lisp Machines.

We need to have coding systems like Wolfram|Alpha being mainstream, not Visual Basic.


It's just his business, he wants to keep making IDEs with a visual aspect to them, so he has to produce posts like this to justify it to the investors. Would be great if he finished the previous one before starting the new one though. Finishing is a skill mankind would benefit from even more: http://makegames.tumblr.com/post/1136623767/finishing-a-game


His last project was LightTable, and provided huge value to people. Can I ask if you ever even used it?


Oh, this thread again.)

There is almost no relation between knowing a language (vocabulary and grammar) and the ability to write, say, 1984 or Zen and Art of Motorcycle Maintenance. It requires a different set of skills (explained, btw, in the Pirsig's book).

The "new literacy" is, basically, a "model extraction" - ability to understand the principles form the emerged phenomena on the go. It is about learning new skills on the go - you need to perform some data analysis - well, just pick up the required skills (it is not that hard, by the way).

Again, coding is not programming. These two are different skills. Programming is an engineering discipline. Coding is a translation skill. To be a translator one have to know the languages, but this is not nearly enough to be a writer (an inventor). A typist, who is literate enough to be able to make a carbon copy of someone's else fine poetry is usually incapable to write her own (neither are the guys who study literature).

To put it another way - "to know C" is not enough to write nginx or redis. It is not about languages (well, almost - a language that fits the task best is a requirement), the way a "good idea" could be expressed in any language.


+1 for reading Zen and the Art of Motorcycle Maintenance on this topic. He delves into great depth on the intersection of technology and humanity and the way people look at both. It is incredibly thought provoking.


I agree wholeheartedly.

We have some QA folks at work. Our best is a former database programmer from way back in the day, and she has a REALLY good "systems way of thinking" and has a mental model of how things work and are united conceptually. This ability, her "systems way of thinking" sets her far and away above every other QA person I've ever worked with.

Being able to come up with a mental model of a system and encode it, either in code or as a document or description is truly valuable in today's increasingly complex and nuanced world.


I have to partially disagree with Chris here on two aspects.

First of all, programming indeed seems to be a fundamental thing, on the level of reading, writing and arithmetics. It gives you new classes of mental tools that allow you to think thoughts previously unthinkable.

People sometimes say it's very arrogant of some programmers to think they have something important to say in fields of biology, physics, philosophy, etc. But it's not arrogance. There are important insights in CS shining light on every other branch of science that are often noticed by people related to programming and CS fields because no one else is trained to think in this fundamental new way.

But I strongly agree with Chris on one thing here - this is not what you get in those "Teach Children to Code" programs. Those will teach you how to become a frontend developer in a year and earn good money, which honestly doesn't require you to think at all. It's a trained-monkey level thing. They took what was supposed to be an idea about education and turned it into something that's about money and career.

I think Bret Victor did the best summary of why current "teach kids to code" movement is just total bullshit.

http://worrydream.com/MeanwhileAtCodeOrg/

Second - modeling. I agree with the sentiment that it's modeling that's (one of) fundamental thing(s). But it can't just be any modeling. Because typical "modeling" of today is that diagram your chief marketer drew on a whiteboard that conveys exactly zero knowledge (or even negative knowledge, if you count in confusion it caused). Typical "modeling" gets you UML diagrams, which we all know are most useful as padding at the bottom of the trash can, so there's something to absorb coffee from the cup that you just have thrown away.

The modeling that programming is about is careful and very precise expression of concepts. It's like that saying goes - "if you want to really understand something, teach it to someone; but if you want to really, really comprehend it, teach a computer to do it". Computers are unforgiving - either you exactly understand what you're doing and can express it precisely, or it doesn't work. There's no room for error or hand-waving around difficult concepts that people do when explaining things to one another.

So modeling may be the new literacy, but this must be careful and precise modeling.


The modeling that programming is about is careful and very precise expression of concepts. It's like that saying goes - "if you want to really understand something, teach it to someone; but if you want to really, really comprehend it, teach a computer to do it".

This is all well and true for scientific modelling, where you want to achieve an exact depiction of how nature works. But for more social undertakes I believe that level of exactness is overkill and brings unnecessary complexity.

After all, society members have been able to collaborate in numerous projects that don't need that level of accuracy, well before (and after) the widespread adoption of scientific reasoning. Most social activities simply don't need such precision and can advance with an amount of ambiguity and implicit knowledge, with which a computer would choke.

In such environments, people could still benefit from modelling through computers in a way focused not in precision, but in semi-automation of routine tasks.

Where computer programs need that every possible case is taken into account in the model and handled with precission, a system that handled well the common cases and delegated outliers and exceptions to human supervision could greatly improve those social enterprises without requiring that those involved learn how to express their thoughts with scientific meticulousness.

Alas, modern programming languages are not adequate to support that semi-structured way to modelling, but I think we are closer than ever to build a tool that can allow end users to build domain-specific abstractions without learning to create imperative programs or learn how computers work internally - the ideas to make this happen are floating around with strength for some time.


> This is all well and true for scientific modelling, where you want to achieve an exact depiction of how nature works. But for more social undertakes I believe that level of exactness is overkill and brings unnecessary complexity. (...) Most social activities simply don't need such precision and can advance with an amount of ambiguity and implicit knowledge, with which a computer would choke.

This reminds me of a common misconception that computers are about precision as opposed to ambiguity. It's not true - computers handle ambiguity well, but force you to be precise about what is ambiguous and in what way ;). We've invented probability theory to deal with uncertainity in a proper manner.

Of course, most social activities work fine with handwaving anything we're not sure or don't understand - but well, humans do have this skill and it's not going away. What programming does is it complements that skill with an ability to be precise about uncertainity when you need to.

> Alas, modern programming languages are not adequate to support that semi-structured way to modelling, but I think we are closer than ever to build a tool that can allow end users to build domain-specific abstractions without learning to create imperative programs or learn how computers work internally - the ideas to make this happen are floating around with strength for some time.

I agree, though I don't think we can get there without teaching people the essence of programming. Syntax peculiarities don't really matter, but the basic ideas are so fundamental, they're probably more fundamental to reality than physics. I believe you need to have an intuitive grasp on computation to be effective even with that semi-structured modelling - because not everything we know from today's programming is working around machine-imposed constraints; some of it is actually touching the very fabric of reality.


  I agree, though I don't think we can get there without 
  teaching people the essence of programming. 
That's probably true. Though what is "the essence of programming" IMHO is quite different from what most developers think when they talk about "teaching programming" to the general public; it should have little to do with how to represent computations in available computer hardware, and everything to do with building domain-specific languages that represent interesting concepts - for people who don't like computers nor want to understand how they're built.

As others have pointed out in this thread, what is truly essential is the nature of modelling, i.e. building abstractions from constituent parts, either in bottom-up or top-down fashion. And as the original article says, an essential property is being able to build working parts of the system without needing to complete the whole of it before execution.

  I believe you need to have an intuitive grasp on
  computation to be effective even with that 
  semi-structured modelling
True, but again what developers usually think of computation (building a tree-like program that is executed at runtime with a specific dataset) is not what is needed by the general public.

Research in End-User Development, watching real people use experimental IDEs for non-programmers, have found that systems based on rewrite rules are easy to grasp.

Also image-based environments like Smalltalk and "live" programming -where the whole system is seen as a single collection of data and always-running code- that reacts to changes instantly is also simpler to understand than the standard write-compile-execute cycle, and better adapted to exploratory programming.


I agree with everything you wrote. Those essential skills have little to do with what we as programmers do every day at work. Also people seem to rediscover them in many different ways - look at things that are made in Minecraft or Dwarf Fortress, for example. There's lot of application of those concepts in something that doesn't involve writing code directly.

> Also image-based environments like Smalltalk and "live" programming -where the whole system is seen as a single collection of data and always-running code- that reacts to changes instantly is also simpler to understand than the standard write-compile-execute cycle, and better adapted to exploratory programming.

As someone in love with Lisp, I endorse this message.


> simpler to understand than the standard write-compile-execute cycle

well, you'll hear different things about that. Some say the exact opposite, since understanding the program then depends on the large amount of state of a running system. In contrast, the write-compile-execute-cycle is more controlled, since the state building is being controlled as part of starting the program.


> understanding the program then depends on the large amount of state of a running system

Fair enough. But a system based on declarative functional language should largely avoid that problem, as all state has to be made explicit, and all interactions with state are represented locally.


But that's why we have Aglie and NoSQL. Modelling was just slowing us down. :)


> But that's why we have Aglie and NoSQL. Thinking was just slowing us down. :)

FTFY ;).


non trivial front end stuff does involve systems modeling in the sense mentioned in the article. Just saying ;)


Of course it does, as non-trivial anything development. It's just not what people are being taught, and more importantly - they're being taught things for a wrong reason. "Teach kids to code" was supposed to be about making them smarter, allowing them to think in ways they previously couldn't. Not about preparing them for a lucrative career.


You know, basic alphabetization also does not teach kids to think in any way. And does not even prepare them for a lucrative career either.

If your criticism is that those programs do no go far enough, I can agree with that. But they have to start somewhere, and that somewhere must be easy and basic.


Fair enough. My criticism is that they focus on the wrong thing. They turn into vocational training. It's a different goal, even if the beginnings look the same. We don't need everyone to be a developer - but we could use everyone understanding concepts related to computation.


I think that coding is the last chapter of a computer litteracy education.

Kids should learn to: - type with a keyboard - understand how a computer works: cpu, ram, disks, etc - understand softwares, OS, security - understand internet, html, ssl certificates - use office/office style softwares - basic programming skills in an easy language

Computers are here to stay and will be an everyday technology like cars and electricity. Kids have to learn thermodynamics and electricity at school not because they will all become physicists or engineers, but because they need a basic understanding of what they will be exposed to their whole life.


"No... but it might be the new numeracy."

Especially with all the tools people have to do raw calculations for them (hardware or software calculators, spreadsheet, equation solvers) the coding "benefit for people who won't continue it professionally" is similar to math education.

In both cases, the takeaway involves establishing certain critical-thinking skills, for creating mental models of a real or desired thing, trying to break that model down into solvable chunks, recognizing chunks you already know how to solve, and building upwards again.


Indeed. You don't learn arithmetics to do the work calculators are better suited for, you learn it to understand the concept of quantities and their behaviour. You don't learn calculus to solve integrals by hand, you do it to get a feel of how dynamic and interdependent systems behave. And similarly, the takeaway from general programming education should be additional mental tools enabling you to think new thoughts and understand all kinds of systems better.


I think computer science has more to do with engineering than science. Like learning Latin, learning computer science has some side benefits in term of learning rigor, logic, etc. But that should not be a primary motivation.


> There are a number of tools that already help us do this - from Matlab to Quartz Composer - but Excel is unquestionably the king. Through Excel we can model any system that we can represent as numbers on a grid, which it turns out, is a lot of them. We have modeled everything from entire businesses to markets to family vacations. Millions of people are able to use spreadsheets to model aspects of their lives and it could be argued that, outside of the Internet, it's the single most important tool available to us on a computer. It gains this power by providing a simple and intuitive set of tools for shaping just one material: a grid of numbers. If we want to work with more than that, however, we have to code.

The idea that the entire average joe population needs more than excel has never sounded quite right to me. Anything closer to "real" programming and I think we're asking people to devote too much of their time to something that looks like software developing. I Just honesty think not everyone has the demeanor to enjoy programming. It's a very introverted, abstract, high focus, and high patience endevour.


Indeed, Excel is close to perfect for the purpose of building data abstractions in an intuitive way. The only part I find lacking is for building behavior abstractions which allowed users to build semi-automated workflows by changing views of data on the fly; to change presentation, Excel still requires you change languages and resort to imperative programming in Visual Basic, much like Hypercard.

A tool that combined Excel's functional data-flow programming with Hypercard's representation of state (as a stack of panels that change depending on the execution of small attached functions) would IMHO provide a general programming tool for end users.


Excel is easy to model in an intuitive way, but it end up being very poor models that come out of it, as it is basically a two dimensional grid. So the first two dimensions you are modelling will come out well, but once you need to start adding extra information in, then you have color coding, new fonts, and all sort of hacks to overcome the limitations of excel, when in reality you ought to be using a proper database by that stage.


That's right, and this is why a better tool is needed. In the end Excel can't support arbitrary levels of abstraction.

IMO this is not because the base model is a grid (after all, "proper databases" also follow that table model) but because you can't create new tables on the fly and treat them as new objects independent of their structure. Once you build a data type as fields laid out as columns, all the operations on that data (mainly filters and search) need to be operated by hand.

You can't properly express in Excel something like "there are three collections of employees, and I can create a '' function that works on any of the three". If you have separate tables with the same structure, you need to replicate the same operations on each independent table; the only way to build reusable code is with Visual Basic macros, which falls outside the spreadsheet declarative model.

But if you could create new functions with Excel formulas (like style handling or business processes) and apply them to any arbitrary table having the adequate structure, I think you could have robust abstractions within the spreadsheet context itself - from a theoretical level, the storage model of spreadsheets is no different from that of relational databases.


I cannot agree more.

Apple's stab at Excel with Calcs is the way Excel should always have been. A blank sheet to which you can add tables or charts, not a single grid. You only need to add an index by which you can query these tables (instead of VLOOKUPs), and you have something as simple and flexible as a spreadsheet and as powerful as a database.

Microsoft seem to be keen to observe and take inspiration in what is done outside, but somehow the office team seems to be an exception. Apart from re-shuffling around the buttons, I can hardly tell the difference between Excel 2000 and Excel 2013.


To be fair, Excel internals must be too brittle to allow for more extreme redesigns; they've been piling up layer upon layer of technological changes for three decades, and they need to keep it working without crashing too much.

I don't know Apple's Calcs, but Microsoft has built the PowerPivot extension that largely works as a modern, parallel version of Excel embedded within the same window. If you need to work on Excel with large datasets and/or complex relations, it may pay off to learn this extension.


The Visual Studio team overcame a similar problem with a complete rewrite. I think the problem is more about breaking the compatibility, many add-ins rely to a very old COM interface. But I think people wouldn't mind a breaking change if it comes with really new and powerful capabilities. In fact it would provide an incentive to upgrade. And it can run side by side with old-style spreadsheets that preserve compatibility.

The problem is not so much large datasets, for which there are better tools than Excel. It has more to do with doing more advanced thing conceptually.

For instance, what about adding a table to a sheet (Apple Calcs style), specifying a name and some inputs cells, and one output cell or range, and then being able to use that table as a function anywhere else in the workbook. I can already solve these types of problems with a UDF, but think about users who do not know how to code. Also, having the logic of a function in excel formula format makes it easily auditable by a non-programmer.

And also what about manipulating data in memory instead of having everything in big sheets and having to link every cell with each other, or having nested tables in a cell (or a reference to a table in memory or in another sheet).

And this is Excel only. Word and PowerPoint should really have a mark-up language side by side with the visual representation. That would make it so much easier to manage metadata (object names, linking a piece of text to an external source, working with templates, etc).


If you have time, could you take a look at this project I'm working on? https://vimeo.com/107069470

I would relish feedback.


I'll have to take a closer look at your video, but from a first glance I'd say you're firmly in the right direction.

Spreadsheets "declarative-reactive" execution model is a solid base for end-user development environments, and some features (like "attribute chaining", if it is what I think it is, and allowing users to build data structures as attribute collections on tables) match what I have in mind for my ideal "live modeling" programming environment.

Do you have a place where you document your design ideas (in addition to this video) and share the code of Kaya? On which language is it coded? I'd love to study how you've built the running environment.


Contact me: david 927 at gmail


The author explained a very useful analogy (reading : comprehension = writing : composition = coding : modelling) but didn't push it as far as he should. The important thing about coding as modelling is that the results are not just external to the author's head, they are formal models that automatically infer or do things. The other way to do that - in one's head and via his actions - simply doesn't scale beyond a point. Historically the ways people scale such models are human organisations. That's why IT is so exciting today: not only it helps an individual think, it helps groups organize at scale that wasn't practical before because of the friction inherent in implementing formal models in people's heads.


Personally, I wish there was a "everybody should know how to design" movement. Then instead of coding tutorials I don't need, there would be tutorials for designing available. There are some articles about typography, but it seems very hard to learn graphics design in general (on your own).

I feel much more held back by my lack of design skills. And I think it's much more important to being successful in life. It starts with presenting yourself (design your CV, website, letters, mails, ideas, power point presentations).

But even if you want to create an internet startup: the backbone technology is pretty much interchangeable. Java/Ruby/badly strung together bash scripts/whatever. The visuals and UI matter much more.


That's not what the "learn to code" movement was supposed to be about (though that's what it is about now). What you're suggesting now is again "how to get rich", but in the marketing version instead of webdev one ;). I get why learning how to sell yourself well is important for being (financially) successful, but it does not make you smarter, which was the original point.


Ok I missed that - they seriously want everyone to learn to code because it would make them smarter? No offense, but I think that sounds so arrogant and misguided I am speechless (not meaning you, I mean the movement).

There are plenty of other human activities that also require and train intelligence. I would like to see some actual research on why programming supposedly is more effective in making people smart than other things.

And then personally I would say people should learn some basic maths before learning programming. That would help even more, for example to avoid financial breakdown.

I really thought the movement started because people saw so many "I became a billionaire by programming" stories in the media, and felt that without programming people would end up poor. But for example with design it is much easier to make money on the internet. Just throw together something nice about a current meme and sell a T-Shirt. I don't know how much work designing a T-Shirt is, but I am pretty sure it is less than coding a startup that makes money. Especially, since for a website or app that makes money you'll also need good design, so it is a strict subset of "sell a T-Shirt".


> Ok I missed that - they seriously want everyone to learn to code because it would make them smarter?

No, of course not. That was the idea of initial proponents, but it ended up being "coding is important because it will make your rich" instead.

> No offense, but I think that sounds so arrogant and misguided I am speechless (not meaning you, I mean the movement).

> There are plenty of other human activities that also require and train intelligence. I would like to see some actual research on why programming supposedly is more effective in making people smart than other things.

I'm arguing for that in the top-level comment. Understanding programming gives you another mental tool you can use for thinking, just like arithmetics or calculus. It's a way of thinking that is both relatively new and important thanks to the problems we're facing in the modern world.


I'm a big fan of programming. Just saying that there are lots of other candidates for "everybody should learn x" that would also make sense.

I think programming should be taught in school anyway.


I agree. There are a lot of things we should be learning (understanding of finance would be one thing, optimization maths would be another IMO, and maybe if people well taught to understand the behaviour of exponentially growing systems (like self-replicating ones), we wouldn't have so strong an anti-vaxxer movement now).

My point is, I believe programming has very fundamental insights and new models of thinking that are not really related to the syntax of particular language. Insights that let you think of things in terms of computational processes and use that thinking to make stuff do itself. I definitely don't think everyone should be a developer. Just that everyone could use those insights in the complex world we live in today.


I couldn't agree with you more. I have been working on trying to get some skills therein. I paste the notes from my personal expertise board.

Graphic Design:

Relationship

Transition

Layering

Principles of Design:

Repetition

Opposition: Excitement and tension. Too much is bad. Particularly thick and thin lines in logos.

Priority: The order of the items.

Position: Use a grid. Divide in twos and connect corners diagonally.

Attributes of Design:

Balance: Arrangements so that elements equal each other, not overpowering each other. Can balance objects with each other or whitespace with objects.

Emphasis and Contrast: too little is boring, too much irritating. Plain text with bold or underline in small quantities works.

Rhythm: regular movemet, reduce line spacing or tighten words to increase speed. You can increase the line height, but too much and the eye has trouble finding the next point. The rhythm should fade out, and not end abrubtly.

Position:

Grid Choices

http://www.troytempleman.com/2010/04/30/grids-in-graphic-des...

Rule of Thirds

Golden Ratio

Columns

Baseline Grid http://www.markboulton.co.uk/journal/five-simple-steps-to-de...

Start with a symmetric plus, and then use 1.414 ratios to create smaller sections with margins.


There are 500 pages à 15 tutorials on tutsplus about "design":

http://tutsplus.com/tutorials/search?utf8=%E2%9C%93&search%5...


Thanks, I'll check them out.


Can recommend these two books: The Design of Everyday Things and The Non-Designer's Design Book.

TDoET will teach you about usability and designing for that. TNDDB will give you a vocabulary for design (much like design patterns give a common vocabulary for code problems, this does for design IMO).


Thanks. I already have the second one and it's great. But it doesn't make me a designer.

I'd like a course like the Coursera MOOCs that walks me through the steps (most importantly, makes me practice with homework on a regular schedule). Also the mundane details - what tools to use, for example (software, hardware).

To be fair I think there was actually a MOOC by the author of The Design of Everyday Things. I was signed up, but ended up not having the time :-(


Not everyone needs to be coding since they are 10 and end end up as a top software engineer in some hot tech startup.

But I definitely agree that basic programming and database skills would be beneficial to almost everyone following any professional career.

Take my mom as an example: she's a very skilled doctor working with kidney transplant in Brazil. There are lots of software programs written to help them, but she's always ahead of time and trying to do some magic with the massive data she has access to, playing with graphs and spreadsheets.... and I sometimes (little time) help her to put some little tool up..

I can only imagine where her ideas would be if she had basic programming skills...


In front of you is an apparatus that can do a trillion operations for you in the time it takes your microwave to bring a cup of water to a boil.[1]

But whereas entering how long you want your microwave to heat its contents is trivial, programming your CPU and GPU to do anything at all makes you about as qualified as someone who holds an undergraduate degree in the field of computing[2]. Like a priest at a time when hardly anyone could read and write, and being lettered itself was a distinction, anyone who can get their computer to do anything at all (i.e. to program it) has a rarified ability.

Why is the analogy with literacy a good one: because you must communicate your intentions to the computer in writing. That is how you program. Yes, we have frameworks and models and API's and all sorts of helpful protocols and formulae to help you communicate with your PC and give you a mental framework.

But at the end of the day, you are asking a machine to do something. And you're doing so in writing. The rest is literally formalities.

[1] assuming dual-core desktop CPU stepping up to 2.7 GHz and mobile-class GPU or better; iPad 4 or iPhone 5S GPU capable of 76 GLFOPS or better; 700 watt microwave capable of boiling 1 cup of water in 1 minute and 30 seconds.

[2] CS undergraduate degree does not guarantee that a candidate can program their PC.


All this discussion remembers me of Brian Kernighan [1] mentioning that Dennis Ritchie was also a good writer (besides a good programmer). Also, Jamie Zawinski (jwz), somewhere in the book Coders at Work [2], while questioned by the author, points out a feeling of mental connection between programming and writing.

I am with those guys in the sense that programming is just like writing anything else. Even in the most trivial cases, you actually think (or could have done it) before putting down a word or two. And you often come back to erase and make it better. So, teaching how to program is also teach how to "build models".

[1] http://youtu.be/uxtKwJZbYr0 [2] http://www.codersatwork.com/


Writing is much more difficult than programming. With programming, I just have to get the program to work, the computer checks my BS by not doing the right thing; I can't lie to it. With writing, I can easily delude myself and it takes extra mental effort or external review to check prose.


You can use the computer to check whether you are doing the right thing by trying it out, but the computer will not tell you whether your program is organized in the best possible way, whether it is to be considered beautiful or not, or even, sometimes, whether it is efficient enough. To achieve all these points you really have to think, and it is just like building anything else, from texts to mechanical devices.

Of course, its not abstract modeling; there is a necessary amount of technical knowledge to it, but so there is with anything else.


You are in a dance with the computer, there is a reliable feedback loop if you want to test or do perf, there is a truth to it. But with writing, you might not get any feedback at all, or very little. Building things is quite different from communicating.


The world runs on software now.

Gaining a basic insight into what's going on, at a relatively early age, is important if we want people to gain an understanding of the world around them. Not everyone has to become an expert, but a modern education system at least ought to introduce people to it.


Teaching systems thinking is hard. Even higher institutions of education haven't figured out how to teach it properly and they are in the business of teaching stuff. Sprinkling computational magic dust over it will not help.


Computational magic dust has this magic(!) feature that you can't pretend you understand anymore - if you're stupid about your model, it just won't work, period. Machines are unforgiving, and they only give you absolutely honest feedback.


On a syntax/grammar level the compiler is very strict. For plenty of interesting problems you can often get away with stupid/buggy models. Not that it's usually a good idea...


Well, it usually quickly comes around to bite you in the ass. That's the cool thing about computers - quick and honest feedback.


There are lot of guys who don't code. And there are lot of guys who code but dont exactly know what are they coding. If you remove all these combinations, Coding literacy is very less.


Yes. And you're talking about design, in the sense that modeling is a form of design, and also the design of computers and the supporting OS's we have today.

To put the above into perspective, imagine how you would design the entire computing environment for the common man if you could start over and design it from scratch.

Much of the literacy barrier is due to the fact that our systems are hangovers from 25 or 50 years ago.


Good article. I am not sure the difference between modeling and coding/programming needs to be made as there is not really any significant coding without modeling.

It's a bit like saying pilots are airplane takeoff/landing operators because enroute flying is the trivial part.


Improving the nation's overall STEM skills is more important than just trying to focus on programming. Programming is not the hard part -- planning, deducing, structuring, hypothesizing, analyzing and so on.

Programming is just one way to apply those skills.


Sure but any STEM major is going to be interacting with computers to a significant degree. It helps significantly to understand them. Maybe not to the same level as a developer on HN, but at least to know how software works and the fundamentals of programming/controlling the machine.


It's the new middle management.


Interesting. These sorts of headlines bug me, in that they reject computation itself as a novel medium of expression or externalizations of mental models.

I think modeling is very important, but Chris seems confused on how to reason about education. The core message, that we want children with higher levels of cognition than mere factual knowledge, is good, but rather obvious. His core conclusion that we don't want a generation of people caring about code has nothing to do with this, and is very misguided. This basically rejects computation as an expressive medium outright in favor of the usual writing and mathematics ("writers and accountants"), rather than recognizing computation's new role in society and new jobs that leverage it (business analyst with his Excel macros, quantitative analyst or data scientist with his combination of statistics and scripting, software engineers, test engineers, architects).

It's almost like Chris took Bloom's taxonomy (not actually a taxonomy, but useful) http://en.m.wikipedia.org/wiki/Bloom's_taxonomy of cognition and decided that the foundational element - knowledge - basic reading, writing, coding, math, and facts - is irrelevant. This is not irrelevant, it is the foundation on which we build comprehension, then application, then analysis, synthesis, and evaluation. This he calls specification, exploration, validation, and debugging. Good luck teaching someone how to apply a model, let along synthesize one, when they can't do the basics of expression. This gets into the philosophy of education, and perhaps this article is just a reflection of the current trend away from rote learning, but this is an active debate, not settled science. This trend may lead to some bizarre outcomes: I see 15 year olds that can't read a wall clock or understand basic techniques around fractions, though they can describe and do basic reasoning about them.

Secondly, I believe coding itself is a fundamentally new area of knowledge, and every debate about "coding is the new literacy" really comes down to whether you believe this or reject it. Rather than convince you, all I can do is quote from the preface of SICP http://mitpress.mit.edu/sicp/front/node3.html, one of the great examples of computing pedagogy we have:

'Underlying our approach to this subject is our conviction that "computer science" is not a science and that its significance has little to do with computers. The computer revolution is a revolution in the way we think and in the way we express what we think. The essence of this change is the emergence of what might best be called procedural epistemology ­ the study of the structure of knowledge from an imperative point of view, as opposed to the more declarative point of view taken by classical mathematical subjects. Mathematics provides a framework for dealing precisely with notions of "what is." Computation provides a framework for dealing precisely with notions of "how to."'

Yes, modeling is important - it's a a higher level of cognitive reasoning. We still should teach kids to code.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: