Hacker News new | past | comments | ask | show | jobs | submit login

> I'm a little disappointed to see him attack this initiative to get more people into coding.

That's because I'd rather people have the freedom to be mathematicians, nurses, lawyers, physicists, writers, and accountants while still being able to leverage computers. Instead of forcing people to become professional programmers to get the computer to model things for you, we should focus on teaching people how to model in general and build more tools like Excel that let people do that.

Hell, I'd stand behind a movement to just teach people Excel. With Excel, we don't have to spend the next several years worrying about what flavor of MVC we have to use. Instead, we can focus on doing actually important things like curing cancer.

> Besides programmers and mathematicians, the only people I've met with the faculties to properly break things down are lawyers.

My point is we should fix that, instead of focusing on teaching people Python.




So much this...

The goal should be for programmers to provide tools with interfaces that one can use without actual knowledge of "the machine".

Imagine driving a car would require you to have intimate knowledge of combustion engines, mechanics and the electronic systems therein vs just pedals + wheel.


> Imagine driving a car would require you to have intimate knowledge of combustion engines, mechanics and the electronic systems

So, for this analogy to be fair, we'd need to have computers which you can't operate unless you know how to use transistors to build logic gates, how to build your CPU out of those, how this CPU executes instructions and so on. Of course, that's "vs just keyboard and mouse".

Of course there's room for improvement, but current computers are not orders of magnitude harder to use than cars. I'd say that they never were, but right now they're not for sure: 3 years old kid won't drive a car, but he can have some meaningful and fun time with a tablet.

Learning programming is secondary issue in my opinion. A "new literacy" is something else: it's ability to stop and think about how the pieces fall together, how they work, how you can make them do what you want. Programming is certainly a way of learning this ability, but it's also full of pointless ritual and irrelevant things and operates on completely wrong - for the normal user - level of abstraction. I don't know what is the most efficient way of teaching this to people, but I strongly believe that we need to teach them this. If we have no better way, then let it even be via programming, it's still better than nothing.


Ofc there is some amount of hyperbole in the analogy.

Computers ARE orders of magnitude harder to use than cars. A car has (at the simplified level) a single wheel that goes either left or right and 3 pedals. The hardest concept relating to hardware is what a gear is and why you need to shift it (among a few others).

It is correct though that software CAN be more simple than that in their use. Those pieces of software tend not to solve complex or any problems in our real world though.

When it comes to 'coding as literacy' this has not much to do with using a highly simplified UI. It has to do with the fact that if I want to use most of the features of the machine we call PC, you HAVE to be able to program.

Sticking with the car analogy my kid can have great fun if I show her how to use the horn in the car. It will not allow her to do anything meaningful with the car (getting from A to B). All she can do is use the car as a toy.

I agree with you on the core problem though. Things are abstraced away from everyone in our daily lives in such a way, that people sometimes are unable to be 'precise'. By that i mean the ability to fully describe a problem and formulate an executable solution. I personally don't think programming is the solution to that. Personally I learned that concept in school in philosophy class.


Every computer program ever written constitutes such a tool.


To add to this, I'm a developer and I have relatively little knowledge what is really happening with the machine. Where a layperson knows something like 0.1%, I know instead something like 1%.

The level of detail exhibited by some in the recent "what happens when you type google.com and hit enter" thread made me further realize just how little I understand.


That's most likely because you're 'late to the party'. I was lucky enough to be born just when all this stuff got underway so I saw it progress bit-by-bit in slow motion and over 3.5 decades I could absorb the changes. If you're dumped in on the deep end in 2015 or so then it can be a bit overwhelming, but keep the faith, it will all clear up in the longer term and you'll be so much better positioned to actually achieve something. Higher level stuff really is higher level, you get more done with less writing (even if that comes at a price in the form of a loss of contact with the lower layers).


This is a bit scary. I would not be able to write functional programs without knowing at least some of the infrastructure the code runs on (albeit not down to the nand gate). If I would be a dick I would guess web programming :P.


If I would be a dick, I'd guess you haven't been in the field long enough to know everything that you don't know?

Even if you're, say, a competent assembly programmer, I'm sure there are plenty of areas you're unfamiliar with. Perhaps cryptography, the details of JIT compilers, or BGP? There's just a lot to know. Or perhaps you're just unusually talented.


Don't be dicks, either of you.


Word. Tried snarky sarcasm, forgot how people react differently, failed. Apologize!


> we should focus on teaching people how to model in general

Classes on logic and/or learning theory can accomplish that goal without any tools whatsoever. In fact, I'd argue we should teach people visio instead of excel if the only thing we're shooting for is mental modeling. I've always thought the world would be a much better place if people leveraged concept maps more frequently.

> My point is we should fix that, instead of focusing on teaching people Python.

Again, I really get your point about not wanting a generation full of programmers but having gone through programming classes in HS I think it's a non issue. I think I'm the only person from my HS programming class that went into programming as a trade. If anything, I think the thing worth attacking is the typical emphasis on the programming language as opposed to the process of programming. My HS 101 programming class was done with an overhead projector, mapping out what a loop was actually doing and visualizing the memory allocations being modified step by step. That's the sort of thing we need more of. Just shoving syntax down a kid's throat will do no good, on that we are absolutely on the same page.

P.S. I think you're doing amazing work. Thank you for all you're trying to contribute to the field and the world.


> That's because I'd rather people have the freedom to be mathematicians, nurses, lawyers, physicists, writers, and accountants while still being able to leverage computers

All of the mentioned professions make heavy use of computers. I do agree teaching basic skills with tools like spreadsheet programs, etc, are fantastic. But at some point in time, it will become a roadblock to not know how to write a basic script to automate some repetitive/remedial task.

That's not to say everyone should learn how to write massive enterprise software, or have deep understanding of how a kernel works. Rather, it's to say people need to learn how to make the computer work for them in the way that suits them best, not just how some Vendor thinks they should work. Being able to write some quick script simply requires problem solving skills.

There is a significant advantage to the journalist who can write their own SQL and query data out of a massive database by themselves without having to depend on someone else. There's a massive advantage to the lobbyist who can better understand statistical trends visually because they learned how to represent, consume and model the data with R.

There are fields where non-programmers are the only ones who fundamentally understand their domain well enough to write software for it. Take the various sciences. How can a non-chemist accurately write a chemical modelling and simulation software without first having to learn how things work? Not very well... even if they designed the software based on some spec, it's unlikely all of the professional chemists domain knowledge from 30+ years of in-field research will accurately translate into a hired-gun's handy-work.

The main issue I see in practice is people are either afraid to begin, or get some glossy-eyed look when the word "programming" is brought up. To me, that is incredibly frustrating, since programming is nothing more than writing down some words in a grammatically correct structure... "you can form complete sentences, right? Good, so you can now write a complete sentence but follow a slightly different rule."


> The main issue I see in practice is people are either afraid to begin, or get some glossy-eyed look when the word "programming" is brought up.

In non-technical companies I see turf protection by IT departments as part of the root cause of this. They set themselves up as wizards who are the only ones able to make the computers do their magic and make the task seem much more difficult than it is. After all, if journalists, lobbyists, customer representatives, and such were able to write their own SQL queries, then Bob the SQL Guruâ„¢ would be out of a job. Plus, you would have to give these unwashed masses access to the systems.

I'm being a little harsh to IT departments in my caricature, but my experience has been that 90% of them make the other 10% look bad. IT at my wife's company is particularly bad; if she wants data out of their database she has to request a database report from a group in Kuala Lumpur (she is in Dallas). She has to basically write the SQL query in plain English in the request in order to get exactly what she wants, because from what I can tell this group is basically an English->SQL translator. If the data is not what she wanted she has to do it all over again. Thus it takes her days to get what she could get in minutes if she had her own MySQL client. On other things I have offered to show her how to use Python to automate things, but those conversations always end with "IT won't let us install anything". In that kind of environment I'm not surprised that people who have a natural reluctance to programming would have said reluctance amplified.


Wherever I've worked, the IT people protect their turf because they deal with end users who get MS Access or Excel, and then those end users get data from the company through force of management. They then create their own computer systems on their desktops, then the department becomes dependent on them.

Then when that person leaves, or confidential data gets out, or an OS upgrade screws up the ad-hoc system they created, who's responsible? IT. IT has to now learn about, repair, and support this system they didn't know about or budget for. It's even worse when a non-IT area hires their own programmer who thinks that IT is "protecting their turf", and so that dev does some skunkworks thing without any consultation.

If you've ever managed corporate IT, you know how these little systems come up. And you learn why IT wants to control it. Because the average person has their job to do, and they're learning computers on the side, and only enough to make something that barely works.

So a little bit of training early on could indeed be a good thing. It would solve many issues. :)


My experience has been that these little systems come up because we can't get IT to do what we want and have to work around them to get our jobs done. That's not to say all requests or IT customers are reasonable; some ideas are stupid or infeasible. Labelling those ideas as such without addressing the underlying business need that birthed them does no one any good. I rarely see IT organizations try to understand that; they judge requests primarily on technical merits.

> So a little bit of training early on could indeed be a good thing. It would solve many issues. :)

What we are talking about in the context of the article is a lot of training. It would give non-technical people the ability to go beyond something that barely works. It would increase the number of independent micro-systems, unless IT departments are willing to start really listening to their customer base.


> Plus, you would have to give these unwashed masses access to the systems.

This is unfortunately the kind of thinking I see with people I know who are tasked with system administration. They take pride in their work, setting up infrastructure to run unnoticed in the background, but I've noticed a tendency to simplify the job to extreme by making sure said infrastructure can't really be used. After all, if no one uses your system, they won't break it and you won't have to fix it. Users, instead of being customers, become adversaries.

I don't know anyone who does that on purpose, but I see the tendency to go there unconsciously. Restrict this, limit that, block everything. Principle of least privilege. It all makes sense from security POV, but when applied internally, users have to fight with support to just get their job done. I especially dislike this when it's happening in the educational context (schools, universities, hackerspaces) - computer systems there should serve as opportunities to tinker, learn and explore. Strict limitations significantly hinder usefulness of the infrastructure while saving only little work for admins.


Exactly. I have yet to deal with an IT organization[1] that does not have at least some level of adversarial relationship with its userbase. The level of adversarial behavior seems to be directly proportional to the computer knowledge of the user, as well. It is much better to be completely ignorant, or feign such, than to think you know what you are doing.

[1] IT groups that aren't large enough to be organizations, say 0.5 - 3 people, don't seem to have this issue as often or as badly.


>> "But at some point in time, it will become a roadblock to not know how to write a basic script to automate some repetitive/remedial task."

Isn't that the opposite of what's happening? 20 years ago it was necessary to know stuff like that to make full use of a computer. Now we have programs like IFTT.com or the more powerful Automator on OS X which allow us to do those things using a GUI without having to understand ho the scripts work or are written. Why can't we continue building tools so that people don't have to waste time learning the nitty gritty? Computing has been getting easier and easier so much so that 1 year olds and 90 year olds can use computing devices. Why does everyone seem to think that in the future we will all be coding when that has been becoming less necessary as time has passed?


Interesting discussion by Seymour Papert & Bret Victor: http://worrydream.com/#!/MeanwhileAtCodeOrg

The point is using computers to help people reason (because experimenting & seeing results in real-time is what enables people to create new & better ideas), not teaching coding for coding sake (to create more programmers)


I agree that people should learn the value of modeling and know how to do it.

But in order to model, people will need tools. Natural language, even when very carefully phrased, is insufficiently clear or precise. And predicate logic isn't sufficiently numerical to support quantitation, which should be central to most models.

Some kind of formal notation is required, both to specify and as well as exercise the model -- vary its parameters and evaluate results.

I see no alternative to using some sort of programming language here. Personally I'd prefer it be a lot simpler and purposeful than Python. (Perhaps Z or UML?) But there's a lot of value in being able to manipulate everyday data as programmers do, now that data is omnipresent in everyone's personal and professional lives.

Possessing basic skills with which to fix your car or repair a shirt has sufficient value that I can't see why anyone might prefer not to know how to program. It's just a matter if matching the right cognitive impedence in choosing the tool.


If literacy is solidifying our thoughts such that they can be written, then the first way we teach children is by finger paint. Here the seed is planted, but literacy is not that disconnected for put words on the page. Literacy demand more than just expressing thoughts and require that we do so with common tools.

Teach people Excel can be a first step towards computer literacy (modeling), but the first way we commonly teach children today is by gaming. World building, puzzles, RPG, and many more are areas which creates a representation of a system that can be explored or used. Some games like minecraft can o both literacy and computer literacy, where thoughts and models are both present.

At this point is where I disagree with the article concept, since you then need to teach children the common tools which composition, comprehension, and modeling are practiced with.


Seriously! Below some people are worried that Excel, "while effective", will make people write unmaintainable VBA scripts. Who cares?

We aren't training people to one day invent NP-complete algorithms. We want their life to be better with a computational understanding.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: