Hacker News new | comments | show | ask | jobs | submit login
Taking a Second Look at the Learn-To-Code Craze (theconversation.com)
48 points by rbanffy 4 months ago | hide | past | web | favorite | 48 comments



The Obama administration's "Computer Science For All" initiative and the Trump administration's new effort are both based on the idea that computer programming is not only a fun and exciting activity, but a necessary skill for the jobs of the future.

Why should I read beyond this? If this were truly the justification for the program, it would be patently ridiculous — as ridiculous as claiming that solving algebra problems was a necessary skill for the jobs of the 20th century — but I doubt anyone ever seriously described it that way. The Obama announcement is still on the web[1] (I assume Trump's initiative is only described in a semiliterate 3am tweet) and it describes tech careers as fun, not programming, and it touts the value of "computational thinking skills" for many jobs, not programming. Teaching programming in schools is akin to teaching algebra or history. There's a spectrum of effects: some kids will develop the skill further and use it directly in their work, some will get a basic understanding that will enable them to work effectively with the first group, and the rest will get some exposure that will make them more effective people (for their own purposes and everyone else's) in a society where it's a powerful force shaping everything around us.

If the author can't make their point without basing it on this fundamental misrepresentation, I won't bother clicking through.

[1] https://obamawhitehouse.archives.gov/blog/2016/01/30/compute...


I had to read your post 3 times and I still don't understand the source of your disagreement. You wrote that exposure will "make them more effective people" and the the author's wording was "a necessary skill for jobs of the future." Are those two things materially different? or in conflict AT ALL?

You called the latter "patently ridiculous" but then made a very similar argument. I don't get it.

Splitting hairs?


Those are vastly different things. Computer programming is a necessary skill for a small number of technical positions in software development, data analysis, and research, a tiny percentage of the jobs in the economy. Even at a midsize SAAS company, 30% or more of the positions don't require programming. Teaching computer science to every single kid because a tiny percentage will need programming for their job would be ridiculous, just like it would be ridiculous to teach every kid history because writing research papers is a "necessary skill for the jobs of the future." You can, however, make a reasonable argument that kids should be exposed to powerful ways of thinking that shape their world, and that learning concrete skills (writing a program, writing a research paper, solving equations) is a good way to familiarize kids with new ways of thinking. The arguments are only similar in that they both claim that studying a topic in school will have some benefit later. That isn't a high degree of similarity in the context of asking what students should study in school.


My wife came home today and said her work load had just doubled. She's been told to copy reports from Word to a web based application. I told her that this could be scripted really easily. She shrugged. Imagine the efficiencies possible in normal jobs if everyone knew of the possibility of automation, even if they weren't able to do it themselves.


Well, if it's a one off task, it would probably be more efficient for the non coding human to just do it, rather than realise this is something someone else could automate this one time. Bearing in mind that we don't live in a perfect world and the data would more than likely need a "human touch" in terms of applying intelligent actions to malformed/missing data (even more so if the data isn't in a structured format)

In all reality, it would just be yet another thing that could be automated but it probably isn't financially worth it. The 'x' human effort in this one time task is probably not worth the '3x' human effort to get someone who can code familiar with the data and then be on hand to answer any non obvious questions.

Course, mileage may vary depending on exact data(size, quality etc)


I'm not sure I agree. Knowing programming specifically -- how computers work, and how to write scripts to control them and automate your workflow -- rarely does not have a positive impact on your productivity at work and your aptitude in understanding the world around you. Computational thinking is important as well, but you can't discount the practical angle here.


Oh, sure, I agree with you on that. I'm just contrasting the actual stated purposes of such programs with the author's strawman version of them.


A century ago, the same argument was made for teaching Latin in elementary school.


I teach my elementary school age children about the Latin and Greek roots of the words we use to give them a better understanding of things.

I won't know how effective my instruction is until they're much older but it's my belief that having a deeper understanding of the words we use has a positive effect on one's thinking.


Well said.

It's not important everyone becomes a proficient programmer - it's important kids are exposed to the ideasand given the knowledge and oportunity to pursue it further if they want.

Like with most other school subjects.


>It's not important everyone becomes a proficient programmer

programming is a tiny subsection of a far larger and more useful skill that should be thought more, analytical thinking. learning kids how to break down problems into smaller ones, how to think logically etc is far, far more useful then learning them programming.


Both are useful, and in world where code resides in dozens of devices in your daily life an exposure to it will be useful and help your picture of the world.


Well... if you read up to that point you would also have read "is the key to the future for both children and adults alike" - and you've only addressed the kids bit. I think the silly craze has died down a bit in the UK, but a few years ago we were subjected to such excitement as the news highlighting activities like c level execs doing coding classes in their lunchbreaks - 10 o'clock news?! This may have been about the same time as the BBC microbit was produced - I've heard nothing of that recently either. Easiest way to confirm it was a silly craze is to watch peak daft hysteria disappear into the distance.


The idea of making some degree of programming universal in the school curriculum both predates and will outlast the craze you're talking about. I don't know about the Trump plan, but Obama proposed spending hundreds of millions of dollars changing the way hundreds of millions of elementary to high school aged kids are taught. If such a program is carried out, the effects and the costs will be felt over decades.

c level execs doing coding classes in their lunchbreaks

This just doesn't sound weird to me at all. CTOs read about marketing and finance; pharma CFOs read about biochemistry. Execs spend an afternoon working in the call center or learning to operate a jackhammer to prove they're regular guys. It's what you get when you mix ambition, personal curiosity, image awareness, and having the authority to make stuff like that happen. In that context it's not as significant as the 10 o'clock news might make it sound.


I did learned about loosely related to my job topics or on the news topics over lunch breaks and evenings. I am programmer, but I did read up on management, accounting, physics, art, biology, finance, children, history etc. I dont think any of that was silly.

For many people, the idea that they could do learn some programming was new and they were curios just as I was about the above. Pretty much any temporary hobby people tend to do is like that - something wakes up curiosity and ambition. Network effect is usually smaller, it is just one company or office doing the thing, but when it is in the news a lot of them gonna do it at the same time.


Solving algebra problems IS a core requirement for living in this day and age. It's a key component of financial literacy and those inadequately familiarized with the material suffer greatly from it.


I'm very skeptical that we should be pushing masses of kids to learn to program, to the detriment of other subjects already being squeezed for classroom time. I love programming but I just don't see why everyone should be required to learn what is IMO a very fun but pretty niche skill.

However, I do think there are a lot of adjacent skills that can be learned through the practice of programming which all kids should be taught (whether through learning to write code or repairing cars or through other pedagogic methods):

- How to break down a complex process into a bunch of simpler steps

- How to recognize abstract patterns and what variables might differ in various situations

- How to use tangible evidence and your understanding of how a system works to 'debug' some phenomenon

- How to "evaluate code in your head" - think through the implications of how a change to a system might work

Programming is a specific skill useful in a bunch of situations, but not universally practical.

Critical thinking is a universally useful skill adaptable to a wide number of situations. A citizen body better versed in critical thinking would (I propose) make our society function better.


I started volunteering this year to teach students to code. We're halfway through the year, and this couldn't be more apparent. Without a true interest in learning to program, these students won't remember 10% of what they've been taught at this time next year. Problem solving, troubleshooting, (as you've listed) or basic IT/computer literacy skills (why isn't my device connected to the Internet?) would take students much further.

Don't get me wrong - a lot of people can get benefit from learning to write an automation script or small program of some sort to take care of an annoyance. Taking a programming course for one year in HS, and then forgetting about it for 3 years will put you marginally ahead of someone just starting. On that same token, I don't expect someone who took Cal II at university to derive/integrate a bunch of equations 3 years later.

Programming isn't a muscle memory skills like riding a bicycle. It takes practice.


I have three work colleagues who started teaching a "programming academy".

They selected 18 people out of 80 applications. The last time I checked with them, just like I predicted, they expected 4-5 to actually finish the course.

The selected people are not high school students pushed by their parents, most have a technical background and came knowing fully well that getting a programming job in the future will increase their salary dramatically.


other critical things that miss from current curricula around the world: extracting information from data - evaluating information quality - vetting information for bias - generalizing patterns into paradigms - build value systems out of paradigms - create hierarchy of values - navigate one self trough the hierarchy toward the top


As a lifelong programmer that pursued programming even before I had a computer I am very biased here but... teaching people how to code at school is very empowering because the devices they carry around with them and the servers they interact with are not magical in anyway, yet they seem that way if you don't understand the basics of how computers work and what a program looks like.

For those who have not considered programming, being forced into it school may not be pleasant, but for many they may find it as delightful and fun as I do and we will bring more people into the programming community.

The article tries hard to make it a bad thing that big computer companies want to be able to hire qualified staff in the future, but clearly it is not. Any country without programs that nurture young programmers and encourage them to pursue it as a career will be behind the rest of the world in software, which is not a place you want to be.


One reason I support "learn to code" initiative is that k-12 schools started introducing computers in the 80s when your Apple][, C64, BBC Micro, etc uses a BASIC interpreter as the core software operating system, learning to use a computer and learning to code were one and the same. It's extremely important for young people to understand that despite the shiny user experiences of modern devices, phones and computers today don't operate fundamentally differently than a 1980s IBM PC, or a 1960s IBM mainframe. This will help them prepare for when technology inevitably changes as they get older.


I fully support teaching people to learn to code, IF they want to learn it. I studied programming since high school in an "intensive programming" classroom -- we had 5-7 hour of programming per week for 4 years. Out of ~25 students only 2 of us are coding for a living. We had a guy winning international competitions in high school programming and he's not coding for a living.


I benefited from the Apple IIe program in schools back in the early 80s. (Mentioned in the full article.) Still programming today, and have no idea what I'd be doing now if not for that pivotal situation. It gave me the idea, the motivation and the confidence to pursue an engineering degree. Before that, I had no idea what I wanted to do after high school.


In the early 90s, we were still using some variety of Apple II. I was too young for programming, at the time. It was basically 45 minutes of Oregon Trail or Math Blasters per week.

I found the QBASIC interpreter on the family computer around 1994 and begged a 70s-era BASIC book of code listings. I didn't get far without instruction, but when I had a chance at programming classes in my sophomore year of high school, I jumped on it.

I knew that I wanted to do something with computers when I was 8 or 10 years old. The programming classes certainly helped to start demystifying computers for me, and gave me a boost of confidence, though.


I went to public school in the early 2000s and 'computer class' meant 'learn to use MS Office' when we weren't playing Oregon Trail of course. I was lucky enough to have parents that could afford to support my interest in programming at home from a young age because it certainly was not happening at school.


Another long term goal is making the software job market more competetive so they can bring down expensive salaries.


you know during medieval times and the victorian age there were guilds and they constrained the number of new apprentices yearly in order to bolster their own salaries (and egos). if my salary goes down by 50% but twice as many people get access to a middle class lifestyle then so be it.


> you know during medieval times and the victorian age there were guilds and they constrained the number of new apprentices yearly in order to bolster their own salaries

You mean like doctors in 2017?


Specialty medical boards in the US are essentially self-declared and self-enforced cartels.


Except having an unqualified cooper in the middle ages meant wine might leak out of its cask. An unqualified doctor today means people will die.


You are assuming that every doctor not accepted into the very small Residency Class each year is unqulified. You'd be surprised to see how tiny most residency classes are in the US. Also, they are usually closed to foreign medical graduates, even the absolutely highest scorers with immaculate credentials.

The more obvious explanation is that the supply is very tightly controlled by the specialty boards (essentially cartels) to ensure low supply and consistently high pay [for themselves] and higher bills for the rest of the population.


yes indeed


I'm really grateful for the "Learn to Code Craze," as I never would have gotten into programming were it not for the media campaigns daring me to do it. It's taken a lot longer than 3 months as the boot-camps would lead people to believe, but it's definitely been worth it.


> the fact remains that only half of college students who majored in science, technology, engineering or math-related subjects get jobs in their field after graduation

Whoops! This is a huge piece missing from the conversation. I'm in EdTech and I had no idea. Can we get a source?


https://www.census.gov/newsroom/press-releases/2014/cb14-130...

Note that some of the assumptions are a bit comical, like an economics major who works in finance or a biology major who becomes a physician are considered STEM majors who work in non-STEM fields.

Still, the overall picture isn't great even if you look past those weird classifications.


To go on a bit of a tangent, one of the things that gives me pause about my otherwise enthusiastic support of capitalism is the horrible waste of talent.

How many mathematicians and physicists are going into finance to produce marginal improvements in liquidity, instead of contributing to the advancement of human knowledge?

How many brilliant software engineers are working on mindless CRUD and API-piping tasks at Google, Microsoft, Amazon, et al.?

How many great doctors go into high-paying, no-research, largely menial positions instead of working on medical breakthroughs?

Capitalism probably does a better job at allocating talent than any other economic system we know of, but the results are still depressing.


How much talent is wasted coming up with medical breakthroughs that are not used to save lives because doctors who are interested enough in knowing the medical start of the art, because all of those who have an interest go into medical research, leaving people to die?


I notice that it counts "Educuation" as non-STEM.

Weird things always happen when you try to classify occupations, but I suspect a there are a lot of high school teachers who needed their STEM degree for their jobs.


i'm training to be a teacher next year - the programme is in the school of education and social work.


Great to see people find the most important quote in the article.

It is certainly true in Australia:

https://grattan.edu.au/the-number-of-science-graduates-are-g...

Australia includes Psychology in science. Which is fair enough really.

Note also, about 1/4 of University Graduates say they didn't need their degree or are not really using it for their job:

The fact is that Universities world wide try hard to get as many people enrolled as possible and are often supported by governments in this but we already over educate the population.

Getting people to get good skills to get good jobs is much harder than raising the number of people who get not great degrees in subjects where is no employment.

https://www.washingtonpost.com/news/wonk/wp/2013/05/20/only-...

(Heaps of other sources, search for one quarter of graduates don't need their jobs or whatever)


Another way to frame this, which might invalidate/color some of the linked article's points, is to consider "coding literacy" within the context of how the world is moving, and how coding and programming-related concepts are becoming more commonplace and necessary in non-technical careers, and, indeed, in our everyday lives.

In the world these kids will grow up into, having this skillset will be as obvious as having the ability to write one's own documents (as opposed to creating documents via dictation and secretaries/typists), or create one's own presentations. And when being able to program (& I'm using that word loosely; i.e., not everyone is going to be a Haskell dev, but everyone _should_ be able to build "software") is as commonplace and widespread as those previously-technical skills, all kinds of new and amazing use cases will emerge (consider all the things people do with spreadsheets--not everyone is creating financial models...far from it--and the people using spreadsheets to make shopping lists, or organize a Little League roster, or track social media post engagement, are not "Excel Engineers").


None of this is surprising. Corporations, like people, tend to think of themselves in the noblest of terms, even when they are clearly pushing their own agenda. And in this instance, they are pushing all the right buttons - "Children", "Education", and "Jobs". Who can resist?

But having said that, who should resist? What is the actual downside of this push? Programming is exposing children to math and science, it improves problem-solving skills and allows kids to create something on their own. Not all of them will use these skills as adults or get any additional education in the area, and that's totally fine.


> the fact remains that only half of college students who majored in science, technology, engineering or math-related subjects get jobs in their field after graduation.

Most of (~70%?) my colleagues from engineering university indeed went to banks or management consulting firms. The thing is that none of them can or ever wanted to code. I see no sense in using this fact to argue that there is no skills gap.

More broadly, I am part of this "code craze" and decided to learn to code. I think it is a great thing and I advise all my friends with children to include some basic programming skills in their education. I am not even in the US. I am not directly affected by this huge corps influence (I think). I used freeCodeCamp.org to learn. And actually, the transition to becoming a software developer made me move from Windows to Linux in both my professional and personal computers.

I have no doubt that companies are using this trend as an opportunity to grow. But companies do that with Christmas, seasons of the year, wind, sunlight, demographic transition...

Personally, I have two certainties: learning basic coding skills is a good thing for students and companies are pursuing their own interests. One thing doesn't exclude the other.


> I advise all my friends with children to include some basic programming skills in their education.

The nice thing about programming is that it's one of the fields where "edutainment" isn't an eye-rolling euphemism, since a bunch of the problem-solving can be restated into flexible puzzles.

For example, Infinifactory isn't an explicit coding-game, but a lot of the same problem-solving (and debugging) skills are in-play.


I help out teaching kids programming at a code club in my local library. I don't do it because I want ever larger numbers of kids to code, but because I believe in equality of opportunity (the library is free to all), and it's fun and enlivening to engage with children.

Some of the kids are focused and enjoy learning, others really want to play games. Some are just bored (stashed there by parents grateful to be able to browse books in peace), or are just accompanying more engaged siblings).

It's perfectly obvious that, like every other activity, engagement in programming comes from a kernel of indigenous interest (wherever this originates).

What's even more obvious is the cruelty and blind unreality of trying to mash people into a mould dictated by a so-called 'economy'.


http://theconversation.com/taking-a-second-look-at-the-learn...

The ACM link is an abstract which links to this link.





Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: