Why should I read beyond this? If this were truly the justification for the program, it would be patently ridiculous — as ridiculous as claiming that solving algebra problems was a necessary skill for the jobs of the 20th century — but I doubt anyone ever seriously described it that way. The Obama announcement is still on the web (I assume Trump's initiative is only described in a semiliterate 3am tweet) and it describes tech careers as fun, not programming, and it touts the value of "computational thinking skills" for many jobs, not programming. Teaching programming in schools is akin to teaching algebra or history. There's a spectrum of effects: some kids will develop the skill further and use it directly in their work, some will get a basic understanding that will enable them to work effectively with the first group, and the rest will get some exposure that will make them more effective people (for their own purposes and everyone else's) in a society where it's a powerful force shaping everything around us.
If the author can't make their point without basing it on this fundamental misrepresentation, I won't bother clicking through.
You called the latter "patently ridiculous" but then made a very similar argument. I don't get it.
In all reality, it would just be yet another thing that could be automated but it probably isn't financially worth it.
The 'x' human effort in this one time task is probably not worth the '3x' human effort to get someone who can code familiar with the data and then be on hand to answer any non obvious questions.
Course, mileage may vary depending on exact data(size, quality etc)
I won't know how effective my instruction is until they're much older but it's my belief that having a deeper understanding of the words we use has a positive effect on one's thinking.
It's not important everyone becomes a proficient programmer - it's important kids are exposed to the ideasand given the knowledge and oportunity to pursue it further if they want.
Like with most other school subjects.
programming is a tiny subsection of a far larger and more useful skill that should be thought more, analytical thinking. learning kids how to break down problems into smaller ones, how to think logically etc is far, far more useful then learning them programming.
c level execs doing coding classes in their lunchbreaks
This just doesn't sound weird to me at all. CTOs read about marketing and finance; pharma CFOs read about biochemistry. Execs spend an afternoon working in the call center or learning to operate a jackhammer to prove they're regular guys. It's what you get when you mix ambition, personal curiosity, image awareness, and having the authority to make stuff like that happen. In that context it's not as significant as the 10 o'clock news might make it sound.
For many people, the idea that they could do learn some programming was new and they were curios just as I was about the above. Pretty much any temporary hobby people tend to do is like that - something wakes up curiosity and ambition. Network effect is usually smaller, it is just one company or office doing the thing, but when it is in the news a lot of them gonna do it at the same time.
However, I do think there are a lot of adjacent skills that can be learned through the practice of programming which all kids should be taught (whether through learning to write code or repairing cars or through other pedagogic methods):
- How to break down a complex process into a bunch of simpler steps
- How to recognize abstract patterns and what variables might differ in various situations
- How to use tangible evidence and your understanding of how a system works to 'debug' some phenomenon
- How to "evaluate code in your head" - think through the implications of how a change to a system might work
Programming is a specific skill useful in a bunch of situations, but not universally practical.
Critical thinking is a universally useful skill adaptable to a wide number of situations. A citizen body better versed in critical thinking would (I propose) make our society function better.
Don't get me wrong - a lot of people can get benefit from learning to write an automation script or small program of some sort to take care of an annoyance. Taking a programming course for one year in HS, and then forgetting about it for 3 years will put you marginally ahead of someone just starting. On that same token, I don't expect someone who took Cal II at university to derive/integrate a bunch of equations 3 years later.
Programming isn't a muscle memory skills like riding a bicycle. It takes practice.
They selected 18 people out of 80 applications.
The last time I checked with them, just like I predicted, they expected 4-5 to actually finish the course.
The selected people are not high school students pushed by their parents, most have a technical background and came knowing fully well that getting a programming job in the future will increase their salary dramatically.
For those who have not considered programming, being forced into it school may not be pleasant, but for many they may find it as delightful and fun as I do and we will bring more people into the programming community.
The article tries hard to make it a bad thing that big computer companies want to be able to hire qualified staff in the future, but clearly it is not. Any country without programs that nurture young programmers and encourage them to pursue it as a career will be behind the rest of the world in software, which is not a place you want to be.
I found the QBASIC interpreter on the family computer around 1994 and begged a 70s-era BASIC book of code listings. I didn't get far without instruction, but when I had a chance at programming classes in my sophomore year of high school, I jumped on it.
I knew that I wanted to do something with computers when I was 8 or 10 years old. The programming classes certainly helped to start demystifying computers for me, and gave me a boost of confidence, though.
You mean like doctors in 2017?
The more obvious explanation is that the supply is very tightly controlled by the specialty boards (essentially cartels) to ensure low supply and consistently high pay [for themselves] and higher bills for the rest of the population.
Whoops! This is a huge piece missing from the conversation. I'm in EdTech and I had no idea. Can we get a source?
Note that some of the assumptions are a bit comical, like an economics major who works in finance or a biology major who becomes a physician are considered STEM majors who work in non-STEM fields.
Still, the overall picture isn't great even if you look past those weird classifications.
How many mathematicians and physicists are going into finance to produce marginal improvements in liquidity, instead of contributing to the advancement of human knowledge?
How many brilliant software engineers are working on mindless CRUD and API-piping tasks at Google, Microsoft, Amazon, et al.?
How many great doctors go into high-paying, no-research, largely menial positions instead of working on medical breakthroughs?
Capitalism probably does a better job at allocating talent than any other economic system we know of, but the results are still depressing.
Weird things always happen when you try to classify occupations, but I suspect a there are a lot of high school teachers who needed their STEM degree for their jobs.
It is certainly true in Australia:
Australia includes Psychology in science. Which is fair enough really.
Note also, about 1/4 of University Graduates say they didn't need their degree or are not really using it for their job:
The fact is that Universities world wide try hard to get as many people enrolled as possible and are often supported by governments in this but we already over educate the population.
Getting people to get good skills to get good jobs is much harder than raising the number of people who get not great degrees in subjects where is no employment.
(Heaps of other sources, search for one quarter of graduates don't need their jobs or whatever)
In the world these kids will grow up into, having this skillset will be as obvious as having the ability to write one's own documents (as opposed to creating documents via dictation and secretaries/typists), or create one's own presentations. And when being able to program (& I'm using that word loosely; i.e., not everyone is going to be a Haskell dev, but everyone _should_ be able to build "software") is as commonplace and widespread as those previously-technical skills, all kinds of new and amazing use cases will emerge (consider all the things people do with spreadsheets--not everyone is creating financial models...far from it--and the people using spreadsheets to make shopping lists, or organize a Little League roster, or track social media post engagement, are not "Excel Engineers").
But having said that, who should resist? What is the actual downside of this push? Programming is exposing children to math and science, it improves problem-solving skills and allows kids to create something on their own. Not all of them will use these skills as adults or get any additional education in the area, and that's totally fine.
Most of (~70%?) my colleagues from engineering university indeed went to banks or management consulting firms. The thing is that none of them can or ever wanted to code. I see no sense in using this fact to argue that there is no skills gap.
More broadly, I am part of this "code craze" and decided to learn to code. I think it is a great thing and I advise all my friends with children to include some basic programming skills in their education. I am not even in the US. I am not directly affected by this huge corps influence (I think). I used freeCodeCamp.org to learn. And actually, the transition to becoming a software developer made me move from Windows to Linux in both my professional and personal computers.
I have no doubt that companies are using this trend as an opportunity to grow. But companies do that with Christmas, seasons of the year, wind, sunlight, demographic transition...
Personally, I have two certainties: learning basic coding skills is a good thing for students and companies are pursuing their own interests. One thing doesn't exclude the other.
The nice thing about programming is that it's one of the fields where "edutainment" isn't an eye-rolling euphemism, since a bunch of the problem-solving can be restated into flexible puzzles.
For example, Infinifactory isn't an explicit coding-game, but a lot of the same problem-solving (and debugging) skills are in-play.
Some of the kids are focused and enjoy learning, others really want to play games. Some are just bored (stashed there by parents grateful to be able to browse books in peace), or are just accompanying more engaged siblings).
It's perfectly obvious that, like every other activity, engagement in programming comes from a kernel of indigenous interest (wherever this originates).
What's even more obvious is the cruelty and blind unreality of trying to mash people into a mould dictated by a so-called 'economy'.
The ACM link is an abstract which links to this link.