Hacker News new | comments | show | ask | jobs | submit login
Zuck, Bill Gates, Jack Dorsey In Short Film To Inspire Kids To Learn How To Code (techcrunch.com)
441 points by aashaykumar92 1551 days ago | hide | past | web | 223 comments | favorite

Personally I think the world is drunk with "code fever". We are creatures of overreaction.

Not too long ago it was "no one learn to code! it's all going to be outsourced to India!"

Now, "I'm starting a non-profit to spread the gift of coding to children. Coding is our future".

Give me a f-ing break.

I'm still all for it. Coding is a lot more than just filling jobs. It opens the door to critical thinking, breaking down complex problems into simple ones, separating correlation and causality, and differentiating between data vs anecdotes. It's essentially applied STEM education, but with very practical, tangible outcomes in equipment that nearly everyone will have access to.

I can't think of a faster, more accessible, more entertaining way to grow these skills.

It is also a very affordable and quick way to "build stuff", which is one of the strongest ways for children/adolescents to get excited about their "work"

I do have to wonder what learning to code offers over the math that is already taught. It seems akin to teaching baking as applied chemistry. Should baking be a mandatory subject?

In high school, I taught myself how to program and formally took the math classes offered. I remember many days in class thinking, "oh, this is just like programming with another syntax." There is a lot of overlap there. In terms of critical thinking and breaking down problems, a coding course would just be a reiteration of the existing math classes in an applied setting.

What's "the world" for you? The bay area? Cause' I wish people would like to learn how to code where I live (Miami, FL).

I moved from Miami to the Bay Area because of that exact reason. Miami is more or less dead as far as the coding/startup community goes. To bad since it's about 1/3rd the cost to live there.

Yep. I think before everyone learns to code, they need to learn to think. Just dumping a bunch more people into the software field is not going to help the average quality of software at all, which is already pretty miserable.

Of course, coding is a good way to stimulate thinking, or at least it is for me. But they must go hand in hand...

As someone who teaches people to code as a non-profit in an economically challenged environment, I'd invite you to stop in to one of your local non-profit-learn-to-code events and see the variety and interest of the people it attracts and the results. Not all of them will go on to become professional programmers but most of them will use code in their careers at some point in some capacity. More importantly, coding teaches self-learning, problem-solving and independence which can be applied to absolutely every way of life.

I think this is a very sane response. It is cynical and perhaps a little harsh but it's just reality.

Is it really that surprising to you that this would cause such a reaction? Over the past few years, coding has become so accessible that non-technical people are able easily set up projects with a few lines of code. It has also become a lot more social with programmers posting in hundreds of blogs, on twitter nonstop, and collaborating on sites like Github.

This is all besides the point, though. What is wrong with "code fever," as you put it? With that kind of connotation, you seem to be implying that there is some kind of net loss if everyone learns something about programming. Beyond being mad about a fad (which is itself an overreaction, I might add), it doesn't make much sense to me. Care to explain?

Thank you.

If we're teaching coding to have more coding monkeys or because it's cool, you're right. On the other side, if we're teaching people to solve problems (coding or no-coding), that's always good.

I think people are shifting away from the point of the video and that's fine, and I don't mean to be "that" guy, but it is important to realize what these accomplished programmers are trying to say through the video.

If you listen carefully, most of the interviewees are simply attesting to Steve Jobs' quote at the beginning: "Everybody in this country should learn how to program a computer...because it teaches you how to think." The second part, the ability to think, is a skill that people must have AND use in their everyday life to be successful in whatever they do. Zuck, Gates, Dorsey, Houston, etc. are all encouraging people to learn to code for this reason, and because programming is an extremely empowering skill--not just in one field but in all...if you want it to be. The fact that programming can be used in all fields is why there is such a huge push for people to learn. It is a self and world-empowering tool that is simply being encouraged to be taught/learned by the incoming generation.

The current generation needs to realize that programming exists everywhere, just as we are taught that subjects like math, biology, chemistry, physics, and history are. There is a HUGE distinction between learning to code just for the sake of getting a job or improving the economy and learning to code because it can empower us to THINK and then DO. Not to be philosophical, but part of societies' problems has been always knowing what problems exist, but not actually doing anything to solve it--until now. Programming, more than anything, accelerates our ability to solve problems, little or big.

In a world where so many problems exist, programming is many times the means to achieving the solution. And that is why these successful programmers are pushing everyone to learn.

I agree with you completely. Here's a repost of what I think that I posted elsewhere:

"I think the video was great and people are over analyzing it. The following is what I got out of the video because I didn't over-analyze it like most people are doing:

(1) Everyone should be introduced to computer programming. Keyword here is introduced. Just like kids are introduced to Art, English, Biology, Sports, etc. Many middle and high schools are simply ignoring technology for the most part despite it's growing relevance in our lives and that is what this video is trying to point out.

(2) A career in software engineering isn't necessarily the cubicle dwelling, loner, boring stigma that most uniformed people associate with it. They're showing that it can be a very fun and impactful environment like at Valve, Facebook, and Dropbox (noting the free food, laundry, etc)

(3) You potentially have the opportunity to affect many people in very positive ways through the software that you develop.

(4) EVEN if you don't pursue a career in programming, it can help you understand its implications in many OTHER seemingly unrelated fields. And it will help you develop critical thinking skills. (5) It can be a learned skill like any other and there's no need to be intimidated by it.

All of those are very noble intentions in my opinion and valid concerns to address to the general perception of programming as well as computer education at early stages to develop interest in people that otherwise wouldn't have known about it."

"Zuck, Gates, Dorsey, Houston, etc. are all encouraging people to learn to code for this reason"

Certainly the self serving reason for the push can't be overlooked. The more people that are pushed into coding the more of a chance that Zuck/Dorsey/Houston can find a superstar to hire.

I'm sure in any industry (medical research as only one example) the people in that industry would love if more young people took a shining to what they do. Then they would have more to pick from. Sports works the same way. If you get a bunch of people interested in football out of the funnel comes some superstars and you have an entire industry.

I'm not so sure if I agree with the self serving push. I mean sure, these guys want great engineers--in fact, Zuckerberg explicitly states that.

But also remember that Zuckerberg is now investing in young medical research and Gates has been a philanthropic investor for a long time--these guys are looking beyond their companies and show that they truly care about the world and solving the BIG problems that exist. They are literally world-serving through these investments.

If there's no self interest involved at all how come Objective-C is not on the list of languages that they're pushing?

Because it isnt that great of a language and only a handful of people can actually make money with it?

291K "app economy" jobs is nothing to sneer at: http://www.apple.com/about/job-creation/. One of 'em is mine. I support a family of five with it. Why isn't that on the list of things we'd want to teach kids to be able to do?

The code.org video opens with a Steve Jobs quote, and the language that powers much of his legacy isn't good enough to teach? Really?

Did you care to calculate out how much $8B is when it is spread out over 270,000 people and 5 years? Apple is a big company, no doubt, but it is not an economy...no matter how hard their PR department tries to spin their "study"

let me know if im missing something, but where do they mention specific programming languages? I don't seem to see any reference to it. Furthermore, Objective C is taught like the others--for example, CodeSchool has an entire section on iOS app development and an Objective C tutorial to go along with it...

When you volunteer to teach (http://www.code.org/help), they list languages that you can teach, and the list they push does not include Objective-C. Nor is there an "other" option.

EDIT: They added an "other" option to that list since yesterday when I first tried to sign up. That's a good start.

Loved seeing them bring in celebrities outside of the tech world to share their experiences/passions for code.

Chris Bosh was a complete surprise to me but made me like him even more.

I agree. Not many young people know who Gabe Newell or Drew Houston is. But seeing big names like that that younger people are familiar with could lead to more interest in coding. Great video.

Or compare the public reaction of the death of Steve Jobs vs. the death of David Ritchie

You mean Dennis Ritchie...

BTW, I posted news about his death (I heard of it a week after he died, go figure) on facebook. Its sad that one person replied with RIP Mr. Ritchie and a "like". I was pleased though.

That's the way the world turns.

Yep, same with John McCarthy, who died around the same time.

The irony of you getting Dennis Ritchie's name wrong is strong...

If he became a more vocal advocate for coding education, he could have an incredibly huge impact since basketball has such a big international fan base. Not to mention he is just a cool dude!

Make sure to check out CodeHS. We are the little demo shown in the video using "Karel the dog." We work on making it fun and accessible to beginners, mostly in high schools and middle schools, but also have all ages of students. More than other "learn to code" sites, we focus on problem solving rather than syntax, and also provide help from real people to students along the way. http://codehs.com

It's exciting to see a video like this highlight the real need and current lack of computer science education in high schools. That is exactly what we are working on, and have high schools all around the country! If this is an issue you are interested, in please contact as at team@codehs.com

I'm a manager. I can read code (PHP and C#) to some extent, but I can't code anything. I tried to learn it several times, but the syntax and all its curly braces were simply too complex to me. A bit frustrated, I decided to move one step up in the stack and tried to understand software at a "macro level", architecture, paradigms etc. I wanted to be able to make IT decisions based on concepts I found easier to grasp at that higher level. It was equally hard. But there is one thing I got from the learning process that I really liked and kept alive in my head for future use: the notions of abstraction and generalization. I'm thankful I learned this. I see value in applying it in my life. If the idea behind the widespread of teaching/learning how to code, involves concepts like those I mentioned, then I think it is super valid.

If you still want to learn how to code, try with different languages. Python is a favourite for people starting out with programming, for example.

I get the message and it certainly makes sense for alot of people to be able to code, but on the other hand i know so many (also smart) people that hate any job that would involve using a computer for more than communication.

Not everyone wants to sit in front of a screen all day, and imo thats a good thing.

Not everyone wants to sit and read or do calculations, but it's still important to have a basic grasp of literacy and numbers.

Currently, there is not a basic grasp of digital literacy. I don't think everyone should know, off the top of their head, how to compile a script. But if people understood the idea of patterns in text and repeatable tasks (i.e. for loops), that could be very helpful, especially to professional developers who can make use of structured, well planned information.

But most of the people do anyway. Majority of office jobs involve sitting in front of the screen through the whole day. Having some basic programming skills and general knowledge of "what is happening under the hood" would make many people's work much easier but we can't expect them to suddenly become developers just because it's cool and useful. However, most of them don't even realize how useful knowing at least one scripting programming language can be. What sit in front of the screen through the whole day if with little effort you can automate some areas of your work.

Well, you have to consider that this is due to their exposure to computers in the past. Common users often see computers as this buggy device that sometimes feeds them their emails or allows them to catch up with friends on facebook. Others familiarize it with a tool that makes people antisocial or destroys interpersonal communication because of all of the possible distractions. And even then others think that computers can only do what they make them do on the job and they hate their job.

I'd imagine that if they stopped looking at their computing device in those eyes and were able to truly express their creativity, their viewpoint would change. Programming is not about "[sitting] in front of a screen all day." That's a misunderstanding. Programming is about taking control of your computer for fun, fame, and fortune. It just so happens that our only way of doing that is by sitting down in front of the damn thing, currently.

Software is terrible. I'm not sure the solution to the vast stinking wasteland of software slop is more people -- if pressed, I'd say it's less software.

FOr me the reasoning is simple:

* reading and writing code is basic literacy for the information age

* One generation hence, any company that is literate (almost all employees write code, the use of code runs through all its processes) will have enormous competitve advantages. Or rather any company that is illiterate will have enormous competitive disadvantages

* So Software will eat the world.

* but in the process from here to there, developers will take over more and more - moving from the "typing pool" into all of the company.

* this is unlikely to be a fixed supply of coders getting more % of revenue, and more like everyone learns to code and then usual political fighting resumes.

  > reading and writing code is basic literacy for the
  > information age
No. Just like everyone being a car mechanic is not the basic literacy of the automotive age.

"Everyone learns to code" is a pipe dream.

But everyone learning to drive is / was. Being a car driver is python coding. Being a car mechanic is probably on the order of writing a compiler.

The analogies will always break down but really, in this day and age, an adult in the Western world who cannot drive is socially and econmoically disadvantaged. (chooses not to is likely a different thing - for half of 20 years in London I had no car and biked / tubed everywhere. Now, my day would be impossible without a car. Sad but true)

I mean how many 17 year-olds do you know who say, "Learning to drive - who needs it?"

>> "But everyone learning to drive is / was. Being a car driver is python coding. Being a car mechanic is probably on the order of writing a compiler."

I think being a car driver is more equal to being a competent computer user not a python programmer.

Really? Driving a car is hard... lives depend on it, it requires judgement and constant attention to details. That we have mostly got it down to our sub-concious does not change that.

"competant" computer users who do not code basically send email and write word docs. Thats the level of putting on a seat belt, or working out how to turn on the air-con.

We can argue about the analoigy for a long time, but pretty much anyone can be taught to code a simple dynamic web site. Children learn Logo - thats Lisp basically.

Anyone can code - just teach them young enough.

You're straining this analogy though. Using fine motor skills and hand eye coordination is different from understanding recursion and using to solve problems. Let's get serious everyone. :)

This is not to knock anyone but let's say all of us participating in this discussion are at or better than the level of coding we think the general public should be at. This coding ability still doesn't seem to make us have great, sound, logical arguments and all these other attributes that are being espoused.

You're analogy sort of fails little bit though. It's more like being able to use a computer and being able to browse the web is much more like being a driver in the automotive age than being a compiler writer is to a mechanic (the compiler writer seems to be more like the automotive engineer ...).

I wish I could up-vote you twice.

Programming only does not help you think, there are several other disciplines that help you think. Philosophy/Logic/Sociology IMHO does it better. If industry needs more programmers, for economic growth, this does not mean kids should be brainwashed to learn to code. And even after learning to code, what do you do, an average programmers life goes just copy pasting stuff.

Google/SO customize copy and paste. Thats what I have been doing. I wonder if I actually think to solve problems while coding or just pile patterns of text, place proper settings and API keys.

Having said all above, Deep within I believe hackers are like painters(Read pg's great essay - http://www.paulgraham.com/hp.html), for hacking is like painting, would you want every one to be a painter? If not why everyone a programmer? Why learn to code? Programming or Painting is not mathematics nor reading writing skills, i can live happily without learning how to paint and how to write codes. Coding is NOT an essential life skill, there are many who have been living without and will continue to live. I won't learn programming because i don't require it.

I feel sad, on such brainwashing. And starting with steve job's quote about programming, is like doing a ferrari promo by Michael Jordon. I would have been happier, had they got something from woz.

Michael Moore come up with a documentary please.

everyone might not end up painting, but to extend the analogy, most schools have art classes.

At first, I dislike teaching programming in schools. It is because the space might then be over populated and finding jobs will be harder. Then I realized math was taught in school, science was taught in school. But most of us students didn't end up as mathematicians or scientists. Programming is much like math and science. It is a technical topic, and not all of us will like it. It can be taught in school as an essential subject but not all of them will pursue to be a programmer in the future.

It would make software developers' lives a lot easier if so many people weren't scared out of their wits by computers. I think even basic programming knowledge can really help with that, because it makes it abundantly clear that a computer is, itself, a tool rather than a magic box (that happens to contain some tools in between all the scary and confusing stuff).

It isn't so much that I think everyone should be writing their own databases, but some understanding of how good software _should_ work, (for example, standard data formats), can guide informed decisions about software. Some basic technical knowledge is enough to understand why you should never build a large website on an ancient CMS where you end up with thousands of HTML pages that will need to be completely rewritten, one by one through some clunky web interface, when the site's design changes. Anyone should be able to look at that kind of setup and say "hey, that's complete garbage — let's hire someone who knows what they're doing!" and there will be a few less website janitors in the world. And hopefully a few more people making cool new stuff in their place.

Sorry, that kind of veered off course :/

But take any subject we learn in high school, whether it's math, science, history, etc. The general public is vastly uninformed about all of those subjects regardless of the fact that they are taught in high school. Why would programming make a bit of difference in that sense? It's not like programming is not taught at all in high school, anyway.

Yes, absolutely, let's make it harder for kids to compete with us and put us out of work when we are in our 50's. I don't want them innovating and creating new open source libraries that make us more productive or open up new opportunities we can't even imagine today. Life is a zero sum game after all, right? If they win I lose.

I am being sarcastic.

I think the video was great and people are over analyzing it. The following is what I got out of the video because I didn't over-analyze it like most people are doing:

(1) Everyone should be introduced to computer programming. Keyword here is introduced. Just like kids are introduced to Art, English, Biology, Sports, etc. Many middle and high schools are simply ignoring technology for the most part despite it's growing relevance in our lives and that is what this video is trying to point out.

(2) A career in software engineering isn't necessarily the cubicle dwelling, loner, boring stigma that most uniformed people associate with it. They're showing that it can be a very fun and impactful environment like at Valve, Facebook, and Dropbox (noting the free food, laundry, etc)

(3) You potentially have the opportunity to affect many people in very positive ways through the software that you develop.

(4) EVEN if you don't pursue a career in programming, it can help you understand its implications in many OTHER seemingly unrelated fields. And it will help you develop critical thinking skills.

(5) It can be a learned skill like any other and there's no need to be intimidated by it.

All of those are very noble intentions in my opinion and valid concerns to address to the general perception of programming as well as computer education at early stages to develop interest in people that otherwise wouldn't have known about it.

It seems silly to me that a basic computer science ciriculum isn't being added to high school or before. In the same way you have to take basic biology, chemistry, and physics, a modern student needs at least a fundamental exposure to the basics of computing

I wholly agree with you, though I will expand it by saying that I hope they bring back some of the other vocational classes as well (woodshop, metalshop, etc). While I can't wait to see what new generations of programmers will bring us, we need craftsmen, mechanics, welders and other vocational professions just as much to help us with our physical infrastructure.

Basically, everything Mike Rowe says here - http://www.youtube.com/watch?v=0NwEFVUb-u0

I mostly agree with the gist of the video in that learning to program is empowering and can sometimes result in a job. However, the type perk-laden of working environments shown are unrealistic, and limited to large companies such as Google or Facebook.

When many of us started using computers, all you could do with them was learn to program them. There were relatively few distractions of video games, or the Internet to draw your focus away.

Today's computers are so engaging as a communications and entertainment device, many young people are not exposed to the possibility of using the computer as a creative tool.

Learning to program is just one such activity to get the creative juices flowing in the next generation. Learning to use a computer to write a paper, make a presentation or build a spreadsheet are fine; but the power a computer can unleash has much more potential for creativity and originality when students are taught to instruct the computer directly.

The early learn-to-program sites are focused on learning programming languages. This should evolve into more sophisticated environments where higher level constructs are made readily available (e.g., graphical environments, data storage and communication features).

It's exciting because we can also bring the social element into the equation to create for and with your friends and family.

The pages at http://www.code.org/learn could also include a reference to the Raspberry Pi and the foundation behind it: http://www.raspberrypi.org/about.

> We want to see it being used by kids all over the world to learn programming.

It used to be a lot more expensive than $40 to give a child access to a computer that (s)he could program and tinker with...

I think the point is there is a huge gap between the supply and demand for programming education, particularly for ages from 8 to 18. It is so easy and rewarding to feed the spark of desire to learn and create among kids of this age range, especially with all of the great free or almost free resources like MIT's Scratch purpose-built to address this need and reduce the barriers to entry.

Schools, teachers, and most parents are not equipped to deliver these experiences today. The superficial "technology programs" at most schools are appallingly shallow, not going much beyond Garage Band and book reports in Powerpoint.

Videos like this help motivate those who have the skills to engage and make a difference. What's not to like? Go out and get involved in volunteering to teach programming to kids. You will love it, and you will make a difference for kids who will go on to be the next generation of software developers.

My 7 year old and I have been working together on the Super Scratch book from No Starch and he's having a blast. When he asks "Can we program now?" instead of asking to watch a TV show, it's an awesome feeling. With things like Scratch and now Minecraft (which he also loves) on the Raspberry Pi, it's great to see so many resources for kids that go beyond the BASIC programming I didn't start learning until I was a few years older than him.

I've heard lot of people say that the best way to learn to code is to set out to solve a problem you're having.

I wonder if the global trend towards open solutions for problems will discourage people learning to code. I've been using computers for perhaps 15 years and I've only ever skirted around the edges of coding. I've learned a little regex and a tiny bit of bash scripting and I know how to edit a little php to get Wordpress to do what I want it to, but that's as much as I've needed to solve every computing problem I've encountered.

Wordpress is a pretty good example: there's now such a rich ecosystem of themes, plugins and hacks that you can get it to do some pretty amazing things without needing to actually write any code yourself.

Then you're thinking too small. Solve non-computing problems you're having! :)

I really wish people didn't spread this kind of message. Think job security guys! Come on!

I am all for job security, but I am also for good code and programming fundamentals. I absolutely hate it when I have to edit someone else's code that is filled with bugs and improper use of basic functions just because said person cut and pasted bad code from the internet.

Also, a key aspect of a good programmer is experience. If kids start early say for example at age 12, by the time they are 18, they have 6 years experience already. Think about the things you can do with that much experience? It could lead to innovations in software far beyond what we see right now.

I say "bah-humbug" to job security. That means that you just don't know or care about your craft. If you were really "1337" job security would be the last thing on you mind. You would actually be running the show and working on amazing products/solutions.

I'm a senior developer and I teach my methods to my colleagues all the time. It doesn't hurt me, but helps the team and the cause immensely. It makes going to work much more enjoyable when everyone is on the same page and pushing each other.

> I say "bah-humbug" to job security. That means that you just don't know or care about your craft.

I'm twenty-five years old. My wife and I have four cats, a home, two cars, and more than six figures of combined student loan debt.

I earn 75% of our total income as a Software Engineer. We can't survive on my wife's income alone and we don't have any family members to fall back on if times get tough.

So you, a senior web developer at JPC, might say "bah-humbug" to job security, but I, for the reasons listed above, do not.

I think the parent comment is absolutely right, more competition in the industry can only be a good thing from a societal perspective, as the highest quality workers take the jobs from the poor performers. Obviously from your perspective it might be worrisome, especially (and I apologize for getting personal) with all that stuff coupled with student loans, but surely it's a net societal good.

I was responding to the specific portion of the parent comment I quoted, nothing more.

Did you start out with the two cars before getting into student loan debt? Following that on with auto loans, maintenance and a mortgage isn't helping your financial independence.

Our mortgage is less than what our rent was at our apartment.

We owned one car prior to university and took out an auto loan to get my wife her own car.

We work in opposite directions and we each have a commute time of ~20-25 minutes, so moving closer to her job would increase my commute time and moving closer to my job would increase hers.

Our schedules are too different to make carpooling a possibility anyway. Also, there's no way to get to either of our jobs using public transportation.

Most of these people would not become developers. But think of the benefits if everyone had a brief exposure to how a computer works, or a realistic view of it should/can do. It may even cause everyone to regularly update their browsers as they now see the importance of it - we may even see the death of IE ;)

Do you really think the people who are leading this movement care about your job security?

That is wrong attitude.

Honest question - why? How getting more people into automation business will benefit people who are in it now (us), or the economy (arguably automation makes jobs and whole industries redundant). I am all for encouraging people to study science and mathematics, but I don't understand the value of getting more people to just "code".

Large, skilled populations bring great advances. In a world with hundreds of programmers, you get FORTRAN and specialized computers solving defense problems. In a world with hundreds of thousands, you get everything from smartphones and video games to industrial robots.

I'm not worried about job security. But I am excited about what armies of coders can invent.

People should be educated enough to know what they can expect from a computer - by learning coding they would get a deeper view into those things - not everyone would become a professional coder. And even then, every good professional create even more jobs - you can't do everything by yourself even if you know how too. Also it is for greater good - I think it is a step to happier society with more decent life.


This no-profit IMHO should focus on creating a free, cross-platform, awesome system similar to Codea iPad app [1], that is, the BASIC of 80s. Everything not in the same line is too boring.

I've a very very positive experience with load81 [2], something I wrote almost solely for my son. In the latest months and thanks to this program my son learned to write Lua code and we are having a lot of fun.

I don't think it would be possible without a system that makes drawing a circle or checking where the mouse pointer is trivial.

[1] http://twolivesleft.com/Codea/

[2] https://github.com/antirez/load81

I was fortunate enough that the public elementary school I went to, had a full computer lab (in 1985); the public middle school and high school also did. And this was in a very poor part of the country. There's little question every school in America should have that; if my poor public schools were able to make it happen, anybody can.

I learned basic programming in middle school (turbo pascal!) and high school courtesy of those opportunities. I plan to give back throughout the rest of my life, to the education system I came from. If even one or two people per school did that, it would make a huge difference.

Just want to share something I've been involved with: http://www.kidsruby and http://www.kidscodecamp.com

Sorry for typo, can't edit. http://www.kidsruby.com

I think programming really needs to be added to public school curriculums. Not just because there is a lot of job opportunities for people with the skill set, but also because I personally believe it is one of the best vehicles for learning critical thinking skills, problem solving and computer literacy. All skills that are very useful in todays day and age regardless of wether you end up ever writing a line of code outside of school.

I think exposing kids to programming is a good thing. But when so many applicants fail the Fizzbuzz test, do you think there is something innate about coding? For other subjects, like math, you can get most people to do the basics, like add 2+2.


But 2+2 is something you learn by rote, really.

School courses will do nothing if people don't have access to programmable computers. And they don't have - Windows doesn't ship with Visual Studio, OSX and Linux are irrelevant, iOS and Android don't come with SDK CDs and demos.

Now, of course, you know that these tools exist, and I know too. But most people don't and to them programming simply doesn't exist. Teaching these people quicksort in school won't be of much help.

  > School courses will do nothing if people don't have
  > access to programmable computers
I was taught to program exactly in this way: in the classroom, without access to computers—our school did not have any.

When I finally had access to a computer and a real programming language (not some pseudo code) I just needed the syntax to write my programs in.

The tricky part is not to teach kids the syntax of language X, it's to teach them to operate abstract structures, and that is much much harder.

But most have access to Excel. VBA has it warts but the step from Excel is small and creating something valuable is actually possible for beginners.

VBA ... gosh ... Why don't you just use a Web browser and Javascript.

I hate VBA as much as the next person here, quite certainly even more. However, comparing VBA and Javascript:

- To be really productive Javascript requires knowledge of HTML/CSS

- Both have their quircks

- For the things people do in their day jobs Excel/VBA is a better match

- You can do some mathematical stuff easier, it has matrix multiplication and a generally usable solver

Windows ships with windows scripting I believe; which gives you access to VBScript and something like a javascript out of the box. They have a free editor too but I don't know if it installs by default; anyway you can just use notepad if you want a lowest common denominator programming environment for kids on pretty much any windows box since 98.

VBScript is rather obscure and AFAIK useful mostly for automation of admin tasks.

It doesn't help you with everyday stuff. It doesn't exist in Windows GUI, unlike (most) other Windows software. You don't use it to build GUI apps, games, web services or whatever may be considered "interesting" nowadays.

Contrast this with DOS which shipped with BASIC interpreter and demo games.

> School courses will do nothing if people don't have access to programmable computers. And they don't have [...]

Well, that's almost solved with the Raspberry Pi: a fully programmable computer for just $40. It used to be a lot more expensive to give a child access to a computer that (s)he could program and tinker with...

Hm... What would be easier to convince schools to do:

A.) download this software package (say, Python) that allows kids to learn how to program. B.) buy a dozen $40 mini-computers kids can use to learn how to program.

Most schools I know of would pick option A.

I don't see how this is an issue.

Day 1: Learn how to setup the language on your computer.

A lot of books, like Learn Python the Hard Way, make this the first step.

I think I slightly misstated my point. The problem isn't really about lack of programming tools, but lack of programming as a concept.

You are given a black box which does some stuff, but you have no idea how it does this stuff or how to make it do something else. You don't even know that it would be relatively easy with a decent IDE.

I am in full support of this movement. Its benefit in my opinion are twofold: 1) Younger kids learn to code and which will result in them becoming more skillful programmers when they are older. 2) People who can currently program and write shitty code get challenged and can actually seek to improve their own skills and learn about more advanced concepts in programming.

I think these days is relatevely easy to get basics down with sites like codecademy, codeschool, teamtreehouse and such. At least in web developement. You can get the idea how it all works, how to do simple stuff.

To me, the biggest challenge is to get bigger picture of technology I am using -> once I understand that learning is much faster.

I really hope that barely-disguised copy of an LCD sound system song in there involved royalties being paid to the band.

It's turtles all the way down. I love LCD, but here are examples of things James Murphy lifted:

the fall telephone thing: http://www.youtube.com/watch?v=vEOtckaZtLg#t=0m36s pretty distinctive way to say "i'm tapped" used in movement: http://www.youtube.com/watch?v=o6MIChyCBAU#t=0m43s

in general his vocal style and delivery alternate between mark e smith from the fall and brian eno. really not a bad thing in my book, but still not entirely original.

homosapien by peter shelley: http://www.youtube.com/watch?v=2HwmO_GZfzI sounds like the basis of north american scum: http://www.youtube.com/watch?v=Jmm14g4cAFc

there's also the more than passing similarity between somebody's calling me: http://www.youtube.com/watch?v=wRCAuao67F4 and iggy pop's nightclubbing: http://www.youtube.com/watch?v=G3OaMZojJRg

all i want http://www.youtube.com/watch?v=A6BYCcy5aeo is pretty much bowie's heroes.

I understand it's just culture, appropriation is how culture develops. That being said, that piece of music used in this is obviously just a rip off of Dance Yrself Clean, with a small modality change.

Based on past occurrences, it would seem that DFA Records would not mind it too much:

https://twitter.com/dfarecords/status/205708411344666625 http://boilerroom.tv/james-murphys-red-bull-music-academy-le...

This is all well and great, and my comment is meant in no way to belittle these efforts.

What good programmers have a lot of, though, is self-discipline, initiative, and patience. If we can teach kids those core skills, they can use them to learn programming later or excel in any other profession they choose.

Coding is an incredibly frustrating and hard path and I doubt that coding will become more popular on a professional level. There probably can be more hobby programmers that code some simple stuff but there is NO WAY coding can become some kind of popular mainstream hobby.

I showed this to my high school programming class not two hours ago. They weren't impressed until Gaben showed up. What does this say about inspiring students to code? It syas that Minecraft has probably done more for the Java than Sun Microsystems ever did.

Well, shocker, kids are more into games than licensing collaboration software to other businesses ...

Wrongly Done. Do kids really care about Bill Gates and the guy who created Valve ?

Show them some Pixar movie and tell them how those movies were made with technology. Tell them how technology enables sports bring people whom they can relate with. Not Zukerberg or Gates.

Bubble, bubble, toil and trouble.

We're all going to look supremely stupid for things like this in ~2 years.

I think it's interesting that initiatives to teach programming mostly reach to kids.

A big chunk of my peer group is 30 year-old underemployed humanities students. If they were interested, I'm sure that most of them could be not-underemployed.

Though TV takes its toll.

I don't understand what you mean by your last sentence. Are you saying that your friends are too distracted by TV to learn coding?

I feel that if my peers would stop with the incessant stream TV, movie, and video games, they would have no problem fitting in the time to learn all manner of useful tricks, including picking up skills that would free them from the jobs they hate.

I don't believe this is a collective personal shortcoming on their part-- it's probably just the way of the world-- but the idea that so many of them hate their jobs but have ten hours a week to play games/watch films frustrates me. Rationally I understand why you might want to spend the day between your swing shifts waiting tables doing something other than working, but from where I am I think that all I can do is encourage them to do other things and point out resources.

I think people often use media to self-medicate and distract themselves from their problems. Many of your peers probably don't even enjoy a lot of the media they consume, but they keep consuming it because it is easier to to do that than think deeply about their situation and take the risks necessary to change it. If media is your coping mechanism for a bad situation, it's tough to put that aside even though it's rationally the obvious thing to do.

10 hours of leisure time a week is too much?

Also why exactly do you think it is so easy to get a programming job? Most employers want either work history or formal education. Even if you teach yourself programming, finding a better paying job might not be any easier.

That's a cool intiative, But we have to consider that all companies that support the initiative arent there for free.

Theres a commercial interest in encouraging kids to code: cheaper manpower in the future.

Wow, we should have a lot more of these kickass videos so people understand what we do without having to tell them :)

Seriously, do you think inspiring the students with the amount of money, 'they' earned with code is a good idea?

The gap between CS majors and software jobs is misleading. You don't need a CS degree to become a decent programmer.

Still, if there's such a shortage of good programmers (and I agree that there probably is) then why aren't we playing this to our advantage? A senior engineer can't afford a house in the Bay Area or New York, and our status is low-- we still work for managers; we're a defeated tribe in this way. Most of us don't have the autonomy to choose our tools or decide whether to use or replace old legacy code. Shouldn't our top priority, as a tribe, be to change this?

We should make programmer autonomy our major issue. http://michaelochurch.wordpress.com/2012/11/25/programmer-au... This will benefit us and the economy as a whole.

Then, after we've had this victory, we can work on increasing the total number of programmers.

Fred George is interesting on this - see Programmer Anarchy. My view (and yes I am working up a blog post) is

1. Coding is the equivalent of literacy after 1451 (Gutenberg).

2. in say, one generation, any company not having literate people at all levels of the organisation, and not having then changed how they work at a fundamental level to take advantage of these programmers/business people will be at as severe a compettive dis-advantage; similar to trading company in 1600 where no-one at the top could read or write.

3. If you think that's a bit of a stretch, or say "yeah but there were loads of illiterate people on trading ships" - but the people at the top, the captains, the investors, could all read and write. So they could exchange information globally and accurately. Share a model of the market effectively.

4. So, the successful companies in a generation will have coding in their DNA. And to get there from today, companies will benefit most from hiring great coders and teaching them how to be CEOs, how to be Salespeople, how to be accountants. And letting them run the company as CEOs, and accountants do now. Let me repeat - hire people who can code and teach them to be CEOs. They will find ways to run the company with coding at its core.

5. In short programmer anarchy / autonomy is the right way forward. The advantages of running a company through code is so great (comparable to running a company through writing) that all other skills come second. They are necessary but software eats them. For a generation only.

Takebacks: I am guessing that the econmoic advantages of code-literate population is a diminishing return compared to a text-literatate population. say 50%? Thats still phenonemonal.

I humbly disagree.

Lets focus on one point.

Literacy facilitated efficient information exchange.

Likewise programming facilitated efficient information exchange.

But, not everybody needed to program the Internet for this advantage.

I think you are confusing a personal advantage and societal advantage. Certainly, we need programers but I am uncertain if everybody needs to be a programmer.

I work a full-time software job, and I can say that the software is the easiest part. I do big data HPC, C, C++, learning models and physics simulation. I need people to come up with models, Ontology and even vision. I think at the Hacker News Ivory Tower of software-for-software or software-for-programers, we often underestimates the value of domain specific knowledge.

Teaching kids is certainly rewarding and a CS degree is a bright future, but I am more worried about the lack of US PhD's then the lack of user facing javascript.

He's not saying everyone needs to be a programmer, but everyone needs to be able to program.

The number of times in all my jobs where I've seen people working on problems that could be done orders of magnitude faster with a bit of coding knowledge, is beyond count.

Reformat that document? How about a Regex? Analyse this data? Automatic graph generation and statistical methods. Keep records updated? Database with automated queries. Always building weekly reports? Template substitution and HTML generation.

There are so many jobs, that would not be considered anything to do with programming, that can be made hugely more efficient with a bit of coding knowledge.

As this video says, for most of this you don't even need to know algorithms or data structures or the 'important' stuff for being a programmer.

You just need to understand 'SELECT date FROM records WHERE quantity > 10'.

Or 'println("Report Week Ending " + date.GetToday()))'

Or 's/\[[A-Z]*\]/(&)/'

In many cases the people in these positions don't even need to be able to program, they need to be able to see the solutions that programming can give them. I think this is the number one issue right now. Not everybody needs to be a programmer, but everyone needs to be educated about some of the ways programming can help you.

With that said, we mostly come up with these ideas because we are able to program, so yeah.

Uh no. He's saying everyone needs to be a programmer.

> companies will benefit most from hiring great coders and teaching them how to be CEOs

The problem is that today 'being able to read/write code' and 'being a programmer' is as good as synonym. It shouldn't be, that's the issue. We won't call CEOs with programming skills 'programmer's just as we don't call them 'readers' today.

Domain specific languages have a role to play there. You already mentioned Regex and SQL but things like R, MathLab or even Excel can be invaluable tools in some professions.

The problem is people don't even realise what types of problems can easily be solved by computers. It doesn't even cross their mind that something they do every day could be automated or made better.

That's why everyone needs to learn how to code (a bit). So they can go "Ah, I think software might improve this thingy here; let's call in an expert and ask"

Programming enables automation of your domain-specific knowledge. Animators who could program gave us CG movies. Musicians who could program gave us software music composition tools. Would garbage collectors who could program give us robots to do parts of their jobs? Or apps to communicate real-time problems? Or public education websites? I think it very likely.

Any field -- any field at all! -- that can benefit from information organization and retrieval, automation, scripting, robotics, internal wikis, private smartphone apps, information dissemination, etc., will benefit from people who have deep domain knowledge ALSO being programmers.

And I think that's just about all of them.

This recent article is basically the same concept: http://news.ycombinator.com/item?id=5275313

> Despite these challenges, we republished the updated graphic without too much delay. But I was left thinking how much easier it could have been had I simply recorded the process the first time as a makefile. I could have simply typed make in the terminal and be done!

In other words, the key "transferrable skill" of programming, is being able to describe a process in enough exacting detail, that nobody ever needs to "reconstruct" it again. The computers-being-able-to-execute-the-process thing is just a nice side-effect.

Absolutely !

It's not about user facing JavaScript apps. It's about adding another tool to the mental toolbox.

Calculus, statistics, physics simulations - all of these will be of use in a persons career, yes, but like you I think that is not important. What is utterly important is that the most expressive and flexible tool in the mental toolbox for two hundred years is put in the hands of as many human beings as possible.

I want the next Einstein to be literate, able to use Lisp on her OLPC device and change the understanding of the universe from her school hut in Rwanda, circa 2039.

We live in a world of wonders, invented by the brightest few thousand humans who ever lived. Let's give ourselves the best chance of the next thousand adding their full worth too.

It sounds like you think all scientists should learn how to code.

I am certain that there are many people who rarely use calculus and physics simulations in their work, and who would never need to program even if they knew how.

I don't think I have ever used my Secondary School chemistry, or my knowledge of the biology of frogs.

But that does not mean learning those was useless - either to make me a more rounded person, or to pass me through a filter so that those who do want to contribute to society through chemistry actually discover their interests

School is a process for finding and educating those who will invent the next 100 years, and persuading the rest of us that we should support them.

I think that what humanity gained in literacy when Gutenburg invented the printing press is akin to what humanity can gain in problem solving skills upon the invention of the microprocessor. An everyday problem solver is a more peaceful, more level headed, and more thoughtful person than someone who is not. Just like an everyday reader is someone who is more informed than someone who is illiterate.

The best way to teach problem solving skills is through computer science without a doubt. A smarter population will lead to more peace and prosperity for the human race.

Do I think that computer science can lead to a more peaceful and thoughtful society? Absolutely; therefore I think that it's crucial to teach it.

I don't think the Gutenberg thing applies at all. I think we're getting carried away. You could make a argument about digital information and the Web as a way to disseminate information as a Gutenberg style invention but I don't think programming is.

I don't think programming and learning to code is the "everyday problem solver," "more peaceful" person. I think what you're talking about is the phenomena that intellectuals tend to be more circumspect, perhaps less violent whatever. We don't need programming for that. We have a whole history of knowledge and art in philosophy, literature, other sciences, logic, mathematics etc. I fail to see how programming is special in that sense?

The average person, say in the US, may be more likely to be able to read than in the past but they do not tend to be less "ignorant" of many societal issues, cultures, etc (because you bring up peace and harmony) even thought we have educational tools long standing to solve that problem, much better than programming.

Honestly, I dispute this outright. I've been programming a long time, met many programmers and computer scientists; it does not seem like the ability to produce CRUD correlates in any real sense to a person being less likely to join the Tea Party.

You're viewing programming as this great society benefiting solution to social ills or something that makes you a better person; large swaths of people would just view it as an engineering and say "fine, sure, programming skills make it easier for you to get a job, great" but not anything that substantially creates peace in the world -- that's too much I think.

I think this is a good point and I am surprised by the downvotes your post received.

I wanted to add that Internet culture has facilitated many people to realize their most twisted fantasized this perhaps not good. Consider the desensitizing effects of r/spacedicks.

> A smarter population will lead to more peace and prosperity for the human race.

Or language wars fought with real weapons.

lack of US PhDs? We already produce way more PhDs than the economy can absorb; if anything we should either produce less PhDs or create a better job market for them.

The job market certainly sucks, but it maybe because a large number have uncompromising dreams of working in academia. In EE a PhD will land you a job at a company of your choice.

As does the U.S. in general, the NavLabs suffer from an increasingly severe shortage of trained replacement civilian personnel in science, technology, engineering, and mathematics (= STEM topics). The Navy’s problem is more severe than the nation’s because only U.S. citizens can work at NavLabs, and a great many U.S. STEM graduates are foreign nationals.

[1] The following is a report http://www.dtic.mil/dtic/tr/fulltext/u2/a503549.pdf.

This is true for EE, but that's an engineering degree. Sciences PhDs have a much harder time, even in industry (e.g. biotech is languishing).

We already produce way more PhDs than the economy can absorb; if anything we should either produce less PhDs or create a better job market for them.

When I was a kid and first learned about the Roman Empire and its fall, and of all the other historical empires that collapsed, one of the things that I wondered was how these people could be so "stupid". Of course, I didn't understand the day-to-day nuances involved. I saw these as personalized historic entities rather than complex patterns of behavior. I just couldn't grasp why such things would "allow" themselves to fall apart.

When I look at what has happened to the U.S. job market for scientists, researchers and academics and place that in the context of what we were as a society and what we've become... I get it. I now know what it is and what it looks like when a complex society "chooses" decline.

Historically speaking, wasn't the temporary boost in demand for PhDs after WW2 the abnormality?

Personally, I dislike the argument that recent progress is an "abnormality". A large middle class (post-1925) is an "anomaly". Slavery being illegal is, historically speaking, extremely unusual.

Right on. I was at the Smithsonian Natural History museum in DC a few days ago. They had an exhibit on pollution in the world's oceans. I was shocked at the reasons they gave to reduce pollution. To paraphrase: if we don't stop polluting, we will hurt the local economy, etc. My jaw dropped. That's the reason why we aren't supposed to destroy the oceans???

As a scientist, it is totally sad to see the state of research, and the standing scientists have in society. I don't understand why sane people would choose to get themselves into such career paths.

Communication has evolved over time from hand/body gestures and noises to spoken language to reading/writing. The latter (literacy) is now a major form of communication and required in almost any job.

Coding is not a form of communication.

let me know when that blog post is finished!

I'll put up a mailchimp account :-)

Can you explain to me what actual evidence there is that there's a shortage of good programmers? Good programmers still make less money than a good doctor or lawyer, indicating that those are even shorter in supply. Surely addressing that shortage would be a higher priority? Lots of people could use more legal services if the price was lower, and yet if someone wanted to hire a Harvard lawyer for $60k a year, we wouldn't even seriously consider it, but when someone wants an MIT programmer for $60k a year we consider it a job opening and thus a 'shortage.' Thus I find the 'job openings' a ridiculous metric of supply and demand, and I would use the metric people have always used - the market price. Using this more reasonable metric, there are careers with much more severe shortages of talent. As usual, nobody is surprised that those heavily invested in hiring programmers want a larger supply to drive down market wages.

My identical-twin brother is a doctor, and I make far more than he does. The reason is that I don't call myself a programmer [1]. Instead, I've built-up enormous amounts of domain knowledge [2] in addition to my software engineering skills.

As for the career prospects of anyone fresh out of law school, the legal profession suffered a massive meltdown in 2008 and hasn't recovered.

[1] http://www.kalzumeus.com/2011/10/28/dont-call-yourself-a-pro...

[2] http://jacquesmattheij.com/domain-knowledge-or-a-lack-thereo...

Your two data points don't mean much because we're talking about macro trends and whether a 'shortage' actually exists. Specifically I'm asking someone to justify the assertion that there is a shortage of good programmers with some numbers, and I pointed out that job openings is essentially just market 'bids' and therefore not a very good metric since all markets have lots of 'bids' at well below market price. For some reason in the world of programming we take those low bids more seriously than in other fields which is inconsistent. The best metric that I know of, market price, does not indicate a shortage relative to legal and medical work, since both doctors and lawyers both have higher averages, and yet you rarely hear about shortages in those fields and how we need to more aggressively train more. So your point about how you individually make a lot of money is a bit of non-sequitur.

As for legal work, the median starting salary for the top 14 schools (remember, there is a shortage of 'good' programmers) is still 160K+. How many good programmers start there? Sure lower-grade lawyers don't make much, but neither do lower-grade programmers. Facebook and Microsoft both have reasonably high hiring standards and are complaining about a shortage of people over their threshold. As for 'the rest of us', the average lawyer salary is higher than the average software engineer salary. There are a lot of low paid software engineers too, it's absurd to compare only the highest paid software engineers to all lawyers which it seems you tried to do.

The fact that you can see the average lawyer salary being higher than the average software engineer salary, and still claim there a meltdown in the former industry while the second is struggling for people, just shows how deeply ingrained our cultural perspective of what job markets and compensation for a given career 'should be', and how it skews our perspective away from the simple econ 101 supply and demand curves that tell the real story. Markets are a lot more honest than PR releases from billionare tech executives.

But on a side note I did like your blog post, I just think it's irrelevant to this conversation.

It's important to keep in mind the role of the professional guilds in controlling supply for lawyers and particularly for doctors.

Speaking as one, MIT programmers go for a lot more than 60k.

That's why it's a job 'opening'. My point was that it doesn't get filled but people classify it as an opening, but if someone wanted a Harvard lawyer for $60k it wouldn't even get listed.

Do you have an example of a job opening where they're offering 60k for a MIT programmer? I can give you a bunch where they offer substantially more (all $100k+ with just an undergrad degree):




You are totally missing my point. Yes, I realize MIT programmers make more and you need to pay them well to hire them. However, if you look at average software engineering salaries, you will realize most people do not pay as well as prestigious tech companies. Yet their job openings are used to indicate a shortage. Shortages are just low bids in a market, the fact that lots of people want to buy gold at $1 an ounce does not mean there is a shortage of gold. It's a bad metric if you understand anything about economics, and yet I've yet to see another metric used to argue for a 'shortage'.

I'm just asking for a few examples. Do you have any where they advertise an open job but the pay is low?

With exceptions, salaries aren't advertised in job openings. However you can look at average software engineering salaries to learn that most salaries are south of 100k, and it seems unlikely that on average higher salaries are getting turned down for much lower ones. Furthermore, specific examples are unnecessary. I originally asked for metric that proved there was a developer shortage, and explained why job openings are a bad indication of one. There have been none other proposed to my knowledge, so you're really just nitpicking details irrelevant to my original point.

>A senior engineer can't afford a house in the Bay Area or New York

This is an important and under-emphasized point, which is elaborated here: http://www.slate.com/articles/business/moneybox/2012/05/face... . One problem with programming salaries is that so many programming jobs and opportunities are concentrated in areas with very expensive housing, such that salaries themselves have to go up to compensate.

Before 1976, this wasn't as big a problem as it is now: most places that experienced rapid growth could build housing stock in response to that growth. In 1976, however, the Supreme Court upheld a 9th District decision allowing the City of Petaluma to impose severe restrictions on construction (see 522 F.2d 897 if you're the sort of person who likes reading court decisions).

After that, more and more cities made construction harder and harder; now areas that have a geographic advantage in some industry often can't build sufficient housing to keep up with demand because of legal barriers.

But relatively few people connect these issues.

The legal framework is the hidden-in-plain-sight gearing of civilization. Nobody pays attention because it is boring and slow.

> "our status is low-- we still work for managers; we're a defeated tribe in this way"

As a fairly senior developer, I just can't fathom this attitude. Working for a manager means you're low status or defeated? I want to work for a manager. Reward me and give me autonomy and respect due a valuable developer, sure, but please do all that management shit for me. In my mind (and at every single company I've ever worked at), the guys who actually build stuff are high status.

Maybe we should be talking less about forming "tribes" and more about how to make programming accessible to anyone who needs to solve real problems using computers.

Bingo. Reading and writing used to be a separate profession as well.

It is accessible to anyone, for free. I learned everything I use in my career for free on the Internet.

Most people have better things to do than programming, and don't want to invest the time to learn just because a news article said they should.

> Most people have better things to do than programming,

Most people have no idea what programming is and can't make that judgement for themselves—coding should be taught in high school, not because everyone will code but because people need to know what kind of opportunities exist. Let's be honest, a lot of high school classes were nearly useless in terms of how much you used what you learned, but they were useful because you learned you didn't want to become a chemist (or whatever).

Coding IS taught is in high school, but is not mandatory. You know this of course, but it's important to note that there is no shortage of opportunities for students to take computer science courses. Some districts will even send students to a nearby school if enough students want to do it but the school doesn't offer it. High school computer science classes were useful, but still at such a slow pace for people that were interested enough to learn at all outside of class. Anybody that was obsessed with programming outside of class probably wouldn't take the class or just snooze through it.

Yes, programming is taught inside of _some_ high schools. A shockingly low amount of high schools actually teach it though. Neither my high school nor other nearby high schools have any sort of computer science curriculum.

These schools would offer the course if enough of the students are interested. There is no lack of information around us (internet, libraries, sit down at a book store) to learn programming. The issue is the perception of programming in high school - the stigma around it. This might be perpetuated by some of the more eccentric programmers, and these programmers are very good and tend to grow into strong programmers! But they can also repel people that might have been interested, but it conflicts with social status that is so important in high school to the majority of people.

No, this is just false. My high school was run by incompetent people, the result being Photoshop classes being offered to people wanting to program. This is further complicated by people who can't afford their own computers or internet and are explicitly disallowed from programming at the library.

You have a very optimistic view of high schools.

That seems like a really bad scenario, but there are computer science classes and AP/AB CS classes that need to follow a certain curriculum for the Computer Science AP Test by the College Board. I graduated from high school only a year and a half ago in Washington, so I would agree that my perception of schools is skewed towards wealthier districts.

You don't agree that any student should be allowed to program on computers in the library? It seems like a security issue, whereas in a classroom students would be supervised by a computer science instructor. I'm not saying students need to be hand held, but schools really just don't know enough about programming to open it up. Web development, Flash, and Photoshop could be installed on library computers - they were on a few where I went. Otherwise I totally understand why school district IT would not allow it or fight against it.

> Still, if there's such a shortage of good programmers (and I agree that there probably is) then why aren't we playing this to our advantage? A senior engineer can't afford a house in the Bay Area or New York, and our status is low-- we still work for managers; we're a defeated tribe in this way. Most of us don't have the autonomy to choose our tools or decide whether to use or replace old legacy code. Shouldn't our top priority, as a tribe, be to change this?

Working for someone else will almost always end up like that. Programming is interesting in that you don't have to work for someone else, you can scratch your own itch. You just need to make something that is useful.

I think you're off the mark, because you're always "working for someone else". Details change, but ultimately there's no difference between "mak[ing] something that is useful" and working for others. Sometimes the terms are great, and sometimes they're awful.

The implicit assumption is that "working for an established organization" must imply "working with miserably low levels of autonomy". I'm not sure that I agree.

If we negotiate better terms, we get more autonomy and more compensation. This will also give more people the financial means to start their own companies and get that level of autonomy.

The problem is that the exchange rate between capital/establishment and talent/effort is very unfavorable in comparison to what it could be.

| The implicit assumption is that "working for an established organization" must imply "working with miserably low levels of autonomy". I'm not sure that I agree.

And it is not only about autonomy. It sucks to work for somebody stupid, in general somebody that doesn't understand the nature of software development, probability, statistics, etc.

Do you mean we should unionize?

Lawyers don't unionise - they have professional ethics bodies.

The difference is that when the law is code, and code law, we shall have a professional ethics body too. When programmers can do real damage and affect real lives they naturally gravitate to professional considered advice

Look at JPL or NASA, or a medical devices company.

The level of testing and reliability there is astounding - but so is the cost. And one day the engineers will find a way of capturing all that cost.

The problem with unionization is that it works well when the work is a commodity, and software engineering isn't. The 10x (and 100x, and -25x) programmer phenomenon is real, but individual performance is nearly impossible to measure on a quarterly basis.

We need some traits of old-style professionalism: autonomy and the right of self-investment. A profession is a set of ethical rights and obligations that supersede managerial authority. That's what we need. http://michaelochurch.wordpress.com/2012/11/18/programmers-d...

Unionization works reasonably well for the entertainment industry.

One one of the reasons for the lack of push back from the unions over the modernization of the UK's telco infrastructure was that the people building system X etc where Union members.

and the BMA and AMA do quite well for there members.

> The gap between CS majors and software jobs is misleading. You don't need a CS degree to become a decent programmer.

Conversely, you don't need to be a decent programmer to get a CS degree. It's a shame that in the age of the internet and self-directed learning, degrees are so heavily weighted in the job market.

And yet all of the glorious self learners can't describe when to use a hashmap, what a hash is, or even the difference between a list and set...

I'm not talking even knowing the algorithms, I mean a guy that writes list.contains(condition) in a loop and it brings the server to a halt because every call to contains runs through the entire list, when a hashset.contains(condition) would run in milliseconds.

* no I did not go to a top tier school and only have a CS Minor, but I did read the books to make sure I didn't have gaps in my knowledge. It's just rare in my experience to find people that know the basics who also claim they were self-taught. People get all pissed off if you ask them what a HashMap is, which should be the most basic thing in programming.

Yawn. I would say the same about the glorious PhD professors that attempted to teach me. Even the best professors, who were actually pretty knowledgeable, would never really code at all, pushing that off to the grad students.

Some people just don't fit into the way academia works, and making silly judgments like yours about them does nothing but entrench the status quo.

I agree, but you still have to read the book. Coding can not be learned by copy and pasting other people's code. All I was trying to say is for every 20 people that say they are self-taught only one of them can write a feature from scratch from a blank slate and explain the trade offs of why they used a hashset vs a list. The other 19 may be able to get something working by copy and pasting another section and spending a week shoehorning it, but the competent person would bang out in a day without looking anywhere else.

> Coding can not be learned by copy and pasting other people's code.

Absolutely it can. I would even argue that is the best way to intro into programming, and even learning in general. You, of course, cannot stop after you have pasted it in. You have to think about why the author made the choices he made, but in doing that you will quickly build an intuition to be able to apply similar choices to entirely new problems. With a little experience, you won't even need to draw on other's code anymore.

The distinction you really seem to be making here is that self-taught programmers are able to enter the job market on day one, without that experience. While graduates have at least four years of experience under their belt before they are reasonably able to enter the job market. Of course the people with more experience are going to perform better. As more time passes, the skills between the groups will start to average out (with amazing and not-so-amazing people found in both groups).

Well, consider me in the 5% then. And though you're probably right about the 1-in-20, I don't think I'd put the proportion much higher for CS grads.

There are two diverging definitions of what it means to be a programmer.

There's one set of them who can write new programs, build systems, and learn technologies they aren't familiar with, but put them on a team of 30 people for a typical enterprise project, chugging along at 10 lines per day of code, and they'll be bored out of their minds.

There's another set that can't function outside of an IDE or design new systems, have no idea how anything actually works, but they can slog through corporate stuff and follow orders well enough to stay employed.

Both are called "programmers" but they're practically different jobs. It's easy for us in the first set to look down on the second, but maybe we should just admit that they're different task categories altogether.

Do you really need a CS degree to make that kind of judgement? Even before I was college-aged I knew (and so did several of my programming friends) about hash tables and how much faster they were than just iterating through something and checking. I didn't know Big-O but I did have a loose understanding that told me iterating through EVERYTHING is slow, and that there are smarter ways to look up values.

This sounds like a case of a programmer using the wrong datatype, realizing it, and then being too lazy to fix it.

I've interviewed plenty of people with a degree or even a master's in CS that, while they can recite the definition of a hashmap, can't actually use one in a program.

Sounds like someone from Stanford.

One thing about people that make this argument that I don't understand is do you all know about libraries? You know they've been around for a long time. I'll grant you search engines make some parts marginally easier but to get the fundamental knowledge, you don't have to have that. So why is the Internet the great panacea for this ... Or is it that autodidactism is not really that common but it's much easier to call casually watching a YouTube building a blog engine as learning?

I guess I'm mostly going from my own experience. I have actually learned some cool stuff from library books (not in a while though), but it's mostly from the internet. The internet facilitates learning a wider variety of information without even leaving your house, allows you to download software/code, and also allows you to find communities where you can ask questions and get feedback on your progress.

I basically came here to say this. Often HN is frequented by those who are go to top CS schools, but I do not think they are in the majority of most CS schools. It is really possible to escape a university with a BS and MS without knowing how to develop and learn beyond your scope of knowledge. At our university, for example, we are supposedly taught C++, but I still find myself having to explain basics like STL Containers such as Sets and Maps to 3rd and 4th year students. This is after they've run into bugs with something that would essentially take two or three lines using one of the above structures.

Another problem is that CS is often very, ahem, academic.

In the UK, we have the British Computer Society (BCS) who give accreditation. If your doing a CS corse without BCS accreditation everyone knows it will be a bit suspect.

However, the BCS you've probably never heard of, they aren't exactly at the forefront of anything, they are also quite stuck in the past.

In a world where dynamic or at the very least JIT'd languages are the norm, teaching Pascal or Delphi I really doubt is in the best interests of the students. Yet in 2003 that is what I was taught for first year.

So the idea that a CS graduate will be able to go in to software development immediately is a bit missleading. It's more a case of someone who has a CS degree will have demonstrated they have the aptitude to become a software dev.

tl,dr: its education, not training.

I really do agree with your point, but it sounds like an excuse from academia. I can assure you that many students, when starting in their Freshman year in college, expect to be able to build the cool things they want to build or be competitive in the job after their 4 or 5 year stay is over. However, as you state, it is not necessarily the case and the problem is with the archaic nature of academia. Yet, you don't learn the things that many would define as essential knowledge in current technology.

In my university, for example, I have not once ever heard about design patterns or ever seen different coding paradigms other than OOP. What I've learned is how to solve trivial or nuanced problems in poor ways in languages not really suited for it. In job interviews, I've only passed through because of things I've personally self-taught myself due to projects I've worked on. So what's the point of an education if you have to teach yourself most things?

Autonomy within an organization sounds novel for morale, but it's a major shift from today's business culture. Strategy would also be incredibly difficult to execute. Organizational leaders need to be able to align their people down a common path.

If autonomy is the end-goal, start a company or go freelance.

It's a major shift I agree... but it's a great decision if executed correctly.

Every company has a set of main goals that imply that development can't be completely autonomous, but hiring good developers is not just about code it's also about people with a vision and a sense of purpose for your product. Take the way Github works for example: at the end of the day their main goal is an accessible hosted interface to git, to make git version control easier and ubiquitous; nevertheless having that main goal does not stop their developers from creating new features (say for example a desktop client) that make the product better though it may or may not directly translate to their main goal.

If you have a company that develops software and your developers have the autonomy to work on things that they actually believe will further increase the companies value (be it an interface change, bug fixes, or the refactoring of bad code) you will most definitely have a better result than if you make people work on things that might not be their forte or that are uninteresting to them.

Now I don't mean that people should have a complete free reign of the product, as some things are too important to forgo because no one wants to do them, but giving developers the chance to work on things they actually believe in as opposed to things that might not translate to creating value in their minds can go a long way towards good morale and retention, and un-doublty a better product because of it.

I agree that people who want outright autonomy will need to go off and start their own companies, but right now, typical employment terms for software engineers are outright bad in comparison to what they could be:

* mediocre compensation, especially considering the horrific costs of living in the places with the strongest economies.

* low autonomy, respect, and opportunity for advancement due to a scope-of-work defined to commoditize rather than grow engineering talent.

* low job security or transfer opportunity, which is the main reason for working for a large organization.

I believe that this arrangement exists because most software engineers lack the skills to negotiate for better terms.

Realistically, we need to improve organizational and independent employment at the same time. Market dynamics will corrupt one if the other is bad. Fixing one while giving up on the other is not a workable strategy.

How in any way would this benefit the economy as a whole?

Expensive labour provides incentives to automate it.

So? It can't be done, so forget about that. Just like you can't replace the executives.

First, the shortage of good programmers has more to do with the scarcity of high-quality work than a lack of access to knowledge. More autonomy for programmers means more high-quality work is done, which generates higher returns already but also means that programmers improve.

Second, if programmers take the decision-making roles back from smooth-talking executives, that means that smarter and more competent people will be calling the shots and society will run more efficiently.

> More autonomy for programmers means more high-quality work is done, which generates higher returns already but also means that programmers improve.

I disagree. The self-motivated path may be "more productive" for that one person, but it does nothing to guarantee returns or value for the larger organization.

Any employee should leave a company if they believe that their executive team is consistently shown to be inept.

I disagree. The self-motivated path may be "more productive" for that one person, but it does nothing to guarantee returns or value for the larger organization.

I don't think you've been in software for very long. Most software is terrible. Code and software quality are serious issues. Unmanaged complexity costs companies millions of dollars. This is a direct result of the low average competence of software engineers, which follows from their inability to get good projects on which they would improve.

If engineers had better working terms, there would be a much larger number of decent software engineers out there, and the productivity of the software industry would improve massively.

Any employee should leave a company if they believe that their executive team is consistently shown to be inept.

Bad idea. 95 percent of companies have executive teams that are either incompetent or malignant. If you have a bad immediate manager, leaving is a good idea. Leaving a company because it has worthless executives is not, because that's the norm. You'll just end up job hopping in the search for something incredibly rare. Better is to assume that most executives are harmful and minimize your career exposure to their incompetence and malice.

> Code and software quality are serious issues. Unmanaged complexity costs companies millions of dollars. This is a direct result of the low average competence of software engineers...

If the executives aren't aware of these facts, then it's a communication error. If they are ignoring the facts then there's little chance they would entertain a push for autonomy.

> ... You'll just end up job hopping in the search for something incredibly rare. Better is to assume that most executives are harmful and minimize your career exposure to their incompetence and malice.

Blame flows downhill, my friend. By accepting autonomy you're opening yourself to the risk of those actions. It's quite the opposite to minimization.

Conceptually, I like your ideas. Empowering smart people is rarely a bad decision. But attempting to change from within an existing behemoth, with their entrenched positions, is a far-fetched dream.

their inability to get good projects on which they would improve.

As someone who learned to code informally and is now looking for their first real software development job, I worry about this most.

> You don't need a CS degree to become a decent programmer.

You need one to get a job, though. If not now, in twenty years when you're old in an ageist industry.

If you can learn to program on your own and become proficient in it, build great projects, and keep up to date with the languages, why does it matter what major you are?

Granted I'm 20 and still in school, but just being a CS major doesn't make someone a great programmer. I like to think that employers care about projects you've worked on as that is the best way to see a true programmers talents.

Just realize that if you don't get a degree, you will always battle people who feel threatened by you. School becomes a core part of most people's identity, and they can be very protective of it in passive aggressive ways. People are told from a young age that people who didn't attend school are uneducated. At least that's been the case with me. I have a great career and no degree, and it's something I deal with constantly.

Yes, I'm in a similar situation and see the same reactions. People have a very irrational concept of post-secondary schooling. In many cases it's a very personal thing for them. Many've grown up thinking they'd be the first-in-family college graduate, holding that as a life goal, and listening to their parents and grandparents complain about how they never had the same opportunities as the college grads. Others have grown up with the expectation that they'll keep in the family tradition of graduates and not bring shame by failing to graduate. They've paid dozens of thousands of dollars for a degree. They've invested four years or more of their life, and all of that is epitomized in the "degree", when they're pat on the back and told they're a real grown-up now. And many seriously take this whole process to heart, and the less real value in a career, the worse this degree snobbery gets. They feel their degree has given them everything (and honestly it probably has, since their personal commercial value is often negligible). It almost becomes an idol to them. It is, therefore, considered a heresy if you speak evil of that system, and heretics are not well received.

Because of all this social programming, there are few who will admit that modern undergraduate programs, and often their graduate counterparts, are a horrendously inefficient, slow bureaucracy that provides a form of pseudo-independence for developing adults, cater classes to the lowest common denominator in order to pass more students, and occasionally have desirable network-building properties. But that's the reality.

We tie up adult identity in degrees very, very closely in our culture. Someone who never graduated college (or worse, high school) is automatically considered lower class by many. It is considered an essential of both personal and professional development in the white-collar world. Their whole lives people are told, "Go to college so you can get a good job." Is it any wonder those who've "paid their dues" and gone through these motions feel entitled to employment, even if they have no commercially viable skillset (and, this is critical to understand: most don't)? It's becoming a large social issue, but no one is willing to admit the real causes.

Just keep quiet about it.

I don't include 'education' on my resume, and I don't mention it to people. I can't recall the last time I was asked. Everyone seems to assume I have a degree because everyone else in the field has one.

I am not saying that a degree is not important! I'm just saying that just being a CS major does not make one a great programmer and likewise, a great programmer need not be a CS major. I believe a history major can become a great programmer...as said in the video, learning how to program just takes determination.

I was an econ major and learned html/css in a few short weeks pretty darn well. At least enough to land me a web developer position with a rising startup. I have decided to change my major to a more technical one purely because I enjoy learning both HOW to program and what I can do with it--which I'm learning is just about anything :)

> I'm just saying that just being a CS major does not make one a great programmer and likewise, a great programmer need not be a CS major.

I'm emphatically not saying this, I'm saying that a degree is good for employment. Many places will outright refuse to hire people without at least a BA in computer science (or computer engineering, I guess? I am not aware of the related fields).

Thankfully, investors care more about the end code than degrees.

Your first statement was that you need a degree to get a job. Having a degree gives you access to more jobs, but you can get a job just fine without a degree.

> but you can get a job just fine without a degree.

I have a few relatives who are well versed in computers but cannot find employment. I'm sure they would be reassured by your comment.

Being "well-versed in computers" is not the same thing as being a valuable programmer. If you are a valuable programmer and you still can't find a job without a degree, you're doing something wrong (what you're doing wrong varies individually, of course).

I have no degree, and have never gone more than a month without a job offer while actively looking for a position. One time it took two months to find a position with acceptable compensation, but I'd received a low-ball offer in the first month.

What exactly is a "valuable programmer"? It's about as vague a notion as being "well versed in computers".

Edit : essentially, you just re-defined bar without providing any real context.

Taken a step further (and perhaps in a different job market), someone might say "oh, well being a valuable programmer isn't the same as being a Rockstar programmer..."

The key lies in the word "valuable", which is completely situational and vague, whereas "programmer" is not.

I don't know exactly what a valuable programmer is, but I'll give you numbers regarding my situation, so that we talk about concrete things.

I do Python/Django work and I have 3 years of experience. I have a GitHub profile which I show when applying for jobs and my profile shows that I've made 359 contributions in the last year. I have a LinkedIn profile through which details the technologies I use and the experience I have.

Since the start of the year, I think I've been contacted by 10 recruiters with job proposals. I chose to work remotely though, with clients outside of my country.

A valuable programmer, in this context, is someone who seeks to get paid by a company to write code and would have a significant (as far as individual employees go) net-positive influence on that company's financials.

So returning to the statement "...you can get a job just fine without a degree..." comes with an important caveat.

Namely, that you can get a job "just fine" if you're a programmer of sufficient talent and knowledge to be able to be a net-positive influence on a company's financials.

YMMW, but that's a relatively high bar. Higher, at least, than the ease that "just fine" (IMHO) implies.

It's certainly possible, but non-trivial.

I'm pretty sure the concept of employment in general is that any employee is going to supply ample net financial upside to the employer. If you have the skills to be employable as a programmer, you should not need a college degree to find work.

This is changing though. Even in just the last decade, people's attitudes towards the job/degree relationship has changed dramatically. If the trend continues, I expect in another decade or two more it probably won't even be a consideration, except perhaps where the law still requires.

But degrees are a very strong indicator to most companies that you will do what you're told, have good "work ethic", etc.

I dropped out of college because I was learning very little, at a slow pace, and mostly in areas not related to my major. I love to learn and work on interesting projects, but I've never been the type to do things just because an employer/school/society expects me to. And I've definitely had to pay for it in many ways...

Dont take this the wrong way, but if you are a CS major you have an opportunity to be grounded in the basics of the craft . If you are not covering SICP, algorithms and implementing them in C code at your school, demand it or just stay at home and work on those. You have a n opportunity - work it - seriously dont think flash projects will help you here - take the education time to get educated.

It may be helpful to get you past the initial Catch-22, but if you can manage to get a few years of experience (let alone 20), I doubt the degree would matter much. Lots of job postings have "Or equivalent experience" after their degree requirements.

Thankfully you won't need one to start a company, and hire all of your CS-degree-less old decent programming friends.

Or when you find yourself in a new city outside of you network of contacts.

There are more programming/CS jobs every year than graduating CS majors.


If pilots had hangars and hangars full of free airplanes to choose from, I'm sure they would.

What is stopping you from getting your own autonomy?

p.s - I did not read your post.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact