Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How to reconcile “teach everyone to code” and “only hire the top 1%”?
75 points by ern on June 28, 2016 | hide | past | web | favorite | 55 comments
The Brexit crisis has got me thinking about inequality more broadly.Technologists have politicians in their thrall, and are not afraid to throw their weight around promoting curriculum changes in areas like math, and to also push the concept of "every child must code". Yet these same technologists claim that they hire a small percentage of job applicants for their companies.

It's clear that cognitive ability correlates with coding ability, and the idea of the 10X developer is widely accepted in our world. As a broader industry, we practice shameless elitism and seem to be making real-world software development more inaccessible to non-experts (apart from child-level development environments). As professionals. we deride RAD tools and "drag and drop" development, and show geek love for ever-more abstract modes of thinking like Functional Programming that a small percentage of working developers, let alone the general public, will grasp.

What's the end-game? Are we trying to create a new cognitive elite? Is it a labor lottery so that the small percentage of kids who are turn out to be good at coding will become professional programmers? Are we willfully blind to the fact that human talent is not evenly distributed? Or is the lack of accessibility ("easiness") just a blind spot that we have yet to address?

I don't think those things need to be reconciled.

Not everyone hires the top 1%, only those that can afford it. And how many people consider themselves in the top 1%, anyway? Do you? I don't. The hubris of the idea of the 1%. That's what the hedge funds tell themselves when they put you through an 8 hour-long interview where you're asked to code sorting algorithms on a whiteboard. That they're seeking out that 1%. Larry Wall and Bjarne Stroupstrup? Okay, 1%. 25 year old comp-sci / top of class engineer student? I don't know, possibly? Okay, putting aside the snide remarks: 1% selection is not scientific, as we all know from hiring interview experiences. And moreover, are the people making the most money doing it writing software, or are they somewhere else in the IT company?

But to answer you, why shouldn't kids learn coding, it's one more notch on the belt. Takeaway: people have different interests, inclinations, talents and thoughts. For a young kid, you teach them the basics, you see what sticks. No reason why coding can't be taught young, even if it's part of a math curriculum. To be honest with you, I sucked in math, but perhaps if I had math in the context of a computer program, it wouldn't have been so bad.

Teach people coding. Also teach them to read the classics, do pullups, cook, what happened in the past, how old the rocks in the Earth are, what's in the oceans, what's outside of Earth's atmosphere, and everything else that we feel is important for a human being - not a future worker - to learn.

It's not about the companies that actually hire the 1%, it's a matter of how many companies claim to hire the 1%.

And more importantly, how come those who claim to hire the 1% never pay anywhere near what might be considered 1% salaries?

I've had small startups talk highly of how they only hire the best, the 1%, the 10xers, etc., then put me through five or six interviews, including the ever-so-common 8 hour white board algorithm test, only to make an offer only marginally higher than I could make as a highly skilled blue collar tradesman.

It's fine if a very select few companies with lots of money decide to have insanely high standards. The problem is when the industry thinks they all deserve to emulate those companies hiring processes.

There are certainly places that will pay top dollars to hire the absolute top talent, think Putnam or acm winners. It just won't be a startup. It should be obvious why: startups don't have the capital or need to hire absolute best. It will more likely be a hedge fund.

As a whole these startups with ridiculous expectations will have to face reality or suffer the consequences of slow hiring. They will lose in the long run.

Most likely they will. Maybe they won't, because part of the process is a certain indoctrination that their employees are more than just talented or clever, but are actually special. And you can convince an employee of a lot of things if you start by convincing them that they're special.

Mostly, though, I worry about the ripple effect that's been on the industry. I have friends who are in the medical field, dealing directly with surgeries, childbirth, life-saving (or potentially ending) drugs, and they're taken aback by the kind of interview requirements that so many tech companies have. A Nurse Practitioner can get a job in an ER in a heartbeat, but a front-end developer building React components needs a collective two full days of interviews? Something about that just doesn't make sense to me.

A big difference is the amount of formal certification.

A diploma from a med school and associated residency is sufficient to expect that the person knows their speciality and is able to treat people from day one.

A CS diploma and related working experience is some indication, but (sadly) often doesn't ensure that the candidate is able to code their way out of a wet paper bag and will be able to perform reasonably well (not 10x well but 1x or even 80%) after a month or two of on-site adaptation.

Our industry, unlike many others, doesn't have a good (centralized?) way to approve qualifications, so each company has to verify on their own, which takes quite a lot of time and expense.

I agree, that's a big difference.

That said, the dangers of hiring a medical professional who screws up, versus an app/front-end/back-end developer whose code is sub-optimal, means that our industry should hire a lot more easily than it does, and simply let go the employees which underperform.

I think the problem actually doesn't start with hiring, it starts with the kind of "workplace-as-family-and-friends" culture that makes firing so taboo.

Firing is a very important skill for any line manager or business owner. It's not pleasant, and it's complicated in various ways, but not doing it properly, not doing it as soon as appropriate or even not doing it at all is a very expensive failure that hurts everyone involved. You either learn to do it somewhat properly and timely, or you're not qualified for the job yourself.

Your comment reminded me of cperciva's thread where he introduced Tarsnap - at one point someone challenged him about his braggadocio - https://news.ycombinator.com/item?id=35079

Also, it's neat to see Drew Houston comment in that thread, who as far as I know didn't win the Putnam, with ~ a billion net worth, and cperciva still working away in relative obscurity.

"Relative obscurity" is a matter of perspective. Within my fields, I'm far better known than Drew.

And that's the key difference here -- we have both been very successful in doing what we want to do. We just want to do different things, that's all.

I'm not trying to throw any shade on you, it's just one of the most memorable threads in HN history in my opinion and interesting to look back at. Cheers!

I doubt I'll ever get an opportunity for such a comeback again. But I do regret that I let my irritation at ivankirigin suggesting that I had a 90% probability of failure carry me away in that thread. I hope I've kept my patience better on HN over the past 8 years.

Why does teaching programming have to result in a programming-only job? That's like saying the end goal of teaching literacy is for students to become professional writers. Or that children should be taught to touch-type so that they are eligible to work as transcribers.

> Why does teaching programming have to result in a programming-only job?

At the last local Python meet up I went to I sat next to a sociologist who writes code for data analysis and visualization and a 3D artist who wanted to script up Blender. If social scientists and artists are programming, I'd say it is a useful skill across a broad range of disciplines.

I think the future of programming is multi-disciplinary. In a way it already is. If you know how to code well and you're experienced in another field that could benefit from it, there are an extraordinary amount of opportunities out there.

Because it's marketed to parents and governments as "skills for the future" or "necessary for a modern workforce". Yet the "real-life" programming world in its current form is both elitist in attitude and genuinely difficult, and a miniscule percentage of kids will actually write code for work (barring a genuine effort to make it easier, and for businesses, developers and consultants to change their attitude to "amateur code").

Literacy is not vocational: it can be applied in many contexts-everyone uses reading all the time. Arithmetic is also plenty useful.

Or that children should be taught to touch-type so that they are eligible to work as transcribers.

Touch-typing isn't widely taught anymore, is it?

> Touch-typing isn't widely taught anymore, is it?

I dunno? In my elementary school, it was part of the computer lab that we did on a weekly basis. Can't remember if it was part of an English class or a separate computing class, but we played games (e.g. Mavis Beacon type games) and kept score.

> Literacy is not vocational

Why should "Learn to code" be seen as vocation? I understand that's the way it is seen by the majority, but that's not the way it's seen by all advocates of "Learn to code". How did "Everyone learn arithmetic" become ingrained in educational systems?

> Because it's marketed to parents and governments as "skills for the future" or "necessary for a modern workforce".

It is, you can get both of those things because programming teaches you how to really think, and that skill is transferable to anything you do. It's like akin to reading, it's a useful skill to learn no matter your eventual chosen profession. It doesn't have to lead to a programming job.

> Literacy is not vocational: it can be applied in many contexts-everyone uses reading all the time. Arithmetic is also plenty useful.

Programming is the same and belongs in that list with literacy and math.

Pretty much the only reason I can touch type is from smack talking people during Quake sessions

> Touch-typing isn't widely taught anymore, is it?

If true, that is deeply bizarre. Touch-typing, much more so than programming, is actually useful in basically any non-menial job in the modern world. It ought to be as basic to the curriculum as math and reading.

I think they just call it "typing" now. I had to look up what "touch typing" means. There isn't really an alternative; a person has to know qwerty to type, even swiping on a smartphone.

The alternative is hunt-and-peck, aka "look down at the keyboard, find the key you want, and poke it with your index finger." I started programming at age 7 and wasn't taught to touch-type until 11 or so; you definitely don't need formal QWERTY training to input text at a pretty rapid rate. It's still not as fast as touch-typing, though, and being able to type without looking at the keyboard is a huge advantage.

Indeed at the school I attend, over half the students taking classes in the Computer Science program are non-CS majors. This includes a lot of math, science, and engineering majors, but there are a fair deal of humanities and arts students as well.

Studying programming and computer science is valuable for more than just programming; it teaches a useful way to problem solve and structure your thoughts.

As a broader industry, we practice shameless elitism and seem to be making real-world software development more inaccessible to non-experts. As professionals. we deride RAD tools and "drag and drop" development, and show geek love for ever-more abstract modes of thinking like Functional Programming that a small percentage of working developers, let alone the general public, will grasp.

Yes. Look at what happened to HTML, which was supposed to be simple. Up to HTML 3.2, there were good WYSIWYG programs such as Dreamweaver which could do a good job of page layout. Then came CSS, div/clear/float layout, vast amounts of Javascript, and a mess so complex that HTML became only an output language for content management systems. Yet most of the pages look about the same.

The complexity of simple business applications has increased substantially since Visual Basic, yet most of them aren't doing anything that profound. (I occasionally point a finger at Soylent, which built an elaborate IT infrastructure for a site that averages about two sales a minute.)

The annual web infrastructure fads aren't helping. Some of the biggest sites run on rather vanilla infrastructure. Instagram runs on Postgres. Wikipedia runs on MySQL with Ngnix caches. Do you really need NoSQL?

> became only an output language for content management systems

Sure, if you want a complex website and crm. HTML is just as easy as it ever was, if not easier with some of the cleaner html5 tags

> I occasionally point a finger at Soylent, which built an elaborate IT infrastructure for a site that averages about two sales a minute

Average isn't necessarily what matters, but peak. Do you think Amazon.c om needs to design their systems around avg load, or potential load a few weeks before Christmas?

Your complaints don't really hold water.

His complaints relate to non-technical folks learning to code. In this respect, the HTML note is valid. You, a technical person, understand the nuance, but "easier then it ever was with cleaner html5 tags" won't resonate for folks trying to learn.

I think the end goal is to get more programmers out there so they can bring hiring costs down. That's what happened to other well paid professions (ie law and pharmacy). The other strategy is to bring in more immigrants.

It's easy to get cynical about business and politics. But, sometimes people push for ideas not only because they can exploit the results, but also because they might be good ideas for lots of people in general.

1) It is generally expected that the skills involved in software development will be increasingly important for high paying employment and the general advancement of the state of the world. That's not a claim that being specifically a "Software Engineer" is important. But, that already "Nearly half (49%) of all jobs that pay more than $58,000 require some coding skills" [1] The kid might grow up to have a job that involves analyzing a lot of data, or running a lot of simulations, or designing something that requires more aid than a pen and paper can provide (ex: synthetic biology). Either way, that kid is going to end up doing a lot of work that looks a lot like software engineering in practice --even if it doesn't say "That Kid - Software Engineer" on that kid's business card.

Even beyond "some coding skills", it is recognized that the systems thinking, systems design and systems architecture that is inherent in software engineering is increasingly important for highly-paid technical work moving forward.

2) Very few actually "hire only the top 1%". If that actually was the case, 99% of software engineers would already be unemployed. The "hire only the top!" meme is more of a reminder that it's not often a good idea to go cheap and hire lots of low-cost warm bodies to fill out your project. Hiring cheap is the natural tendency of all management. So, they need a catch phrase to knock themselves out of that norm.

[1] http://www.content-loop.com/why-coding-is-still-the-most-imp...

Huh, that's an interesting way to look at it. You've convinced me not to deride people saying that.

The larger the number of programmers in the world, the larger the number of programmers in the top 1%.

I'm just a student but as I see it, two things: enough companies don't hire only the top of the top; and most people fit in the top 10% for one company or another. One might not be a good fit in one place, but in another fit really well and coincidentally have the right technology stack background.

And besides, someone who can code is (in my programmer's opinion) more self-reliant than someone who can't. You can start your own company doing something small that still helps a bunch of people.

Do you reconcile “Teach everyone to read and write English” and “Only hire the best writers”?

No because you write: >Literacy is not vocational: it can be applied in many contexts-everyone uses reading all the time. Arithmetic is also plenty useful.

So the question is more: "is coding vocational" ?

Rephrase that juxtaposition as "teach everyone to read" and "only admit the top 1% [to elite universities]". Now you have gone from potentially contradictory normative statements to an uncontroversial description of the educational systems in most countries today.

Public education was a bastion of American liberal democracy [0]. But adult literacy is not merely an egalitarian project. It was and is an important source of average labor productivity gains. At the same time businesses practice elitism where it, too, is consistent with the profit motive.

I'm not sure what's to be gained from hiring C+ English graduates to staff the New York Times, nor from hiring 55th percentile CS grads to bootstrap your next startup. Let a business hire the best employees for the job to maximize marginal productivity, and by all means keep teaching people to code where it can raise average productivity.


There is no objective top 1%. 99th percentile for one job might be 9th percentile for another.

Also teaching programming is more about technical skills and literacy than making literally everyone employed as a programmer.

I can teach someone litterate in math and one language how to code, but I cannot do the opposite.

Coding out of industry is less a usable skill than knowing how to make bread, alcohol, building, mechanic, filling your taxes, reading the laws for an individual.

Sacrificing generic skills that makes citizens more independent compared to «ready to use» skills for the industry is a sacrifice the nations are doing I don't grok.

What I see discussed much less than inequality is social mobility. Why precisely is inequality bad? I'd argue that it is bad in large part because it implies lack of social mobility - when top 1% consists of the children of the top 1% of the previous generation while everybody else is stuck trying to make rent, something is wrong with the society.

In this light more equal access to knowledge and education is a step in the right direction. And elitism, impenetrable jargon, arbitrary barriers to entry etc. are not. But, you imply that who gets past the barriers is determined only by "cognitive ability". What if it is not the case and the structure of society is responsible too? When your "cognitive elite" consists mostly of white males with specific backgrounds who went to specific colleges that's suspicious.

Now when someone with a lot of money is out of investment opportunities with acceptable ROI and seeks for someone to create these opportunities for them, that kind of promotes social mobility but only up to a certain level.

Others have commented that basic coding education is akin to literacy or mathematical literacy. I agree, but also observe:

Coding teaches structured thinking. Basic coding (like lightbot-level or the Frozen/Minecraft hour of code exercises, which is where my 5 & 7 year old are) teaches skills every bit as valuable as thinking board games. We don't play Rat-a-tat-cat, Monopoly, checkers, chess, and go with our kids in hopes that they'll become grandmasters or even employable in the field. We do it because the elements of strategy, planning, adaptation, hard work, and overcoming initial obstacles are valuable thinking skills. As my kids get older, they'll progress into "harder" coding exercises, still not with a vocational angle. Even if they end up in a non-programmer job, "programming" is going to become an ever-increasing important part of most white-collar jobs in the future. VBA is programming. Excel macros is programming. Writing rules to filter your email is programming. I'd rather that not be mysterious to future generations, and I'll see to it that it's not to my kids.

"Only Hire the Top 1%" is something that I'd love to do, having seen the results possible when you get a dozen or so of the actual top 1% together. As it's practiced, most companies that think they are hiring the top 1% are probably hiring the top 3-5% and are hiring that not from the overall pool, but from the pool of people that walk through their door. That can quickly get you to "we're hiring from the top half of the pool, but we say we hire from the top 1%".

Imagine a pool of 1000 candidates and 26 companies with a divinely perfect interview system and rigorous "top 1%" standards. 10 will be hired by company A. 990 will be joined by 10 more applicants who left their job for whatever reason and apply to company B. 10 will be hired by company B. Lather, rinse, and repeat, and the top 250 (or ~25%) will have been culled from the pool before company Z even starts their process, yet company Z will still hire "the top 1 percent [of people they see]" The example is simplistic, but company A will get a much higher caliber workforce than company Z.

That's why retention of strong employees (with financial and work/colleague means) is so critical, IMO. If you can keep your top employees from leaving and have a decent interview process, you'll end up with a good team. If you have constant churn, you'll have an awful team almost regardless of anything else you do.

How to reconcile "teach everyone to write" and "only give the top 1% a book deal"?

Just as everything in the school curriculum, everyone should learn it because it is a useful way of thinking and a useful tool to know, and an important perspective if you want to understand the modern world, and to give everyone a chance to discover their talent at it/their particular interest in it in case they have it, in which case it might then lead some to choose it as a career.

Just because your todo list is not world literature, doesn't mean that your ability to write is useless, and just because your little VBA script (or whatever) is not a distributed fault-tolerant system that can process billions of transactions a second, doesn't mean that your ability to write programs is useless.

One aspect of teaching everyone to code is what Seymour Papert talked about in Mindstorms: it can be used as a medium for carrying ideas. For example, I think Mindstorms had this anecdote about a child who previously had trouble grasping the idea of classifying parts of speech (nouns, verbs, etc.). Later, she was writing a program for creating random nonsensical sentences, when suddenly it clicked, and it all made sense to her.

The point of this anecdote is not that every child should write such a program to understand parts of speech. It's about something much more general, but I have no idea how to express it briefly. I'm not even sure if I fully understand it. So, unfortunately, this idea might just be too subtle to be properly executed on a national scale.

Personally, I don't think those two statements are linked. The first is about education on general.

Yes, it's nice to have a one more tool in ones cognitive toolbox. Like, it's nice to know some maths despite not using it professionally.

The second is just about hiring publicity. Given free choice, the top professionals often seek the hardest challenge in town. It's just a recruitment honeypot. There is no hiring methodology to actually figure out the productivity of a hire - but, you can affect the population of hires. 8 hours at a whiteboard in itself is not that necessary except to maintain the image of a worthy challenge, thus calling in a population with it's siren song. It's not personal, it's just statistics.

We are not remotely close to teaching everyone to code. Only about 5% of high schools teach AP computer science.


20 times as many students take the US History exam. And you probably have to be even better than top-1-percent to get a job as a historian.

I'm not saying that everyone should turn into a software engineer, I just think programming is a very relevant subject to the modern world, and a curriculum in which everyone took computer science classes would be superior to the status quo curriculum.

I don't know that it can be reconciled, at least not without roughly defining a boundary between what "everyone coders" are supposed to do, versus what "real programmers" do.

Learning to code is all about picking up a shovel and help your tribe accomplish something. What else are you going to do? Your tribe needs you to be 21st century literate. Learning the basics of shoveling is not hard. But if you refuse to even pick up a shovel there will be a problem. Not everyone that picks up a shovel will go on to become master coders. But the ones that don't even try? There are no jobs left for non-technical people. It's the shovel or... what?

It expands their horizon.

Think of it like children learning a musical instrument (or taking up ballet lessons). Not every one of them would turn out to be musicians.

Hopefully, they grow up to have an appreciation for programming and things/fields related to it, as well as develop an additional perspective. They don't need to become programmers, or even continue to program as a hobby.

This is why I believe programming should be taught in school, early, much like art and music.

Perhaps some of the 1% might not realize they have an aptitude and passion for programming if they're not encouraged to try it properly?

Software engineering is genuinely hard. There's no need to add elitism to it. This choice of vocation is already highly selective towards everyone who tries to enter that field.

I'll share some numbers from my academic days pursuing a CS course. From an average group of say 100 students to graduate with a CS degree only about 3-4 actually had an aptitude to become a serious and wise professional. And even those would not necessarily take a programming job in the end. Some didn't see good career perspectives in it. The rest of the group would come to a conclusion it's just too hard and would spread out to whatever alternative jobs they can find, either IT-related or not.

The amount of computer-engineering talent is pretty constant and very low. You can encourage the general population to try it but that won't accomplish anything. Anyone with genuine abilities and an inclination towards this line of work already goes into that industry.

What is possible is that through inconsiderate encouragement of large masses to enter IT you eventually get crowds willing to perform unqualified tasks for pennies, like building trivial smartphone apps, performing WordPress installations etc. In fact, this already is a reality. I see people around me abandoning their careers and moving into IT. This is in part caused by the economic crisis in my country (Russia) which devalued salaries almost three times, and lately everyone's been thinking about getting into IT to improve their earnings. I also see that people just stumble at the basics. As soon as they realize programming is not about dragging & dropping objects in a visual editor and you have to actually perform some intensive thinking, people just give up that idea and quit.

Ultimately, it will just make it more difficult to distinguish a professional in a larger crowd, but it will not make his/her services cheaper. In fact, it will probably cause more people to be burned by amateurs posing as professionals and eventually they'll be more ready to pay the premium in order not to have to deal with amateurs and risk their projects ruined. Software engineering services will then become more expensive in general.

Software engineering is hard and will remain hard. The amount of talent will also remain pretty small and launching a thousand of new programming schools is not going to change that. Smart people don't need any schools at all. For those who aren't smart, schools won't help them much.

As to what the end game might be, it's just too early to say. The society is transforming into a different community. Right now it looks like people with average abilities will have tough times landing a job as those become automated and replaced by computer systems of some kind, whereas it will be easier for the unqualified folks and for the highly qualified ones. The middle might just see some very rough times. We'll just have to wait and see which way it goes.

I'm sorry, but this is simply not correct that "anyone with genuine abilities .. already goes into that industry."

Last year, I had the delightful experience of teaching a primarily non-majors intro to CS course. The top 10-15% of that class (of FOUR HUNDRED STUDENTS) could easily compete with our majors. Here's a bit of what they did, after one semester, with no prior programming experience:


There were a lot of students in that class who could become first-class computer scientists if they chose to. And they probably won't - they have other things they want to do.

But what a lot of them will do, now, is integrate programming into their careers as engineers, artists, scientists, and everything else, when it makes sense to do so. And they'll get a huge leg up on their competitors in doing so as computing continues to expand more widely into our digital and physical lives.

Add to this the huge number of people who don't have the opportunity to learn to program in the first place. This is a very US-centric view, but our high schools, by and large, are absolutely garbage at teaching CS. There are some stand-out exceptions, but most of our high schools do a great job of convincing most students, particularly those not of the "geeky young white male" persuasion, to stay the hell away from CS. (I say this as a formerly young geeky white male who was actually convinced by his HS experience not to go into CS, until switching back into it late in college. I finished a biology degree in the meantime until I realized that CS was far more awesome than my abysmal high school experience had suggested.)

The combination of deep domain-specific expertise with reasonable programming abilities is very powerful, because for the right problems, it acts as a huge force multiplier and/or eliminator-of-tedious-crap.

Together with the fact that it's a useful mode of thinking, this is why we should be teaching more people to program.

You are quite correct on all your points and I agree with you.

I suppose I could have expressed my idea better. What I meant is that for people with a programming aptitude there are no barriers these days. Whether they want to jump directly into software engineering or pursue another field and use their knowledge of programming as an asset, for the both choices the road is open. There are tons of information freely available for those who wish to learn. If one wanted to get an official CS degree it's also easily arranged, university chairs of that specialization are under-filled. At least that is so in Europe. For some reason the locals prefer economic/legal/business degrees and engineering isn't terribly popular. Can't say anything about the situation in the US though.

You don't even need to be a good programmer to make 70-80k

You can be a really bad one, that's still upper middle class in much of America , and not to shabby even in high price cities like LA

> How to reconcile “teach everyone to code” and “only hire the top 1%”?

The logic is simple: Everyone needs to be a coder if you want to fill all coding jobs with top 1% coders.

Teach everyone to code is to help the people who use a spreadsheet for everything.

Letting them know a bit of databasing would avoid some scary uses of spreadsheets.

Coding is a great hobby. Not everyone has to do it as a job.

Many good comments here already. One thing about the "computer literacy"-thing -- I think that's elevating the standard a bit. I prefer "computer use", as in "we need to enable people to use computers". Sure, "computer literacy" covers that, but I think it obscures how bad people are at using computers. Even young people that have grown up in a world surrounded by them.

I routinely work with people that can't leverage a spreadsheet as a better calculator - and prefer the "calculator app" for help with arithmetic. Now, I know that it's easy to fall back to using the first thing/tool we know - a hammer for every screw, so to speak.

But I think part of teaching how to actually use computers, should also be teaching people to look for the better tool, or to be able to make one. Much like how Bram Moolenaar, creator of the vim editor, talks about constantly looking for things we do while editing that is repetitive, and probably could be improved (either by discovering/looking up functions that are there, but we don't use day-to-day, or by scripting) - so it is with all things we do on a computer. It's in many ways the ultimate tool for data and information - everything can be (more) automated. If we have the right mindset, and mental tools.

In my mind, that's what "computer literacy" is about, and basic programming is absolutely a part of that. While I'd prefer to see people know how to write a python script to find their way out of a wet paper bag, even if we just enable 80% to actually automate spreadsheets, that could have a huge impact on overall productivity. Because despite the horrors many hn'ers probably have encountered in terms of visual basic, access and excel -- still far too few people are able to make such horrors in order to do everyday tasks, like sane shift scheduling.

I think we need a bit of a revival of the ugly, everyday scripting and maybe even so called "4gl" languages -- because while they often are the wrong tool when you want to make a shrink-wrapped tool for others to use (maybe even something you could sell), I think it is often very much the right tool for thinking, experimenting and making certain tasks easier and less error prone.

As a side note, a sane spreadsheet-like interface with a sane programming language would probably be a good idea. R does this to a certain extent, but I'm not sure I'd want to try to use R to plan shifts, or do a number of arithmetic things like manage my budget or calculate material use for a circular stair case and so on. I'd probably prefer something like Python or Ruby coupled with something a little snazzier than a CSV-file (ok, I would totally prefer grep, awk and a text-file -- but I think we should be able to do better).

>The Brexit crisis

It's not a crisis. They didn't want to be a part of bureaucratic corrupt system which never gets anything done for them, takes from the poor of the rich countries and gives it to the rich of the poor countries. They became independent of it now. It's best for everyone, except for the very few who are leeching off of it.

>every child must code

Yes, it's in their interests to have more coders in the market so that they can pay them less... and also in the interest of the government if they don't have to depend on foreign manpower. You don't need to fall for it.

>they hire a small percentage of job applicants for their companies

Why should they hire anyone who is not the best? Specially when hiring the best is not a lot more expensive than hiring an average developer? Let's say you set out to make a new product. Why would you not hire anyone who is the best among the people who are willing to work for you? Also, the world doesn't need everyone to be developers. At some point market will be saturated enough to not be able to pay even the good developers... which is exactly what the tech industry wants and for good reason.

>seem to be making real-world software development more inaccessible to non-experts

No, we don't do that. It is more accessible than ever to non-experts... be it with PaaS or plethora of tools and tutorials and books and other resources available.

>we deride RAD tools and "drag and drop" development

We definitely don't do that without a reason. They have been tried and tested and have failed to deliver in the long term. Changes are difficult to track and building on top of someone else's work is painful.

>ever-more abstract modes of thinking like Functional Programming that a small percentage of working developers, let alone the general public, will grasp

To a non-programmer, Functional Programming makes WAY MORE sense than assignments and looping. Also, we didn't decide to use Functional Programming to exclude out people who have been trained otherwise. We did it because it makes code easier to reason about and scale.

>Is it a labor lottery so that the small percentage of kids who are turn out to be good at coding will become professional programmers?

Yes. A small percent of children who turn out to be good at science class will become professional mathematicians. A small percent of children who turn out to be good at sports will become professional sportsmen. A small percent of children who are good at cooking turn out to become professional chefs. Same with everything. Why should it be different when it comes to programming?

The ONLY thing you need to do is make yourself valuable for someone else so that they are willing to share a part of their income with you. If everyone did the same for themselves, everyone would create value for each other. Unfortunately, people are so much focused on fixing others that they end up harming everyone's lives.


Personal jabs are not allowed on Hacker News. Please post civilly and substantively, or not at all.

We detached this comment from https://news.ycombinator.com/item?id=11991618 and marked it off-topic.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact