For my social circle it's not remotely in the obscure category. I suspect that none of my friends or coworkers would have to look it up on Google...
This is what a liberal arts education is for.
Of course I realize that not everybody is lucky enough/even wants to get a liberal arts education, but the social/educational structure of America is not something I'm looking to attack this comment.
I also have a liberal arts education, but those fruitful years were reserved for prevaricating on and dissecting various obscure and convoluted logical arguments related to phenomenology.
Thus, what passes for a "standard high school education" in the United States varies wildly -- in some schools, you can graduate without even being able to read and comprehend a magazine article. In others, the top graduates often find Harvard less intellectually rigorous than their high school (because Harvard has to help their students from worse high schools catch up).
If you assume everybody on HN is an American, maybe.
I also attended a French school, where it was covered, and I've had at least one conversation with my Norwegian family in which it came up.
(That is, of course, based on the overwhelming likelihood that these aren't actually Bret Victor's words.)
The post conveys the false idea that mathematical purity/simplicity and use simplicity are at opposite ends of an imaginary spectrum. If anything, the optimal is somewhere at their intersection.
That is, the ideal tool should be mathematically simple and sound, while still being intuitive.
We should also admit that we do not know of a good, reliable method to write good programs; and this fact is completely independent of the programming language used. Languages have different properties. Some are beginner-friendly, others scale well to large system, some both; but in all cases, what's valuable is the thinking that leads to the program. In another commenter's words: formalizing murky idea.
Programming as a fundamental activity, simply doesn't scale. The level of training required to get competent requires many years of training, while software is able to "download" a braindump of the best practices. For instance, Excel is used for a lot of business analysis. Business analysis, you say? How is 65535 rows sufficient? Excel, for instance has an engine that can process large amounts of data that's hosted externally using a vertical columnar compression. While a competent programmer with years of training may be able to do this themselves, they will not be able to beat on price when a person armed with half an hours tutorial can perform the same task on Excel.
Yes, there are monstrosities and errors http://www.businessweek.com/articles/2013-04-18/faq-reinhart... with Excel. The question is whether these errors are any easier to discover in a computer program.
The reason we aren't seeing more of these software is because programmer-types don't think very hard about non-programmer's programming problems. I hope with Jon and Brett's work, we push the boundaries further, and make computer programs easier to modify and easier to reason (note: not just easier to write). In both Brett's and Jon's works, you don't avoid writing code. You just find it easier to figure out what effect a code change has.
I saw the comment about using a traffic cone as a mutex. Again, this is a symptom of the dismal state of tools rather than the idea behind the tools themselves. VisualSourceSafe still performs pessimistic check outs. Imagine if the entire sourcecode of a project is in one single file, then better tools would evolve to find code, and allow many people modify it at the same time. We already know how to lock a database, provide MVCC. One day, programming will be very similar.
I'm actually a lot more frightened to be called for the cleanup...
No; I have to object to this specifically. Normal people can't "get shit done." Not really, not successfully. People, when you let them build their own tools in Excel and Access and what-have-you, end up with rosters full of incorrect and incomplete and invalid information, things split into so many different places that nobody can find anything any more, and interfaces that require inexplicable cargo-cult rituals and avoiding otherwise valid input states to use. Their stuff works 95% of the time; they just aren't used to a world where "5% failure rate" means "silently and consistently eats any customer's file if their name has a ç in it" instead of just "has to be restarted every hour."
The thing programmers do--it isn't using arcane languages, recognizing mysterious error codes, memorizing APIs or libraries. We aren't here just because the difficulty of typing "/[A-Z]+?/" grants us job security. The thing programmers do--or more generally, the thing Engineers do--that other humans need us for, that machines can't do for us (yet), is to formalize murky ideas.
People who don't have training in Engineering have no fucking idea how to go about doing this. You know the old joke of there being a button labelled "Do What I Mean"? It's a joke because 99% of the people who would press that button don't know what they mean; there would be no coherent thing for the button to do--even if it could read their thoughts--but to interrogate them for hours to get them to decide what they really want.
The few of us--the Engineers--we can sit down, and without any further prodding, think out what we want to have happen. Then it's a simple matter of just writing it down. The compiler, the language, none of those are really problems, compared to knowing what you want to happen.
Note that Bret's talks and essays are focused, by-and-large, on the iterative rapid-prototyping model: using computers as tools to help us explore options, so that we can more quickly figure out "what we want." But even if you know that you want a Sudoku solver, you can't iterate out a Sudoku solver. You have to know that what you want to do is to put these constraints, in this order, on these numbers--and that's an algorithm. People--not us Engineers, but people--they don't understand algorithms. You need an Engineer to take the statement "I want a Sudoku solver" and formalize that into "I want to use this algorithm." Just like you need an Engineer to take the statement "I want this bridge-support anchored to the riverbed here" and translate that into materials and rigging that will take the tensions and stresses without shearing.
Neither COBOL, nor SQL, nor HyperCard, nor AppleScript, nor Inform, nor any other "human-friendly" language, ever served to allow anyone to express a clearly-defined thought they wouldn't have been able to express in a language with more "punctuation-y" syntax. Learning what "<=" or "&&" represent is a small, fixed cost at the beginning of attempting to solve problems. Learning what they mean--what the formalism is there, and what its consequences are--takes years, and requires that you think like an Engineer. That step is required whether you are using C, COBOL, LabView or Prolog.
To get shit done, you must first know what done shit looks like; perfect, robust, unbreakably done shit. And only Engineers really do. Just like how people who aren't artists, if asked to draw someone's face, will end up drawing the abstraction of a face; people who aren't Engineers, if asked to formalize a system, will end up describing the vague, hard-AI-complete, "and then it just does what you'd expect okay!?" system that, in practice, isn't a system at all.
The thing that programmers do is the exact same thing that your post says "normal people" do. Programmers build buggy systems that lose data because someone's name has a ç in it and fail 5% of the time (or more) and rarely do they understand algorithms (and especially not the complexity characteristics) and very often they build the wrong thing, even if they get it done and few have a formal education in anything resembling engineering and very often programmers are indeed getting shit done, but it just turns out that what they got done was just shit.
We have to stop putting programmers on a pedestal. There are great programmers and there are bad programmers. Just because someone slings some code doesn't make them a bastion of clear engineering practice. Many programmers that I've met would benefit greatly from the kinds of systems that Victor describes.
However, let me put my argument another way: the ability to work with you to formalize your idea is what you're paying for when you hire a programmer. You're paying for a human compiler, a fully-intelligent REPL where you can give vague commands like "make me the next Facebook" and, through the programmer's Engineering knowledge, they will ask you questions and force you to make choices, until they've turned that informal idea into an actual capital-S System. Inasmuch as they have that Engineering knowledge, the resulting System will be a formalization of your own desires. And then, having the formal System, the programmer can go and implement it. That part is comparatively trivial, and increasingly done "by" the software compiler. Victor is noble for driving us further toward trivializing that part of the process, but it is only part of the process.
But anyway, back to your points:
Most "programmers" aren't programmers. If you're only as good at programming as a member of the general population, then you don't get a special job title, right? Otherwise, I would be right now a writer, actor, landscaper, game-designer, philosopher and life-coach, as well as a programmer. But in only one of those fields could I actually make more money than some schmoe who just decided to jump into the field last week, and the reason I'm "more of a programmer" than Bob-the-Accountant is a programmer, is that I can do the Engineering part better. If Bob-the-Accountant started calling himself a programmer, he'd be a white belt programmer--someone who just joined the Art, and must still unlearn their preconceived knowledge before they may begin--while my belt, at least, would have some color on it. Plenty of people calling themselves programmers are white-belts. That doesn't mean we should consider them when we speak of "the thing that programmers do." It would be like including people who write their own legal contracts when speaking about "the thing that lawyers do."
So yes, if we're going to keep using the word "programmers" to refer to both the Engineer journeymen and the white-belts who produce misfeasance with their every step, then we should stop putting "programmers" on a pedestal. After all, what use is a pedestal where the people on it are exactly as high up as the general population around them?
So if you're paid to write programs you probably aren't a professional programmer? What's wrong with an operational definition?
There's a field with a similar problem to our own: public-school teaching. Teacher performance isn't quantified, and so teachers can pretty much get away with only being as good at imparting knowledge as any random member of the population--even though they took years of education-in-Education. Most "teachers" aren't teachers, any more than I'm a teacher; they're simply people paid to repeatedly attempt (and fail) to be teachers.†
We pay a lot of people to repeatedly attempt (and fail) to be programmers.
† Teachers' unions are right now fighting the introduction of actual metrics on how a student's achievement level's year-over-year velocity is affected by a teacher relative to the average of their peer-teachers in the same school. They are fighting this because the data clearly shows a bimodal distribution, where a lot of people are just extremely unfit to be teachers--their students actually reaching negative learning velocities by their presence in the class--and until now, the unions protected these people, since there was nothing to prove they sucked so very much. It's hard to go back on your decision to defend someone when you later find out that they're indefensible.
(Apologies for the totally off-topic response to your only mostly off-topic comment.)
We would unionize and be fighting metrics too, if they tried to measure how good a programmer you were with the number of goto's you use.
I like where you were going on programming though.
However, if you believe that there is any empirically-detectable property that makes a someone a better programmer when they have more of it, then "programmer" is a natural category, not an artificial one. It's something where, if we washed away the word for it, we'd end up re-creating the word, as a handle to describe that obvious cluster of things which are unlike other things but like one-another. Being a programmer has phenomenological consequences--you can determine who is or isn't a programmer using games or tests which don't have anything to do with programming trivia, and without mentioning that "potential for ability to program" is what you're testing for.
In any natural category, you'll have false negatives and false positives: things that are identified as X but don't have the property that puts them in the X natural-category, and things that aren't identified as X, but which do have the property.
There are many False Programmers. There are also False Not-Programmers: people who don't think, or know, that they're programmers, but who are nevertheless. This is true of every natural category. There are people who think they can sing but can't, and people who think they can't sing but can.
There are professional singers who can't sing, even though they "are" singing. When we say "can sing", we imply the edifice of a market, and competition; we really mean "can sing to a level where we'd pay them more for their singing than a randomly-selected member of the population." In other words, they "can sing objectively-well."
There are people who "can program objectively-well." They are, in the terms of the natural category, both the True Programmers, and the False Not-Programmers. There is no fallacy at work.
From the linked homepage:
>> We (Saeed Dehnadi, Richard Bornat) have discovered a test
>> which divides programming sheep from non-programming goats.
>> This test predicts ability to program with very high accuracy
>> before the subjects have ever seen a program or a programming language.
You just made his point.
All the downsides you mentioned don't matter in the real world and for those people. They are only appreciated by programmers like you and me (and mostly the kind with a slight OCD).
But the upsides TFA mentions are very real: stuff that took them days, now takes minutes or hours. They could not give a rat's ass if it's not DRY, if it doesn't handle corner cases, if it expands in 20 ifs, when a range check would suffice, etc.
I'm not talking about any of that.
I'm talking about things like actively losing track of customers because they were saved to a separate file that got saved over on a network share. I'm talking about billing people two or three times because there isn't a single place to check to see if they've already been billed. I'm talking about being on the phone with a customer service representative who can't authorize anything because your account is in an indeterminate state, and they have to check with management--and your account will never get out of that indeterminate state, so every five-minute conversation you ever have with them will become two days long, as you wait for them to call back the next day. I'm talking about going back to paper because half the time the information just isn't in the database, or is too wrong to rely on. I'm talking about having to hire clerks just to manually print data out of one system and type it into another, because Bob from Accounting "got shit done" without having ever heard of this thing called "networking."
People trying to automate things, without first having a formalized understanding of what the process is that they want automated, will cause business-impacting failures. There's never been a single time where it hasn't, in my years of dealing with this as an employee, a contractor, or a consultant.
Ah, those things weren't done by the kind of Excel-wielding people the article talks about.
Those mistakes were done by programmers proper. With CS degress and everything.
It's not the small mom & pop or mid-sized company that usually bills people two or three times -- those knew how to bill even before excel.
More often than not, it's the multi-million enterprise crap large corporations use, with 400 options and convoluted procedures. I mean I've been double billed by the utility (electricity) company, and that's surely not due to the bill being in any Excel file.
I've worked at a few unnamed places, medium to huge, that used Excel and Access in horrifying ways. One place, with 60000 employees, had an 'editing cone' which was an actual traffic cone that you had to have in your cube if you were writing to the Access file on the SMB drive. During my time there, one person ran their Excel script sans cone and a bunch of people didn't have to pay their bill that month.
I LOVE this image. Acquiring a physical lock on the file. Think of the manager who thought of this beautiful idea, probably not a programmer by training. Awesome.
And yes, since it relies on human conformance, it's bound to fail on occasion. You can say the same thing about any piece of software you ever came across or wrote.
(this was a pretty great idea from the manager though)
Just... absorb that for a few seconds.
The chap on the next row of desks runs 300+ full time students (course success tracking, attendance, nett funding per student &c) on a fairly large Excel spreadsheet. He owns it. He knows it. His colleagues feel that they can edit the data for their students over the shared drive. Works for them.
In the UK the funding methodology for Skills for Life funding draw down is, shall we say, complex. Median funding is £3k per student, so 300+ of those is not far short of a million. The students in question are also subject to monitoring by three other agencies, with overlapping data requirements. The cost to produce an application that embodied the business logic for this edge case provision would, I imagine, be quite high.
It works for us. If it breaks, it can be fixed.
In a previous paper, Chlond (2005) presented the formulation of
Sudoku as an integer program. Chlond claims that a spreadsheet
formulation is not straightforward but we present a simple Excel
formulation. In light of the current use of Excel in the
classroom it can be highly instructive to formulate the problem
using Excel. In addition, the formulation of this relatively
simple model enables instructors to introduce students to the
capability of programming Solver using VBA. In a follow-on paper
(Rasmussen and Weiss, 2007), we demonstrate advanced lessons
that can be learned from using Premium Solver's powerful
features to model Sudoku.
Which is funny, because that's almost exactly what Bret suggested as a better way for less-technical people to create information-display software in an essay from '06:
In short, to take examples given by the user and extrapolate them, letting the computer do the formalizing, and having it ask for clarification on unclear points.
The rest of the essay is absolutely worth a read, by the way.
But the full solution requires hard AI, really. "Build me the next Facebook" can't be extrapolated into anything useful unless the software itself can dream of what the "next" Facebook would be like. A human programmer probably already has those dreams on offer.
Also, I never said one needs training to be a programmer. As far as I've seen, it's a perfectly natural (or nurtural, whatever) talent, that one then hones over time. The "programmer", selling their work as a programmer, is a false positive: someone who is in the term, but not in the natural category. The admin assistant, serving as their own client and creating a System to suit themselves, is a false negative: someone who isn't, nominally, a programmer, but is in the category.
If you can formalize an idea into something that Works, you are an Engineer. Nobody needs to hand you a certificate; you don't need to call yourself one, or even know you are one; you just are. It's a detectable, testable property of your mental architecture.
The problem is that nobody ever told this to some of the people trying to make their livings as programmers. They're like portrait-artists with dysgraphia, but unlike with that condition, they are the majority of humanity. Actually, let's take that analogy further, it seems sound:
1. Let's say 90% of the population is dysgraphic;
2. but "portrait artist" is a highly-compensated, "in-vogue" field;
3. additionally, the client has no idea how to judge the portrait (maybe an independent 90% of the population is also blind), so any flaws in it won't show up until it gets exhibited several months later;
4. and (okay this is getting a bit ridiculous, but I'll keep on with it) most portaits are the works of several portrait artists, so it's hard to say who caused a given flaw.
If all these things were true, the average portrait-artist's ability would entirely illegible--you couldn't judge them on results, nor on past performance. This would encourage a market for lemons. Additionally, the set of (people with dysgraphia & people willing to lie and say they can paint) would, just by numerical advantage, outweigh "people who can actually do their jobs" in the portraiture market. It would do to be extremely skeptical.
But still, there would be false negatives; people who never even considered portraiture, but aren't dysgraphic. Maybe at one point a friend of your admin-assistant asks them to doodle them for a newsletter, and they produce something that Actually Looks Good. Surprise!
Usually, though, the nose will be on the forehead.
Just knowing what you want isn't enough. Language matters because you don't want to solve the same problems over and over again. Have a look at Dan Amelang's work on the NILE renderer, and then see if you can still say that language doesn't matter.
What is the smallest set of orthogonal abstractions that, when combined, yields the explicitly desired behaviors, along with the implicitly necessary ones?
Do you really believe that it _should_ be impossible for an average person who desires a sudoku solver to get one without any engineering knowledge?
The spirit of Bret's talk, and even this response, is of thinking broadly. The question is not "is this possible", but "should this be possible".
1. have a database of known algorithms, and map-reduce out the ones that produce the most signal for your data-set (this isn't Hard, but it requires a globally-networked language-neutral ABI-neutral algorithm repository and a free-use cloud compute cluster to run the heterogeneous algorithm-tests on), or
2. expect the computer to invent a novel, efficient (or at least polynomial) algorithm in response to your data-set on the fly. This is a Hard problem--since solving it basically means that computers can now take the jobs of Mathematicians in proving novel theorems. I don't think that's "impossible" either--obviously, Mathematicians are performing some describable algorithm in their heads to come up with novel proofs--but it's likely a Big Data problem in the same way most AI problems have turned out to be; not something you can ask your workstation to do.
People don't know what they don't know. And making things look easy seduces them into thinking that "looking easy" is the same as "is easy." Jon Livesey, an engineer at Sun and later SGI used to quip, "It is easy to say, like 'largest integer' is easy to say, but actually pointing out what that is, now that is a different story entirely."
Engineers recognize when a detail is missing and ask about it. Annoying as hell to people who "just want it done" but essential to the task at hand.
Get out of people's way and stop telling them they're stupid, and they will fucking amaze you.
But if nobody tells someone they're stupid after they've proved repeatedly to be stupid--if we, as a culture, are too nice to leave bad reviews of bad work; if we overlook that time that Bob's excel sheet cost the company five days of downtime, and how we had to hire three extra interns to do redundant data-entry for it--then Bob might think people should be paying him a professional programmer's salary for his time. Bob might put out his shingle as a programmer. And now, the market has one more lemon.
There are some things which require real, natural (or nurtured-in, at least) talents. Singing, for example. Everything I've experienced in dealing with other programmers tells me that programming skill, in the end, is just an outgrowth of the ability to think logically, systematically, and formally--and that these abilities are part of your mental architecture, and, if they're not determined genetically, at least can only be developed when you're young. By the time someone comes into their first high-school programming class, they already will or won't be an Engineer by mindset, and there's no switch you can flick on them, no number of facts and rules you can teach, to turn that around.
To use a slightly-sour analogy, it's like the cases of children found surviving in the wild, and taken into society. They can learn words, in the same way a chimpanzee does, but they never become able to grasp syntax--their mental architecture has already set, and that component wasn't included. Logical/formal/systems thinking seems to be like this. If anyone needs to teach it, it's parents, and probably around the same age as reading. But we literally don't know what it is that's needed, specifically; what exercise you can do with a kid to "induce" logical thinking. What did we do when we were young? Play with lego? Play pretend?
The point of possible disagreement is that I'm still inclined to think that any person can learn how to program at any point in time, but that it is just exceedingly uncommon for them to actually do so. Typical CS classes are certainly not going to accomplish it. Like you said, it's about thinking logically, systematically, and formally. I think very few people actually try to learn how to do that late in life. If you don't already have it, you probably don't value it enough to try to get it. It's almost tautological: how would someone who doesn't think rigorously be convinced of the value of thinking rigorously?
(If I'm allowed the hacker cult of pedantry).
Being ignorant is less good than being informed, but informing yourself is expensive. Becoming complacent due to your perceptions of your intelligence is foolish. Not exploring something because it's not in your skillful sweet spot can lead to failure to innovate.
So, there's nothing surprising here. If you educate yourself more you'll tap into a wealth of knowledge about how to get things done more powerfully. If you use that as an excuse to step away from solving problems then you'll likely solve fewer problems, so hopefully you'll investigate things with greater theoretical validity (or else why bother?).
More people using tools means that more people will become more informed and get more stuff done. Tools should not be made as to be inaccessible. But a technique or idea often begins as inaccessible and there may even be no way to make it more accessible. This does not mean that it is invalid and it may be that it gives people who master it greater power.
Whenever learning, don't learn for learning's sake and don't learn to meet some end point defined by your teacher---seek what the master's sought. They often are solving real, practical problems and by training yourself to look for that perspective you can understand how to arm yourself with their knowledge in the way that they did.
A person who generates knowledge in effort to solve a hard problem is usually one who is more than willing to throw that knowledge away if it does not serve them.
Many problems are no longer as hard of problems as they once were. This is directly due to the fact that advanced technologies have become increasingly approachable and because larger bases of people have improved their mental technologies so as to use them. This is a great thing, but it's not an end state---harder problems still exist and gains can be made by tackling them. People can learn more things and more powerful ideas can be made the foundations of new mental technology to improve anyone's ability to solve problems.
None of this is controversial.
We naturally think of things slightly different than non-programmers. Part of the reason the tools he mention were despised by "all true hackers" is that they were limiting frameworks. We, almost by definition, tend to dislike limitations (especially when we can wield all of the power of lower level languages).
I was just pondering the other day with a co-worker, what it would be like to write a web server in xl (not sure you could, or would, but what would the mental model be like). It was an interesting experiment.
It's a similar reason to why we tend to prefer libraries to frameworks. I like code that helps me get things done quicker, but not if it enforces some mental models or comes with implementation limitations. Just yesterday I was working around a problem with a certain framework that wasn't making the right system calls the way I needed it. I knew what I wanted from it but couldn't twist the right nobs to get it to happen.
Programming means writing obscure ASCII codes.
Programmers are afraid to use any tool or language that doesn't look like source code or is easy to use. Because if you are doing that, then you aren't programming. And therefore, you're not a programmer.
If its not complex or difficult for normal people to do, then you are not really programming, and therefore not a real programmer.
All programmers are therefore afraid of tools that make programming easier, because they are afraid they will be judged by others as not being programmers.
This is the main thing holding back programming.
Having to understand electricity, trigonometry, and basic signal analysis to use an oscilloscope does not mean that oscilloscopes are not useful tools. Those are simply prerequisite knowledge to work as a professional in the field, and the tool is made for professionals.
Nobody is complaining that imperative programmers have to learn how memory works, how conditional control structures work, etc. And yet when functional programmers can benefit from learning category theory, suddenly it's a huge problem. This is just anti-intellectualism, nothing else. The term category theory scares people off, and so rather than acknowledge that failing within themselves, they throw it back on the functional programmers and say that programmers shouldn't have to learn category theory.
NB: You really don't need to learn category theory to program effectively in Haskell. You can benefit from it, but it is absolutely unnecessary.
How long did it take for calculus to make things easier for most people?