Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> These were all enormous successes, allowing normal people to get shit done. They are despised by all true hackers. Because normal people can use them to get shit done. That is our greatest fear.

No; I have to object to this specifically. Normal people can't "get shit done." Not really, not successfully. People, when you let them build their own tools in Excel and Access and what-have-you, end up with rosters full of incorrect and incomplete and invalid information, things split into so many different places that nobody can find anything any more, and interfaces that require inexplicable cargo-cult rituals and avoiding otherwise valid input states to use. Their stuff works 95% of the time; they just aren't used to a world where "5% failure rate" means "silently and consistently eats any customer's file if their name has a ç in it" instead of just "has to be restarted every hour."

The thing programmers do--it isn't using arcane languages, recognizing mysterious error codes, memorizing APIs or libraries. We aren't here just because the difficulty of typing "/[A-Z]+?/" grants us job security. The thing programmers do--or more generally, the thing Engineers do--that other humans need us for, that machines can't do for us (yet), is to formalize murky ideas.

People who don't have training in Engineering have no fucking idea how to go about doing this. You know the old joke of there being a button labelled "Do What I Mean"? It's a joke because 99% of the people who would press that button don't know what they mean; there would be no coherent thing for the button to do--even if it could read their thoughts--but to interrogate them for hours to get them to decide what they really want.

The few of us--the Engineers--we can sit down, and without any further prodding, think out what we want to have happen. Then it's a simple matter of just writing it down. The compiler, the language, none of those are really problems, compared to knowing what you want to happen.

Note that Bret's talks and essays are focused, by-and-large, on the iterative rapid-prototyping model: using computers as tools to help us explore options, so that we can more quickly figure out "what we want." But even if you know that you want a Sudoku solver, you can't iterate out a Sudoku solver. You have to know that what you want to do is to put these constraints, in this order, on these numbers--and that's an algorithm. People--not us Engineers, but people--they don't understand algorithms. You need an Engineer to take the statement "I want a Sudoku solver" and formalize that into "I want to use this algorithm." Just like you need an Engineer to take the statement "I want this bridge-support anchored to the riverbed here" and translate that into materials and rigging that will take the tensions and stresses without shearing.

Neither COBOL, nor SQL, nor HyperCard, nor AppleScript, nor Inform, nor any other "human-friendly" language, ever served to allow anyone to express a clearly-defined thought they wouldn't have been able to express in a language with more "punctuation-y" syntax. Learning what "<=" or "&&" represent is a small, fixed cost at the beginning of attempting to solve problems. Learning what they mean--what the formalism is there, and what its consequences are--takes years, and requires that you think like an Engineer. That step is required whether you are using C, COBOL, LabView or Prolog.

To get shit done, you must first know what done shit looks like; perfect, robust, unbreakably done shit. And only Engineers really do. Just like how people who aren't artists, if asked to draw someone's face, will end up drawing the abstraction of a face; people who aren't Engineers, if asked to formalize a system, will end up describing the vague, hard-AI-complete, "and then it just does what you'd expect okay!?" system that, in practice, isn't a system at all.



> The thing programmers do...

The thing that programmers do is the exact same thing that your post says "normal people" do. Programmers build buggy systems that lose data because someone's name has a ç in it and fail 5% of the time (or more) and rarely do they understand algorithms (and especially not the complexity characteristics) and very often they build the wrong thing, even if they get it done and few have a formal education in anything resembling engineering and very often programmers are indeed getting shit done, but it just turns out that what they got done was just shit.

We have to stop putting programmers on a pedestal. There are great programmers and there are bad programmers. Just because someone slings some code doesn't make them a bastion of clear engineering practice. Many programmers that I've met would benefit greatly from the kinds of systems that Victor describes.


Let me first say, I agree with every individual statement you made, here.

However, let me put my argument another way: the ability to work with you to formalize your idea is what you're paying for when you hire a programmer. You're paying for a human compiler[1], a fully-intelligent REPL where you can give vague commands like "make me the next Facebook" and, through the programmer's Engineering knowledge, they will ask you questions and force you to make choices, until they've turned that informal idea into an actual capital-S System. Inasmuch as they have that Engineering knowledge, the resulting System will be a formalization of your own desires. And then, having the formal System, the programmer can go and implement it. That part is comparatively trivial, and increasingly done "by" the software compiler. Victor is noble for driving us further toward trivializing that part of the process, but it is only part of the process.

But anyway, back to your points:

Most "programmers" aren't programmers. If you're only as good at programming as a member of the general population, then you don't get a special job title, right? Otherwise, I would be right now a writer, actor, landscaper, game-designer, philosopher and life-coach, as well as a programmer. But in only one of those fields could I actually make more money than some schmoe who just decided to jump into the field last week, and the reason I'm "more of a programmer" than Bob-the-Accountant is a programmer, is that I can do the Engineering part better. If Bob-the-Accountant started calling himself a programmer, he'd be a white belt programmer--someone who just joined the Art, and must still unlearn their preconceived knowledge before they may begin--while my belt, at least, would have some color on it. Plenty of people calling themselves programmers are white-belts. That doesn't mean we should consider them when we speak of "the thing that programmers do." It would be like including people who write their own legal contracts when speaking about "the thing that lawyers do."

So yes, if we're going to keep using the word "programmers" to refer to both the Engineer journeymen and the white-belts who produce misfeasance with their every step, then we should stop putting "programmers" on a pedestal. After all, what use is a pedestal where the people on it are exactly as high up as the general population around them?

---

[1] https://news.ycombinator.com/item?id=6131742


> Most "programmers" aren't programmers.

So if you're paid to write programs you probably aren't a professional programmer? What's wrong with an operational definition?


Another way to say it: if we had any way to quantitatively measure the performance of programmers over time, most "programmers" wouldn't be programmers any more.

There's a field with a similar problem to our own: public-school teaching. Teacher performance isn't quantified, and so teachers can pretty much get away with only being as good at imparting knowledge as any random member of the population--even though they took years of education-in-Education. Most "teachers" aren't teachers, any more than I'm a teacher; they're simply people paid to repeatedly attempt (and fail) to be teachers.†

We pay a lot of people to repeatedly attempt (and fail) to be programmers.

---

† Teachers' unions are right now fighting the introduction of actual metrics on how a student's achievement level's year-over-year velocity is affected by a teacher relative to the average of their peer-teachers in the same school. They are fighting this because the data clearly shows a bimodal distribution, where a lot of people are just extremely unfit to be teachers--their students actually reaching negative learning velocities by their presence in the class--and until now, the unions protected these people, since there was nothing to prove they sucked so very much. It's hard to go back on your decision to defend someone when you later find out that they're indefensible.


Your opinions on teachers are really easy to nod your head along with, until you actually talk to some teachers about the effects that they are able to observe, directly, of the in-vogue obsession with "actual metrics" (standardized tests) on the actual, ya know, education of their students. I'm sure many teachers disfavor more testing because they know they fear their unfitness, but I think most of them disfavor more testing because they feel we are already testing too damn much, thus trading off a tangible negative short-term effect for a speculative positive long-term effect. Every hour a student spends taking a test is an hour not spent learning.

(Apologies for the totally off-topic response to your only mostly off-topic comment.)


That's honestly a very good point; so much of our public school systems are already broken due to testing. Students are mostly taught the requirements of proficiency tests solely for funding and more occupational-related reasons. I've heard many teachers complain that they aren't truly imparting knowledge but merely repeating from a book due to "the state."


> Teachers' unions are right now fighting the introduction of actual metrics

We would unionize and be fighting metrics too, if they tried to measure how good a programmer you were with the number of goto's you use.


I thought you were doing well until you started in on teachers, you should have left that out because it's actually a more complex problem than you make it out to be.

I like where you were going on programming though.


Here I thought the analogy worked well for that exact reason. They're both very complex problems. That's why both fields are still in the state that they're in.


Yes, exactly--if assessing teacher performance was simple, then we'd be doing it, wouldn't we? :)


Could you share a link to the student achievement metrics pertaining to the average of peer teachers in the same school?


I believe we have an open and shut case of the No True Programmer fallacy.


The No True Scotsman fallacy is a fallacy because "Scotsman" is an artificial category. There is no phenomenological consequence of being a "Scotsman"; therefore, its properties can be assigned arbitrarily, based on first assigning someone to the set (by marriage, say), and then remarking on the thing all the members of the new, expanded set have in common. (A clear parallel would be a "No True HN Member" fallacy.)

However, if you believe that there is any empirically-detectable property that makes a someone a better programmer when they have more of it, then "programmer" is a natural category, not an artificial one. It's something where, if we washed away the word for it, we'd end up re-creating the word, as a handle to describe that obvious cluster of things which are unlike other things but like one-another. Being a programmer has phenomenological consequences--you can determine who is or isn't a programmer using games or tests which don't have anything to do with programming trivia, and without mentioning that "potential for ability to program" is what you're testing for.

In any natural category, you'll have false negatives and false positives: things that are identified as X but don't have the property that puts them in the X natural-category, and things that aren't identified as X, but which do have the property.

There are many False Programmers. There are also False Not-Programmers: people who don't think, or know, that they're programmers, but who are nevertheless. This is true of every natural category. There are people who think they can sing but can't, and people who think they can't sing but can.

There are professional singers who can't sing, even though they "are" singing. When we say "can sing", we imply the edifice of a market, and competition; we really mean "can sing to a level where we'd pay them more for their singing than a randomly-selected member of the population." In other words, they "can sing objectively-well."

There are people who "can program objectively-well." They are, in the terms of the natural category, both the True Programmers, and the False Not-Programmers. There is no fallacy at work.


Sorry, what is your test for being a programmer? I'm quite curious to hear what more is required beyond writing software, or even how somebody can be a programmer without having written a line of software in their lives.


Writing software well (or even reading well existing software ... which is a must, when someone needs to learn, or just maintain old code), seems to require a combination of cognitive skills. That combination can be detected, even when a person has no previous programming experience: http://www.eis.mdx.ac.uk/research/PhDArea/saeed/ .

From the linked homepage:

    >> We (Saeed Dehnadi, Richard Bornat) have discovered a test 
    >> which divides programming sheep from non-programming goats. 
    >> This test predicts ability to program with very high accuracy 
    >> before the subjects have ever seen a program or a programming language.
Edit: added the note about reading software.


>No; I have to object to this specifically. Normal people can't "get shit done." Not really, not successfully. People, when you let them build their own tools in Excel and Access and what-have-you, end up with rosters full of incorrect and incomplete and invalid information, things split into so many different places that nobody can find anything any more, and interfaces that require inexplicable cargo-cult rituals and avoiding otherwise valid input states to use. Their stuff works 95% of the time--they just aren't used to a world where "5% failure rate" means "an error every ten milliseconds."

You just made his point.

All the downsides you mentioned don't matter in the real world and for those people. They are only appreciated by programmers like you and me (and mostly the kind with a slight OCD).

But the upsides TFA mentions are very real: stuff that took them days, now takes minutes or hours. They could not give a rat's ass if it's not DRY, if it doesn't handle corner cases, if it expands in 20 ifs, when a range check would suffice, etc.


> They could not give a rat's ass if it's not DRY, if it doesn't handle corner cases, if it expands in 20 ifs, when a range check would suffice, etc.

I'm not talking about any of that.

I'm talking about things like actively losing track of customers because they were saved to a separate file that got saved over on a network share. I'm talking about billing people two or three times because there isn't a single place to check to see if they've already been billed. I'm talking about being on the phone with a customer service representative who can't authorize anything because your account is in an indeterminate state, and they have to check with management--and your account will never get out of that indeterminate state, so every five-minute conversation you ever have with them will become two days long, as you wait for them to call back the next day. I'm talking about going back to paper because half the time the information just isn't in the database, or is too wrong to rely on. I'm talking about having to hire clerks just to manually print data out of one system and type it into another, because Bob from Accounting "got shit done" without having ever heard of this thing called "networking."

People trying to automate things, without first having a formalized understanding of what the process is that they want automated, will cause business-impacting failures. There's never been a single time where it hasn't, in my years of dealing with this as an employee, a contractor, or a consultant.


>I'm talking about things like actively losing track of customers because they were saved to a separate file that got saved over on a network share. I'm talking about billing people two or three times because there isn't a single place to check to see if they've already been billed. I'm talking about being on the phone with a customer service representative who can't authorize anything because your account is in an indeterminate state, and they have to check with management

Ah, those things weren't done by the kind of Excel-wielding people the article talks about.

Those mistakes were done by programmers proper. With CS degress and everything.

It's not the small mom & pop or mid-sized company that usually bills people two or three times -- those knew how to bill even before excel.

More often than not, it's the multi-million enterprise crap large corporations use, with 400 options and convoluted procedures. I mean I've been double billed by the utility (electricity) company, and that's surely not due to the bill being in any Excel file.


I think you'll find that 'large corporations' are awash in Excel monstrosities.

I've worked at a few unnamed places, medium to huge, that used Excel and Access in horrifying ways. One place, with 60000 employees, had an 'editing cone' which was an actual traffic cone that you had to have in your cube if you were writing to the Access file on the SMB drive. During my time there, one person ran their Excel script sans cone and a bunch of people didn't have to pay their bill that month.


One place, with 60000 employees, had an 'editing cone' which was an actual traffic cone that you had to have in your cube if you were writing to the Access file on the SMB drive

I LOVE this image. Acquiring a physical lock on the file. Think of the manager who thought of this beautiful idea, probably not a programmer by training. Awesome.

And yes, since it relies on human conformance, it's bound to fail on occasion. You can say the same thing about any piece of software you ever came across or wrote.


I can't work right now, Tim has the cone.

(this was a pretty great idea from the manager though)


Not in my experience; those were all real cases I mentioned, with real people using combinations of Excel, Access, and SMB shares to "simulate" having actual software talking to an actual database. Usually, in fact, right beside the actual software that talks to an actual database--because someone figured they'd throw something together to hold "an item or two of ancillary data" instead of getting that software modified to include that data--and then their ancillary data-store grew and grew...


At least one regional electric utility uses an Excel macro to scrape its own public web site to gather aggregate usage and price data. This method allows them to save the data, and maybe analyze it later.

Just... absorb that for a few seconds.


"Neither COBOL, nor SQL, nor HyperCard, nor AppleScript, nor Inform, nor any other "human-friendly" language, ever served to allow anyone to express a clearly-defined thought they wouldn't have been able to express in a language with more "punctuation-y" syntax."

The chap on the next row of desks runs 300+ full time students (course success tracking, attendance, nett funding per student &c) on a fairly large Excel spreadsheet. He owns it. He knows it. His colleagues feel that they can edit the data for their students over the shared drive. Works for them.

In the UK the funding methodology for Skills for Life funding draw down is, shall we say, complex. Median funding is £3k per student, so 300+ of those is not far short of a million. The students in question are also subject to monitoring by three other agencies, with overlapping data requirements. The cost to produce an application that embodied the business logic for this edge case provision would, I imagine, be quite high.

It works for us. If it breaks, it can be fixed.


   In a previous paper, Chlond (2005) presented the formulation of    
   Sudoku as an integer program. Chlond claims that a spreadsheet 
   formulation is not straightforward but we present a simple Excel 
   formulation. In light of the current use of Excel in the 
   classroom it can be highly instructive to formulate the problem  
   using Excel. In addition, the formulation of this relatively 
   simple model enables instructors to introduce students to the 
   capability of programming Solver using VBA. In a follow-on paper 
   (Rasmussen and Weiss, 2007), we demonstrate advanced lessons 
   that can be learned from using Premium Solver's powerful 
   features to model Sudoku.
http://archive.ite.journal.informs.org/Vol7No2/WeissRasmusse...


> there would be no coherent thing for the button to do--even if it could read their thoughts--but to interrogate them for hours to get them to decide what they really want.

Which is funny, because that's almost exactly what Bret suggested as a better way for less-technical people to create information-display software in an essay from '06:

http://worrydream.com/MagicInk/#designing_a_design_tool

In short, to take examples given by the user and extrapolate them, letting the computer do the formalizing, and having it ask for clarification on unclear points.

The rest of the essay is absolutely worth a read, by the way.


That is, in fact, in a rather primitive way, what compilers do already; it's just very blunt and baroque, so instead of "please be more specific" you get an error message, and then you have to run it over again to get the next iteration of the feedback loop. The near-term goal in compiler/interpreter design (given still-textual languages) should be more of exactly this kind of interactive communication, where the compiler is "peering" with you to edit your code.

But the full solution requires hard AI, really. "Build me the next Facebook" can't be extrapolated into anything useful unless the software itself can dream of what the "next" Facebook would be like. A human programmer probably already has those dreams on offer.


(Also, I should note, the usual thrust of the "Do What I Mean" button joke is that the person wants to press it because they're tired of having to make choices. What they really are asking for is a Coherent Extrapolated Volition (http://intelligence.org/files/CEV.pdf) button: to cause the machine to, perhaps, simulate a bunch of copies of you, showing you a result for each combinatoric set of responses in parallel, and then pick the result which simulated-you likes most, without having to bother real you with any more questions.)


I counter with the experience of repeatedly encountering plain average admin assistants who have managed to hack together Excel spreadsheets that do exactly what they need to. And all the crappy little bridges that villages whip together from random garbage that they walk on every day for five years. From an engineering viewpoint, these creations are delicate monstrosities, but for the user they are absolutely beloved and insanely efficient. And they are the epitome of highly iterative creations starting with a barely half-baked idea and zero engineering training. Also, have you really convinced yourself that "perfect, robust, unbreakably done shit" actually exists anywhere in the world for someone to view it? Even the space program, in my opinion the pinnacle of human engineering achievement, had several exploded rockets and dead astronauts, and was quite iterative.


I never said that '"perfect, robust, unbreakably done shit" actually exists anywhere in the world'--merely that it can be conceived of, and then strived for. Knowing what it would look like if you saw it, and how it would be different from a kludge, is exactly what makes you an Engineer.

Also, I never said one needs training to be a programmer. As far as I've seen, it's a perfectly natural (or nurtural, whatever) talent, that one then hones over time. The "programmer", selling their work as a programmer, is a false positive: someone who is in the term, but not in the natural category. The admin assistant, serving as their own client and creating a System to suit themselves, is a false negative: someone who isn't, nominally, a programmer, but is in the category.

If you can formalize an idea into something that Works, you are an Engineer. Nobody needs to hand you a certificate; you don't need to call yourself one, or even know you are one; you just are. It's a detectable, testable property of your mental architecture.

The problem is that nobody ever told this to some of the people trying to make their livings as programmers. They're like portrait-artists with dysgraphia, but unlike with that condition, they are the majority of humanity. Actually, let's take that analogy further, it seems sound:

1. Let's say 90% of the population is dysgraphic;

2. but "portrait artist" is a highly-compensated, "in-vogue" field;

3. additionally, the client has no idea how to judge the portrait (maybe an independent 90% of the population is also blind), so any flaws in it won't show up until it gets exhibited several months later;

4. and (okay this is getting a bit ridiculous, but I'll keep on with it) most portaits are the works of several portrait artists, so it's hard to say who caused a given flaw.

If all these things were true, the average portrait-artist's ability would entirely illegible--you couldn't judge them on results, nor on past performance. This would encourage a market for lemons. Additionally, the set of (people with dysgraphia & people willing to lie and say they can paint) would, just by numerical advantage, outweigh "people who can actually do their jobs" in the portraiture market. It would do to be extremely skeptical.

But still, there would be false negatives; people who never even considered portraiture, but aren't dysgraphic. Maybe at one point a friend of your admin-assistant asks them to doodle them for a newsletter, and they produce something that Actually Looks Good. Surprise!

Usually, though, the nose will be on the forehead.


Hmm. No.

Just knowing what you want isn't enough. Language matters because you don't want to solve the same problems over and over again. Have a look at Dan Amelang's work on the NILE renderer, and then see if you can still say that language doesn't matter.

What is the smallest set of orthogonal abstractions that, when combined, yields the explicitly desired behaviors, along with the implicitly necessary ones?


I think you're falling in to the trap of supporting the OP's point by overreacting.

Do you really believe that it _should_ be impossible for an average person who desires a sudoku solver to get one without any engineering knowledge?

The spirit of Bret's talk, and even this response, is of thinking broadly. The question is not "is this possible", but "should this be possible".


I don't believe it's impossible. In fact, there are two clear ways to do it: either

1. have a database of known algorithms, and map-reduce out the ones that produce the most signal for your data-set (this isn't Hard, but it requires a globally-networked language-neutral ABI-neutral algorithm repository and a free-use cloud compute cluster to run the heterogeneous algorithm-tests on), or

2. expect the computer to invent a novel, efficient (or at least polynomial) algorithm in response to your data-set on the fly. This is a Hard problem--since solving it basically means that computers can now take the jobs of Mathematicians in proving novel theorems. I don't think that's "impossible" either--obviously, Mathematicians are performing some describable algorithm in their heads to come up with novel proofs--but it's likely a Big Data problem in the same way most AI problems have turned out to be; not something you can ask your workstation to do.


That is an awesome rant.

People don't know what they don't know. And making things look easy seduces them into thinking that "looking easy" is the same as "is easy." Jon Livesey, an engineer at Sun and later SGI used to quip, "It is easy to say, like 'largest integer' is easy to say, but actually pointing out what that is, now that is a different story entirely."

Engineers recognize when a detail is missing and ask about it. Annoying as hell to people who "just want it done" but essential to the task at hand.


http://www.cnn.com/2013/02/27/opinion/ted-prize-students-tea...

Get out of people's way and stop telling them they're stupid, and they will fucking amaze you.


I'm fine with letting everyone try to program. Everyone should try to paint, write, make movies, play music, all those good things.

But if nobody tells someone they're stupid after they've proved repeatedly to be stupid--if we, as a culture, are too nice to leave bad reviews of bad work; if we overlook that time that Bob's excel sheet cost the company five days of downtime, and how we had to hire three extra interns to do redundant data-entry for it--then Bob might think people should be paying him a professional programmer's salary for his time. Bob might put out his shingle as a programmer. And now, the market has one more lemon.

There are some things which require real, natural (or nurtured-in, at least) talents. Singing, for example. Everything I've experienced in dealing with other programmers tells me that programming skill, in the end, is just an outgrowth of the ability to think logically, systematically, and formally--and that these abilities are part of your mental architecture, and, if they're not determined genetically, at least can only be developed when you're young. By the time someone comes into their first high-school programming class, they already will or won't be an Engineer by mindset, and there's no switch you can flick on them, no number of facts and rules you can teach, to turn that around.

To use a slightly-sour analogy, it's like the cases of children found surviving in the wild, and taken into society. They can learn words, in the same way a chimpanzee does, but they never become able to grasp syntax--their mental architecture has already set, and that component wasn't included. Logical/formal/systems thinking seems to be like this. If anyone needs to teach it, it's parents, and probably around the same age as reading. But we literally don't know what it is that's needed, specifically; what exercise you can do with a kid to "induce" logical thinking. What did we do when we were young? Play with lego? Play pretend?


I'm not completely sure I agree with you yet, but this argument does mesh very well with the trope that so many programmers first started programming when they were very young. I was playing with BASIC at age 5, and anecdotally I've heard very similar stories from other programmers at a rate that exceeds what my expectation would be of a random distribution, and definitely exceeds corresponding anecdotal evidence from other professions (i.e. number of people in profession X who were doing X at a very young age).

The point of possible disagreement is that I'm still inclined to think that any person can learn how to program at any point in time, but that it is just exceedingly uncommon for them to actually do so. Typical CS classes are certainly not going to accomplish it. Like you said, it's about thinking logically, systematically, and formally. I think very few people actually try to learn how to do that late in life. If you don't already have it, you probably don't value it enough to try to get it. It's almost tautological: how would someone who doesn't think rigorously be convinced of the value of thinking rigorously?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: