Hacker News new | past | comments | ask | show | jobs | submit login
Paul Graham: Six Principles for Making New Things (paulgraham.com)
173 points by Darmani on Feb 16, 2008 | hide | past | web | favorite | 118 comments



I enjoyed the article, as I enjoy most of Paul's articles.

But, I'm coming to the tech world after having worked in the health care industry for 15 years. And, I studied art for 6-7 years, thinking that I could make a career out of it. So, I guess that I have a different perspective on the geek/hacker culture. And, something about hacker culture that never really set well with me was this--the nastiness.

Paul referenced it in his article, referring to the trolls. I just don't understand why people troll like they do. I didn't understand why people were so up in arms about Paul's trying to write a new language and offer it up for public consumption. I don't understand why people have been so quick to criticize Y Combinator. I don't understand the hating that goes on in language flame wars, or OS flame wars.

I understand that people are passionate about technology, and passionate about their language of choice, or OS of choice, but... really. Do people really need to get nasty about it? Why aren't people able to have a discussion about the merits of a language, or strengths and weaknesses without getting personal or mean about it. I just don't understand.

A great doctor that I worked with, was explaining to me why he got out of teaching at a medical school. What he said struck a chord with me: The reason that there's so much back-biting and politicking in academic medicine is very simple--it's because the stakes are so low.

Which leads me to ask the question--Are the stakes really so low in the technology world, that people are so nasty? I have a hunch that for many trolls, that's really true.

I know that shortly after reddit got purchased the trolls made camp and set up a small troll swamp. There are a few cool sub-reddits, though. Why did that happen? Because the stakes became so low.


Interesting angle, but I really don't believe nastiness has much to do with it.

We hackers are just dying to let others know how smart we are. It's what makes us tick.

So we come here (and other forums and blogs) because most of us have trouble finding peers who even understand what we're talking about. Add in a lack of writing style and the anonymity of the internet, and our puffing and strutting APPEARS to be nastiness.

Put us in a room together to discuss the same subjects and I'm sure it would be much more civil.

Honestly ask yourself. Whenever you heard someone else getting a compliment for being smart, (Alan made the Dean's list!), (Joe is a great chess player!), (Fred wrote the best program I ever saw!), don't you get JUST A LITTLE BIT JEALOUS. You almost want to scream out, "Hey! What about me? I'm smarter than that!" We hold back in person because we're polite. But we don't hold back here, because most of us understand. Sometimes I think that if you DON'T think like that, maybe you shouldn't be a hacker.

Reminds me of an old story that Rabbi Harold Kushner told about a young man trying to temper his competitiveness, so he joined an ashram in Japan. He wrote to his father, "Dad, this environment has helped me evolve to the point of enlightenment where I no longer have to compete with others to achieve my bliss. The meditation has done it. I'm one of the top 5 meditators here, and hopefully, by next year, I'll be Number One."

[UPDATE: I just read the comments about this pg essay on Reddit, and realize that it's different over there. They ARE nasty. What's happened to Reddit? Whatever it is, I sure hope it never happens here.]


I hear what you're saying in terms of people not being able to write well, and that people are trying to beat their geeky chests in a sort of hacker bravado.

I also don't get a really nasty vibe from Hacker news, like I do on Reddit. I was referring to the Hacker culture at large when I was referring to "nastiness". I believe that there is a lot of bravado and competition in Geek culture. But there's also a lot of nastiness, too. In fact, we are developing the English language by devising new language for people being assholes online: griefing, trolling, powning, flames, flamewars, Greater Fuckwad Theory, Godwin's law, STFU, RTFM, STFW, etc... It's only a matter of time before these terms hit the mainstream to describe asshole behavior offline.

Even though he doesn't say it, I believe that Paul has been rather hurt about this often enough several of his last essays have referred to these people as "trolls". Steve Yegge has had enough problems with this, that he's turned comments off on most of his blogs. Pmarca also thinks that this is enough of a problem, that he (to my knowledge) never turned on comments on his blog.

For instance, I went to a boarding school for a year, and was in a dorm with 45 high school boys. There's friendly competition, and then there's getting 64 wedgies like I did that year. (I was the youngest kid in the school, and quite a nerd). A lot of time, putting your work into the intarweeb slipstream of geek/hacker culture is like walking around with your fruit-o-the-looms sticking 2 inches outside of your pants at a boarding school. It's only a matter of time before some asshole gives you a wedgy. And, then you have brown streaks.

My question is this: why do we (not Hacker News, but Geek culture in general) tolerate that behaviour? There is no need for it. If you want to prove how smart you are, make something cool. Or, explain String theory in a way that mere mortals can understand. But, walking around saying that Arc sux because of this and that, or Yegge is a wind bag, or what was J Gosling thinking when he designed the turd that Java is... This isn't productive.

If people make these comments maybe it's because their stakes are so low.


Hackers don't tolerate bad behavior. They avoid communities that accept mean people (unless they are also mean).

Paul probably only visits Reddit to read comments about something he's done. I left the chatroom #lisp because of the assholicism. (Leaving was an excellent decision. Everything about Lisp can be learned from CLHS, SICP, On Lisp, gigamonkeys.com/book, other books, and Google. It takes patience, but you'll learn far more than by asking someone.)

Communities are a source of power - if you've built a strong one, you can usually get rich. But they require huge amounts of time. Unless you're working toward a specific goal, it seems best to be a lone wolf (or part of a small team) and to operate without considering what everyone else has to say, or to offer you.

I learned to program almost exclusively by asking people and experimenting. It was a mistake to ask people how to do things. You can learn orders of magnitude faster by reading manuals and books than by trying to get someone to help you. Hacker culture isn't necessary, except to feel good about yourself, which isn't necessary. John Carmack knows so much because he spent most of his time in quiet isolation, meditating on problems, not hanging around communities.


I think a middle approach is optimal: Have one or two friends to study with, preferably people you know in real life. You get interesting conversations and gentle pressure to keep up without trolls or groupthink.


I think it's pretty hard to remove nastiness from competition, since it's such an easy 'win.'

On a tangent, is competition necessary or even beneficial for progress? The world today seems to be founded on this idea, but things would be much different if people didn't believe competition was important. Competition makes products ego based, and I don't think ego based products are the best. People have to aim beyond themselves and the felt needs of others to make something truly excellent.


I mostly agree with you. But I don't think it's a geek thing per se. Just look at the comments on political blogs. It's the same thing there.

It's the anonymity. Combined with a 95% male audience. If each reddit commenter had to sign with his full name, and his picture would be shown next to his comments, the tone would be quite different.


> Interesting angle, but I really don't believe nastiness has much to do with it.

My experience is that this is a limitation of communication without subtext, inflection, etc. I don't have a whole lot of time right now for details, but I have noticed a few tendencies that exacerbate emotions in online discussions:

- Deprecating humor seems especially biting without the wink and smile, and the inevitable response from others then appears like an attack

- Posters lump all critics together, and tend to forget who made which argument (but are especially sensitive to having their points glossed over, misinterpreted)

- Often those within technical circles don't realize that different people weigh evidence differently (i.e., two parties may reach separate conclusions simply because they have differing views of which facts are important). Unlike in the math world, discussions and conclusions cannot be reduced to a few simple axioms--there's always the experience of the observer necessary to interpret data.

- There's a tendency to skim critics posts and respond to individual sentences/words (rather than the sum of what was actually written).


The reason is very simple. Most of the trolls are nerds, and so they never get laid. Because of sexual frustration they decide take it out on someone by trolling. So trolling stems out of nerd sex denial. Sigmund Freud has explained the critical role sex plays in human psychology; even all wars & bloodshed that happens is in one way or the other connected to sex. In the end, its darwinian instinct.


Credit where credit is due... the quote is based on Henry Kissingers' original:

"University politics are vicious precisely because the stakes are so small."

http://www.quotedb.com/quotes/1477

As for why trolls exist at all, I think it is because it is so darned easy to be rude and generally small-minded and full of self-puffery when the troll has such near-anonymity and therefore no sense of responsibility. I also think that geek communication tends to be brief, and it is easy to be rude with a few short direct words... in the real world, such behaviour is not tolerated but in forums, the troll can just keep posting, and posting, and posting.

As readers, we also absorb rudeness better, and remember the smallest slights, whereas we skip over the fluffy bunny comments where someone chips in with a happy "me too!" or "hug" and a row of smiley faces.

Andrew.


I don't believe this effect is restricted to the computer-related industries. I distinctly remember there being some rather vicious altercations between artists, particularly during the Renaissance period, over whose style was better.

What it boils down to is reputation. Trolls seek only to reduce the reputation of his victim.


What are the stakes, why are they low, and what can be done to elevate them?


the takes are very small in all academia


I agree on general principle, but there's a big difference between Viaweb & YCombinator vs. Arc:

Market.

Viaweb and YC were each aimed at the overlooked low end of a market. Small businesses may not be sexy, but they had a pressing need for a web presence, and the can't write it themselves.. College students and young professionals may not be experienced, but they have a real need for money and advice, and they (usually) can't fund it themselves.

Arc, however, is explicitly aimed at the high end. It's meant to be a "LFSP". And smart people can write Arc themselves (and have ;-)). They don't need a new programming language; if they wanted one, they could invent it themselves.

Hence the contemptuous reaction, and why I think Arc is in more trouble than either Viaweb or YC. It's not really a problem when your peers are contemptuous, because that just means they're less likely to compete with you. It's a big problem when your users are contemptuous, because that means they don't need your project.

If I were in charge of Arc marketing, I'd position it as the "PHP for the ones that PHP forgot". PHP initially got its start as a way to throw a quick & dirty webapp prototype up on the web, and then incrementally refine it. But around version 5, it started getting complex and adding all these features from Java. That's left a vacuum at the bottom of the web language market. Rails and Django tried to fill it, but the average Django user is quite a bit more sophisticated than the average PHP user circa 1999. And Arc's design is already pretty well-suited to throwing a design up on the screen quickly.

Right now, the majority of Arc users seem to be disenchanted Lispers. Disenchanted Lispers tend to be smart people, and they also tend to be interested in language design. So instead of doing things with the language, they do things to the language. On arclanguage.com, I've seen one person write an app with the language, and several dozen people write enhancements of the language itself, either in the form of macros or as hacks to the interpreter itself.


They don't need a new programming language; if they wanted one, they could invent it themselves.

Where is it, then? Where is the final, perfect Lisp that's so easy to write?

What you seem to be saying is that being smart automatically makes people good at language design. And you are just dead wrong. Being smart may make you a good language implementor, but there's little correlation between that and the kind of skills you need to be a good language designer. Some of the worst languages in the world were designed by smart people.

It's a big problem when your users are contemptuous, because that means they don't need your project.

The Arc users on arclanguage.org don't seem that contemptuous.

So instead of doing things with the language, they do things to the language.

That seems a good sign to me. Munging the language is what you do with Lisp, like numerical calculations are what you do with Fortran. So what this means to me is that the users are real Lisp hackers, using the language as Lisp is meant to be used.

Plus I explicitly said that at this stage I'm mostly interested in the core language, and that I want to hear new ideas in that department.


"Where is it, then? Where is the final, perfect Lisp that's so easy to write?"

Maybe there isn't a final, perfect Lisp. Maybe there are lots of individual Lisps that are each perfect for some class of problems. That's one of the great strengths of Lisp: if an existing implementation is almost-but-not-quite what you need, you can throw a few macros on it and adapt it into a new language that is what you need.

It's one of the greatest weaknesses too: you don't see the same willingness to say "This is good enough; let's move on to more interesting problems" that you get with, say, Python/PIL or Ruby on Rails.

If you're going to create a new Lisp (and expect people to use it; creation for the sake of creation is another thing, and doesn't need justification), you've gotta answer why it's superior to throwing a few macros on top of an existing Scheme or CL implementation. After all, wouldn't an individual programmer know his specific problem better than you do? They don't have to worry about harmonizing a variety of concerns, because they don't have to worry about other people's concerns. They could just build the language that's best for their specific problem and keep it in their own private toolbox.


What you're doing is arguing semantics. Whoever feels like replying to a reply of a reply long enough that all meaning has been lost and the other person stops posting, is the "winner." Whover is smarter and can come up with great examples can easily make another person defensive and tired of arguing. This person can then "win" arguments simply because the person who just wrote about actually developing something is busy developing.

This is actually what causes that disparity between what people think of new technology when it first comes out and two years later, as long as it still exists: one party was busy developing, the other party was busy coming up with reasons it can't be done, getting pats on the back for being so smart, then riding that dopamine rush to nowhere fast for two years until they meet the product again and just say "oh, I was wrong." Then this cycle continues. That's what PG's essay is about.

But being unable or unwilling to argue semantics does not make the original poster's ideas wrong. The only way to prove a determined person wrong is to engage in direct competition, and from the rear, of course, as the other party is already ahead of you. For obvious reasons, it's easier and "smarter" to come up with "logical" reasons than to "do the full experiment" to show it's wrong. We're not talking about math equations here, so a productive person's potential is more than his writing but also his skills, experience, credibility, and dedication, which can't all be ignored.


Actually you're wrong because you're assuming that writing (trolls or anything) isn't a skill. It is, and if you got trolled it's your own fault and I'm your feelings were hurt. In addition, you obviously don't understand the first thing about "semantics."


;) I think you won.


If you're going to create a new Lisp... you've gotta answer why it's superior to throwing a few macros on top of an existing Scheme or CL implementation.

If that were true, Scheme and CL wouldn't exist themselves. Everything you can do in those languages you could have done by writing a few macros on top of their predecessor, Maclisp.

What you seem to be saying is that the evolution of programming languages has now stopped. No one ever needs to make a new LFSP, because SPs can do whatever they need by writing macros on top of existing languages. Do you realize how unlikely this is, historically? Especially in a field like programming languages, which is at the moment in a period of ferment.

If you think CL is the last word in Lisp, you probably have a higher opinion of it than any of its designers. They were in the kitchen when it was being made, and they are all too aware of all the hacks and kludges that went into it.


I love your answer.


"Where is it, then?"

I don't think a comma is required after 'it'.


I think there's very interesting thing happening around Arc. It appears that people there are less afraid of making mistakes or repeating past attempts. Maybe because pg broke some Lisp traditions so that others feel more free to step out of them as well. Maybe because people with other language background are gathering so there are more divergence of minds.

I have my own Scheme implementation (Gauche) and I live on writing software in it, and I keep trying new ideas with it. For that regards, I don't need Arc. But that kind of atmosphere at arclanguage.org, that's not something I can get without Arc.

Maybe most of the ideas tried there will turn out not to work, but new ideas need a place like there to come out. In a sense, Arc isn't a new programming language, but a medium that sheds a new light to the language design process.

And I'm watching it closely to steal whatever good ideas that come out :-)


It's funny, I was commenting to a friend a couple days ago that I really liked arclanguage.com because people were hacking there instead of talking about hacking or arguing about hacking. Kinda at odds with what I'm saying on this thread. Maybe I'm just a natural contrarian.

I dunno. I could certainly be wrong - I was about Reddit. (Though I was also wrong about Xobni, and they did basically the opposite of everything PG suggests in this article.)

Maybe I'm just not the target market for Arc. I'd like to think I'm a smart person, and I did download it and play around with it and reimplement it. But right now, I'm finding that Python and JavaScript let me do more cool things, so I'd rather play around with them.


That's not true. I'm learning Arc to use it to build my first web applications. I'm the sort of small company you mention about viaweb.


"Right now" doesn't and will never matter. "Right now" is incompetence, lazyness, lack of awareness, comfort, and politics all rolled into one. That's the definition of "right now." It is a horrible, horrible measure of what can or should be achieved.

I don't think Paul intended this to be a ground-breaking article. I doubht he's saying that those specific examples lead him to a certain process with a certain guaranteed successful result. In other words, he's not saying because a=b, and b=c, a=c. In that case, if you can disprove the examples, Paul's theories don't stand.

What I interpreted PG to say is that he has done things a certain way for a while, and has noticed correlation where people reject change time and time again, but that this is normal, and in fact, one of the steps in the process of innovation.

You need to 1) come up with a simple concept that benefits someone, especially yourself or someone you know who might care to look at it when it's ready (that you know you can do in a weekend even if it actually takes you a while) 2) promise to that person or anyone else besides you that you will release the software you promised 3) release the crappiest version you can that has the least features and is so unimpressive that it's stunning (esp. if you release before your deadline which gets you feedback sooner) 4) take criticism and make adjustments 5) come up with a system to make promises to do x by a certan date and a parallel system to make and measure forward progress towards any of the objectives 6) stick with it because people really love it when you've been so dedicated to a project 7) you're now an expert and have a lot of design and coding experience under your belt, as well as happy users who like that you're improving the app and listening to them.

Most people never get past step 1 or 2, even though at those steps you really don't have to do anything.

Oh, and step 8) you will learn what makes your body tick and become very efficient and feel like you're "improving exponentially" and 9) you will come up with ideas and insights that may make you sound insane, especially since you'll be so confident and focused you'll be blurting them out, but that 10) a year or two later will have been proven 100% true.


"Right now" is incompetence, lazyness, lack of awareness, comfort, and politics all rolled into one.

And yet that's what the market generally pays for.

Your list reminds me of Levchin's advice from http://venturebeat.com/2007/03/26/start-up-advice-for-entrep...

As a final word of product development advice, Levchin encouraged founders to think about the Bible’s seven deadly sins - especially greed, sloth, envy, pride and gluttony. These characteristics, he said, describe many of the primal motivations for users.


Upmodded for your first half. But Arc is definitely not drop in replacement for PHP. And you can use PHP 5 in basically the same ways you could use PHP 3.


Agree- very small market for Arc. I know hundreds of people who can benefit from Viaweb, tens of people who can benefit from YC, no single person who will use Arc.


Agree. There is a world market for maybe five computers.

Ok, I am joking you may be right. But it is notoriously difficult to predict the market for programming languages. had you attended LL2 on [November 9, 2002](http://weblog.raganwald.com/2007/01/where-were-you-on-saturd...), could you have predicted which of the languages discussed would be popular... five years later?


I dont know. Market for this kind of computer language vs eshop solution and funding is very small in comparison. Market means someone will pay for using Arc.


If that's what you mean by market, there's no market for any programming language.


Is it for a programming language startup to make money if you have a superior language? Say you have a programing language that is to Lisp as Lisp was to Fortran, how (if at all) can you make money from it?


The only way I can think of would be to keep it secret and write applications in it.


Matlab is actually high-level language so it is possible to monetize computer language.


I just use Octave (open source implementation of Matlab)


Do you think that the language you use matters much? i.e. if you have a language that makes you twice as productive as in Arc, would that be a big advantage or a small one? (compared to having money, a good idea or being smart, etc.)


Assuming that a significant portion of your time in a software startup is spent programming, a language that makes you twice as productive means you can try twice as many ideas, or develop the same software and free half your time for finding capital or users, or shortening your time to market by 50%, wouldn't any of these be a significant advantage???


> Assuming that a significant portion of your time in a software startup is spent programming

That's exactly the question I wanted to ask! So is a significant portion of your time spent programming? Do YC startups measure this?

I find that if I code in a Blub language I spend 4 hours coding, but in a non Blub language I spend 2 hours thinking and 1 hour coding.


From my experience in most startups (2/3) spend the first 4 months almost entirely on programming. Later stages are obviously more balanced as other activities begin to take the spotlight.

So according to your stats you get a 25% improvement in productivity by using non-Blub language (don't worry as you gain familiarity with non-Blub programming this will actually improve) and that means launching a month earlier.

The only reason to not use a non-Blub language is that you need to write in the Lowest Common Denominator of your team and if the entire team isn't comfortable in a specific language you'd better not use it.

The only exception to the above rule is if you separate your efforts in a very specific way, if instead of using the entire team to build project A, you use part of the team to build tools that help build projects like project A easily and the rest of the team uses the tools the other half created to actually build project A. In which case the tool building group can use more powerful languages without any adverse effects.


It's been tried, eg. Clean, Dylan, or Miranda. Results are not really encouraging, and the language often gets open-sourced after its owner realizes they can't make money off it. And those languages are really nice languages (Dylan is my favorite Lisp variant, and Clean and Miranda are in many ways friendlier versions of Haskell).

Companies tend to have much better luck if the language is attached to a product, either as a scripting language or as part of a RAD suite. Think of AutoCad/AutoLisp, Flash/ActionScript, or Visual Basic.

But for general-purpose languages, not so much.


There's one paragraph in there that I think sums the entire essay. When I read it I was so stuck by it that I put that paragraph on my desktop image (http://www.cocunderground.com/desktop.jpg).

Here it is: I like to find (a) simple solutions (b) to overlooked problems (c) that actually need to be solved, and (d) deliver them as informally as possible, (e) starting with a very crude version 1, then (f) iterating rapidly.


I'd add the closing paragraph to this as well:

So when you look at something like Reddit and think "I wish I could think of an idea like that," remember: ideas like that are all around you. But you ignore them because they look wrong.


Reading PG's essays makes me very happy. Quoting again:

Here it is: I like to find (a) simple solutions (b) to overlooked problems (c) that actually need to be solved, and (d) deliver them as informally as possible, (e) starting with a very crude version 1, then (f) iterating rapidly.

Perfect.


(g) Profit!

Hence the gnomes' predicament is solved.


"When I first laid out these principles explicitly, I noticed something striking: this is practically a recipe for generating a contemptuous initial reaction."

Jeffrey Rosen has some good stuff to say about this in his book The Unwanted Gaze. One of his defenses of the right to privacy is that new ideas often seem wrong at first, especially when they're still only half baked. And because of this, progress is seriously impeded when the government has the ability to go through our journals and sketchbooks at any time. This is especially true in our made-for-TV society where most people won't put more than thirty seconds into trying to understand something, esp. considering the factors PG mentions. He suggests that for these reasons innovation may not be possible in a fully transparent society.


The govt or anybody being able to read your scrapbook is not the same as them telling you to stop based on it.

I've put the book on my list.


I remember reading Paul's essays, pre YC. When he wrote the essay announcing it, I remember thinking, "This guy is crazy, all he's gonna get is a bunch of school projects." Sure enough, he did, and wrote about it. I felt vindicated by that, saying to myself, "I knew THAT was going nowhere." People like to be correct in predicting failure for others, for whatever reason.

After reddit, loopt, zenter, anywhere.fm, etc., however, I don't think I was completely accurate in my prediction....


People like to be correct in predicting failure for others, for whatever reason.

I know there are a lot of people like this, but there is also a perhaps equally-large number of people who feel the opposite way. Personally, I love to be proven wrong when I predict failure, unless I have some other reason for wanting someone to fail (they use unethical practices, they're my direct competition, etc.).


I think people like to be correct in predicting the failure of crazy-sounding ideas because they don't want to feel like they are missing out on something big. I know I predicted that Google was all hype when it was selling at $200/share and unloaded the few shares I had. I kick myself for it now obviously, and even though it made no financial difference to me whether it sunk to $25 or went up to $600 (since I no longer owned shares), I still couldn't help hoping it would tank so I would feel like I made the correct decision.

I think this is really why you don't have to worry about competitors. If your idea is good enough, everyone will think you are crazy/naive. If your idea is simply decent, you'll never pick it anyway (because it's not exciting enough to you) and it will end up getting done by some existing company. Either way whatever you actually choose to work on will be unique until it's proven, by which point no competitor is catching you anyway.


> so I would feel like I made the correct decision

It's not just about feeling. If Google had tanked, you would be more justified (in a statistical sense) to think that you're smart and to engage into more financial transaction.


The difference is that the negative folks are the most vocal, while the positive folks are trying arc instead of whining about their perception of Paul's behaviour.

For example, I was just over on programming.reddit.com. The majority of the comments on this essay are smearing Paul's character. If you went by that, everyone hates it.

Yet the essay has +115 points. So there are 115 people who liked the essay for each person who dislikes it. Where are their comments?


Maybe they vote because they want the story to stay on top a while, either to convince others (Ron Paul) or because they want to see the discussion. Reddit is like a reality show. You might vote for the candidate you liked best, but you might also vote for a troublemaker to ensure a good show.


I know what you mean, and I was happy to be wrong in this case.


The saying I have for this negative reaction is "people see only what is in front of them." A good concept with poor presentation will be dismissed, and in the case of a language like Arc, presentation could also be defined to include "number of features, immediate utility, etc."

I know my reaction was similar to the crowd, but even so I recognize that it's the wrong thing to look for. If Arc's initial release state could be summed up as "Lisp with some cleaned-up and rearranged features," then it's really not that different from say, Digital Mars D, which is best described as "C++ with some cleaned-up and rearranged features." The difference is, D has a massive compiler backing it, and its scope is unlikely to change. Arc's minimal implementation substance only makes it easier to change tack midway.

I've only gradually learned to restrain my sense of ego and take a similarly minimal approach with my own projects(indie games) - my newest process is to always start the final implementation with a text console interface. Besides encouraging better architecture in preparing for a later graphical interface, it's easier to debug the key elements this way than to have "stuff moving on the screen" at 30 or 60FPS and get frustrated figuring out which of those frames is the one where you're getting an error, let alone what routine is causing the problem.


If at first, the idea is not absurd, then there is no hope for it.

- Albert Einstein


Your point about having a version 1 and iterating quickly based on real world input reminds me of my experience as a mechanical design engineer. The most costly mistakes made in that field resulted from too much design and not enough prototype phases. The typical design process back in the 80's was to meticulously draw every aspect of a component, send it to a machine shop for tooling, get manufactured parts and put them on the test stand. Only then would you discover that the gage was too light, the material was the wrong spec or the welds had to be moved.

I introduced the concept of prototype early and often. Actually re-introduced, as this is how most products were engineered in the olden days. Someone would suggest a new gage, weld placement, cross section, and I would don my overalls, head to the shop, cobble up something and demonstrate its feasibility before the draftsmen could decide what size paper to draw it on. When I worked at Chevrolet I took this to the max and prototyped entire vehicles years before the official prototype build phase.

When I transitioned to the world of networks and software I was shocked at how little prototyping there is. Website developers would deliver project plans that had two weeks of "QA" built in before the launch date. It was crazy, any major problem uncovered in that two weeks would delay the launch. It is still challenging to get a programmer to demonstrate a prototype early. So, I think you are on to something. Good luck with ARC!


If Viaweb didn't process credit card transactions for the first year I'm curious what initially attracted merchants to using it.


They wanted to be able to generate a good-looking site with large numbers of products and a way for people to order them. Till about 1997 online order volumes were so small that it was no problem for merchants to manually punch in credit card nos on their POS terminal.


Maybe the ability to reach customers around the world much cheaper than any other way? Remember that the first (and most of the) customers were already-existing businesses, hence they presumably already had a method for processing transactions.

At least, that's my guess.


I don't think anyone was really processing credit cards through the web at that point, at least not unless you paid out the nose for a custom system. "Sales" could still be made over the web, you just couldn't receive payment for them - like an online catalog really.


I like a lot of what this essay says, but I still don't think it applies to Arc. Because the criticisms of Arc would have happened no matter what the design had been.

For the most part, Lisp people will never like any modification to the language that changes their pure abstract computation engine into something that more people would want to use.

Behold, a Usenet post from 2000 on the notion of a "lispscript" that could compete with perl or awk:

http://perl.plover.com/yak/12views/samples/brief/why-lisp-wi...

That said, I personally have no idea if Arc is all that good or not. I get it, but it doesn't inspire me the way Haskell or Erlang do right now. But Arc seems to be successful by the designer's own standards.


I like Arc, but there's just no way that it follows the six principles laid out here.

(a) simple solution - yes

(b) to overlooked problem - no. It's just a personal spin on Lisp/Scheme. It covers exactly the same territory.

(c) that actually needs to be solved - no. Programmers have many good languages to choose from. Arc is a nice little wrapper on top of Lisp to make it pretty, but any undergrad could do the same thing.

(d) deliver them as informally as possible - no. You don't write long essays about a language 5+ years before you show a prototype if "informality" is your goal.

(e) starting with a very crude version - yes

(f) iterating rapidly - no. You could get a master's degree in computer science in the time it took for the minimal prototype of Arc to show up.


It's a succinct summary of a method for developing new ideas/products. The closest I have seen for brevity is Bob Bemer (the inventor of ASCII) with

"Do Something Small But Useful Now"

which he also wrote as ((((DO SOMETHING!) SMALL) USEFUL) NOW!)

so that it became this sequence:

Do Something

Do Something Small

Do Something Small But Useful

Do Something Small But Useful Now

see http://www.trailing-edge.com/~bobbemer/ for more info


The Law of New Inventions by Mike Rozak (game designer) : http://www.mxac.com.au/drt/LawOfNewInventions.htm

It fits perfectly with your essay. Good write btw.


I like to add a 7th - the ability to ignore problems that are irrelevant to the problem at hand. PG actually mentioned it all over the essay, but seems to have left it out.

btw, thanks PG, this essay has a very profound effect for me; makes me rethink some of the "ideas" I have and how I will approach them.


Is it just me or Twitter fits perfectly in that definition?


Yes, when Evan came to speak at YC recently I was struck by the similarities. Here was an idea that was literally right under everyone's noses for about 10 years, and everyone ignored it.

Evan is a great product designer. That's his secret weapon.


Yesterday I was watching Smith&Jones, Britcom also has much of this, constructing ideas around the stupidly obvious and ignored on the perspective of the comic writer.


This is an interesting take on (b) overlooked problems:

A few years ago, Gary Hamel and colleagues analyzed more than a hundred cases of business innovation to learn why some individuals, at certain points in time, are able to see opportunities that are invisible to everyone else. They learned that you need to pay close attention to four things that usually go unnoticed:

1. Unchallenged orthodoxies—the widely held industry beliefs that blind incumbents to new opportunities.

2. Underleveraged competencies—the “invisible” assets and competencies, locked up in moribund businesses, that can be repurposed as new growth platforms.

3. Underappreciated trends—the nascent discontinuities that can be harnessed to reinvigorate old business models and create new ones.

4. Unarticulated needs—the frustrations and inconveniences that customers take for granted, and industry stalwarts have thus far failed to address.

original @ http://discussionleader.hbsp.com/hamel/2008/01/innovation_ha...)


This is a great list and nicely expands "(b) overlooked problems." In his book "Innovation and Entrepreneurship" Peter Drucker offered the following observation on the frame of mind need to spot opportunities for innovation.

"Innovation requires us to systematically identify changes that have already occurred but whose full effects have not yet been felt, and then to look at them as opportunities. It also requires existing companies to abandon rather than defend yesterday."

He goes on to suggest seven places to search systematically for opportunities

  The Unexpected
  The Incongruous
  Weak Link In Existing Process
  Industry Or Market Structure Change
  Demographics: Size, Age Structure
  New Zeitgeist: Perception, Mood, Meaning
  New Knowledge


I like to find (a) simple solutions (b) to overlooked problems (c) that actually need to be solved, and (d) deliver them as informally as possible, (e) starting with a very crude version 1, then (f) iterating rapidly.

I'm still trying to get a handle on which of these apply to Arc. Perhaps I'm looking at it the wrong way, but Arc feels a little light in areas (a-c).


Arc is a simple solution to the overlooked problem of core language design.

http://www.paulgraham.com/core.html


In general, at this stage, the amount of flak that Arc has received is so disproportionate to the normal amount of flak that new ideas receive because it hasn't solved any problems that real hackers have encountered, well enough that they'd want to switch to writing programs in it. You haven't really solved the overlooked problem of core language design until people start using your cleaner core language, and to do that you'll have to give people compelling reasons to start using it. If you don't agree, the BeOS people have solved the overlooked problem of operating system design and are on their way to meet you, together with the people who invented Esperanto.


It solves the problem hackers face in every program they write, and which high level languages exist to solve: making programs smaller.

Try translating some Arc programs into Common Lisp or Scheme and you'll see what I mean.

And incidentally, I don't have to give hardened users of existing languages reason enough to switch. There are new people learning to program all the time, and to them, all languages are on a level playing field. If they look at Arc and CL and see that programs are 50% longer in CL, why would they choose CL?


Ok, I think I get it now if I look at the "Arc approach to language design" instead of "Arc in its present form". The problem isn't to come up with a better language, it's to set a good example of the better language production process and in doing so perhaps get a better language as a byproduct. Here are the morphisms I see:

a) Programs in "high level" languages can fail to be shorter than their lower-level equivalents. The simple solution: be merciless in keeping the important things short!

b) The overlooked problem in language design is high-level language brevity.

c) The "language design" problem that needs to be solved is... not sure about this one. I'm guessing it's the fact that the rate of change in the field is putting more programmers into the role of language designers at an increasing rate and anything that helps them avoid bad decisions based on ignorance is significant.

d) deliver informally as possible: when designing a new language, write enough to make the goals clear, address issues in a discussion group and put the code somewhere without sweating organizational details like setting up code repositories, bug tracking systems, regression frameworks, etc.

e) crude version 1: implement the minimal stuff using an existing system and don't worry about the fact that to the unaware it may seem just like a trivial program in that system.

f) iterate rapidly: don't get bogged down by things like release processes and backward compatibility. Just focus on getting feedback, experimenting, measuring and looking for improvements.


Arc solves the problem of making programs smaller, I agree. I'm not merely admitting that: that's exactly the thing that's impressed me about Arc 0 in the first place. But the problem with programming languages is that they aren't single-purpose honed tools: they're more like a swiss army knife, or a house that the programmer has to live in, or an operating system for the programmer's thoughts, if you will. What a programming language has in common with all of the above is that it has to solve very many problems at once: in order to be considered an unequivocal advancement, it has to solve _all sorts_ of problems at least as well as languages people are already using. Arc currently solves only one problem, albeit very well, and solutions to all of the other problems (copious libraries, that profiler thing you mentioned in one of your essays, abundant and welcoming documentation - all stuff you've identified as necessary for a successful programming language in your own essays, in fact) are being delayed until a later release, an unspecified date in the future.

Taking the analogy to a house, Arc is like a half finished construction project. The frame is up, they're still putting insulation in, and there's one finished room with a cot, a table, and a hot plate which Paul Graham is using to cook the news.yc software. Sure, you might attract scads of interior decorators by opening your house up to random strangers, but very few programmers will be willing to come over to live in Arc. They'll stay in their own houses which are already well stocked with comfortable furniture, decorations, etc., even if it is all a little cluttered and you have to go through the kitchen in order to reach the library from the bedroom.

Some of the people speaking out against Arc are genuine trolls, but some are just saying in a non-tactful way that they'd prefer to stay in a house with drywall, thank you very much.


Does Arc make JavaScript look really verbose? I'm guessing it does, if conciseness is its primary goal. How much of the delta is due to its macro system?


Arc Challenge has shown that for many languages this translates into roughly as many lines of code (see Seaside solution, for instance)


I think the six-principles approach is great, and it really shows why the initial reaction will always be contemptuous and therefore doesn't mean anything.

Of course, not everyone can easily copy the approach, you need to be an original thinker in order to not overlook overlooked problems.

I don't understand the last part of the essay about reddit as a classic example. "Tell people what was new", was that an overlooked problem? News agencies have been around for a long time. "Stay out of the way" was only possible because of the user-submission-and-voting system, which seems to me a clever and non-obvious idea. What I like most about this idea is the voting part, it tells me what's interesting, I don't care about whether it's new.


I still wonder why RTM doesn't say anything at all. Does he read HN? I would like to know his point of view, too.


I still wonder why RTM doesn't say anything at all.

People have been wondering about that since he was a small child.


If you work on overlooked problems, you're more likely to discover new things, because you have less competition.

I'd like some help interpreting this sentence.

Why does the lack of competition make it more likely you'll discover new things? (And what is things referring to?)

I assume it means you can attack a problem at a more leisurely pace, so you see solutions you would have missed if you were in a feature-copying war with competitors. But I doubt PG would advocate running your startup leisurely.


Why does the lack of competition make it more likely you'll discover new things? (And what is things referring to?)

Because if few are looking where you're looking, there's more chance that things you discover will be new.

By things I mean pretty much anything that could be the object of "discover", from mathematical concepts to narrative techniques.


Oh, right – overlooked problems.


From No Contest by Alfie Kohn:

"Robert L. Helmreich [tested] the relationship between achievement, on the one hand, and such traits as the orientation toward work, mastery (preference for challenging tasks), and competitiveness, on the other. A sample of 103 Ph.D. scientists were rated on these three factors based on a questionnaire. Achievement, meanwhile, was defined in terms of the number of times their work was cited by colleagues. The result was that "the most citations were obtained by those high on the work and mastery but low on the competitiveness scale."

This startled Helmreich, who did not expect that competitiveness would have deleterious effect. Could the result be a fluke? He conducted another study, this one involving academic psychologists. The result was the same. He did two more studies, one involving male businessmen, measuring achievement by their salaries, and the other with 1300 male and female undergraduates, using grade point average as the attainment criterion. In both cases he again found a significant negative correlation between competitiveness and achievement.

...

But Helmreich did not stop there. As of 1985, he had conducted three more studies. The first compared the standardized achievement tests of fifth- and sixth-graders to their competitiveness. They were negatively correlated. The second examined the relationship between performance of airline pilots and competitiveness. The relationship between the latter and superior performance was negative. The third looked at airline reservation agents and again found a negative correlation between performance and competitiveness.

...

Consider the question of artistic creativity. The little research that has been done suggests that competition is just as unhelpful here as it is in promoting creative problem-solving. In one study, seven- to eleven-year-old girls were asked to make "silly" collages, some competing for prizes and some not. Seven artists then independently rated their works on each of 23 dimensions. The result: "Those children who competed for prizes made collages that were significantly less creative than those made by children in the control group." Children in the competitive condition produced works thought to be less spontaneous, less complex, and less varied.

...

Consider journalism, a profession that, while no more competitive than many others, is worth exploring by virtue of its unusual visibility to outsiders. The frantic race for news generates terrific levels of anxiety ... on the part of journalists. Can we at least point to better reporting as the result of this competition? Setting reports against one another in a battle for space on the front page or the first block of a television news show probably lowers the quality of journalism in the long run, and so, too, does the contest among news organizations for subscriptions or ratings."


>> 'Startup funding meant series A rounds: millions of dollars given to a small number of startups founded by people with established credentials after months of serious, businesslike meetings, on terms described in a document a foot thick.'

This is a bit of a stretch in my opinion - there has always been companies taking smaller investments; whether seed from a VC, seed from an individual Angel, or borrowing money from family.


Hi Paul:

   My father-in-law was an inventor.  His one claim to fame was a cheap plastic pulley-like thing you could stick on the bottom of a telephone, and wrap the cord around it.  It was called a "Cord Caddy."  He made a small...no a medium fortune on this in the early 1970s.
   I went to a few home trade shows in L.A. with him.  The creative energy at these places was palpable.  (I even got to meet Ron Popeil there once).  You'd go around the booths, see all these incredibly simple....no STUPIDLY simple things all these guys were making fortunes at.  I'd mutter at myself, "Heck, anybody can come up with this crap."  And then a voice from above thundered down, and slapped me upside the head.  My ears are ringing thirty years later.  The Voice from above said,  "Yes...but you DIDN'T, did you!"
That's the secret. ANYONE can come up with stuff...but almost nobody does. The real inventors in this country are a minuscule percentage of the population who actually DO something with their dumb ideas. I have unmitigated respect for Ron Popeil and his ilk, regardless of the obnoxious late night commercials.

eric


Reddit started out as a blatant copy of digg, which was 6 months old when reddit launched. I think that sort of invalidates large parts of this essay.

Some digg copy from Dec. 2004: "What's Digg? Digg is a technology news website that combines social bookmarking, blogging, RSS, and non-hierarchical editorial control. With digg, users submit stories for review, but rather than allowing an editor to decide which stories go on the homepage, the users do."

Reddit copy from August 2005: "A source for what's new and popular on the web--customized for you. We want to democratize the traditional model by giving editorial control to the people who use the site, not those who run it. All of the content on reddit is from users who are rewarded for good submissions (and punished for bad ones) by their peers. You decide what appears on your front page and simultaneously, which submissions rise to fame or fall into obscurity."

And, yes, I realize there was nothing completely original about digg, the idea was originally done by kuro5hin, yadda, yadda, and I actually have plenty of respect for PG and the reddit guys. I'm just saying....


IMHO the reason behind the response to Arc is "underwhelming" - because the anticipation has been built for 5 years, but it turns out the implementation is a "just" set of macros. And it certainly don't fit the "iterate rapidly" concept.

Now - this is not a criticism of PG's work - on the contrary, I believe his design matches up with his philosophy in On Lisp very well. Looking past how the process, people now actually start to play with the language and make use of it.

What I'm mostly interested though - are the learning lessons if PG is willing to share them. Why did it take as long as it had? Is it because this is not his day job, or are there additional design changes or problems that he has to solve along the way, etc. Getting a sense of that will shed light for other aspiring developers.

I for one tend to try to shoot for just the right things and have scratched quite a few prototypes so I can go for the perfect one, so would sure love to hear from PG on his experiences.


As with everything else that I can remember reading by you, I guess it's the fact that you write the way I'd advise people how to talk to me: don't be pedantic [don't treat me like I'm an idiot or just a "child"], and don't speak as though trying to impress me [don't waste time using big words when small ones will do; don't use many words when a few will do].

I like the way you share ideas and concepts with me, the reader and message receiver.

Obviously, Paul, the whole is greater or less than the sum of the parts. You some how manage to make points that others couldn't with many more words. It's how you use the English language.

It's how you reason and think.

I think.

I think you annoy the hell out of insecure little twits whose "authority" they cling to like little kids afraid as hell that somebody (a "bully," perhaps?) is going to pry their "binky" (or toy or some other thingy [sic]) out of their sweaty little hands.

Or something like that.


You know PG, you can't discuss Arc beyond a point - the whole discussion gets trivial - not the content but just the fact that there have been so many discussions on the subject that one tends to skip or hurry through it.

I think the best response will be your continued work at Arc as opposed to a well written essay.


Simple, Informal, Crude and Fast: Lessons for many things, not just Web 2.0.

Small teams of hackers definitely present the fewest problems with this kind of approach, but if people in large, established organizations in fields from mechanical engineering to life science (with the possible exception of virology...which can take longer than a Y Combinator season just to incubate the test vector) don't start applying this approach, they will be overtaken by fast newcomers when instead they could have fostered and benefited from them.

http://thethreepercent.com/2009/05/04/paul-graham-simple-inf...


Yes yes yes! Once again the astute Mr Graham slices through conventional thinking.

I've received a standard reaction to new ideas all my life. In the early seventies I went looking for a sound system with dual cassettes ... "why would anybody want that?". A few years later I wanted to use VHS cassettes as the basis of a multi-track audio recorder ... "why would anybody want that?" In the early nineties I started writing radio automation software that ran on a standard PC ... "why would anybody want that?"

All these things (and many others) eventually had their season. And now when I hear that phrase, my ears prick up because I =know= I'm on to something.

::Leigh


How does this mesh with "don't try to create trends, find and ride them"? That's advice I've heard a lot.

The final note about Reddit seems to say the opposite: "The Reddits pushed so hard against the current that they reversed it". That is probably incorrect. They found a local maxima in the trend for social news before others spotted it.

This doesn't really repudiate anything you say in the essay. But people shouldn't get the idea that just because there is resistance, they are doing something good. There needs to be a reason why the critics fail to see the light for you to claim you're working on an overlooked problem.

The trick here is that while you're working on it, you don't know if it something that needs to be solved.


How does this mesh with "don't try to create trends, find and ride them"?

You mean Joe Kraus's advice? That's advice for making money at something before your funding runs out. If burn rate is a factor, you may not be able to survive a contemptuous initial reaction, even if you're right.


Yes, I meant Joe Kraus's advice. It's interesting how I took it to mean something different than you did: a bad idea generally, not just if you don't have enough money for it.

I suppose you're right and I only found it general advice because it completely applies to my current state in life. Lots of people make that mistake: "this totally applies to me, it must be good advice generally"

After the first $10B, once I start working on a space elevator and robotic asteroid mining, I expect the luxury of a contemptuous initial reaction.


For more details, read the Innovator's Dilemma and Innovator's Solution.


Or if you have some time to kill, listen to the author's highly-rated talk on IT Conversations:

http://itc.conversationsnetwork.org/shows/detail135.html



Great essay Paul, I especially liked this part: "Don't worry about trying to look corporate, the product is what wins in the long run."

This is so true! I consider this one of the top reasons many startups fail but at the same time, this is a result of the paradox of offline vs. online culture as it still holds true that many offline businesses would fail without looking corporate. Its a transition so we'll be fine in a couple years. Do you agree?

Basar Kizildere Istanbul, Turkey


"... your solution can benefit from the imagination of nature, which, as Feynman pointed out, is more powerful than your own."

Which Feynman quotation is this referring to?



The bulk of the programming community substitutes "can we?" for "should we?" Your design philosophy is a result of asking the 'should' question. The work of the 'can' question people piles up opportunities for proper application of the 'should' question. This is one way of determining the trolliness of a person or point of view. The troll is incapable of asking the 'should' question.

Love your stuff, Paul.


“Perfection (in design) is achieved not when there is nothing more to add, but rather when there is nothing more to take away.” Antoine de Saint-Exupéry

Your essays are not bad, but there are still parts you can take away.

Creating a good design is very time consuming. One spends most of the time searching for the onions.

PB


I particularly agree with (a) creating simple solutions but even more so with (c) solving problems that actually need to be solved.

It’s amazing how often a complex solution can create more problems than it solves and / or completely miss the underlying problem.


the crux of the essay is the process Paul refers to(steps a-e). This model gives me great confidence that we are on the right track, and while we were not able to launch as fast as we would have liked we still see the basic unsolved, overlooked, problem that inspired us to begin.

At the same taime its what annoyed me about being passed up last funding round, if this is how they judged ideas, how could they have glossed over us. Oh well, until we are a success they where correct.

@pg: thanks for the inspiration and clarity as always


Thanks, Paul! I always enjoy reading your insigts. Some of the best free advice around. You might find that my companies design philosphy (for some reason...) is similiar to yours!

www.ensigntech.com


The Design principles are solid. Thank you for expressing them lucidly. Personally, I am encountering the same forces, and this essay provides useful arguments.


Excellent article. I've been reading your essays for well over a year now and am always impressed by your insights.


but Hamming (www.paulgraham.com/hamming.html ) in "Your and your research" says exactly opposite thing:

"If what you are doing is not important, and if you don't think it is going to lead to something important, why are you at Bell Labs working on it?"

Why both appoaches work?


great words as ever, best place to find feedback is indeed on hits and seeing who reads, not the comments of those that are too lazy to read. As fro slashdot, it used to be my staring portal to the web, but has been replaced by reddit :-)


In the 90's people were saying the same things about perl that are now saying about arc.


In the 90's people were saying the same things about perl that are now saying about arc.


Awesome essay.


Graham,

When will you stop ranting about Viaweb? I understand it was your single achievement, but it was essentially a fluke. There were hundreds of startups trying, one of them got lucky. And where's Viaweb now? It was a long time ago, and you should considering moving on with your life.

Now, as a self-proclaimed nerd, you know of Gary Gygax? Well, if Viaweb was your D&D, YC is your Lejendary Adventures: a heartbreaking attempt of a spent man to recreate his former success.

And just as Gygax died, Arc is your death.

You're dead, Paul. You're as dead as that Lisp of yours which you keep rambling about. Even people who leech your money can't keep up with your delusions of Lisp; Reddit, for example, was redone in Python as soon as the mad old grandpa loosened the sack.

And that's why Arc is depressing and smells of rot.

Rest in peace.


Wow! Tough words. For a lot of us, Y Combinator is showing us a way forward during troubled times. Paul is definitely not dead. His name and thinking come up nearly every day among my colleagues across many companies. Funny - even though Paul says he is a hacker, it is his impact on business model innovation that will last.


Thank you for yet another great essay Mr. Graham . . .

I think that - in a sense - your essays are programs for human minds.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: