Linus Torvalds: Well, so this is kind of cliché in technology, the whole Tesla versus Edison, where Tesla is seen as the visionary scientist and crazy idea man. And people love Tesla. I mean, there are people who name their companies after him.
The other person there is Edison, who is actually often vilified for being kind of pedestrian and is — I mean, his most famous quote is, "Genius is one percent inspiration and 99 percent perspiration." And I'm in the Edison camp, even if people don't always like him. Because if you actually compare the two, Tesla has kind of this mind grab these days, but who actually changed the world? Edison may not have been a nice person, he did a lot of things — he was maybe not so intellectual, not so visionary. But I think I'm more of an Edison than a Tesla.
That's how the US got its start too. In the 19th century still, most inventions were from Europe (England, France, Germany, etc), from the steam engine to the refridgerator, and from the radio to the internal combustion engine, cinema and photography. All European inventions.
What "long time"? China has been the "factory of the world" for mere 2 - 2.5 decades now and has done huge leaps since the early 90s when it started.
For contrast, it took from '45 to the mid-70s or so for Japanese companies to start innovating.
>Only duplicated others' work identically or with uglier, shoddier construction at lower prices?
Also, "uglier, shoddier construction at lower prices"? You seem to be under the impression that the Chinese just build cheap knockoffs.
Actually they also build also the high quality, high precision stuff you buy, from iPhones to BMWs (as of now 1 million BMWs have been assembled in Chinese plants).
That's what they did in the past as I stated. They were renown for it. Later on, they started innovating a ton on top of that. The transformation was covered very well in the Wired Shenzhen documentary.
I think it's a really interesting phenomenon how quickly we as a culture are able to imagine that what we're experiencing right now is without precedent.
Historically, it's only taken a decade or so for copying to move into innovation in Japan. China's already completed that decade now.
Like Elon Musk for example (who Torvalds seems to be making a gentle dig at, in the quote above).
There's a myth that some people are born with amazing powers (Tesla, Einstein) but what people don't see is the hours they put in from a young age
Therefore, both Tesla and Edison worked hard, except Edison bragged about it
Those who create? Or those who share their creations?
All the production of electric power in the world and most of its distribution today is based on the Tesla's methods. Not accidentally: Edison's were simply ineffective.
(Some also claimed that the first ever AND electrical logical circuits, today minimized in every chip in every computer, were the ones that Tesla built as the part of his wireless remote controlled boat in 1898. Here the article: https://www.computer.org/csdl/mags/dt/2007/06/mdt2007060624-... just mentions http://www.tfcbooks.com/teslafaq/q&a_024.htm but I can't find other sources. So let's stay by the electric power today. Tesla.)
Tesla's response was : if you thought a bit more, you wouldn't have to sweat so much
The Tesla vs. Edison narrative is always couched in terms of the idea guy vs. the more pragmatic (perhaps more business-oriented) guy, not unlike the popular Woz vs. Jobs narrative, or the Jobs vs. Gates narrative in the '80s. These are popular narratives and archetypes that reflect the people involved, but can lead people to mythologizing history rather than understanding it.
In a different field, Lennon vs. McCartney.
So, among my peers at least, conventional wisdom is that Tesla was 100% an amazing visionary and got screwed over by unfair forces of history, and Edison was the villain whose contributions are overrated by historians. There's some truth there, but more than anything else it's a historical narrative where people are slotting Tesla and Edison into archetypes.
When a lot of people talk about Tesla vs. Edison, they're really just talking about those archetypes, and revealing to what degree they value inspiration vs. perspiration. I think that's all Linus is doing here, saying that in his mind perspiration is undervalued and inspiration is overvalued among his peers. I don't think he's really trying to make a historical argument, which is what a lot of the commenters here are assuming.
I would also add, Torvalds is an enchanted unicorn from a resume perspective, it's not like he has followed a typical FOSS career trajectory, so seriously, what does he know --> that you can put to use? It's like a beautiful young woman telling you how you can get just a warning instead of a speeding ticket next time, just do what she does (except if you follow Linus's lead you will definitely get the speeding ticket)
Humans have a built in instinct to follow great leaders, but if you are not a great leader yourself, emulating them might not actually work for you.
I have tremendous respect for Linus and for the HN community; I'm just saying, let's use critical thinking. There are plenty of successful super lightweight coding projects; he works on kernel level OS for a huge legacy of hardwares; he already was first to market, you still need to be, it's always tradeoffs. Borrowing/paraphrasing an idea from finance or population genetics, maybe the world is the way it is because that's the right mix, why believe you know a better way to a greater extent than your already believing so is already part of the right mix?
so, I'm not saying don't think, reflect and improve, I'm saying don't throw yourself into these debates all on one side. One size doesn't fit all.
Apple's slogan may have been "think different", and they have the image of being radical innovators, but hardly any of their innovations actually originated with them.
Apple is 99% perspiration and 1% stealing good ideas :-)
There's even an infographic : http://mashable.com/2012/10/27/apple-stolen-ideas/#Fs4Q5gSS....
Even to do something like a simple library and to publish it professionally will need `high quality`: code, tests, documentation (user/developer/guides), examples at a minimum. Basically the long tail of polishing is that 80% perspiration, 19% perspiration is to get to the stage (MVP may be) where you get on to polish it and that 1% innovation is when you kick your backside to get going!
Then, as I use the library, if a bug comes up that seems like it might come back, I add a test. If I break something accidentally, I add a test for that. And if I can't remember how an interface works, I document it in the readme.
That's it. Over time my documentation and test coverage come to match my use of the library. Anything I don't use gets deleted. And I end up with pretty ok documentation and test coverage.
The philosophy is: your first interface is never right anyway, so don't bother testing or documenting it. Just document the stuff you fix.
Edit: At the same time there are also few options to make people believe it's yours beside exceptional execution, but that's not what the quote is about.
Apple is going to steal all the thunder again, this time in AR/VR, isn't it.
"When is copying flattery, when is it thievery, and when is it sheer genius? In this hour, TED speakers explore how sampling, borrowing, and riffing make all of us innovators."
As an example, I worked on a tablet in '99 . As the referenced article indicates we were not alone, nor were we first, and more came after. Around the same time there were a number of attempts at gluing together PDA's and phones. With e.g. Palm you had a UI that was not that dissimilar to the iPhone (ironically things like the memory model in PalmOS was inspired by pre-OS X MacOS - it was awful).
But all of these failed to graps fundamental issues that Apple understood, and you have it right when you say they "got to work" on making it work properly, making it user friendly, and so on. But I feel you undervalue how transformative what they did was.
When we started designing the Freepad, it's not like we were not thinking about usability, making it sexy etc. But while I think we did reasonably well in terms of aesthetics, we were too locked into thinking about it as a phone replacement, and letting ourselves be led down the wrong path by being controlled by technical limitations we ran into:
Battery tech had just barely made cellphones viable at that point. Instead of deciding what it would take, and waiting, like Apple did (not just on batteries, but also screen and others), the batteries available at the time dictated that this would not be a device to take out and about. The lack of a well established wifi standard and extremely slow GSM data mean we were led down the route - since battery meant you wouldn't run around anyway - of thinking of this as a home device and consider DECT (wireless home phones) with a data extension as a viable solution, firmly locking you into using the tablet around the house or at best out in the garden.
We ended up with a resistive touch screen because it was the only alternative that was viable cost-wise. RAW and flash was woefully limited, and dictated a model without any app eco-system, because we had to tune everything (hence things like Nano-X  mentioned in the article, coupled with our own widget library) to make even basic applications fit.
So while you may say that Apple "just" stole the idea (the idea was not ours either - I grew up with sci-fi describing or showing tablet-like devices; we were never under any illusion that we were first), they did more than "go to work".
What they did was that they refused to accept the limitations, and rather than let the technology-limitations dictate the product and lead them down the path of turning a good idea into a bad one, they didn't compromise and instead waited it out. They did this with the iPhone as well as the iPad.
This is why so many who had seen the previous tablet and smartphone fad reach maximum hype and fade away looked at the iPhone and went "so what?" (I was guilty of that): We'd already had devices that came from a similar idea, and we failed to realise that while the germ of that idea was similar, what we had ended up with last time was an idea that had been compromised in so many different ways that it was no longer the original idea but a corrupted, bastardised version that had lost the important bits when we tried translating it into reality.
I mean, the tablet I worked on was tethered to your house, for example, as mentioned above (and see the title of the article  - it references Ericssons "Screen Phone" - which basically says it all: the first generation tablets were either replacements for landline phones - in the case of Ericsson and Screen Media and others - or laptops with swivel screens in the case of PC manufacturers; the latter flawed in entirely different ways).
We didn't start out with that vision, but we let practicalities corrupt our vision, and convinced ourselves that the result would still be good enough, because we didn't see any alternative: It had to be good enough.
Part of the problem was that we failed to grasp which parts of the idea where essential, and which we could compromise on. Mobility was essential, and we compromised on it. Screen quality/touch quality, was essential, and we compromised on it (the screen was great for the time, but awful by the standards of even the first iPad). An application ecosystem was essential, and we compromised on it and never really even thought seriously about it.
The result, to me, is that while the core of the idea might have started out the same, the iPhone and iPad were fundamentally different and innovative from the attempts that went before them, not just because Apple "got to work" but because they understood where you could compromise, and which parts of the idea were essential and had to remain no matter what.
But the main point I was trying to make was : taking Apple's slogan to mean "just think of something new and the world will be changed" is wrong, Apple hardly even did the "think of something new" part, but they did the hard work of getting it to a useful state. The 99% perspiration.
But you're right, Apple did have something extra, they didn't get to where they are just by doing the hard work.
So let's say "99% perspiration, 1% recognising and stealing good ideas, and 1% knowing where (not) to compromise"? :-)
Kay had a vision, but his vision was quite different from the way personal computing went. He was thinking of closed systems which would replace dedicated word processors. The Xerox Star was the result. Imagine a machine with Microsoft Office built-in, with all the software installed at the factory.
Kay also had a thing for discrite-event simulation as the killer app. Kay wrote, in Personal Dynamic Media, "In a very real sense, simulation is the central notion of the Dynabook." This matched well to Smalltalk, which was the successor to Simula-67, an ALGOL dialect with objects for discrite-event simulation. All that "message" stuff came from the simulation world, where you have many asynchronous blocks passing events around.
The real successors to the PARC work were the first generation of UNIX workstations. The Three Rivers PERQ, the Apollo, the Sun I, and the Apple Lisa all predated the Macintosh. They were all much better, but much more expensive. The UNIX workstation era tends to be forgotten, but those were the first good desktop computers. Macs were toys.
The original Mac was a flop. No hard drive, 128K RAM, too slow, and too expensive. The competition was the IBM PC/AT - 20MB hard drive, about 1MB RAM. This almost killed Apple. Not until the Macintosh SE (1989) did Apple have a built-in hard drive. In the Apple II era, Apple had a majority of desktop system market share. The Mac in the 1980s had about 15%, which gradually declined.
Genuinely hope that Apple or any other company does the same about VR tech.
Thanks for the detailed post, very informative reading.
The iPod was evolutionary. The iPhone - their #1 source of revenue - was a fairly big leap forward.
'Think Different' is a marketing slogan, not a modus operandi.
"It's a social project," said Torvalds. "It's about technology and the technology is what makes people able to agree on issues, because ... there's usually a fairly clear right and wrong."
EDIT: Just for context, HN thankfully edited the title. When I wrote this the post was using the article's title: "Talk of tech innovation is bullsh*t. Shut up and get the work done – says Linus Torvalds"
- Talk of AI is bullshit. Shut up and get the work done.
- Talk of Machine Learning is bullshit. Shut up and get the work done.
- Talk of VR is bullshit. Shut up and get the work done.
- Talk of Smart Contracts is bullshit. Shut up and get the work done.
- Talk of IoT is bullshit. Shut up and get the work done.
Not sure if I entirely agree with him but there's some truth.
I think this is what he means. Working on any of these ideas doesn't make you an innovator. You might do something in a slightly better or more novel way, but we aren't inventing crap.
And if you don't execute on your slightly better path, you'll still be beat by someone with a slightly worse idea who buckled down and delivered.
Silicon Valley is littered with companies with better ideas and better base products that lost.
In truth, there are no giants (or maybe very few). Even the giants of the adage are actually made up of innumerable little people supporting each other.
But the bullshit starts when people use the technology and claim they are making tech innovations.
But what truth though, can you be specific?
Talking about technology is necessary to attract people to our own work and find new ideas ourselves. Open source, vendor, Saas. All get value out of conferences, meetups, articles, etc.
Sure, not every word adds value, but the same could be said of code.
The same thing has happened with "thought leadership". There was, and still is really, a group of people who do the work and have useful insight. They became high profile, and presumably made money on it, and now the filler has appeared. Self-professed thought leaders who are endless sources of bombastic buzz words and constant self-marketing.
It's a natural pattern, I guess. Something becomes profitable and profit-seekers without much actual value show up. I think the thing for those who can do is to just ignore it and continue creating things of value. Maybe also guide junior engineers on the path of being actually effective and ignoring the crud.
Rage on it if you like, but you're yelling against the wind. My recommendation would be to save your blood pressure.
This is a general human rule that most people spend WAY too much time criticizing the effort of those of us who are working.
When it goes wrong is when it becomes sales led.
> I wish I could help others see over the shouting and point to what does work instead of seeing them fall for the hype every time.
That's the point of thought leadership vs sales. What you complain of is people using the term under the guise of sales.
I don't want to be a dick but you've summed up what's wrong with 'thought leadership'. If thought leadership consists of a bunch of buzzwords that everyone else is using, you aren't a leader, you are a follower.
It's a business in and of itself; there's no correlation between 'thought leadership' and success. Of course the traveling preacher would want you to think there is.
Edit: I'm not saying that he WANTS to be a multi billionaire, but the fact is that he has attained disproportionately less value than he's created. By rights he should be one of the wealthiest people in tech. He might have 150m but that's peanuts given what he's done. The wealth of the guy who made Instagram dwarfs that. The guy who made Whatsapp has a net worth of 8b.
Creation might be 90% perspiration as he says, but perspiration doesn't equal success, and success doesn't equal a career. Obviously everything isn't about money, and Torvald's legacy will be timeless. But if you want to ensure earnings, at some point it's a good idea to sell.
Are you saying that because he didn't try to monetize his code as much as he could have, it somehow makes his opinion less valuable?
The man created Linux AND git, it doesn't matter whether or not he's got billions. He's got something more important than that, a legacy.
Why is that important?
It's important because he's actually delivered on the famous SV con-artist promise of "making the world a better place".
Not directly, as in curing diseases or revolutionizing energy production or consumption, but in ways that help people in developing countries access information due to falling costs of computers (Linux) and phones (Linux through Android) and people in business can thrive because of the diversity it brings to the table (versus the Microsoft quasi-monopoly we had before).
That's what I put under the umbrella term "legacy", something that has, in a way and ever so slightly, changed the world for the better.
Then again, we might say "it's just software", but in a software-centric world, I reckon it does matter.
That's besides the point. My point is that working without selling is not a viable strategy for 99% of people. Hell it barely worked for him in terms of earnings, and he's one of the most impactful people in tech history.
Suppose that what matters is to monetize your work. Then by that metric, Linus is immensely successful (personal worth of over 100 million dollars, which likely puts him in the high brackets).
Now suppose (like I do) that the metric that actually matters is sharing the result of your work so that others will build upon that and end up creating even greater things. Well, by that other metric, Linus is still a HUGE winner.
So we're basically both right (unless of course we're ready to discuss obvious falsehoods such as "Having 100 million dollars means it barely worked for you").
But many of us will regret not spending more time with our loved ones, or not leaving something of value behind, something we created.
The truth is many of us aren't doing this for the money, because let's be honest, most of us have an IQ over average and we could work in the finance industry, which is far more lucrative for making actual money. Or we could build a local business selling products or services for local needs, not "disrupting" anything in the process. Or we could end up in upper the management of big corporations, in safe and high paying positions, instead of doing the actual coding. Etc, etc.
We, the software developers, are creating, we've got the creator's virus. It's both a blessing and a curse.
Oh and if any recruiters are reading this, those of us with passion, experience and capacity for solving hard problems might be motivated by technologies or projects, but we aren't cheap or exploitable, so for as long as we are on the right side of the demand/supply curve if you're looking for cheap, then GTFO!
You're saying 95% of people in the Western world can't afford food, shelter, clothing, education, transportation, healthcare and entertainment? That's awfully grim. It also doesn't really jibe with my experience of the Western world; you can get by fine on $50k/yr (median US household income) outside of expensive areas like SF Bay Area or NYC. If what you're saying is true then the rest of the world must be truly unlivable.
EDIT: I think something like 20% is closer to the mark.
To illustrate, the median house price in the US is around $250k. That's pretty much in line with the rule of thumb that your house should cost at most 5x your gross annual income. So it would seem (at a first approximation) that most houses in the US are affordable, in the strictest, 30-year-mortgage sense of the word, to most households (considering $50k to be the median household income). And a $250k house in most parts of the country is by no means "modest"; we're talking 1500-1800 sq. ft., 2-3 bedrooms, a yard etc. (again outside expensive areas). So if a median-earning household were to spring for a truly modest home (1-2 bedrooms, 1000-1300 sq ft, $100-150k range) it would actually be cheap relative to their income and they could pay it off in < 15 years.
Maybe I'm simplifying too much, or perhaps your experience is different. In which case, of course, we would have differing opinions on this matter.
There's a whole world out there where you can live quite a comfortable lifestyle for a half or even a third of your average SV monthly income.
Seriously, money is a matter of hygiene and if you're feeling the need for more, it's time to make a serious change, like to change city or profession, because startups are a lottery.
I'm from Romania, having worked remotely for EU and US companies and I never left because here I can have a much better lifestyle, I have freedom of movement when needed and recruiters coming with proposals which include relocation are simply not competitive.
Seriously, Silicon Valley is extremely overpriced and IMO quite toxic as an environment to live and raise children. That's because it is a bubble of really smart and well paying engineers, scientists and business people that have created a highly competitive environment.
And if you suffer because of the bubble, the answer is not to fight your way to the top, because that doesn't solve the problem. No, if you don't like the bubble, the answer is to get out of it.
This kind of reducto as absurditum is how we have people that think it's perfectly fine for unprecedented mass poverty in the US outside the Great Depression because people may have running water and TVs when most of the world doesn't. While true, it misses the spirit of the discussion and is a non-sequitur line of reasoning as a result.
I'm sure Linus is living a very comfy, happy life - even though his Net Worth isn't greater than the guy who made Instagram.
And i think, a lot of that has to do with your culture.. people should take this kind of thing more into consideration, try to understand what the "secret ingredient", and try to replicate the things that work somewhere else.
I mean if a world-renowned workaholic genius can't automatically generate a fair amount of wealth from decades of creating some of the most used and influential software in the world, what does that imply for the rest of us?
I also don't see him as a world-renowned genius (as you put it), because he isn't one and from what I've understood from watching his talks & reading the mailing list he himself does not want to be refered to as one either.
I think that people should keep that in mind when they read articles about Linus, because most articles tries to portray him as some kind of higher form of being among the likes of Steve Jobs & Gabe Newell, which I believe is not the way he'd like to be perceived when you read a sentence he spoke.
Tech-world renowned I should've said. And he may not be a self-proclaimed genius but he has a level of talent which is surely rare, at least.
This goes for an extremely large portion of humanity, probably > 99.99%.
Very few people manage to extract all of the value they created (and sometimes more), and most of those people are not the nicest ones.
100M$+ or so? Barely? Where do I sign?
You could not spend > 10M$ on yourself and your direct dependents meaningfully in a lifetime.
The truth about most SV billionnaires is that they captured far more latent value than they created. This ethos of building a fortune of absurd proportions can not work for most either.
That's got to be the most ignorant comment on HN in a very long time. Really? Think for just two seconds: Do you think a person would be employed at all if they did not make more value than they took home (including taxes and all that)?
The whole reason our economy works at all is BECAUSE people make more value than they take home. If not for that there would be zero employment.
Entrepreneurs build the machines. Everyone else is just cogs or highly specialised components.
Try looking outside of your little bubble once in a while. Just for one day pay attention to what the vast majority of working people are actually doing.
You're talking first of all there about valuations (aka pie-in-the-sky) not cash-in-the-bank-or-matress anyway or other encumbered "assets" not money. These usually only get "converted" when in a mad rush to beat the avalanche of other "assets" seeking rapid conversion on a massive scale due to black swans, and then good luck converting those "billions" into (purchasing-power-equivalent, in-the-bank) billions.
> Obviously everything isn't about money, and Torvald's legacy will be timeless.
So why the heck bring it even up? =)
Because he's saying shut up and work? But working doesn't just...work...
I say he marketed himself well
That's a strange formula. Shouldn't there be some kind of saturation for freedom.
Also closes with a great quote. Code is easy, it's either right or it's wrong. People are the sticky wicket
> It's almost boring how well our process works," Torvalds said. "All the really stressful times for me have been about process. They haven't been about code. When code doesn't work, that can actually be exciting ... Process problems are a pain in the ass. You never, ever want to have process problems ... That's when people start getting really angry at each other.
When I read that I got the feeling that people were thinking to themselves "it either executes or has an error," but that is certainly not the case when Linus deems code right or wrong. Obviously not executing is an automatic disqualification.
Two patches can correctly execute and achieve the same goal, and yet one will be deemed "brain dead" and "moronic" and the other be deemed "right," solely on the subjective whims of Linus. Totally his prerogative, and I have no issue with it.
But don't think for a minute that code is "black and white."
Certainly true. IMHO, innovation is about orientation, while perspiration is about walking. They live in different timescales: GTD takes time while innovation is a spark. However, both are equally important: it would be useless to go forward in a wrong direction, it would be useless to identify a meaningful direction without going forward, and it would be of course useless to walk backward.
An acceptable - and subjective ! - balance is hard to find, these days.
p.s. As a side-snark...Enough already about all these various dev technologies. So they enable still-shitty user experiences? So what. No one says, "Oh. I love they use _____."
Users. Don't. Care.
So please, for the love of God & country, stop stroking yourself with your shiny new (dev technology) object. No one cares. The technology is a means. The experience is the ends. Stop focusing on the wrong problem. Please?
In the US there is intense pressure right from school to colleges to work to be 'exceptional', and to be recognized and celebrated for it.
There is nothing necessarily wrong, excellence is worth pursuing and to have individuals believe they can achieve it. But there is a huge difference between motivation by passion and interest and motivation by social recognition and celebration.
There are pitfalls and side effects in a society from a toxic focus on 'winners' and 'losers', constant judgement, politics and one upmanship, the ability of people to work together without the need for self congratulation and diminishing the collective. It takes a village and all.
Excellence always comes through, you don't need to do anything special, individuals who are brilliant will always shine in a self evident way without labels or self congratulation via their work, throughout history and now and in the future.
But you can't progress alone, progress comes from a generational interlinked collective, and there is huge risk of diminishing the collective and brushing every other factor under the carpet by an extreme focus on individuals.
What works for me is a series of plan -> do -> review sequences with about 10 to 15% planning, 80 to 85% doing and 5 to 10% reviewing.
That's where the real work is, in the details. I respect those who walk the talk and he's one among them.
It's not so original and it sounds like it was copied/pasted from an old quote 100 years ago
"Opportunity is missed by most people because it is dressed in overalls and looks like work."
+1 for that take on innovation.
We need innovation simply because its fun.
Now ask someone in the VR/AR department. Everyday they have to think up is new 'innovative' ideas because they are on the bleeding edge. We know innovation is needed because so far not everything is working.
What about Neural Nets, where there have been a lot of innovations to get from one one 'neuron' to what we now call deep learning. And the list goes on.