The fundamental defense made by many is that without patent protection for software, people would not put much effort into innovating.
1) If you look at science, open source, fashion, food, and other areas where humans continually build on culture, you see plenty of continued innovation without insane legal protection.
2) The amount of effort pales in comparison to the monopoly granted. You could make the argument for say, pharmaceuticals, that if it takes 10 years from lab through human trials and hundreds of millions of dollars, that a 2-decade long protection period might be needed. But there is no FDA for software, and Apple actually spends far less on R&D than other companies, and with $100 billion in the bank, you can't claim that haven't gotten an incredibly good return in their investment.
Therefore, it is insane to grant 20 year protection to Apple for stuff like pinch gestures. 2 years maybe. But 20? It's absurd.
You can read that patent here: http://www.google.com/patents/US7812826
And, even thought that patent was _not_ referenced in the case, you will see it's a particular multi-touch behavior _associated with pinch-to-zoom_ that apple is claiming a process invention against, not pinch to zoom. It's actually quite clever.
See: http://www.theverge.com/2012/8/30/3279628/apple-pinch-to-zoo... for good details on this myth.
Note - others have patented Pinch-to-Zoom far before Apple came to the stage.
That said, Google was lucky that there was a loophole in the patent definition. Imagine a world where Apple had actually obtained an airtight patent on a system that listens for single- or multi-touch gestures and switches between scrolling and pinch-to-zoom appropriately (which is essentially what the patent in question does). So, you'd be free to implement pinch-to-zoom as long as it doesn't infringe on that. Good luck.
(the point here being that, de facto, Apple would essentially have a patent on pinch-to-scroll. In real life [not in the proposed alternate universe] this is almost true -- there are loopholes, but you have to hire a patent lawyer to find and/or confirm them).
I'd rather not, lets just stick to the facts and not build strawmen. It's bad enough to even discuss pinch-to-zoom or rounded rectangles when that's not what the trial was even about.
If we're discussing the Apple trial, lets figure out what really happened, not just repeat what we read from the press.
I don't care which phone I use, because they're (mostly) all pretty good now. But wow, I remember the Microsoft WM5 heap that I had just before my iPhone. It was a testament to terrible UI. It had more widgets than my old Palm5 but was somehow far less usable. Not some rubbish knock-off, the premier phone from HTC running the software from the world's software monopoly, and it was terrible. I made vows never to use another Windows phone again after that. Version 5 and it was still crap. With the iPhone, Apple's first version knocked it out of the park. First try. And changed the industry, yet everyone thinks it was all obvious or people should be able to use their ideas freely.
There is obviously a balance between Apple-owns-all and everything-goes. Apple should benefit from their patents like every other company does, but others should be able to license them under reasonable terms. Until we finally dump all software patents, this seems like a reasonable compromise.
PS. - pharmaceuticals usually get a full-on monopoly for the full period. Software makers usually cross-license, so there is a difference.
PPS. - Apple's 100bn in the bank just show how far behind everyone else was at the time. Took ages to catch up.
BTW, I liked my WM5 phone.
Eep. Scrolling was through the tiny scrollbar on the right, and I'm left-handed to make it even more irritating to use. Closing a window with your finger required a fingernail in the corner (or stylus, add 2 seconds). It insisted on stuffing a Start Menu in there, with careful clickery required to not make mistakes. And a Windows Explorer, with tiny expand-contract [+] thingies. I forget most of the horror.
For others who forget, these were examples of the best apps at the time. Behold the menu bar for some apps, the scroll bar, tree menus (and tree folders, requiring stylii or fingernails), [x]-to-close, the button on the bottom that says "Up". You can see the iPhone was out, and some apps were already influenced by it.
My hardware had a slide-out keyboard. Sometimes the flip didn't flip, though, so you'd open and close it multiple times until it woke up. The phone was capable of connecting to the web, but that just generally brought misery. It could do video calls, which was theoretically pretty cool. We tried that once.
Version 5, that was. I think I'm going to put it in a fire, right now.
I cannot find logic that $30 is reasonable, but I cannot find logic that shows $30 is unreasonable, either.
Here is some food for thought: iSuppli claims the iPhone 4S contains $188 of materials (http://www.isuppli.com/Teardowns/News/Pages/iPhone-4S-Carrie...). That "$188 BOM" 16GB phone sells for $649 on the Apple Store.
Part of the price difference will be the cost of shipping and handling, customer service, warranties, etc, but I think it is reasonable to claim that at least $200 will remain after factoring in those. People do buy these phones, so there must be some $200 of value in an iPhone that is not attributable to its hardware. People will have different views on where that value is, but IMO, it is not unreasonable to assume half of it is in the iPhone UI.
Based on that logic, $30 seems not unreasonable.
Whether $30 is a reasonable price for use of the UI elements that Apple claimed ownership of is a completely different question.
There need be no software patents on the basis that software takes time to reproduce and that is your guarantee. The implementation is where the cost is. If the idea is obvious and trivial (like pinch gestures), then you probably don't deserve a head start.
He didn't invent, he discovered. I'm not just being pedantic, I think the approaches signified by these two words goes to the root of the differences your describing.
That reforming copyright and patent laws will reduce innovation is unproven scaremongering. There are other industries that don't litigate like this, but none the less experience a wide diversity of innovation. With the exception of Nathan Myhrvold, most innovations in field of cooking have not been patented. You don't go to a restaurant and get a patented entree, the chef cooks up new and original recipes by copying and tweaking recipes of those that came before.
In a classical sense, science is an act of discovery, of learning and understanding that which already exists, that which is the fundamental nature of the universe. These are things done out of a quest for knowledge and understanding. The entire concept of a patent is foreign to such an approach.
Invention on the other hand is a concept rooted in taking a knowledge base and using it to create something new. Something that isn't just the fundamental nature of things but rather using such fundamental rules to create a novelty. The concept of a patent is founded on this ideal.
Ultimately what I'm saying is that science marching along just fine without patents is a poor comparison to use, it has a completely different basis than what patents were created to protect (theoretically, I think they are just a mess at this point).
>Invention on the other hand is a concept rooted in taking a knowledge base and using it to create something new.
Replace "invention" with "science" and this sentence remains true.
You're drawing a distinction that doesn't exist.
You believe incorrectly.
If pinch and zoom is such a trivial thing then invent some other thing to use instead.
The patent system by-and-large doesn't pick and choose patent durations based on the kind of patent.
Theoretical physics and patentable technology are quite different things and the comparison is not helpful. Before patents we had a guild system where people who made almost anything -- pottery, telescopes, pots and pans -- would obfuscate their techniques and if possible their products. Knowledge of key manufacturing processes was a trade secret. This is the world that we'd be in without patent protections.
Pretty soon, consumers are confused, because every website they go to has a radically different user interface brought on by patents on ideas that should be basic commodities.
When we had the guild system, we also didn't have universal public education, and instantaneous near zero cost publication of ideas.
Plus, you analogy makes pinch-zoom patents look even worse. You can keep how it is implemented a complete trade secret, and from looking at it for a few seconds, I can produce an equivalent implementation.
Apple has already obfuscated their techniques anyway. Why not release iOS as open source then, if it's protected by patents? The reality is, these patents are read by no one except lawyers, and in many cases, they under specify the implementation by being very abstract.
I really don't see how anyone who writes software for a living can defend these things and defend the status quo of not even supporting reform. You are asserting that something like an XOR-cursor (another dumb patent) is equivalent to a manufacturing process deserving 20 year protection?
I agree that it seems like things are a bit screwy right now, but I don't claim to know what the solution is, and I'm not sure that things haven't always been a bit screwy (Alexander Graham Bell got the telephone patent because he was ahead of some other guy in line, Farnsworth died poor having invented TV).
Abolishing software patents is -- in my opinion -- not the solution since -- given the direction technology is headed -- this is going to be disturbingly similar to abolishing patents altogether. Most suggested "reforms" of the patent system seem to satisfy Mencken's criteria ("neat, simple, and wrong")
Note that genetic and chemical-engineering patents are pretty close to software too.
1-click is one of my least favorite patents, but it really comes down to an argument about obviousness -- an argument it appears to have lost in Europe (where the patent was never granted). The patent has been challenged, most of its claims thrown out, and its remaining claims narrowed:
I'd suggest that in the end its a question of application not theory. In this case the Europeans have arguably done a better job of applying patent law than the US.
There's absolutely nothing about the concept of "incentive innovation" that says you must punish other innovators by giving monopolies to each individual. Of all ideas for promoting innovation you can think of, granting monopolies are among the worst ones. If before the invention of Intellectual Property, if you asked people to come up with new ideas to incentive innovation, no one would come up with "let's promote innovation by punishing innovators to pay fees to a select few". And in fact, no one did, that's not how IP was invented, it was the other way around. It started with UK monarchy monopolies with the explicit goal to make money for the monarchy. The excuse that IP protects innovation was made up later by those who were profiting from it when the monarchy fell.
Solutions to replace patents always existed, still exist and are working great. YCombinator is a great example of that. If you think it's the government who should grant some kind of incentives. I don't know about the US, but in my country we have many government programs for innovative startups. Many high tech and bio tech startups only exist because of government granted funding, incubation and mentorship.
Humans will always innovate. Solutions already exist and are working. Patents just need to stay out of our way and the rest will keep working fine.
So, without patent protection, pinch-to-zoom would have been a secret and the rest of us would never have been able to figure it out? Or, without patent protection, no one would have put in the R&D effort to come up with pinch-to-zoom?
The benefits that patents are supposed to provide to society just aren't benefits at all in this case. So, as a society, we ought to figure out a way to change the law so that it provides benefits where needed without doing more harm than good. It's clearly doing more harm than good in software right now.
Do you think samsung needed to refer to apple's patent to figure out those tricks? Or that those tricks needed significant R&D to come up with, which requires patent protection? I do not.
How you would implement pinch to zoom was immediately obvious to me after seeing it.
I've implemented pinch-to-zoom in a bunch of different contexts, and I agree that it's easy to do a half-assed version. AFAIK that's not what Apple patented, and you're free to implement half-assed pinch-to-zoom to your heart's content (although you may be infringing an earlier patent).
But isn't that the exact problem.
p&z is so trivial that any other thing would be less trivial which would be counter intuitive for human-computer interaction.
Its invention cost no money and society as a whole is not served by granted exclusive monopoly over it to any entity.
The entire discussion is turning to be absurd. Is pinch-and-zoom (and friends) all that separates Apple from the competition?
In other industry manufacturers learn from each other (seatbelts, suspensions systems, transmissions, etc), but somehow when it comes to software and electronic devices we want to stop progress and grant monopolies to every oh-so-little idea.
(I have no opinion about how much Samsung actually copied from Apple or whether what they did was legal; I haven't looked carefully enough at the case. I'm not passing comment on that. Just saying that you can't reasonably say both (1) that what Apple have rights over is a very refined implementation of pinch-and-zoom that differs from anything you'll see on an Android device, and (2) that a maker of Android devices copied it. At least, not without a further explanation of how they screwed it up so badly.)
A published paper on the results of a trial may fall under copyright, but the factual observations and conclusions cannot be.
The "clinical trials are expensive" argument is incredibly disingenuous.
Not that I consider software developers to be scientists but I wonder if there are any Apple engineers contributing to open source projects or even just publishing the occasional research paper.
Doesn't Apple have rules that keep all their employees from talking to anyone outside of Apple under threat of immediate termination?
No FDA for software. You can say that again. There's also no Medicaid, Medicare or health insurance that pays for it.
The government can sell the production method to other countries for a profit though.
The role of government is to protect life, freedom and property. As soon as the government gets involved in microeconomic transactions, the efficiency declines. Comparing the US Postal Servie and Amtrack to Southwest Airlines and FedEx illustrates this perfectly. The government should regulate markets only as much as necessary to ensure that the markets function. They shouldn't be running the markets themselves.
The lack of tort reform is one of the key drivers to the insane cost of drugs and medical care. When a drug company has multi-million dollar exposure for every drug they produce, it's going to raise the cost. OB-GYN docs pay several hundred thousand dollars per year in malpractice insurance because one mistake has them on the hook for multi-million dollar punitive judgements.
Government should be a referee, not a player. It should provide a safety net from complete destitution, but not serve as an insurance policy against failure. The GM bailout is a great example of what government should not be doing.
When the government starts meddling in private business (i.e. pouring money into it,) it distorts the market and makes it much harder for innovative new players to enter the market. The Tucker car from the 1950s is a perfect illustration. The big three in Detroit threw around political muscle to effectively shut down a car company that built safer cars. If government had stayed out, we would have had seat belts and pop-out windshields years earlier.
"I'm from the government and I'm here to help" are still the scariest words one can hear in America.
The market is a tool useful for some problems. It is not the alpha and the omega of society.
Government is not and has never been an referee in medical research. The drug companies would never allow it. Try removing a $26.4 billion of free R&D and see lobbyist doing everything they can to stop you. Additionally, anyone caring about sick people would join in in stopping that bill. One do not simply remove $26.4 billion of medical research in the name of free market.
Because my perception is that people accused apple of that, but I don't remenber them saying that...
Pharmaceutical companies do the same thing. It's not the pill that's expensive, it's designing it. Same with software.
But the individual features are the things that Apple didn't do. Yet that's what they sue over because that's how patent law is set up.
So now we see Samsung lose big in court and popular reaction is split, and here's why: People looking at the actual facts of the case are outraged that Apple could win that way because the actual grounds of the win had nothing to do with copying or Samsung's actions and everything to do with the fact that anyone with a million over-broad patents on obvious "inventions" and laws of nature and mathematics can win in court against anyone who produces a computing device, arguing that any actual copying on the part of Samsung is irrelevant. On the other hand, we have the people who look at the result and the fact that Samsung's devices do actually look entirely too much like Apple's and think Samsung got what was coming to them, ignoring that in order to do it Apple had to adopt a long list of bully tactics that they've now demonstrated that they or anyone else with a sufficient patent arsenal can successfully use against their competitors (including those whose devices aren't intentionally copied, because there are too many patents to possibly even attempt to avoid them all).
Nobody seems willing to say that Apple should potentially have some remedy against Samsung for actual copying but that what they got is the wrong remedy in the wrong way, not least which because the same tactics can be used against anyone whether they've done anything wrong or not.
That is, after all, what will make him want to share his methods instead of keeping them a secret. That's the fundamental principle of the patent system: you share, and we give you a temporary monopoly defended by law.
This patent system is obviously broken, but in the mayonnaise case it wouldn't be.
The law of protection of confidential information effectively allows a perpetual monopoly in secret information - it does not expire as would a patent. The lack of formal protection, however, means that a third party is not prevented from independently duplicating and using the secret information once it is discovered.
 Like Coke. Ingredients and technique are, independenltly, common prior art.
 Also, this is not to be snarky: the protections are different.
"Apple products are like a good classy restaurant or hotel chain. They take ingredients everyone has and put a lot of work into fit and finish, they make the customer feel special for a slightly higher price."
And I stand by it. I'll go even further actually, I think Apple is one of the least innovative big companies. Look at all the big research labs at Microsoft, Yahoo, IBM or Google. Anyone who seriously follows this stuff knows a) Apple doesn't have a profile in the academic world and b) knows enough computer history to know Apple is claiming things invented decades ago.
"Fit and finish" is as innovative as a new algorithm, it's shocking how much of the industry still treats it as a footnote and a detail, despite the entire history of the tech world since iPhone 1 would indicate.
No, the reason why Apple has eaten's everyone's lunch up till now is because they're interested in being profitable, not innovative. The GP is correct in stating that Apple isn't innovative - they just take other people's innovations and monetize them by polishing things up and running effective ad campaigns to gain marketshare among the masses. And there's nothing at all wrong with that - if your primary interest is making money.
The devil is in the polishing up, evidently.
Your post is almost scarily indicative of the industry attitude that has allowed Apple to take over to the degree they have. Only hard, technical inventions are given any respect, and when we talk about UX we call it "polishing up", almost spitting those words out of our mouths in condescension.
Are you seriously going to hold up a clickwheel and say it wasn't innovative? Or the iPhone? Or the iPad? The fact that these products look and behave almost nothing like their progenitor technologies doesn't indicate innovation to you?
It really disturbs me how little respect us geeks have for the people who consume our products. When the general public votes with their wallet in a landslide victory for Apple, we blame them for being easily manipulable by slick ad campaigns and shiny baubles. The notion that Apple has actually satisfied a long-standing demand is somehow not allowed to enter this discourse.
The best place to be in business for profitability is to do that last 10-20% that produces a finished product, and Apple is great at that. The worst place is to do the first 50%, basic science which may enable great stuff in 20 or 40 years, but won't do much for your profits today. Hence why much of Silicon Valley is based around mining uncommercialized academic and research-lab work for raw material that can be turned, with additional work, into successful products. I don't think that means the raw material wasn't necessary or important (sometimes even key) to those products, though, so just looking at profits doesn't tell you the story.
--The 80/20 rule [Old saying]
Monetization does seem to be maximized at a bottleneck or stumbling block. Like bridge tolls. Or that final thing that makes the whole worth more than the sum of its parts. Apple does seem to do both.
So it's smarter (if you want to make a profit) to let someone else do that, e.g. someone who's paid as a researcher and isn't trying to turn a profit, and instead look for things that are 1-3 years out. You even see it within academia; applied math pays a lot more than pure math, for example, even though both are quite important to mathematical progress.
No one's saying that. All that is said is it's not entirely Apple's doing. In this case it was Synaptics.
For example, how different is the homescreen of the iPhone from this? http://images.yourdictionary.com/images/computer/_PROGMAN.GI...
Compared to that, Windows Phone with Metro is much more innovative.
Another problem is that many people attribute things to being invented by Apple because they first hear of it from Apple(and because they don't use non Apple products). For example, I remember when Apple introduced hybrid graphics with a way to switch between the integrated Intel gpu and a discrete Nvidia/ATI GPU. Sony had that working before Apple, but a LOT of folks thought it was Apple that innovated it. Perhaps Apple added more polish to it, but they certainly were wrong.
Polishing and going the last mile is very tough(see OEMs with half baked software and hardware) but does it deserve patent protection? Apple innovated and got awarded with becoming the most valuable company in the world with more than 100 billion dollars in the bank with which they can invest further in innovation instead of indulging in petty patent extortion over petty things like the bounceback effect or linking phone numbers in emails to the dialer.
This is decidedly backwards. From the start, Apple's focus was creating great products (by their definition of great product), and only later did they learn how to maximize their profit from them. There's a pretty good and influential taxonomy of "value disciplines": product leadership, operational excellence, and customer intimacy. Historically, Apple was good at the first and not so great at the other two. Since Jobs' return, they've really ramped up on all three.
A classic way in which companies undermine themselves is by pursuing profit through cost-cutting rather than improving product/service quality. I don't have any hard data on this, but I suspect that's what happens to intiially innovative tech companies that have non-engineer/designer business types that take over.
I'm confused. Any rational business is concerned with making money. Profit growth is a symptom of innovation.
> No, the reason why Apple has eaten's everyone's lunch up till now is because they're interested in being profitable
And Microsoft and IBM were not interested in being profitable?!?!
Certainly there is UI innovation, and certainly it can move mountains-- I think arguing that fact is a straw man-- no one is saying that you can't be innovative with UI. We're just saying that most/all of it isn't worthy of patent protection.
In any case, the notion that UX is important and innovative in no way justifies the gross abuse of IP law we've seen up till now from everyone involved.
Which leads to...
> "We're just saying that most/all of it isn't worthy of patent protection."
There's a false separation here. IMO almost nothing we do in the software industry is worthy of patent protection, UI-related or otherwise. To point at UI patents and scream foul, while giving a free pass to "real" patents (ones with academic papers behind them) seems misguided.
As a UX designer, the idea that UX can be patented terrifies me.
It's like patenting a specific color of paint. shudders
This definition does not tie innovation to invention. I think it would be hard to argue that Apple's work is not original relative to their competitors. Apple out-innovates their competitors in marketing, in supply chain management, in product lifecycle management and in design. It's impossible to invent every (or even most) components of a general purpose computer. But to select the right pieces, assemble them in a way that maximizes user experience and market them in a way that makes them stand apart from competitors' products made with almost the same components - that's innovation, just the same way that Netflix' model for mailing DVDs (they didn't invent the mailbox, the postal service or the DVD) was extremely innovative.
iOS is a great example of radical originality, even if not a breakthrough technically. The Concept of the ap - a litewight, bandwidth efficient, modular, reconfigurable element integrated into the OS - was certainly original. It was also thus, highly innovative. It was reductive smaller, lighter, less complex.
Note the features here: ARM CPU, Touch screen, Contextual media app, "Apps" button down the bottom, self contained apps which were modular, integrated into the OS. I know the OS well (RISC OS) and I've had my hands on an actual device.
It was smaller and lighter and less complex than anything else technologically possible at the time.
The basis was an EU funded project to build something like an iPad. Notes here: http://en.wikipedia.org/wiki/Acorn_computer#NewsPad
Arthur C Clarke innovated this particular nugget of technology.
Apple has invented or innovated precisely bugger all there.
Their only innovation is how to make it look pretty and extract money from people.
If you don't see any software innovations in iOS, you're either blinded by Apple hate or unable to see further than checkboxes on feature lists.
Sony have been doing this sort of shit for years. Someone invents something, Sony adds turd polish and a decent supply chain and manufacturing capacity, then takes the market share.
Apple just got better at it than Sony. There is no more story.
For ref, I neither hate nor like iOS devices - I've owned a couple and they've been pretty ok but nothing special. I can't see a single feature or innovation that didn't exist already somewhere else. The same applies for my current Windows Phone (the only innovation there is abysmal battery life - no wait my Treo 180G pioneered that in 2002).
Also, if a feature isn't a checkbox on a feature list, why do they market it like that? http://www.apple.com/iphone/ios/
But that's not a tethered iPhone, in terms of is overall ambition and functionality. Similarly, the acorn with 8MB RAM, was not an integrated multi-media device (ipod, phone, etc), limited as it was. Let us not forget the power of the sw (youtube app, for example).
Lastly, the innovation (in part on the business side) of the Ap store and ecosystem should not be completely overlooked. There is seamless delivery/monetization etc (not just collections, but outbound to devlepers).
In short, there is alot of originality in how the puzle is put together. Some of it is like the swiss army example. Some of it is in the conceptualiztion of the user experience. Some of it, quite frankly is execution of the physical product (manufacturing details, etc), as I have argued before. (e.g http://news.ycombinator.com/item?id=4435490)
I look at an acorn and a blackberry+phone on the desk. And I look at the iPhone. The latter looks like the victorinox, the other items like tools on the table.
 The internet-integration of the acorn aps, i'm not familiar with; e.g. is not full-time connected online unless it had a built-in richochet or whatever. clearly iOS is meant to fully integrate with live information without compromising its mobility.
The basic idea of the NewsPAD was exactly the same though.
They decided to move into the cable/STB market after this as they could deliver the same experience with the connectivity that was already there. They did this successfully for a few years in the late 90's before they marketed themselves into a hole and gave up.
Apples innovations wasn't in "the basic idea", it was in lots of implementation details that make that basic idea into a usable product. If you don't see this, then there is no point in arguing further.
I could just as well say that there has happened exactly zero innovation in the cell phone industry in the last 30-40 years, since "the basic idea" was shown in a science fiction series in the 60s. By doing this, I would be ignoring all the inventions necessary to make that "basic idea" into a real product.
You are ignoring the inventions needed to make a finger-based multi-touch interface possible on a small device, and accurate enough to leave out a physical keyboard.
Without cellular data the iPhone would still be a killer device just as a cell phone with a music and video player, and the iPad with a complete office suite and other apps.
Cellular data was widely available in 1996, at least in Europe. I know because I had a PCMCIA card connected to my cell phone at that time.
But of course, you can keep twisting facts to fit your theory all day long if you want, I know nothing I can say will change your opinion.
If all Apple did was turd polishing, then they could have saved themselves a lot of effort by polishing a turd called Newton (which was also quite advanced for its time) instead of inventing an entirely new user interface.
What I was saying is that listing stuff like you did: "ARM CPU, Touch screen, Contextual media app, "Apps" button down the bottom, self contained apps" and using that as an argument to why iOS has been done before, is to be unable to see further than a check list of features.
Lots of tablets had the same checklist before the iPad, and most of them were completely unusable.
I suspect that you would be just as happy with a WM6 or Symbian device than with an iOS device, since there were no innovations in iOS?
I also don't see how this relates to Sony. They had a lot of innovations, including the Walkman, co-creating the CD, 3.5" diskettes, Video 8, DAT, MiniDisc and lots of other stuff.
I would be happy with anything, but not necessarily impressed with it. A paradigm shift would be innovation but there isn't one.
I'm using a Windows Mobile 6.5 kernel based device to write this on ironically (WP7.5 Lumia 710).
Sony's ability was to take poor grade American products and package them up with Japanese reliability and quality. I'm considering their television range from the 1970-2000ish primarily. The rest of their "innovations" were turd polish over existing products: Stereobelt, 5.25" floppy disks, Panasonic U-Matic, Mitsubishi ProDigi, Canon Ion Disks...
Or are you saying that Sony copied the basic idea of a television, so that makes it impossible for them to have contributed any innovation to the space of TV sets? If you read the history, the invention of something like Trinitron required a lot of work, a lot of trial and failure to make the basic idea of a single-gun color TV feasible.
That work isn't about "polishing the turd called color television", that's called innovation.
That's kind of the definition of innovation. Take something that exist and improve upon it.
My wife had it and I went back to an S60 device (Nokia E51).
The iPhone was just prettier and substantially less functional.
Yet still, of course there are lots of people that will claim that those restaurants can't possibly be worth the price, and _has_ to be mostly about making the customers feel special, after all they use the same basic ingredients as the restaurant around the corner.
"those restaurants can't possibly be worth the price"
First off, there is more to a dinning experience than the quality of the food, so even if the food is the same a price increase can be justified. You could serve me a McDonald's hamburger for 20USD and leave me satisfied with the transaction provided the burger was not all that I was getting... That burger would not be worth 20USD, but that would not necessarily say anything about the worth of the establishment itself.
Regardless, the question is not if the expensive food at high quality restaurants is particularly good, but rather "What is the relationship between quality and price?"
In the case of high class establishments, the food is certainly good and the price is certainly high. Is that a linear relationship though? The 50USD burger is undoubtedly better than many 5USD burgers, but is it 10x better?
Furthermore, does higher quality food always cause the same sort of price inflation? Or is it possible that similarly superb food sold at undoubtedly higher prices at a restaurant without the other things that high class establishments offer would likely be cheaper?
I would even dare state that, to some extent, high price can actually be one of the desirable services that a restaurant can offer. If you happen to be more concerned with appearances than (in the grand scheme of things) a small amount of money, then being expensive for expense's sake can be a feature.
But I think it's difficult to quantify quality differences like that. For instance, when watching the movie Jiro Dreams of Sushi, I had a hard time thinking that a sushi meal could be worth more than $400 (starting price). It would certainly be wasted money on me, since I'm not a big sushi fan. But to the reviewers, apparently it was worth a dedicated trip to just eat there, so they would probably say yes if asked if it was 10x better than a $40 sushi meal.
There are a lot of inventions, especially in those R&D labs you mention, but they rarely come to market, or are good enough for the regular consumer to use.
Perfecting fit and finish is quite obviously innovation.
If I say that the ipad is not innovative, I mean precisely that there was no element of it that was an implementation of a completely new idea, but I don't mean to suggest that what Apple did with it is not amazing and transformative.
So yes, for me, innovative is usually a very high bar.
Jobs saw the mouse at Xerox and knew it was an idea whose time had come. The first mouse was innovative. The first optical mouse was innovative. The first mouse that reduced the number of buttons to one or reduced the cost to $30 was pragmatic and clever but not innovative.
But is it patentable?
A Mont Blanc pen is prettier than a bic biro, but it doesn't really write any better. A fabulous meal is probably not as good for you as plate of boiled vegetables.
The problem with the research labs at those example companies you cited is that those businesses have little incentive to introduce innovation that would compete with "yesterday's ideas" that are driving their profits.
This is why Bell Labs was so unique. They could basically do whatever they wanted (you might try to make the same claim with your example companies, perhaps) _but_ ... they also managed to release these ideas into the market. And not always to the satisfaction of AT&T. People once had to pay for UNIX. Not anymore.
Xerox PARC is another well-known case where people were "set free" to work on whatever they wanted. But their ideas did not manage to trickle out to the market very well. Instead, Microsoft got one of their key people, Excel was born and the rest is history.
Apple is _not_ an idea factory. If someone called them two-timing thieves and told us to watch our backs, I would be inclined to take it seriously. (The fact that Apple is not the idea factory is why the lawsuits are so offensive to anyone who knows anything about the history of computers. If these sort of broad patents should go to anyone, it should be people like the ones who worked at Bell Labs and Xerox PARC. But maybe patents were not their priority. Maybe they were more interested in research, or playing computer games, than money. [How many UNIX patents? 1?] Go figure.)
But, Apple is a design house. An within IT, they do not have lots of competition in that area: e.g. design of hardware casings. In addition they go to great lengths to make the great ideas (namely the flexibility and stability of UNIX-like systems) easy to use. Another area that is lacking in IT: making the good stuff (like UNIX) easy to use.
Unfortunately Apple feels the need to abuse the patent system to stay on top. It makes me think if they didn't they might be in for a big fall. Maybe they are surprised at their own success? And nervous about losing the top spot?
Incidentally you could argue IBM started all this software patent nonsense. Not sure many programmers would agree with you, but the number of filings and issued patents by IBM, most of them before Microsoft even had a patent department, tells the story quite clearly.
You are not going to see much innovation released from "research labs" at the likes of Microsoft or those other companies. They will not keep their patent department in the dark. Those guys want to keep their jobs, not take risks. "Microsoft Research" or "Google Labs" are not Bell Labs or Xerox PARC. It's a wonder that something like Kinect was even made into a product. And you could see how nervous they were about it.
Today, the "labs" and the idea factory is the world wild web.
That's where the risks are taken.
Apple wants to control your devices. How you use them after your purchase. The network you use to obtain content. And even the content you download: you don't own it, they license it to you. There have never been any legtimate reasons for all this and there never will be.
edit: Well, a feature checklist and $50,000 worth of sales dinners, games of golf and "gifts that do not violate the professional ethics rules of the company buying said tool."
Apple's mantra is selling a $1 piece of software to 10,000,000 people who require a simple piece of software.
The two are perfectly valid. In fact the latter would not exist if it wasn't for enterprise software (such as CAD systems, inventory, supply chain management etc).
This happens all the time in the "enterprise" world. Whether it's sold by IBM, BMC, etc etc.
Don't conflate "enterprise" software with anything complex or intelligent. It's typically some of the worst software you can find.
It's not all like that. You've been stung by a shitty purchasing and architecture team, probably put together from people who've ascended the ranks to the point they no longer understand the technology and want to suck up to management and wheel around the country in a company rollerskate with a ThinkPad and a caffiene problem. These are called ivory tower architects or as we call them: asshat-itects.
Enterprise should mean certain guarantees about scalability and reliability and ability to adapt to the organisation.
If it doesn't, you've bought a lemon, not a piece of enterprise software.
Note: there are more lemons than not so you have to be careful.
Are you saying that Apple is succeeding because their customers are their users?
Or are you claiming Apple software sucks, and every ipad sold was a gift?
Yes, and they have different priorities than the CTO or IT manager, who may select enterprise software by feature checklist (or other priorities) rather than end-user experience.
The silly part is that complexity goes in cycles. New products need to differential themselves with old ones, so they add new features. After a while, you end up with a car stereo with 20 buttons, and suddenly a "new" competitor comes out with a clean design with just 3 buttons and the 20 button design looks ill-designed in comparison.
Stereos is one of the earlier examples, but you can see the same phenomena in web-design with today’s White and clean design vs the old dark and complex design.
The iPod and iPhone has added lots and lots of functionality over the years, yet the interface has stayed simple. It's possible to add lots of new features in ways that doesn't make the interface more complex, however it's a big challenge.
One way is to sue all competitors and have the only store available presenting exclusive the new product, but that only goes so far.
As for the johnsphones, it has buttons. As interface goes, it still look complex compared to a smartphone. The most simplistic phone design is one with only one or zero buttons, like the Third generation iPod Shuffle if it had been a phone.
I wish HP would revive the iPaq.
My only imagined use for an iPad is as a portable display. I want the Retina quality, but I have a more powerful hardware to attach and I need a real keyboard. There's nothing the iPad can do that my open, unlocked hardware cannot do.
I believe it was even possible to attach a real keyboard to an iPaq. That's the kind of flexibility I want. I can get data into and out of the device in any number of ways, without hassle.
> There's nothing the iPad can do that my open, unlocked hardware cannot do.
Your open, unlocked hardware, cannot survive two days with my mother.
I actually test some of my ideas with people like your mother, and surprisingly (why should I be surprised?) they have little trouble catching on.
What's really amusing is that these things that I have them doing are things that many nerds cannot themsleves do. I've got them using systems and techniques that many nerds won't touch because they think it's too "hard core". It's hilarious.
There are lots and lots of unfounded assumptions about what users can and cannot do.
There are facts, supported by evidence. And then there are assumptions. One requires a bit of work. The other is effortless: you just hit "Submit".
Your post is vague enough that I have a hard time parsing it. No evidence for what? That there is such a thing as a "non techy" user? The evidence is overwhelming, including anecdotal evidence from practically every "techy" person here who have ever had to help their family/friends with a computer issue.
And it's not a question of ability, it's a question of ability + caring enough. The non-tech people might not care to boot an OS from a CF Card -- certainly not enough to seek out how to learn it.
Here's an iPad, you can use it to easily email your friends, check facebook, and check the web.
Here's an iPaq 2012, you can do all the above, but it's a little harder to use, HOWEVER you can boot any linux distribution you want, add a USB device, attach ANY keyboard -- like one with mechanical switches! I personally prefer the Cherry Blues, but you might want to try the Topre ones. People who use those never go back. They cost a bit more ($~250), and you have to get it imported from Asia or find a U.S. distributor... but it's worth it.
linux? are you kidding? this is exactly what i was referring to: assumptions. how did you conclude by booting an os i meant linux?
a true apple fanboy. thanks for sharing.
The parent has made a different formulation, specifically:
"Every person belonging to group Y has had experience XXX"
This difference is significant, because the argument is basically stating that it is not only the experience of the person making the argument, but that the person making the argument is expecting that the readers of the argument are going to be able to confirm the experience for themselves. This is a much stronger argument than mere anecdote.
And for those that are already starting to lean on their keyboards to type "the plural of anecdote is not data", that platitude is a recognition that data is supposed to repose on a generalisable sample of reality, and if you are just going on anecdote, even multiple anecdotes, you are leaving yourself wide open to claims of cherry-picking. But this claim does not cherry pick, it says that a vast majority of "techy" peoply should be able to confirm the claim from their own experience.
what claim? can you be specific?
the original comment referred to a market for "people like [me]" being "not large". my response was that i have not seen any evidence to support that sort of claim. but i'm not even sure i know what he meant by "people like [me]". i had to assume i knew. the problem with assumptions is they can be wrong.
and i'm not sure understand the reference to "techy people". i never mentioned such a group. i mentioned "people like the [commenter's] mom". presumably (another assumption), she's not a "techy person", whatever that is. but maybe i'm not a "techy person" either. what is the definition of "techy person" anyway? would the definition differ based on the person defining it? maybe i see no distinction between "techy" and "not techy". maybe i only see differences in how much a given person understands about what computers can do, and how to make computers do those things.
Are you being purposefully vague? No empirical evidence of what? I'll counter that there is overwhelming evidence, but I'll share specifics once I know what your claim is. :)
for one, what does "people like me" mean? people who can make use of non-apple hardware? what sort of uses? i don't know what he meant. i could take a guess. but then i would be making an _assumption_. and i might be dead wrong.
and that's what you did in your comment. you made some assumptions. what were they?
i already told you one: you assumed the os a "mom-type" would run would be various linux distros. what if it's not linux?
what are we debating? i'm happy to debate.
here's my guess: we're debating whether mdonahue's statement "people like you are not a large market, unfortunately" is true.
however as i pointed out, we haven't agreed on what "people like me" means. we cannot debate this statement until we have agreed on a definition for that. then we have to consider what is meant by a "large market". what is a "large market"? then we have to decide whether this what is assrted in his statement, if true, is "unfortunate" or not. or maybe we can skip that since it seems like just a mdonahue opinion.
is this explanation still too vague for you? i'm not sure how much more specific i can get.
I have taught my mom how to do certain things, but she has no intuition. As soon as something is slightly wrong or different, she gets stuck and can't move on. The solution is always something simple, like relaunching the app, installing an update, power cycling the computer, jiggling the usb cord, modifying the permissions on a file... but there are only so many contingency plans I can teach her.
Maybe I just suck at teaching, but I think that technical people have an incredibly curiosity and comfort with troubleshooting that people like my mom don't. We are basically playing on our computers.
With the iPad, my mom is finally playing too. She is really adept at it. Sending photos, checking facebook, downloading new apps, she was never comfortable doing any of this on the computer. Too many choices and settings and things to potentially screw up that she would be paralyzed, unable to explore and try things.
Ultimately I think the home button is the most important thing in the iOS ecosystem. If all else fails, go home and everything will be fine*.
(Unless your battery dies, or there is lint in the charging port, or your screen shatters, or you muted it, or you turned on airplane mode... it's not perfect...)
I think reasoning, in addition to basic instructions, is important even though some people might not care about it.
By leaving it out you deny those who do care an opportunity to learn.
And to me it just seems more respectable when someone asks you to do something and tells you why you are doing it then if they just give you bare instructions. (That said, the bare instructions should be able tostand on their own. They had better work, every time.)
Moreover, providing reasoning forces you to demonstrate you know the subject matter well enough to be able to explain it.
Bluetooth always works flawlessly, unlike cables.
It's also proprietary and far more secure than cables.
It's a good thing that Apple iPad can't use cables.
In fact any device that is smaller than iPad and can accept both cables _and_ Bluetooth is less useful and just plain bad.
If Apple is a company that uniquely has the talent and taste of a good chef, the patent protection is unnecessary. They will be able to continually outdo other companies that don't have the same talent.
Arguing that a company has so much talent and is so successful that it needs legal protection seems absurd to me.
Tablet makers (iPaq, MS, etc) tried too hard. Apple showed them don't try too hard. And now they can make tablets too.
So yes, the excellent chef can be the first to make mayonnaise, however, with out patents on mayonnaise so can everyone else now. The idea of a patent is that the master chef spent years learning to not mix the ingredients too hard, now that he has showed the world the way to make mayonnaise, the only thing stopping the world from stealing the fruits of his work are patents.
The point being if Apple never created the iPhone and iPad we would still have phones like the iPaq and the clunky MS tablets in the year 2012. I whole heartily agree with this.
If it wasn't for Apple, every other phone and tablet company today would not be making anything of the flavor of the iPhone or iPad. Android and Win Mobile Phone are of the flavor of Apple.
So why is it right for them to taste like "L'Atelier de Joel Robuchon" (Apple) in the year 2012 if they would still taste like "Taco Bell" (MS, iPaq, etc) in the year 2012 if it wasn't for Jobs' iphone and ipad. Obviously with out a parallel universe for us to visit together I can't prove this to you. However, I do feel the overwhelming facts of 20 years of failure by iPaq, MS, etc show the trajectory they were heading for and we can guess where they would be in the year 2012:
- Stylus pen, or clunky touch screen that you have to press very hard.
- Lots of ram, lots of CPU power.
- Very heavy
- Very large and thick
- Poor quality materials
- Mediocre software that does not come any where close to the current android software.
- Buggy software
- no App store, lots of viruses and other security issues.
- Expensive and running full Windows 8, no RT version.
So if the products Samsung and MS were selling today fit the above recipe, I agree Apple shouldn't be suing them. But this isn't what is happening. Apple is being robbed blind. Every technique and recipe Apple created is being meticulously stolen and engineered into Android and windows phones. These are recipes that Jobs, Ive and the whole Apple company put years of effort into, they poured their heart and soul into these recipes.
The reason so much of the tech media don't see it this way is they have no taste buds. Most people have terrible taste buds. Well, they have great unconscious taste buds but their conscious taste buds are almost worth less. So they see things like the iPaq and those old MS Tablets and think "Well Gee golly, that food sure was tasty, sure android is tastier but not much tastier, apple doesn't really deserve much credit for androids improved taste."
As some one with excellent taste buds (I am a UX/Product designer, I have predicted the success and failure of almost all major tech product that have come out in the past 10 years, you can read the comments in my HN history as I defended the iPad to almost 99% of HN thinking it was stupid when it came out, I predicted the failure of the Zune, I predicted the success of Apple as whole back in 2004, and predicted MS' current decline) I can tell you with a fair amount of certainty how different iPaq and android taste.
What android and Win Phone are doing is sneaky. It is so sneaky you don't realize how many subtle but important details they have stolen. They can do this because you don't have the conscious taste buds necessary to notice it.
These details may be subtle but they are far from small. If you ever watch a grand master play chess. Every time he makes a great move you think to yourself "Gee golly that was an obvious move". No. No, it was not. Once you see a chess move you can no longer look at it objectively. The way to objectively judge a chess move is to try to figure it out on your own before some one shows it to you. After spending hours and hours and more hours looking for this move before finding it, you then fully appreciate the move.
The tech media is watching a chess game from the sidelines Apple is the grandmaster, Android and windows phone are the people building a database of the grandmaster's moves to beat people at chess.
Most people won't be able to comprehend or believe the next sentence: A non Expert Chef or food critic (Expert Product Designer) will never be able to appreciate the Gigantic chasm between the taste of Apples products and any of their competitors, however, on an unconscious level everyone will be drawn to Apples taste so strongly that if the competitors don't copy it, soon Apple will have no competitors still in business. This is the natural Monopoly the iPod almost had for a few years. In 2010 "The latest research by NPD Group claims that iPod had a 76 percent share of the MP3 player market in US in May this year." This monopoly would have been true if Samsung, Google and Microsoft were not spending the past 5 years perfecting their ability to hire Expert Product designers to steal Apple's recipes. From 2001, when the first iPod came out, until about 2010 MS, Google and Samsung went through a phase of denial, trial (trying to steal ideas) and then finally some succes (actually stealing ideas). It took them about a decade just to get good at stealing ideas from Apple. The Zune was an example of how hard it is to steal ideas from apple. A non Expert Product Designer would think the Zune was good thievery, however, it was a sloppy job, it wasn't until windows phone and android, that these companies started to be good at stealing ideas.
You are going to hate me for saying this, and you won't agree with me but the truth is that: you don't on a conscious level know what is going on. You are Unconsciously Incompetent. And Android and Win Phone are using this to their advantage. Its the same way Europeans stole land from Native americans. The native americans were Unconsciously Incompetent when it came to the idea of "Owning Land". They saw the land as owning them. Thus they signed documents that seemed to have little importance. This is what Android and Win Phone are doing. They are using you to steal from Apple.
That said Jiro  doesn't patent his Sushi, he simpley makes the best sushi. And he does quite well for him. If I was Apple I would spend a lot more of my time in the kitchen making the best Sushi in the world and a lot less time in the courtroom.
Note: I am sorry if I come off as arrogant by calling myself an "Expert Product Designer" and claim most people won't understand what I understand, but I don't say this out of arrogance, rather out of fact.
I have spent more than a decade to be as competent as I am in Product Design. This is not small feat. In college I was a talented student when it came to Physics, in particular my introduction to Quantum Physics and special relativity class. If I had chosen to pursue the path of Quantum Physis I am quite confident I would be an Expert Quantum Physis today.
And if this was so, it would not be arrogant for me to say such things as: "light is both a particle and wave, most people will never appreciate how amazing this is, only expert Quantum Physis', such as my self, will come close to appreciating this statements full glory." This would not be arrogant, rather it would be a fact.
The sad fact is that "Product/UX Design" doesn't get the same respect as the hard sciences. From my point of view it should. I use just as much if not more of my brain power to wrap my head around design solutions when I am designing a product as I did when I studied the particle and wave form duality of light and the intricate details of space and time learned through special relativity.
That said Jiro  doesn't patent his Sushi,
he simpley makes the best sushi. And he does
quite well for him. If I was Apple I would
spend a lot more of my time in the kitchen
making the best Sushi in the world and a lot
less time in the courtroom.
What I don't agree with is people saying there was prior art. No. Sorry there was no prior art, for most of apples inventions. If you take the subtle details into account.
So how can I agree with you and the above statement? I think we need patent reform. I believe in capitalism. Our current patent laws are anti capitalism, they are pro corporateerism.
Anyone can come out tomorrow and write their own search engine. Like Bing. Hell, Bing has even been caught red handed copying. Yet still Google makes money. Anyone remember Google suing Microsoft over that? Nope.
Maybe Apple needs to find a better business model whereby they can thrive and out-innovate before their competitors do so? That's what Google has to do, is Apple the special kid needing special treatment? What if Google started getting mad and suing every search engine competitor for infringing on their instant search patents, and other search patents? And without google, you'd be searching and hoping like you did in the 90's. Imagine that, a company that innovated, innovates, and doesn't try to sue their competition silly!!
If I hear one more primadonna talk about how Apple is getting ripped off, I'm going to explode. Please, get over yourself.
Now go off and explode in a sealed room. That much bile won't be a pleasant sight.
You can make cookies and cakes from the same raw ingredients, doesn't mean that the person who invented cookies also invented cake.
Also the 'Software is all zeroes and ones' argument irks me. It's the bridge between an expensive paperweight and a practical device.
> "By this myopic logic, Einstein didn’t invent the theory of relativity..."
i.e. he agrees with you.
By NAILING it apple managed to popularize existing tech like the modern touch-screen interface, the app develeper economy & handheld computing.
So enough with this apple hasnt done anything bs, the reason they print money is they keep making future tech accessible, I hope the iPhone 5 (6?) lives up to their record.
The examples given by the OP actually support this point. It's true that Einstein didnt discover relativity first - both lorentz and poincare had worked out the mathematics, but Einstein articulated the concepts best. I often use this case study as an example of how innovations often arise independently from different inventors simultaneously because the conditions are right. The other famous and illustrative example is newton and leibniz inventing calculus independently.
As a VC, I see this all the time - the market conditions are right for a new idea, and suddenly 4 or 5 companies appear doing variations on the same thing, none aware of the others. Let good execution and the market decide which one is best, not the date on a patent filing for something each came up with on their own.
That's a pretty big feat in itself.
Every time you see someone wearing those white earbuds, that's marketing. When your mom is always on her ipad, despite 20 years of lessons from you on how to use a computer, that's marketing. When you are sitting in an airport and you notice the glow of the Apple logo on the back of someone's Macbook Air, that's marketing. When the same dude is still using his macbook, 6 hours into a flight without a powercord, that's marketing. When you walk into an Apple store, and you feel the aluminum unibody... when you unwrap the sturdy yet smooth packaging and the bottom slides out to reveal the device... when computer boots directly to a gray apple logo on a white screen, instead of a DOS prompt for a few seconds, that's marketing.
It takes a lot of work to do Apple style marketing.
No other company can just show their product without having to say anything about it. That makes Apple unique in such a fundamental way which is exactly why they are the most (or one of the most) valuable company in the world.
See their advertising for the iPod with the white headphones and the silhouettes.
Bose product sounds good, yes, but not as good as their price tag. Bose also severely restricts the way you can use the product - no bass/treble knobs for example. Very similar to Apple.
"No highs, no lows, must be Bose."
We know Apple is already using all the above techniques to protect their business. They have an excellent supply chain, they obviously have preferential trade partners which enable them access the new technologies first before anyone else, and they have a iOS system which is well rounded for its purpose.
Net, they really do not need to leverage their patents protection. That could be an indication that they do not think they are going to be very innovative down the road, and therefore they just want to keep competitors as far as possible until they pull their act together again... or that they have too many lawyers with too much time on their hands to investigate how much harm they could do with claims based on air and smoke.
When a company resorts to such practices, it is usually not a good sign.
These morons have taken the article title as a literal synposis, it is being ironic (and yes, we know that Americans famously don't get irony).
The authors (one of whom is a former Apple exec) are arguing that what Apple did was invention and does indeed deserve protection from the legion of lazy wannabes who would just otherwise simply copy from the smartest kid in class.
"Concept products are like essays, musings in 3D. They are incomplete promises. Shipping products, by contrast, are brutally honest deliveries. You get what’s delivered. They live and die by their own design constraints. To the extent they are successful, they do advance the art and science of design and manufacturing by exposing the balance between fantasy and capability."
Anyway, I very much like JLG and I liked and agree with the articles push.
so what's new
they did other stuff: made things better, connected in new ways, etc.
i still don't like their stuff - for aesthetical reasons
get a life