CPUs are produced to a single design within a processor series. The ones with no faults are clocked the fastest and have all their features enabled. Parts with lots of flaws are sold off as crippled budget processors. AMD's triple-core processors seem a bit strange until you realise that they're a quad-core part with one faulty core.
Overclockers have known about this for years, identifying countless 'good batches' of processors and GPUs that were perfectly capable of being clocked faster, but were sold as cheap parts to fulfil demand.
If this seems like a con, then the whole IC fabrication industry is a con - artificially crippling ICs has been standard practice for decades.
That's what makes it seem arbitrary and annoying. When you buy a low-spec chip, you get it cheap because it could be defective. But when you buy this, it's perfectly functional, just artificially limited. You aren't helping Intel increase their yield and getting some savings for helping them -- you are just being fucked because they feel like fucking you.
I hated Intel for a long time, but then they started making good products and contributing drivers to Linux... but if they keep this sort of thing up, that goodwill is all going to erode and AMD will have to opportunity to get the enthusiast market again. (Their "we'll sue you for using the HDCP crack" is similarly goodwill-eroding, especially because there is nobody they can even sue.)
Corporations are renting the privilege of using Exchange from Microsoft because they want a corporation to stand behind their email system. If they didn't want to maintain a relationship with a major corporation, they could just as easily have done it with open-source software.
Cisco is entitled to sell me more sophisticated software, but not to prevent me outright from using the hardware I bought the way I want to with my own software. And they don't. My company wants features that would be too expensive with a unified Cisco solution, so we run Cisco phones flashed with custom open-source firmware connected to a Trixbox.
Sadly this line reminds me of certain phones and media players
But from a consumer's perspective, this is a big shift. It's taking what was once an open secret in the industry and shoving it right in the faces of consumers. And I wouldn't underestimate the backlash that can come from something like that.
The fact that we know it's been going on a while means nothing to the average consumer. Crippling of CPUs used to be magic that happened at the chip fabs that only people connected to the industry knew about. Now consumers are being openly told that their chips have been crippled in software, and if they pay more, that magical switch will be flipped.
I don't know if there'll be a big backlash or not, but this is not business as usual. Intel's small step forward has suddenly made the pratice visible.
And I see no problem with it.
It is good on two fronts.
1. If you're savvy you can hack the chip thus getting more for less (Premium on your ability).
2. If you're not savvy then you can buy exactly what you're after at price point that is acceptable to you (thus ensuring a proper price/performance).
People feel cheated since they didn't get more than what they paid for? I always assumed that trade happens when two parties reach a level where both "agree" that they get what they expected out of the deal.
There is no need to go all ape on Intel -> IC manufacturing is a cutthroat business with insane risks and low margins. If Intel can profit from different people willing to pay more for the "same" thing (and yes for anyone that knows and understands Amdahl's law its perfectly clear that within a microchip generation difference between high and low end is mostly superficial - regarding to system performance).
I recommend reading Spolsky's Camels and Rubber Duckies: http://www.joelonsoftware.com/articles/CamelsandRubberDuckie...
Why is this a bad change? Currently if I buy a 1.33GHz and want to upgrade later to a more powerful one, I'm buying the second processor and dropping a lot more money than I'd be putting down to unlock more processing power of my existing CPU.
This would be like selling all Kindles with 3G built in. You can save say $10 off the activation of 3G if you buy it already activated, or you can wait and pay a little more in the long run, but less up front.
I don't see this as a bad thing, but I understand why people would misunderstand as to why it is.
If you put a restricted CPU in a budget computer, it's the same as putting a flawed under-clocked CPU in a budget computer save that you can upgrade it later for less money than replacing your CPU to the equivalent. (Edit: note that it also puts less strain on CPU manufacturers as they're not producing 2 CPU's whenever someone buys a computer to upgrade, which in the long run would probably mean more quality control could be implemented in the production process meaning less flawed CPU's)
Actually, you're buying a 2.00Ghz part on the super cheap, you just aren't allowed to use all of what you bought under this kind of payment scheme. This brings up two downsides to this:
1) Consumer's know that the thing they purchased is not really under control...many people want to be able to do whatever they want with what they purchased. This is basically the same as a pay-per-use license like with software today...which most people absolutely hate.
2) It tells the market that the 2.00Ghz part is way overpriced. It's a powerful signal of how bad of a value the company is offering since you actually can but the 2.00Ghz part at the lower price, it's just artificially broken. How long till crackers figure out their way past such a lockout scheme?
If pricing did work this way, Photoshop Elements would have cost the same as Photoshop.
I think that's what I said. In the abstract, consumers will feel like they are getting gypped by paying for the fully unlocked part knowing that the exact same thing is available at a cheaper price -- part of the consumer's mindset will be "we know chip manufacturer A has to make a profit, but they're clearly making a profit on the cheaper part or they wouldn't sell it, the premium part then is just a con-job and manufacturer A is just getting greedy.
This reduces the desirability of the premium part. It'd be like selling a 6-cylinder car and a 4-cylinder car, only the 4-cylinder is actually a 6-cylinder with two of the pistons turned off (and can be enabled by paying some premium and having the other two pistons turned back on -- don't think something like this hasn't already been thought of in the car industry). People will just buy the 4-banger and take it to their local ricer shop and have the other two turned on and a type-R sticker put on the back.
In reality, it'll reduce the price of the premium part as demand shifts to the cheaper (but exactly the same) alternative OR, if the price of the premium part drops so low as to be extremely close or at parity with the slower part, people will stop buying the lower priced part. Either way, in reality, they should have just sold one part and not wasted everybody's time.
After going over every inch of the source code and finding nothing wrong, I'd almost given up when I discovered the Sig 11 FAQ (http://www.bitwizard.nl/sig11/) Clocking the CPU down to 90MHz, no more crashes. Turned out my gray market dealer had sold me an overclocked 90.
Then again, here's a thought: This test marketing is being done with ancient-architecture Pentium chips. Perhaps this is a consequence of fabricating these chips with less-dense architectures on equipment built for the tolerances required by the latest generation. I don't know enough about the process to say for sure, but I could imagine this resulting in a significant improvement in yields compared to the Pentium's heyday (or to Core i7 chips today).
As for this CPU, recently Intel revived the Pentium name and positioned it as a low-performance chip positioned about midway between the Celeron line and the Core i3 line.
Intel doesn't want people to really compare which chip they are buying. They just want people to feel dirty for buying low-end chips.
Your basic subscription to Basecamp, your Photoshop trial, your Windows 7 Home Edition, the Keynote trial that comes with your Mac, your iPhone... they're all capable of doing a lot more (for no extra cost to the company behind it), and you can upgrade for a cost. Another good example that the article mentions is upgrades in video games. This is a pretty common practice for software, so why not hardware?
The formation of Standard Oil in the 1880s was a reaction to ruinous overproduction of oil, and its effect was to control and restrict the flow of oil onto the market to maintain prices. Same with the OPEC cartel today.
That people don't already know this says more about the educational system than the economic system. Then again, the educational system is itself a monopoly that seeks to restrict the flow of information so that it keeps itself relevant.
So the little Chinese kid will indeed get a kind of wiseness by learning that "in the great American Capitalist system, goods are damaged on purpose to fulfill the optimum of their system: [...] chip manufacturers damage their own goods on purpose. Why do they do this? Because the rules of American Capitalism dictate it."
Because for the most part, that's the most unbiased view you can get.
(But the rules of American Capitalism applies more and more to China too, so the little Chinese kid will either don't be taught that or learn soon enough that he will be part of the very same system regardless... (Not saying that real attempts to organize society by communism was better anyway))
Think of the logic. You want to produce a low end and a high end chip - because that's what the market is eating up. It's cheaper to have one fabrication process rather than two.... and suddenly, instead of busted, inferior chips being sold as lower end chips, we have the same chips in the same market niche, with the possibility of a simple software update to enable the high end features. It might smell funny at first, but as long as they are up-front about it, it's just business.
They have exactly one competitor (AMD), and because Intel is so much bigger and has so much more money, they always have newer and better fabs. The only ways AMD can be competitive with Intel are to be really daringly clever with their chip design (Athlon 64, and hopefully the upcoming Bobcat and Bulldozer cores), sell their chips at lower profit margins, take advantage of Intel's mistakes (Itanic vs Athlon 64), and to pick their battles wisely (AMD has never been able to field a whole product lineup that is competitive across the board). The barriers of entry to that market are so high that even AMD can't fully surmount them, and AMD definitely can't gain ground or even maintain solvency by doing what Intel does but slightly better.
But at the same time, doing only low profit margin manufacturing might be: or not profitable; or not enough to drive and push development of new technology.
Unless processors are coming with EULAs now?
This could set a dangerous precedent for automobiles or any other physical good.
EDIT: I'm speaking of consumer, not business purchases.
...not that I think that this is an agreeable situation. But I am hardly surprised to see things evolve in that direction.
As long as they are up front about the cost & features of the product I'm buying I don't care. As a consumer, I don't like this practice, it smells bad - but it's supply and demand. Intel can price their chips however they want, it's up to them to price them in a way that's the most profitable for them - it's not like they have no competition.
Further, it delivers what the customer expects. It just turns out that Intel nicely provided additional functionality that you can unlock if you pay for your part of the R&D cost of it (which is the real cost of processors).
There is nothing wrong with this practice.
That isn't the case in this situation, though. The expectations of the device were entirely up front and this was more of an "aha!".
As an aside, my auto has a built in system that monitors car telemetrics, has GPS functionality, a built in cell phone and "lojack" system, etc. I paid for this as a part of the car. It is completely useless the moment I stop paying a monthly service fee.
What is to prevent a rogue employee, system error, or virus from disabling your vehicle at the most inopportune time?
"Sorry, your [phone|cpu|car] is bricked. Just have it towed, and I can JTAG it."
Even at speeds of up to 40 MPH on the runway, the attack packets had their intended effect, whether it was honking the horn, killing the engine, preventing the car from restarting, or blasting the heat. Most dramatic were the effects of Device Control packets to the Electronic Brake Control Module (EBCM) — the full effect of which we had previously not been able to observe. In particular, we were able to release the brakes and actually prevent our driver from braking; no amount of pressure on the brake pedal was able to activate the brakes. Even though we expected this effect, reversed it quickly, and had a safety mechanism in place, it was still a frightening experience for our driver. With another packet, we were able to instantaneously lock the brakes unevenly; this could have been dangerous at higher speeds.
Software is different.
First time I've seen this on consumer kit tho'.
Upgrading the CPU is quite costly - you have to take out the old CPU and replace it altogether with a new one. With HDDs you can put the old one in an external enclosure and get some use from it (if the reason you're replacing it isn't failure), and with RAM you might have slots free, so you can keep using the old stuff in addition to the new.
Replaced CPUs tend to be pretty useless. The socket type changes all the time, so you have to eBay it off. Laptops are worse, the CPUs are often soldered on or you can't find replacements.
The only real problem I foresee is that the upgrades are likely going to be of limited use. Hyperthreading gives you a modest boost, anything more than that (clock speed, disabled cores, cache) will drive up power consumption and therefore thermal dissipation. Computers upgradeable in this way will have to contain cooling systems that can cope with the extra heat, even if 90% of people will never upgrade. Might still be feasible for the very low end (Atoms - especially as they're so slow even "normal" people might upgrade) or as CPUs become increasingly modular (either double your CPU or GPU cores in the same package, but not both).
Deleting a certain gene in mice can make them smarter by unlocking a mysterious region of the brain considered to be relatively inflexible, scientists at Emory University School of Medicine have found.
* Nvidia intentionally capping floating point precision on 1/8th on their GTX 480 cards, since they cost a fraction of their Tesla cards. Both based on latest Fermi architecture.
* In automotive industry: two otherwise identical cars, but with different versions of engine microcontroller driver, can have many thousands of dollars difference in price.
Doesn't look to me like it covers CPU features.
Copyright is about the right to restrict copying - the clue is in the name. The software case is fundamentally different, because your computer must make a (temporary) copy of the software in order to use it.
(On the other hand, if they designed it such that you needed to upload a piece of (copyrighted) microcode to the CPU on each boot, then that could well bring it within the remit of copyright law. In that case, if you wanted to produce a third-party version, you'd have to write your own "clean room" version of the necessary microcode, which seems like a pretty high hurdle).
Of course DEC field engineers did not feel like waiting for the diagnostics to complete so they usually temporarily upgraded the machine to full spec to run their tools, then revert the changes before they left.
This was a funny little dance because some of their customers had clued in to the trick and would do the same thing after the engineers have left, upgrading the machine, only to downgrade it just before a field engineer would arrive.
On the plus side, if this is a software thing I fully expect it to be hacked.
Excluding the friends I have in the tech industry, not a single one of my friends or family would be able to tell you the different between the CPU and software on their computer.
Besides, which is a better value proposition to the consumer (even if it is not a better value in reality)? Paying $50 for an online CPU upgrade that makes their processor appear 10% faster, or dropping $500+ on a brand new computer to get 40% faster?
Intel gets the opportunity to make a little more money and the customer has a little future proofing - like I said seems like a great idea to me....
In any case, the processors were reportedly quite bad and the whole thing was eventually dropped.
It's a reasonably performing MIPS clone, and the project is still active.
With every paid service, one can get a list of features for a fixed price, and then pay more to "unlock" features that that very software can already do.
I wonder why is it easier for people to pay extra for extra in software but not in hardware.
Does anyone have an idea?
Edit: My point is that Intel is intentionally reducing the value of their product. How can they afford to do that and remain competitive?
I could charge people $5 for the download and $25 for the code. Not a great idea, I think, but straightforward.
Amazingly informative, taught me a lot about how pricing works in the real world.
It does seem a little dirty to sell something to somebody, but restrict how they can use it. But, Intel isn't just selling a chunk of silicon. They are selling the work that went into shaping that silicon just so. They could have designed it much more cheaply if it didn't need to do so much. So, by buying the low-end chip, you aren't paying them for the extra effort they put in for their high-end customers. If you want the extra benefit, then you must pay for the extra work.
If I bought a netbook that had an Atom 450 in it, and I knew what the processor was and what it was capable of, and then I took delivery and found out later that it really is an Atom 450+ that can morph into a Xeon 7500 with a payment, there's no loss or detriment to me to not take the option. If anything it's a convenience.
Intel develops a CPU and looks at the market and sees that their chips is competitive at $150 price point. So they go and sell it at that price. But the production costs of a CPU is probably closer to $10. And because there is a market of people who do not want to pay $150 for a CPU, Intel cripples some features of the CPU and sells it for $100. This way those who are willing to pay more will pay more, and those who are not pay less.
In their market, it makes sense. You buy a cheap mainframe and, when the time to upgrade comes, the hardware upgrade consists of a phone call.
Now... Would you have to run a Windows program in order to "upgrade" the CPU?
I kind of like the idea of writable logic inside the CPU.