Hacker News new | comments | show | ask | jobs | submit login
Resistor hack turns a Nvidia GTX690 into a Quadro K5000 or Tesla K10 (eevblog.com)
405 points by Breakthrough 1673 days ago | hide | past | web | 190 comments | favorite

The corresponding reddit thread is quite insightful:

"It is far cheaper to make one very good chip for the highest market, and modify it slightly for lower end markets...

Now, this is the part you hate: this is the only viable alternative. There's no point demanding different designs for different market segments as that would significantly increase the cost you, the consumer, pays. You can demand chips be sold at lowest prices without being fused down, but then the company eats its own market, becomes unprofitable, and goes out of business at worst - or, at best, doesn't make enough money to fund further r&d as much as they want, which again hurts you, the consumer."


Here's a key insight on this issue that the general public seems to be unaware about: the most expensive thing in workstation cards like Quadro and Tesla is warranty and support. This is also the reason that they are clocked down, they have to be more reliable than a gaming gpu.

Gaming GPUs have some parts fused off. These parts are not essential for gaming performance, and powering them off with fuses gives better thermal properties, which allows more gaming performance. These things run HOT.

Finally, this hack does not really make your card into a Tesla or a Quadro, it just fools the driver with a bogus PCI ID and enables some software features that are disabled. It does not enable hardware that has been fused out.

This is key- and I think that as long as you are just using it to get triple monitor support like this guy did you will likely be ok. However, if the software you work with is going to rely on some fused out feature of the Tesla or Quadro you are just risking frying your card unnecessarily.

I dont understand why users think that this is an evil practice. Its HARD to build hardware, and just like software engineers want to have reusable code, hardware manufacturers want to have reusable silicone. You are paying for the features you want. If you want a feature set that isn't mainstream and "workstation" capable, you are going to have to pay more. If you just want to push pixels quickly, then of course they will disable some high end functionality and sell it to you cheaper.

That said, I applaud the fact that they are digging in and learning to mod these things.

The analogy to software is a good one. Often the software is shipped full, and you unlock features as you pay for.

One thing to keep in mind, is that author mentions that the same card does not perform equivalently in Windows and Linux, specifically multi-screen support under Linux. From the forum post:

"I own a NVidia GTX 690 which I bought for two reasons, gaming, and multi monitor setup for work, NVidia made it very clear that this card would drive up to 3 screens in 2d, which it does quite nicely :-+... under windows :--! The tight asses have decided that if you want this feature under Linux you have to get a Quadro which has Mosaic support :palm:. So naturally I decided to look at how mod the card, as the price difference is over $1000 between the GTX 690 and the Quadro K5000 (same GPU) and, get this... the K5000 is only single GPU and clocked some 25-30% slower then the gaming card, what a joke"

It might be that he had expected 3 screens under Linux but didn't get it, so he set out to "fix" it. It seems a bit excessive to expect someone to spend $1000 for a card performing 25-30% less just to get Linux support :)

Apparently triple monitors on linux with nvidia cards can be achieved without this hardware hack.

Source: http://www.reddit.com/r/techsupport/comments/13vkup/3_monito...

I have a GTX680, and it supports three monitors out of the box using the proprietary Nvidia drivers on Linux. No hacks needed.

I seriously do not understand how this issue is repeatedly brought up. I've been using multi-monitors on linux with nvidia-settings for more than 10 years without any issues.

It's also worth noting that the electrical performance of your cheaper card's chip may not be as good as the higher grade chip, so may overheat more or have a failure earlier in it's life.

On chip lines like these, there's usually a process called "Speed Binning" where basically when you test the performance of each chip, and you put the higher performing ones in a different "Bin" than the slightly less efficient ones. You then sell the super high performance ones for a higher price, or put them in the more expensive product lines, as they will be less likely to fail.

i.e: Of all the chips that don't fail their tests and are rejected, 85% are of C grade performance, 12% are B grade, and 3% are A grade. Intel does this to get the "Extreme Edition" chips, and i'm assuming Nvidia does this to select the chips for their higher grade product lines.

I had no idea that the manufacturing variation among chips is large enough to create performance tiers. Pass/fail I get, but this sounds like more extreme. Like, the difference between C and fail may be such that some non-negligible percentage of chips that test into the C bin should really be in the fail bin.

It's many, many different things that can make the quality vary so much. From the purity and quality of the raw silicon itself, to the quality of the design of the chip itself, where some critical part of the chip architecture is incredibly hard to fabricate to the highest standard.

Remember that the widths of the oxide tracks within the silicon is on average 40nm these days (that's only ~400 atoms across!) or even smaller. with hundreds of process steps. One big molecule from some tiny error in the production process on the wrong part of the chip may not cripple it, but may impede performance, it's just probability at the end of the day.

With regards to potential fails going into production, it does happen, there are several test phases during production to catch as many as you can, but at the end of the day you won't get them all.

Semiconductor fabrication is fantastically expensive, new Fab plants cost several billion to build, so if you want to guarantee quality you have to pay for it.

It does make sense in that they are always pushing the edge as hard as they can, and it sounds like this process allows them to do just that. They get to sell the 'lucky' results for a premium, their 'average' for their bread and butter, and even the sub-par chips provide income.

It seems like a really good idea. It also seems like it must be pretty hard to do in a way that gives you a reliable (say) 10% emerging as 'lucky.'

When the PS3 first came out, they were having really bad yield issues. The design of the chip is one slowish "normal" processor, plus 6 "synergistic processing elements" which were really fast little vector processors. Well, not exactly 6... If you look at the chip http://www.trustedreviews.com/Sony-PlayStation-3_Games_revie... you'll see 8! OK so one was reserved by the OS for a hypervisor that would run in the background all the time and was not available for mere mortals to access. But that still only explains 7. Turns out they just disabled one of the SPEs. They tested each of them and if one happened to be broken, they would pick that one to be disabled. This N+1 redundancy improved effective yields even though a lot of the chips were still broken!

To be fair, those percentages I gave, I made up for that example. Though I'd imagine they are something close to that. But it depends on the chip, the market and the company selling them.

You design those percentage bands, generally to the size of the market you're aiming the chip tiers at. The silicon will be as good as they can get it generally, nobody wants to push bad products out. Recalls and returns probably cost more in the long term than failing more chips and suffering a worse yield.

However if only 0.5% of your customer base is interested in paying more money for a faster chip, you only cream off the top 0.5% of chips.

Keep in mind that chips can be tested at different frequencies with different amounts of input and expected output. So that C bin could have a perfect test score under certain reasonable conditions.

Clever little hardware hacks gives you oodles of geek street cred, but it may mean putting up with occasional bizarre behavior so that cred is well-earned. When silicon fails, the apparent effects can defy all logic. We can be talking about a logical AND that does something else entirely <0.001% of the time.

They do the same thing with regular electrical components. Resistors for example have tolerance bands so they make a ton of them and then if it fits in the 1% range it gets marked with the 1% color band and sold for more than the 5% tolerance resistors. It's easier to change the marketing strategy than the manufacturing process.

Reminds me of all those guys that were arrested for selling modified modems to give speed boosts to internet subscribers. The ISPs run the same pipes to everyone's house, they then check the device you're renting and tighten the spigot.

I'm fine with this as a business necessity, but I really wish someone would pitch a service that only throttles me during peak times. I feel like we're tossing a bunch of idle capacity down the drain.

The big difference between the graphics card modification and the modification of those modems is that unlike the Internet connection, you're stealing from nobody when you modify the card.

If you're going to spend the effort to modify the card to behave as what is described in the hack, then it is really no loss to the manufacturer as they had already made a sale to you and you're forgoing your warranty in the process, so they will unlikely need to spend any further money on you.

Contrast this to a cable modem modification where you modify the uploaded file to allow for faster speeds, to the detriment of others. People who legitimately pay for those speeds you've effectively stolen could be hindered as a result. You're stealing from a service in this case.

So I don't think your comparison is fair here.

A better example would be those who have older video game consoles and are tapping into the RGB lines to produce a video signal superior to of what is offered out of the box. The manufacturer never intended for this but physically it allowed for it due to different markets or how the video was processed to begin with.

Tangent: To anyone wondering about the "RGB lines" comment, apparently it has something to do with SCART cables ( http://en.wikipedia.org/wiki/SCART ).

Here are some screenshots: http://www.reddit.com/r/gamecollecting/comments/mcmsz/my_sne...

Found this out when I was wondering what he was talking about...

There's that and then there's this: http://www.made-by-bacteria.com/viewtopic.php?f=20&t=148

The SCART method is fairly straight forward, but getting at it on other non-SCART capable consoles is something else so hence "RGB lines". :)

The problem with such things cross-financing.

They give free/cheap stuff to poor/normal people and expensive stuff to rich people.

Then the expensive stuff pays for the cheap stuff.

It's like socialism for capitalists.

It's just a situation where capitalism doesn't lead to the most economically efficient solution.

The company and the consumers are rational, yet for no additional cost, everyone could be benefiting from better hardware.

Unfortunately I don't have a better system to propose :)

It is a form of price discrimination. In a nutshell, price discrimination is always better for the producer and results are mixed for consumers -- sometimes it leaves them better off, and sometimes worse off.

The claim that this is not economically efficient isn't so straightforward. In particular, your counterfactual isn't "at no additional cost." It would reduce revenue to the card-maker as no one would pay the higher price for the more expensive cards. If the producer can only charge one price, then it would be somewhere in the middle, which would hurt the consumers buying the low end card. Worst case it would make the entire card unprofitable and it would never get produced in the first place.

I think nobody would call nerfing high-end chips for people with higher price sensitivity an optimal solution, but something gotta give. Either there's a lot of consumer surplus (cheaper prices for better products), or there's a lot of producer surplus through market segmentation (bigger R&D budgets, better products in the long term). You can't have both.

This problem is more obvious with software, by the way: higher-end SaaS plans may cost marginally more (storage, support) but they're really mostly about price discrimination: getting people who can pay more to pay more even if you could provide all your customers with the top-tier experience.

The biggest problem with deciding whether a given system leads to "the most economically efficient solution" is in deciding what the most efficient solution is. Another way of viewing this is wealthy business users subsidising less wealthy gamers - which is a pretty good thing IMO.

Why is that? The major cost is the one-time cost of designing the chip, not producing it. Software is even more extreme: the distribution costs are essentially zero. That doesn't mean the most economically efficient solution is for the software to sell for nothing.

Say you have 2 audiences: Gamers and professionals.

For gamers, a low performance value yields a value of $10, and a high performance value yields a value of $12.

For professionals, a low performance value yields a value of $5, and a high performance value yields a value of $20.

If nVidia sold a single design to all of them, with a high performance value, then gamers would be willing to pay less than $12, and professionals would pay less than $20.

If they want to sell to the entire market, they have to price it less than $12, and lose around $8 from each professional buyer.

However, if they sell 2 designs (e.g: one a cripple of the other), they can derive around $10 of value from each gamer, and $20 from each professional.

The reason this is inefficient, is because for each gamer buying the crippled design, there is a loss of $2 of real value. This loss is a real value loss that could be avoided if there was some other way to extract the actual value for gamers and professionals without crippling the product.

In short, the inefficiency is not that the price is not near the production cost, but that the intentional crippling of the product (necessary to extract maximal value via market segmentation) is causing an actual value loss in the economy.

However in your latter scenario, the net loss is only 2$ versus your original scenario where the net loss is 8$. (lose 8$ per card to the perfessional, versus gamer losing 2$ of performance)

So its socially optimum to continue to do what Nvidia is doing ;).

In the original scenario, there is no economic value loss, there is only monetary loss for nVidia. i.e: the surplass 8$ value are captured by the professionals rather than nVidia.

If they sell at a >4:1 ratio in favor of the gamer version, it is still suboptimal.

If I had to guess, I would say their gamer card prices are by and large good approximations of the value they provide. No idea about professional card prices.

That's not really true, for small runs design time is a significant cost, but at scale it's mostly a question of manufacturing and capital costs.

Consider, Intel produces a wide range of CPU designs specifically because production capacity is what is most expensive. Video cards are something of a special case because they are highly redundant so companies can sell highly damaged chips as slightly different models. However, demand rarely matches the rate of defects so often chips are sold below capacity.

What confuses the issue is 'pro' cards that may preform worse than 'gamer' cards, but cost a lot more. However, that's more a case of different needs. Consider, stability is a lot more important to the 'pro' market which down clocking provides. Toss in some driver tweaks and a small market and you end up with a high priced product that does not cost more to manufacture.

Not necessarily. The people who were buying the high-end version with all the features enabled would then be paying a lot less so the total revenue is significantly reduced. Of course they could raise the prices but then the people who were buying it at lower prices for just the features they wanted will suffer. The price increase might be so much that the lower-end consumers just stop buying it altogether, and then they have to raise prices even further to make up for all the decreased revenue. And then the cost per person will actually be even higher since there are fewer consumers but presumably the total cost of production is the same.

Look at it this way. By decreasing prices in exchange for disabling some features, they increase the total number of consumers. With more consumers the total production costs can be distributed amongst more people. Which lowers prices for everyone, including the people paying for all of the features.

There's a "law" somewhere that says if a vendor can segment a msrket easily it will. Hence academic software pricing, lunch menus, military discounts for movie tickets, and so on. If there's a simple way to segment a market it will always be profitable to do so (unless you somehow piss off one of your segments, I suppose).

Academic Software pricing and military/age discounts for tickets are direct price discrimination, where you can identify groups and prevent any arbitrage.

nVidia chips are a classic example of indirect price discrimination, where you can't prevent arbitrage. In tech this is common because both hardware and software have huge upfront costs (new fab design, coding and testing software), and it's cheaper to produce one and cripple than it is two different designs.

Without the "high- margin" fully unlocked segments, businesses might not pursue projects at all, or greatly reduce the scope and budget for R&D. It feels wrong because humans are much more wired to feel loss than unrealized gains, but it is the best solution for everyone.

Don't you feel a little stupid for bringing up the term "economically efficient" when you don't know what you're talking about?

"Economically efficient" is not equivalent to "the cheapest possible goods"

Could you elaborate in a less rude and more informative manner?


Lot of industries are able to survive without extensive market segmentation. Why should it not be possible for HW manufacturers?

And lots of industries aren't. Why should it be possible for HW manufacturers?

It's not relevant that some industries have different practices. It would be like asking why can't manufacturing let people work from home, since it works in programming. It's because they are different industries. Their attributes, particularly in terms of R&D and capital costs, aren't on the same scale.

Should we force our industries to survive, or should we allow them to charge what people willingly pay (for non-essential goods or services)?

Because it's about growth, not profitability.

Could you (or someone) expand on this?

Exactly, binning is good for the business AND for the consumer.

When "binning" refers to the practice of putting premium prices on parts that test as being capable of operating at higher clock speeds or lower voltages, that's good for the consumer. When it refers to die harvesting - selling the chip with a defective section disabled - that's also good for the consumer. When it refers to crippling a chip that has already passed QA, it's a symptom of insufficient competitive pressure. The best example of this is when Intel offered a $50 software upgrade that would enable HyperThreading and more CPU cache. What NVidia is doing here probably isn't as bad - the premium associated with Tesla and Quadro parts goes toward driver features and QA that is irrelevant to the consumer market.

> it's a symptom of insufficient competitive pressure

Exactly. That's as if fast food chain just before serving food to customer added ingredient that would made food taste worse (and or be less healthy) in order to be able to sell you same food without this ingredient at premium price.

We don't see such behavior because in fast food industry we have heavy competition. Not so much in GPU market where two companies have most of the market and entry barriers are high.

When it refers to crippling a chip that has already passed QA, it's a symptom of insufficient competitive pressure.

I don't know, as I'm familiar with it early in the development of a process node they sell every top-bin-SKU they can make, and as the process matures they just have too many top-bin parts. I can only guess your argument is that in a competitive market they would start dropping the price of the top-bin SKU to move them, but that reduces the overall profit of the venture which may drive them to increase initial pricing, if profit margins are thin enough.

Basically what I'd argue is that the pricing is the symptom, not the down-binning. Down-binning exists to maintain & stabilize SKU distinction and pricing, as yields change over the life of a product.

When a working chip is crippled, that's a destruction of economic value. It may be offset in the long run by helping sustain Moore's Law, but that claim is very much in need of evidence. You have to actually consider what the computer market would look like if we could now buy the flagship CPU from 3-4 years ago for little more than the marginal cost on a mature process. That level of performance is good enough for most users, and could be available for ARM kind of prices. Imagine if the Raspberry Pi were twice the price but powerful enough to be your primary machine.

consider what the computer market would look like if we could now buy the flagship CPU from 3-4 years ago for little more than the marginal cost on a mature process.

If that was what the market wanted, don't you think that is what the market would get? I think the trouble is you are thinking "the marginal cost on a mature process" would be a few dollars.

You're right, the silicon would be a few dollars- but after testing and packaging and all that jazz, I believe most mainstream high performance desktop CPUs have a marginal cost of around $30. (Why not cheaper, like a 10MHz ARM chip? Package is expensive, due to cooling needs and pins for power & DRAM interface) So, perhaps you bring it to market for $40.

This chip you're selling probably performs like a mid-range part in today terms, in the $100-150 range. But when you consider TCO due to power draw and cooling, the numbers start to get closer.

I haven't carefully laid this out on paper or anything, but point being, if such a chip would sell so well, why is nobody doing it?

It becomes much clearer if you increase the contrast by comparing today's parts to those from a decade ago. The clocks are similar, but the performance has come a long ways due to improvements in IPC, multicore, updates to the memory interface, etc.

On the other hand, high-end CPUs on a warehouse shelf are of little economic value to anyone.

> ...it's a symptom of insufficient competitive pressure.

The extra value a producer is able to capture from price discrimination due to a relative lack of competition can be thought of as one of the ways in which the deal between producers and society called "intellectual property" functions to shift rewards for innovation toward to the producer.

Something thought "insufficient" for one end, might be in proper indeed for some other end.

It's good in the sense that it's not clear there's a better alternative, but there is a painful point with a significant part of the market using a crippled design and losing value -- just to make the market segmentation possible.

If there was some other way to do the market segmentation: those who derive $X of value from the chip pay $X, and those who derive $Y pay $Y, without the crippling, then more economic value would be derived.

An example alternative for this approach could possibly be funding such developments with income tax. Then everyone can gain from the benefits of an uncrippled product, and the income tax already approximates how much value you gain from the R&D. This introduces a whole host of other problems, of course, but it does solve the crippling problem.

That's actually how we primarily fund the education system these companies rely on for their R&D hiring. Which itself utilizes a whole other layer of price discrimination for need based tuition cost adjustments.

It seems every company screws up on binning, sometimes repeatedly. AMD and Intel and nVidia have had 're-binning' issues on a lot of hardware.

Outside of smaller community of tinkerers it's not much of a trend. On the scales the companies operate it makes no difference to bottom line. Few can be bothered with learning SMD soldering.

If you're linking to a specific Reddit comment, please link to the actual comment, not just the post it's a comment on.

It's possible for nvidia to replace the resistors with one-time-programable memory ,and program it at factory. And it's cheap, it's being used in $1 FPGA from silicon blue.

And they'll probably do this in their next chip.

I don't even know if they'd bother to do this due to the handful of people who are actually going to buy the gaming card to make it into a pro card.

Most people who are probably doing this are hobbyists at best. Most professionals would probably just have their company buying the cards anyway, and this hack won't at all change that.

Actually, it being a hardware hack rather than a software hack makes it even less appealing even for small shops to try. "You want to take that new $400 video card that we just bought you and do /what/ to it?" vs. "ooh, it's just a software upgrade" (even though both have the very real chance of bricking the card in inexperienced hands).

have their company buying the cards anyway

as if the company had infinite resources and didn't want to spend less.

This isn't true for startups or any kind of small business.

Some people could turn this into a small business I guess.

Doesn't the fact that they're selling the same chip for a lower price mean they are profitable with that low-end price, too? I doubt they are just dumping the leftovers for the low-end, and taking a loss.

Also how is this any different from the Adobe scandal in Australia from a couple of months ago? I believe the Australian government even took them, Microsoft and Apple to court over charging Australians a lot more than Americans or others for the same products.

You are thinking about chip costs as if you were selling bread or cars, where the majority cost in making the product are the marginal costs of manufacture. This is not true for microchips. The marginal cost to produce any modern Intel cpu is <$40. Most of the cost of making chips is in developing the process they run on, and designing, testing and validating the chips.

When the economics of making chips are $2B down to make the first one, then $25 for each that goes off the line, you cannot measure profitability like that. They need to recover R&D, and providing more expensive chips to the professional market through price discrimination is a good way to do this.

Making the mask is really expensive. I had an external lecturer tell me it cost about 1 million usd to make a set of masks for their silicon, I assume Intel pays way more.

Once you have the mask though, pushing out chips is cheap and easy.

And the masks are just a really, really small part of the total cost. Intel has to pay for their fabs, they have to pay for the research into ever smaller line widths, they have to pay for materials research for new gate materials, and they have to pay for the massive research and development to design new, better chips.

It isn't about just being profitable, it's about extracting maximum value from every customer.

Ideally, the company would haggle with each and every customer, and get each customer to pay the highest price he in particular accepts. Since haggling is not feasible in mass markets, customers get grouped in segments, where the price is set to the maximum the segment will accept.

Clearly, if you misclassify a hacker into an expensive segment you get your hardware hacked :-)

Yep. in my view, if the product is already profitable at a lower segment, and they manage to sell it to the pros for a higher price, then the company is making undeserved money, rightly so, bears the anger of their customers when found out.

Yep. in my view, if the product is already profitable at a lower segment, and they manage to sell it to the pros for a higher price, then the company is making undeserved money, rightly so, bears the anger of their customers when found out.

I completely disagree. Even if it was as simple as just charging "pros" more money (it's not), the idea that this is somehow "undeserved money" is pretty ridiculous in it's own right.

I can't break it to you any more easily than: You are flat wrong. Segmentation is micro-economics 101 and is an expected company behaviour.

Perhaps I can explain briefly. The "deserved"(sic) price is a value where both buyer and seller are comfortable doing business. In most cases, this is not a single value, but a range of values. At the minimum threshold the seller turns comfortable, and then there's a range where the transaction is possible and at the maximum the buyer turns uncomfortable.

Now, while for a given product and fixed seller, the minimum value of the agreement range is fixed, the maximum is dependent on the buyer. Yes, it is possible to sell all products at the minimum, but then you are giving all shared value to the buyer. Even if you are aiming for fairness, the transaction should occur at the middle of this agreement zone (the buyer buys at less than his maximum price, and the seller sells above his minimum price).

You could tackle this difference between consumers on a case by case through individual negotiation. Obviously this is not practical, so the next best thing is segmentation.

You may think this as unfair, but this is a result of seeing a half-empty glass. Segmentation allows for a company to subsidize "cheaper" products using "premium" products. If a cheaper product covers variable costs (but not fixed costs), and does not cannibalize the premium products, segmentation allows for prices below what would be possible if the burden of fixed costs had to be assigned to the cheaper segment. In industries where most of the cost is fixed (as in semiconductors), segmentation is key to achieving large volumes without compromising the ability to profit.

The product is not already profitable at the lower segment, unless the higher segment is there to help defray the capital costs of building the fab.

Chips with more defects can be sold on the low end rather than scrapped, making production easier and more profitable.

Yeap. AMD did this with its Phenom II chips. Maintain the same fab process for a 4 core processor but sell it as dual or triple core if it was poor quality. AMD motherboards famously had the feature to enable the locked down processors. Obviously your mileage would vary but going from 3 cores to 4 was a pretty significant performance boost especially for the initial cost.

A while back there was a way to unlock an AMD 6950 GPU to a 6970 as well. Had something to do with unlocking some memory modules since the cards were identical.

I think it's incredible how some people enjoy getting screwed over. I have a friend that actually thinks it's a good thing that iPhones break easily.

That seems out of context; it is fairly common for people to abuse return policies like Apple's to get free upgrades after (un)intentionally breaking their hardware.

Interesting, I just finished reading Joel's article on pricing talking about this: http://www.joelonsoftware.com/articles/CamelsandRubberDuckie...

>The tight asses have decided that if you want this feature under Linux you have to get a Quadro which has Mosaic support :palm:. So naturally I decided to look at how mod the card...

I just absolutely loved reading this line. I had been away from 'hacker culture' for nearly two decades (probably ever since I became serious about my studies back in middle school and I stopped 'having fun' with my studies and interests) and finally seem to be growing back into the mindset. This kind of tinkering, exploring attitude is so wonderful, even just as an observer to this story it makes me feel like I'm regaining my childhood innocence again.

This made me smile. Also reminded me of http://prog21.dadgum.com/169.html

It's possible Nvidia manufactures the higher-end cards and then, if they fail the high-end testing but pass low-end testing, are marked as the lower-end card. There is precedent for this in many areas of computer hardware (CPU, RAM, hard drives). I would be wary of undoing this and expecting any level of reliability.

That is a good point. Something I really wouldn't have even thought of before even though I knew about the hard drives part.

In the post he says: "the GTX 690 and the Quadro K5000 (same GPU) and, get this... the K5000 is only single GPU and clocked some 25-30% slower then the gaming card, what a joke"

so wouldn't it be downgrading the card to make it usable on linux? maybe not idk much about linux, drivers, hardware etc.

no, he's only altering how the drivers perceive the card.

Frankly, throwing in resistors to modify how it get's identified by the computer is really a hardware hack for a software problem.

The drivers are stupid, and they're deliberately choosing to not use card features.

When you buy a high end card, you're buying the development time required to enable advanced features, you are not buying hardware; the silicon simply doesn't cost that much to produce. Most people do not want or need these advanced features, nor do they want to pay for them. The alternative environment has everyone paying $2000 for the same card and much fewer sales vs. premium users paying for premium features, and everyone else getting what they need at a much lower cost.

They're not just spending the development time on advanced features. NVidia appear to have intentionally crippled stuff like data transfer speeds to and from the GPU on their consumer models (but only for OpenGL/OpenCL and DirectCompute - you still get full speed when doing the exact same thing through DirectX) on the basis that it mostly only screws professional users.

This was compared to software that ships with all the features but requires a serial number to unlock the good ones. However I think this is different, and I think modifying hardware in this way is considerably more moral than using keygens to unlock software. It's hard to exactly state why I feel that this hardware mod is entirely moral behavior while software piracy is not but it's hard to put my finger on the right words. I think it's possibly something to do with Locke's notion of property rights and how one has an ownership claim to that which he created by his own labor. Am I the only one that feels this way? Sorry to ramble.

No, your not the only one to feel that way. In fact, until you explicitly pointed it out I wouldn't have thought at all that what this guy was doing was the moral equivalent of using a keygen to unlock features he has not paid for.

I think its the physicality of what he has done that causes this. To me it 'feels' like somebody modifying a thing he had bought to make it more useful - we do this all the time without any moral qualms, such as modifying a pair of jeans so they fit you better. Or you can pull apart cheap AA batteries to get more expensive watch batteries contained inside [1]. Nobody would dare claim such behaviour is immoral.

Yet somehow what this guy has done is skirting a moral boundary and using a keygen is widely considered 'wrong'.

1 - http://www.howtogeek.com/95390/hack-apart-a-12v-battery-for-...

This makes sense because "buying" software is different than "buying" hardware.

When you buy (a license to use) software, (at least nowadays), you're paying for the right to use the software, not the actual bits on the disc/file.

When you buy hardware, you're paying for the materials and manufacturing effort that produced a physical thing.

Given that mass reproducing hardware is non-trivial compared to software, there is often little/no/paper-thin DRM on hardware. Thus the OP could modify his graphics card.

On the other hand, look at cell phones. Given their legal connection to a service contract that is enforced with SIM cards, providers have a convenient form of DRM to enforce limitations on modifications.

While you might have a right to mod the hardware you bought you probably don't have a licence to use different drivers?

Agreed - you've spent your dollars on an actual physical device. Then find out you've paid for the same components as a professional model, but can't use it. An article here on Wired is on the same subject ... http://www.wired.com/opinion/2013/03/you-dont-own-your-cellp...

If this hack well and truly only modifies the PCI ID, then it follows that all of the performance difference is in the drivers - maybe with different firmware downloads on initialization, or maybe just the API exposed to applications.

How would you feel about a binary patch that modified the driver to treat a card with the low-end PCI ID as if it were a high-end card?

I think although the effect is the same, it's not as justified. For one, you can distribute a patch to people who do not have to do any "labor" to get the same effect as the hardware mod. You can't distribute a hardware mod to others. Also, with software I feel there is an implicit trust agreement. Photoshop is just a different series of 1s and 0s than GIMP and both are just a different sequence of 1s and 0s than what exists on the empty, fragmented parts of your hard drive. However as a society we have agreed to pay for our copy of these binary sequences and this is the entire software market. We've also, mostly, agreed not to give the software we buy away to everyone else just because it's technologically possible.

However I don't think the same social contract exists for hardware. It might be part of the hacker spirit but hardware modifications are encouraged and often celebrated. People who make their BMWs drive faster than they should or make Blu Ray lasers into lightsabers are applauded in a much different way than software pirates. Again, I can't really explain the cultural distinction but I'm very aware of its existence and that's why I can't see any objection to this Nvidia hack.

I used to modify my printer cartridges so that they didn't cost me a fortune for all my university printing.

Some might suggest Epson sell the printer at such a low price because they expect to earn money from the ink.

If you own the hardware and you did not have to sign a contract to purchase it you're probably more than within your rights to modify the hardware to your liking (some exceptions to the rule do exist for things like circumventing copyright laws, modifying equipment to function in a manner harmful to others like operating a high energy transmitter in an emergency band etc).

What you're not getting when you hack your hardware to perform outside its bounds is the guarantee from the company. For example if you're sold a processor for $X dollars clocked at 2 GHz you're receiving an implicit guarantee that for Y years the product will be capable of performing every advertised combination of operations correctly. If you have a problem with this processor it is then reasonable for you to seek support for your problem/design etc. If you notice you can run this processor at 2.5GHz which would normally cost more money and for your application nothing appears to glitch you're welcome to do so. If you later encounter issues with occasional incorrect calculations you pretty much gave up the right to complain about that issue.

A similar example might be bolts for spacecraft. Many of the bolts for these spacecraft cost 100x more than the incredibly similar bolts one could obtain at a hardware store and often times may come from a very similar production line. The reason the spacecraft bolts cost more is because they've specified a very rigid set of minimum tolerances they require and stringent tracking requirements (if a bolt fails on one part of a spacecraft you may be able to get a list of every other bolt made in that lot, get the reference samples to test and find every individual location they were placed on that and other craft). It's not necessarily that the bolts cost substantially more to make but they do cost somewhat more to verify or select each piece from the line to meet the requirements. Even though hardware store customers might be able to buy bolts with spacecraft quality performance they just didn't pay to ALWAYS get bolts with spacecraft quality performance.

That said being a hardware person I'm not sure I feel the same way about software as many of the people here. It sort of feels like the primary argument is "hardware modification feels hard to me so its different". To me it seems like where it would be acceptable to modify a piece of hardware I own I should be able to open a debugger or disassembler and patch my software to enable any feature that's inherently part of the product (or new features if I wanted to add them). Now if that breaks updates, causes me to lose work or leads to problems I'd expect to be on my own like I would with hardware.

I'm sure my opinion isn't particularly popular, but changing a resistor and changing an if( premium ) to if( true ) just really don't seem all that different from where I am.

I don't entirely know how I feel about this hack. I feel like this is really similar to unlocking the "Professional" version of a piece of software with a serial key generator.

Although, I understand hardware is dramatically different than software, it's using the same principal -- the same reusable bits to assemble many different products, and then just using tools to mask features unless a user pays for them.

Then again, why should software be any different? Why shouldn't we be able to unlock the hidden features of our software with hex editors, and serial key generators?

What do you guys think?

It's not about principles, it's about economics.

This is a matter of price discrimination. Nvidia wants to be sure they have a chance to sell their gear to every single person who has money to spend on graphics cards at a range of price points.

Consumers have smaller wallets than professionals, but both have similar needs. So the company performs a "hack" that allows them to extract maximum revenues from each segment.

Let's take this to an absurd extreme. Sennheiser did this by crippling their low-end headphones with a piece of foam to limit audio quality. Savvy audiophiles simply popped open the cans and removed the foam. No quibble with that, right?

For me, companies have every right to practice price discrimination by fiddling with their hardware. And I think consumers have every right to mod that hardware. A few tinkerers are unlikely to break a meticulous price discrimination model like Nvidia's, and I think it's probably a net win for the company to have a chunk of their userbase who love them so much all they want to do is tinker in this way.

Yes. That's a wonderful rhetoric, but really what it's saying is that the only people who don't deserve being screwed are the people that make it a hobby to tinker.

There's many other areas of life to specialize in, not just technology/engineering side of things. I think general fair play trumps an incentive to learn and hack.

What makes price discrimination unfair? Nvidia needs to recover costs and earn a profit. They do so by segmenting the market. That they are able to do so in this way suggests that no one else is able to perform the R&D necessary to sell these tools at a lower cost.

Who's getting screwed? The gamers who lose out on pro features many don't even need?

or the pros getting massively overcharged because culture accepts the notion of sinking tens of thousands of dollars on the equipment of ones profession.

The answer is both.

You seem really excited about being angry about this, and I'd hate to deprive of your rage-joy. So, sure.

I appreciate hacker news preventing anyone under 500 from downvoting comments. It makes sense really. People use downvotes as an analog for disagreement, but those kinds of actions obfuscate proper discussion.

Really what they should be used for is dissuading bad thinking, trolling, or poor rhetoric.

What do you think?

I think whatever you're getting at has sailed right over my head. But I'm glad you're having fun! That's really what it's all about.

It's not "being screwed". If they didn't do this, they would have to either physically make different cards which would increase the price of both (or lower margins and reinvestment in r&d) OR keep the same hardware and sell it at only one price point somewhere in the middle (which would be more expensive for the consumer, who typically wants the cheaper gaming card). This is a win-win for pretty much the entire market.

We see this behaviour in lots of industries. Car manufacturers will nerf the lower end vehicles so that there is more 'value' in purchasing a higher end car for enthusiasts.

And to hit a little more close to home - think of all the web companies doing exactly the same thing. There are usually 3 or 4 price points, each segment operating within the same code, but features enabled for the higher price points.

I see the same thing happening here. The 'performance' of the cards are the same after this hack, but the driver software 'lights up' exposing extra features. You could say the price differences pay for the driver features.

In the case of software as a service, different pricing tiers are at least semi-justifiable because you can argue that additional featuresets require more system resources to be added to the hardware infrastructure undergirding the site. If you want the more demanding features, you must compensate the provider for the extra infrastructure hit they'll be taking. Therefore, it doesn't feel as lame as intentionally performing a simple downgrade with the hope that you'll be able to trick others into paying more money for something that's already there if you know how to remove the foam/hack the resistor/whatever.

As far as more demanding resources go, that's not always the case but in a lot of them it would be. Think then about the various versions of Visual Studio or even Windows (Pro/Home) etc. The argument is still the same.

And with this nvidia case, the hardware hack only lights up new features in the drivers. It's exactly the same principle, and one others have shown is acceptable market practice.

Visual Studio and Windows aren't software as a service, and I suppose the argument in traditional client apps is that "the advanced features cost a lot more money to develop, so we need to recoup that cost". Neither case is as egregious as starting everything at a high-level baseline and then intentionally decreasing/damaging some parts to make them into "lower-end" models. Software is usually developed the opposite way -- you start at the baseline, get something together, and then say, "Oh, what about feature X? That'll take a while to develop..." and some guy says "Well, we can make it a premium feature". It just doesn't have the same feel as nvidia's hardware hack.

I can also see this in the obligatory car example: your car has the same engine as a nicer model but is just de-tuned with software. Is buying a Mercedes C63 AMG and flashing the ECU to unlock more horsepower effectively "stealing" the SL63 that has the full horsepower unlocked? Of course not, because you still have a C63 and not the SL. In the case of the GPU, you not only get no "professional level" support; you void your warranty (this second part probably also applies to the car).

Is a professional going to risk it to save a few bucks? I'd bet not.

Sennheiser may have very well manufactured that whole incident in order to sell more of their overpriced-at-any-level headphones. Marketing people love to make people think they are getting a steal.

I think that if I own a piece of hardware, I am the sole owner of every single chip and screw on it, including every hardware hack that it may have.

On the other side, usually you don't own the software but only a license that grants you permission to use it.

On the contrary, most individuals purchase a tangible copy of software in a retail store without any licensing terms attached at the time of purchase. (Granted: App Stores are changing this)

Shrink-wrap licenses attempt to end-run this by making the product unusable unless the user agrees to forfeit his existing property rights. The legality of these sorts of after-the-fact licenses on a traditional retail purchase is very unclear.

It's for this reason that many do not object to cracking or modifying software which they have purchased. Indeed, there's no definitive argument that one shouldn't.

I upvoted, but hope people understand that software shrinkwrap/clickwrap licenses are still a legal gray area. Yes, it's been this way for 25+ years, but it's still an exception to regular property rights.

This is a good point. When you buy hardware, you are now the owner of it so they shouldn't be able to stop you from doing it, just like once you buy a phone you should be able to unlock it or install whatever software you want on it.

> just like once you buy a phone you should be able to unlock it or install whatever software you want on it.

unless, of course you signed an agreement to not do that, in exchange for a cheaper price on the phone and some period of contract with a particular carrier...

Which is handled by contract law; it isn't clear why companies that want to do that should get a subsidy via criminal law DMCA penalties. Just like it isn't clear why white-only diners should have received free services from government thugs to throw out African American "trespassers".

yes, precisely.

What about my laptop? Do I own the RAM? Can I put any bytes I want into it? How about the ones that cause software to skip a license check?

> Can I put any bytes I want into it? How about the ones that cause software to skip a license check?

yes you can - as long as you didn't have a prior agreement to not do it. For example, you downloaded a piece of software off a torrent site. you didn't have any prior agreement with anyone about anything. The agreement between the uploader who originally first distributed the software and the copyright holder is the only place that is valid in my view.

Binning is good for everyone, so it seems kind of short-sighted to try to tear it down.

If people hacking their hardware were a significant threat to the practice of binning, hardware manufacturers could simply make it more difficult and dangerous to hack.

Sargun, your point was, I think, why is it that we treat software modding differently than we treat hardware modding?

The chorus of "The hardware is hardware, and we can do whatever we want!" misses the point.

It is a good point sargun. One I wish I had an answer for.

Perhaps one explanation lies in the fact that some level of risk and skill and ingenuity and research is required to actually mod hardware. So we feel like the hardware modders deserve the fruits. But at we feel like the software "modders" don't, because most of them are thoughtlessly running some riskless script devised by someone else.

I am not saying this should be our attitude. I am just saying I too have the knee-jerk reaction "the hardware is the hardware" but am far less certain when it comes to software. And I don't really understand why.

It's about the volatility and potential danger to copyright holders. If something requires a hardware hack, it's always going to take that much effort to apply. There's no easy way to instantaneously apply that hack to millions of peoples' hardware with minimal human effort.

With a software hack, it's much more fearful because once it gets packaged as a crack, it's four clicks to a perceived infringement on the copyright holder's rights. A barrier to entry this low makes software mods much more frightful to rightsholders.

Though the grandparent is right that these are similar practices in principle, people react differently because one is perceived as a widespread threat to the traditional mechanism of creative livelihood and the other is perceived as an advanced hack that will be done only be a couple of tinkerers. Most people are happy to provide encouragement and information to the latter group, but are more worried about the first group, as most software companies depend heavily on copyright law for their business model.

One possible future: Bits exist physically on my hardware. I get to twiddle them however I want. Furthermore, I can speak freely about having done so.

This reality is at least partly responsible for the emergence of software as a service. Can't crack what you don't have.

Rosetta Stone is a great example of this. After years of heavy piracy and failed client-side protections, their solution was to make the program web and subscription based. While rips of services like this are still possible, they're much more effort than cracking a client-side application.

This kind if stuff has been going on for a long time. I remember one of the Nvidia cards from 8 years ago could unlock extra lanes in the GPU just With a firmware flash.

Intel had been using the same dye for multiple CPU SKUs for a long time now. Though people have figured out how to move between the various skus with ease.

The one thing to keep in mind with hardware like this is that the different price points and SKUs are more than just making money. A lot of the time the SKU a card or chip is set at has to do with the yield quality of the pieces as they are manufactured. It's amazing to realize that when a new line of some cutting-edge tech comes around, they are going to have a 50% or less yield rate on their products. Sometimes they can use lower priced SKUs and take features away as the components will operate better without those features.

There were unlocked lanes as well as the ability to use Quadro firmware with 8800GTS cards (which offered improved IQ in apps like Autodesk 3ds). As you allude to though, I'm pretty sure that the professional workstation cards are of a better binning.

I see it this way: were they giving the cards for free, they wouldn't ever disable features to create different models. In this magic world everyone would get to use the 100% of what the most current technology can do. The only reason they limit their cards is to extract more money out of the economic system. It feels wrong to us because brains don't reason via market economics. You can feel the waste in this solution, but it's the inefficiency inherent in the market economy.

With SaaS it sometimes feels kind of different probably because we really do see it as a service (as it states in the name, btw.). and feel that the company offering us the service deserves more compensation for more things they do for us.

> it's the inefficiency inherent in the market economy.

i think this inefficiency exists because there is no perfect competition (i.e., anyone can start a semi-conductors company and compete in the market).

We should push for a future where there _is_ perfect competition, and this problem will resolve itself. For example, 3D printing is a viable method of reaching such a future, if the printing tech keeps increasing in fidelity and durability etc.

Well, the original reason given by the modder is to unluck the same functionality available under one OS as in another - that is all.

You're already paying an over-inflated price, and they want you to pay an even greater inflated price. It's the manufacturer's and software developers decision to bundle another product within each other. They are already charging too much for this GPU, and they try to squeeze the most profit they can from every chip.

It feels similar to unlocking software because despite the different legal structure (ownership vs. license), the source of the difference similar, as you point out: You have been given a purposefully broken (i.e. effort was expended to reduce its usefulness) product.

One would be tempted to make the argument of costing the company sale of an otherwise higher priced product, the method of fixing this would involve having entirely separate design and/or fabrication for the two tiers of components. This is far costlier than the price gap afforded to a handful of technical and risk-taking customers. Unless the ease of the transition is reduced to "Flip the big red switch from 'Fast' to 'Faster'!" then this sort of missed sale amounts to a rounding error for Nvidia, and certainly does not even match the cost that would be necessary to protect against it.

That hardware is hardware. if I buy hardware from you can can do what ever I want with it.

I see no problem with either side, Nvida or the hardware hacker.

As someone who spent nearly a decade in the semiconductor industry, testing wafers and packaged die, the economics of binning and market segment are understandable. As a electronics and RF geek, I've done plenty of mods and customization that voiding plenty of warranties.

The only issue would be if a third party did this mod in volume and repackaged the card as something it was not originally.

Wouldn't it be easier (and less risky to your hardware) to make the kernel lie about the device ID? Or modify the nvidia kernel blob to compare against another ID?

I'm sure they tried that. It is likely that the driver uses another "proprietary" way to read the ID as well and uses that as definitive answer.

Having said that, distinguishing in software would mean that open source drivers such as Nouveau won't see any difference between the cards and can advertise all the features always.

I suspect it might be possible by modifying the device field in the pci_dev struct [1] in include/linux/pci.h when the device is initialized.

[1] http://www.makelinux.net/ldd3/chp-14-sect-6

So this is basically a change of the PCI device ID. The capabilities are essentially the same; he just changed the device signature. A cool hack nonetheless.

Edit: After looking at the screenshots closer, that seems to be the case.

The device ID 0x11BA becomes 0x118F. Vendor ID remains 0x10DE on both. Memory, CPU and other important stuff are basically the same.

Very cool!

Maybe the configuration is a little different too.

If the professional counterpart is actually worse for gaming, perhaps its because the resources are allocated to supporting multiple displays instead? In that case, is it reasonable to consider this crippling the device?

Professional vid cards are usually about multiple displays so saying that's crippling might be a bit harsh IMO; I'd say it's "differently allocating" resources.

Is it worse for gaming only because of multiple displays (I.E. actual difference in architecture) or is it different in how the funcionality is offloaded by the software based on the device ID? Until there are benchmarks of an actual game played being played before and after, it would be pretty hard to tell.

The professional versions of NVidia cards generally tend to be more conservatively clocked than the gaming ones. Obviously that affects performance.

That's funny how we feel about things:

FastFood monopolist adding too much salt to its cheapest food products so it can sell less salty versions of same products at premium prices ... seems severely pathological.

Hardware monopolist crippling cheapest products so it can sell non-crippled ones at premium prices ... sort of ambiguous. Legit market strategy or conning customers?

Graphic design software make selling crippled versions of their software cheaper so it can sell fully featured version at premium price. Totally legit.

What's the difference?

You keep using that word "monopolist." I do not think it means what you think it means.

NVidia may not qualify for all that they're the stronger partner in a duopoly, but Intel's done this and much worse with impunity.

I know it's bit inappropriate here. I mean provider or clique of providers entrenched in the market and large enough to shape the market.

The term you may be looking for is oligopoly - http://en.wikipedia.org/wiki/Oligopoly

Ah, so an oligopolist.

> Hardware monopolist crippling cheapest products so it can sell non-crippled ones at premium prices ... sort of ambiguous. Legit market strategy or conning customers?

How is anyone conned if the specs are honest?

How do people find hacks like this? Looking at the board, that resistor is one among tons of components.

He actually describes it in the thread, a little farther down:

"No, no schematic, what I did was look for resistors that looked like they had an alternative position, have a look at the photos and you will see what I mean. Any that I suspected of being a strap I used a meter to check if the resistor was connected to ground of 3.3V directly, and looked where the general traces were going in the area. If they went towards the GPU and connected to one of the rails it was a pretty good bet that it was a hard strap."


I once saw a networking IC that had a key strap-in pin multiplexed with an indicator LED, so if you wanted to change the brightness of the LED by varying the resistor value, watch out because you'll change the device behaviour! Good times...

Mostly like inside or semi-inside info. Obviously anyone with a schematic and datasheet at any of the dozens of OEMs will be able to figure this out. But even without docs there were probably bunches of reference boards shipped out to integrators. It's not hard to visually diff a "Quadro" from a "Tesla" version of the same board.

What I'm more surprised by is the fact that this was done by external resistors at all. Almost all chip configuration like this these days is done with on-die fuses that can't be hacked.

If you read some of the comments, essentially: a) looking for resistors that could be placed in multiple positions on the board and b) following the traces from the resistors: going to GPU? probably not what you want. Going to PCIE interface? Probably correct.

You'll find that adding the resistor changes the ID the card sends to the drivers, and that there is other differences in the hardware (so it doesn't change the 'model' of the card). It just so happens that it enables 3 monitors in Linux (this is already enabled in Windows).

Entire post mirrored here: http://imgur.com/ZJt0VmT Save it as well.

Ow, my eyes. Between the microscopic font and jpeg artifacts.

HTML source of the print friendly version: https://gist.github.com/anonymous/5193769

Or less artifacty screenshot: http://i4.minus.com/itHdo0GR7zXw1.png (Sorry to steal any thunder!)

Not affiliated, but minus.com doesn't compress high-res PNGs into JPEGs like imgur does (it's only at a certain size that imgur does it, but it's annoying for screenshot threads, or in this case).

Here is an optimized version of that image: http://i4.minus.com/ivpBgQ8dNf5Ge.png (smaller filesize, loads quicker).

Why did you save the image with transparency?

Because that's what scrot did?

This is simply amazing! And since it's an actual hardware mod, Nvidia can't fix this right? On the other hand, those cruel corporate brats and their money making schemes.

I am astonished there is no law to put you away for 10 years for posting such disrupting stuff ;)

Please don't joke about this. Perception of ownership has changed so much over the last ~20 years, that I wouldn't be surprised if someone actually tried to jail people for improving hardware they bought, by claiming that you only 'license' the device instead of having bought it, or some other contorted reasoning (give it to the IP maximalists to come up with twisted lines of thought and sad analogy stories). Luckily NVidia isn't that sue-happy, but still...

Edit: see also this, currently on the frontpage: https://news.ycombinator.com/item?id=5394928

Laughing is a coping mechanism.

I am well aware of the seriousness of the assault on our freedoms on behalf of the corporate overlords.

And I would rather be chided for being ridiculous writing this comment than rewarded with an uneasy laugh of the people that recognize how close to home it hits.

Some people are comparing what NVIDIA has done to its hardware to crippled software. It's a bit different though. They didn't just cripple their hardware, they segmented their markets and crippled hardware and software as needed to make it almost impossible to use the same card for more than one application.

Geforce cards are crippled in software so you wouldn't want to use them for CAD work. Quadro cards are crippled in hardware and software so you wouldn't want to use them for gaming or computation, respectively. Although Teslas are also stress tested for data reliability so perhaps it's unfair to say the Quadro cards are unsuitable for computation because they're crippled intentionally.

Keep in mind that one of the finer points of making money in the semiconductor industry is yield management, especially when you're cutting larger dies.

These days you don't just toss a chip when it fails a test - you design the tests to exercise different physical regions of the chip to identify the location of a defect. You also design the chip in a way that allows you to power down different regions and behave like a lower priced part.

So a defective quad-core CPU might be sold as a dual-core part, or as a variant with a smaller cache. You have slightly finer grain control on a GPU, and very fine grain control on RAM or flash.

It was possible to flash certain ATI 6950 cards "into" 6970s too: http://www.techpowerup.com/articles/overclocking/vidcard/159

Not too surprising - classic example of price discrimination: http://en.wikipedia.org/wiki/Price_discrimination.

Think of any product you see out in the world - and I guarantee you that there is some level of price discrimination going on. It really is very expensive to develop completely different lines of products - any rational company would just slap on a few restrictions and a different brand name and bam! whole new market being served, with the existing one remaining in play.

It's actually quite elegant.

He's changing the resistors such that the bits which specify the identity of the board to the system now identify as something else? So if in fact there was some difference in the board design or chip design, then it could cause problems where unexpected faults occur. It's like saying you are capable of doing X, Y, Z, but really you can only handle X and Y situations. When someone asks you to do Z, you're screwed.

But I don't know enough about Nvidia GPU hardware design, and I'm sure he wouldn't be publishing this hack if it didn't work well.

I thought the common knowledge on why the "workstation" cards were more expensive than the gaming cards was that you're paying for the driver development for the pro cards (extra stability testing, CAD software compatibility or whatever). What this guy is doing here is just changing what driver the card tells the computer to use.

So really, this is equivalent to software piracy/unlocking software with a product key. He's not changing the actual capabilities of the hardware, just what driver software it unlocks.

Kind of reminds me back in the day where you could flash your Ati X800 card (12 pipelines) to upgrade it to 16 pipelines of the X800 XT model. (http://www.techpowerup.com/articles/overclocking/vidcard/100). This decision corresponds with the quote in the top comment."It is far cheaper to make one very good chip for the highest market, and modify it slightly for lower end markets..."

Why not patch the driver in software instead?

1. This is way easier.

2. You are unlikely breaking any bizarre U.S. reverse engineering law by doing it.

I would not be surprised, if he runs into stability issues.

Silicon manufacturers have long used the approach, to just manufacture the top of the line chip and then after testing, deactivate certain parts of the chip, to sell as a lower product. It was famous with AMD processors, where you were able to unlock more cores. The thing is, often those cores were disabled for a reason. I would not be surprised if a similar sheme would apply here, too.

It works under windows, with NVIDIA provided drivers. Just not under Linux.

"the K5000 is only single GPU and clocked some 25-30% slower then the gaming card"

Makes sense. Quadro cards are used for life critical visualizations, e.g. finite element analysis of a bridge. That's what justifies their expense. Rendering speed only needs to be "fast enough." The premium is on accuracy and reliability.

I thought the premium in quadros these days is in double operations speed. Since I need floats, not doubles I opted for two GTX 680s. They are screaming compared to Q6000, but I could always use more.

If an application uses floats in lieu of doubles might that be evidence that accuracy is not at a premium?

Indeed, 53 significant bits (doubles) vs 24 significant bits (floats). In my case float performance was what I needed. There are always other options, depending on your (precision) needs and needs in general, such as binary coded decimals, fixed point arithmetic...

I wonder if this can be done for Intel CPUs? I am sure one has to open the metal cover and change the resistors, since the device id comes with the complete package.

Off topic Its sad that GPUs are very bulky and dissipates heat, discourages me to buy one to play games once in a while. And expensive too.

I don't understand why he can't just modify the driver if he's going to go through all that trouble. Also, NVidia does support 3 monitors under Linux which was the point of this hack so this is confusing on many levels!

This probably explains why Nvidia doesn't release open source versions of their drivers.

Oh this is a classic move by Nvidia. I had a gaming PC with two GeForce 8800 GTX cards in it. I used a softmod to get them to be recognised as Quadro 4600's so I could work better in Solidworks.

NVidia makes decent video cards and such, but they're almost all based off a single reference design. It saves money on component costs and production time for third parties. Why make three different boards, using three different GPUs, when all you need to do is determine how much RAM you want to stick on it and choose the component layout for the specific product.

When dies are produced from the fab they are broken into different lots based on results from testing. This is based on the error performance. While one lot may not work as the top of the line model, it may meet the requirements for a lower power unit. Buyer beware. While you can possibly overclock your video card, you certainly run the risk of a higher number of issues.

Strange, i have seen another developer use 3 displays on Ubuntu with just a 660 GTX Ti. He said he plugged it in and it worked.

This is why you don't piss off your userbase. Also this should be mirrored as this will be taken down fast.

Wonderful! While I hesitate to break a $1,100 video card, a little hack generates so much value.

What other awesome hardware hacks radically improve the value of your equipment, be it electronic / analog? We're talking about the 100%+ value improvements here.

I guess this is why GPU makers hates open source drivers so much - they prevent artificial segmentation and they would drive prices down if they were competitive with proprietary ones.

For the past few months my GeForce 450 drivers have been acting up, and my computer was actually unusable until I disabled just about all hardware accelleration for the card.

Is it possible that the new drivers for this card are so unstable just because nVidia is trying to thwart this type of hack?

If that's the case, totally unacceptable to cripple my machine just because they want to prevent other people from altering their cards (which I will immediately begin to look into how to do myself now).

That's a very bold assertion. I don't think they would push a driver out that recklessly just to stop one hardware hack. Besides, this was posted within a few days of him posting on the nVidia forums and you said yourself, the issues you had with the card are a few months in.

Even if they knew about it before hand, it's not like a critical vulnerability or something; just an interesting modification to hardware (rarely how intrusions happen) that very few people know to implement correctly and fewer still who would want to do to their cards.

After reading a little more its clear that my original post was incorrect.

Even the original tone of "crippling" hardware seems a little far fetched to me now, it seems like these cards are just optimized differently for specific tasks (gaming, or workstation).

Still would be nice if they could get their drivers working better under windows 7.

Does this improve gpgpu performance? I believe the GFX range were crippled in this regard, has anyone tried bitcoin mining?

I can't wait for someone to work out how to make the single GPU version of this work in a MacPro.

Is this caused by the fact there are only a few companies making graphics chips?

I repeat, this does NOT make your GTX 6XX card faster, nor does it make it slower.

So what is the difference then?

As the article clearly states the function of the hack is to allow the card to run with alternative drivers, providing lacking functionality under Linux.

off point comment ahead i am wondering is bing <strong>smarter </strong>than google i search for "how is 0x11BA equal to 20k resistor" and got nothing in google but bing return at least some useful pages!!!!!!!

If they only need to fool the drivers, you could patch the drivers themselves in software.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact