The gamer problem is exaggerated. While GPUs are definitely hard to buy in quantities > 1, and the prices are definitely much higher than 6 months ago - pre-built gaming machines (Alienware, Lenovo Cube, iBuyPower, etc) are still affordable. It's almost absurd seeing how these computers are now sold for less than the sum of the parts (especially the RAM and the GPU).
So, one thing has definitely changed for gamers - it's actually much cheaper now to buy a whole gaming PC than to build one. So much so that "crypto enthusiasts" now buy these machines, swap out the GPUs for low-end versions, then put them on eBay.
I'm not being hyperbolic, $300 in 2015 buys you a better graphics card than $300 in 2018.
In normal times, a $300 card in 2015 costs $150-180 in 2018. It's easy to then see that the 2018 $300 card much be generations further.
However, today, a nVidia GTX 1060 will run you around $350 if you can find one in stock, and its performance per dollar is literally inferior to say an AMD R9 390 from 2015, which could be found under $300.
Very rarely in tech do we exist in a market where 3 years ago performance costs more today than it did then.
Would you be okay paying $600 or $700 for an iPhone6 or LG G3? It's shocking to us when years and years go by and your dollar doesn't change what you get.
Its like the trends flipped themselves on their head. CPUs were barely improving from 2011 to 2016 but now Ryzen has forced Intels hand and suddenly in a year the jump is substantial (20% per thread, with 50-100% more cores). That being said, now that we have the "new normal" besides the 7nm jump in late 2019 - 2020 I don't anticipate substantial gains in CPUs going forward. 6-8 core should become the average high end but business as usual continues, hopefully with Spectre and Meltdown mitigation.
Likewise, GPUs had explosive performance gains from 2012 to 2015, but now the 1000 series from Nvidia has been out for almost 3 years with no new high end GPUs, and AMDs last full suite of GPUs was the 300 series 3 years ago as well. My understanding is the Vega cards are barely a sidegrade to Fiji, which were the last substantial improvement over the previous 290x series (and those were a substantial improvement over the 7900 series).
- If I spend the same amount of money I have roughly spend before for my graphics cards (~200 €), I can't, because that price point does not exist. I could drop 100-150 € on something that's pretty much the same (+-10 %) as my current card, though.
- If I want a significant upgrade, I'd have to at least spend twice as much as I want to, likely more (400-500 € range).
The price range that most gamers (at least those I know) buy their cards at simply does not exist for the time being.
That's true regardless of vendor. Even if I wanted to buy nVidia, the exact same thing applies to their lineup. There are cards <150 €, and then there are cards >>300 €.
I used to joke a few years back why there were so many models of everything; why would they need a dozen or more models to cover the price range from 50-xxxx € in 30 € steps? That "complaint" seems ironic in retrospect...
I live in Japan, and last year I bought a GTX 1060 (6GB) for around 32000 Yen (a little under $300). This past weekend I was browsing around some computer shops and I found one shop with the same card I bought less than 12 months ago for 55000 Yen (~$510). They also had a GTX 1070 Ti for 77000 Yen (~$710), which is significantly above MSRP.
Every other shop I visited were sold out of everything above the GTX 1050. Those cards haven't had their prices affected by much during the last few months. There's also no AMD cards in sight besides the super low-end cards.
Markets are markets.
Worse yet, if nv does invest in more card production they run the risk of bringing that online just at the crypto market crashes, when there will be much less demand for cards and a flood of old mining cards on the used market.
Worse yet, if gamers can't buy your cards you lose loyalty (can't build it with newbies and old hands slowly forget). If you don't believe that gamers are unhappy then you simply haven't been paying attention.
This is an appalling situation for Nvidia (and AMD - all the same reasons apply to them too).
Are you saying they shouldn’t manufacture & sell more card now because that this will cost them future sales? Can’t they just put that extra income in a bank and earn nice interests on the profits that are shifted from future to present due to that crypto craziness?
How’s that relevant? nVidia has no chip fabs.
They use third party services. Currently TSMC but if TSMC is overbooked there’re others, e.g. for the current-gen GPUs nVidia also considered Samsung but apparently got better deal from TSMC.
As someone who’s trying to upgrade my gpu in the context of video editing, this situation sucks.
My SSD and memory intensive workload infrastructure increased 60% in cost after our deal dissolved.
Besides; some people who develop would probably use a 'scaled down' system to test it anyway.
Maybe NVIDIA will launch a gamer registration system, that would help the gamers actually get their hands on some hardware.
(Ethereum is ASIC-resistant, meaning any ASIC needs to look a lot like a GPU in terms of a powerful memory subsystem, which is much more difficult to optimize. It has never been impossible to improve the efficiency of the processing side of things - after all, a GPU is a kind of ASIC. You can think of these as Ethereum-optimized GPUs. The fact that these ASICs are only 4x as efficient as a general-purpose GPU says that the ASIC-resistance is actually working, in comparison the first Bitcoin ASICs were thousands of times as efficient as GPUs.)
That's assuming it's Ethash-specific though. I guess you could also mine Ethereum Classic with it at least (are they planning to move to Proof-of-Stake? I havne't kept up).
Many chinese miners are using electricity that's nearly or actually free, they will keep trying to recoup their investment even if it sends small miners into the red.
On top of that, most Ethereum people are mining using AMD GPUs - and AMD GPUs have garbage efficiency at all the other coins. Ethash is literally the only one they do with reasonable efficiency, NVIDIA is ~2x more efficient at everything else.
(numbers from WhatToMine.com and include undervolting for all cards, and BIOS mods for AMD cards)
(bandwidth consumption is what makes Ethash memory-hard/ASIC-resistant - same principle as the tuning parameters on bcrypt/scrypt, designed to make it difficult for an attacker to scale their processing)
The 1070 Ti is basically a 1080 with GDDR5 (dual-pumped) instead of 5X (and minus one SM), and does much better, along with the 1070. The 1080 Ti has enough bandwidth to brute-force it even throwing half of it away.
There are other nascent cryptocurrencies that may make an effort worthwhile.
Vitalik can't actually deliver this product. He's a teenager who made a Bitcoin clone with a different hashing algorithm, not a super-genius software engineer. He's way out of his depth and he's not delivering.
It's even sillier to take him seriously at this point in time, and Bitmain is putting their money where their mouth is. The only real question is whether Vitalik will hardfork and screw up their implementation... but then next time Bitmain will just keep it in-house and not tell anyone. They already pre-mine with them before selling to customers, business is business.
(remember, Bitcoin includes a scripting language for smart contracts too! It was just partially disabled due to bugs in the original implementation that nobody ever bothered to fix. And nowadays the development process has dragged to a halt from infighting between various stakeholders, they are effectively incapable of making significant decisions. https://en.bitcoin.it/wiki/Script )
Most of the time it's an earthquake in Asia that takes power plants offline and impacts supply.
(DRAM/flash prices have been on a continual incline because capacity growth has not been matching expected demand growth, and this was expected to continue through next year. China told the DRAM cartel to knock it off or they'd bring state-sponsored fabs online to fill the gap... and a few weeks later Samsung signed a memorandum of understanding and now prices are expected to decline throughout this year. Just a stunning coincidence /s)
And fab time on cutting-edge nodes is expensive as well.
For example, BitMain is buying up more 16nm fab time than NVIDIA right now. So certain classes of high-performance silicon like 100 GbE switches may be in shortage as well.
Also, after a while certain enterprise hardware ends up being worth a fortune as desperate IT guys try to buy stuff as spares for a critical system that haven't been upgraded for whatever reason.
Then there's stuff like Raspberry Pi's, though they aren't really a computer part per se. That being said, computer parts occasionally do sell used for more than their list price.
Is happening sadly.. ram prices are still stupid
And I thought the prices were a joke, but a AMD vega RX 64, was pricing between 1,100$ and 3,000$ 2/13/17
As of today, the card I bought costs over $720.
I paid just over $300
On eBay, 580s are going for about $350 at the moment. Still more than double what they were going for a year ago, you used to be able to reliably get the 8 GB for $175, but it's not $720 either. That's 1080 money.
edit: or incredulous at how much they cost now
I've basically been using these as space heaters throughout the house, in the tool shed, etc:
Now just need to figure out what to do with all the access heat in the summer. Might heat the swimming pool. :)
Come on, don't let him get away with that lie. More miners bring no benefits to the ecosystem.
That's not a great investment considering you could just trade cyrpto itself and be far more liquid.
There is also stock available on the aftermarket, including whole rigs for sale.
No logging on server side guaranteed as it is your own server. AWS could monitor but I don't mind that
I pulled out the ancient ATI video card and sold it for $50 on eBay.
Which means I got an absolute steal on my recent PC build, all things considered.