Hacker News new | past | comments | ask | show | jobs | submit login

PC gaming is a struggle for me as of now, I have an older rig which I would love to update but simply cant justify the cost. It is pretty frustrating when my friends xbox can run 4k games but the only GPU that is comparable is north of 500$.

The ML and crypto hysteria as well as AMD not willing to compete is not healthy for the long term PC gaming industry. It is certainly keeping me out of it.

Most PC gamers don’t play in 4K and 30 frames per second, much less 60 fps. That’s graphically taxing for many contemporary games, especially if other graphical quality is turned up.

The consoles also typically don’t run at full resolution. In the 1080p era, I recall many games topping out at 900p. It was a great source of pc gamer smugness due to the fact most of them could easily run 1080p and 45+ fps. I suspect modern console 4K is the same.

Since the generally accepted fps for consoles is 30, and that’s pretty low, it means developers can either compromise on resolution or other things like how many entities can be on screen at once, or how large open areas can be. I suspect catering to console audiences is why a lot of the last ten years some games feel either empty or constrained in scope.

> The consoles also typically don’t run at full resolution. In the 1080p era, I recall many games topping out at 900p. It was a great source of pc gamer smugness due to the fact most of them could easily run 1080p and 45+ fps. I suspect modern console 4K is the same.

Consoles tend to do dynamic resolution - they'll drive the display at 4K (together with UI), but the actual rendering resolution will drop and adapt to the load. It actually looks pretty good - the UI and text keeps being sharp while you generally don't notice the resolution drop that much. Unfortunately most games lose this capability on PC, so you're stuck with blurry looking text if you have a good monitor or TV :(

The other trick they do is checkerboard rendering - the console only renders approximately half the scene each frame. The partially rendered frame is then combined with the previous one with added filtering. The result is not unlike the interlaced CRT TV rendering of 80s era consoles/micros - just with better filtering. Again, this capability tends to disappear on PC versions of those games.

Lack of those two things means that in a lot of cases games on PC look/run worse on equivalent hardware than they do on the console :/

Some newer first-party Microsoft games have dynamic resolution scaling in the PC versions. I thought Battlefield One / 5 had it, but apparently not.

It works well in Forza Horizon 4. And yeah, I wouldn't mind the option for checkerboard rendering on PC. I understand purists want nothing to do with it, but hey, too many options is what PC is all about.

While 4K is still somewhat far away on a reasonable set-up, a surprisingly large amount of games can comfortably push 3440x1440@60.

Most gamers don't play in 4K, but a large number (maybe most? Surely the largest minority) on PC probably get 60 FPS. You can look at the Steam hardware survey results here: https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...

I've been holding out to do a big upgrade myself, but I got a Gigabyte GTX 1050 Ti last year for $147 to comparably replace my dying AMD 7950. Same card today is $190 (on Amazon). It's more than capable for things at 1080p, and does just fine for some games at 4k too (or I survive with 30-50 FPS and/or tweak with the graphics options; "high" is usually acceptable even if it's not "ultimate").

The new xbone x can do 4k pretty reliably, but that's $500, plus $60 per game. Getting your aging PC (mine is nearly a decade old) up to par (and often exceeding) graphics wise is a cheaper investment in a GPU, and games on PC are routinely on mega sales.

30 FPS? AHHH mah eyes! ;-)

Eh. I can get similar performance as an Xbox One X out of my not-built-for-4K gaming computer if I too lower all my other graphics settings. Soul Calibur VI, for example, runs around PC's "low" settings on an Xbox One S and around "medium" on an Xbox One X.

Instead I play at 1440p and 144FPS with most of the knobs in most games turned up near max (and my graphics card is a couple revs out of date).

I agree. I used to build my own PCs and I was a big-time PC gamer, but these days I struggle to find the energy for it. I recently bought an Xbox One for $200, which is half what my PC GPU cost. Not to mention I prefer working on a laptop, but laptops that have beefy GPUs and CPUs tend to be heavier and more expensive with worse battery life. So to play PC games, I have to have a desktop in addition to my laptop, which doubles the cost and in effect just turns my desktop into a really expensive and oversized game console anyway.

It's so much easier and cheaper and convenient to have one box dedicated to playing games and one box dedicated to doing work. Instead of a $2000 laptop or a $1000 laptop and a $1000 desktop, I can have a $1000 laptop and a $200 Xbox.

It's a shame the "PC Master Race" thing has become such a meme lately to fragment and radicalize the gaming community. All I care about is playing video games to relax and enjoy myself. It doesn't matter one bit how I accomplish that goal.

I bought an Dell "gaming edition" laptop last year for about 700 bucks. It's my daily driver dual booted with Ubuntu, and Windows, and it handles any modern AAA game with 30+ fps or more.

To me the PC Master Race meme started with FPS games (CS 1.6/SourceGO vs COD on console), the cultures around the various platforms, and how video games in general just seem to be more enjoyable with a mouse and a keyboard than a joystick (not to mention you can be much more accurate with the crosshairs),. The only exception to me are fighters and drivers.

The meme then slowly morphed into this pseudo-elitism about graphics and performance, but most "pc master racers" have at least one console, if not two.

I still don't understand how Microsoft can charge people a monthly fee to play multiplayer games on Xbox--or Sony doing the same on PS4--when the very same games can often be played on a gaming computer for no more than the cost of the internet connection and the individual game subscription fees (if any).

That type of blatant rent-seeking by gatekeepers can never happen in an open-source OS gaming ecosystem with unrestricted HID hardware. It's a structural impossibility. That's what I think about when I see the phrase. It's PC-master race, not PC master-race. Owning your own device means being its master, and master over any software that must petition you to run on it.

For that $60/yr you pay for Xbox Live, you also get $700 worth of games included with the price. Every month you get two or three free games, usually older AAA games like Assassins Creed Black Flag or one of the Halo or Gears of War games. This month you get Hitman Blood Money and Overcooked for free, which are both great games. Xbox Live is only the cost of one brand new AAA game per year, and again you get games included that you can download and play forever, so it really offsets the cost.

You also have better anti-cheat systems (get caught cheating on PC, you have to buy another copy... get caught cheating on Xbox you have to buy another Xbox). I tried playing CoD4 on PC when it came out and very quickly switched to Xbox because I didn't want to have to install an aimbot.

But that's ignoring all the games that exist on PC that you do have to pay money to play online...

Great. If that's such a great deal, unbundle the games from the network access. I am perfectly happy not getting bonus games if I can use my own network connection without getting permission.

None of this addresses the fact that console manufacturers are erecting tiny tollbooths on the wire between the computing device that the customers own and the network router that the customers own, extracting rents from transactions that they no longer have any business being a part of.

The best anti-cheat system I have ever seen is the ability for users to refuse to play with other users whom they suspect may be cheaters. Whether they actually are or not is immaterial. You should be able to not play with someone who makes a game less fun for you (blacklisting). And you should also be able to play with only friends whom you know and have invited (whitelisting). Needing anti-cheat measures beyond that is usually a symptom of not allowing customers to run their own private servers, likely because they won't pay for access to the main company servers if they don't have to. Get caught cheating or griefing on my server, and you can only play in the trashbag-exile instance, which is somehow still fun for some twisted weirdos. Get caught cheating on a centralized corporate server, and someone has to Report you, then there is a Process, and then that guy is banned and maybe just replaced by yet another guy just like him, doing all the same stuff that got the last guy banned. If your culture turns toxic, the good players abandon the game altogether, instead of just switching servers.

I don't know of any games that have both a PC and a console version that charges for the PC version, but not the console. That money to the games distributor pays for the server maintenance and the development team salaries, and there is a clear line of demarcation between what I own and what I am paying for. But there have always been free servers, pay servers, prince/pauper servers, and donation begging servers out there in the computing world, since shortly after 1 Jan 1983. It has been entirely possible to waste all your spare time online, from your home computer, without spending a single dime above the cost of the network link, for decades now.

Even the Xbox one X cannot run all games on 4K. Most games have dynamic resolution somewhere between 1440p and 4K and/or run at low frame-rates.

You can grab a GTX970/GTX1060 and be able to run any game on 1080p-1440p.

Even the most powerful Xbox is only comparable to a RX 480. You can find a RX 580 on Amazon right now for $250.

If you compare specs that is true, but if you actually run the games at 4k on high settings etc, the xbox will be fairly smooth where as the RX 480 will struggle. In reality you need to get a 1080ti to get the same 4k experience, and the new RTX cards are priced way too high.

If you actually run the games at 4K and at high settings, you will be running them at higher settings than the Xbox One X. Hence the problem with the comparison.

Is it an actual 4k experience, or is it an upscaled image?

I'm really hoping that this will change soon. The fact that AMD's making competent CPUs again is such a relief that I ended up upgrading to an 8 core Ryzen 2700X from a 4790k. I got a Vega64 knowing that it won't be as fast as a 1080ti, but still adequate. I don't like nvidia. I'm really hoping that once AMD starts regaining some more satisfying market share then they'll do whatever they can to get Radeon Technologies Group to be competitive again.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact