Hacker News new | past | comments | ask | show | jobs | submit login
240Hz is the new 120Hz: It's time to buy a high refresh rate monitor (pcworld.com)
22 points by belter 7 months ago | hide | past | favorite | 48 comments



Is there a term for people who aren’t especially sensitive to refresh rates?

I notice a difference in frame times fairly easily due to the stuttering, but can barely tell the difference between even 30-60hz on a flatscreen game (I do notice a difference when my desktop is <60hz though).

I’m a bit more sensitive in VR, but still can’t readily notice a difference above ~80hz. Sometimes I’ll clock it up to 120hz, but only for games that kick on motion smoothing since I can easily notice the difference between frame doubled 60hz vs 45hz.

I feel like I’m a freak or something for not being able to instantly perceive high refresh rates.


I'd wager that you'd notice the difference when using higher refresh rates regularly and then downgrading. I think that's a cruel thing to do to yourself though.

I remember when I first saw a high-def Retina Macbook, I didn't appreciate it. I happily programmed on my low resolution Macbook Air, snug as a bug in a rug.

I eventually upgraded to a Retina Macbook. One day I opened up my old low-res Macbook to recover some old files and I was horrified at how bad it looked. You could see the damn pixels! Text on that screen was so blocky. I couldn't believe I happily used that laptop for so long.

Is a Retina screen useful? Sure. Did my happiness increase? I don't think so. Does my screen eat up much more battery lighting up many more pixels? Probably.

Sometimes it's nice to be happy with the minimum and not ruin by chasing upgrades.


It wasn't actually that bad at the time. Check what version of macOS that old MacBook has been updated to. Apple nerfed macOS when they introduced Retina displays so that it only does gray pixel anti-aliasing when earlier they did subpixel RGB anti-aliasing.

I just got Better Display[0] which works great for driving non-retina-density monitors with HiDPI resolutions that do anti-aliasing better.

[0] https://betterdisplay.pro/buy

[1] https://github.com/waydabber/BetterDisplay


Retina everyone liked. Increased refresh rate is sketchier.


I find I can't really tell the difference beyond ~90fps outside of artificial tests (like moving a hard edge and "counting" the edges visible, you can normally do this with a mouse cursor) in non-interactive content.

But beyond that the latency improvement is still noticeable - most games still tend to handle input and game update in terms of frames - so if you have 3-4 frames of input->visible result latency that's 30-40ms at 100fps, but 15-20ms at 200, and you can tell the difference in some situations. This is very dependent on the style of game though - many have sluggish movement and slow direction changes as part of the design, so is less noticeable than some super slide-y shooter like Doom.

But much of that is solvable by the app outside of higher FPS, with GPU submission timing and delaying input processing you can get pretty much the same latency at any FPS, just many games don't do that (yet?)


Interesting point about gaming input latency: I haven’t been really interested in twitchy flatscreen FPS since the early days of Overwatch 1 and my VR stuff is mostly bottlenecked by network and decode latency.


It varies so much between people (and age?). My dad has a very hard time telling 60hz and 120hz apart. Meanwhile to me 144hz and 240hz are trivial to tell apart, let alone 60 to 120.

One time my drivers for some reason reset my old 144hz monitor to 120hz, and that's still enough difference that it immediately felt weird when I logged into Valorant (so I began diagnosing the issue).


How was your system configured. If you frame capped your FPS at 144 and had a 120 Hz monitor without V-Sync that would be super noticable.


Back then I played vsync off, freesync monitor, unlocked fps (around 300).


Noticing the difference vs valuing the difference


>Is there a term for people who aren’t especially sensitive to refresh rates?

Normal people.

>I feel like I’m a freak or something for not being able to instantly perceive high refresh rates.

I feel like I am the freak when I am the only one who constantly wants to push for higher refresh rate and lower latency. I mean the latency of 240Hz display is 4.2ms alone. So in the worst case scenario that could be up to 8ms delay between frame and input assuming zero latency in the system which we know will never be the case.

I think most people would perceive it if you use an Apple Pencil on iPad. The difference between old iPad and newer iPad. Just that most dont care enough about it.


Me too. My guess is because I spend most time reading / coding and not in first-person view games. Then FPS only affects smooth scrolling.

Anything at 60 Hz or more looks all the same to me (well, I can spin my mouse in a circle and see jerkiness but who cares, I am not playing shooters). On the other hand, low DPI annoys me instantly. All my monitors need to have >130 DPI (e.g. 4k resolution with 31.5" diagonal, scaling at 100% works fine). I also set all fonts really small, around 6px.


I haven't been sensitive to refresh rates or even resolution.

I'm watching my movies still in 720p.

My resolution doesn't go past 2000s.

I believe at some point it's just "number go up".

I'm probably on the low end but not by much.


It would be nice to see the comparison between 2160p/144hz and 1440p/240hz since they're priced similarly, so that's the choice consumers will be making. The article makes the point that the perks of 144hz for general computer usage are for scrolling text, but in my experience the difference between 144hz and 240hz in that area isn't a huge game changer.

Meanwhile the benefits of 4k seem more impactful with better text clarity and native resolution for 4k video/image content. Even with games, the argument for 240hz where "games display many small, fine details in character models, effects, and the interface" would be just as applicable for 4k.

I made this choice recently and decided on 2160p/144hz, already owning a 1080p/240hz monitor that I thought I would still use for competitive games like Counter-Strike and Rocket League. But I haven't noticed a significant difference between 240hz and 144hz so I'm happy using my 2160p for all types of content.

IMO you reach diminishing returns sooner with refresh rate than you do with resolution.


As a 16:9 refusenik/16:10 holdout it's pretty disappointing how, with the possible exception of LCD backlights, the 16:10 display panel offering remains basically the same as it was in 2006, and hasn't inherited any of the upgrades that the rest of the display industry have produced since then, like 120Hz or HiDPI. Seeing 120Hz be considered not "high refresh rate" nowadays though is certainly a surreal feeling.

Will 16:10 continue to be treated like an unloved stepchild? Can vendors really just not wait to kill it?


3:2 has been getting some love and is a not-bad compromise between 16:10 and 4:3.


I recently received a 144Hz ASUS monitor from the IT department at my company. I don't game on company computer and yet the difference while doing regular day-to-day work was noticeable. Cursor stutter/delay was noticeable when I switched back to my personal laptop with 60Hz setting. Time to look forward to 240Hz!


Yup. I do love my ProDisplay XDR from a photography perspective, but I miss my dual LG monitors - and there's a little Stockholm syndrome, as Apple broke DSC 1.4 in order to make the ProDisplay work - previously, in Catalina, I could drive the 2 27GN950-Bs in 4K HDR @ 144Hz, and after, could not (this was reported hundreds of times, across many users, monitors, and GPUs). After Catalina, and at least through Monterey, I could only get HDR@60, SDR@95 with DSC 1.4. Ironically, telling the monitors to only advertise DSC 1.2 support got me better results.

At the time, people were wondering how Apple was able to drive the ProDisplay given bandwidth constraints. The answer was "by breaking DSC 1.4 for everybody else".

But to your point, agreed. Whether coding, or otherwise, 144Hz was just buttery smooth and made 60Hz very noticeable.


Yeah my multimonitor setup includes both 60hz and 240hz monitors, and it's so much easier to use the cursor with the 240hz monitor.


I would like to see more rigorous studies done of refresh rates, especially w/ and w/o (free|g)-sync. 120 was definitely huge for me, but I'm skeptical that 240 vs 120 would make a material difference, especially w/ gsync, which I believe is much more impactful in real terms.


Gsync basically doesn't matter if you're fine with disabling vsync, and at high FPS the tearing is almost invisible.


G-sync isn't exactly about tearing, it's also about latency. The thing that matters in game is click-to-photon, which is shortened by syncing screen blank to frame availability.



Sure sure, if you don't care about tearing or you're rendering at an absurd framerate. In real world situations where your framerates are not 2x your monitor's refresh rate and you care about tearing g-sync is very helpful.


Back when I was cool, I used to love games like Quake 3 and Counter Strike. I could absolutely tell when a game reenabled vsync after an update and I absolutely hated vsync; made everything feel laggy/"smoothed".

I don't recall ever noticing tearing.


IMO, an even bigger difference is screen size. I’m not talking about 23” vs 27”, I mean a monster, like 43”. My work laptop and personal gaming desktop are both hooked up to an Acer Predator 43” 4K monitor. It is utterly massive, and is a delight to both code on and game on.

For work, it’s effectively like having four 21.5” monitors perfectly aligned to a grid. I usually have two browser windows up top, then a wide terminal on the bottom, usually sliced in half with nvim on one side. If I have a large refactoring project or the like, I can fill the entire screen with nvim and have buffers galore.

For gaming, nothing draws you into the world like it. On top-down games like Factorio, or old RTS, it’s almost cheating due to how much of the map you can see.

The fact that it also does 120 Hz (144 Hz with two DP cables connected, but I don’t really care enough to bother swapping cables all the time) is icing on the cake.


Is there any kind of objective study that shows:

1) What percentages of the population are able to distinguish between hZ rates in terms of normal object smoothness of movement, and

2) Same question but about mouse latency, given lots of hZ headroom so it's measuring specifically the mouse and not display?

At some point improvements don't matter any more, and I'm curious where that point is.

And of course there are always going to be ways to detect any hZ if underlying object motion is fast enough, but I'm not talking about frame rate artifacts like wheels seemingly moving backwards. I mean regular linear motion at regular UX movement speeds.


"able to distinguish" is a very ambiguous description. Imagine the following scenario: you play a video game, where you are a cowboy in a standoff, as soon as your opponent grabs his gun, you need to click to shoot yours. You are punished for a false start, and rewarded based on your reflexes (maybe statistically: the sooner you shoot the more likely you're to win the duel).

There is no lower bound to latency, and therefore upper bound to FPS, that helps you statistically. Eventually the resolution of the study (e.g. sample size and therefore the statistical error) will say that "the population are able to distinguish" up to some Hz refresh rate. But when you then increase the resolution of the test (sample size), you will again start to see a difference.

Of course I'm not saying a research like that wouldn't be interesting, all I'm arguing is that there's no definitive answer about maximum FPS that makes sense.


Think about the human brain's ability to distinguish 2 nearly-identical sound frequencies. Our neural nets are masters of time domain signal analysis. Even the slightest shift in phase or frequency is usually perceptible.

I think saccade plays a big role in how we perceive smoothness of motion and could also explain why some people don't really mind 60hz whereas others find it to be an affront to civilization. Just a little bit of difference between human visual systems could explain a large difference in final perception.


this heavily depends on whether or not there is any interpolation, the complication is that you can detect a strobe at essentially arbitrary frequency, provided it is illuminating a moving object, as you will see the rate of the strobe as a spacing between images. Similarly, if you are rendering pong with no interpolation, and the ball is moving fast enough you'll just see copies of the ball. This makes the test somewhat ill posed. Interpolation is also somewhat ill posed, computationally, but if you were to do it very intelligently, or to only a first approximation, I suspect we have more trouble detecting it.


Indeed -- is it time to start applying motion blur to scrolling, dragging, and cursor movement?

This wouldn't affect hZ for gaming, but would it be an improvement for general UX?

Can anyone tell the difference between e.g. scrolling at 960 hZ without motion blur, and scrolling at 120 hZ with the exact right amount of motion blur?


Sorry, I've had two 4k+ monitors for ~7 years now, and not going back. Wouldn't mind 120hz and squarer aspect ratios but not at the price of half or more of the resolution.

Also skeptical of higher refresh rates than that. I remember I could see flickering clearly with 60hz CRTs. 72hz was enough to alleviate it significantly but could still see it in my peripheral vision. By 90 or 100 and it was completely gone. Games are quite fluid at those rates as well, so not expecting over 120hz to be of much use.


Feels like somebody's trying to sell me something I don't need.


This continual push for upgrades drives resource exploitation, drives mountains of un-recyclable e-waste.

In 2023 should be desperately trying to keep our existing technology in use, not making people insecure for having a 120hz monitor (or making people insufferable, boasting they have 240hz).

I don't know what refresh rate I have, but it's just fine.


I share your sentiment, which is why I am thrilled with the progress on right to repair laws across the globe.

Also companies like Framework are letting us have our cake and eat it too. They’ve made CPU upgrades possible on laptops, and there’s a way to re-use old parts.

I don’t know if the same can ever be done for monitors - perhaps if someone came up with a clever tech to combine 4 displays into one big frame TV?


And here I am, still rocking 60Hz after all these years.


60hz 1080 on my desktop, I still see no reason to upgrade since 2018. Everything run smooth, games run in high/very high at 60+ fps

Upgrading my screen would need spending 1.5k+ in hardware to be able to drive it properly, I'm sure 120hz is cool but I feel like as for my pixel 3, upgrades are very expensive for what they bring to the table


If you tested 120 Hz screen, or 4K screen, you would see a reason or more than one. There are some cools experiences to be had on 120 Hz, like movie interpolation, or playing some video games, where even if you don't see a difference, you might notice you're suddenly a better player. When it comes to 4K, I think the most obvious difference is the text quality - no longer sharp (square, pixels) edges might be odd at first, but a more natural, less colored (less need for subpixel rendering) look of the text is easier to read…


I used to clock up CRT monitors as high as possible, because their blinking was extremely annoying. Since I am using LCD monitors, I have it set on 60Hz by default and never had any reason to migrate higher, because LCD just doesn't blink. So I don't see point in 60+ Hz monitor


My 144hz panel has done nothing but instilled an urge in me to upgrade my computer. I might have played myself.


Does it make all the slow-downs more noticeable?


Actually the VRR makes the dips and hitches smoother, the problem is in my inherent annoyance at my system having trouble maintaining a framerate even approaching 100[0], let alone the 144hz capability of my screen.

[0]Or 60, depending on the game. Sigh.


240Hz makes sense but I have to image the difference between 300 and 540Hz is placebo


This screenshot of Dota 2 that is being used is at least like 3 years old.

Which I guess doesn't detract from the overall thesis, and the caption technically didn't claim it was on a current 240hz monitor, but still.


I found even at 60hz an oled display solves all my complaints about ghosting.


What's the point of higher refresh rates when even the best LCDs have motion issues? You forget how bad the situation is until you see a CRT with perfect motion


What's even more dramatic is getting a vertically-aligned (VA) panel rather than an in-plane-switching (IPS) type. It's not only about deep blacks--the colors are so much more vibrant.

If you're going to get a new monitor go for 120Hz+ refresh and VA, but note that it's not good for bright environments. Check the nits and see if that's enough. Or if you have the budget can get something like the LG/Dell Nano-IPS Black aka IPS Black (different than Nano-IPS) but it doesn't do high refresh rates.

Edit: if you only care about text documents, then get an office IPS monitor.


Meanwhile I'm using qhd at 60hz still. Never got the hype over 120hz




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: