
DisplayPort and 4K - ecliptik
https://etbe.coker.com.au/2020/02/16/displayport-4k/
======
proverbialbunny
I'm an early adopter of 4k60. If you write code and you're not on 4k, you
don't know what you're missing. 4k is great.

Back then display port ran at 60fps, and hdmi ran at 30fps. My hardware has
changed and moved on, but I still default to using a display port cable. It's
rare to not find a graphics card without display port, and it's rare to find a
monitor without one, so I've never had a reason to try HDMI. As far as I can
tell it's a stubborn format that continue to fight to live on. Frankly, I
don't really get why we still have HDMI today.

~~~
dmos62
> If you write code and you're not on 4k, you don't know what you're missing.

I really don't. Would anyone care to share their experience? Currently, I'm
thinking that a vertical monitor would be an improvement for me, but I don't
see a reason to get a screen that's several times denser.

~~~
mlyle
They're probably suggesting to get a bigger monitor at the same time. You can
have somewhat higher DPI and smoother text, and more real estate.

If you're used to 24" 1080P, going to 32" 4K gets you 1.5x the DPI and 1.77x
the screen area. For what's now a $300 monitor, that is a pretty significant
improvement overall.

The DPI means my characters go from 9x18 pixels to 13x27, which is a big
difference in text rendering fidelity and just feels a little nicer. And the
additional screen real estate speaks for itself.

~~~
jhoechtl
I am still traped into the mess Linux wayland has brought and all I see is a
blurry algamation of nlurry fonts surrounded by blurry images.

The Linux desktop is dead.

~~~
proverbialbunny
That's odd. Maybe it's your hardware or your setup or something.

I'm on Linux, and as long as I have screen composition on in the NVidia
Settings Panel and software vsync off, 4k60 is as smooth as butter and as
sharp as 4k, much like OSX. It's nicer than Windows.

~~~
384028345
Thank you for that tip. I had a lot of screen tearing before I enabled that
option. Do you have "Force Full Comoposition Pipeline" enabled or just "Force
Composition Pipeline"?

EDIT:

[https://wiki.archlinux.org/index.php/NVIDIA/Troubleshooting#...](https://wiki.archlinux.org/index.php/NVIDIA/Troubleshooting#Avoid_screen_tearing)

That's helpful! :)

------
chx
The real "fun" starts with HDR.

See, HDR on one hand requires ten bits of color per channel. So bandwidth wise
4K @ 60Hz 10 bit fits DisplayPort 1.2 but not even HDMI 2.0 has enough. See
the calculator [https://linustechtips.com/main/topic/729232-guide-to-
display...](https://linustechtips.com/main/topic/729232-guide-to-display-
cables-
adapters-v2/?section=calc&H=3840&V=2160&F=60&bpc=10&timing=cvtrb&calculations=show&formulas=show)

But! HDR is more and some video standards have the necessary extensions but it
can also be done from software but then only compatible software will be HDR.
Check
[https://superuser.com/a/1335410/41259](https://superuser.com/a/1335410/41259)

> The HDR mode will automatically activate if the monitor receives a
> compatible signal, but the device only relies on a software-based
> implementation. When active, the HDR representation adds a nice touch of
> extra color pop and a seemingly deeper contrast range, although the
> monitor’s limitations will come into play here.

And this extra has been added to HDMI 2.0b (compared to HDMI 2.0a) and for
this reason it is said HDMI 2.0b support 4k @ 60Hz HDR when it can't because
it doesn't have enough bandwidth (well, it does for 4:2:0) and DisplayPort 1.2
is said to not support it when it does have enough bandwidth.

Oh joy.

~~~
Latty
Yeah, it's kinda sad how close we are to the specs and that they aren't
getting much ahead. Higher resolutions just chew through that bandwidth
quickly.

Apple had to really hack around to get the Pro Display XDR to do 6k at 10-bit
at 60Hz. They are using USB-C w/Thunderbolt for that and actually pushing two
DisplayPort 1.4 signals down that cable. Clearly cludgy at best. Even then at
6k/10-bit/60Hz the other USB-C ports on the monitor can only work at USB 2.0
speeds because the link is so saturated.

6k/10-bit is a bit niche right now for sure, but I'd sure love to be able to
daisy-chain my three 4k monitors which I can't do, and 120Hz is becoming very
mainstream (although actually rendering 4k/120Hz is problematic on the
graphics card side too).

~~~
chx
I believe Apple arrived to the Prod Display resolution based on the
Thunderbolt bandwidth! [https://linustechtips.com/main/topic/729232-guide-to-
display...](https://linustechtips.com/main/topic/729232-guide-to-display-
cables-
adapters-v2/?section=calc&H=6016&V=3384&F=60&bpc=10&timing=cvtrb&calculations=show&formulas=show)
it's 38.70 gbit/s which is suspiciously close to the bus bandwidth limit of 40
gbit/s

~~~
amarshall
It also is the equivalent PPI of the Apple Cinema Display (which was 2560x1440
27”), but 2x for “retina” monitors. I eagerly await DP 2.0 to become
widespread so the rest of the world can catch up to the XDR and LG UltraFine
5K.

------
dangus
This article does contain useful information especially since a lot of this is
way less obvious than it should be to a non-technical end user.

My crappy Apple USB-C Multiport adapter simply refuses 60Hz 4K on an HDMI
cable that works out of the box with Windows 10 on a GTX 1060. This is despite
the fact that the product page says that this configuration with my particular
Mac is supported (edit: it’s actually because I have an old adapter).

Then I use a cheap $8 USB-C to HDMI cable and it works fine on two MacBook Pro
15/16” computers (be careful here as a 2016 USB-C MacBook Pro 15” doesn’t
support 4K 60Hz over HDMI but the identical-looking 2017 model and newer do).

Frustratingly, Apple doesn’t make a similar USB/power/video adapter for
DisplayPort, just for HDMI. Nobody else seems to make one either unless you
get into the world of expensive hubs.

For our older 2015 MacBook Pro, Thunderbolt 2/mini displayport to DisplayPort
is the option.

What I’m getting at in a poorly organized fashion is that the world of single
cable charge and display and device hub is definitely not here. There’s one
display on the market that supports USB-C charging above 85W and it only works
with newer Macs.

You can get basically the same display for $1000 less with an LG 27” 4K
display and just use DisplayPort and HDMI with adapters. Saving yourself from
plugging in a second thing isn’t worth $1000.

I can’t for the life of me figure out why the people who make displays with
built in power supplies for USB-C charging didn’t just default it on the 100W
maximum that the spec allows. What short-sighted design!

In any event, I think I don’t mind keeping that piece of hardware separate
from my display, not just for reliability but for weight and bulk as well. I
can easily hide a power brick under the table.

~~~
brightball
I'm not on a Mac anymore, but this sounds very similar to something that I
experienced when switching over to Linux and trying to use the Apple adapters
that I had for my displays.

I can't remember the exact terminology, but there are basically two types of
adapters: active and passive. Passive adapters defer some of the work to
software on the computer while active have everything needed built in.

All Apple adapters are passive and because of that when you try to use them
with non-Apple computers that don't have the expected software/driver...they
don't work.

It's been a while but I experienced this with mini-display port to DVI
adapters. I don't know if it carriers over to other types as well.

~~~
mruszczyk
It's interesting, at least the lightning to HDMI adapter from apple
dynamically loads a bundled copy of iOS from the device itself. I'm unsure if
the USB-C multi port adapter is similar.
[https://hackaday.com/2019/07/30/apple-lightning-video-
adapto...](https://hackaday.com/2019/07/30/apple-lightning-video-adaptors-run-
ios-dynamically-loaded/)

~~~
__float
It doesn't seem to be dynamically loaded the same way, but it certainly has
software updates: [https://support.apple.com/en-
us/HT205858](https://support.apple.com/en-us/HT205858).

------
Stratoscope
For me the sweet spot is a triple monitor setup with all three monitors in the
neighborhood of 200 DPI, and running at 200% scaling in either Windows or
Linux (Cinnamon).

One monitor is my ThinkPad's 14" WQHD display. The other two are 24" 4K
displays.

One of those displays is horizontal, immediately above the ThinkPad display.
The other is vertical, to the left, with the bottom of the monitor close to my
standing desk. (Of course it could be on the right if that suits you better.)

Because I'm old enough that my eyes can not adjust their focus like a younger
person's eyes, I make sure that all three monitors are at the same distance
from my eyes. And I have a pair of single vision prescription glasses adjusted
for that distance.

Since I also use the ThinkPad on its own, that determines the focus distance:
about 20 inches. The external monitors are also at that distance from my eyes.
Each of the three monitors is tilted at the appropriate angle so that the
plane of the monitor is perpendicular to the view from my eye position. In
other words, the "normal" at the center of each monitor points directly to my
eyes.

I can't use big monitors like a 32", regardless of the resolution. The focus
distance changes too much between the center of the monitor and its edges. But
24" is small enough that the entire monitor is in focus with my single vision
glasses.

Did I mention single vision glasses?!

Unless you are young enough that your eyes can easily refocus, do yourself a
favor and get these. Not a "reading" prescription - that is typically 16",
much too close for typical computer use. Bring your laptop to your optometrist
and get single vision lenses for that distance. It will probably be about 20".
Then when you use external monitors, make sure they are also at that same
distance.

Do not under any circumstances use progressive lenses for computer work! I
have seen far too many people tilt their head back and lower their gaze into
the bottom part of their progressives. This is a recipe for neck and eye
strain. You will be amazed at the improvement that a good pair of single
vision lenses give you.

~~~
hbarka
I agree wholeheartedly. My optometrist created an extra pair of single vision
glasses that were backed off by about one diopter from my true distance
prescription (YMMV) and it is much much more comfortable for computer work
than reading glasses.

------
nihonium
To anyone has a slightly older Mac and thinking about getting a 4k screen:
DON'T.

4k is not supported on many older models. Check your official specs. Most Macs
has max 1440p@60hz output. 4k is only supported @ 30hz, which is no good for
daily usage. And the main problem is, if you get a 4k monitor( to future proof
your setup), and try to use it at 1440p, everything will be blurry and pixels
will shift and distort.

Just get a native 1440p monitor.

If you have a never Mac, getting a 4k 27" monitor may still be a bad idea.
Since 4k is too much for a 27" screen, you will need to use scaling in Mac
options, and ideally set it to "looks like 1440p" But this will cause your mac
to do 1.5 scaling and create a burden in your GPU and CPU. It will render
everything doubled at 5k and try to scale it to 4k. If you're using a Macbook,
your fans will never stop even on idle. This is even worse performance than
getting a 5k monitor and using it native 2x scaled, which is easy on GPU.

One side note; there is no USB-C Hub that offer 4k@60hz output, technically
not possible. You have to get a separate hdmi or dp adapter, or an expensive
Thunderbolt 3 dock. But there are some usb-c to hdmi or dp adapters which also
offers Power Delivery.

I've already wasted money and time figuring this out, so you don't have to :)

~~~
crooked-v
> If you're using a Macbook, your fans will never stop even on idle.

I have a 2017 MacBook Pro that runs 4K-at-looks-like-1440p fine with no fan
noise and without even turning on the dedicated GPU for normal web browser /
code editor / document stuff.

~~~
nihonium
Macbooks always turns on dedicated GPU when connected to an external monitor.
Maybe you're using it in clamshell mode?

~~~
crooked-v
Huh, I guess I was incorrect on the GPU, then, but I've definitely never heard
any fan noise for anything less than a game with 3D rendering spinning up.

------
sgt
HDMI and 4K/60Hz is problematic on other platforms than just Linux, in my
experience. It just doesn't work well enough yet for most people and most
computers/screen combos.

Using 4K and 30Hz is like using your desktop through a VNC session, it's
absolutely horrendous.

------
trollied
On a Mac the other way to sort this is with an egpu.

I have a 13” 2017 MBP & use a Razor Core X Chroma with an AMD Vega 64.

It makes an awesome 1 cable docking station - I have mouse, keyboard &
external disks plugged into it, as well as the monitors and a Behringer
external sound card (for midi and studio speakers).

Just 1 Thunderbolt cable into the MBP, which also provides power to it. Makes
for a nice tidy desk too!

[https://egpu.io](https://egpu.io) has all the info you’d ever need on such
setups (no affiliation BTW).

~~~
pram
I have a similar setup but I found the USB ports on the Chroma to be
ridiculously unreliable, and that seems to be very common. They work without
issue for you?

------
m463
Related, I had an interesting experience with an older 4k monitor and ultrahd
bluray.

I had a dell UP3214Q monitor, and I got the bright idea to get a bluray
player. The model I bought happened to support 4k ultrahd, so I hooked it up.

It looked pretty good... except ... wait, it was downscaling ultrahd to 1080p
over HDMI.

So I tried making sure I had the right HDMI cables. Still 1080p.

Long story short -- nobody will tell you this straight out -- ultrahd bluray
players require hdcp 2.2. strangely 1080p non-ultrahd disks upscaled to 4k.

hollywood sucks.

I tried a U3219Q (HDMI 2.0 and HDCP 2.2) and everything worked. (except the
monitor is not as high quality with a little less viewing angle and some LED
bleed at the lower edges)

~~~
proverbialbunny
If it says anything, most 4k movies out right now have cgi rendered in 1080p
and then upscaled. Also, most cameraing, even if recorded in 4k, is not
focused to 4k, but focused for 1080, so it's often fuzzy. Then some shots are
more focused than others which is just annoying.

Avengers: End Game is the first movie to render for 4k and recorded with 4k in
mind, and I am uncertain if any movies have come after it meeting that spec
yet, so atm 4k bluray is sadly a gimmick anyways.

~~~
m463
I watched Gemini Man 4k/60hz/HDR and it was really weird to watch.

It felt more like reality tv than a cinematic movie, and I'm uncertain if it
was the HDR or the 60hz.

------
derefr
Do variable-refresh-rate (FreeSync, G-SYNC, etc.) monitors support being
driven at 24/25Hz? If so, can we just _do that_ and watch movies in 4K@24
(which would fit perfectly fine down an HDMI1.4 cable), rather than needing
this whole ridiculous chain of 3:2 pull-down + frame doubling + bandwidth
increases to get there?

~~~
amarshall
Not necessarily any reason to have FreeSync or G-Sync. Just set the output
refresh rate to the desired fixed value. Some media playback software can even
be set to do this automatically. You’ll get a flash during the refresh rate
switch, though.

------
nullifidian
Speaking of 4k, one question bothers me -- will we see HDMI 2.1 or DisplayPort
2.0 in the 2020 GPUs? Because if not, it would be another 1-2 years wait for a
solution capable of doing 4k HDR(10bpc) at 120 fps, or even just 4k 10bpc
4:4:4 in case of HDMI.

Frankly, don't want to buy a GPU that will be outdated in a year, simply due
to its connectivity.

------
pyr0hu
Met this issue on my 2015 MacBook Pro, tried to connect a 4k display with HDMI
only to find out that it can only display 4k at 30hz. I thought it was only a
configuration issue, but seems like MBP can only do 4k@30hz when connecting
with HDMI (it uses HDMI 1.2 IIRC). Works properly with a DisplayPort cable
though.

~~~
sebastianconcpt
My scenario. I'm about to buy 2 4K monitors but with Thunderbolt-DisplayPort
adapters for that very reason.

------
Kirby64
It's unclear why the author posits that the graphics cards (s)he has on hand
only support HDMI 1.1. Just because they only support 1080p output doesn't
really mean anything about the HDMI revision they support. PCI-e should have
nothing to do with that either.

Also, I'm not sure what spare parts bin the OP has, but if you don't have
something that supports anything higher than FullHD/1080p, you've got some
ANCIENT graphics cards.

For reference, here's one of the oldest graphics cards I could find on Newegg,
the GTS 450. First released in Sept 2010... according to Nvidia it still
supports an output resolution max of 2560x1600:
[https://www.nvidia.com/object/product-geforce-gts-450-oem-
us](https://www.nvidia.com/object/product-geforce-gts-450-oem-us)

~~~
taspeotis
The article says:

> Apparently some video cards that I currently run for 4K (which were all
> bought new within the last 2 years) are somewhere between a 2010 and 2013
> level of technology.

Which I find unhelpful, because I can buy a new GeForce GT 710 circa 2014 from
the local computer store today. (Dirt cheap, passively cooled, I assume that's
the niche.) Buying new doesn't mean it's contemporary. A cursory Google
suggests my old GeForce 960 (which first came out in 2015; also I'd consider
it mid-range at best) supports HDMI 2.0.

Elsewhere it mentions a GT 630 which is ... from 2012. So yeah the 2012 era
GPU bought two years ago is "somewhere between a 2010 and 2013 level of
technology."

------
egdod
Having just dealt with the HDMI cable 4K refresh rate debacle, I can say that
it is indeed a mess. The only way to make it worse would be to incorporate
USB-C somehow.

~~~
jeswin
> The only way to make it worse would be to incorporate USB-C somehow.

Why? I run DisplayPort over USB Type-C port on my Linux Laptop and it works
quite well for 4K at 60Hz. DisplayPort over Type-C is a pretty common
configuration on laptops. [https://www.displayport.org/displayport-over-
usb-c/](https://www.displayport.org/displayport-over-usb-c/)

~~~
egdod
I was mostly joking. USB-C is fine if you get the right cables. But that can
be a big “if.”

It’s a far cry from the trivial “just plug anything into anything with any
cable and it’ll work” that we were promised.

~~~
izacus
> It’s a far cry from the trivial “just plug anything into anything with any
> cable and it’ll work” that we were promised.

It pretty much works exactly like that except for very edge cases near end of
protocol capability. Things like 100W power supply and 4k/60Hz.

(Also I don't know who exactly promised you that any cable will work for high
performance protocols.)

------
pwython
I tested a bunch of different HDMI cables when I needed to record 10-bit 4K
4:2:2 @ 60fps from my camera, and found that the most reliable cables were
Monoprice brand, starting at just under $4.

[https://www.monoprice.com/product?c_id=102&cp_id=10240&cs_id...](https://www.monoprice.com/product?c_id=102&cp_id=10240&cs_id=1024021&p_id=24182&seq=1&format=2)

------
mncharity
"A roomy 5K desktop, on a bus."[1] is a recent silly tweet of mine. The system
as demoed is variously unusable: an inappropriate camera lens is terribly
blurring passthru; the underlying desktop capture is scaled, losing pixel
crispness; and the scaling and config are making head motion too sensitive.

But the motivating idea... Nreal glasses have a crisp 1080p stereo display, at
2 meters, with a laptop-like 50deg (5 held-out fists) field-of-view (vs this
demo's 90-ish). Used as a viewport into a larger (and shallow 3D) desktop...
I'm very much looking forward to trying that. One downside is that's still
only 1080p visible without head motion... but the motion mapping can be
variously aphysical.

[1]
[https://twitter.com/mncharity/status/1225091755667853318](https://twitter.com/mncharity/status/1225091755667853318)

------
dzink
I am able to run Windows 10 in 4k at 100fps with display port using the
Alienware curved 34in monitor (it supports 100+ fps) . The hardware behind it
is described in this unfinished blog post
([https://blog.dianazink.com/constructing-a-deep-learning-
desk...](https://blog.dianazink.com/constructing-a-deep-learning-desktop-
in-2020)).

I built this rig for machine learning and I spend a lot of hours coding every
day. Any other setup gives me headaches and eye exhaustion, including Apple’s
otherwise excellent Thunderbolt display (60fps). The high fps on the Alienware
is the number one criteria i now use for buying a monitor or TV. It has made
me more productive by far (feels like two monitors in one by real estate).

~~~
Kirby64
You sure you don't mean the UltraWide 3440x1440 120Hz (or 100Hz, overclockable
to 120Hz for the old version) Alienware monitor? I'm not aware of any curved
monitor they make that is also 4k.

At 3440x1440x120Hz, you're only pushing ~14.5Gbit/s. DP1.2 or HDMI2.0 supports
it fine, but that's it.

It's even worse than a 4k 60hz monitor. You literally cannot use anything
besides HDMI2 or DP1.2 to drive the display.

------
djsumdog
I was searching for the date thinking this had to be old, but it's not. It's
very recent, but the author does seem to be using some older hardware. I did
have a problem with 4K-DCI and KVM switches at one time:

[https://battlepenguin.com/tech/4k-uhd-kvm-switches-the-
start...](https://battlepenguin.com/tech/4k-uhd-kvm-switches-the-startech-
sv231mdpu2-and-the-iogear-GCS62DP/)

Since then I've switch to a UDH monitor and a TESmart HDMI KVM switch. It can
do 60Hz with HDR on my Windows and PS4 and 60Hz/non-HDR on Linux with the
amdgpu (open source) Radeon drivers. I had no issues at all with the full
resolution an refresh rate, either directly connected or with that KVM.

------
sebastianconcpt
Interestingly this week I was planning to upgrade my setup buying 2 Samsung
monitors 4K for the MBP. And from what I've investigated I concur that HDMI is
a no-no. All is pointing to Thunderbolt<->DisplayPort adapters as the way to
achieve 60Hz in 4K

~~~
Kirby64
HDMI is fine for 4k 60hz if it supports HDMI2. Depends on the adapter. If you
have an old-school MBP that still has an HDMI port, that port definitely can't
do 4k 60hz.

Both Thunderbolt 2 and Thunderbolt 3 (USB-C) would be fine for supporting 4k
60hz. Whether it's over HDMI or DisplayPort ultimately doesn't matter, it's
the adapter that'll matter more.

~~~
sebastianconcpt
Yes. And to add something to this, you might be able to find a cable instead
of an adapter. One like this:

Monitor <\-- male DisplayPort* ---- male mini DisplayPort --> MacBook Pro

*I has to be DisplayPort 1.2 or newer (they are backward compatible)

------
jsntrmn
I work from home 1 or 2 days a week and am extremely comfortable with a 2019
MacBook Pro (15 in) and two Dell P2415Q monitors (24 in / 4K). I wish I had
gone with the 27 in model but had limited space in my office.

My personal laptop is a mid-2015 MacBook Pro. For connectivity, I use both
DisplayPort (DP) and mini-DisplayPort (mDP). DP goes to the 2019 MBP and mDP
goes to the mid-2015 MBP.

I run everything from the latest MacOS to Windows 7 and 10 to multiple flavors
of Linux via an unRAID box that pulls double duty as a media server. I've had
very limited issues with driver support.

Very affordable and reliable solution.

------
altmind
GT630 seems to only support hdmi 1.4, not 2. i miss the point of this article.

------
remmargorp64
For whatever it's worth, I recently attempted to run a 4K 60hz HDMI connection
from my gaming PC to my 4K LG OLED TV in the living room.

I ended up trying out probably 4 different cables before I found one that was
actually able to transmit the signal correctly over 25-30 feet. The cable that
ended up working for me was a KableDirekt cable that I bought off Amazon. The
beast of a cable is THICK. I also had a lot of luck with AmazonBasics HDMI
cables for shorter runs.

The most important thing for 4K HDMI cables seems to be that they need to be
well insulated and thick.

------
elif
>my testing has shown that a cheap “High Speed HDMI Cable” can work at 60Hz
with 4K resolution with the right combination of video card, monitor, and
drivers.

My testing was completely contrary to this. I bought around 10 cables that
claimed to support various dp and HDMI specs, trying to achieve 4k@60 on my
MacPro.. in the end only one cable actually succeeded, and the longer version
of the same cable from the same manufacturer would not work, and even the
"good" cable would flicker if the ambient temperature was too low.

------
nullc
If you think 4K 60Hz is a PITA, try getting 8k working in Linux. It's a super
PITA.

Then try relocating the computer 50 ft away to put it in another room for
noise reasons. Super super PITA.

But: it's possible.

------
gfrff
Does anyone know how I would find a graphics card or set of graphics cards for
Ubuntu which supported 5 monitors well, one being 4k?

I also don't want NVidia proprietary drivers due to bugs. They confuse
something with monitors versus screens which breaks a lot of things. Nouveau
works fine. Haven't tried AMD but I assume it's fine too.

~~~
masklinn
AMD used to have special "Eyefinity" series cards which supported up to 6
displays (as in a single card / GPU would have up to 6 display controllers).

Though I don't know if that's still a thing, or whether they were well
supported on linux, or how well they deal / dealt displays of different
resolution. Also I'm pretty sure it only worked over native DisplayPort, so
you'd need active adapters to plug in non-DP displays.

------
elchief
I just got a 32" 4k monitor. Works nicely on my Asus Zenbook UX430UA running
Ubuntu 18

The sales guy kept trying to sell me an HDMI cable, but I knew I could only do
30 Hz. Had to ask several times for USB type C to displayport

My next system will have thunderbolt and I'd like to run a 27" monitor at 5k

------
shmerl
Always prefer DisplayPort over HDMI. There should be no reason to ever use the
later, unless you are stuck with hardware that can't work with anything else.
It's only being pushed around by HDMI creators who still like to collect fees
for its patents.

------
ratsimihah
I've let go of HDMI for a while now. They work terrible on macs as they tend
to render at 1x instead of 2x at 60hz, or you can run 2x at 30hz, which is
super laggy.

Not a lot of people seem to be aware of that, yet it's like not wearing
glasses when you actually need some.

------
caymanjim
I use a Dell P2715Q with a GeForce GTX 1050 Ti and an HDMI cable and it's at
60Hz. Works fine in Linux and looks great. And this is some pretty old
hardware. I believe I had to do something on the monitor to make 60Hz work,
but I can't remember for sure.

~~~
prpl
I have one of these at home and U2718q at work. At work, I use the Macbook
USB-C multiport adapter (new one - you have to double check if you have an old
one) and at home I got a uni USB-C-> HDMI 2.0 adapter ($18 on amazon)

Both systems work good. Both monitors also worked with mini displayport on my
2015 Macbook at 60hz, which was kind of embarrassing when going to the newer
macbook, because it was expensive for the apple adapter and I had to look
around before I tried the uni adapter.

------
annoyingnoob
If you need a long cable length, like more than 20 feet it gets even more
interesting. Yay for 'Standards', maybe.

~~~
Benji_San
I'm currently using a 10m (~33ft) HMDI cable to connect my desktop PC to my TV
@ 4k/60hz. It works well without HDR but does cut out from time to time with
HDR enabled so I mostly leave it turned off @4k.

I spent quite a long time researching what cable to get as the standard does
not really cover cables longer than around 5 meters, and most "legit" cables
explicitly states that they can't provide 4k/60hz at lengths longer than 5
meters so you mainly have to go on customer reviews. The price does not
necessarily indicate what the cable is actually capable of so getting
something that works requires a lot of research... (+ YMMV)

------
JosephRedfern
I'm a bit late to the game here, but why would Chroma subsampling give you a
poor experience when reading text?

------
choeger
Wait a second. You can watch 4k content from Netflix under Linux nowadays?

~~~
buster
Amazon Prime and Netflix work in the browser nowadays. Quite well on Linux,
too. I suppose it's some DRM that made it into firefox in the past? Anyway,
it's working quite well.

~~~
KingMachiavelli
At least Netflix is limited to 720p if you are not using the Windows 10 store
app or (apparently) using some firefox extension that tricks it. But yes the
DRM/widevine has been working for quite some time now.

------
bluedino
Related question: why did DVI become popular first? DP is smaller and
"better", was it the fact that DVI could carry analog as well?

~~~
Qiasfah
DVI Predates DP by ~7 years

[https://en.wikipedia.org/wiki/Digital_Visual_Interface](https://en.wikipedia.org/wiki/Digital_Visual_Interface)
[https://en.wikipedia.org/wiki/DisplayPort](https://en.wikipedia.org/wiki/DisplayPort)

