
My Adventures with “4K” 2160p and Linux - pmoriarty
http://lxer.com/module/newswire/view/200317/
======
darshan
I've been running 3840x2160 on the 23.8" Dell UP2414Q for the past month, and
I couldn't be more pleased. Well, I could if I had it running at 60 Hz rather
than 30 Hz, but having twice the PPI is worth the cut in the frame rate to me
without question. I'm using the open source Intel drivers and the on-die GPU
of my i7-4770K, and everything works out of the box at 30 Hz. Patches to get
things at 60 Hz are in the works, but it's pretty complicated because the
software needs to treat the display as two separate 1920x2160 displays due to
limitations in the DisplayPort spec.

My desktop is where I spend the most time, yet it was the last remaining
device I use where there were visible, ugly, distracting pixels everywhere.
Now I never see pixels in my daily life, and reading and writing on my desktop
have become significantly more pleasant. I highly recommend switching to a
high-DPI monitor for anyone who spends much time on the computer (which is
probably almost everyone here).

Oh, most software works just fine once I manually set the DPI in KDE's system
settings. Upgrading to Plasma 5, the only software I use that isn't scaled
properly is Chrome. Apparently they're working on it, but I only use Chrome
for Netflix, and it's not too bad having tiny browser tabs for that one use.

~~~
kevinchen
Possibly off topic, but what do you think about the Dell UP2414Q's color? I
saw the 32" version in a store and the viewing angle was pretty bad.

~~~
jacobolus
I’m really impressed by the Dell UP2414Q. I’d highly recommend it over just
about any other external monitor. Color is consistent across the front, and
quite accurate. It’s pretty bright with good contrast. And the high resolution
is just stunning.

The only computer display I’m more impressed with at the moment is the 5k
“retina” iMac display.

------
patcheudor
Hats off for trying this in Linux. I gave up on that path a long time ago and
am running nine monitors with a Windows 8.1 base OS, utilizing DisplayLink USB
docs in combination with the DVI out on my main doc to drive the 2560x1600 30"
display (the yellow one) with all others running at 1920x1200 from
DisplayLink. I then run Linux in a VM, all from a laptop whith 32GB of ram and
three built in SSD's giving me a bit more than 2TB of storage.

Interestingly there appears to be an eight monitor limit total when using
DisplayLink for the OS so the ninth monitor is driven from its own DisplayLink
doc which I've directly hardware associated with a VM. Of course because the
host OS doesn't know about that monitor I also need a second mouse so I can
access it. Here's the latest photo of my setup:

[http://defaultstore.com/mydesk.jpg](http://defaultstore.com/mydesk.jpg)

~~~
dustin999
Honest question, do you feel a little ridiculous with that setup? I'm a coder
and have 2x24" monitors, and found that to be borderline ridiculous with neck
strain, thus the reason I'm going 27". But unless you're running security at a
place of business or something with 50 security cams, I can't imagine what
you're doing that requires a wall of monitors?

~~~
patcheudor
Our philosophy in the office is that alt-tab is for suckers. While my setup is
the most over the top we have three other people with five monitors or more.
We are all security researchers and find the configuration saves a bunch of
time in reviews. As an example, I can associate a MitM proxy like the
Burpsuite or Fiddler 2 with the server side application which might
communicate to web clients as well as to additional web services behind the
scenes. That takes up one monitor, typically the one at the far top left.
Under that monitor I can then associate another MitM proxy with the client. I
can then run the client from my laptop display. If I'm working on a fat
client, on the 30" I'll then run Wireshark which will effectively be watching
the client. On another monitor I can run sysinternal tools. What remains I use
for writing code necessary for the review, running additional tools like
Metasploit, e-mail, chat, and research. I arrange my workspace for the task at
hand. On a daily basis do I use all nine? No.

Interestingly enough, with this sort of setup it's pretty easy to visually see
what's happening when going after an application. Before going to the monitor
extreme I'd constantly alt-tab between my monitoring & exploit tools with
every action. Now I can run an action and see the results within one
workspace. Of course there is a massive downside. It makes competing in CTFs a
pain in the butt as I can't drag that setup with me to physical events.

~~~
lttlrck
Surely you still use the keyboard to switch keyboard/mouse focus? But instead
of Alt-tab another key combo?

~~~
richardkeller
The moment I use more than 1 screen, I find alt-tab to be cumbersome. I find
that in a GUI environment, having your mouse autofocus on the window it's
hovering over is far easier than using alt-tab. Especially when you have
multiple windows open, cylcing to the correct window using alt-tab is usually
slower than moving the mouse over the window.

------
dedward
I don't mean to be "that guy"... because this is, of course, about linux
running 4k.

But much of the discussion seems to be around what the point is, whether it's
useful, and so on.

As someone who's been using one of the new 5k iMacs since it came out (so
that's 5120x2880 in a 26" display)... let me add a bit, hopefully without
sounding like an apple fanboy.

First - running it at actual 5k, rather than at 3200 or 2560 HiDPI, is crystal
clear, amazingly sharp, but too small at this size. With a 40" display it
would be perfect, I imagine. At full 5k, you can fit an amazing amount of
stuff on a single screen, and it's handy for certain short troubleshooting
sessions - but stuff can be so small it's straining. The stuff is still crisp
though, if you can manage to focus on it. The detail is there. 12 point fonts
are barely readable to my 40 year old eyes unless I get closer.

3k is just fine, and 2k is what you'd expect - but both are using HiDPI
rendering and you get stunningly crisp fonts and detailed images and whatnot.

I wouldn't trade it for anything, even if I don't run in 5k all day long.

~~~
lrizzo
Speaking of screen size, having used it i confirm that 39-40" is what i find
the minimum acceptable size for 4K if you are not running in HiDPI mode. It
corresponds to 1080p at 20" which (which to my 50y eyes is almost below the
threshold of what i can see).

------
lrizzo
I have tried 4K@30Hz using HDMI on five different TVs (not monitors): Seiki
39" US, Seiki 39" UK (US and UK modes are different), Hisense 42K680, Samsung
UE40HU6900DX, LG 40UB800 (the latter three are in Italy and seem available
across europe).

In all cases the additional display lag wrt my retina MBpro screen (measured
with the small HTML/JS at the end) was huge, in the 130-230ms range, i.e. 4-7
frames. As a comparison, even the low end 1080p TVs i tried (at 60Hz) only
gave 0-16ms additional lag, i.e. one frame.

The problem is not 30Hz vs 60Hz, but a video pipeline which is too deep, and
cannot be shortened in any of the models i tried (even the models that have a
"gaming" mode disable that control when running at 4k). Note that even at 30Hz
one frame is only 33ms so anything more than 66ms indicates excessive pipeline
depth. Based on my measurements, I think that at the moment the chipset that
do 4k (upscaler etc.) are still buggy/immature in terms of features, and
probably we need to wait the next generationof silicon to get 4K TVs that can
be used as monitors.

I have become used to the 130ms lag but it is not pleasant.

Note that the Acer Chromebook C720 ($199) can drive the screen at 4K through
the HDMI port. I had no problem with FreeBSD using high end nvidia cards
(GT640 and GT750), whereas lower model seem unable to use pixel clocks above
165 MHz (you need about 290 MHz to run at 4k).

<!DOCTYPE html><html><head><script> Object.prototype.d = function(l) { return
(this + Math.pow(10,l)).toString().substring(1, l+1) } function g(x) { return
document.getElementById(x); } function clock() { var t=new Date();
g('txt').innerHTML=t.getSeconds().d(2)+'.'+t.getMilliseconds().d(3);
setTimeout(clock,1); } </script></head> <body onload="clock()"> Set PC to
mirror screens, take a snapshot with a camera, compare times<br/> <div
id="txt" style="font-size: 120px;"></div> </body></html>

~~~
firloop

        displaylag.html:5 Uncaught SyntaxError: Unexpected token function
    
        displaylag.html:10 Uncaught ReferenceError: clock is not defined
    

Wasn't able to run that page you put at the end of your comment on Chrome
39.0.2171.95. Am I missing something? Here's a codepen:
[http://codepen.io/pen/yyOxXQ](http://codepen.io/pen/yyOxXQ)

~~~
sumnulu
Fixed: [http://codepen.io/anon/pen/GgZXYO](http://codepen.io/anon/pen/GgZXYO)

------
siliconc0w
I looked into this setup but it gets pretty ridiculous if you want to do any
gaming. You want 4k @ 60hz with 4:4:4 color sampling at a reasonable <40" but
it soon begins to feel like you're hunting a rare animal. With only hints and
whispers strewn throughout a dark abyss of forums threads and amazon reviews.
The kind of places where people put AMD driver revisions in their signatures.
I don't think we're quite there yet.

~~~
marknutter
Can you get a 4k display and use it at its full resolution for everything but
gaming, and then just switch to a lower resolution for gaming when you want
the higher refresh rates?

~~~
bryanlarsen
According to reports, it scales 2560x1440 beautifully. 1920x1080 obviously
works well, since that's an integer multiple.

------
kenrikm
I ran the Seiki 4K as a monitor for about a year. Overall the quality is fine
however the 30hz and mouse lag make it less then optimal for anything other
then static content. Even scrolling a page is not easy on the eyes.

------
shmerl
KDE will only fully support it in Plasma 5.0. As of now it's somewhat messy
and requires lot's of manual tweaks besides changing the DPI.

See [https://community.kde.org/KDE/High-
dpi_issues](https://community.kde.org/KDE/High-dpi_issues)

However one should be careful when getting monitors with such huge resolution
since they put more requirements on the GPU. For example if you want to play
games in native, it will require dual GPU at least which is a very pricey
overkill otherwise. Current video cards don't cope with such resolution well
yet. That besides the fact that such monitors are very expensive on their own.

Also, ergonomics wise, for me personally 24-27" is a sweep spot. Anything
larger than that becomes uncomfortable to use unless you place it in some
distance.

~~~
RussianCow
> For example if you want to play games in native, it will require dual GPU at
> least which is a very pricey overkill otherwise. Current video cards don't
> cope with such resolution well yet.

Can't you just set the games to a lower resolution? Wouldn't that lower the
GPU load?

~~~
shmerl
You can, but for LCD monitors non native resolution usually degrades image
quality. And basically it means that for that scenario you aren't using what
you paid for. So it's something to consider when buying it.

~~~
simoncion
The Windows version of ATI's Catalyst driver has an option that permits you to
use the GPU to scale less-than-native-resolution output to an attached LCD's
native resolution. It makes video games a little bit fuzzy, but works better
than the built-in LCD rescalers that I've seen.

------
bryanlarsen
If you're looking for a ~40" QHD monitor, you no longer have to buy a TV with
its attendant compromises. Check out the Philips BDM4065UC.

~~~
pmoriarty
How well is it supported on Linux?

~~~
jacquesm
HDMI 1.4 should drive it at 30 Hz.

~~~
pmoriarty
I think I'll wait, then. I need a minimum of 60 Hz for it to be bearable.

~~~
jacquesm
That works with that monitor but only using display port. It's more of a
cabling thing than a linux thing.

It also seems to support multiple interface signals side-by-side or above each
other (see documentation):

[http://download.p4c.philips.com/files/b/bdm4065uc_61/bdm4065...](http://download.p4c.philips.com/files/b/bdm4065uc_61/bdm4065uc_61_dfu_eng.pdf)

That might be an alternative road to getting 60 Hz at full res.

------
PSeitz
I have the same monitor, 4k is cool but the input lag makes it not so
efficient to work with. At least I could't get used to.

------
nightcracker
> The myth I keep hearing is that you must go to larger fonts when scaling up
> to a 4K monitor. This is not exactly true. Do the math. If you double the
> screen resolution and at the same time you double the screen width, you have
> done absolutely nothing to the size of a pixel or the physical size of your
> fonts.

What the author seems to be missing here is that if you double your screen
size you will likely put the screen at further viewing distance.

~~~
Someone
Even if you don't, pixels near the edge of a flat display will look smaller
because they are further away from your eyes.

For example, for a 20 inch diagonal monitor at 10 inches distance, the corner
pixels have about 0.7 times the angular extent of those in the center of the
screen, or about half the solid angle.

Replace that by a 40" diagonal monitor at 10", and that factor drops to about
1:sqrt(5), or 1:5 in solid angle.

So, I guess you will need larger fonts, _if_ you keep your eyes in the same
position on your larger monitor. There probably are publications on this in
studies on airplane ergonomics.

------
vostrocity
No one has mentioned that the author of the piece likes to leave his monitors
on 24/7\. I often see many monitors in offices left on overnight running a
screensaver. Is there a good reason for this, or simply ignorance?

------
eeZi
The real fun starts with high DPI/high resolution displays when you actually
have to scale up your desktop environment.

(which works reasonably well nowadays - at least with Gnome!)

~~~
kuschku
Gnome and KDE support it, but sadly there is still a lot of software that
doesn’t :/

~~~
Rondom
LWN had a nice article about the challenges of high-dpi displays and how
different OSes and DEs deal with them.
[https://lwn.net/Articles/619784/](https://lwn.net/Articles/619784/)

------
dustin999
"The myth I keep hearing is that you must go to larger fonts when scaling up
to a 4K monitor. This is not exactly true. Do the math. If you double the
screen resolution and at the same time you double the screen width, you have
done absolutely nothing to the size of a pixel or the physical size of your
fonts."

Yeah but it's a 39" monitor... On your desk! Seriously, I'm all for the
largest monitor and resolution and everything else, but there's a point where
I'd argue it's just too much. I think a 39" monitor on your desk is crossing
that line. I can't imagine the neck strain that's going to occur.

Instead of selling extended warranties, they need to start selling these with
chiropractic insurance.

Full disclosure: I just went through several days of research on 1440p vs 4k.
I went into it assuming I'd get a 4k monitor, but in the end, opted for the
1440p monitor because I refused to stick a 39" monitor on my desk, and the 28"
4k would require DPI scaling and all that mess.

I'd get a 4k for gaming, assuming I had a rig that could power games at that
resolution. Otherwise, I'm happy with my decision to get the Asus PB278Q 1440p
monitor.

~~~
pmoriarty
A 39" monitor is too big???

I'm impatiently awaiting the day when the norm is high-DPI monitors that
seamlessly extend to take up whole walls.. all my walls, to be precise.

Projectors are nice, but the resolution's not there yet. Not to mention the
poor color gamuts, poor dynamic range, and typical lack of 3D.

Basically, I won't be satisfied until I get the holodeck... or a neural jack.

39" monitors. psshhh

~~~
digi_owl
> Basically, I won't be satisfied until I get the holodeck... or a neural
> jack.

How about a Rift work space?

[https://www.youtube.com/watch?v=db-7J5OaSag](https://www.youtube.com/watch?v=db-7J5OaSag)

------
zaroth
I have a late-2010 Macbook Air with the nVidia 320M, which can handle
2560x1440 over the mini-DP. I have the Acer B326HUL which was $399 on Black
Friday. Seemed like a good compromise since driving 4K would mean a new
laptop.

Compared to the 28" 1920x1200 monitor it replaced, I'm very pleased with the
32" 1440p.

------
miduil
I wonder if a bigger screen will make that difference for me. I've been using
an 1920x1200 24" screen and that was nice, but for the past 12 months I've
been fine with my 1366x768 12.4" laptop. Tough, I'm using a tiling window
manager which makes a difference compared to others I guess.

(:

------
jrockway
So all that and it's only refreshing at 30Hz?

Here's my long post about the current state of the art:
[https://news.ycombinator.com/item?id=8549629](https://news.ycombinator.com/item?id=8549629)

------
netforay
Can some one tell me why it has to be 30Hz vs 60Hz. Why it can not be 40Hz or
50Hz?

------
ck2
How's that 30hz refresh rate working out for you.

Noticed there was no mention of it.

You want to flash the firmware from the 50" version to fix it a little bit but
without hdmi 2.0, it is always going to suck.

~~~
washadjeffmad
To explain this comment, flashing the 39" model to the 50" FW enables true
1080_120 display mode, as opposed to the stock FW's frame doubling.

2160 is limited to 30Hz, but that's hardly an issue for anything but games,
most of which are better played in other display modes anyway. Besides, it
doesn't make sense that someone who cared enough about hires gaming to buy a
GPU beefy enough to run games at >30fps in 4k wouldn't just buy an appropriate
monitor as well.

~~~
thyselius
It's impossible to get used to the mouse pointer in 30hz. I tried hard for
weeks but cant stand it.

~~~
washadjeffmad
If you still have the screen, the
ATSC_THTF_SY14343_ST2975C_CMI_V500dk1_P01_20140707 FW improves the apparent
responsiveness of movements at 2160. It's still 30Hz, but feels closer to 50.
Includes other improvements as well.

