
Our Brave New World of 4K Displays - cocoflunchy
http://blog.codinghorror.com/our-brave-new-world-of-4k-displays/
======
nickjj
I wish the author commented on whether or not he can view text comfortably at
1:1 scaling. 27" seems pretty small for that resolution, probably borderline.

As soon as you start playing with scaling you begin to lose the real estate of
the monitor. Sure, things will look smooth but I wouldn't want to drop $700 on
a 4k monitor only to have to scale it. At 200% scaling you end up with
identical real estate as a 1080p monitor.

In case anyone is wondering, Googling shows that this monitor is also SST
which is a good thing. It means on the inside there's only 1 actual panel
instead of 2 panels stitched together. A lot of 4k monitors still run MST (the
worse alternative to SST).

~~~
michaelt
I have a BDM4065UC [1] which is a 4k 40" display, and I scale things up almost
all the time. With the same number of pixels in less area, I don't see how you
could avoid doing so.

I don't scale things as far as 200%, so I get more screen real estate, but if
you aren't willing to scale at all, you might be disappointed after upgrading
to 4k.

[1] [http://www.philips.co.uk/c-p/BDM4065UC_00/brilliance-led-
bac...](http://www.philips.co.uk/c-p/BDM4065UC_00/brilliance-led-backlit-lcd-
display)

~~~
joesmo
I use the same with no scaling. I just wish I had a lower desk.

------
vanderZwan
If we keep improving the DPI, then just having ever more beefy graphics card
isn't going to scale. People tend to forget that graphics cards are really
focussed on rendering 3D games, and the vast majority of our daily computer
use does not involve that at all. Rendering 2D path graphics is much, _much_
more common.

We need hardware accelerated path rendering we want to render text at those
high resolutions fast efficiently. Thankfully, there have been people working
on this for a while now, like Mark Kilgard at NVIDIA:

[http://www.slideshare.net/Mark_Kilgard/gtc-2014-nvidia-
path-...](http://www.slideshare.net/Mark_Kilgard/gtc-2014-nvidia-path-
rendering)

It's covered by patents though, so I'm not sure how open this particular
solution is.

~~~
moonchrome
Since when is desktop 2D graphics rendering a performance an issue ?

I understand the mobile story - they need to conserve energy + work on low end
devices - but I doubt you're driving those 4k screens with ARM/Atom CPUs, low
end integrated graphics chips and 2GB ram - once you have those giant screens
you can fit a desktop machine somewhere as well and those aren't lacking the
horsepower to draw 2D graphics.

~~~
vanderZwan
That 8K near the end of that post is _four_ times as many pixels to render as
4K. It's a surface area, it scales quadratically.

On top of that, the more pixels you have, the more a high frame rate matters
to make animations look smooth. Sure, that might mostly just be scrolling on a
webpage, but people care about this stuff. It's also physically less tiring
for the eyes.

(Don't forget that, mobile includes laptops too - I'd like a high-res screen
on my next laptop without the battery draining like crazy)

~~~
jerf
Yes, doubling the screen on one side raises the pixel count by 4x... but right
now 3D graphics cards have at least a couple of orders of magnitude more
graphics power than they need to drive a conventional 2D display, and 3 or 4
isn't out of the question, frankly. It really isn't a problem in 2D. Compared
to any modern 3D game it's still a walk in the park for them. They've been
overprovisioned for 2D work for a long time now. Even non-3D cards were
getting pretty powerful before it just wasn't worth shipping 2D-only cards
anymore, really, and that was a long time ago, and graphics cards have gotten
scarily more powerful since then.

~~~
ethbro
I am not a graphics card hardware expert, but I thought that the current tail
end of the pipeline (shaders) basically was a massively parallel pixel
handling engine. And that the hardware for that has been getting more and more
generalized because of GPU compute.

Is it that inefficient to bend modern graphics card shaders to the task of
rendering obscene numbers of pixels without the front end 3d geometry
calculations?

~~~
moonchrome
Problem is GPUs are build for triangle rasterisation and that part is still
baked in to hardware for performance. You can use hack stuff on top of the 3D
pipeline to render vector graphics but you need to transform geometry and do
triangulation which isn't really efficient and can't be done on the GPU
efficiently. Coupled with overhead of talking to the GPU trough drivers and
you end up being CPU bound and not a lot faster than the CPU rendering.

The equation changes when you increase screen resolution as the cost of
setting up GPU to render is determined by geometry not by resolution and GPU
rasterization is practically free compared to CPU which gets slower as you
increase the resolution - so at the high enough resolution the driver overhead
might matter less.

------
neverminder
Correct me if I'm wrong, but Displayport 1.2 can only provide 4K@60Hz via MST,
which is something like simulating two vertical displays as one? I've heard
this approach causes problems with some apps/OSes. Why can't they just start
making displays and graphic cards with Displayport 1.3?

~~~
greggyb
Displayport 1.2 can handle 4k@60hz without MST.

MST is, to my understanding, a cost savings on the panel manufacturer side.

Driving a display over a line is a matter of bandwidth. An MST monitor and a
single panel monitor both have the same number of pixels.

------
Theodores
I went for 4K recently, however, I went for real 4K, as in 4096 pixels across
rather than a mere 3840. So that is resolution dialed up 'to 11'. I also went
for 31" rather than anything larger as I didn't want to 'crane my neck' to see
the top of the see of pixels.

For me it is actually important to see individual pixels, this I can do on my
4K mega-screen. For me 31" was the 'sweet spot', 27" or 24" would just not be
practical.

I also stayed with the one screen, for me I get 4 'HD laptop screens' which is
what I want, to have the equivalent of 12 - not sure I would be that
productive with a gazillion of windows open.

What surprised me was how well different devices do. I plugged in my
Chromebook on the HDMI input and it just worked. My Chromebook is lowest of
the low, designed for an 8 year old, I am not even sure it has things like a
CPU or memory, but it just works on 4K, very nicely.

Obviously I have a real PC with linux to drive the screen, I did need to
create the 4096 modes as it would only do regular 3840 out the box.

One feature these monitors have is the ability to run more than one input with
two inputs driving the screen side by side. So you can get your refresh rates
even with lame hardware if you don't mind using two cables. (These can go to
the same graphics card or two PC's).

If you are a developer where the bottleneck is the speed of your own brain
(rather than the graphics card) then I highly recommend going 4K. Big is good
but there is a limit to how big which is tied in to the vertical height - you
don't want to be staring at the ceiling.

------
brador
With the number of Amazon affiliate links this post has, how come it doesn't
get flagged as spam? Is it the authors celebrity?

~~~
ronjouch
> Is it the authors celebrity?

Kinda, yes. The author is Jeff Atwood, founder of StackOverflow (you probably
don't need a link) and Discourse ([1], ruby-based modern open forum software).
He's also a hardware "enthusiast", to say the least, such posts where he's all
excited about his new shiny shiny thingie are not uncommon on his blog. As
such (and because he's probably already full of $$$), I wouldn't say the
affiliate money is his primary motivation to blog this, I think he writes a
geek post _then_ puts Amazon affiliated links because he can, but maybe I'm
naive.

[1] [http://www.discourse.org/](http://www.discourse.org/)

~~~
brador
Say we're looking at $500 per sale @ 10% Amazon commish = $50 a sale for just
the monitors/graphics cards he's pimping. But it's a 30 day cookie and many
here have Amazon prime, so we'll call it $100.

Say the number 1 spot on HN gets him 10,000 views. 1% buy rate:

10,000 views * 0.01 buy rate * $100 commish per user = $10,000

Dudes expected to bank 10k from this single post.

Guess that explains why he doesn't write about USB sticks.

~~~
AlexeyBrin
_But it 's a 30 day cookie and many here have Amazon prime, so we'll call it
$100._

Last time I checked an Amazon cookie lasts about 24 hours. Also, honestly what
is your problem with his Amazon affiliate ? It is not like you are forced to
buy a monitor if you follow his link. I'm more interested in what he has to
say than in actually buying a display now.

------
zitterbewegung
If you build a Gaming computer with NVIDIA cards isn't it basically equivalent
to a inexpensive deep learning setup?

------
bryanlarsen
27 is an annoying size for coding with multi monitors IMO. In landscape mode
you've got a lot of width but not enough height to fill your field of view.
But turning it vertically gives you way too much height and you crick your
neck.

The right height IMO is 22-24 in portrait or 30 inch 16:10 or 32 inch 16:9 or
larger in landscape. If you're going to maximize your screen area, maximize in
both directions.

~~~
lsaferite
This is why I totally prefer the Chromebook Pixel aspect ratio for coding. I
wish you could get 3:2 240dpi+ 27-30" monitors. Being stuck using 16:9 for
anything other than watching a movie or show sucks.

------
PebblesHD
Im curious for more in depth detail about the differences between 1440p at
native scale or 4k at 2x scale, does the loss of effective screen space not
negate any gains in crispness? I've only recently upgraded from two 1080p
screens to a single 2560x1440 screen and I'm finding the extra space
incredibly useful. That being said I absolutely love the fine text on my
retina macbook.

------
TurboHaskal
I just don't see the point, I have a 2015 Retina Macbook Pro and a 1080p 24"
Dell monitor and can hardly feel any difference. It might have to do with the
scaling though.

I wish they stopped with the pixel race or at least made it easy for the user
to opt-out. 1080p is still pretty great for any mid size laptop and maybe then
I'd be able to run Mission Control at more than 5fps.

~~~
JustSomeNobody
Stop with the pixel race? Seriously? Monitors have been stagnant for a decade.
Look at how many laptop screens are stuck STILL at 1366X768. What pixel race
are you talking about?

~~~
backtoyoujim
I think that manufacturers have been busy pushing small screen pixel density
for tablets & phones. So there has been a race just not the forefront at the
large display or laptop markets.

~~~
JustSomeNobody
Agreed, however, the context wasn't tablets & phones.

~~~
backtoyoujim
It is not the context, but that context could help explain the lack of R&D at
those segments.

------
mistermann
Is there any fully featured software to handle chopping these up into 4
virtual monitors, or basically just highly efficient window management so
one's not constantly fritzing around resizing things?

~~~
raisedbyninjas
Divvy?
[http://alternativeto.net/software/divvy/](http://alternativeto.net/software/divvy/)

------
alrayyes
Does anyone have experience coding on 21:9 screens like the Acer XR341CK?

~~~
brk
I have a 21:9 display, the LG 34UM95. Overall I like it, but I wish I had
gotten the curved version (I think). It's so wide that I find the edges a
little bit unusable for full-time stuff. It's nice to keep notes and things
off to the sides.

I went this route because I didn't want a 3-monitor setup, and a 2-monitor has
the bezels in your direct view, or one monitor off to the side. My current
setup is the 34UM95, and a 24" Apple Cinema display (on a different machine)
directly above the 21:9 display.

The curved version has come down a bit in price, I may try swapping to that.

