
4K is for programmers - bhauer
http://tiamat.tsotech.com/4k-is-for-programmers
======
MetaCosm
> Today, you can still buy a 30-inch 2560x1600 display for over $1,000. Or you
> can get a 39-inch 3840x2160 display for $500. This choice is not even fair.

Yes, you can buy a wonderful, physically adjustable, mountable, color
calibrated, blissful contrast, delightful brightness control monitor with a
great matte finish like the Dell 3014... or you can buy an absolutely awful
30hz, locked in place, blue tinted, horrible uniformity, terrible brightness,
awful contrast, with horrible black levels and really -- just horrific to work
on 4K "tv". I speak from first hand experience, it is absolutely one of the
worst panels I have ever had the misfortune to stare at vim on.

The colors are so awful that literally, some of my favorite themes (seoul256)
where ENTIRELY unusable, just a greymush of nothingness. With bright spots and
dark spots to make reading code even harder. So, I tried a black background
theme... only to be horrified with the black level.

I agree with the author there is no comparison, obviously almost anything
beats the Seiki. Hell, most review sites won't even recommend it as a TV, let
alone a monitor.

~~~
leoc
Sad to say, for many programmers and other people who aren't AV types, the
step up in resolution— _plus_ the saving of over $500—is likely worth the
dodgy colour and brightness. Many won't even really notice the badness. That
said, some of the $600-ish mystery-meat 2560×1600 displays apparently have
good display quality and so may be a stronger alternative to the Seiki.

~~~
MetaCosm
At $500, there are FAR better monitors. Bad contrast, bad brightness, bad
everything. If you buy your programmers this, buy them a years supply of pills
to deal with the headaches it will induce.

Again, this isn't like a "bad for picky art professionals" type of bad. Just
awful, I wouldn't even sell it to anyone I knew personally because I knew it
would come back to bite me.

~~~
shiven
_At $500, there are FAR better monitors._

You keep repeating this as if it is some magic mantra. You have not defined
any what "better" really means in manner that can be quantified and compared
without introducing subjective opinions of a user.

Also, care to link to these "better" monitors, including sellers that sell
these for the $500 price, as well as non-subjective reviews?

~~~
f00_
Better monitors very easily be objectively conceived. Latency, refresh rate,
contrast ratio, brightness, input options, are all easily measured.

"Better" monitors are easily found through a simple search, and how do you
exactly determine a review to be "non-subjective"?

[http://www.displaylag.com/display-
database/](http://www.displaylag.com/display-database/)
[http://thewirecutter.com/leaderboard/tvs/](http://thewirecutter.com/leaderboard/tvs/)
[http://thewirecutter.com/reviews/a-great-27-inch-lcd-
monitor...](http://thewirecutter.com/reviews/a-great-27-inch-lcd-monitor/)

~~~
randallsquared
What latency is better than what brightness?

~~~
f00_
I can see where you going with that, and I'll agree it's subjective, as to
which suits your needs, but I think you can find TV's that are all around
better.

------
XorNot
"Seiki is missing a golden opportunity to dominate the desktop display market
by removing the television tuner, speakers, and remote, and then reallocating
that budget to a 60Hz or better input (HDMI 2 and/or DisplayPort), a matte
screen surface, and instant-on DPMI support, all the while retaining the
market-wrecking price."

And while we're at it, I want a unicorn as well! Seriously, speakers, a remote
and a TV tuner are like, $5 in parts _max_ whereas everything he just listed
involves upgrading silicon fabrication processes and designing new ASICs.

~~~
bhauer
I'll take my unicorn in black, too.

I'm not a monitor manufacturer, just an opinionated blogger. But are you
telling me that a matte surface would cost more to manufacture than a glossy
surface? DPMI support costs a lot? Of my wishlist items, DisplayPort seems the
toughest to this layperson, but even that doesn't seem unreasonable.

In my earlier blog entry about the same monitor, I said I'd also be willing to
pay a tiny bit more. If I could get the "professional" version with the
attributes I listed above for $600, I'd do it in a heartbeat.

~~~
josephlord
> ...But are you telling me that a matte surface would cost more to
> manufacture than a glossy surface?

Ex TV Product Planner here - yes matte costs more than glossy. If nothing else
there is the volume but I think the production techniques for glossy are
intrinsically cheaper too.

~~~
bhauer
Thanks for posting this. It's good to finally have a rationale for why glossy
has dominated matte in recent years beyond the one I've heard over and over
("glossy looks better at first glance in a Best Buy"). If it's truly a matter
of price, I suppose it's fair that only professional monitors use matte
screens.

I for one be willing to pay a modest premium for matte.

~~~
josephlord
It isn't the only reason, contrast ratio is higher too so it is needed to be
competitive on specification too. If you can control the lighting so that it
doesn't reflect to you it can be better as matte will scatter light in all
directions. The intrinsic cost difference would be pretty small without the
economies of scale I think. Matte would have to be special order panels and
you wouldn't be able to pick them up on the spot market.

~~~
cma
"matte will scatter light in all directions."

Unless you are saying one pixel smears into another, doesn't scattering light
in all directions end up giving better viewing angles with good brightness
uniformity?

I always thought the reason a matte screen hurt contrast was because of
scattered light from the room raising the black level.

~~~
josephlord
Mostly I'm talking about ambient light, hence worse blacks and lower contrast
ratios. It may also scatter outgoing light reducing sharpness and brightness
(some may be scattered back in to the panel) a little but that is pretty
minor.

------
r0h1n
The article lists the "ideal size" for monitors at 50 inches, and the one for
TVs at 100+ inches!

While I'm sure it's great to have such massive screen real estate (I wanna get
a 4K in 2014 too), I can't but help think this is the programmer equivalent of
consumers buying needlessly large TVs.

I work on a 24" desktop monitor and a 14" laptop one. How is 50" the "ideal
size"? Even 39"? At what point does the law of diminishing returns kick in?
Sure, it's great to gave 7 windows side-by-side, but is it that much of a
productivity jump from, say, 4 windows or 5 windows?

Personally and subjectively, I'd draw the line somewhere around the 28-32"
screen size.

To sum up, the core argument here seems to be that upgrading to a 39" or 50"
monitor is great because (a) it's cheap, and (b) more is always and infinitely
better.

~~~
sentenza
I don't really understand how a TV can be needlessly large. Also, I don't
understand why people still buy TVs in 2014.

When I replaced my TV about 4-5 years ago, I bought a 500€ beamer and a 100€
retractable canvas. Sure, that beamer only has 1024x768, but today you'll
certainly be able to buy something significantly better in a similar price
range.

With my living room PC connected to the beamer, my "screen" is 1.7 meters
wide, I can watch movies, television, play games, have internet, linux
(rsync!, ssh!) and all of that significantly cheaper than any "decent"
television set.

Why do people (given they have the space) still buy televisions?

~~~
carlob
I did the same (but my screen is taller than yours is large :P), but I think
there are a few reason why this hasn't seen larger adoption.

People like to have their TV on all day long: with a projector you can't,
because it won't be very visible in daylight and it will burn through lamps
like it's popcorn.

Second point, modern TVs usually have a tuner in, you plug them in and they
work. Also you don't need to adjust the image until it's rectangular.

You don't need to have a line of sight for a TV, chairs and people are not an
issue.

~~~
sentenza
You are right that of course, not everybody has my usage pattern. Since my
daughter becomes a mindless, staring zombie whenever a TV is running in her
field of vision, we only watch TV late in the evening (for us, a lamp lasts
about two years).

Before I first set it up, I expected the configuration to be a lot more
tedious than it actually was. If you have the computer, some sort of sound
system and your TV receiver lying around, all you need is a switch for the
different audio sources and you can basically plug it all together. (In my
setup, TV does not go through the computer. We don't have Hulu, HBO or Netflix
over here.)

I'd say about 80% of the work was mounting the beamer to the ceiling and
getting the upper hand on the cable mess.

------
GigabyteCoin
This is quite the opinionated "article".

"4k isn't for couch potatoes", "you should buy a $500 monitor", "if you are
programming on something so old, [you suck]", etc...

I am fairly surprised to see this style of wording upvoted to the top of HN,
actually.

~~~
bhauer
I took this as a bit ironic because my writing style was tamer in years past
and I feel it has become more blunt (to use another euphemism) since I started
reading more Hacker News postings.

But yes, it's opinionated. I'm quite strongly of the opinion that in general
larger desktop displays increase productivity. They are finally arriving, and
at prices that are almost affordable. With several important caveats and
weaknesses that cannot be ignored, the market is finally moving in a direction
that I applaud. So this blog entry/rant is a form of celebration.

~~~
GigabyteCoin
Hey, no worries!

There is a time and place for everything.

I am not criticizing your writing style. I just found it odd to find one like
it at the top of HN is all.

~~~
xmlninja
Because this kind of writing style makes people with different opinions
frustrated which leads to discussions and debates.

I actually like it. It reminds me of kenrockwell.com

~~~
hkmurakami
A debate without the accompanying upvotes would trigger the flamewar penalty
though.

------
fhd2
> At our office, we just equipped all of the programmers' workstations with
> Seiki 39" 4K televisions as monitors. [...] For the time being, there is no
> single higher-productivity display for a programmer.

Maybe it's just me, but I'm most productive on a 13" laptop. I can change my
work place and position as I want, plus only one app fitting on the screen
actually helps me focus. Emacs/tmux/IDE users can do most programming work in
a single application anyway.

I've used a 27" monitor and before that two 20" monitors for years. It was
great for other activities, but when it comes to programming, it only
distracted me.

~~~
NigelTufnel
It's not just you.

While reading the article I was thinking "wow, these guys are using 39"
monitors, while I'm perfectly happy with a single 17" monitor or a 13" laptop.
There must be something wrong with me."

Emacs is my screen space saviour. Every file or shell is 3 key presses away, I
just don't need extra monitors.

~~~
marcosdumay
Have you tried emacs in portrait mode on a bigger screen?

~~~
broken_symlink
I've tried it on a 24in monitor running at 1920x1200 in portrait mode. Its
great for editing text. I found that a good setup was to split the screen
vertically and have 2 windows, one on top of the other. It doesn't really work
well when splitting horizontally.

------
adriancooney
A 39-inch monitor? I'd feel pretty intimidated by such a beast at close-
quarters. The extra real estate would be handy but the constant glare from all
angles attacking my peripheral vision would be relentless. Nice to see some 4K
adoption however.

~~~
grayrest
It's not that bad. I had a pair of 30" monitors in an open floor office and I
really liked the setup precisely because I wasn't distracted by watching
everybody walk around behind my monitors. The people who disliked it simply
used a dark background/editor scheme and kept the browser window towards the
center.

------
johndriscoll
This seems like flaunting your budget more than improving your environment.

Maybe I just need to get my eyes checked again, but for some reason my code
just doesn't perform any better or write itself any faster with more screen
real estate and I'm as productive on my laptop as I am on my big screen.

4K is not for programmers, code is for programmers.

~~~
nilkn
I use two 27" monitors at work and would definitely feel a productivity hit if
I were to go back to smaller or fewer monitors.

I dedicate one entire monitor just to a full-screen terminal with tmux running
inside. It's trivial this way for me to edit and view and compare multiple
files at once. I typically dedicate half of the second monitor to a browser,
for testing, and the other half to administrative applications, like email,
Jabber, and IRC.

The result is that the cost of context-switching is drastically reduced. I'm
able to maintain focus for much longer stretches of time because I don't
really have to ever completely switch contexts.

~~~
Sir_Cmpwn
A tiling window manager would blow your mind. Anyone whose productivity is
improved by using tmux should really try out something like i3 or xmonad or
awesomewm.

I use i3: [http://i3wm.org/](http://i3wm.org/)

My vim-like config:
[https://github.com/SirCmpwn/dotfiles/blob/master/i3/config](https://github.com/SirCmpwn/dotfiles/blob/master/i3/config)

~~~
BadassFractal
I use xmonad on 2 screens, it's amazing how much better it feels than without
a tiling manager.

------
nsxwolf
30Hz is a dealbreaker. I will keep my 1920x1200 until I can have a proper 4K
display. I really am surprised there are people who can tolerate that refresh
rate on a desktop.

~~~
silon3
"Proper" includes 16:10 or even better (15:10 would be great)

~~~
hatu
Honestly I hate this new trend of making displays wider and wider. I feel like
4:3 or something a little bit wider is great for programming. Right now my two
widescreen displays waste space at the edges because it's just too far to look
at. Putting some of that vertically would improve the use of screen real
estate for work

~~~
vacri
4:3 is 16:12. 16:10 is 'a little bit wider' :)

------
gjm11
Not only programmers.

ASIC designers. Analogue electronics engineers. Traders. Quantitative
analysts. Anyone whose work involves big complicated things that it's
difficult to hold all of in one's head: a large piece of software, the design
of a power station, the results of a finite element analysis.

For almost any sort of heavy brain-work, having _more pixels_ makes for more
productivity. So for almost any sort of well-paid heavy brain-work, employers
should be falling over themselves to provide lots of pixels. It is a
longstanding mystery to me why so many aren't.

------
qwerta
I had 4k for years (4x 22" displays in pivot mode). It really improves
productivity on complex tasks such as debugging.

Soon I will upgrade to 3x Dell P2815Q with 6480x3840 resolution.

~~~
pizza
That's the same number of pixels as 12 1080p monitors. Isn't there some level
of diminishing returns after about, say, half that resolution? Just curious.

~~~
potatolicious
In terms of sheer workspace? Probably, but then you can dedicate the excess
pixels to increasing pixel density and making things easier to read.

After getting used to the high-DPI displays in tablets and phones I have to
admit, desktop text rendering has become very notably lacking.

~~~
XorNot
It's not just that - you can read _very_ small text on a high DPI display. I'm
consistently amazed that I can basically read HN without zooming on my S4
despite it being tiny compared to my desktop (but the same resolution pixel
wise).

That's a big benefit for when you want to keep an eye on some debug output but
not work on it directly, or quickly parse big sections of text.

------
AshleysBrain
Heh. Lots of articles about "technology sucks" on a page which has broken
middle click so I can't open new tabs, can't right-click to copy a link URL
(in the 'more' section), breaks the back button if the AJAX request completes
after you press back again...

------
pivo
I'd love to have a good 39" 4K monitor, but I really don't want any other
developer in my office to have one. For some reason, everyone else feels that
they should use the entire screen width for code.

So we end up with 200 character long lines on our 24" monitors. Can't imagine
the horror they'd produce with 39" monitors.

~~~
randallsquared
Not having an arbitrary length limit allows one to write code prose, instead
of being limited to code poetry... :)

------
leoc
4K isn't _only_ for programmers though, or even only for programmers and a few
other specialised technical or AV jobs. Bring the word of high-resolution
monitors (ideally through demonstration) to your friends and relatives who are
navigating tax law or writing papers about _Beowulf_ : 4K is for them too.
Doing any serious work or study in law, business or the humanities on a single
small, low-resolution computer screen is horrible: it's like being forced to
read only through a cardboard kitchen-roll tube. And the more people who are
demanding reasonably-priced high-res monitor screens, the more likely
manufacturers are to take note.

------
jmgrosen
Does anyone know if the 15" 2012 MBPR (that is, MacBookPro10,1) supports 4K
with its HDMI output under Mavericks? Of course, it will only be 30Hz, but
that's tolerable.

~~~
wyager
I've read officially no, but Apple frequently gives "official" specs that
claim a feature isn't supported when really it is.

For example, every macbook pro in the last 5 or 6 years that had replaceable
RAM had an actual RAM limit that was double what Apple said was its RAM limit.
I wouldn't be surprised if 4K works, albeit perhaps a little sluggishly.

~~~
polymatter
Why would Apple do this? What possible reason could they have for selling
their own product short?

~~~
xutopia
Removing the amount of choices in front of people allows people to feel more
confident about theirs.

If you give them the option for 4Gb, 8Gb, 16Gb and 32Gb instead of limiting to
16Gb lots of people will freeze and never make a choice or get frustrated when
they see the markup for 32Gb and avoid buying altogether because of that.

In limiting the amount of upgradable RAM on their sales page they show you
choices that are most likely going to be easy to sell.

------
jheriko
how is this better than multiple monitors? the support in Windows (and at
least Ubuntu Linux) for e.g. maximising a window to fill a single monitor is
excellent.

The other thing, with any article like this where Windows is not used for
testing makes me skeptical. Most programmers do not work in offices developing
on Macs or Linux machines by a very large margin. Working on a mac for iOS and
OS X development is something we grudgingly do because Apple enforce it... in
fact you could say that all programmers use Windows and be very close to being
accurate.

I've never used multiple monitors on a mac to comment though - I can imagine
the support being just as good as in Windows or Linux.

~~~
knightni
An extremely large number of programmers use Mac and Linux machines. Linux on
the desktop is sustained to a substantial extent by a programmer audience -
and you only have to look at the availability of software for it to see that
it's popular among that group.

Personally, when I'm not using a unix-y environment, it's because I've been
forced. Using windows as a programmer feels like having one of my hands
chopped off.

~~~
jheriko
sure it is a large number, but large != most.

every programmer i know works in a windows dominated environment. this is a
small sample though.

i believe that in back-end only environments linux has some serious popularity
but i can't comment on how many studios use it as their development platform
because i have no great experience there (actually i work in office where they
develop such things, but they use windows exclusively).

for desktop software, games and web front-end development windows is king. its
the platform that all of your target audience are using to an excellent
approximation. for AAA games there is an extra restriction that its the only
practical choice (there are zero tools for working on any other platform).
sure for iOS/OS X there is the same restriction to a mac and iOS development
has become extremely popular - even so there is quite some resistance to using
OS X and XCode as anything more than a test configuration on cross platform
projects - I've seen a lot of macs dual booting to windows in this context.
For android Linux can give you a small edge, but its not much...

Linux is obviously meant for programmers, but that doesn't mean they use it
either - until quite recently even the best flavours were horrible user
experiences. I think ubuntu is great and really takes steps to get away from
the 'you download source and build your app with archaic and buggy tools at
the command line' approach which has always dominated... in terms of doing
your job though - unless you work on the backend of some web service you can't
do any useful builds under Linux, unless you are targetting Linux desktop
which is exceptionally rare.

~~~
knightni
I don't dispute that windows developers are in the majority - I dispute that
other choices are as vanishingly rare as your initial comment suggested.

Anecdotally (I work at a large corp), slightly under half our devs use Linux.
You only need to look at the array of software-for-programmers that's
available under Linux to see that it's an extremely popular choice - the
availability of programming software and libraries is generally better under
Linux than it is under Windows.

While linux is certainly a bit more hassle to maintain than windows (I use a
mac when given the choice, which i find gives me the best of both worlds),
it's still a clearly superior environment for highly technical users. It's
only recently with the introduction of Powershell that Windows has stopped
being a substantial handicap for developers, imo.

------
npsimons
Flagged, for two reasons:

\- Requiring JS for text. Okay, I'll turn it on this once since you claim
"nothing dodgy."

\- Disabling my control click to open in background tab, especially after
claiming "nothing dodgy"? You shouldn't be on the front page.

~~~
bhauer
Well at least you posted why. Incidentally, I got a lot of flack for the
animation and loading posts via AJAX and the history API rather than just as
traditional pages. Lesson learned: a blog isn't really the place to experiment
with my meager UX skills. So I'll probably simplify it when I get some free
time.

------
Fomite
What about couch potato programmers? That strikes me as a pretty sizable
market segment.

------
mproud
Whatever. It’s not the number of pixels, it’s the pixel density that’s
important. Higher density is what is going to make text and pictures better.
At 39″, that’s 113 ppi. Anyone can rig up a multi-monitor display setup with
higher ppi. The only advancement is one physical screen that a good number of
GPUs can’t take advantage of.

For comparison, my Retina 13″ MacBook Pro has 2560x1440, 2.5K (or really ⅔ 4K)
but at 227 ppi.

~~~
harpastum
The real factor is pixels per degree of your vision, not ppi. Viewing distance
range with phones and laptops is pretty limited (an arm's length), but with a
monitor you can simply move it farther away on the desk to increase the
perceived pixel density.

Of course, moving the monitor farther away also makes the perceived monitor
size smaller. since PPI and distance can be traded one for the other, what
ends up mattering is the total resolution. 4k is a lot of pixels.

------
slr555
The classification of this as a television reveals the schizoid nature of
display development. For programmers the main requirement seems to be lots of
pixels without undue eyestrain. Where things start to get crazy is that 4K is
now the standard for movie projection in commercial theatres and creative
professionals of all stripe are clamoring to produce content that takes
advantage not only of the pixels but all the qualitative properties that
photographers, illustrators, animators need. For me as a photographer, having
a 4K camera is a wonderful thought in terms of resolution but if the display
can't integrate into my workflow to match my printer then it is not so useful.
It is this referenceability issue that makes it possible for Sony to sell
10,000 dollar reference monitors for television production. I can only hope
for the day that I can by a $500 4K monitor for creative professionals.

------
mentos
> Several colleagues found the display shockingly bright and were frustrated
> that the brightness adjustment did not actually reduce the backlight
> intensity.

In 2007 I had the idea to use my 28" flatscreen TV as a monitor, took me three
days before I had terrible eye strain/migraines/light sensitivity.

~~~
lukeschlather
Yeah, I have a beautiful little ASUS netbook (the newer ones that ship with
Ubuntu) and its brightness ranges from reasonable to just barely on. It's
fantastic being able to tune the brightness to the ambient light in the room
(as well as save on battery life.)

I really don't care about 4K, I want a monitor that's got a wide range of
brightness levels with the same integration with my Ubuntu workstation that
that netbook has with its integrated screen. (I plug it in and the OS
immediately knows how to communicate with the monitor to adjust the brightness
based on a keystroke. )

------
onion2k
Does the alleged productivity boost come from the increased screen estate, or
the feeling of value that comes from your employer investing in you? I had a
distinct boost when I went from a 17" Dell laptop to a 13" Mac Air.

It'd be interesting to see if any increase lasts.

------
dieg0
I agree, higher resolution displays is a productivity enhancer, but I also
believe that saving money by purchasing a television, as a monitor, is not a
good idea… taking care of your eyes is worth every extra dollar.

Just recently I went through the same issue, had to buy a new monitor for my
work station, finally decided to go with a 2k Apple TB monitor, also
considered Samsung alternative, but it doesn't have some of the features Apple
added such as automatic brightness control and additional ports.

My decision was based on taking care of my vision, tvs simply aren't built to
be viewed from a desktop distance, they are just too bright… they where
thought to be used from a distance.

------
hatu
My two 23" screens area already too big for me. I could cut half off each
almost. They're nice IPS panels with minimal glare and really good for your
eyes. Putting a cheap 39" TV in front of my nose 8 hours a day sounds
ridiculous.

------
NDizzle
I don't see how anyone could use a 30hz panel for a productive length of time.
At 40hz, which is what my laptop goes to when it's in power save mode, even
something as simple as scrolling a web browser looks like a torn, jagged mess.

------
brownbat
> This blog uses a little JavaScript. Nothing dodgy, though, and nothing
> hosted at third-party sites. Just some jQuery and animation bits. So please,
> if you'd be so kind, ask Noscript to call off the hounds. ... That said, if
> you insist on leaving script off, you can just scroll down a bit to read the
> content.

This site is my hero. Write a "no javascript detected" message as if blocking
javascript is perfectly understandable (even if you secretly believe we're all
hatted in tin). You know your audience with these, what does it hurt to throw
them a bone.

~~~
brownbat
(That said, I don't block javascript to keep content providers from doing
dodgy things quite so much as I do it to avoid fallout when someone breaks
into your server, or slips something into your page in transit.)

------
catmanjan
Very tempting! Unfortunately I like to have my monitors close to my face, so I
think the large size of the television may cause viewing angle problems.

~~~
erikig
I thought the same thing but now that I have the Seiki it is very similar to
having 4 x 20" 1080p monitors. The view angle is decent as long as you place
the monitor about 2-3 feet away.

------
dzhiurgis
Article about 4K monitors with 300 pixel photos. It's just not right.

------
haar
I feel like I must be in the minority here: I enjoy multiple (smaller, 28"
max, sub-24" preferable) displays.

I run iTerm2 in full screen with Vim split alongside 2 or 3 console panes; I
find the separation from the rest of my OS a blessing. The second screen I use
for Chrome, HipChat, Flint, Sparrow etc; I still notice chat notifications,
and to keep a eye on my dock (which displays all notification counts across my
multiple programs) is a simple key combination if I'm in 'the zone' and don't
wish to break out of my programming environment (iTerm2).

I've tried larger screens in the past, and found myself physically looking
around the screen a lot more, simply due to having all my content splatted
together.

Combine this with a keyboard-based window manager and the 1/2 - 1/4 screen
segmenting and I feel much more organised and efficient than trying to manage
a 30"\+ monster of a screen.

------
nbevans
I don't think I'd want to work somewhere where it appears to be compulsory to
use a ridiculously cheap $500 39 inch UHD display. How ridiculous! 28-30 inch
is plenty big enough for a desktop computer.

The intentions are good. But seriously, why not just wait a few weeks for the
Dell 4K monitor that's price well under $1000.

------
natch
Anyone know, is a Windows machine required for the step of applying the
mentioned firmware fix to this monitor?

~~~
bhauer
Nope! The monitor "boots" and patches from a file on a USB key. It's fairly
painless.

~~~
natch
Sweet, thanks!

------
ciofeca
A few weeks ago I got a brand new 27" 1920x1080 Asus display for some 200
bucks. Once at home, I realized it was a TV set without the TV receiver. This
is the worst thing a programmer may ever buy.

A programmer's monitor should have not only a fairly decent contrast, but also
the darkest black possible. This thing, even at the lowest
brightness/contrast/colours settings, outputs a lot of light (and its black is
just a gray), making it unsuitable for programming with any color scheme.

I switched back to the old 19" 1440x900 Samsung monitor bought some seven
years ago, which has a true black. I'll pop in a mediacenter box and throw the
27" in the dining room.

Before buying a monitor, you should check for its "blackest black" in a low
light condition and use full-screen.

And -yes- that tsotech article just sucks.

~~~
bhauer
> _And -yes- that tsotech article just sucks._

It's just a blog entry. I'm not a journalist.

But thanks for the criticism. It's clear you and I just disagree about
displays.

------
avighnay
It is definitely a productivity booster for devs. However if your software
audience includes low res users (enterprises, accessibility etc.) ensure that
design/layout testing for low res usability does not become an afterthought.
We have had this oversight in our team a few times.

------
maaarghk
My problem is that I have really bad eyesight and I don't want to program all
day on a monitor with pixel density like that. I have a laptop which has a
high pixel density, 1080p at 13 inches and it is just not ideal. I am zoomed
in 150% to 200% on websites routinely.

~~~
jkscm
High pixel density should, ideally, have no influence on text sizes but the
sharpness of the text.

You should increase the DPI count in your OS.

------
pedalpete
I don't really understand monitor specs, is the 30mhz limit due to the size of
the monitor, or is it a GPU limitation on most PCs?

I'm wondering if a 4k 36" monitor would have a better 40mhz limit or
something?

Also, my laptop is a few years old with Intel HD integrated graphics. Any idea
if it would have enough power to run one of these?

What about a ChromeBook?

I'm trying to get an idea of just how much power these things need?

~~~
lstamour
The panel is capable of 120hz on both 39" and 55" but the 39" is further
restricted to 60hz at all resolutions. The limit at 4K to 30hz is the HDMI
connection. The TV does not have DisplayPort, though HDMI 2 is hoped for in a
future firmware update. (At least based on Sony's announcement of such an
firmware update for their sets.)

~~~
houkouonchi
If you flash the 39 inch with the firmware from the 50 inch it can do 120Hz at
all resolutions 1920x1080 and below. I even give it 240Hz @ 720p even though
it drops half the frames to reduce the input lag from around 9ms to 4.5ms.

~~~
lstamour
Sounds too good to be true. 30 fps mouse cursors were starting to drive me
nuts. I'll have to try this soon. I wonder if/when Seiki comes up with an HDMI
2 version of the TV we can reuse the firmware here...? Edit: I see that Seiki
announced new 4K TVs with HDMI 2 at CES, but only 50"+, no 39". I can only
hope the new firmware has clues for getting HDMI 2 running on my 39"
eventually.

------
lnanek2
Honestly, I've long used a pair of 1080P 37" TV as my preferred monitors. One
landscape, one vertical. I place them far away so my eyes don't have to work
as hard to focus. Fonts are big for the same reason. I don't need 4k
whatsoever. It would just make things too tiny to read with the current poor
support OSes have for dpi independent rendering.

------
vonseel
Well, this is just about the opposite of what many people have been moving
towards for the past decade (smaller, more portable, lighter).

I do think a single large monitor would be nice. I'm using dual 24's with a
13" retina display centered below and find it difficult to make smart use of
the screen space because I have to turn my head constantly.

------
Buge
Part of the problem might be gamers not having powerful enough machines and
not wanting a blurry image from scaling.

~~~
catmanjan
I think PC gamers would find the 30Hz much more of a disincentive. I think
most console games are used to 24 fps anyway, but they don't even have the
option to reach that resolution (yet?)

~~~
ssully
Yup, 30hz would be a no go. And I am pretty sure Sony has said the ps4 is able
to play video at 4k, but has no intentions for games at that resolution. Hell
games for both systems aren't even 1080p across the board.

------
mhd
Anyone knows where you can get something like this at a similar price point in
Germany? The only price I could find the exact same TV was 1200 Euros (about
$1600). Not even ebay's international sellers seem to have it, and those
proved pretty useful when it came to the Korean high density IPS monitors...

------
mathattack
Yes indeed. And programmers should get multiple screens too. This will only
get better over time too.

------
Istof
For movies, last time I checked about a year ago, it was still difficult to
get a bluray that was real 1080 (they usually are DVDs converted to bluray and
the quality is not any better) ... I wonder how many years it will take for 4k
content to be widespread ...

~~~
sosborn
Not my experience at all, although I could see that being true for older
movies.

------
m0skit0
> but if you are programming on something so old, you should first contend
> with that. So for programming Android apps I need an top of the line GPU?
> Most ridiculous thing I've ever heard. Not even bothering reading the rest.

------
taternuts
I had to bring my own shitty 19 inch monitor to work just to have two :/

------
anigbrowl
No, 4k is for film editors and colorists. These are the people who think about
resolution color, persistence of vision and so forth for a living, and who
ultimately guide the standards that result in quality monitors. Whingeing
about HD as if it had hindered some programmer-driven monitor nirvana is
laughable; if the needs of programmers were the guiding light of display
technology we wouldn't be using GUIs because so many programmers prefer a
command-line environment.

 _I want a 50-inch desktop display with north of 10,000 horizontal pixels._

Big whoop. I want an interactive holodesk and bidirectional wireless neural
interface too. I also want dynamically-generated domain specific function
graphing so I don't want have to shoehorn my ideas into a one-dimensional text
stream.

------
cma
I'd say up to about 42in 4k monitors are for programmers, between that an
about 8 feet, 4k is a gimmick--4k projectors are for couch potatoes. 4k VR
headsets will cover everyone.

~~~
bhauer
Of these, I feel the least certain about VR headsets. I've not yet tried one
so I remain skeptical. But I agree, if a terrific VR headset hits the market,
that would be compelling.

------
Kiro
They say that having a 120hz monitor greatly improves the desktop experience
(it's not just for games) so wouldn't 30hz be a real downgrade if you're used
to 60hz?

~~~
seabrookmx
Honestly, it depends on the person.

At work I have an old 75Hz 1024 x 1280 panel (portrait) beside a newer 60Hz
1200p panel. If you look closely, you can tell the old panel is smoother. But
its just that.. you have to look closely. It definitely isn't a jarring
difference. IMO I wouldn't get much more out of a 120Hz panel just for
programming and daily use.

As for gaming, wI know the extra framerate definitely smooths out the
experience in FPS games. I'd take a 1440p IPS panel over a 1080p TN (120hz)
any day, but I know others who say the exact opposite. Its totally subjective.

------
transfire
Just wait 12 months. Better 4K options are on the horizon.

------
disbelief
This monitor is listed as $999 on Amazon, following the very link he posted in
the article. Is there somewhere else you can buy it for $500?

~~~
ryanfreeborn
He's talking about the 39 inch, which is $500. When you go to Amazon, it
defaults to displaying the 50 inch. Click 39 inch button to the right of the
product image and you'll get the $500.

------
2gg2rg
Would I get the full resolution using the DisplayPort out on my laptop and a
cable to convert that to the HDMI on the TV?

~~~
erikig
I have been using the Seiki 39" for about 4 months now and I agree with a lot
of what @bhauer has to say.

There are a couple of things as a developer. 1\. You end up developing really
long lines of code that might upset other developers 2\. I'm on my second
Seiki, my first one died and it took 1 mo. to receive a replacement.

------
mrfusion
Does anyone know if a macbook pro (1 year old) can support 4K monitors?

~~~
GuiA
Only late 2013 retinas do

~~~
houkouonchi
Not true. The older 2012 rMBP will do it fine too on newer versions of OS X or
linux/windows. And that is just over the HDMI port. With a Active DP 1.1 ->
HDMI 1.4 adapter even very old macs that have TB will run the display just
fine.

------
airtonix
alt+tabbing? wut?

Synergy + 17" Laptop as server + Desktop with 2x 24" screens on desk arms...

Some one explain why a single large screen is better for a programmer, when
synergy is dead easy to setup.

~~~
MetaCosm
Some people loath the seperation between screens (I am one of these people) --
So I have a single nice 30".

------
goggles99
Who would disagree that more pixels is better? I DO disagree that a 39" glossy
monitor from a garbage brand and that only runs @30hz in full res is a good
option right now. Too big, glossy and every Seiki monitor that I have come
across has had problems within the first two years (everything from displaying
all green to catching on fire)

