Hacker News new | past | comments | ask | show | jobs | submit login
4K is for programmers (tsotech.com)
449 points by bhauer on Jan 10, 2014 | hide | past | favorite | 335 comments



> Today, you can still buy a 30-inch 2560x1600 display for over $1,000. Or you can get a 39-inch 3840x2160 display for $500. This choice is not even fair.

Yes, you can buy a wonderful, physically adjustable, mountable, color calibrated, blissful contrast, delightful brightness control monitor with a great matte finish like the Dell 3014... or you can buy an absolutely awful 30hz, locked in place, blue tinted, horrible uniformity, terrible brightness, awful contrast, with horrible black levels and really -- just horrific to work on 4K "tv". I speak from first hand experience, it is absolutely one of the worst panels I have ever had the misfortune to stare at vim on.

The colors are so awful that literally, some of my favorite themes (seoul256) where ENTIRELY unusable, just a greymush of nothingness. With bright spots and dark spots to make reading code even harder. So, I tried a black background theme... only to be horrified with the black level.

I agree with the author there is no comparison, obviously almost anything beats the Seiki. Hell, most review sites won't even recommend it as a TV, let alone a monitor.


Thank you for the reality-check. I really do appreciate it and hope anyone reading my words reads yours as counterpoint.

I had tried to be fair-handed with my judgment of the Seiki but I admit that my desire for large form-factor displays coupled with Seiki delivering one that I can actually afford clouds my judgment some. I want high-quality large displays and the incumbent manufacturers have left me in the cold for a decade.

Not that it matters, but one area where I spoil myself is in monitors. I spend what I can afford (and then some, admittedly) on monitors in lieu of other luxuries. I own five 30" monitors, three of which are Dell U3011s. I love those monitors. As you point out, they ship from the factory with great color, a matte screen, and of course inputs that don't disappoint.

But on the flip-side, the 30" "professional" LCD monitor as exemplified by the Dell U3011 and U3014 has not decreased in price substantially since I picked up my first Dell 3007WFP, what was it, eight years ago?

Maybe it's telling that I have not yet replaced one of my home Dell monitors with a Seiki. But when my wife's Dell 30" died, we decided to give the Seiki a try. As I've said, the Seiki has several downsides. The 30Hz lag is foremost. It's noticeable and frustrating. But the screen size and delight of having so many pixels in a single display is so compelling. Also, after applying the Seiki firmware update, the backlight can be adjusted, allowing for better calibration.

It's not a perfect monitor. It's not even a great monitor. It's a good monitor and it does the thing I value most: give me pixels with which to see and interact with information quickly. As a programmer on a budget, it's my favorite monitor currently available. I really hope that competition heats up and better monitors are on the horizon. I will bite at opportunities to get large, high-DPI, high-quality, high-speed displays.

But for now, I personally would not buy the Dell 30" unless they severely cut the price. I'm keeping my eye on that because I am hopeful the new competition could start moving the higher-end prices downward, and that will make me happy.

(Incidentally, the U3014 is presently $1,600 at Newegg [1]. I could barely stomach the ~$1,000 price of 30-inch monitors eight years ago. In 2014, I can't bring myself to spend $1,600 for 2560x1600.)

[1] http://www.newegg.com/Product/Product.aspx?Item=9SIA25V0U169...


First of all, you can get the U3014 for around $1100, less on sale. http://www.amazon.com/gp/offer-listing/B00C2RPW8O/ref=dp_olp... ... additionally, it is a "professional" monitor (X-Rite tie in, factory calibrated, etc) and will likely maintain a premium price as corporations are the target customers.

Secondly, the Seiki is not a "good" monitor. It isn't even a "good" TV. The only thing Seiki has is pixel density and cheapness, it is awful in every other measurable way, without a doubt one of the worst panels I have ever purchased. I can't fathom how you can justify calling it "good". Excluding resolution -- can you even name a single thing you like about it, that you think is a cut above?

You can get a number of wonderful 27" inch (2560x1440+) in that same price point. With good inputs, colors, refresh rates, adjustability, brightness, yada, yada. They go as cheap as $350 now, but you can get GREAT ones (really great, not just pixel density) for around $500.

Questions: (1) How many hours do you spend really working on the Seiki... not your wife, not on the phone, not at home on your wonderful Dell monitors, not walking around, not just in the office, how many legitimate hours to you spend USING the Seiki? (2) Why do choose for your developers, why not given them a budget of X to buy whatever they want?


> Secondly, the Seiki is not a "good" monitor. It isn't even a "good" TV. The only thing Seiki has is pixel density and cheapness, it is awful in every other measurable way, without a doubt one of the worst panels I have ever purchased. I can't fathom how you can justify calling it "good". Excluding resolution -- can you even name a single thing you like about it, that you think is a cut above?

Do you have an ax to grind, or is this just typical HN pedantry?

The reviews on Amazon for this display is a solid 4.0 from 200+ reviews -- hardly the shite quality you seem to be suggesting. Given that the thesis of the article is that this would be great for programmers (i.e. not designers or the color obsessed), the fervor to which you're arguing the point seems misplaced.


The man used two physical calibration devices on the monitor. I think that means he values color reproduction much higher than the average buyer.


CNet compared the 50inch model to a 50 inch 720p plasma, and came to the conclusion that the plasma was a better tv. The review was more focused on using it as a television, but it appears that product is low quality. http://reviews.cnet.com/8301-33199_7-57599449-221/budget-tv-...


Good to hear that the Dell prices have not actually increased to $1,600. I was a bit shocked by that when I searched at Newegg. Still, it's exceedingly frustrating that the price of a 30" professional monitor has been locked at $1,100 since they were introduced in 2005 (or was it even earlier?).

I suppose you and I just disagree on what is good and what is not. I've personally used the Seiki for the past few days. We just got them this week. I value the quantity of pixels and price a little more than color accuracy, balance, and uniformity. I want even better specs. I want 60Hz most of all. But given the balance of priorities, I personally think the scale tips toward size and resolution to maximize my own productivity. 60 Hz though, I'll take it in a heartbeat.

The blog entry is just my opinion and I understand your opinion is strongly opposed. That's fine.

Answers: (1) A total of about 6 hours a day for the past few days. They just arrived. Key quality of life improvements have been using the firmware patch to allow the backlight to be dimmed and using dark color schemes. Most of us were using dark color schemes already since we prefer those at any resolution. (2) I'm not going to bother answering this in detail. From the reactions I've seen, we're very happy with the upgrade.

Seiki has a lot of room for improvement. I take it your opinion is that they have a huge amount of improving to do. I really want the whole desktop display market to improve dramatically. It has been stagnant for so long that I really treasure these recent developments. As much as I acknowledge this display's weaknesses, I see it as a harbinger of some long overdue innovation. For that alone I am very happy.


I'd like to chime in as another Seiki user. I've been working on projects for 3-4 month with the Seiki and I end up using it 12-14 hrs on many days. I run it at full 2160p but have increased my Win8 res to 120%.


This is my problem. Pixel density is great if you have 20/20 vision.

I'm really happy with 1080p on 27" IPS (LG). People would probably think I'm mad. Just as I would consider myself mad looking at a blueish display all day long.


> I'm not going to bother answering this in detail. From the reactions I've seen, we're very happy with the upgrade.

Detail? This answer has cost you more words than the obvious two picks: "We made a vote","I made a gamble"

?


> You can get a number of wonderful 27" inch (2560x1440+) in that same price point. With good inputs, colors, refresh rates, adjustability, brightness, yada, yada. They go as cheap as $350 now, but you can get GREAT ones (really great, not just pixel density) for around $500.

Where can I find these?


... Everywhere? There are literally 41 under $500 at 2560x1440 or 2560x1600 on Newegg right now. The QNIX was already linked in this thread by someone else (http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY0X598...)


I was looking for your opinions on some great options, not trying to pick a fight.


Then why didn't you say that?


How is that "trying to pick a fight"?


> They go as cheap as $350 now, but you can get GREAT ones (really great, not just pixel density) for around $500.

Are there any for $500 that are better than the $300 ones?


I agree the colors are a mess..... until you spend an hour or so calibrating. I did over the last two days since I bought one, and it has the same look, color wise, as my rMBP now

(I own the 39" Seiki)


I tried calibrated it with both X-Rite and Spyder (both physical calibration devices) -- neither could get it anywhere near "average". It just swings all over the place. Also, calibration can't fix horrible blacks, uneven brightness and the finish.

What equipment did you use? Can you post your pre/post scores? I have NEVER met anyone who has been able to actually calibrate it into even "average" ranges. I realize that X-Rite and Spyder are "consumer grade" calibration devices, but they are all I have access to.


I didn't even use a device, I just eyeballed it. For my uses of Sublime Text, Chrome and Textual, colors are great. No qualms.

And now I'm pretty sure I'm out of my league.


The fact that you have access to two physical calibration devices makes me discount your opinion massively, FWIW. It means you're not really in the market for a programming monitor, but for design or some other thing where color is important.

I don't care for calibration at all, really. What I do care about is that all my monitors, when I have multiple monitors, have similar performance across the gamut. To the point that I lived with what in retrospect was an extremely out of whack 2407WFP for 6 years to match a cheap Acer until the Acer died. I only noticed how bad it was when the U2813HM that replaced the Acer was sitting alongside. Resetting to factory defaults pretty much fixed that.


>I tried calibrated it with both X-Rite and Spyder (both physical calibration devices) -- neither could get it anywhere near "average". It just swings all over the place. Also, calibration can't fix horrible blacks, uneven brightness and the finish.

If you don't mind me asking, which version of Spyder did you try?

Edit/Disclaimer: (I work for the company that produces it)


4PRO.


Sad to say, for many programmers and other people who aren't AV types, the step up in resolution—plus the saving of over $500—is likely worth the dodgy colour and brightness. Many won't even really notice the badness. That said, some of the $600-ish mystery-meat 2560×1600 displays apparently have good display quality and so may be a stronger alternative to the Seiki.


At $500, there are FAR better monitors. Bad contrast, bad brightness, bad everything. If you buy your programmers this, buy them a years supply of pills to deal with the headaches it will induce.

Again, this isn't like a "bad for picky art professionals" type of bad. Just awful, I wouldn't even sell it to anyone I knew personally because I knew it would come back to bite me.


At $500, there are FAR better monitors.

You keep repeating this as if it is some magic mantra. You have not defined any what "better" really means in manner that can be quantified and compared without introducing subjective opinions of a user.

Also, care to link to these "better" monitors, including sellers that sell these for the $500 price, as well as non-subjective reviews?


Better monitors very easily be objectively conceived. Latency, refresh rate, contrast ratio, brightness, input options, are all easily measured.

"Better" monitors are easily found through a simple search, and how do you exactly determine a review to be "non-subjective"?

http://www.displaylag.com/display-database/ http://thewirecutter.com/leaderboard/tvs/ http://thewirecutter.com/reviews/a-great-27-inch-lcd-monitor...


What latency is better than what brightness?


I can see where you going with that, and I'll agree it's subjective, as to which suits your needs, but I think you can find TV's that are all around better.


Better monitors in terms of brightness, color and other monitor-specific features. But not in resolution. And personally I really don't understand who in their right mind might want this kind of insane resolution for a TV. The resolution only makes sense for a monitor.

Fix the issues, turn this thing into a monitor, and corner the market for big monitors.


If it was 60Hz I would certainly consider it right now. But 30Hz is really killing it, the other issues are no big deal for programming work. I've programmed using cheap LCD screens since the early '00s that were horrible 6-bit TN panels with only very few different shades of gray. Still not an issue at all.

I'll wait for the Think Vision Pro 2840m for $799 in April though.


I have 2 of these, and they have been 12 kinds of awesome:

http://www.amazon.com/Nixeus-27-Inch-Resolution-2560x1440-NX...

Adjustable (height and rotation) stand, pack in dual-link DVI cable, all 4 major inputs.

I am definitely an amateur when it comes to display calibration, but I'm pickier than your average bear and these have acceptable picture quality when dialed in


Thanks for pointing this out -- the article seemed to completely overlooked the panel quality and configurability.

I'm personally really looking forward to a proper UHD/4K over 30" that does 144hz with GSync, which I can use as my main desktop monitor. It'll finally be a reason to upgrade from my old Dell Ultrasharp.

The Asus PQ321Q is almost there, it's just limited to 60hz.


You will be waiting a very long time. UHD@144Hz requires 40Gbit/sec of bandwidth. DisplayPort 1.3 (which is not generally available yet), only supplies 25Gbit. HDMI 2.0 and DisplayPort 1.2 are 14 & 17, respectively.

http://emsai.net/projects/widescreen/bandwidth/


This Asus ProArt series is also incredible.


There's absolutely no difference from 60Hz to anything above it.


For a counterpoint: http://www.tweakguides.com/Graphics_5.html

Citing: http://amo.net/NT/02-21-01FPS.html and: http://www.100fps.com/how_many_frames_can_humans_see.htm

The human eye's ability to detect change exceeds 200 fps, though "fps" isn't really a good match for how the eyes process information, which is highly dependent upon the scene in question.


Yes, there is. I have side-by-side 60Hz and 120Hz monitors, and window dragging for example on the 120Hz one is much much smoother (I had to buy a gaming mouse, because most normal mice won't update at more than 100Hz I think). And movies don't have this jerky motion where 60Hz doesn't quite exactly divide with 24 fps that movies use.


Well, in my day, when you moved a window all the computer showed was a grey square - no fps issues there.

Sorry, had to say it.


I started with zx-spectrum and was also very happy with whatever was there :) But baby steps towards moving pictures that are indistinguishable from reality.. Eh, I'm just very picky. Ever noticed that the top side of a moving window moves before the bottom side on most LCD displays?


Haha, good one. Isn't trolling against the rules here though?


Admittedly 30 Hz sucks and the colors are not up to par with an IPS panel, but overall the Seiki offer a great deal for the price.

The brightness is great (only terrible if you like it really low), the 112 pixel density is higher than on 2x23 inch monitors, for example;

The colors can be calibrated from within the OS so it's not as much of an issue;

And as the black levels - if you found them horrible, it may be because you were using an nVidia card on Windows - the drivers are set to limited RGB range (because it thinks it's connected to a TV), giving you horrible black levels (and colors).

I spent a month wondering why my brand new monitors look like crap when every review said they were the best bang for the buck :-D.

Or maybe you just got a bad panel and/or the new revisions were improved...


30 Hz is awful. The brightness problem is blow out and bleed. It is "very bright" it is so bright that when doing any sort of calibration the whites bleed on everything. The contrast ratio is good due to the excessive brightness, which means the blacks still suck. Horrible highlights and horrible shadows.

The calibration you can do in the OS is limited, as I said, I have two different sets of hardware equipment for calibration and I still couldn't make it less than horrible. The display is just ick (as expected at that price point) -- every color is signifigantly off out of the box, and trying to fix it even with physical calibration devices is a lost cause. You can't get it anywhere near "normal".

I tried the TV against multiple forms of input -- from Windows to OS-X to Ubuntu to QNX(I used to build command center software, which ran on TVs on top of QNX, so I have my own stack that is tested against TVs). From nVidia, to ATI, to Intel. I also tried it as a simple TV (my fallback so I didn't have to sell it) -- it even looked like crap as a TV... even with external upscaling to 4k.

I purchased mine (and used it for about a month before I was able to sell it on Craigslist) just about as soon as you could get one in the US. I doubt any radical changes have taken place since then, I have read of none outside of the price dropping. It is still generally reviewed as a poor TV and worse monitor.

Dells 4k 28" should be available in just a few weeks -- wait for that, don't purchase this lemon.


For a workstation, how is 30Hz refresh a problem? I could understand that wouldn't be ideal for gaming, but if your daily use is that if reading static text that doesn't move around, is it really a problem. If this were a CRT, I'd agree with you, but as an LCD panel, I don't understand why the lower refresh rate is a problem. It isn't as though the screen will flicker.


Go try it now. Connect your laptop to your tv, set the refresh rate to 30hz. Your mouse will become extremely "laggy" (e.g. unresponsive), all 60fps animations like dragging windows around, scrolling will become stuttered, etc.

Just the productivity decrease alone makes this monitor extremely expensive in the end, despite its price.


No need for a TV set, some (most?) computer monitors can display 30 Hz, it only needs to be enabled using "custom resolution" or similar setting in drivers.

I just tried:

- Dragging windows - not an issue, but maybe I am used to it from times when having window contents displayed during dragging lagged much more due to cpu/gpu power.

- Scrolling - it is slower, but with clickless scroll wheel and smooth scrolling it was not too noticeable.

But, oh god, the mouse cursor movement... It was like working through a high latency VNC.


It increases the latency for displaying the mouse pointer, so you may find it harder to quickly and accurately move your mouse.


Excessive backlight bleed - could be that you got a real lemon...

Manufacturers do have frequent revisions on the panels - for example LG's e-IPS panels had 6 revisions (or more), each pretty different, initially with backlight bleed and ghosting problems, then AFRC and brightness changes (it varies a lot between revisions) and the bad reviews went to Dell, Asus, Acer and other display manufacturers...


Well -- at the time (way back in 2013, when it was signifigantly more expensive) -- the reviews echoed what I saw -- just Google around for them. Due to the number of reviews matching my personal experience (blown out brightness, bad color accuracy, unable to get it calibrated to good norms, etc) -- I was delighted to sell it off for what I paid for it.

I can't prove without access to a modern panel that they didn't get exponentially and magically better. But, I find that unlikely. They are fighting for cheapest slot.

But who knows, maybe it is the best panel in the universe now! Maybe the extra $3000 people are dropping on Sharp and Dell 4k panels is all markup! But, I suspect not.


There is apparently a firmware patch from Seiki to make the backlight adjustable. I don't think I'd do graphics work on it that requires absolute color-accuracy--the 30Hz would already be too painful with a wacom anyway.

What I'd like to know is if the viewing angles are so bad that you see different colors, inverted colors etc. just while sitting at a desk with it (since it is so large you are at pretty big angles from pixels on the edges).


Any IPS panel is better (even the cheap $200 ones) than TN, but from what you're describing it sounds like the worst TV/monitor in the world :-). I've got an old Samsung LED TV that still has acceptable contrast/black levels, surely a new panel can't be that bad unless it's defect...


Reports indicate that the Dell 4k 28" will be 30Hz @ 4k also :(

The Lenovo Thinkvision Pro 2840m and Asus PB287Q have both been announced (60Hz @ 4k, US$799), but you might have to wait a couple of months for release.


> I doubt any radical changes have taken place since then, I have read of none outside of the price dropping.

Didn't the article mention something about a firmware patch to fix the backlight brightness?

(I appreciate your calibrated review/opinions btw)


Alternatively, you could buy a nice 27" 2560x1440 Qnix QX2710 (http://www.2560x1440monitor.com/index.php?route=product/prod...), which has an excellent Samsung PLS panel and can often be overclocked to 120Hz.


I like the "Up to 5 dead pixels" on the product image. Makes it sound like a feature, "5 Dead Pixels in every monitor, guaranteed!"


You can find retailers that sell "Pixel Perfect" models with none, but you pay a decent premium for it.


Well -- or just find a "no questions asked" retailer.


At work I have dual 30" Dell monitors, which I use with a tiling window manager (I use awesome [1]). To see the far right/left sides I almost have to roll the chair left or right so I tend to keep editor windows and browser windows where I'm actually reading text in my central field of vision, and keep other windows on the outer margins. Also they do consume nearly all my desk space, so sometimes I think a single larger monitor might be better.

1: http://awesome.naquadah.org/


I switched my dual monitor orientation to a primary 27" (iMac) running 2560x1440, with my secondary 24" running 1200x1920 in portrait orientation. It is, at first, disorientating to have the two monitors eschew but once you get used to it, the reduction in straight from looking far left or right out weighs the strange look. The added benefit is that you get 1920 pixels of vertical resolution for those times that you want to display long blocks of text without scrolling. Admittedly, I use that far less than I use it tiled with windows arranged over-under.


Eschew or askew?


Askew. Typed that on my iPad, so there are a couple of typos. Corrected 'eschew' and 'straight':

"It is, at first, disorientating to have the two monitors askew but once you get used to it, the reduction in strain from looking far left or right out weighs the strange look.


Not that I've tried it, but dual 30" seems like it'd be a bit much for me. Or the OP's dream 50". There's only so much you can focus on at a time, and only so far you can turn your head to change focus, though I guess depending on what you do you might want more areas of focus. 3 window columns seems to be where I'm most comfortable, but I've never tried a display bigger than 1x30". I do keep a 'dashboard' on a smaller side monitor that I can turn to look at as necessary though.


You don't focus on everything at once. Secondary windows can be viewed peripherally -- for instance you can tail a log, and your peripheral vision will notice motion. The other benefit is multiple windows as an alternative to alt-tab. Moving your eyes slightly is a lot easier than pressing alt-tab -- it's can be done without concious thought, so it doesn't interrupt your "zone".


A friend of mine swears by those little cheap USB monitors as little mini log viewers.


Interesting. I found I got a lot more productive when I switched from a large monitor with multiple windows to a smaller monitor with a single window switched via keyboard. And that was specifically because it was faster and required less thought to flip through windows than to remember where the window I wanted to look at was and to move my eyes to it.

However, I have poor eyesight and very poor peripheral vision, so that might have something to do with it. If I'm trying to watch for movement out of the corner of my eye I frequently get false positives from visual aberrations.


You're exaggerating. Why?


I had a 3014 at work and I stopped using it because it was terrible with Mac OS X. It had this thing where it would blue tint a box around any sustained motion on the screen, like a video being played. It also came out of the box blue tinted and with the sharpness cranked up. So while the Seido might not be great, the 3014 is no panacea either.


There's some hacking that has to go on at the OS level with OSX to have any monitor > 27 inches. The trouble lies in that the operating system wants to natively believe that you're actually using a TV, not a monitor.

To get it working you need to fool the OS into thinking that it's a monitor again: http://embdev.net/topic/284710#3027030


Note that's only for DisplayPort. I use DualLink DVI to my large screens, and it works great without any hacks.


>with a great matte finish

Isn't that an oxymoron? Why would you want an intentionally blurry screen?


So you can see what is on the screen. I used to have best Dell 27" and the Apple Thunderbolt 27" display side by side (the ones I had used the exact same panel).

The Apple was glossy and the Dell was matte. If you peer at the Dell, you can see the matte coating -- but everything was nevertheless much more readable for most of the day due to the glare from the glossy screen (it was an office with a lot of natural light). Only at twilight did the Apple provide superior visibility.

So matte finishes are extremely helpful in some circumstances (depending on the lighting and type of room), and some matte finishes are a lot better than others.


Aren't there low-reflectivity glossy screens now? For example, on my Nexus 4, I can barely see my reflection when the screen is dark, although a lamp or window in the right direction is visible (but not so much that it overpowers the backlight).


Seems like just fixing your lighting situation would be preferable to having a blurry display.


Lots of natural light is my ideal lighting situation. I just bought a dell 27" IPS with matte finish for $650 cdn. over all the Korean monitors specifically for the matte finish.


Well, I suppose I could have asked the company to re-partition their office without so many windows and get rid of their skylights to accommodate my glossy display.

But I figured it would be easier to just use a display with a good matte coating.

Then again, I am one of the people who used to always pay the extra $50 for the matte screen option on my MacBook Pros, back when Apple offered that option.


>to stare at vim on

Well, there's your problem right there...


ahahah wp sir wp


>>>/g/


"Seiki is missing a golden opportunity to dominate the desktop display market by removing the television tuner, speakers, and remote, and then reallocating that budget to a 60Hz or better input (HDMI 2 and/or DisplayPort), a matte screen surface, and instant-on DPMI support, all the while retaining the market-wrecking price."

And while we're at it, I want a unicorn as well! Seriously, speakers, a remote and a TV tuner are like, $5 in parts max whereas everything he just listed involves upgrading silicon fabrication processes and designing new ASICs.


I'll take my unicorn in black, too.

I'm not a monitor manufacturer, just an opinionated blogger. But are you telling me that a matte surface would cost more to manufacture than a glossy surface? DPMI support costs a lot? Of my wishlist items, DisplayPort seems the toughest to this layperson, but even that doesn't seem unreasonable.

In my earlier blog entry about the same monitor, I said I'd also be willing to pay a tiny bit more. If I could get the "professional" version with the attributes I listed above for $600, I'd do it in a heartbeat.


> ...But are you telling me that a matte surface would cost more to manufacture than a glossy surface?

Ex TV Product Planner here - yes matte costs more than glossy. If nothing else there is the volume but I think the production techniques for glossy are intrinsically cheaper too.


Thanks for posting this. It's good to finally have a rationale for why glossy has dominated matte in recent years beyond the one I've heard over and over ("glossy looks better at first glance in a Best Buy"). If it's truly a matter of price, I suppose it's fair that only professional monitors use matte screens.

I for one be willing to pay a modest premium for matte.


It isn't the only reason, contrast ratio is higher too so it is needed to be competitive on specification too. If you can control the lighting so that it doesn't reflect to you it can be better as matte will scatter light in all directions. The intrinsic cost difference would be pretty small without the economies of scale I think. Matte would have to be special order panels and you wouldn't be able to pick them up on the spot market.


"matte will scatter light in all directions."

Unless you are saying one pixel smears into another, doesn't scattering light in all directions end up giving better viewing angles with good brightness uniformity?

I always thought the reason a matte screen hurt contrast was because of scattered light from the room raising the black level.


Mostly I'm talking about ambient light, hence worse blacks and lower contrast ratios. It may also scatter outgoing light reducing sharpness and brightness (some may be scattered back in to the panel) a little but that is pretty minor.


Dell will beat them to it - the new Dell UltraHD, while only 28", is a proper desktop 4K monitor with what will (most likely) be far better color accuracy and usability, for $699.

Seiki just dropped their price to $500 which is attractive, but I think the Dell's build quality and internals will be worth the $200 premium.


The dell 28 inch 4k monitor is also only 30hz though. The 60Hz ones are still more in the 1200-3000 range


The Asus 28" 4K is 60Hz @699, and the Lenovo 28" 4K is 60Hz @799. They won't be available until April, though.


All of these are using TN panels. TN's bad viewing angles in a 28" size must be a sight to behold.


I think you mean DPMS, DPMI is a little different[1].

[1] - http://en.wikipedia.org/wiki/DPMI

I'd need to dig a bit to know for sure, but I'd probably they're using off the shelf parts for HDMI that may or may not support DPMS, and either way it'll probably be some hardware changes. If they've already tooled up, yes, it will take some coin to make the transition.


Oops! You're right, I had that initialism wrong. I've corrected it to DPMS.

Also, I will concede that it wouldn't be cheap to add the features I want, but I'd nevertheless pay a small premium to see them happen.


It doesn't only cost money, it will also cost time. In this case the time is the important part, not the extra two dollars you are willing to spend. This could bump up the release date 6 months or more and by then they have lost whatever market there was.


The fact that it can only do 30Hz in UHD mode is probably one of the reasons it's so cheap - going to 60 or 120Hz is a two to four-fold increase in processing.

It'll happen - it's just a matter of time (less than 18 months if Moore's law is anything to go by)


I'm sure they could do it now, no Moore's law required. The technology and parts are already available.

I think the lack of 60-120Hz has more to do with keeping costs down. Low production costs translate to lower prices for the consumer. By selling these TVs at absolute rock bottom prices, they can sell them by the truck loads and, more importantly, make a name for themselves. These last couple weeks I've seen Seiki's name brought up dozens of times all over the internet whereas a month ago, I don't think I've ever heard their name before.

If this is part of their marketing plan, it's working beautifully.


>> I think the lack of 60-120Hz has more to do with keeping costs down

That's exactly what I meant - there have been displays for years that can do really high refresh rates at 4K, they have just been quite expensive.

Due to Moore's law, image processing ASICs with that kind of processing power should be as cheap as what's in the Seiki in a year or two.


I believe IBM produced a high resolution monitor which was a single panel, but acted as if it were 4 smaller displays, complete with 4 separate display input cables. Something similar could technically be done to re-use lower-res, but higher-speed existing display controller parts on 4K panels.


Two special display input cables (DMS-59). IBM T221. I have one. It's... hard. The max I could drive it was on 25Hz -- two 1920x2400@25Hz panes besides each other. Watching a movie on it was a weird experience so I set it aside. If someone has 4 DVI outputs, you could drive it surely at 41Hz which would make a huge difference ( the monitor is up for sale :) ).


Speed your movie up from 24fps to 25fps; it should make a huge difference.


Some of the 4k displays out there are literally two panels glued together. Fortunately, DisplayPort supports multiple displays per connection, but DP hubs are expensive, presumably even if they're internal to the monitor. I'm not sure that using this feature would work for having two chips drive a single monolithic panel.


I believe the 4K displays are merely presenting themselves as if they were two displays glued together -- do you have a source that says it is actually two glasses??


For some reason, I read your post in Sterling Archer's voice.


Daaaanger zooone


The article lists the "ideal size" for monitors at 50 inches, and the one for TVs at 100+ inches!

While I'm sure it's great to have such massive screen real estate (I wanna get a 4K in 2014 too), I can't but help think this is the programmer equivalent of consumers buying needlessly large TVs.

I work on a 24" desktop monitor and a 14" laptop one. How is 50" the "ideal size"? Even 39"? At what point does the law of diminishing returns kick in? Sure, it's great to gave 7 windows side-by-side, but is it that much of a productivity jump from, say, 4 windows or 5 windows?

Personally and subjectively, I'd draw the line somewhere around the 28-32" screen size.

To sum up, the core argument here seems to be that upgrading to a 39" or 50" monitor is great because (a) it's cheap, and (b) more is always and infinitely better.


Jakob Nielsen's research on "ideal size" is 48Kx32K (1200 dpi) @ 1200 fps for perfect image quality.

See http://www.nngroup.com/articles/how-much-bandwidth-is-enough...

His 2012 recommendation was 30" or larger screens for knowledge workers (http://www.nngroup.com/articles/computer-screens-getting-big...).


"A 48Kx32K pixel monitor with 24-bit-per-pixel color needs 4.6 GB videoRAM, so it will probably be about seventeen years (2012) before these perfect monitors are commonplace."


We probably would've had them now if display tech didn't stop progressing in 1999....


Only 120 fps actually in the article. He also doesn't talk about the color space, only about the granularity.

But anyway, quite nice that there are articles that take a little bit of a human point of view and not just the usual "existing technology, just 20% better is good enough for everyone"


It seems there's already a monitor which offers to upscale your 120Hz signal to 240Hz: http://www.eizo.com/global/products/duravision/fdf2405w/inde... . And the product page suggests that there's already a ready market for it, among viewers of high-res satellite imagery: apparently the blur when scrolling such images on a mere 120Hz (and high-persistence) display is notable enough to be distracting.


120 fps, not 1200 fps


> At what point does the law of diminishing returns kick in?

Well, diminishing returns have already begun by the time you get to the sunlit uplands of about 1024 × 768 (non-interlaced! in 16-bit colour!). Every further step up feels somewhat incremental in comparison. But are further steps worthwhile? Yes. There's a reason why people use maps and papers printed in 600dpi and above, then spread several of them out at the same time on a big table. Can you display this chart http://xkcd.com/657/large/ so that all of the text is clearly legible, and without having to scroll or zoom? Can you display this chart and two long excerpts from LOTR side-by-side so that all of the text is clearly legible and you can move between all three just by moving your eyes, without scrolling or zooming inside any window or moving, resizing or tabbing between windows? If not, then your monitor isn't as high-detail as it should be and/or doesn't cover as much of your visual field as it should. And that test doesn't come all that close to testing the upper limits of what you can usefully do with progressively-better displays.


I use 6 19" monitors (using a combination of display port, HDMI, display link adapters). It's a bit of a mess to set up, but I like the actual physical splitting of the monitors (I have the logfile monitor, the google chrome monitor, the firefox monitor, the jira monitor, the editor monitor, etc...). Also, it plays in very well with a tiling window manager where I can just switch windows from one monitor to the next without having to worry too much about tiling "inside" of a screen. On bigger than 19" monitors, I find I can fit more than 2 windows side by side and then it becomes a weird game of rearranging windows.

I have been using this for 3+ years now, and while people always look at me like I'm crazy when setting up 6-7 monitors on my desk (I freelance so I usually lug them around or just buy a used set of monitors for a few bucks), I could never go back. Not switching windows, and using your peripheral vision for things like database load, or just plain having 2 editor monitors and 2 reference browsers open at the same time is invaluable.

While I can work on a single monitor, even a huge one (I love the 3k x 2k notebook resolutions), I feel cramped because I don't have the physical separation that comes from the multiple monitors.


But, given a single large very high resolution monitor and a tiling WM, don't you get the same in the end but with vastly simpler hardware setups?

If you are so keen on the 'physical separation' then perhaps a WM that includes some synthetic black bars to give defined visual separation of tiles?

You didn't specify what resolutions your monitors are running. Out of curiosity, what are they?


With my WM, I can switch the workspace of each of my three monitors independently. If I only had a single monitor, such composition really wouldn't be possible (unless you specifically built a WM capable of doing so).


i have 2880x1800 on the mbp, and 4-6x 1440x900/1280x0124 or whatever on the others (depending on which i find).


Having had a six-display setup for about 5 years, I agree. The Mac Pro made it quite easy to set up, too. Having separate (preferrably HD) panels, each dedicated to one window, feels excellent.

For about half a year now, I've been using a 3-display setup centered around a large iMac. While the iMac display is nice, I kind of miss my old work flow. The iMac resolution is quite awkward, actually, because it's too small to split in two or four separate work areas, but at the same time it's way too big for just one window. There is really no good way to use that screen efficiently. So I often end up splitting it into two overlapping halves with a code window and some other window, but on the whole the setup is not as satisfying as the 6-screen.


Out of curiosity, what iMac size? I've got the 22" one, which is perfect for vertical split of two windows, but I remember having the exact same issue you did with my previous 27".

I've now got my 2011 22" iMac sitting next to two 22" Dell UltraSharps, and it's amazing for my workflow :)

That said, I also do a heap of work on my 11" (1366x768) HP notebook, so I'm crazy, and use so very many virtual desktops its not even funny.


As I said, it's the 27". Too big for one window, too small for 2. Its pixel density has the same problem by the way: it's denser than the usual low-density HD screens, but it's also nowhere near "retina". I have a low-cost HP Pavillon 23" HD monitor on both sides of it.


Nice, but 6 is a bit much for me given my situation.

On the road with 2 LG 23" monitors (6.5 lbs each) and a Dell M4700 laptop. i3 WM provides the tiling needs. Love the setup, not too much context switching (i.e. changing virtual desktops) and sufficient screen real estate to do my work.


Think I understand where you're coming from, but for me virtual desktops give the same separation.

Do you use virtual desktops as well?


nope, just that layout. 6 really comes into play when doing web development. These days I'm usually running with 2 or 3 external monitors, as I don't have as many things to keep track off. Having virtual desktops hides the kind of peripheral vision thing I like having, plus I have to think in terms of "hidden" things, which I usually tend to forget about anyway.


I don't really understand how a TV can be needlessly large. Also, I don't understand why people still buy TVs in 2014.

When I replaced my TV about 4-5 years ago, I bought a 500€ beamer and a 100€ retractable canvas. Sure, that beamer only has 1024x768, but today you'll certainly be able to buy something significantly better in a similar price range.

With my living room PC connected to the beamer, my "screen" is 1.7 meters wide, I can watch movies, television, play games, have internet, linux (rsync!, ssh!) and all of that significantly cheaper than any "decent" television set.

Why do people (given they have the space) still buy televisions?


I did the same (but my screen is taller than yours is large :P), but I think there are a few reason why this hasn't seen larger adoption.

People like to have their TV on all day long: with a projector you can't, because it won't be very visible in daylight and it will burn through lamps like it's popcorn.

Second point, modern TVs usually have a tuner in, you plug them in and they work. Also you don't need to adjust the image until it's rectangular.

You don't need to have a line of sight for a TV, chairs and people are not an issue.


You are right that of course, not everybody has my usage pattern. Since my daughter becomes a mindless, staring zombie whenever a TV is running in her field of vision, we only watch TV late in the evening (for us, a lamp lasts about two years).

Before I first set it up, I expected the configuration to be a lot more tedious than it actually was. If you have the computer, some sort of sound system and your TV receiver lying around, all you need is a switch for the different audio sources and you can basically plug it all together. (In my setup, TV does not go through the computer. We don't have Hulu, HBO or Netflix over here.)

I'd say about 80% of the work was mounting the beamer to the ceiling and getting the upper hand on the cable mess.


With many (most?) cable companies now encrypting even "basic" channels, you need to use their box to watch anything. At that point, you don't need a tuner, unless you're actually receiving over-the-air broadcast TV.


Because contrast and brightness are much higher on televisons.

Also, televisons often offer more features like usb, networking, sound and cec. I'm not entirely sure about most beamers, but I don't believe they offer the same integration with a chromecast or raspbmc that my current 500 euro sharp aquos does.


my setup is a Samsung 42" TV + AppleTV + a PS4. The colours and clarify are amazing (the sound isn't)

I use the TV for BBC iPlayer, Netflix, and Viki (I'm into asian soap operas) I use the Apple TV for Youtube, The PS4 is for Battlefield 4, I use Airplay for anything else


A couple of friend of mine have have gone with that solution and to be perfectly honest the bad image quality really annoys me on the cheap setups. The one setup that I've seen and liked cost in excesses of 2500€. I'd much rather have a 40" TV with the best image quality possible than a crappy 80" image from a cheap projector.


At my office they replaced the beamers with big ctouch touchscreens. It's a huge improvement even at the smaller diagonal because of the clear and sharp image. Plus, it's cool to stand there pulling stuff across the screen minority report style.


Try watching a projected picture on a canvas in brightly lit room. It can be difficult to view it if the room is not dimly lit.

TV is still king in these situations.


Cables, for one. Most living rooms already have antenna plugs and power in one convenient wall or corner, where the TV is supposed to be located. The beamer needs to be opposite of this position (to beam at it). To not cast shadows, it will commonly need to be placed high. Not very popular with wifes, this setup.


Ok, I was lucky in that regard. Now it looks all neat, but the first time I set it up I was really, really sloppy. Fortunately my wife is a lab-hardened scientist, so the conversation went like this:

Me: "Do you mind if I nail these cables to the walls in our living room?"

Her: "Whatever. Just be done by tonight so we can watch something."

This time, I worked with well-hidden cable conduits and it is not that noticeable any more.


The fan noise for a projector can get fairly annoying.


Although your brain erases the noise after a while, it is indeed pretty annoying.


I mentioned that I used a sound system. That is actually not quite accurate any more, since we had to make some adjustments when my daughter was born.

At the time, we were living with an unfortunate flat layout which placed the bedroom directly next to the living room. So in order to avoid waking up the child, we started using headphones (and have never stopped doing so since).

Since I had checked before buying the beamer that it wasn't too loud, using (good) headphones had the side-effect that beamer noise is now an absolute non-issue for us.

I find it enlightening how all these responses show that my setup really isn't the right thing for everybody. In retrospect, I was also not consciously aware of all the quirks that my wife and I were willing to accept (headphones, cable conduits, TV only in the evening).

Nonetheless, as beamers become brighter and less noisy (LED?), they will probably become attractive to a wider range of people.


Because projectors in a given price range have much lower image quality (contrast in particular) than TVs in the same price range.


I've had three 30" Cinema Display's side by side for a while and it was awesome; except for my neck hurting like hell after a while when looking at the outer half of the outer screens :)

That convinced me that, unless you're going to make a circular desk with a keyboard that moves along with your chair, the perfect setup is a 30" display with two 20"-ers next to it in portrait mode.


I'd take that to the next step where in the future all workstations will be a nice tall igloo shape where the entire roof and walls are the display and you can swivel around you seat (with embedded keyboard).


> At what point does the law of diminishing returns kick in?

You won't know until you try. I hesitated a lot about switching my CRT from 14" to 17". After half an hour of work going back felt like staring at ATM bulb shaped display.

I'd like to try very large monitors but it's pointless without higher resolution so I'll wait until I can buy 4K as a toy.


Indeed. I remember in the 90s even going from 14" to 15" was noticeably better (running MacOS 7). 17" was better yet and the 19" on the Unix workstations - wow.

Now my Apple Cinema Display feels cramped and I'll probably add another...


Id say in terms of productivity it adds almost nothing because the biggest things holding a programmers productivity back have nothing todo with display size. Better get rid of interruptions, procrastination, bad tools etc.

That being said programmer happiness is elemental and a 42inch 4K screen would add to that for me :)


I feel like I'm less productive on multiple monitors. I've tried them a number of times, but they've never felt natural. I just prefer looking straight ahead, managing a single screen, and clicking between the few programs I'm running in the taskbar. Also, I like some room on my desk, and big monitor displays just seem overwhelming.

Screen resolution is important though, otherwise everything just feels cramped. I'm perfectly happy with a nice 23-30" monitor. Anywhere within that size range is fine, but it needs to be a quality panel. I work with a lot of very subtle colors, and light gray tones. With a TN panel they're invisible half the time, or I'm constantly shifting up and down in my seat trying to find the sweet spot for more accurate colors.

If I had to pick my dream setup, it would be a single 30" monitor. If Dell gets those 4K 30" displays rolling out, I'm sure they'll be great. For now, I don't feel like my productivity is suffering in any way with a 24" monitor.

Personally, I find I'm most productive when I have a nice clean workspace, natural light, and a good chair. For example, I just grabbed a photo off Google...

http://articlesofton.files.wordpress.com/2011/12/01-04_home_...

Replace the chair, and it's ideal for me. Bright light, and minimal clutter.


>> I feel like I'm less productive on multiple monitors. I've tried them a number of times, but they've never felt natural. I just prefer looking straight ahead, managing a single screen, and clicking between the few programs I'm running in the taskbar. Also, I like some room on my desk, and big monitor displays just seem overwhelming.

You might (maybe, just saying what I like) want to try one 'main' monitor and one subsidiary. Can be the same size or smaller. Main monitor is for the work you're focussed on, the other monitor is where you have documentation or other reference material you need while working. I quite like this pattern rather than just throwing programs all over the two screens.


Tabbing between windows can be an distraction. The stack-like function of the tab mechanism does not help it either.


You should try a tiling WM in which you can switch between windows based on their location. eg, alt+{left,down,up,right}


Any recommendations? I dived into xmonad some time ago but I never found a good and fast windows manager for OS X.


I use bspwm, which has a very simple design and follows the unix philosophy quite nicely. A single daemon process which manages windows, and all configuration is done with the shell via the `bspc` command, which talks to the daemon over a socket. A separate daemon (sxhkd) is used for keybindings, and the configuration for sxhkd is just a map of keybindings (including mouse buttons) to shell commands, with a very simple DSL to allow groups of commands, and the use of the cursor position. Example from my configuration:

    alt + {Left,Down,Up,Right,h,j,k,l}
        bspc window --focus {left,down,up,right,left,down,up,right}

    {alt,super} + Tab	
        bspc {window,monitor} --focus next
    
    alt + super + {Left,Down,Up,Right}
        bspc window --swap {left,down,up,right}

    super + {Left,Down,Up,Right}
        bspc window --edge {left,down,up,right} push
    
    super + shift + {Left,Down,Up,Right}
        bspc window --edge {right,up,down,left} pull

    super + q
        bspc window --kill
Worth noting that bspwm treats monitors individually when using window commands, so you need a separate set to handle windows on diff monitors (unless using Xinerama).

I've no idea whether it works on OSX though. 2bwm seems to be popular among OSX users, but I've not tried it.


wow, the tiling concept looks even more flexible than the tiling in xmonad. I will definitely try this.


I wouldn't say it's more flexible, as XMonad has all the power of Haskell behind it (+ ability to create sub layouts), and is easier to create more complicated arrangements than hacking together a bunch of bash scripts - but I think it's a good balance of flexibility:simplicity. XMonad is pretty complicated to configure even pretty basic things.


There was some discussion about OS X window management options on another thread the other day: https://news.ycombinator.com/item?id=6998309



I personally use Divvy on my Mac, I needed something simple and it works perfectly for me :)


slate


Nope..hate that too. I've been using virtual desktops since the 90's to overcome that issue. I use up to 8 vdesktops that I address absolutely by using numpad1 .. numpad8. Each number has a specific function (generic browser, email, editors, db, virtual servers, gimp, ...). Nothing generally overlaps, and switching is instantaneous.


Assuming your window manager has some overall spatial layout of desktops, or a preview that highlights the current desktop: why not add a 9th one? Then you can arrange them in a 3x3 grid in your window manager, the same way they already are on your numpad.

I do this and it gives an incredibly tactile feel to desktop management.


Totally agree. Using a single monitor works well if you use an IDE otherwise a multi-monitor setup is better.


Even then, IDE on one monitor; web for reference material on the other.


For a lot of devs having a third monitor for the database IDE is very useful.


A 39" monitor on your desk can block a lot distractions... from behind the monitor.


I'm using a 30-inch monitor to program regularly, and it isn't nearly enough once you learn to use it properly. Have you really tried this out?


I've tried a 30-inch monitor for a couple of days and to be honest I much prefer 2x24" monitors.


I can't argue about a 39" monitor, but having more screen estate boosts productivity for me to the extend that I'm sure that I could use 3 27" over my current 2 24". Avoiding alt+tab means I get my information quicker when I need it from another window, and I get back to the previous context faster. Sometimes I need both but does not have space and is tabbing back and forth.


Yep. At my home office I have 3x Dell U2412Ms (1920x1200 i.e. 16:10) on an Ergotech stand and I'm absolutely loving this physical setup for programming. The side monitors float over my desk (an Ikea Galant, corner version), and having a central monitor with two surrounds is just ideal for me. I much prefer this to a dual screen setup.

I think that ideally the same 24s would deliver a 2560x1600 resolution, however I'm still quite happy with the setup as it is. 3x 27"s might be nice, but I fear it could become too much. Well, at least for my wallet, that is!

An issue with the 4k displays is that if you're going to have 3x of them, you're going to hit GPU walls very, very quickly.


Sure, but if one were to plot (a) the number of alt-tabs you do in a day, versus (b) screen size in inches...don't you think at some point the delta benefits would start leveling off?


Sure, diminishing returns kick in but monitors are cheap and programmers are expensive. It's the same reason that 16+GB of ram and SSD's are worth it. Not because most people need it but the cost is low enough that the productivity gains need to be minuscule to be worth it.

aka Assuming monitors last ~3 years and cost below 2,000$ it takes a tiny difference to be worth it because your not just comparing productivity delta's but also cost delta's. In other words a 2,000$ monitor that improves productivity by 1/20th of 1% than a 1,000$ monitor is clearly worth it over 3 years even if the difference is simply eye strain. 3x (salary + benefits + overhead) * .0005 > 1000$.

PS: Another way to look at it is spending an extra 1-2k per year on salary over industry standard rarely tempts programmers but an extra 1-2k per year on hardware might.


That wasn't the best thought out example. The total yearly cost for a programmer would have to exceed 666.666 $ to justify a 1000 $ purchase, if the improvement is only 0.05%.

That said, I agree. The benefit doesn't have to be big to justify it.


Ops. Yea 1/2 not 1/20th of 1%. The .0005 number is correct, but I messed up changing that to a fraction of a %.


1/2 of 1% is 0.005


indeed it is subjective. it varies between persons, and for the same person it varies between tasks.

i use 4x2 24" in the office. some days it feels like i could use more screen space. but it has also reached a point i need to roll back and forth on my desk to read all ends. coding work is concentrated in 3.5 screens. spreadsheets, browsers, email, front ends, other software occupy the rest. and yet it still hasn't released me from alt-tab hell. there will always be that bunch of information that you must have in front of you no matter the cost, and the cost is that you still have to alt tab the less important windows -email, excel, hacker news.


I think the "ideal size" depends on the size of the monitor you are currently using. I was happy with 29" initially... now I can't wait to try something larger.

I am not sure where this process ends, but 50" sounds very nice indeed. :)


I found 24-27" the right size for me. Anything larger than that feels uncomfortable due to too much movement of the head.


This is quite the opinionated "article".

"4k isn't for couch potatoes", "you should buy a $500 monitor", "if you are programming on something so old, [you suck]", etc...

I am fairly surprised to see this style of wording upvoted to the top of HN, actually.


I took this as a bit ironic because my writing style was tamer in years past and I feel it has become more blunt (to use another euphemism) since I started reading more Hacker News postings.

But yes, it's opinionated. I'm quite strongly of the opinion that in general larger desktop displays increase productivity. They are finally arriving, and at prices that are almost affordable. With several important caveats and weaknesses that cannot be ignored, the market is finally moving in a direction that I applaud. So this blog entry/rant is a form of celebration.


Hey, no worries!

There is a time and place for everything.

I am not criticizing your writing style. I just found it odd to find one like it at the top of HN is all.


Because this kind of writing style makes people with different opinions frustrated which leads to discussions and debates.

I actually like it. It reminds me of kenrockwell.com


A debate without the accompanying upvotes would trigger the flamewar penalty though.


You and me both. Not that I'm complaining. :)


> At our office, we just equipped all of the programmers' workstations with Seiki 39" 4K televisions as monitors. [...] For the time being, there is no single higher-productivity display for a programmer.

Maybe it's just me, but I'm most productive on a 13" laptop. I can change my work place and position as I want, plus only one app fitting on the screen actually helps me focus. Emacs/tmux/IDE users can do most programming work in a single application anyway.

I've used a 27" monitor and before that two 20" monitors for years. It was great for other activities, but when it comes to programming, it only distracted me.


It's not just you.

While reading the article I was thinking "wow, these guys are using 39" monitors, while I'm perfectly happy with a single 17" monitor or a 13" laptop. There must be something wrong with me."

Emacs is my screen space saviour. Every file or shell is 3 key presses away, I just don't need extra monitors.


How do you use frames and windows in emacs? Even on a single screen, I've generally got at least a couple of columns up, so I can understand the utility of greater real estate. With multiple monitors, you can open two frames up, use next-multiframe-window and it mostly behaves like a giant emacs across two screens.

That said, I almost exclusively work on a 15" retina MBP and don't find myself annoyed, so I guess you're right. :)


I use multiple monitors for having my editor on one screen and having on the other screen:

1. My database design for reference

2. A web browser for when I google how to do a particular thing

3. Specifications for the project I'm working on

4. Example code from other things I have written in the past

These things make me a more productive programmer. I don't see a legitimate reason as to why multiple monitors wouldn't make someone more productive but let me know what I'm missing!


Have you tried emacs in portrait mode on a bigger screen?


I've tried it on a 24in monitor running at 1920x1200 in portrait mode. Its great for editing text. I found that a good setup was to split the screen vertically and have 2 windows, one on top of the other. It doesn't really work well when splitting horizontally.


I prefer:

> 12" 1440x900 (X200s)

> 14.1" 1440x900 (T400)

Resolution is big enough, mobile, keyboard is normal sized and you can see your coworker while talking to him.


MBPr 15" @ 2880 * 1800, scaled to the smallest possible size. It's awesome.


I'm currently doing my hobby programming projects on a 10 inch screen (asus t100), and have set up my flow so that i don't need a bigger screen.

I used to have a three monitor setup, but at some point realized that all that real estate was preventing me to focus. Now i only want one window on screen at a time. Alt+tab is my extra screen real estate. My usual screen is 22 inches and that's about the maximum size for putting windows fullscreen.


You must not work from lengthy spec documents that often.


I write lengthy spec documents on that 22 inch screen ;)

I use a logitech m500 with the inertial scrollwheel, so scrolling through long documents is easy. Also, word's navigation pane makes it easy to jump from point to point. The 22 inch screen is just large enough to see the full page width and the nav pane side by side.

When implementing long spec documents, i print them out and mark off the parts i've completed. There would be no added value from having a second screen where i do that digitally for me personally. (I did try it once.)


I'd rather scroll than tilt my head up and down to read. And I can only read one thing at a time, so it doesn't give me any benefit to have both documentation and code on screen at the same time.


This is the first I've heard of another programmer using the T100. Are you using it as a terminal to a more powerful computer, or as the actual computer? If the latter, what does your dev environment consist of, and what platform(s) are you programming for?


A 39-inch monitor? I'd feel pretty intimidated by such a beast at close-quarters. The extra real estate would be handy but the constant glare from all angles attacking my peripheral vision would be relentless. Nice to see some 4K adoption however.


It's not that bad. I had a pair of 30" monitors in an open floor office and I really liked the setup precisely because I wasn't distracted by watching everybody walk around behind my monitors. The people who disliked it simply used a dark background/editor scheme and kept the browser window towards the center.


This seems like flaunting your budget more than improving your environment.

Maybe I just need to get my eyes checked again, but for some reason my code just doesn't perform any better or write itself any faster with more screen real estate and I'm as productive on my laptop as I am on my big screen.

4K is not for programmers, code is for programmers.


$500 is such a small amount to pay, it is easily recouped over the course of a year. $100k to pay a developer? Just 0.5% to break even through reduction in time spent, or increase in comfort, or decreased eye strain, additional time spent working, or improved retention. It can be a signal to potential hires that you spend money where it counts, instead of fancy foosball tables or a 75" plasma in the reception area. I always tried to get my team the hardware they wanted because it's such an easy problem to solve as a manager and can pay in spades.

I agree it's not an automatic improvement, but I'll take that bet every time.


You seem like a very nice employer :)

For many of us that have in the past (or still do) work for less-than-very-nice employers, or ones without as much money available to spend on hardware for their team, there was/is always hope in improving your own practices. I have spent a lot of time investment improving my own practices and I wouldn't give up CLI, workspaces or dvorak for enough 4K monitors to line a conference room with.


I find I am much more productive at home (1x 30" monitor + laptop display) then I am at work (1x 24" 1080 monitor + laptop display).

The screen size absolutely makes a difference when you are working on a large project with lots of disparate pieces. When you jump between project-wide search, debugging, showing git diffs, and so on.

This is especially true if you do graphics work and code. When you need to move graphics assets around your filesystem and into your IDE it's nice to have the room (i.e., not so many overlapping windows) so you don't spend time digging for assets, resizing assets, running scripts over assets.

I consider myself very productive on just my laptop screen, but sometimes I postpone large architectural changes or heavy instruments based debugging sessions until I get home. There's less back-and-forth and a considerable speedup under certain workloads.


I do all these things in a terminal. I used not to. But once I started doing things the CLI-way, I never looked back and have yet to discover a tool offering even half the productivity gained from keeping my fingers firmly planted on the home row. However, if you are locked into using an environment where doing the things you've said requires mousing or multiple windows with tons of resolution, then you have my sympathies.


While I do a lot on the terminal, not everything is faster there.

Jumping between symbols in your IDE is not going to happen faster on the command line because it's a spatial operation. You've located the symbol visually, clicking it to jump to definition is far faster than keyboard-navigating to it, or even typing it.

Merging is better with a dedicated merge tool than a command line merge. I have to resolve conflicts one to three times per day and I couldn't imagine a non-visual merge interface for complex files where manual edits from both sides are necessary. Being able to see it all at once helps.

There is nothing quite as good as Xcode's instruments on the command line — providing easy to navigate visual representations of your performance and memory use, graphs and other metrics. Having the instruments panel on a second screen is incredibly useful, the extra screen real estate is of real value here. Having to keep switching on a single laptop display is a pain in the ass.

Obviously there are many tasks that cannot be easily sped up on the terminal, saving slices out from Photoshop (when your designer hasn't done it properly). And then getting the resulting assets into your pipeline. This stuff is benefited by extra screen space.

But the main benefit is being able to see more code at once. And it is especially helpful in large re-factoring operations or with extensive debugging / performance profiling. As I said, it depends on the type of work. I can code just fine on my laptop but I tend to prefer coding focused features, or fixing isolated bugs.

I find the previously mentioned profiling and architectural changes are better suited to large screens where more code / graphs / visual assets can be seen at once.

You simply can't do "all these things" in a terminal. You can do a hell of a lot, but sometimes you need visuals and sometimes you need space. When you can see more code you don't have to hold as much in your working memory, your eyes can flick back and forth over your code and it makes it easier to see the dependencies and realise what you need to change.


"Jumping between symbols in your IDE is not going to happen faster on the command line because it's a spatial operation. You've located the symbol visually, clicking it to jump to definition is far faster than keyboard-navigating to it, or even typing it."

When the lab I worked in moved between buildings, we each broke down our work stations and set them up in the new spot. I went through a good week of getting work done before I noticed that I hadn't plugged in my mouse - I just never reached for it, because virtually everything happens with a thought when my hands are on the keyboard.


Most of the time when I'm coding I use the keyboard to jump to symbols. But when you are getting acquainted with a large codebase, reading through and clicking to jump to symbol can be a more efficient way to parse the code and understand its interdependencies.

It may depend on what you use to position the cursor as well — I haven't used anything but a trackpad in the last five years, and even on that mostly I use gestures to navigate UI. Since the trackpad is almost part of the keyboard it feels very seamless to scroll/tap and type.


Yeah, I'm much more likely to bump a trackpad than reach for a mouse, even when the mouse gives me much more control (I always find trackpads a little finicky), just because of the distance. My hands don't need to actually go anywhere.

Of course, I also bump the trackpad accidentally sometimes, but that's a separate issue...


"Merging is better with a dedicated merge tool than a command line merge. I have to resolve conflicts one to three times per day and I couldn't imagine a non-visual merge interface for complex files where manual edits from both sides are necessary. Being able to see it all at once helps."

Being able to see it all at once absolutely helps, but on a big enough console you can see it all just fine. Vimdiff is great. A large monitor definitely adds value there, GUI or TUI.


Fair enough, I normally use DiffMerge and find it fairly quick at pulling relevant sections from the files being merged. But I agree that the three pane preview is a must.


DiffMerge is my all-time favorite visual diff tool, with 2 fatal flaws that upset me so much...

1. Unable to edit text on the left side, only the right. >.<

2. Related to 1, unable to simply paste two pieces of text into two views and compare them, the way you could on www.quickdiff.com. You have to manually create 2 files each time.

So sad, because it is so good in all other ways! And it's not open source, so I can't step up and contribute. :(


Although I don't have any problem stitching side-by-side commits in a terminal, or keeping multiple text buffers akimbo in general (its what the terminal is ideal for), I can't argue with most of your points because you're probably 100% correct in the context of your environment and it's limited modes of input (mouse-centric). And you have my sympathies. Being forced to use the mouse is a drag, and I would argue, a cruel thing to do to users from an accessibility standpoint. This (without going deeply into it) is what ultimately binds me to my "limited" set of inputs: keystroke. Yet being bound to the keyboard interface has only helped my productivity. When I get improperly formatted assets from my designer (i know, right?) I can actually do all manipulation, resizing, splitting, converting from the CLI faster than I can send ze an email! (even some minor visual stuff! but my designer fucking hates when i do that so i tend not to anymore.)

But your environment and my environment are likely completely different and YMMV.


As I said, it has more to do with your working memory than with speed or ease of navigation.

If you are viewing more code at once (which a larger screen allows you to do) then you can be more productive at certain tasks. I find that large architectural changes, performance profiling, and complex debugging are particular scenarios that lend themselves well to a large screen. This is the point I was making to which you replied with your preference for the command line on a small screen.

I am fully aware of how to manipulate image assets on the command line. Most computer vision and computer graphics tools I have written are executed via the command line so they can be pushed into my build phase. But I deal regularly with many formats from many tools, and not all of them can be dealt with easily on the command line.

The command line is not immediately better than everything — most visual oriented tools use keyboard shortcuts. You may not realise this, but committing those shortcuts to muscle memory makes using graphical applications just as fast, if not faster, than many tasks on the command line.

Do you also do all your performance profiling on the command line? Because that is a situation where visual data is very helpful, and a visual interface is in general more efficient and faster to navigate than the same on a command line.

As I said elsewhere, I do not use a mouse at all, so I do not have a mouse-centric workflow. I use the trackpad centimetres from my keyboard for gestures. This only requires slightly more physical motion than pressing a key, and ties itself just as strongly to muscle memory. I also use a graphics tablet for design, painting, and 3D modelling.

And to repeat, my initial point relating to large screens still stands. If you can see more code at once (whether on the command line or not, who cares?) then you do not need to commit as much to your working memory, and you can more productively make large scale changes that effect a lot of dependent code.

Same goes for performance profiling and debugging. The more info you can have up at once, the more you can dedicate your mind to reasoning about the problems at hand and the less you switch back-and-forth between views.


To your repeated point: It's been my experience that better knowing what the code is doing leads to improving it. I am not a smart or clever person. I can only see one focal point at a time. But with enough time (and a trivial screen resolution) I feel that I can understand and improve on something as complex as the operating system running my computer, which is some very big software with parts of it (big and small) written in the early 80's on 80 character wide screens. This probably applies to everyone's operating system, so I'm just using it to juxtapose my next subject, which is the nightmare project (and I'm not making any of it up).

Looking back on the nightmare project almost makes me cry until I manage to suppress its memory. The difference between this project and my operating system is that my operating system is written by people who need to maintain software for a long time and, at some random point in the future, will need to rely on the maintainability of the code they wrote a lifetime ago. My nightmare project was not written with the project in mind, let alone the future of the project. Its problems were far removed from screen resolution. Its problem was the programmer. The programmer made it as painful as possible for me to understand what ze had written.

There was no abstraction and the code's entry point invoked every procedure the project directly in various loops. There was no difference represented between the software's big wheels and tiny gears. This made large scale changes difficult because dependent code had no typed contract, just access to buffers and primitives passed around in what was essentially a big main().

There was no cohesion of modules in the project and procedures just appeared together grouped seemingly at random and given names project1.x, project2.x, etc. Even if I had enough screen real estate to have the whole project's source open simultaneously, I would have been just as lost as when I was trying to wrap my head around a single file.

There was never a consideration for inversion of control and all the flow parameters were hard coded paths in giant conditional nests all throughout the project. That made simply guessing which code paths to analyze for a given input difficult.

Seeing more bad code doesn't make fixing it any easier and seeing more good code doesn't help me understand it any faster. When it comes to code, there are many factors influencing how productive I will be at working on it and screen resolution should be down there on the list, hopefully near the bottom.

"You may not realise this, but"

Was that necessary?

"committing those shortcuts to muscle memory makes using graphical applications just as fast, if not faster, than many tasks on the command line."

What's even better about the command line is that you can pipe input and output between applications. Chaining commands and remaining on the cli is faster for me than switching between applications and opening files. Most UIs have no way to define macros for commonly used procedures, whereas you can script anything cli-based.

"Do you also do all your performance profiling on the command line? Because that is a situation where visual data is very helpful, and a visual interface is in general more efficient and faster to navigate than the same on a command line."

I do. I lack the sensory to monitor more than a couple metrics at once. I prefer profiling data that can be captured and analyzed later.


> To your repeated point: It's been my experience that better knowing what the code is doing leads to improving it. I am not a smart or clever person.

Here is the example I am dealing with right now. I have spent the last few months designing and implementing a complex system. It consists of hundreds of components and tens of thousands of lines of code. This project is reasonably small and I can keep most of the mental model "in my head" at once. So it's appropriate to edit on my laptop screen — or any screen. I can usually think of which symbol I need to jump to and within seconds I can be positioned in the editor, at that symbol.

Today I have had to take that component and push it into the rest of the system, which is hundreds of thousands of lines of code and it is a codebase that I have created over three years. I can't keep this all in my head, and this is where having a large screen produces noticeable productivity improvements and speed improvements to problem solving.

It is not a matter of not being able to jump around the code instantly from my keyboard — I can do that easily. It is knowing and remembering where to go and what I'm supposed to be working on. It is trying to maintain context while handling the many hundreds of errors that come about due to the necessary integration, API and architectural changes.

When I have a large screen I can afford to maintain a long list of files in a column on the left, I can see more large files at once. This allows me to constantly refresh my memory. I can glance at the data in front of me to determine what symbol I need to jump to next. I often mentally blank out when dealing with this level of scale — that "what was I about to edit?" feeling. When I am only on my laptop screen I have to start pulling up code, tracing a path back through it to redevelop context. When I have a large screen I can (usually) rebuild this context by glancing at the existing information.

(Note that this is not a messy project, I have spent a long time and given very careful thought to its design. It is just large and necessarily complex because of what it has to do. A bigger screen makes it faster to develop large scale interactions in such a project.)

> Was that necessary?

I'm sorry about that. It wasn't.

> What's even better about the command line is that you can pipe input and output between applications.

I have nothing against the command line — I use it all the time. I just like to take the most efficient path when working and sometimes that is not the command line. But that is tangential to the large screen point above.

> I do. I lack the sensory to monitor more than a couple metrics at once. I prefer profiling data that can be captured and analyzed later.

I had meant to ask if you analyse your profiling data on the command line. Do you?


My operating system is a huge and complex beast. But a lot it, if not most of, is modularized in a way that extending it never requires my touching any of its source files. It assumes a mission statement to the user (I'll make one up and say it's "I will organize your computer's resources and let you use them easily") and code contracts that allow multiple components to serve this goal. Internally, the components are abstracted into statements of a very simple logic-flow.

This paradigm persists in many components themselves, from the bootloader to the end-user. When I want to update something in user space, it involves, from my perspective, no interaction with the operating system (even though it is really the one doing everything). Conversely if I want to do something as advanced as completely rewrite my filesystem implementation (something my operating system is very dependent on), I can, and without ever needing to touch my operating system's source. I could come up with any ridiculous way for a filesystem to work, and my operating system would be cool with it, and would operate the same way because if its modularized and abstracted design.

I would say these are incredibly good design features and that you would have to be extremely hard pressed to find a system that inherently cannot be designed in such a way as to mitigate maintainability and trivialize extensibility. Is your top-level project inherently so complex that adding smaller pieces requires that much screen real estate? If so, you have my sympathies.

> I had meant to ask if you analyse your profiling data on the command line. Do you?

Yes, my command-line editor has some powerful cvs support and manipulation features that make slicing the data up very easy.


My argument is that in some cases large displays make coding more efficient.

In particular when navigating a large codebase, rewriting or refactoring an existing design, and other similar tasks. Your argument sounds to me like "If I write perfect code, I will never need to touch old code again and thus do not need a large screen."

It doesn't matter how big or well designed something is. There are going to be cases where you need to revisit, redesign, or rewrite. Your suggestion that you should just write code perfectly the first time assumes you know the perfect design for the problem at hand — the perfect design is not obvious in many cases, not until your code is used in production and many features have been added on top of it do you begin to think that you can revisit it.

In my case, I am writing a graphics engine, code editor, parser/AST builder, documentation viewer, and project management system. As well as managing documentation and strings localised in 20 different languages.

I have re-written the code editor three times now, each time I have carefully considered the design and how I could proceed with a multi-threaded, efficient editor free from rendering bugs and able to handle large amounts of data. At this point I think I have a great design and implementation, but it is far from perfect. Much of it was developed on my plain old small laptop screen.

Recently I pulled all old networking code from the system and re-wrote it using newer and better high-level APIs that have since become available. This task was benefited by a larger screen because I could look at old and new implementation, while looking at a second new implementation from a different area of the code base as reference — while also still having room for API documentation and related files. It just made things faster, it would still be possible on a small screen.

When I profile the OpenGL graphics engine being able to correlate performance graphs with locations in the code, while displaying GPU buffer contents is something that would feel quite clunky on the command line. It's also faster to be able to glance at this data with my eyes without having to flip between windows / contexts.

So my top level project is complex, but I never claimed it required much screen real estate. It can be more comfortable and faster to perform certain tasks with a large screen — which reduces the need for mental context switching. That's what I'm arguing and I find it hard to understand why you are arguing against this point.

> If so, you have my sympathies.

Your attitude seems quite condescending. As if my code is poorly designed and unmaintainable. This is not the case.

As I said: coding does not require extra screen real estate. It can just be more comfortable and more efficient when you have it.


I use two 27" monitors at work and would definitely feel a productivity hit if I were to go back to smaller or fewer monitors.

I dedicate one entire monitor just to a full-screen terminal with tmux running inside. It's trivial this way for me to edit and view and compare multiple files at once. I typically dedicate half of the second monitor to a browser, for testing, and the other half to administrative applications, like email, Jabber, and IRC.

The result is that the cost of context-switching is drastically reduced. I'm able to maintain focus for much longer stretches of time because I don't really have to ever completely switch contexts.


A tiling window manager would blow your mind. Anyone whose productivity is improved by using tmux should really try out something like i3 or xmonad or awesomewm.

I use i3: http://i3wm.org/

My vim-like config: https://github.com/SirCmpwn/dotfiles/blob/master/i3/config


I use xmonad on 2 screens, it's amazing how much better it feels than without a tiling manager.


For me, just turning my attention (and my head) to a second monitor is too much. I greatly prefer multiple workspaces/virtual desktops to multiple screens. I'm fine with completely switching contexts when the cost of context switching is 0.


I guess I can sort of understand this viewpoint, but I barely have to turn my head at all to go from looking at code to the browser. I can move my eyes between code and the browser (for instance) faster than I can switch between virtual desktops.


Well said. I wrote code back on a 640x480 just as easily as I now write code on my 1080p. The only differences in productivity have come from me, as a programmer.


I've always felt the same. A huge monitor has never felt any better or more productive than my Macbook Air -- just bigger.

I mean, I just never need to be looking at more than one window at once. And when I need to interact with multiple windows (editor, browser, e-mail, docs, whatever), then I need to shift the focus to each of them anyways to reload the browser, change the code, highlight a new email, etc. -- so I always have to cmd+tab to the window no matter what.

After all, you can only look at a single window at a time, and I never need to keep things open for constant monitoring or anything.

(I do a little 'trick' with OSX, however, which is to always resize every application to use the full screen area, and keep it on its own desktop, so I never have to deal with the clutter and confusion of overlapping windows. And remember, dragging and dropping files etc. works between desktops just fine!)


I like to have the following windows visible at the same time :

* Code editor * Terminal with the logs of my app * Web browser where I see my app running

It's really one task, and focus is on doing this one thing (coding). But having these windows visible at the same time really saves my short time memory.


What sort of programming do you do?

Doing web development in VS2010+ feels very cramped on anything less than 2 1080p monitors for me.


"What sort of programming do you do?"

Waddya got?

When I do web dev, I prefer a fullscreen terminal for editing and a virtual desktop with a fullscreen browser for proofing.


I was thinking this as well and additionally praying I never work with code that requires debugging in 4K which may be inevitable if one is writing in 4K.


30Hz is a dealbreaker. I will keep my 1920x1200 until I can have a proper 4K display. I really am surprised there are people who can tolerate that refresh rate on a desktop.


I poisoned 60Hz accidentally when I bought a 120Hz gaming monitor. Now 60Hz mouse movement feels choppy and horrible.


"Proper" includes 16:10 or even better (15:10 would be great)


Honestly I hate this new trend of making displays wider and wider. I feel like 4:3 or something a little bit wider is great for programming. Right now my two widescreen displays waste space at the edges because it's just too far to look at. Putting some of that vertically would improve the use of screen real estate for work


4:3 is 16:12. 16:10 is 'a little bit wider' :)


Why would you care about the aspect ratio instead of vertical pixel count? I think wide-screens are okay as long as there is enough vertical space (to read code, articles etc.). The window manager is a software problem and can be fixed For eg, Ubuntu's Unity had some clever solutions to utilize wide-screens efficiently.


Assuming a sufficiently high pixel density, I think what most people care about most is physical height. Being too wide might make a display difficult to fit in to certain work spaces, but it's not nearly as big an issue with desktops as it is with laptops.


Especially if you're using your monitors for gaming and such, in addition to work. I was a bit disappointed at first, having just upgraded to 3x 27" 1920x1080 monitors, thinking I could have had 2x 39" 4Ks for about the same price. But 30Hz would basically eliminate gaming on them, much less Eyefinity/Nvidia Surround. With the 3x 27" set up you can easily move them between rooms when you're not using all three, too.


The Seiki supports true 120Hz at 1920x1080.


Advertised, but the 39" model does not deliver.

https://www.youtube.com/watch?v=4AClZklhqeA

Edit: Looks like crossflashing, may fix it. Testing now...

http://www.avsforum.com/t/1482611/review-of-the-seiki-39-4k-...


Damn. That's weird it works on the 39" but not the 50".


Yeah, I'd love to have such a big, high resolution display, but it has to meet some basic standards. I agree with the article that it's a waste to market that kind of resolution as a TV (why would a TV ever need that kind of resolution?), whereas a cheap, good quality high-res monitor could sell very well.


Not only programmers.

ASIC designers. Analogue electronics engineers. Traders. Quantitative analysts. Anyone whose work involves big complicated things that it's difficult to hold all of in one's head: a large piece of software, the design of a power station, the results of a finite element analysis.

For almost any sort of heavy brain-work, having more pixels makes for more productivity. So for almost any sort of well-paid heavy brain-work, employers should be falling over themselves to provide lots of pixels. It is a longstanding mystery to me why so many aren't.


I had 4k for years (4x 22" displays in pivot mode). It really improves productivity on complex tasks such as debugging.

Soon I will upgrade to 3x Dell P2815Q with 6480x3840 resolution.


I'd love three of those Dell monitors. But did you happen to see the update to that Forbes article [1] that suggests that the P2815Q will also be limited 30 Hz refresh? I'm enduring 30 Hz now obviously, so I know it can be done, but I'd like to see Dell up the ante a bit and make 60 Hz 4K happen.

[1] http://www.forbes.com/sites/jasonevangelho/2014/01/07/dell-w...


30Hz is not really a problem for reading code. And it also makes things simpler for graphic cards.


It's a TN screen :(


That would be a deal breaker, but lenovo and asus make nearly identical IPS displays.


AFAIK all 28'' 4k screens are TN, that's why they are so cheap. If you want IPS, it's $1300 for 24 (dell), or $3500 (dell, asus).


And the 32" inch $3500 ones are IPS on IGZO instead of traditional amorphous silicon.


Yes, sorry - I had a typo in previous comment, I meant $3500 for the 32'' from Dell and Asus.


Lenovo ThinkVision Pro2840m looks like IPS, there are even videos which shows good angles.

Personally I only care about viewing angles, so cheaper technology such as e-IPS would be just fine.


That's the same number of pixels as 12 1080p monitors. Isn't there some level of diminishing returns after about, say, half that resolution? Just curious.


I personally think we ought to reconsider our whole desktop paradigms from scratch. Some of the ideas we're using were designed they way they are primarily because of display size constraints. For example, popup/tree navigation based application menus would probably seem silly when we could just dedicate a fraction of one side of your screen to always show the whole tree.

I don't think we're anywhere close to the limit where something like Eagle Mode won't still benefit by adding more pixels (provided you can also supply the memory and processing capabilities for it to work responsively). The same kind of thing could apply to many tree structures, or graphs of complex relationships - being able to visualize large collections at once without the context-switch of scrolling is useful. Being able to view the whole source code of an application at once would be incredibly useful.

Another example might be reading or writing some research paper. Frequently you end up scrolling back and forth through a paper to cite or connect the bits together, which is pretty cumbersome, particularly when "back" buttons don't always have the expected or desired behavior (e.g, many viewers are missing a "push my current scroll position onto a stack before navigating" feature). Being able to view a whole document at once would ease some of that.

It'd be interesting to see some studies on the difference between the efficiency of using spatial layouts compared with sequential actions. It seems to me that spatial layouts quickly become rote actions that don't require much thought, but having to execute a set of commands is repetitive and error prone. (e.g, how many times have you clicked the wrong application in a task bar accidentally when you have a brain fart?).

It seems the industry is obsessed with heading in the opposite direction. I find mobile platforms a pain to use and I generally avoid them. Things like Gnome Shell's task switcher are also a regression from task bars, as the sequence to switch an application now has extra steps.


In terms of sheer workspace? Probably, but then you can dedicate the excess pixels to increasing pixel density and making things easier to read.

After getting used to the high-DPI displays in tablets and phones I have to admit, desktop text rendering has become very notably lacking.


It's not just that - you can read very small text on a high DPI display. I'm consistently amazed that I can basically read HN without zooming on my S4 despite it being tiny compared to my desktop (but the same resolution pixel wise).

That's a big benefit for when you want to keep an eye on some debug output but not work on it directly, or quickly parse big sections of text.


I usually use one window per display. First display is for code navigation, 2nd and 3th for code windows and 4th for documentation and notes.

Also decent window manager with hotkeys and scripting helps (KWin on Linux)

I practically never use alt tab :-)


My current setup is 4x22" at 1050x1620. I originally started with two displays, but kept adding over years. I think each display increases productivity by half of previous (2nd display adds 50%, 3th adds 25% and 4th 12.5%).

My secondary screens are slightly damaged (dead pixels, scratches...) so they cost peanuts. The real limit is electricity bill.

Also with such large setup you need good light balance. I always use inverse color scheme (light on dark background). For example I even filter web page colors.

I am upgrading because my laptop and phone has retina display and I love it. Also LED displays should have lower energy consumption compared to my ancient displays.


Heh. Lots of articles about "technology sucks" on a page which has broken middle click so I can't open new tabs, can't right-click to copy a link URL (in the 'more' section), breaks the back button if the AJAX request completes after you press back again...


I'd love to have a good 39" 4K monitor, but I really don't want any other developer in my office to have one. For some reason, everyone else feels that they should use the entire screen width for code.

So we end up with 200 character long lines on our 24" monitors. Can't imagine the horror they'd produce with 39" monitors.


Not having an arbitrary length limit allows one to write code prose, instead of being limited to code poetry... :)


4K isn't only for programmers though, or even only for programmers and a few other specialised technical or AV jobs. Bring the word of high-resolution monitors (ideally through demonstration) to your friends and relatives who are navigating tax law or writing papers about Beowulf: 4K is for them too. Doing any serious work or study in law, business or the humanities on a single small, low-resolution computer screen is horrible: it's like being forced to read only through a cardboard kitchen-roll tube. And the more people who are demanding reasonably-priced high-res monitor screens, the more likely manufacturers are to take note.


Does anyone know if the 15" 2012 MBPR (that is, MacBookPro10,1) supports 4K with its HDMI output under Mavericks? Of course, it will only be 30Hz, but that's tolerable.


I've read officially no, but Apple frequently gives "official" specs that claim a feature isn't supported when really it is.

For example, every macbook pro in the last 5 or 6 years that had replaceable RAM had an actual RAM limit that was double what Apple said was its RAM limit. I wouldn't be surprised if 4K works, albeit perhaps a little sluggishly.


>Apple frequently gives "official" specs that claim a feature isn't supported when really it is.

"Supported" is not the same thing as "it works".


Why would Apple do this? What possible reason could they have for selling their own product short?


Removing the amount of choices in front of people allows people to feel more confident about theirs.

If you give them the option for 4Gb, 8Gb, 16Gb and 32Gb instead of limiting to 16Gb lots of people will freeze and never make a choice or get frustrated when they see the markup for 32Gb and avoid buying altogether because of that.

In limiting the amount of upgradable RAM on their sales page they show you choices that are most likely going to be easy to sell.


They probably figure it won't necessarily work very well, so people would complain, so they'd rather just short-sell their machines.


Apple simply does not cater to the crowd that does aftermarket upgrades to their computers. The limits they list on the box are usually the top configurations they will sell you. They do not bother testing and certifying every combination of third party component that might happen to fit into the machine and work.


Really? Are you saying my 2012 Mackbook Pro could use 16 GB? Do you have any source for that?


It does. I have the 2012 first macbook pro retina (bought it the day it came out for its 2880x1800 display back in June or July of 2012).

You need to be running Mavericks to use its native HDMI port. I am running the default version it came with (10.7.x) so it only works on the HDMI port under linux (which I primarily use on it) and Windows. That being said I use an Accel Active DP 1.1 -> HDMI 1.4 adapter which makes it work just fine even on older version of mac os X. Actually with that adapter you can use a lot of even much older macs to drive the seiki or any other 4k 30hz display.

The adapter: http://www.amazon.com/Accell-B086B-008B-UltraAV-DisplayPort-...


The late 2013 drives at 30Hz over HDMI, and 60Hz over DP if you are running Windows.

I don't think the HDMI hardware is new enough in the 2012 for HDMI, though in theory display port might work if the drivers get enabled for it.


From what I can tell (running the same unit you are), only the 2013 rMBPs will drive 4k panels. Would love to be wrong on that, though.


I've got an early 2013 rMBP with the NVIDIA GeForce GT 650M and it drives 4k at 30 Hz over HDMI.


Can you reply with your Model Identifier?

Click the Apple icon in the top left of the menu bar, choose "About This Mac", click "More Info", click "System Report", and on the first screen ("Hardware Overview"), the Model Identifier will be displayed. Mine is "MacBookPro10,1".

I would love to verify that it is capable of driving these 4k displays, if you wouldn't mind checking.


MacBookPro10,1


Huh. Apparently that one's identical to Mid 2012 except for a tiny improvement in CPU, so that sounds promising. Thanks!


Thanks for confirming!


My 2012 (MacBookPro10,1) works with Mavericks out of the box. I use SwitchRezX to get it up to 31Hz.


how is this better than multiple monitors? the support in Windows (and at least Ubuntu Linux) for e.g. maximising a window to fill a single monitor is excellent.

The other thing, with any article like this where Windows is not used for testing makes me skeptical. Most programmers do not work in offices developing on Macs or Linux machines by a very large margin. Working on a mac for iOS and OS X development is something we grudgingly do because Apple enforce it... in fact you could say that all programmers use Windows and be very close to being accurate.

I've never used multiple monitors on a mac to comment though - I can imagine the support being just as good as in Windows or Linux.


An extremely large number of programmers use Mac and Linux machines. Linux on the desktop is sustained to a substantial extent by a programmer audience - and you only have to look at the availability of software for it to see that it's popular among that group.

Personally, when I'm not using a unix-y environment, it's because I've been forced. Using windows as a programmer feels like having one of my hands chopped off.


sure it is a large number, but large != most.

every programmer i know works in a windows dominated environment. this is a small sample though.

i believe that in back-end only environments linux has some serious popularity but i can't comment on how many studios use it as their development platform because i have no great experience there (actually i work in office where they develop such things, but they use windows exclusively).

for desktop software, games and web front-end development windows is king. its the platform that all of your target audience are using to an excellent approximation. for AAA games there is an extra restriction that its the only practical choice (there are zero tools for working on any other platform). sure for iOS/OS X there is the same restriction to a mac and iOS development has become extremely popular - even so there is quite some resistance to using OS X and XCode as anything more than a test configuration on cross platform projects - I've seen a lot of macs dual booting to windows in this context. For android Linux can give you a small edge, but its not much...

Linux is obviously meant for programmers, but that doesn't mean they use it either - until quite recently even the best flavours were horrible user experiences. I think ubuntu is great and really takes steps to get away from the 'you download source and build your app with archaic and buggy tools at the command line' approach which has always dominated... in terms of doing your job though - unless you work on the backend of some web service you can't do any useful builds under Linux, unless you are targetting Linux desktop which is exceptionally rare.


I don't dispute that windows developers are in the majority - I dispute that other choices are as vanishingly rare as your initial comment suggested.

Anecdotally (I work at a large corp), slightly under half our devs use Linux. You only need to look at the array of software-for-programmers that's available under Linux to see that it's an extremely popular choice - the availability of programming software and libraries is generally better under Linux than it is under Windows.

While linux is certainly a bit more hassle to maintain than windows (I use a mac when given the choice, which i find gives me the best of both worlds), it's still a clearly superior environment for highly technical users. It's only recently with the introduction of Powershell that Windows has stopped being a substantial handicap for developers, imo.


every programmer i know works in a windows dominated environment. this is a small sample though.

Indeed, every programmer I know works in a Linux or Mac dominated environment!


Every web frontend dev I've ever worked with used a Mac.


Maybe you didn't read the whole thing? I use Windows on my workstation at the office and included two photos of that. It works just fine on Windows.


my bad. not sure how i missed that. must have reached a natural ending point at the bottom of my screen. :)


Flagged, for two reasons:

- Requiring JS for text. Okay, I'll turn it on this once since you claim "nothing dodgy."

- Disabling my control click to open in background tab, especially after claiming "nothing dodgy"? You shouldn't be on the front page.


Well at least you posted why. Incidentally, I got a lot of flack for the animation and loading posts via AJAX and the history API rather than just as traditional pages. Lesson learned: a blog isn't really the place to experiment with my meager UX skills. So I'll probably simplify it when I get some free time.


What about couch potato programmers? That strikes me as a pretty sizable market segment.


Whatever. It’s not the number of pixels, it’s the pixel density that’s important. Higher density is what is going to make text and pictures better. At 39″, that’s 113 ppi. Anyone can rig up a multi-monitor display setup with higher ppi. The only advancement is one physical screen that a good number of GPUs can’t take advantage of.

For comparison, my Retina 13″ MacBook Pro has 2560x1440, 2.5K (or really ⅔ 4K) but at 227 ppi.


The real factor is pixels per degree of your vision, not ppi. Viewing distance range with phones and laptops is pretty limited (an arm's length), but with a monitor you can simply move it farther away on the desk to increase the perceived pixel density.

Of course, moving the monitor farther away also makes the perceived monitor size smaller. since PPI and distance can be traded one for the other, what ends up mattering is the total resolution. 4k is a lot of pixels.


For programming, pixel density will probably just get me smoother fonts. Previously it might have gotten me more lines of fonts, as I could've reduced the font size will maintaining readability. But I think I've reached a limit there, where further decreasing the point size just hurts. And honestly, with your usual fare of monochromatic fonts and current font smoothing tech, I'm quite okay even with "just" 1900x1200 resolution.

For other stuff, sure, more pixels won't hurt. Reading PDF at high ppi devices is a joy.

But getting back to programming, I wonder how I'd cope with a 39" display. At the same position where my monitors are now, I could probably replace two with just the one. With two monitors my viewing angle is too wide anyway, so again I'd mostly use either one half of the screen for lots of stuff while paused videos and web pages occupy the rest, or just push everything to the sides and focus on the center. Maybe TekWar had it right...


The classification of this as a television reveals the schizoid nature of display development. For programmers the main requirement seems to be lots of pixels without undue eyestrain. Where things start to get crazy is that 4K is now the standard for movie projection in commercial theatres and creative professionals of all stripe are clamoring to produce content that takes advantage not only of the pixels but all the qualitative properties that photographers, illustrators, animators need. For me as a photographer, having a 4K camera is a wonderful thought in terms of resolution but if the display can't integrate into my workflow to match my printer then it is not so useful. It is this referenceability issue that makes it possible for Sony to sell 10,000 dollar reference monitors for television production. I can only hope for the day that I can by a $500 4K monitor for creative professionals.


> Several colleagues found the display shockingly bright and were frustrated that the brightness adjustment did not actually reduce the backlight intensity.

In 2007 I had the idea to use my 28" flatscreen TV as a monitor, took me three days before I had terrible eye strain/migraines/light sensitivity.


Yeah, I have a beautiful little ASUS netbook (the newer ones that ship with Ubuntu) and its brightness ranges from reasonable to just barely on. It's fantastic being able to tune the brightness to the ambient light in the room (as well as save on battery life.)

I really don't care about 4K, I want a monitor that's got a wide range of brightness levels with the same integration with my Ubuntu workstation that that netbook has with its integrated screen. (I plug it in and the OS immediately knows how to communicate with the monitor to adjust the brightness based on a keystroke. )


Does the alleged productivity boost come from the increased screen estate, or the feeling of value that comes from your employer investing in you? I had a distinct boost when I went from a 17" Dell laptop to a 13" Mac Air.

It'd be interesting to see if any increase lasts.


I agree, higher resolution displays is a productivity enhancer, but I also believe that saving money by purchasing a television, as a monitor, is not a good idea… taking care of your eyes is worth every extra dollar.

Just recently I went through the same issue, had to buy a new monitor for my work station, finally decided to go with a 2k Apple TB monitor, also considered Samsung alternative, but it doesn't have some of the features Apple added such as automatic brightness control and additional ports.

My decision was based on taking care of my vision, tvs simply aren't built to be viewed from a desktop distance, they are just too bright… they where thought to be used from a distance.


My two 23" screens area already too big for me. I could cut half off each almost. They're nice IPS panels with minimal glare and really good for your eyes. Putting a cheap 39" TV in front of my nose 8 hours a day sounds ridiculous.


I don't see how anyone could use a 30hz panel for a productive length of time. At 40hz, which is what my laptop goes to when it's in power save mode, even something as simple as scrolling a web browser looks like a torn, jagged mess.


> This blog uses a little JavaScript. Nothing dodgy, though, and nothing hosted at third-party sites. Just some jQuery and animation bits. So please, if you'd be so kind, ask Noscript to call off the hounds. ... That said, if you insist on leaving script off, you can just scroll down a bit to read the content.

This site is my hero. Write a "no javascript detected" message as if blocking javascript is perfectly understandable (even if you secretly believe we're all hatted in tin). You know your audience with these, what does it hurt to throw them a bone.


(That said, I don't block javascript to keep content providers from doing dodgy things quite so much as I do it to avoid fallout when someone breaks into your server, or slips something into your page in transit.)


Very tempting! Unfortunately I like to have my monitors close to my face, so I think the large size of the television may cause viewing angle problems.


I thought the same thing but now that I have the Seiki it is very similar to having 4 x 20" 1080p monitors. The view angle is decent as long as you place the monitor about 2-3 feet away.


Article about 4K monitors with 300 pixel photos. It's just not right.


I feel like I must be in the minority here: I enjoy multiple (smaller, 28" max, sub-24" preferable) displays.

I run iTerm2 in full screen with Vim split alongside 2 or 3 console panes; I find the separation from the rest of my OS a blessing. The second screen I use for Chrome, HipChat, Flint, Sparrow etc; I still notice chat notifications, and to keep a eye on my dock (which displays all notification counts across my multiple programs) is a simple key combination if I'm in 'the zone' and don't wish to break out of my programming environment (iTerm2).

I've tried larger screens in the past, and found myself physically looking around the screen a lot more, simply due to having all my content splatted together.

Combine this with a keyboard-based window manager and the 1/2 - 1/4 screen segmenting and I feel much more organised and efficient than trying to manage a 30"+ monster of a screen.


I don't think I'd want to work somewhere where it appears to be compulsory to use a ridiculously cheap $500 39 inch UHD display. How ridiculous! 28-30 inch is plenty big enough for a desktop computer.

The intentions are good. But seriously, why not just wait a few weeks for the Dell 4K monitor that's price well under $1000.


Anyone know, is a Windows machine required for the step of applying the mentioned firmware fix to this monitor?


Nope! The monitor "boots" and patches from a file on a USB key. It's fairly painless.


Sweet, thanks!


A few weeks ago I got a brand new 27" 1920x1080 Asus display for some 200 bucks. Once at home, I realized it was a TV set without the TV receiver. This is the worst thing a programmer may ever buy.

A programmer's monitor should have not only a fairly decent contrast, but also the darkest black possible. This thing, even at the lowest brightness/contrast/colours settings, outputs a lot of light (and its black is just a gray), making it unsuitable for programming with any color scheme.

I switched back to the old 19" 1440x900 Samsung monitor bought some seven years ago, which has a true black. I'll pop in a mediacenter box and throw the 27" in the dining room.

Before buying a monitor, you should check for its "blackest black" in a low light condition and use full-screen.

And -yes- that tsotech article just sucks.


> And -yes- that tsotech article just sucks.

It's just a blog entry. I'm not a journalist.

But thanks for the criticism. It's clear you and I just disagree about displays.


It is definitely a productivity booster for devs. However if your software audience includes low res users (enterprises, accessibility etc.) ensure that design/layout testing for low res usability does not become an afterthought. We have had this oversight in our team a few times.


My problem is that I have really bad eyesight and I don't want to program all day on a monitor with pixel density like that. I have a laptop which has a high pixel density, 1080p at 13 inches and it is just not ideal. I am zoomed in 150% to 200% on websites routinely.


High pixel density should, ideally, have no influence on text sizes but the sharpness of the text.

You should increase the DPI count in your OS.


I don't really understand monitor specs, is the 30mhz limit due to the size of the monitor, or is it a GPU limitation on most PCs?

I'm wondering if a 4k 36" monitor would have a better 40mhz limit or something?

Also, my laptop is a few years old with Intel HD integrated graphics. Any idea if it would have enough power to run one of these?

What about a ChromeBook?

I'm trying to get an idea of just how much power these things need?


The panel is capable of 120hz on both 39" and 55" but the 39" is further restricted to 60hz at all resolutions. The limit at 4K to 30hz is the HDMI connection. The TV does not have DisplayPort, though HDMI 2 is hoped for in a future firmware update. (At least based on Sony's announcement of such an firmware update for their sets.)


If you flash the 39 inch with the firmware from the 50 inch it can do 120Hz at all resolutions 1920x1080 and below. I even give it 240Hz @ 720p even though it drops half the frames to reduce the input lag from around 9ms to 4.5ms.


Sounds too good to be true. 30 fps mouse cursors were starting to drive me nuts. I'll have to try this soon. I wonder if/when Seiki comes up with an HDMI 2 version of the TV we can reuse the firmware here...? Edit: I see that Seiki announced new 4K TVs with HDMI 2 at CES, but only 50"+, no 39". I can only hope the new firmware has clues for getting HDMI 2 running on my 39" eventually.


Honestly, I've long used a pair of 1080P 37" TV as my preferred monitors. One landscape, one vertical. I place them far away so my eyes don't have to work as hard to focus. Fonts are big for the same reason. I don't need 4k whatsoever. It would just make things too tiny to read with the current poor support OSes have for dpi independent rendering.


Well, this is just about the opposite of what many people have been moving towards for the past decade (smaller, more portable, lighter).

I do think a single large monitor would be nice. I'm using dual 24's with a 13" retina display centered below and find it difficult to make smart use of the screen space because I have to turn my head constantly.


Part of the problem might be gamers not having powerful enough machines and not wanting a blurry image from scaling.


I think PC gamers would find the 30Hz much more of a disincentive. I think most console games are used to 24 fps anyway, but they don't even have the option to reach that resolution (yet?)


Yup, 30hz would be a no go. And I am pretty sure Sony has said the ps4 is able to play video at 4k, but has no intentions for games at that resolution. Hell games for both systems aren't even 1080p across the board.


Anyone knows where you can get something like this at a similar price point in Germany? The only price I could find the exact same TV was 1200 Euros (about $1600). Not even ebay's international sellers seem to have it, and those proved pretty useful when it came to the Korean high density IPS monitors...


Yes indeed. And programmers should get multiple screens too. This will only get better over time too.


For movies, last time I checked about a year ago, it was still difficult to get a bluray that was real 1080 (they usually are DVDs converted to bluray and the quality is not any better) ... I wonder how many years it will take for 4k content to be widespread ...


Not my experience at all, although I could see that being true for older movies.


> but if you are programming on something so old, you should first contend with that. So for programming Android apps I need an top of the line GPU? Most ridiculous thing I've ever heard. Not even bothering reading the rest.


I had to bring my own shitty 19 inch monitor to work just to have two :/


No, 4k is for film editors and colorists. These are the people who think about resolution color, persistence of vision and so forth for a living, and who ultimately guide the standards that result in quality monitors. Whingeing about HD as if it had hindered some programmer-driven monitor nirvana is laughable; if the needs of programmers were the guiding light of display technology we wouldn't be using GUIs because so many programmers prefer a command-line environment.

I want a 50-inch desktop display with north of 10,000 horizontal pixels.

Big whoop. I want an interactive holodesk and bidirectional wireless neural interface too. I also want dynamically-generated domain specific function graphing so I don't want have to shoehorn my ideas into a one-dimensional text stream.


I'd say up to about 42in 4k monitors are for programmers, between that an about 8 feet, 4k is a gimmick--4k projectors are for couch potatoes. 4k VR headsets will cover everyone.


Of these, I feel the least certain about VR headsets. I've not yet tried one so I remain skeptical. But I agree, if a terrific VR headset hits the market, that would be compelling.


They say that having a 120hz monitor greatly improves the desktop experience (it's not just for games) so wouldn't 30hz be a real downgrade if you're used to 60hz?


Honestly, it depends on the person.

At work I have an old 75Hz 1024 x 1280 panel (portrait) beside a newer 60Hz 1200p panel. If you look closely, you can tell the old panel is smoother. But its just that.. you have to look closely. It definitely isn't a jarring difference. IMO I wouldn't get much more out of a 120Hz panel just for programming and daily use.

As for gaming, wI know the extra framerate definitely smooths out the experience in FPS games. I'd take a 1440p IPS panel over a 1080p TN (120hz) any day, but I know others who say the exact opposite. Its totally subjective.


Just wait 12 months. Better 4K options are on the horizon.


This monitor is listed as $999 on Amazon, following the very link he posted in the article. Is there somewhere else you can buy it for $500?


He's talking about the 39 inch, which is $500. When you go to Amazon, it defaults to displaying the 50 inch. Click 39 inch button to the right of the product image and you'll get the $500.


Would I get the full resolution using the DisplayPort out on my laptop and a cable to convert that to the HDMI on the TV?


I have been using the Seiki 39" for about 4 months now and I agree with a lot of what @bhauer has to say.

There are a couple of things as a developer. 1. You end up developing really long lines of code that might upset other developers 2. I'm on my second Seiki, my first one died and it took 1 mo. to receive a replacement.


Does anyone know if a macbook pro (1 year old) can support 4K monitors?


Only late 2013 retinas do


Not true. The older 2012 rMBP will do it fine too on newer versions of OS X or linux/windows. And that is just over the HDMI port. With a Active DP 1.1 -> HDMI 1.4 adapter even very old macs that have TB will run the display just fine.


alt+tabbing? wut?

Synergy + 17" Laptop as server + Desktop with 2x 24" screens on desk arms...

Some one explain why a single large screen is better for a programmer, when synergy is dead easy to setup.


Some people loath the seperation between screens (I am one of these people) -- So I have a single nice 30".


Who would disagree that more pixels is better? I DO disagree that a 39" glossy monitor from a garbage brand and that only runs @30hz in full res is a good option right now. Too big, glossy and every Seiki monitor that I have come across has had problems within the first two years (everything from displaying all green to catching on fire)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: