Hacker News new | past | comments | ask | show | jobs | submit login
Linus Torvalds: Make 2560x1600 the new standard laptop resolution (plus.google.com)
633 points by orjan on Oct 31, 2012 | hide | past | favorite | 327 comments

Now that windows 8 and mountain lion both for the first time credibly handle HiDpi I agree, it's about time we switched. But laptop makers are hardly to be roundly chastised for not going super density until now (though higher than 1386x was always workable) since the windows experience of hidpi was pretty broken. Fonts would get clipped inside too small bounding boxes, things wouldn't line up, chroming would be too small and so on.

Now all we need is linux to credibly support it as well, or at least a linux built for mouse use. There are still many, many usability issues in gnome with a high ppi screen.

Also, lets not forget how far mobile gpus have come in the last few years, it would have been impossible to push that many pixels with anything but the most minimal of 3d use cases.

It's a much more complex problem than linus would suggest by simply having oems switch panels. Witness how relatively complicated apple's solution is, which came after years of supposedly "somewhat" supporting hidpi. Use the hidpi macbook pro at 1920x on the ivy bridge gpu and it's still noticeably laggy at some 3d operations.

Care to share why you think Windows 8 handles DPI differently than Windows 7?

At least in terms of native applications I've seen literally no differences; maybe Metro handles it better but Metro on the desktop is, well, Metro on the desktop.

PS - I know it is now called "Modern UI" but that name sucks.

i do mean metro, mostly. aside from engine improvements microsoft has been really beating the drum with 3rd party developers to support high ppi screens out of the gate. booting the retina mbp on windows 8 works surprisingly from the start, i have it installed with bootcamp.

A lot more detail about the mbp and 8: http://blogs.msdn.com/b/tonyschr/archive/2012/08/29/windows-...

MS has been trying to get 3rd party developers make resolution independent apps for very long time now. Last time it was WPF that was supposed to solve all problems, now it's Metro.

eg here is one blog post from 2006, which in turn refers to guide written in 2001 http://blogs.msdn.com/b/greg_schechter/archive/2006/08/07/69...

iirc there was some similar blog-post series about high dpi stuff when windows 7 was released, but I couldn't find it now with google.

WPF and Metro are of course both XAML so the same push from the WPF days applies. I've build apps in Silverlight and WPF that scaled just fine. You just need to make sure you start with proportional design and stay that way, using some key min and max height/width properties where necessary. Most desktop app devs are used to fixed sized gui widgets which is where the problems start.

Once you get into a proportional mindset, the hardest problems tend to be related to text and images. One trick I learned for the latter was using vector art as much as possible and converting SVG images to XAML (care of XamlExport[1]). Then scaling no longer becomes an issue. For raster art, I would hope Metro has something similar to iOS's retina graphic substitution magic. Silverlight 4 didn't which was a bit of a drag.

Text was always a little trickier but when dealing with full screen apps I would usually just have some code that changed the text size based on the delta between the new and old window size.

[1] http://www.mikeswanson.com/xamlexport/

Vector art doesn't scale very well. Most icons are designed in Illustrator (as vector art) and then touched up in Photoshop for each scaling used (as pixel art), mainly because the real time rendering of vector art just isn't at professional quality yet. So the only solution, which has been adopted by iOS and Metro (and probably Android) is to include multiple icons for each DPI class.

In my experience with converting to XAML you don't experience the same degradation since your art is being actively redrawn. You can add gradient brushes and other embellishments that scale well. Perhaps I will dust off my windows laptop and do a blog post about it.

That said, you mentioned that Metro will do image substitution a la iOS. Thats fantastic and something I sorely missed during my Silverlight dev days.

It seems more baked in this time, with metro automatically engaging two different scaling targets (full hd and 2560x) as well as the baseline ~100dpi. So people will be in hidpi modes without engaging them, and a good number of them since 10" full hd is looking like a high volume form factor. All of the ui elements are vector, fonts obviously are, and there is built in support for switching out bitmaps for the 3 stated targets - even in css from the internet. Visual Studio 2012 has these dpis built in to its ui builder previewer along with the typical form factor previews. I guess time will tell but it certainly feels like the most credible attempt to date to me.

> Care to share why you think Windows 8 handles DPI differently than Windows 7?

I can't answer the to specifics of how rendering is different between 7 and 8, but the frameworks released with 8 have or are moving towards vector and resolution-independent rendering.

Yeah. Metro is resolution-independent, it can scale to 100%, 140% or 180% (or at least, those are the asset sizes suggested)

Apparently it is called Microsoft Design Language by those presenting at the Build conference.


Another great post on the issue from Microsoft: http://blogs.msdn.com/b/b8/archive/2012/03/21/scaling-to-dif...

Problem solved in Metro. But now desktop needs the same love.

I used Win7 on 1920x1080 and has zero issues with the DPI handling. Oh yeah except for chrome, but, i don't use it then. On firefox i set devperpixel to 1.2. Other apps all work with the system dpi setting.

I didn't spend a whole lot of time looking, but the High-DPI settings I played with seemed to provide half-assed results: fonts and some components of the window were scaled appropriately, but other bits (spacing between icons comes to mind; a bunch of text was prematurely truncated or something) were just not quite there. It's a big improvement from previous versions of the OS, but the implementation seemed a bit off compared to iOS and what I've seen on Retina MBPs.

Now that windows 8 and mountain lion both for the first time credibly handle HiDpi I agree, it's about time we switched.

I made that decision back in 2004 when my laptop had 1680x1050 resolution. Since then I've only gone higher and I've yet to have problems.

Windows seems to have handled this pretty well long, long ago so I have no idea what people are complaining about.

You turn the ui scaling up to 150% in windows 7 or previous and don't experience visual and alignment issues? Or do you mean that you don't find the elements too small when used at default size? There is no argument that the latter works fine, obviously 1680x isn't even particularly higher density than 1386x. For most people, though, once you get to say 11" 1920x or 15" 2880x you really need the scaling and my experience is that it used to be pretty half baked.

Look at this skype screenshot at 125% for example:


I've used DPI scaling in Windows 7 for years now, since I used Win7 as the OS on my HTPC. My television is the only large enough surface where with the higher resolution i've needed larger text.

That screenshot you link to is of Skype, and I would not use Skype as a standard example for anything. Even if MS owns it now, it is a poorly designed program by today's standards.

The only UI problems I ever had with scaling was with iTunes, and that was at >150%. I write it off as Apple not optimizing the UI for Windows.

> That screenshot you link to is of Skype, and I would not use Skype as a standard example for anything. Even if MS owns it now, it is a poorly designed program by today's standards.

How does this in any way negate it's worth as an example of a problem? It's effectively saying "That problem, it's an invalid problem because I said so."

Because the cause of the problem is Skype's (and iTunes) poor development, not Windows.

I didn't state the problem is invalid, I merely provided anecdotal evidence from my years of experience in using DPI scaling in Windows to demonstrate that it's a non-issue.

> Because the cause of the problem is Skype's (and iTunes) poor development, not Windows.

Which is the key factor here. It's not Windows' fault specifically, but it's a problem that occurs when using (very common software on) Windows. To the end user, what's the difference?

If the problem is with third-party software, then end users have a decent chance of finding acceptable alternative software without the problem. The problem is also likely to be limited to some specific applications they use, rather than coming up generally.

If the problem is with Windows, finding acceptable alternative software is unlikely, if it is possible at all. Even worse, users are likely to see the problem affecting most of the applications they use rather than just some specific applications.

I'd say end users would notice and care about the difference between those two scenarios.

If the commonly-available applications for an OS are ugly, that OS is going to be perceived as ugly. And vice versa.

If Skype and iTunes -- which together are probably installed (one or the other) on a huge percentage of home systems, and a nontrivial number of corporate ones -- are ugly at higher resolutions, then there's a serious problem.

Sure, they shouldn't write shitty software, but it's now Microsoft's problem due to how it impacts the platform as a whole. At least in the case of Skype, one assumes they could fix it (since they now own the product); iTunes is ... probably more complicated.

I'm not saying there isn't a problem. I'm just saying the problem can be perceived differently even by people who don't understand what's going on under-the-covers.

If people generally spend most of their time in IE/Chrome/Firefox,Word,Excel,PowerPoint... and they all look great and Skype and iTunes look bad, I'd expect people to complain about Skype and iTunes.

If, on the other hand, people spend a lot of their time in applications that look bad I'd expect them to have a broader complaint (which may or may not be about Windows since, by assumption, we are talking about people who don't understand what's going on under-the-covers).

I think you're all missing a key part of the equation here. It was the OEM who configured the OS, the OEM who selected the panel, and the OEM who advertised the feature that presumably went into my buying decision. If my (windows+skype) looks bad and my friend's otherwise identical (windows+skype) laptop from another brand looks fine, I'm blaming the vendor. Who, coincidently enough, is the party that takes the economic hit if I return it. OEM's know this deep in their core - and this is one of the significant reasons you haven't seen them pushing devices that require scaling for normal eyesight.

> I'd say end users would notice and care about the difference between those two scenarios.

You haven't met very many end users, have you?

It's an issue if I need to use Skype. :(

The problem is that both the OS and popular apps need to support it, otherwise in most people's eyes it will seem like Windows is "broken".

Windows XP at 2560x1600 is ok on my 30" monitor, but it would be next to unusable on a 15" display. The Icons didn't scale to the resolution, so they would have been tiny.

Lots of screen real estate though.

2560x1440 on my 27" iMac is ok, but I have to use CMD-+ on almost all web pages. Only in Safari is Magic Touchpad zooming smooth and iOS like, but I'd like to use Chrome.

With 175% zoom I get some weird(and reported) errors in Chrome: http://imgur.com/TCDMj

The HiDPI solution in ML is not really a solution for resolutions that are somewhere in-between normal DPI and HiDPI.

If your eyesight isn't good enough to handle low-DPI, you may want to investigate the accessibility control panel.

There's nothing there to force a minimum font size.

Because that would break every app. But you can zoom the whole screen.

Yes, and that is exactly the problem. The iMac's size and resolution is awkward-DPI.

I enabled HiDPI mode, but then I got 1280x720 on 27"... And without it much of the texts on web pages are too small.

You make it sound like I'm blind or something. I'm just tired of reading far too small texts.

I'm pretty sure HiDPI mode and accessibility zooming are not the same thing.

100 DPI is pretty much the industry standard, so if it feels awkward to you then it's not a problem with the iMac per se.

The size of the display makes you increase the distance, or one would get chronic neck pain quite fast. :)

Are you running Windows XP under BootCamp? I could see Windows XP being okay at 27" @2560x1440.

No, Mountain Lion. I am a bit disappointed since NeXT had Display Postscript back in the 80s and today we use bitmap icons far too much.

NeXT had bitmap icons (TIFFs) displayed using display postscript. The problem with vector graphics in icons is that you'll still need multiple versions for different resolutions or the icons will look terrible at most resolutions, and then it's horribly inefficient to render them on-the-fly all the time so you'll end up caching them in memory and rescaling them... and -- oh look -- bitmaps.

Well, eventually it's going to come out as a bitmap—they'll still end up as pixels on the screen. Vector icons just give you the flexibility to do it really really well. Cache them, prerender them, I don't care what works best, just match the icon DPI to the screen DPI and we're good.

His point is that it's not good enough to simply match the DPIs and render. Small icons need actual visual differences, not just scaling, in order to really look good. Some icons completely change their character at small sizes, because what looks good when small is not the same as what looks good when large, regardless of DPI. So going pure-vector doesn't completely solve the problem, although it certainly can make it easier.

There is that, and there is also the quality of how aliasing is handled by real time vector art renderers. We just aren't their yet to be relying on vector art for icons (even though they start out that way in Illustrator).

That's true at low-DPI, but not so much at high-DPI. I wonder if a bundle of low-DPI bitmap + a vector for high-DPI would work better than the standard practice of bundling multiple bitmaps...

I think it's still true, although less so. Certainly a certain amount of fiddling takes place in an attempt to align to pixels etc., but sometimes icons just need to change at different physical sizes. Small elements may need to be enlarged to remain visible and such.

But by chosing a certain style to all UI elements, vector icons could look and feel just smashing anyway.

Quartz2d is effectively "Display PDF". Bitmap icons are necessary (at least in the icon portfolio) as they can degrade more gracefully at lower pixel counts.

NeXTStep/OpenStep used bitmap icons.

It seems like a chicken and the egg problem but it really isn't: there's far more support for HD res than there is HD hardware out there.

Go to any etailer and check 1080p on laptop options: from 300 models you get 50 at best. There's very little choice too, AFAIK the Zenbook is the only 1080p ultrabook out there and I could only find 2 AMD laptops with HD screens.

Like I said before I don't know what's crazier: that a 10" tablet has a 2560x1600 screen for $399 or that a 15" laptop for $800 might not have a 1080p screen.

The irony is that its been years since I saw a sub-1080 monitor for sale, but XGA/720p laptops are still the norm.

Let's at least make 1080p the standard.

> There are still many, many usability issues in gnome with a high ppi screen.

Would you care to mention some? My impression has always been that Gnome handles different DPI settings relatively well, but I'm no expert.

I thought Gnome 3 has no DPI slider at all?

It uses the DPI setting from X, and you can then adjust the font size to your preference with the gsettings option 'org.gnome.desktop.interface text-scaling-factor'. I believe the idea is to set the DPI's in X to the physical DPI of your monitor, and then use this setting to adjust the scaling of elements to your personal preference. That this setting is not exposed in the default config dialogs is a drawback, and an unfortunate result of Gnome's current philosophy. I think you can access it via gnome-tweak-tool though.

> Now all we need is linux to credibly support it as well, or at least a linux built for mouse use. There are still many, many usability issues in gnome with a high ppi screen.

I've never had a problem using high DPI screens on Linux. There's a setting in xorg.conf, but I haven't had to touch it in years because the correct settings are usually auto-detected.

I don't use Gnome, though, so maybe it's causing problems?

The hiDPI thing should only be meant as a transition. What we ultimately want, is for pages to look the same on the web, and the UI to look the same on the apps, but just have higher PPI. The hiDPI thing just enlarges them because Windows 8 and Mac OS are not resolution independent. What I want is for everything to work on 2560x1600 by default, with the same sizes we have today - not enlarged, and not smaller either.

My retina MBP displays everything at "the same sizes we have today" right now, on a 2880x1800 panel.

Well you're not supposed to use something as weak as Intel's integrated graphics.

The HD4000 runs surprisingly well at HiDPI res, actually. Haven't experienced any lag myself, and when needed the integrated nvidia GPU kicks in automatically without a hitch.

Well, Sandy Bridge and Ivy Bridge integrated graphics is OK for most users (i.e. not for gamers and engineers, but for most people they are more than they need).

The new macbook pro retina 13" uses only the intel 4000.

I picked one up last weekend and it's surprisingly quick, at least whilst playing with things like WebGL and three.js.

What usability problems do you see with DPI in Gnome? I can freely adjust the font-scaling across Gnome and... well for the past 2 weeks straight I can't remember having any issues.

I'm so damn happy people are starting to realize how awesome resolution is. I've been buying 1920x1200 15" Dell laptops for 10 years now, and never bought a Mac because they've always had terrible resolution. I run Linux anyway, but I'm going buy Mac hardware next, unless a PC maker creates a competitive display (which I assume they will).

Next up: IPS LCDs everywhere.

For years, the things I have cared about in terms of computing have been, roughly:

1) Internet connectivity. Without it I'm mostly dead in the water.

2) Screen resolution. I want as much code/information on the screen as possible. I currently have a 1920x1200 Dell myself, and am I bit scared/saddened that I won't be able to replace it.

3) Memory. Always good.

After that come processor speed and storage space. I suspect my next machine will have SSD, because I've heard that it's such a dramatic improvement.

I completely agree with Linus though - I very much want a high resolution laptop screen.

Once you've gone SSD you'll be amazed you got by without it. It's the biggest single-part speed upgrade I've seen since the 90s. Even just a small on for a system partition, and data kept on a traditional HDD, you'll see great gains.

(And yes, also agree, it's a shame that monitor and laptop resolution seemed to stall at 'HD' for ages)

Any laptop I buy for any one, I always replace it with SSD. Higher dpi doesnot matter much for a programer but higher screen estate or resolution matters more.

Obviously if you're working with massive datasets or many many VM's this may not be true, but everyone I know who has 16GB RAM has claimed that it's more than enough. I've got 8GB, and haven't any complaints. I even know people with 32GB who say that with hindsight they'd have been perfectly happy with only 16GB.

An SSD however, is incredible. I originally wrote "essential", though I guess it's not... but it certainly feels like it is once you have one. The downside is that it will make working with non-SSD desktop machines irritating-as-hell.

RAM is so cheap today, I just maxed it out on my laptop. Granted, I have seen only probably 22GB out of 32GB used when I had YACY (P2P search network written in Java) running, but I still expect having system use rest of RAM for caches which in theory should speedup access to file system (on my SSD drive)

Hell, I had 2 GB RAM before and it seemed ok, 8 GB now is more than I need. I had to send the SSD for replacement, though, and it was easily the most frustrating week in recent memory.

Memory deduplication in Windows 8 [1] will lower RAM usage for VMs further.

[1]: http://arstechnica.com/information-technology/2012/10/better...

FWIW Windows 7 Home Premium (64-bit) has a maximum memory limit of 16 GB.

Professional, Enterprise, and Ultimate can go up to 192 GB. Windows 8 is limited to 128 GB, but the Professional and Enterprise editions can do 512 GB.

From http://msdn.microsoft.com/en-us/library/windows/desktop/aa36...

SSD on my MacBook Air is the one thing that has made more of a difference than anything else.

I can live with only running four or five major apps at a time in the 4 Gigabytes that came with my 2010 MacBook Air - but the SSD changed my life. I'd put that on your list above memory.

After having bought an SSD a couple of years ago I can honestly say it is at the top of my requirement list (after internet connectivity, but it's basically impossible to buy a computer that can't connect to the internet) for all my computers, easily above memory and screen resolution.

One of the reasons that I finally jumped over to the Mac was the fact that it was one of the few laptops that provided 1680x1050 at around 15".

My previous 2 Windows laptops had been that resolution, but when I came to upgrade about 2 or 3 years ago, that resolution seemed to have been abandoned for 15" laptops by pretty much everyone apart from Apple. Most of them had gone to 1440 or even less. A handful of pretty expensive, at least Apple price, and usually quite bulky, ones offered 1920x1200 but given that the scaling didn't look too great on them at the time I though that this might be just a bit too small for me, and if I was going to be paying about the same price for a Windows PC as I was paying for a Mac one - and not get the resolution I wanted anyway on Windows, I thought I'd bite the bullet and see what all of the Mac fuss was about.

I am a little surprised by this - both Sony and Dell have had credible 1920x1080 options at 15" for several years (admittedly I prefer x1200, which has also been available) - both have also had 1920 13" options for at least a year.

Are these matte? Put one next to a glossy display and the difference is glaringly obvious (pun intended!).

Yes, matte! (well, at least on the ones I know about - my Sony is matte - Glossy is better for little more than the store)

Like I say, the 1920 seemed too small to me, given the not particularly great scaling in Windows at the time if I did want to use other resolutions, and the ones that I did find were usually around Apple prices anyway.

I was recently asked for laptop buying advice. I told them that if they're going to be using it more than casually, then prioritise the screen over everything else; Minimum 1080p IPS screen. I was shocked that here were less than a 5 models (that you can buy new) out there that fit the bill. Such a sad state, I expected that to be the starting off point.

1680 by 1050, which is the best you could get on a 15" Mac previously, isn't exactly "terrible resolution" (by historical standards anyway). But otherwise agreed. My main machine is a 15" Retina, and I don't think I could ever go back to a lower-PPI laptop.

I bought the first generation MacBook Pro, a 17", for that reason. Other resolutions were just too small for what I was doing, and 1680 seemed like just enough. Eventually when they started offering the resolution upgrades on 15" models I upgraded to my current machine. The way I see it, there's no way I can be disappointed by the resolution of my next machine. No more pains over spending $100-300 for just a small bump to make things usable. Of course, the downside is that I can't let myself even try out the retina MacBook in store for fear that I won't look back– I'll wait until I can afford it.

Last time I went shopping for a machine 1680x1050 was my target resolution for a 17" screen with dual hard drives, I'm still using it four years later (it has had HD upgrade to 2x500gb 7200RPM drives) It happens to have a couple of GeForce 8600M GPUs in there so its just about usable for gaming too.

After I dropped it (laptop bag strap broke) and smashed the corner I looked around for a replacement (The mac book pro was tempting but oh the $$$) and couldn't find one for sensible money. So I bought some new plastics, redid the thermal paste on the CPU (it had started thermal throttling during long compiles) bought a second hand screen (and a new 4 port USB pcb that got damaged during the fall) off ebay to replace the broken clips that hold it together when its closed. So I also have a spare screen now.

I'm dreading replacing it but I reckon I'm probably still good for a year or two yet.

I would consider the fact that they have only been able to increase the vertical resolution by ~300 pixels in 20 years to be pretty terrible.

Meanwhile the images we view on those same screens have increased in size by nearly 20 times!

20 years ago notebooks had 640x400 or 640x480 screens...

If you include the 15" retina MBP then that's 17x more pixels in 20 years. And that includes greyscale to 8bit color to 16 bit color to 24/32 bit color, passive-matrix to active-matrix, and dozens of other panel technology improvements not directly related to resolution.

The XGA (1024x768) was intro'd in the early 90s. Maybe you missed the earlier discussion but we've all been lamenting the crappy default resolutions since then. We were not discussing bit depth or black and white vs. color.

Not in laptops. Which is what everyone is talking about, if you missed it.

The problem is there are just as many bad IPS LCDs systems out there as there are bad TN LCDs. The main problem seems to be either uneven backlighthing (especially on LED based LCDs) and really bad anti-glare coatings that make grey colors sparkle. I just returned an Asus IPS LCD because of the later case.

There's also the problematic panel used in the Sony S15 and others, which is 15" 1900x1080 IPS but has poor colour accuracy that makes reds look orange.

can i have a link to your 10 year old 1920x1200 15" laptop? i'd buy one just for the screen to use in one of the many hacky projects!

Dell has had a number of them over the years, one I remember was the Latitude D820 (15.4", with a 1920x1200 option) back in 2006 but I know they had such offers before as a colleague had one in 2005.

I'm not sure they offer WXUGA screens anymore though, it seems everyone has nerfed back to "full HD" (which offers 120 vertical pixels less) these days, which is bullshit. Since 2010 (when this trend started), the interwebs have been full of people trying to find out manufacturers who still provided WXUGA screens.

You mean WUXGA. I also detest the shrinking of vertical pixels and the manufacturers pushing it on to consumers. Even on desktops, the prices for non widescreens (more vertical space) seem to be priced excessively to drive people to get widescreens.

>> Since 2010 (when this trend started), the interwebs have been full of people trying to find out manufacturers who still provided WXUGA screens.

Exactly, it's like laptop vertical resolution in particular has gone in the wrong direction the past couple of years. What happened? Trying to view web sites on one of these resolution challenged laptops means a lot of scrolling up and down, feels like the 90s all over again. Glad I'm not the only one and glad Linux raised the issue.

> Exactly, it's like laptop vertical resolution in particular has gone in the wrong direction the past couple of years.

Indeed it has, formerly 1920x1200 screens were dialed back to HD at best (so 120 vertical pixels lost), and where the standard laptop resolution used to be WXGA (1280x800), WSXGA (1440x900) and WSXGA+ (1680x1050), all of that has been rolled back to the completely crazy 1366x768 garbage. TV's "HD" has been a blasted plague on computer displays.

Agreed. Trying to find a decent high-resolution screen is a real pain today. Dial back 5 years, and it wasn't such an issue! Everyone makes "HD" up until 24"... 24" HD is nasty.

I would hope for something approaching 3000 x 1500 in a 24" screen :-(

I have the D830 with the WXUGA screen and even though it's an old 1.8Ghz Core2Duo, I still refuse to buy a new laptop because I can't find anything with a 16:10 ratio, everything is 16:9 and it's difficult to find even 1920x1080 screens at a reasonable price.

Older thinkpads had these as an option http://support.lenovo.com/en_US/product-and-parts/detail.pag... It's not 10 years but they are pretty cheap now anyways (although getting your hands on the WUXGA or even WSXGA+ isn't the easiest task)

My Toshiba Portege ultralight back around 2001 had 1920x1200 on a screen about 12". After that I got 15.6" wuxga Dell Latitudes and now a Thinkpad W500 from a Lenovo outlet because their current offerings dropped down to short-screen rez.

The 14.1 inch Thinkpad T21 released in 2000 had a resolution of 1400x1050. Not as good as you were asking, but an awful lot better than nearly all 14 inch laptops you've been able to get for the last 5 years.

Ohh, I have exactly this one unusable because of a dead motherboard at home.

I got Dell C810 with 1600x1200 screen around 2003. There is a couple of them still available on ebay

I got a Dell Inspiron 8200 with 1,600 x 1,200 in 2002

Check out the Asus Zenbook Prime series. They are 11" or 13" laptops with 1920x1080 IPS displays. I have one and it works well with Ubuntu. http://www.asus.com/Notebooks/Superior_Mobility/ASUS_ZENBOOK...

One of the deciding factors for purchasing my Elitebook 8560w was the 1920x1080 RGB LED screen.

I can never go back.

One of his comments far down the page just struck me:

"It's ignorant people like you that hold back the rest of the world. Please just disconnect yourself, move to Pennsylvania, and become Amish.

The world does not need another ignorant web developer that makes some fixed-pixel designs. But maybe you'd be a wonder at woodcarving or churning the butter?"

I, for one, am shocked and appalled. The Amish live in places other than Pennsylvania.

Like Belize, for instance. (Mennonites, but similar idea.)

Eeeeyup, that's Linus alright. He has absolutely no problem calling out stupidity when he sees it, and I love him for it.

Just Linus being Linus.

The 13" MacBook Pro is now 2560x1600. Give it 18 months the entire Apple line will be a minimum 2560x1600.

That's fine if you are willing to drop $1000 on a laptop, I don't expect we'll see $400 laptops @2560x1600 for several years - the Tablets have the advantage of free operating system, lower computing requirements, smaller physical screens, and, in the case of Amazon/Google, a willingness to subsidize the hardware sales in order to capture downstream Content/Search revenue.

While I have sympathy with your point, I haven't seen evidence is that Google is subsidizing (as I think Amazon is) rather than just selling at cost. The reason this is an important distinction is that it highlights how much the other handset makers, especially Apple, have been gouging their customers.

Google's disconnect from the hardware also shows how you actually can have performance on older hardware - apparently the new real-time Google Voice is on older iOS devices that Apple said couldn't handle Siri - if you don't have an interest in the upgrade cycle.

I wince a little at calling it price gouging. It's true that over the long term prices should be near marginal cost of production. But these are short term products. They need to cover their R&D costs. Costs that come with a risk that needs to be covered too. I feel a little milked when I buy a mac and kind of annoyed because I don't feel like I have an alternative (This is obviously an illusion. I don't like Windows or Linux personally, but they are both very reasonable alternatives). But macs are much older and they are still getting better (but not cheaper aargh!) ever year.

On tablets and phone, these high profits have been driving wonderful innovation bringing us better devices at lower prices every year. It's not like we're getting overcharged for stagnant products. This December's devices would have been an incredible last year.

Price gouging implies overcharging on essential items because the consumer has no choice. The only power Apple have over their consumers is that they really really want the stuff Apple makes, right now.

Any time Google (or any other major corporation) sells hardware at under a 20% margin, they are subsidizing their sales with revenue profit from other divisions, or the future potential of profit. It's important to note that you need to cover R&D, G&A, Marketing, etc... beyond that marginal cost. The exception would be in situations where you could make up the lost margin in huge volume (commodity sales, wholesaling, retailing other people's product). Google's average margin from other units of their business is 30%+, so it would make sense for them to invest in those business lines, unless their are some strategic reasons to focus on the tablet market - which, I'm sure, we all recognize there is.

To put it more clearly - Google is not selling tablets to make a profit on hardware. They are selling tablets so they can profit from search/advertising through those tablets.

Re: Google Voice on older IOS equipment. I installed it on my iPhone 4 today - it runs significantly faster than Siri on an iPhone 4S. Search results are instant, as well. Completely agree with you that this is an example of where the disconnect from hardware greatly benefits the consumer.

Apple and the others no more price gouge their customers than Ferrari or Porsche do.

"Tablets have the advantage of free operating system"

Which tablets do you mean? Certainly not those running Android, iOS or Windows.

In the case of Android, manufacturers pay Microsoft for the privilege. Android users pay with their personal information and by looking at ads. iOS isn't free either, development of the OS is included in the hardware price. And Windows RT is certainly not free, manufacturers license it from Microsoft.

The Tablet that Linus is referring to, the Galaxy Nexus 10, has no marginal license fee per device. Apple, likewise, doesn't pay marginal license fee's, nor does Microsoft.

Laptop Makers end up paying fees per device - typically to Microsoft. OEM Fees are around $50 for Windows XP/Windows 7.

>Android users pay with their personal information and by looking at ads.

Are you suggesting ads are built into the Android software? I was considering getting a Nexus 7 but this would totally change my mind.

If he is suggesting that he is flat out wrong.

There are quite a few free apps that use ads as a way of supporting themselves but the OS has no ads itself. And you can almost always purchase premium versions of the apps that remove the ads as well.

Not to mention with Android there is the possibility of using something like AdAway

Using Google products with the idea that you can avoid giving out personal information to them is not wise. I was also considering the Nexus, but why bother fighting against the product's design.

Still, AFAIK, you won't see actual ads in the OS.

Yep, Ads in Google search, ads in Google Maps. Honestly, I consider Google Play to be full of ads now as well since I only ever open it for apps and it shows a bunch of music, movies, and hardware I don't want instead until I drill down/search.

Someone else in the thread is complaining you can avoid these things if you try, install alternatives, but it is still true Google makes money after the sale, and that they pay Mozilla a ton of money to be default search on that platform, so it is very valuable to them.

>Yep, Ads in Google search, ads in Google Maps.

Ads in apps is a lot different than ads built into the OS. What are your examples of ads built into Android itself?

"Ads in apps is a lot different than ads built into the OS."

If those apps come pre-installed, what's the difference?

Very few devices run stock Android, the Nexus series being the exception. I was referring to one of the most popular Android tablets, Amazon's Kindle Fire.

However, since Android comes with Flash, you're likely to see more ads than on other platforms, even if you use a Nexus device.

weird, we have a kindle fire and it doesn't show us any special ads. and i've never noticed flash ads being especially present on the galaxy tab. it sounds to me like you're trying to talk with authority about things you haven't used and don't really have much perspective on.

One can pay $15 to not have to see ads, but according to Amazon, most users don't.


You've never noticed Flash ads on a Galaxy Tab? Unless you turned off Flash, use a pop up blocker, or if you use a text browser like Lynx, I seriously doubt that: they're pretty hard to miss.

Nope, you've forgotten about the original kindle fire, you know the one thats been on sale for a year and not a month or two. no ad subsidy program there.

look, you obviously don't use an android tablet so please stop telling people who do use them what their experience is like. we understand that you find your kool-aid delicious, let's move on.

The Silk browser used on the Fire processes browsed pages on Amazon's servers. In a perfect world, Amazon would store your browsing activity and manipulate the content it passed along to you for it's own profit. But fortunately, they have placed a Santa clause in the user agreement which prohibits them from doing so.

Android has not come with flash since Jelly Bean: Adobe long since stopped supporting it and it's absolutely not available on the Nexus 7 unless you sideload some legacy APK.

Jelly Bean is the latest version of Android. Only 1.8% of Android devices run Jelly Bean. Most new Android devices being sold today do not run Jelly Bean and will very likely come with Flash.


Adobe stopped supporting Flash for Android 2 months a go, hardly an eternity.


That's absolutely true, but you were quite explicit about Nexus devices also being included.

Edit: Adobe made their original statement that they were moving away from Flash on mobile devices a year ago: http://www.theverge.com/2011/11/9/2549196/adobe-flash-androi...

The Nexus One, Nexus S and Galaxy Nexus all come with Flash.

And yes, Adobe made their intentions known a year a go, but that's not quite the same as ending support. Especially since many new Android devices will still have Flash.

The Galaxy Nexus does not (and, to my knowledge, never has) come with Flash.

Some Galaxy Nexus phones run Jelly Bean now, but most came with Ice Cream Sandwich (which had Flash).


Google Nexus devices didn't ship with Flash preinstalled on the device.

Yes, but the Galaxy Nexus both preceded a working version of Flash for ICS and never thereafter shipped with it anyway (it had to be manually installed from the Store).

I've never seen ads in the Android OS. I've used versions 1.x up to 4.x.

Except in the case of Windows, it's not the case that OEMs have to pay licensing fees to put an OS on their tablet, and considering the state of Windows right now I can't imagine even its fees are prohibitive.

>by looking at ads

What a load of malarkey. You have full control over how much information you give Google when you use the device. You can even get apps with a throwaway gmail and give them no data at all. Further, there are no ads built in the OS and I never see any, period. It's just like on iOS, I don't know what on earth he's on about.

The Nexus comes, by default, with google as a search engine, google mail as the mail client, google maps as the map engine.

This is something that google pays organizations like Mozilla, and Apple, hundreds of millions of dollars to have on their browsers.

The vast majority (95%+) of people take the default search engine and map engine that comes with their tablet/smart phone. Google is willing to take a loss on the 5% who might decide to use some other system (though, given that google makes pretty good mapping and searches, there is a better than average chance those 5% will stick with google anyways)

That's why google is willing to sell a tablet for $200 that others might have to sell for $250 or $275 in order to make a profit - they don't have the search engine revenue.

None of this is advertising in Android OS.

All it does is make you look like you have a bias against Google.

Bias against Google? I love their products! I Use their Search Engine. I Buy Products from their Ads. I Use their gmail client all day long - happy to get directed ads. Use their Map Client.

I love all the free stuff I get from Google.

Their latest GoogleVoice App is amazing - if anything, I have a huge bias for google.

It's okay to be analytical about something you love. Apple makes money on hardware. Google makes money on services. Each have different business models, and will price their products accordingly.

Then why are you ranting about the default search engine as evidence of adverts in Android?

Apologies if I appeared to be ranting.

My objective was to emphasize that the entire business objective of Android, an operating environment that Google has spent billions of of dollars developing and protecting, is to provide a platform for Google's advertising in search, in maps, and, soon, in voice.

Because Google gets little, if any profit from hardware sales, and no (to my knowledge) license fee's from third-parties vendors, all of their profit has to come in the form of advertising.

What this means, is that when comparing a hardware product from Google, and from Apple - we need to understand that Google's profit comes downstream from the advertising revenue, whereas the bulk of Apple's come's front loaded, from the hardware sale.

This will then have an impact on the margins that each of the organizations will be required to pursue at various stages of the product lifecycle.

Apple will start off with a high (30-40%) margin up front, but has less pressure to monetize the user eyeballs in its services.

Google will start off with a lower (approaching 0%) margin up front, but then has much more pressure to monetize user eyeballs in its services.

Neither business model is inherently good/bad/otherwise, but you case see how the incentives for Apple and Google are differently aligned - we've already identified one - Apple did not release Siri on the iPhone 4, even though the phone was more than adequate to run Google Voice, which was release for the iPhone 4. Google's got no skin in the game selling you more hardware. Apple isn't really incentivized to release it's premium services on old hardware...

Again, it is more than easy to use an Android phone without using those. I know tons of people that use Android phones exclusively with personal accounts because they don't use Google for mail/calendar. You can even use maps without giving up your location, it just makes it damn near useless... but again... it's an unavoidable issue. For it to be useful, you're going to give up something, and if it's not to Google, it's to someone else.

"it's an unavoidable issue. For it to be useful, you're going to give up something, and if it's not to Google, it's to someone else."

One can choose to give one's personal information to a company whose core business isn't to sell personal information to other companies. That excludes Google and Facebook.

Right, like I've said 4 times now, it's incredibly easy to use an Android phone without sending a blip of data to Google.

Also, you're absolutely insane and bordering on fanboi territory to act like Microsoft and Apple aren't collecting the exact same usage data from location services etc. You have some bone to pick with Google over advertising yet ignore the fact that it's completely a choice to use it. You people act like Google is breaking into your house and installing a GPS chip in you. It's dishonest.

"it's incredibly easy to use an Android phone without sending a blip of data to Google."

Assuming that's true, that doesn't change the fact that most users will send loads of personal data to Google without knowing it, let alone knowing how to turn it off.

"like Google is breaking into your house and installing a GPS chip in you."

They don't need to. They know that most users don't understand computers and don't read end-user agreements.

But even if you don't use Android, they'll indeed drive by your home, take pictures and collect information about your wifi network, which enables them to link your digital life to a physical one.


I'm very happy to tell you that, like most people, I was upset to find that Google collected that data. I was even less impressed with how they handled the deletion and disclosure of their mistake. I'm happy to discuss that, but it's pretty desperate to reach for that when it started with more or less: "Android has ads and Google watches everything you do on it"

"it's pretty desperate to reach for that when it started with more or less: "Android has ads and Google watches everything you do on it""

That might've been your interpretation, but I wrote:

"Android users pay with their personal information and by looking at ads."

By which I mean a typical Android experience. Not the Android OS, or the official Google Android experience. Most Android users are subjected to lots of ads because they don't buy apps; they download ad-supported apps. They don't buy third party navigation apps, they use the ad-supported Google Maps app. Etc etc.

Maps could show ads, then they make a profit from you!

"You have full control over how much information you give Google when you use the device."

You don't even have control over how much information you give Google when you use their search engine on a desktop computer. Even if you sign out of Google, turn off cookies and use a browser without geolocation. Google will always continue gathering information about you and providing you with custom search results.

But even if you could turn off all data collection, the vast majority of users won't know about it. As a result, they pay Google by sharing their personal information, their interests, their location, their contacts, etc. Google's core business is advertising, they can't help themselves.

"there are no ads built in the OS and I never see any, period. It's just like on iOS, I don't know what on earth he's on about."

I was referring to Amazon's Kindle Fire, but the Android experience in general is filled with ads. Not only does a core app like Google Maps include sponsored links, Google pushes consumers and developers towards ad-supported apps by offering limited payment options in the Google Play store. Three quarters of all apps in the Google Play store are free, and many of those have ads supplied by Google or its daughter companies.


" Google pushes consumers and developers towards ad-supported apps by offering limited payment options in the Google Play store."

Ok, and so does every single marketplace ever that charges a brokerage fee. I think it's completely acceptable to whine about an application that can give you a shortest path between any two spots in the world, 3d buildings, 3d perspective, vector tiles, the best business listings outside of the old white pages, etc, etc, for giving me RELEVANT sponsored results when I look for something. yeesh.

"so does every single marketplace ever that charges a brokerage fee."

That's true to some extent, but Google does both. Google charges 30%, just like the other app marketplaces. On top of that, it even makes developers pay for chargebacks.


I used to have a "huge" 22" Iiyama CRT (20" Effective) boasting 2048x1152. I have forever been confused why current desktop screenS try to satisfy you with 1080p. This crap was getting hyped up On screens years after I was already enjoying higher resolutions at good refresh rates. And guess what? My 2 screens ay work are two 27" iiyamas LCD. And they don't go over 1080p... shouldn't it be even easier to get a better pixel density with lcds in bigger screens ?

I agree with the "tiny font" bit. As a patient with severe myopia my mac air is a lot more comfortable on the eyes than my 16" widescreen acer.

Well there's at least 27" TFTs with 2560x1440 and 30" TFTs with 2560x1600.

I guess part of it must be that single-link DVI and older HDMI doesn't allow for more than 1920x1200, and cheaper graphics units would be overwhelmed with anything over 2560x1600.


> The IBM T220 and T221 are LCD monitors with a native resolution of 3840×2400 pixels (WQUXGA) on a screen with a diagonal of 22.2 inches


This was back in 2001, the screen ran off of a G200 (which was bundled with the T220) (http://en.wikipedia.org/wiki/Matrox_G200). A "modern" IGP is perfectly able to drive desktop and desktop application at that resolution (hell, an eyefinity card supports 6 screens to at least 1920x1200 in 3x2 configuration, that's a total resolution of 5760x2400... and you can play games on that)

A card with 6 outputs costs a few bucks. The Intel HD4000 I believe has support for 2x2560x1600 I believe.

Note that you're also comparing components with vastly differing prices. Maybe the market for $2000 monitors just isn't big enough.

EDIT: Eizo offers 30" displays with 4096 x 2560 resolution. However, they are not exactly cheap. About $20-$30K. Meanwhile a 27" 2560x1440 display goes for $600, or as an import from Korea even for $300.

There were two components to my comment: the first one was that there were high-DPI desktop screens back in 2001 which could be driven with 2D accelerators released in 1998, and thus a modern IGP would have no trouble driving these, and thus that "cheaper graphic units" wouldn't be overwhelmed with "anything over 2560x1600".

And the second part was modern mid- and high-end dedicated (but still consumer) GPUs being able to drive much, much higher resolutions than that without breaking a sweat.

Nobody would accept 2D-only these days. And in a world where people buy $500 PCs, you can't expect a $250 GPU to be the norm.

I have to confess I am flabbergasted by your ability to misunderstand (willfully?) what I think to be plain and clear english.

1. I never said anything about people having to accept 2D-only (and the G200 was not 2D-only, by the way), I pointed out that GPUs nearing 15 years old were already able to produce resolution you consider "overwhelming" and that a modern IGP would thus have more than enough power to handle it

2. The dedicated GPU note was to point out just how far beyond those resolutions you consider "overwhelming" a "modern" (Evergreen — and the first Eyefinity release — is 3 years old) dedicated GPU goes

I understand fully well that it is not _difficult_ to drive a lot of pixels if you want. Many modern cheap cards and IGPs however cannot, for whatever reasons: they physically have no support for dual-link DVI. Maybe they saved 50 cents on some connector that way, or whatever.

Note that I never said anything about overwhelming a dedicated GPU. That point was about cheap IGPs, specifically from Intel. (EDIT: OK I didn't explicitly say it, my fault.)

Ultimately I guess it boils down to priorities. PCs have gotten a lot cheaper, and some of that has been done by making things worse.

Personally I would like high-dpi as much as anybody here - although for now I think I'd first want a lot of screen estate (13" is a bit limiting at times), but I probably use my computer for reading a lot more than other people.

> Many modern cheap cards and IGPs however cannot

A $50 AMD card will get you at least 3 1920x1200 outputs. I am not sure if the HDMI port on them can be driven any higher, it possibly can though.

I am not sure what the max resolution on AMD's Trinity line is if they have Display Link or HDMI. Again, it may very well be up to the max Display Link or HDMI supports, which is pretty damn high.


Turns out Intel also supports 2560 x 1600 over Display Port.


Scroll down to "Display and Audio Features Comparison"

> I am not sure if the HDMI port on them can be driven any higher, it possibly can though.

HDMI itself can (from 1.3 onwards), but the card may not allow that for whatever reason.

> I am not sure what the max resolution on AMD's Trinity line is if they have Display Link or HDMI

I can't find anything on AMD's website, but the Asus F2A85-M Pro[0] is specced at:

- 1920 x 1080 over HDMI

- 1920 x 1600 over RGB

- 2560 x 1600 over DVI

- 4096 x 2160 over DisplayPort

[0] http://www.asus.com/Motherboards/AMD_Socket_FM2/F2A85M_PRO/#...

I've never seen the point of early adoptations of HDMI. The first and last HDMI cable I bought constantly caused flckering and weird artifacts. I've only used it for a while because aesthetics and being lazy: no extra audio cables needed. A really old VGA cable provided more superior video quality on the same setup.. And HD+ resolutions, no problemo.

I'd suggest that the one and only HDMI cable you bought was faulty.

I think it's mostly because the interfacing controller thingie is so redicilously cheap - "thanks to" to the "flat screen revolution". Same controllers can be used everywhere, costing next to nothing.

It is an interesting rant.

Pixel density affects many aspects of a system;

Storage - it affects the size disk you need/want because high resolution images / video take up much more space.

Memory - You need more memory to build up screens for a higher density display. Further you need the bandwidth to shove that data around.

Compute - If you want to 'render' to the display rather than just copy bitmaps around, or composite complex bit maps, you need to spend a lot of time computing which can make other things slow.

So when you look closely at tablets you will see interesting places where they have been adapted to support these densities.

But more importantly there is 'change' in the systems where there is new money being invested. So tablets are getting all of the 'change' now, less so with laptops, and hardly at all with desktops.

The reason this will change though is that I expect we've convinced display manufacturers that 'regular users' (the bulk of the purchasers) want 'high dpi' displays. Its not easy communicating with an entire industry but success Apple has been having with 'retina' displays, and the more recent Android tablets with higher resolutions, means more people will jump in to support them. And more importantly when the choice is available folks reject lower density displays. So in the great 'tuning' these guys do where they calculate how to get the most money out of each hour of running their factories, the equation is tipping in favor of high dpi displays.

That said, I'd love to have a couple of 32" 2560 x 1600 displays for my desktop, but I think that is still a couple of years off from being 'mainstream'

"That said, I'd love to have a couple of 32" 2560 x 1600 displays for my desktop, but I think that is still a couple of years off from being 'mainstream'"

Overclockers have a sale on 27" 2560x1440 displays today. They're selling for £311. I bought one about a year ago for £450 - worth every penny.


That's not a high-DPI/retina display though, just a regular DPI.

Ah yes, Korean IPS monitors. They are really amazing I have one myself and they give amazing bang for your buck. Read more about them here: http://www.codinghorror.com/blog/2012/07/the-ips-lcd-revolut...

I've been watching the korean ones on ebay after reading the coding-horror article. An acquaintance of mine has a 2560 x 1600 native resolution video projector. He was showing it off by projecting on to a 120" screen. One of the things that struck me about that setup is 'the wall as monitor' had some very interesting stuff at that density. It felt like it could be a very productive environment.

I believe the issue with using it as your primary monitor is that projectors have limited bulb life, and the bulbs are often almost as expensive as the projectors themselves.

I have a Korean no-brand 27" 2560x1440 IPS screen that I picked up for 250,000KRW (that should be about £145). I'm in Korea so that makes the big difference on cost plus lower taxes. They will come down in price far more yet.

...and with a 6ms response time - Not bad. I'm in the market for a dual screen setup, so I've been watching these closely.

I'm probably going to wait for black friday though to see if I can nab a decent deal.

I got myself an HP ZR2740w (2560x1440) a few weeks back (the old ZR40w relegated to secondary screen). Definitely worth its price.

TBH I'm not sure if I want 2560x1600 yet if there is going to be a significant drop in battery life. I have an IPS 1366x768 on a 12.5" screen and it looks great. Fonts already go smaller than I can reasonably see for programming - I can't imagine a higher resolution would materially improve my workflow.

Personally I'd like to see a higher refresh rate. Even with triple buffering I don't think that horizontal scrolling is smooth enough. I'd love to have a 120hz laptop screen. The new Windows 8 start screen, and switching between workspaces in Linux would be so much nicer. Still it's not exactly necessary, just a nice to have.

Battery life seems to have held steady for the HiDPI/Retina Macbooks and for all the tablets we've seen. So the real sacrifice is in weight and thickness.

Linus's rant is compelling, but as the owner of a 2011 MacBook Air, I'm not completely sure I want to add 1.5 pounds (the difference between the MBA and the 13" Retina MBP). Granted, complaining about 4 pounds sounds ridiculous, but having something you can effortlessly carry from room to room with one hand is pretty great.

>>IPS 1366x768 on a 12.5" screen and it looks great

Some money advice: Don't use a retina display. :-)

I recently saw my old iPad 1 at my ex gf. It was the best screen I had ever used -- now after an iPad 3, it was just horrible.

Do you think that's just the resolution, or also the quality of the display? I think the iPad 3 is a better quality screen as well as higher resolution. I tried a lot of latpops when I was in the market, and I found the ones with a higher resolution TN panel were worse (to my eyes) than lower resolution on a high quality IPS.

Related aside: the consistency of lighting across the viewing angle on my iPad 2 was far better than on the 3. (It was scarily uniform on the 2.)

Yeah, the colour range is much better on the iPad 3 than on the iPad 1.

That is a bit irrelevant. I'd rather have 256 levels of grey on a retina than a good colour non-retina. I can't get good readability of text (both web and A4 documentation) together with quick page turns (no eInk) otherwise.

I could use something else to watch movies, etc.

I disagree with Linus on this one.

In an ideal world I would agree completely, a better DPI is amazing both in terms of font readability AND for watching full screen media (movies, TV shows, games, etc).

But in the real world higher resolution means small screen elements. At 1600x900 fonts are readable at 125%, at 1920x1080 even at 125% fonts and some elements are literally too small to be comfortable read (you get eye-strain after less than an hour).

Now I would turn it up to 150% "text size" but that breaks SO many native Windows applications (e.g. pushing text off the viewable area) and does the same on Linux too (Ubuntu).

Ideally everything should remain the same size no matter what the resolution, and the DPI should just grow upwards. This is how it works on platforms like the iPad.

So, I disagree with him, I don't want higher resolution displays because Windows, Linux, and OS X still suck at handling resolution (and if you use a non-native resolution it hurts the performance since the GPU has to re-scale constantly).


So you just want to stop evolution forever. That's just silly. I use a 47" TV with KDE, I set DPI to like 200, and the resulting fonts and interfaces are very nice when viewed from couch.

I have to ctrl+ every site a few times, but it works flawlessly after that.

How does making things smaller and smaller with each resolution increase equal "evolution?"

I would argue that making everything a fixed consistent size regardless of resolution and then increasing DPI as the resolution is increased would be a better way to "evolve" things.

"Zoom" should be an OS function (e.g. 125%, 150%, 200%, etc). "Resolution" should be hidden from the end user entirely. "DPI" should automatically scale with the supported resolution.

That way if the user is hard of seeing they can "zoom" things but aside from that everything would look identical no matter what resolution you had (e.g. identical on 1600x900 or 1080p, or higher).

You don't make things smaller. You make things sharper.

Right but you were arguing against that?

Why? I argued that big resolutions are good and interfaces will soon catch up. Browsers are ready, operating systems are ready. Some legacy software might become tiny, but you can live with it by changing resolution when you need to use it.

I disagree, desktop Operating Systems cannot handle high resolutions at all. They just keep making things smaller and smaller instead of sharper and sharper. OS X has JUST added basic support for a sharper lower resolution mode, but even that is highly limited to their new displays.

Resolution from a OS perspective needs to be scrapped and re-invented.

That's simply not true, OS level support has been improving for years and years.

There is a chicken-before-the-egg problem with getting ISVs to support high resolution modes (ie, for third party developers to use the support the OS offers) since there isn't much incentive until their users have hardware to use the high resolution modes. Apple has started shipping high resolution screens, once others do too software support will be more likely to catch up too.

If that was true then first party software would support it well but it doesn't. A lot Microsoft's software cannot handle 150%.

>If that was true then first party software would support it well

No no no no no. Hardware adoption has comes first, followed by the optimized software functionality.

Just look at SSD and TRIM.

> Now I would turn it up to 150% "text size" but that breaks SO many native Windows applications (e.g. pushing text off the viewable area) and does the same on Linux too (Ubuntu).

In the web too. For too many years it was standard to use hard coded font sizes and specifying measurements in pixels in CSS.

It's about time web designers start thinking about accessibility and start using flexible font sizes and using em's (unit of length relative to text size) in web designs. Even when it means doing small compromises in the design and you can't make the web site pixel perfect with regards to the photoshopped design mockups.

Pixels in web (css) are resolution independent. They are basically defined to be equivalent to a pixel on a desktop 96 PPI display. So using pixels in CSS is fine, if bit confusing. And of course browsers do all sorts of silly things, I can't recall if eg gecko has enabled PPI scaling on default.

edit: reference: http://www.w3.org/TR/css3-values/#absolute-lengths

Pixels dimensions and font sizes in CSS are fine because the browser apply a multiplication factor to the CSS values, so that the page and fonts display correctly on a high res screen. This is how Mobile Safari works.

The problem with websites is the image files--JPG, GIF, and PNG. The resolution is locked in when the graphic is made, so files made for the the low-res web will look blurry and pixelated when the browser scales them up for a hi-res display.

> The problem with websites is the image files--JPG, GIF, and PNG. The resolution is locked in when the graphic is made, so files made for the the low-res web will look blurry and pixelated when the browser scales them up for a hi-res display.

Such sites aren't blurry in absolute terms, just in relation to sites that have been optimized. That is, they look the same as they would on a standard screen.

Huh? That's not entirely accurate and besides, Gecko and Webkit have been doing full page scaling for years and I presume IE is as well.

>Ideally everything should remain the same size no matter what the resolution, and the DPI should just grow upwards. This is how it works on platforms like the iPad.

And on the "Retina" Macbook Pros - i'm not sure why Linus goes batshit over a Apple marketing name, it's a term Apple uses for marketing their HiDPI screens (iPhone, iPad and now Macs) and not in use by anybody else to describe their high resolution screens.

The true HiDPI effect is only apparent in the "same size as before but 4 times the pixel density" default configuration.

Changing it run on one to one with 2880x1800 display achieves nothing besides really tiny unreadable font sizes.

One would hope that if screens of this resolution became ubiquitous, people would start to notice which apps behave poorly and fix them. It might also push forward an ecosystem for supporting such displays. E.g., tools to allow per app dpi settings to isolate badly behaved legacy apps for which the source is not available and on which no active maintenance is done.

>Ideally everything should remain the same size no matter what the resolution, and the DPI should just grow upwards. This is how it works on platforms like the iPad.

No that's not how it works on the iPad. Apple would not have increased the number of pixels by a factor of exactly 4 in one iPad update if the software gave them the freedom to choose any display resolution.

Most Linux DEs have actually been pretty good at high DPI, and OS X (now) handles it pretty well. Only holdout is desktop Windows.

I know people who tried Ubuntu on the new Macbook Pros (2880x1800) and everything was so tiny they were literally unable to alter the resolution using the GUI. They could not even click individual elements. They had to manually do it via the shell and even then they weren't able to see what they were doing...

I wouldn't call that "pretty good." In fact I would call that shockingly bad.

Linux still falls in the trap of re-sizing things to meet the resolution. As you increase the resolution things get small, and as you decrease the resolution things get bigger. This is a broken design.

Zooming should have no relationship to resolution. Resolution should increase DPI, not zoom or scaling (or however you wish to word it). A font should be the same size on 1600x900 as it is on 1080p.

Last time I tried, KDE measured everything in points and scaled just fine. Unity/Gnome2 might be broken.

It's true the default text size is too small even with a low-resolution monitor, but I've been dealing with that for years just by selecting a suitable font size within each of the half-dozen or so programs in which I actually read a significant amount of text. I find that works fine.

> a better DPI is amazing ... in terms of font readability

I haven't really found this myself: I typically prefer a nice pixel font at a lower DPI than a TrueType font at a higher DPI, especially for coding. The manual effort that goes into a good pixel font just makes for better readability imo.

I suspect that's because you haven't seen high enough DPI yet. The 200+ DPI that Linus calls for is really the bare minimum.

I've always been a sucker for high-quality displays. Apple's Retina MBP is what finally converted me from being a Windows user. I have to say, I really love it.

And... I may be the only one here... but I think they should go a little higher than 2880x1800. I know normal viewing distance is something like 15 inches, but I like to sit closer to my screen when coding and it sure would be nice to have all semblance of "pixels" completely disappear. How cool what that be?

And if they started using AMOLED screens instead of IPS, then that would really be the perfect screen.

> And if they started using AMOLED screens instead of IPS, then that would really be the perfect screen.

Why? AMOLED would just make the colours ridiculously unrealistic without many other benefits.

It depends on what primaries you choose for your OLEDs. AMOLED has perfect black levels (LCD technology does not) and near-perfect viewing angles.

Unrealistic colors are a problem of reproducing the information you acquire correctly. All of the colors humans can see are represented on the CIE 1931 chromaticity diagram. Choose any three points within that color space and draw a triangle between them. This colors within the triangle are those that the primaries at the vertices of the triangle can reproduce. The rest is just a software issue (controlling the correct voltages to each LED).

IBM invented a 2000ppi monitor over a decade ago, but it was useless because software and video cards couldn't handle it.

Roll it out. Stop with this incremental horseshit for sciences sake and make a LEAP.

Why do you need 2000 pixels per inch? No really. Photographs are between 300 and 600. Are you making scans from the screen?

But they can make so much more money dripping out advances. Look at Apple, that's been their entire MO with the iPhone.

His not the only one to think so:



And about every other hacker I know.

Interesting to note that Jeff was so very wrong with his prediction. Only half a decade after his prediction, we're already past 2560 x 1600 on the MBP Retina displays :). Pretty soon PC manufacturers will also catch up.

Technology generally improves faster than one presumes (accelerating returns and all).

s/his/he is/

Resolution matters when you need to work with text. For me, Macbook Pro 15" retina is the best thing ever happened, and it is impossible to look back now.

Seeing as how Apple's "retina" displays are actually made by other folks (Samsung and LG for the 15" rMBP), I suspect we will see high-res panels on other laptops very soon.

In fact, Samsung recently demoed a few Series 9 prototypes at IFA with WQHD resolution (2560 x 1440, which is the 16:9 resolution you get from most 27" panels): http://www.theverge.com/2012/8/31/3282360/samsung-wqhd-2560-...

I don't get what point he's trying to make at all. And I don't know why it makes either a blog post, or top story on hn, who cares what screen resolution Linus has his laptop set to.

I can work just fine on my 2009 MBP with 1280x800 (or whatever it is) the text is perfectly readable, there's no noticeable pixelation at distances past a few inches from the screen and having everything shrink, as a result of increasing the res, would make it unusable.

He's probably exaggerating for effect, but it's not even remotely true that laptop resolutions have stagnated. They have steadily increased to the point where we now have retina screens on regular work laptops.

I think if Linus created a blog post saying he'd just set his background color to blue, it would make the top spot here!

It's a shame to see Linus stooping to mock apple's use of the term retina. They (Apple) name everything, like the fusion drive or any number of previous technologies. It's to humanize the tech so the average person walking into the apple store doesn't have to talk in tech-speak. It's just a marketing term, and every company has them.

The definition of "reasonable resolution" changes over the years, VGA seemed reasonable compared to EGA.

Like most things you don't realize the difference until you use it. If you spent a week working on a retina MBP at 1920x1200 you'd never want to use 1280x800 again. 4 vim splits at useable width, 1000px+ browser viewport width with inspector open to the right, etc.

I'm really happy with the 1440x900 quadrupled. Text is unbelievably gorgeous, and when I pop open QuickTime Player to check out video, I see my 1920x1080 data unscaled. STUPENDOUS.

One nice thing about the choice of 2560x1600 as a standard pick is, it to my knowledge is the largest resolution supported by DVI, and specifically DVI dual link at that. This might start pushing the market towards DisplayPort, which used to be limited to 2560x1600, but is allegedly being expanded upwards.

(DisplayPort is my favorite display connector to date, and I hope to see adoption grow)

Seems like this guy is not bothered about battery. I have a 1366*768 display which just works fine. Most of the times I'm just running a terminal and this resolution saves quite some battery.

The brand new 1700€+ 1080p 17" Dell XPSes around me at work barely manage to get 2h idling. For my colleagues it's basically an embedded UPS, while my MacBook Pro still handles 5+ hours after three years.

Like my MBP, Retina Macs still have 7 hours of battery, so I don't see the landscape changing much: many laptops, retina-class or not, cheap or not, will get crappy battery life, and a few will get it right.

"This guy" is Linus Torvalds. I assume he's often running a terminal as well... ;)

Yes please!!! I can't say how much i am disappointed by that 1366x768 crapsolution.. before my current laptop i had a FullHD laptop and that was faaar better.. Even my first laptop from over 10 years ago had a better resolution then most standard laptops have, how is that?!

(i had a 15" 1400x1050 display, then a 16" FullHD and now a 13" 1366x768)

Can somebody explain what is so bad about seeing individual pixels? Serious question from an oldschooler.

It's fine if you're okay with gigantic fonts that are usually seen on laptops. A lot of us prefer much smaller fonts to fit more code on the screen.

For example the equivalent of 9pt on a 900p display allows actually reasonably split-screen editing. On an actual 900p display though, it's much too blocky to be easily read without straining the eyes. At 1600p though, it's crystal clear.

Fonts are clearer and, assuming your vision is up to the task, you can fit more on the screen.

Other people may have other reasons for higher density displays, but those are mine.

Pixels are an artifact of our crappy technologies. They do nothing but detract from the actual point of a display.

I sure wouldn't want to spend my whole waking life wearing glasses that slightly-pixelized everything I saw. And since I spend a large amount of my waking life looking at "glowing rectangles", I'd rather they not be slightly-pixelized either.

Fonts are clearer, it stops me from getting eyestrain (because I don't register everything as slightly blurred), and it is a great help when working on photos.

Not just laptops - I'm sick of the standard PC monitor being such crappy resolutions as well. Even with dual screens on my work PC, there's not enough room for the xterms I need at a decent font size.

Computer displays have stagnated for too long.

You are confusing resolution with screen size.

> a decent font size

This is a function of screen size, not resolution.

No, I'm not.

I'm talking about the actual physical size of the font, which is about 5mm. If I make my font size any smaller, because the resolution is crap, the font is unreadable.

Higher pixel density would mean I could make the font size smaller and it would still be clearly legible.

I don't know what oneandoneis2 meant. But I suppose what font size is decent might depend a little on the resolution.

What is even more sad is that about 4 years ago resolution of average new desktop TFT was higher than today.

I doubt it. The average resolution today is probably 1920x1080. I guess you're referring to the fact that 4 years ago, 24" displays often came with 1920x1200 instead. But I doubt enough people bought them back then to make it the average resolution (don't blame me, I bought two). Most people probably bought craptastic 22" TN displays with resolutions worse than 1920x1080; I know I talked my share of acquaintances out of it.

1920x1200 displays are still available, incidently. And overall, the desktop TFT market is much higher quality these days. TN panels are going out of style, for one thing, thank god. You can now get a 22" Full-HD display with an MVA panel for around 150 USD. Still pathetic compared to tablets, but we're getting there.

Agreed, screens need to gain higher resolution across all form factors. I would be ok with terrible frame rate for 1-2 years while the hardware catches up with 4x the pixels.

Nexus 10 with a keyboard (and battery) casing and some desktop linux on it sounds like a good cheap high res laptop. Performance wise it would be fine for the stuff I personally do, guess that would scare other coders off.

Compile in the cloud, or use a dynamic language or use Go?

Yes, for some an option, but usually cloud doesn't fly here (I live in the mountains) and the languages are usually not the problem, the environment is. I like my tools :)

On my Pandora I do scala/clojure/java (yep, Swing) coding (the new JVM for ARM by Oracle is really good, I never thought I would say that, but it's an amazing piece of work for such 'small' memory and performance footprint as the OpenPandora), Haskell works well, the whole linux build chain works well, LAMP stack works well, I can run Rails, Django, Apache, Node.js. And amazing batterylife with the option to swap out batteries on the go. Only, the screen is way too small and doesn't work under anything more than almost darkness :)

The iPad 3 works fine everywhere, but I cannot do much work on it and I haven't found an Android pad which works well enough for fulltime coding nor have I found a good enough case to work with for any of those. Most cases are really bad chinese things which have a broken power supply (batteries or anything in between loading and the batteries) after a few weeks usage. Anyone? :)

If the Nexus 10 sells anywhere near the amount the Nexus 7 does, there might finally be some good accessories. I would almost do a Kickstarter to make different form factor clamshell keyboard/battery/connection docks for android phones and pads. Almost...

Could you elaborate more on how you develop on a Pandora? I'm quite interested. Do you hook up an external monitor and keyboard?

No, that's what I meant (reading back, I didn't actually say that; I just thought it during writing); they didn't deliver cables for the monitors yet :) I have the components to make one, but didn't get to that yet (it's none standard and probably I'll just get theirs when they finally created it). When i'm behind an actual desk I just work on it via another computer or via github (depends where I am). But i'm often 'on the road' (which can be actually on the road OR walking in the mountains) and then I just program straight on the little thing. It's actually quite comfortable if there is not a lot of sun.

But yet it's very much the opposite of high resolution so it takes more planning what you are going to do, but for algorithmic work and hard problems nothing beats mountain walks and then typing/testing on that thing. And the occasional Crash Bandicoot III play of course.

For me the optimum config would be something like an OpenPandora (before it I had (have actually) the Zaurus c860 which did the same minus the games) with a docking 'station' or, but now i'm dreaming as it seems nothing comes close to this yet, AR glasses with a sufficiently high resolution connected to it.

I have been experimenting with a Twiddler 2 (http://www.handykey.com/); everyone complains that it's too slow for typing, but again, when you are thinking up stuff which are not kilometers of boring (crud) coding typing speed, imho, doesn't matter too much. And I can type while walking with it, but no AR (or actually VR) yet.

I think a good solution could come from review websites. If they all agreed that any screen size below 2560x1600 (or maybe a little less) would only score a maximum of 5/10 it would certainly rock the boat.

Before this happens, it would be really nice if Windows supported different DPIs on different screens. IIRC Linux can already be hacked to do this. Windows cannot. The result of this is I have a 21" 1080p desktop screen sitting next my 14" 1080p laptop screen and I cannot read text on my laptop screen! (I'd move the laptop dock closer but then it'd be sitting on top of my working space.)

As for Windows 8 doing high DPI, Tech Report has a decent article (http://techreport.com/review/23631/how-windows-8-scaling-fai...) on the lackings of Win8 high DPI settings even in Metro. Though feel free to ignore their complaints about browser scaling, each browser takes a different approach to how they break web pages when scaling. (Suffice to say 1 pixel borders and non-integer scaling don't go together well!)

They go into detail about the different scaling options, but the scaling options are all things that the user has to manually enable! Hardly auto-DPI. There is a balance to be struck between "more information on screen" and "better displaying information on screen" that Microsoft apparently decided to not even attempt, instead giving the user a blunt instrument with which to toggle between "way too big" and "way too small".

Sorry to sound silly, but, what laptop can do 2560x1600 for $399? I feel out of the loop!

He's talking about the new Google Nexus 10 tablet.

2560x1920 (or even 2132x1599) would be better, but I guess that fight was lost long ago.

I still have no idea why people bought into shortscreen laptops hook, line, and sinker - I guess that's marketing for you. Maybe we're just in a minority of people that want laptops, and not expensive DVD players? I'll never buy a laptop with that bottom seventh of screen real estate missing.

Where do you even look to find something otherwise? As far as I could tell the last time I was able to buy a laptop with a 4:3 ratio screen was in 2006, and they were significantly more expensive than 16:10 models. Since then I haven't ever even seen the option.


Well the last time I bought a new laptop was 2007. I figured there had to still be some Thinkpads Ts with 4:3 screens (given that they're used by people who are in touch with the realities of traveling and all), but alas I see that's one more thing Lenovo has crapped up. Yet another reason to keep this IBM-branded T60 going as long as possible (despite the periodic keyboard swaps).

How do those spacers on the sides of keyboards not just scream waste needing optimization? I guess that 7% area savings for LCD manufacturers overrides common sense. Well okay, I see the writing on the wall - time to learn to use ed(1).

The one issue that cannot be ignored is that a higher resolution display will consume significantly more power and cause the GPU to consume more power and generate more heat. The relationship is roughly linear to pixel count, in other words 2x more pixels is equal to 2x more power and heat. From 1366 x 768 to 2560 x 1600 it's roughly 4x more power/heat.

It WILL indeed be the new standard resolution, very soon, especially considering the "push" the PC industry is getting from Apple's Retina Macbook Pros. The demand for high res panels is just to darn high and it has to be met. Like in Apple's situation, the cost can be taken care of by charging a premium and then using economies of scale to bring down production costs and making these kind of displays the new norm.

But exactly when this will happen? Well, it sort of already has: http://www.engadget.com/2012/08/31/samsung-Series%209-WQHD-U...

However, I believe we're going to see a huge blast in these super high-res panels right after Intel Haswell is released. It'll provide the Ultrabooks with graphic performance capabilities that'll be good enough for these high-res panels.

Nice to see; but obviously it is just a small step where a larger one is needed. They'll never scale it bigger to compete with their own 1080p hypnotoads.

I find it fascinating how Linus gets away and is actually awarded for flaming and bashing other users their opinions.

He has earned it.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact