Now all we need is linux to credibly support it as well, or at least a linux built for mouse use. There are still many, many usability issues in gnome with a high ppi screen.
Also, lets not forget how far mobile gpus have come in the last few years, it would have been impossible to push that many pixels with anything but the most minimal of 3d use cases.
It's a much more complex problem than linus would suggest by simply having oems switch panels. Witness how relatively complicated apple's solution is, which came after years of supposedly "somewhat" supporting hidpi. Use the hidpi macbook pro at 1920x on the ivy bridge gpu and it's still noticeably laggy at some 3d operations.
At least in terms of native applications I've seen literally no differences; maybe Metro handles it better but Metro on the desktop is, well, Metro on the desktop.
PS - I know it is now called "Modern UI" but that name sucks.
A lot more detail about the mbp and 8:
eg here is one blog post from 2006, which in turn refers to guide written in 2001 http://blogs.msdn.com/b/greg_schechter/archive/2006/08/07/69...
iirc there was some similar blog-post series about high dpi stuff when windows 7 was released, but I couldn't find it now with google.
Once you get into a proportional mindset, the hardest problems tend to be related to text and images. One trick I learned for the latter was using vector art as much as possible and converting SVG images to XAML (care of XamlExport). Then scaling no longer becomes an issue. For raster art, I would hope Metro has something similar to iOS's retina graphic substitution magic. Silverlight 4 didn't which was a bit of a drag.
Text was always a little trickier but when dealing with full screen apps I would usually just have some code that changed the text size based on the delta between the new and old window size.
That said, you mentioned that Metro will do image substitution a la iOS. Thats fantastic and something I sorely missed during my Silverlight dev days.
I can't answer the to specifics of how rendering is different between 7 and 8, but the frameworks released with 8 have or are moving towards vector and resolution-independent rendering.
I made that decision back in 2004 when my laptop had 1680x1050 resolution. Since then I've only gone higher and I've yet to have problems.
Windows seems to have handled this pretty well long, long ago so I have no idea what people are complaining about.
Look at this skype screenshot at 125% for example:
That screenshot you link to is of Skype, and I would not use Skype as a standard example for anything. Even if MS owns it now, it is a poorly designed program by today's standards.
The only UI problems I ever had with scaling was with iTunes, and that was at >150%. I write it off as Apple not optimizing the UI for Windows.
How does this in any way negate it's worth as an example of a problem? It's effectively saying "That problem, it's an invalid problem because I said so."
I didn't state the problem is invalid, I merely provided anecdotal evidence from my years of experience in using DPI scaling in Windows to demonstrate that it's a non-issue.
Which is the key factor here. It's not Windows' fault specifically, but it's a problem that occurs when using (very common software on) Windows. To the end user, what's the difference?
If the problem is with Windows, finding acceptable alternative software is unlikely, if it is possible at all. Even worse, users are likely to see the problem affecting most of the applications they use rather than just some specific applications.
I'd say end users would notice and care about the difference between those two scenarios.
If Skype and iTunes -- which together are probably installed (one or the other) on a huge percentage of home systems, and a nontrivial number of corporate ones -- are ugly at higher resolutions, then there's a serious problem.
Sure, they shouldn't write shitty software, but it's now Microsoft's problem due to how it impacts the platform as a whole. At least in the case of Skype, one assumes they could fix it (since they now own the product); iTunes is ... probably more complicated.
If people generally spend most of their time in IE/Chrome/Firefox,Word,Excel,PowerPoint... and they all look great and Skype and iTunes look bad, I'd expect people to complain about Skype and iTunes.
If, on the other hand, people spend a lot of their time in applications that look bad I'd expect them to have a broader complaint (which may or may not be about Windows since, by assumption, we are talking about people who don't understand what's going on under-the-covers).
You haven't met very many end users, have you?
The problem is that both the OS and popular apps need to support it, otherwise in most people's eyes it will seem like Windows is "broken".
Lots of screen real estate though.
With 175% zoom I get some weird(and reported) errors in Chrome: http://imgur.com/TCDMj
The HiDPI solution in ML is not really a solution for resolutions that are somewhere in-between normal DPI and HiDPI.
I enabled HiDPI mode, but then I got 1280x720 on 27"... And without it much of the texts on web pages are too small.
You make it sound like I'm blind or something. I'm just tired of reading far too small texts.
100 DPI is pretty much the industry standard, so if it feels awkward to you then it's not a problem with the iMac per se.
NeXTStep/OpenStep used bitmap icons.
Go to any etailer and check 1080p on laptop options: from 300 models you get 50 at best. There's very little choice too, AFAIK the Zenbook is the only 1080p ultrabook out there and I could only find 2 AMD laptops with HD screens.
Like I said before I don't know what's crazier: that a 10" tablet has a 2560x1600 screen for $399 or that a 15" laptop for $800 might not have a 1080p screen.
The irony is that its been years since I saw a sub-1080 monitor for sale, but XGA/720p laptops are still the norm.
Let's at least make 1080p the standard.
Would you care to mention some? My impression has always been that Gnome handles different DPI settings relatively well, but I'm no expert.
I've never had a problem using high DPI screens on Linux. There's a setting in xorg.conf, but I haven't had to touch it in years because the correct settings are usually auto-detected.
I don't use Gnome, though, so maybe it's causing problems?
Next up: IPS LCDs everywhere.
1) Internet connectivity. Without it I'm mostly dead in the water.
2) Screen resolution. I want as much code/information on the screen as possible. I currently have a 1920x1200 Dell myself, and am I bit scared/saddened that I won't be able to replace it.
3) Memory. Always good.
After that come processor speed and storage space. I suspect my next machine will have SSD, because I've heard that it's such a dramatic improvement.
I completely agree with Linus though - I very much want a high resolution laptop screen.
(And yes, also agree, it's a shame that monitor and laptop resolution seemed to stall at 'HD' for ages)
An SSD however, is incredible. I originally wrote "essential", though I guess it's not... but it certainly feels like it is once you have one. The downside is that it will make working with non-SSD desktop machines irritating-as-hell.
Professional, Enterprise, and Ultimate can go up to 192 GB. Windows 8 is limited to 128 GB, but the Professional and Enterprise editions can do 512 GB.
I can live with only running four or five major apps at a time in the 4 Gigabytes that came with my 2010 MacBook Air - but the SSD changed my life. I'd put that on your list above memory.
My previous 2 Windows laptops had been that resolution, but when I came to upgrade about 2 or 3 years ago, that resolution seemed to have been abandoned for 15" laptops by pretty much everyone apart from Apple. Most of them had gone to 1440 or even less. A handful of pretty expensive, at least Apple price, and usually quite bulky, ones offered 1920x1200 but given that the scaling didn't look too great on them at the time I though that this might be just a bit too small for me, and if I was going to be paying about the same price for a Windows PC as I was paying for a Mac one - and not get the resolution I wanted anyway on Windows, I thought I'd bite the bullet and see what all of the Mac fuss was about.
After I dropped it (laptop bag strap broke) and smashed the corner I looked around for a replacement (The mac book pro was tempting but oh the $$$) and couldn't find one for sensible money. So I bought some new plastics, redid the thermal paste on the CPU (it had started thermal throttling during long compiles) bought a second hand screen (and a new 4 port USB pcb that got damaged during the fall) off ebay to replace the broken clips that hold it together when its closed. So I also have a spare screen now.
I'm dreading replacing it but I reckon I'm probably still good for a year or two yet.
Meanwhile the images we view on those same screens have increased in size by nearly 20 times!
If you include the 15" retina MBP then that's 17x more pixels in 20 years. And that includes greyscale to 8bit color to 16 bit color to 24/32 bit color, passive-matrix to active-matrix, and dozens of other panel technology improvements not directly related to resolution.
I'm not sure they offer WXUGA screens anymore though, it seems everyone has nerfed back to "full HD" (which offers 120 vertical pixels less) these days, which is bullshit. Since 2010 (when this trend started), the interwebs have been full of people trying to find out manufacturers who still provided WXUGA screens.
Exactly, it's like laptop vertical resolution in particular has gone in the wrong direction the past couple of years. What happened? Trying to view web sites on one of these resolution challenged laptops means a lot of scrolling up and down, feels like the 90s all over again. Glad I'm not the only one and glad Linux raised the issue.
Indeed it has, formerly 1920x1200 screens were dialed back to HD at best (so 120 vertical pixels lost), and where the standard laptop resolution used to be WXGA (1280x800), WSXGA (1440x900) and WSXGA+ (1680x1050), all of that has been rolled back to the completely crazy 1366x768 garbage. TV's "HD" has been a blasted plague on computer displays.
I would hope for something approaching 3000 x 1500 in a 24" screen :-(
I can never go back.
"It's ignorant people like you that hold back the rest of the world. Please just disconnect yourself, move to Pennsylvania, and become Amish.
The world does not need another ignorant web developer that makes some fixed-pixel designs. But maybe you'd be a wonder at woodcarving or churning the butter?"
That's fine if you are willing to drop $1000 on a laptop, I don't expect we'll see $400 laptops @2560x1600 for several years - the Tablets have the advantage of free operating system, lower computing requirements, smaller physical screens, and, in the case of Amazon/Google, a willingness to subsidize the hardware sales in order to capture downstream Content/Search revenue.
Google's disconnect from the hardware also shows how you actually can have performance on older hardware - apparently the new real-time Google Voice is on older iOS devices that Apple said couldn't handle Siri - if you don't have an interest in the upgrade cycle.
On tablets and phone, these high profits have been driving wonderful innovation bringing us better devices at lower prices every year. It's not like we're getting overcharged for stagnant products. This December's devices would have been an incredible last year.
Price gouging implies overcharging on essential items because the consumer has no choice. The only power Apple have over their consumers is that they really really want the stuff Apple makes, right now.
To put it more clearly - Google is not selling tablets to make a profit on hardware. They are selling tablets so they can profit from search/advertising through those tablets.
Re: Google Voice on older IOS equipment. I installed it on my iPhone 4 today - it runs significantly faster than Siri on an iPhone 4S. Search results are instant, as well. Completely agree with you that this is an example of where the disconnect from hardware greatly benefits the consumer.
Which tablets do you mean? Certainly not those running Android, iOS or Windows.
In the case of Android, manufacturers pay Microsoft for the privilege. Android users pay with their personal information and by looking at ads. iOS isn't free either, development of the OS is included in the hardware price. And Windows RT is certainly not free, manufacturers license it from Microsoft.
Laptop Makers end up paying fees per device - typically to Microsoft. OEM Fees are around $50 for Windows XP/Windows 7.
Are you suggesting ads are built into the Android software? I was considering getting a Nexus 7 but this would totally change my mind.
There are quite a few free apps that use ads as a way of supporting themselves but the OS has no ads itself. And you can almost always purchase premium versions of the apps that remove the ads as well.
Not to mention with Android there is the possibility of using something like AdAway
Still, AFAIK, you won't see actual ads in the OS.
Someone else in the thread is complaining you can avoid these things if you try, install alternatives, but it is still true Google makes money after the sale, and that they pay Mozilla a ton of money to be default search on that platform, so it is very valuable to them.
Ads in apps is a lot different than ads built into the OS. What are your examples of ads built into Android itself?
If those apps come pre-installed, what's the difference?
However, since Android comes with Flash, you're likely to see more ads than on other platforms, even if you use a Nexus device.
You've never noticed Flash ads on a Galaxy Tab? Unless you turned off Flash, use a pop up blocker, or if you use a text browser like Lynx, I seriously doubt that: they're pretty hard to miss.
look, you obviously don't use an android tablet so please stop telling people who do use them what their experience is like. we understand that you find your kool-aid delicious, let's move on.
Adobe stopped supporting Flash for Android 2 months a go, hardly an eternity.
Edit: Adobe made their original statement that they were moving away from Flash on mobile devices a year ago: http://www.theverge.com/2011/11/9/2549196/adobe-flash-androi...
And yes, Adobe made their intentions known a year a go, but that's not quite the same as ending support. Especially since many new Android devices will still have Flash.
What a load of malarkey. You have full control over how much information you give Google when you use the device. You can even get apps with a throwaway gmail and give them no data at all. Further, there are no ads built in the OS and I never see any, period. It's just like on iOS, I don't know what on earth he's on about.
This is something that google pays organizations like Mozilla, and Apple, hundreds of millions of dollars to have on their browsers.
The vast majority (95%+) of people take the default search engine and map engine that comes with their tablet/smart phone. Google is willing to take a loss on the 5% who might decide to use some other system (though, given that google makes pretty good mapping and searches, there is a better than average chance those 5% will stick with google anyways)
That's why google is willing to sell a tablet for $200 that others might have to sell for $250 or $275 in order to make a profit - they don't have the search engine revenue.
All it does is make you look like you have a bias against Google.
I love all the free stuff I get from Google.
Their latest GoogleVoice App is amazing - if anything, I have a huge bias for google.
It's okay to be analytical about something you love. Apple makes money on hardware. Google makes money on services. Each have different business models, and will price their products accordingly.
My objective was to emphasize that the entire business objective of Android, an operating environment that Google has spent billions of of dollars developing and protecting, is to provide a platform for Google's advertising in search, in maps, and, soon, in voice.
Because Google gets little, if any profit from hardware sales, and no (to my knowledge) license fee's from third-parties vendors, all of their profit has to come in the form of advertising.
What this means, is that when comparing a hardware product from Google, and from Apple - we need to understand that Google's profit comes downstream from the advertising revenue, whereas the bulk of Apple's come's front loaded, from the hardware sale.
This will then have an impact on the margins that each of the organizations will be required to pursue at various stages of the product lifecycle.
Apple will start off with a high (30-40%) margin up front, but has less pressure to monetize the user eyeballs in its services.
Google will start off with a lower (approaching 0%) margin up front, but then has much more pressure to monetize user eyeballs in its services.
Neither business model is inherently good/bad/otherwise, but you case see how the incentives for Apple and Google are differently aligned - we've already identified one - Apple did not release Siri on the iPhone 4, even though the phone was more than adequate to run Google Voice, which was release for the iPhone 4. Google's got no skin in the game selling you more hardware. Apple isn't really incentivized to release it's premium services on old hardware...
One can choose to give one's personal information to a company whose core business isn't to sell personal information to other companies. That excludes Google and Facebook.
Also, you're absolutely insane and bordering on fanboi territory to act like Microsoft and Apple aren't collecting the exact same usage data from location services etc. You have some bone to pick with Google over advertising yet ignore the fact that it's completely a choice to use it. You people act like Google is breaking into your house and installing a GPS chip in you. It's dishonest.
Assuming that's true, that doesn't change the fact that most users will send loads of personal data to Google without knowing it, let alone knowing how to turn it off.
"like Google is breaking into your house and installing a GPS chip in you."
They don't need to. They know that most users don't understand computers and don't read end-user agreements.
But even if you don't use Android, they'll indeed drive by your home, take pictures and collect information about your wifi network, which enables them to link your digital life to a physical one.
That might've been your interpretation, but I wrote:
"Android users pay with their personal information and by looking at ads."
By which I mean a typical Android experience. Not the Android OS, or the official Google Android experience. Most Android users are subjected to lots of ads because they don't buy apps; they download ad-supported apps. They don't buy third party navigation apps, they use the ad-supported Google Maps app. Etc etc.
You don't even have control over how much information you give Google when you use their search engine on a desktop computer. Even if you sign out of Google, turn off cookies and use a browser without geolocation. Google will always continue gathering information about you and providing you with custom search results.
But even if you could turn off all data collection, the vast majority of users won't know about it. As a result, they pay Google by sharing their personal information, their interests, their location, their contacts, etc. Google's core business is advertising, they can't help themselves.
I was referring to Amazon's Kindle Fire, but the Android experience in general is filled with ads. Not only does a core app like Google Maps include sponsored links, Google pushes consumers and developers towards ad-supported apps by offering limited payment options in the Google Play store. Three quarters of all apps in the Google Play store are free, and many of those have ads supplied by Google or its daughter companies.
Ok, and so does every single marketplace ever that charges a brokerage fee. I think it's completely acceptable to whine about an application that can give you a shortest path between any two spots in the world, 3d buildings, 3d perspective, vector tiles, the best business listings outside of the old white pages, etc, etc, for giving me RELEVANT sponsored results when I look for something. yeesh.
That's true to some extent, but Google does both. Google charges 30%, just like the other app marketplaces. On top of that, it even makes developers pay for chargebacks.
I agree with the "tiny font" bit. As a patient with severe myopia my mac air is a lot more comfortable on the eyes than my 16" widescreen acer.
I guess part of it must be that single-link DVI and older HDMI doesn't allow for more than 1920x1200, and cheaper graphics units would be overwhelmed with anything over 2560x1600.
> The IBM T220 and T221 are LCD monitors with a native resolution of 3840×2400 pixels (WQUXGA) on a screen with a diagonal of 22.2 inches
This was back in 2001, the screen ran off of a G200 (which was bundled with the T220) (http://en.wikipedia.org/wiki/Matrox_G200). A "modern" IGP is perfectly able to drive desktop and desktop application at that resolution (hell, an eyefinity card supports 6 screens to at least 1920x1200 in 3x2 configuration, that's a total resolution of 5760x2400... and you can play games on that)
Note that you're also comparing components with vastly differing prices. Maybe the market for $2000 monitors just isn't big enough.
EDIT: Eizo offers 30" displays with 4096 x 2560 resolution. However, they are not exactly cheap. About $20-$30K. Meanwhile a 27" 2560x1440 display goes for $600, or as an import from Korea even for $300.
And the second part was modern mid- and high-end dedicated (but still consumer) GPUs being able to drive much, much higher resolutions than that without breaking a sweat.
1. I never said anything about people having to accept 2D-only (and the G200 was not 2D-only, by the way), I pointed out that GPUs nearing 15 years old were already able to produce resolution you consider "overwhelming" and that a modern IGP would thus have more than enough power to handle it
2. The dedicated GPU note was to point out just how far beyond those resolutions you consider "overwhelming" a "modern" (Evergreen — and the first Eyefinity release — is 3 years old) dedicated GPU goes
Note that I never said anything about overwhelming a dedicated GPU. That point was about cheap IGPs, specifically from Intel. (EDIT: OK I didn't explicitly say it, my fault.)
Ultimately I guess it boils down to priorities. PCs have gotten a lot cheaper, and some of that has been done by making things worse.
Personally I would like high-dpi as much as anybody here - although for now I think I'd first want a lot of screen estate (13" is a bit limiting at times), but I probably use my computer for reading a lot more than other people.
A $50 AMD card will get you at least 3 1920x1200 outputs. I am not sure if the HDMI port on them can be driven any higher, it possibly can though.
I am not sure what the max resolution on AMD's Trinity line is if they have Display Link or HDMI. Again, it may very well be up to the max Display Link or HDMI supports, which is pretty damn high.
Turns out Intel also supports 2560 x 1600 over Display Port.
Scroll down to "Display and Audio Features Comparison"
HDMI itself can (from 1.3 onwards), but the card may not allow that for whatever reason.
> I am not sure what the max resolution on AMD's Trinity line is if they have Display Link or HDMI
I can't find anything on AMD's website, but the Asus F2A85-M Pro is specced at:
- 1920 x 1080 over HDMI
- 1920 x 1600 over RGB
- 2560 x 1600 over DVI
- 4096 x 2160 over DisplayPort
Pixel density affects many aspects of a system;
Storage - it affects the size disk you need/want because high resolution images / video take up much more space.
Memory - You need more memory to build up screens for a higher density display. Further you need the bandwidth to shove that data around.
Compute - If you want to 'render' to the display rather than just copy bitmaps around, or composite complex bit maps, you need to spend a lot of time computing which can make other things slow.
So when you look closely at tablets you will see interesting places where they have been adapted to support these densities.
But more importantly there is 'change' in the systems where there is new money being invested. So tablets are getting all of the 'change' now, less so with laptops, and hardly at all with desktops.
The reason this will change though is that I expect we've convinced display manufacturers that 'regular users' (the bulk of the purchasers) want 'high dpi' displays. Its not easy communicating with an entire industry but success Apple has been having with 'retina' displays, and the more recent Android tablets with higher resolutions, means more people will jump in to support them. And more importantly when the choice is available folks reject lower density displays. So in the great 'tuning' these guys do where they calculate how to get the most money out of each hour of running their factories, the equation is tipping in favor of high dpi displays.
That said, I'd love to have a couple of 32" 2560 x 1600 displays for my desktop, but I think that is still a couple of years off from being 'mainstream'
Overclockers have a sale on 27" 2560x1440 displays today. They're selling for £311. I bought one about a year ago for £450 - worth every penny.
That's not a high-DPI/retina display though, just a regular DPI.
I'm probably going to wait for black friday though to see if I can nab a decent deal.
Personally I'd like to see a higher refresh rate. Even with triple buffering I don't think that horizontal scrolling is smooth enough. I'd love to have a 120hz laptop screen. The new Windows 8 start screen, and switching between workspaces in Linux would be so much nicer. Still it's not exactly necessary, just a nice to have.
Linus's rant is compelling, but as the owner of a 2011 MacBook Air, I'm not completely sure I want to add 1.5 pounds (the difference between the MBA and the 13" Retina MBP). Granted, complaining about 4 pounds sounds ridiculous, but having something you can effortlessly carry from room to room with one hand is pretty great.
Some money advice: Don't use a retina display. :-)
I recently saw my old iPad 1 at my ex gf. It was the best screen I had ever used -- now after an iPad 3, it was just horrible.
That is a bit irrelevant. I'd rather have 256 levels of grey on a retina than a good colour non-retina. I can't get good readability of text (both web and A4 documentation) together with quick page turns (no eInk) otherwise.
I could use something else to watch movies, etc.
In an ideal world I would agree completely, a better DPI is amazing both in terms of font readability AND for watching full screen media (movies, TV shows, games, etc).
But in the real world higher resolution means small screen elements. At 1600x900 fonts are readable at 125%, at 1920x1080 even at 125% fonts and some elements are literally too small to be comfortable read (you get eye-strain after less than an hour).
Now I would turn it up to 150% "text size" but that breaks SO many native Windows applications (e.g. pushing text off the viewable area) and does the same on Linux too (Ubuntu).
Ideally everything should remain the same size no matter what the resolution, and the DPI should just grow upwards. This is how it works on platforms like the iPad.
So, I disagree with him, I don't want higher resolution displays because Windows, Linux, and OS X still suck at handling resolution (and if you use a non-native resolution it hurts the performance since the GPU has to re-scale constantly).
I have to ctrl+ every site a few times, but it works flawlessly after that.
I would argue that making everything a fixed consistent size regardless of resolution and then increasing DPI as the resolution is increased would be a better way to "evolve" things.
"Zoom" should be an OS function (e.g. 125%, 150%, 200%, etc).
"Resolution" should be hidden from the end user entirely.
"DPI" should automatically scale with the supported resolution.
That way if the user is hard of seeing they can "zoom" things but aside from that everything would look identical no matter what resolution you had (e.g. identical on 1600x900 or 1080p, or higher).
Resolution from a OS perspective needs to be scrapped and re-invented.
There is a chicken-before-the-egg problem with getting ISVs to support high resolution modes (ie, for third party developers to use the support the OS offers) since there isn't much incentive until their users have hardware to use the high resolution modes. Apple has started shipping high resolution screens, once others do too software support will be more likely to catch up too.
No no no no no. Hardware adoption has comes first, followed by the optimized software functionality.
Just look at SSD and TRIM.
In the web too. For too many years it was standard to use hard coded font sizes and specifying measurements in pixels in CSS.
It's about time web designers start thinking about accessibility and start using flexible font sizes and using em's (unit of length relative to text size) in web designs. Even when it means doing small compromises in the design and you can't make the web site pixel perfect with regards to the photoshopped design mockups.
edit: reference: http://www.w3.org/TR/css3-values/#absolute-lengths
The problem with websites is the image files--JPG, GIF, and PNG. The resolution is locked in when the graphic is made, so files made for the the low-res web will look blurry and pixelated when the browser scales them up for a hi-res display.
Such sites aren't blurry in absolute terms, just in relation to sites that have been optimized. That is, they look the same as they would on a standard screen.
And on the "Retina" Macbook Pros - i'm not sure why Linus goes batshit over a Apple marketing name, it's a term Apple uses for marketing their HiDPI screens (iPhone, iPad and now Macs) and not in use by anybody else to describe their high resolution screens.
The true HiDPI effect is only apparent in the "same size as before but 4 times the pixel density" default configuration.
Changing it run on one to one with 2880x1800 display achieves nothing besides really tiny unreadable font sizes.
No that's not how it works on the iPad. Apple would not have increased the number of pixels by a factor of exactly 4 in one iPad update if the software gave them the freedom to choose any display resolution.
I wouldn't call that "pretty good." In fact I would call that shockingly bad.
Linux still falls in the trap of re-sizing things to meet the resolution. As you increase the resolution things get small, and as you decrease the resolution things get bigger. This is a broken design.
Zooming should have no relationship to resolution. Resolution should increase DPI, not zoom or scaling (or however you wish to word it). A font should be the same size on 1600x900 as it is on 1080p.
I haven't really found this myself: I typically prefer a nice pixel font at a lower DPI than a TrueType font at a higher DPI, especially for coding. The manual effort that goes into a good pixel font just makes for better readability imo.
And... I may be the only one here... but I think they should go a little higher than 2880x1800. I know normal viewing distance is something like 15 inches, but I like to sit closer to my screen when coding and it sure would be nice to have all semblance of "pixels" completely disappear. How cool what that be?
And if they started using AMOLED screens instead of IPS, then that would really be the perfect screen.
Why? AMOLED would just make the colours ridiculously unrealistic without many other benefits.
Unrealistic colors are a problem of reproducing the information you acquire correctly. All of the colors humans can see are represented on the CIE 1931 chromaticity diagram. Choose any three points within that color space and draw a triangle between them. This colors within the triangle are those that the primaries at the vertices of the triangle can reproduce. The rest is just a software issue (controlling the correct voltages to each LED).
Roll it out. Stop with this incremental horseshit for sciences sake and make a LEAP.
And about every other hacker I know.
Technology generally improves faster than one presumes (accelerating returns and all).
In fact, Samsung recently demoed a few Series 9 prototypes at IFA with WQHD resolution (2560 x 1440, which is the 16:9 resolution you get from most 27" panels): http://www.theverge.com/2012/8/31/3282360/samsung-wqhd-2560-...
I can work just fine on my 2009 MBP with 1280x800 (or whatever it is) the text is perfectly readable, there's no noticeable pixelation at distances past a few inches from the screen and having everything shrink, as a result of increasing the res, would make it unusable.
He's probably exaggerating for effect, but it's not even remotely true that laptop resolutions have stagnated. They have steadily increased to the point where we now have retina screens on regular work laptops.
I think if Linus created a blog post saying he'd just set his background color to blue, it would make the top spot here!
It's a shame to see Linus stooping to mock apple's use of the term retina. They (Apple) name everything, like the fusion drive or any number of previous technologies. It's to humanize the tech so the average person walking into the apple store doesn't have to talk in tech-speak. It's just a marketing term, and every company has them.
The definition of "reasonable resolution" changes over the years, VGA seemed reasonable compared to EGA.
(DisplayPort is my favorite display connector to date, and I hope to see adoption grow)
Like my MBP, Retina Macs still have 7 hours of battery, so I don't see the landscape changing much: many laptops, retina-class or not, cheap or not, will get crappy battery life, and a few will get it right.
(i had a 15" 1400x1050 display, then a 16" FullHD and now a 13" 1366x768)
For example the equivalent of 9pt on a 900p display allows actually reasonably split-screen editing. On an actual 900p display though, it's much too blocky to be easily read without straining the eyes. At 1600p though, it's crystal clear.
Other people may have other reasons for higher density displays, but those are mine.
Computer displays have stagnated for too long.
> a decent font size
This is a function of screen size, not resolution.
I'm talking about the actual physical size of the font, which is about 5mm. If I make my font size any smaller, because the resolution is crap, the font is unreadable.
Higher pixel density would mean I could make the font size smaller and it would still be clearly legible.
1920x1200 displays are still available, incidently. And overall, the desktop TFT market is much higher quality these days. TN panels are going out of style, for one thing, thank god. You can now get a 22" Full-HD display with an MVA panel for around 150 USD. Still pathetic compared to tablets, but we're getting there.
On my Pandora I do scala/clojure/java (yep, Swing) coding (the new JVM for ARM by Oracle is really good, I never thought I would say that, but it's an amazing piece of work for such 'small' memory and performance footprint as the OpenPandora), Haskell works well, the whole linux build chain works well, LAMP stack works well, I can run Rails, Django, Apache, Node.js. And amazing batterylife with the option to swap out batteries on the go. Only, the screen is way too small and doesn't work under anything more than almost darkness :)
The iPad 3 works fine everywhere, but I cannot do much work on it and I haven't found an Android pad which works well enough for fulltime coding nor have I found a good enough case to work with for any of those. Most cases are really bad chinese things which have a broken power supply (batteries or anything in between loading and the batteries) after a few weeks usage. Anyone? :)
If the Nexus 10 sells anywhere near the amount the Nexus 7 does, there might finally be some good accessories. I would almost do a Kickstarter to make different form factor clamshell keyboard/battery/connection docks for android phones and pads. Almost...
But yet it's very much the opposite of high resolution so it takes more planning what you are going to do, but for algorithmic work and hard problems nothing beats mountain walks and then typing/testing on that thing. And the occasional Crash Bandicoot III play of course.
For me the optimum config would be something like an OpenPandora (before it I had (have actually) the Zaurus c860 which did the same minus the games) with a docking 'station' or, but now i'm dreaming as it seems nothing comes close to this yet, AR glasses with a sufficiently high resolution connected to it.
I have been experimenting with a Twiddler 2 (http://www.handykey.com/); everyone complains that it's too slow for typing, but again, when you are thinking up stuff which are not kilometers of boring (crud) coding typing speed, imho, doesn't matter too much. And I can type while walking with it, but no AR (or actually VR) yet.
As for Windows 8 doing high DPI, Tech Report has a decent article (http://techreport.com/review/23631/how-windows-8-scaling-fai...) on the lackings of Win8 high DPI settings even in Metro. Though feel free to ignore their complaints about browser scaling, each browser takes a different approach to how they break web pages when scaling. (Suffice to say 1 pixel borders and non-integer scaling don't go together well!)
They go into detail about the different scaling options, but the scaling options are all things that the user has to manually enable! Hardly auto-DPI. There is a balance to be struck between "more information on screen" and "better displaying information on screen" that Microsoft apparently decided to not even attempt, instead giving the user a blunt instrument with which to toggle between "way too big" and "way too small".
Well the last time I bought a new laptop was 2007. I figured there had to still be some Thinkpads Ts with 4:3 screens (given that they're used by people who are in touch with the realities of traveling and all), but alas I see that's one more thing Lenovo has crapped up. Yet another reason to keep this IBM-branded T60 going as long as possible (despite the periodic keyboard swaps).
How do those spacers on the sides of keyboards not just scream waste needing optimization? I guess that 7% area savings for LCD manufacturers overrides common sense. Well okay, I see the writing on the wall - time to learn to use ed(1).
But exactly when this will happen? Well, it sort of already has: http://www.engadget.com/2012/08/31/samsung-Series%209-WQHD-U...
However, I believe we're going to see a huge blast in these super high-res panels right after Intel Haswell is released. It'll provide the Ultrabooks with graphic performance capabilities that'll be good enough for these high-res panels.
As someone with absolutely terrible vision, I'll have to disagree with this point. I've been keeping my screen res around 1024x768 for years because moving it higher just makes it so darn hard to see. I've now come to the point where some monitors and video cards won't even go that low. I'm probably a unique case. Still, I do wish accessibility was better for visually impaired users.
FWIW, Macs do magnification the best out of the box. Zoomtext on windows costs a bit, and I'm not sure anything exists for Linux that's even comparable to MacOSs magnifier abilities. Even with a mac, there's too much mouse movement involved for my tastes.
I think, and I might be missing a very obvious demographic, that developers trying to cram in 12 vim buffers on one screen would be the main use-case, and that's just not large enough of a market to justify developing ultra-high-res panels at 24" or 27" at an affordable cost. I'm sure you can make and outrageously expensive monitor of that quality, similar to that 28" that John Carmack paid 10k for in 1995, but not that many people would be able to justify the purchase.
It's pretty nice.
I have a 13" Retina MacBook Pro and scaled resolutions look blurry, so I'm stuck with 1280x800.
But yes, it's sad how the Retina MacBook Pro has the capability, but other than doing really, really sharp fonts (when it's using the built in font renderer) - it goes unused in most situations.
That will change. Somebody needs to write a Retina aware Terminal App.
Or maybe it was the Font Handling - I was hoping that it would look okay (just small) at Andale Mono 4 - but it pixelated.
Regardless - In my 5 minutes of futzing at the apple store I could get it to do what I wanted (Basically, make the Screen look 2560x1600 to Terminal.app, but have the OS treat it as 1280x800)
> I couldn't get extra screen real-estate the way I would if the screen was 2560x1600
The resolution of the 15" is 2880x1800, not 2560x1600 (that's the 13", is that the one you're actually talking about? I'll go with that for the rest as what you say doesn't seem to make sense for the 15"). You don't get access to the native resolution of the screen with the built-in tools as far as I know, there always is some scaling applied.
> It screen shrinks back to 1280x800, and my 2010 MacBook Air runs at 1440x900.
That's just the default scaling (1:2), if you go into the display settings you can change the virtual resolution to 1680x1050, 1440x900 or 1024x640.
> Or maybe it was the Font Handling - I was hoping that it would look okay (just small) at Andale Mono 4 - but it pixelated.
Andale Mono 4? As in 4 points?
> Basically, make the Screen look 2560x1600 to Terminal.app, but have the OS treat it as 1280x800
I fear I don't get what you mean anymore than previously.
What I want is for the screen to appear to be 1440x900 (the 1:2 scaling you mentioned), or 1680x1050, but for the Terminal Windows to appear to be as though they were on a 2560x1600 screen - really small.
I thought I could do that by making the font Andale Mono 4, and I was hoping that the font would be displayed properly because of the retina screen - but it was still pretty unreadable. You can't really get "microprinting" on a retina screen - it still has a ways to go in terms of resolution.
It's a trade-off between various things, as usual. What Linus thinks may work for him (I happen to agree), but I can easily see someone wanting an ultraportable with basically VGA resolution and battery lasting entire day of active usage.
iPad's on the other hand are highly dependent on battery life. And they get to sacrifice a lot of specs and get to use worse hardware than a typical 'ultra portable' without people really noticing as it's a different type of device. Still, you will find plenty of Android tablets that get stuck in the spec wars and end up with terrible battery life.
Also, app ecosystem is not even on Macs supportive of those high res, imagine what mess would there be in Windows applications. Augh.
Admittedly, this isn't something that could happen today. But Intel's IGP are the low end. In a few years, they'll be everywhere. It's difficult to argue that something that requires a modern integrated graphics chip is anything too demanding, when you look at how far ahead dedicated chips are.
I still don't want big luggable laptops, but that 1366x768 is so last century. Christ, soon even the cellphones will start laughing at the ridiculously bad laptop displays.
And the next technology journalist that asks you whether you want fonts that small, I'll just hunt down and give an atomic wedgie. I want pixels for high-quality fonts, and yes, I want my fonts small, but "high resolution" really doesn't equate "small fonts" like some less-than-gifted tech pundits seem to constantly think.
In fact, if you have bad vision, sharp good high-quality fonts will help.
It's just much more comfortable to read on it and pictures look amazing.
If you really wanted to you could make everything tiny, but you don't need to.
And finally, head over to an apple store and take a look at their retina macbook pro and try comparing it to a normal macbook pro.
That's why OS X only supports 1x and 2x HiDPI; everything else just resizes the whole screen.
Obviously 2560x1600 on 11" is utterly useless as it is on tablets. 1366 is equally dumb. Whatever goes in between is generally fine. I start being happy at around 1920 for 13" (and it's 4:3 friend) since, after that, i can't see pixels at all and in native mode I can't see the text too well either.
I don't think that's a personal thing.
As you can see from many of the comments, people do want to buy higher pixel density screens on their laptops. Apple even came out with laptops where that is the distinguishing feature. But if your choices are outside of Apple then simply cannot buy high pixel density screens.
Laser printers when they first came out were 300 dpi. That is a very good indication that those kind of pixel densities make for better legibility. Sure you can read stuff at 75dpi, but it isn't as productive.
For example my screen has 150 DPI That's pretty good. I can't see pixels. I can see small text.
What I mean is that the resolution listed is overkill. Heck, not only it's what i meant but also exactly what I wrote ;-)
You are right that there are people for whom extra ppi, extra cpu, extra memory, extra power saving, better colour gamuts etc won't make a difference. But there are also a group who do want ppi improvements and Apple has done so across their laptop and tablet lines, Asus has done for some of its tablets, and the Nexus 10 has done so too. If nobody bought those then it would demonstrate lack of demand, but people have been buying them. And people really want them for non-Apple laptops too as Linus stated and others have concurred.
I have a "hi dpi" laptop since 3 years now (yeah, its a sony vaio Z)
150DPI helps.. 1000DPI? useless. scale.
Maybe some games? Maybe some designery stuff? Maybe some video creation stuff? (It might be useful for doctors and medical images, but I kind of hope they're using special purpose monitors for that stuff).
And does pushing those extra pixels have a cost in energy use?
That kinda depends on what you use your computer for, doesn't it?
Or perhaps someone has links to great information. Because while that stuff is findable with a bit of www searching it can be tricky to sort the wheat from the chaff.
If we're limiting the question to the specific "me" (rather than a general hypothetic user) - I do a lot of work with text. Text in web browsers and text in word processors. I do a little bit of film watching - just DVDs. I don't particularly care about battery life, but I'm happy with lower performance if there's a clear eco benefit.
>Christ, soon even the cellphones will start laughing at the ridiculously bad laptop displays.
They've been beating the pants off of them for some time now.
But they're inching closer and closer, the standard android resolution is 720p which is only a step away from the "standard" bullshit 1366x768 laptop standard.
I think this is mainly because of Apple's aggressive use of high quality displays in their iOS devices, competitors had to try to match or surpass them and the competition has given most phones really nice displays.
Hopefully once "Retina" displays in the MacBook Pros become slightly more affordable this will have the same effect on the laptop market.
This is nonsense. You need pixels to a certain amount to have a good looking picture, but the benefit of having way larger resolutions is like a log curve: it stagnates as you go up and up, since you would notice the pixels less and less.
I am surprised to see Linus making this kind of claim, he used to be more practically-focused. Now he sounds like a marketing guy from Apple.
I say, why stop at 2560 * 1600. This is ridiculously low. Make 10 000* 7000 the new standard laptop resolution. Yes, we can. Tomorrow, please. Even if the capacity and the plants to make it do not exist, yet.
BS if I ever see it.
People really underestimate how far Intel has come.
I can tell you what, if I can go from 2560x1440 on a 27" and go back to my Macbook Air and want to rip my eyes out... we can stand to improve.
Obviously, without scaling the UI, this resolution would be unusable on a 10" screen, you wouldn't be able to click much of anything. Linux is good at DPI scaling, Windows 8 is supposedly better.
And try running games in that resolution with your crappy GPU and you ll have a nice slideshow that make you feel like you have a 286 playing Doom all over again.