Then up to around the year 2000, resolutions increased, going through such stages as 800x600, 1024x768 and 1280x1024 (your CRT would flicker if much higher, and then LCDs came along pinning it at 1280x1024 for 17").
Then the resolution (or especially DPI) growth basically stopped for 10 years imho ("HD" was not really any higher at all than what a year 2000 CRT could do).
But now there's some growth again :) Why the problems though? has 10 years of the same DPI made developers forget about different DPIs?
Expensive workstations already had big monitors and 1280x1024 resolutions in the 1980's.
Also, on a related tangent: prior to 1970, computers already ran with processor clocks over 30 Mhz, with 32 bit (or larger) words, and virtual memory. These things did not show up in the Intel 80368 and start to be supported in Windows 95. :)
But they were complete monsters(they were deeper than x, y dimensions), not good for your eyes either,needed filters, some X rays will hit your face.
We replaced them with much cheaper(and not as good quality) LCDs arrays. They were so cheap that you could buy 6 for the price of 1 high quality display.
What happened is very simple, LCD technology did not benefit from CRT advances in resolution, as they are completely orthogonal technologies, and people bought cheap panels.
The other change was 3D. We used 2D acceleration cards back in the day, but consumers made little use of them, but they started playing 3D games a lot.
Comparing original Lemmings "sprite changing 100 small parts of the screen" with Call of duty "millions of polygons moving in 3d at real time" does not make very much sense.
So the cards evolved from accelerating a static 2D image or document on the screen like professionals needed, to updating a 3D screen 60 times per second, which is much more complex.
After LED backlighting(too expensive before) smartphones ,tablets and TVs mass producing even cheaper panels quality has improved a lot.
VW bought Porsche, Fiat bought Ferrari etc. :D
Note that Porsche SE used to own Porsche AG directly, but now VW owns it.
To quote your link:
In July 2012, it was announced that Volkswagen AG was taking over the Porsche AG automotive company completely, which bears the same name, but is only a subsidiary of Porsche SE.
I guess as nowadays all digital signal is routed through content protection filter, the technology processing these streams wasn't performing well, causing stagnation in display resolutions for a long time (well, there was also a limit in DPI of CRTs and immature LCD tech). I guess only nowadays with HDMI 2.0, DisplayPort 1.3 or Thunderbolt we are getting to the point the digital signal can be "properly" controlled by content-protection chips, hence allowing 4K+ resolutions (though it's still a mess).
1. DVI has supported higher resolution displays for a long time; 1920x1200 since 1999 on single-link, and 2560x1500 on dual-link. Note that this is more than '10 years ago'.
2. HDMI has supported 4K displays since May 2009.
Most of your post belies a misunderstanding or underestimation of the technology involved. The bottleneck was never 'content protection filters' so much as the feasibility of building electronics that can handle higher-bandwidth signals over cables with no extra data channels (in a backwards-compatible manner), and the challenge of getting manufacturers to make hardware that would actually support it, at increased cost, for no practical benefit.
For older LCDs, it was an issue of new technology is expensive. I remember when everyone I knew had 15" CRT monitors, and having a 15" LCD monitor was a luxury that almost no one could afford. It might be hard to remember, but consumer LCD displays were new once, and new technology is never cheap. On top of that, the cost of an LCD panel doesn't scale linearly with diagonal size, so going from 15" to 17" to 19" was a huge cost curve. Until LCD production was more consistent and people understood the point in buying them over CRTs, the market didn't really heat up, and so sizes/resolutions never grew.
As for DPI, LCD DPIs depend on the size of pixels, which are a mechanical element. In contrast, CRT displays are a printed screen of phosphors; paint a smaller, more detailed phosphor grid, adjust the electronics for more precision control, and boom, higher DPI.
The reason for why high DPI was a niche was because of one thing, and one thing only: software. You had to have good eyesight and seek a large screen real estate to even be able to make use of it.
Software didn't scale well. We had small laptop screens with 1920x1200 in the early 2000s. It barely cost more than lower resolution displays (back then you typically had three different resolutions to choose from for every laptop (then came apple)). It was real apparent even back then that high-DPI wasn't costly at all, it just wasn't mainstream enough for anyone to care.
Fast forward a decade and we have the exact same problem. Software can't scale shit. The only reason apple was first with high-dpi was because they controlled the software, and they took the easy way out. They just scaled everything up 4 times and look at that, exactly the same screen real estate with just higher DPI. Same thing with iPhone 4 - the only reason they increased the resolution so much was so that they wouldn't have to bother with scaling ("retina" was just an added bonus). Remember that niche that sought large screen real estate and that bought high-resolution CRTs in the 90's? Well, they are still waiting for a decent LCD...
As for HDMI, no, firstly HDMI was never intended for computers at all. But the hype in combination with small laptops forced it upon us anyway. But HDMI came much later anyway, the battle for high-resolution/high-DPI displays was already lost (because of software). The real technical hurdle was the OSD and scaling chips. Again, the reason for why apple was first with the 30" 2560x1600 display was because they were the first ones that could ditch both the OSD and all scaling from the monitor. It only had brightness adjustments (no OSD) and all of the scaling was done on the graphic card, that way you couldn't pair it with a regular PC - if you did you would have to use another monitor to be able to enter BIOS, install the OS, enter safe-mode, or game on any other resolution (which you pretty much had to). Of course, eventually most graphic cards could do so but apple were the first to be able to assume that the monitor would be paired with such a graphic card.
That and the fact that Dual-Link-DVI was quite rare (hardly surprising since there were no monitors on the market that used it).
Oh and people, I hope you (not parent, but lots of others) didn't run 1280x1024 on your 4:3 CRTs. The only 5:4 monitors that existed were 17" and 19" LCDs. You should have used 1600x1200 or 1280x900, that is, if you didn't want to stretch everything.
This time, the screen size stays the same, but the DPI goes through the roof.
LCDs introduced the problem of a fixed resolution, where basically you have to choose the right resolution at the time you bought it. There were 1920x1200 laptops  and 4k 22inch (T221) screens 10 years ago, it was clear this problem was coming, though the software never changed to become resolution scalable.
Game developers these days usually think of a PC or console game display in terms of pixel count and aspect ratio, not DPI
That's why going to a high-pixel-count display for the same physical dimensions screws everything up.
The last 10 years have seen a major shift from desktop to laptops (and mobile/tablet). That has different needs & constraints. e.g. Power, form-factor, capabilities of mobile GPUs. So I think that's part of the effect you've witnessed.
Resolution didn't simply stagnate; it regressed. In approximately 2000, Dell sold laptops with 1600x1200 LCD displays. Once "HD" appeared, display manufacturers lost interest in resolutions higher than 1920x1080. For several years to a decade, the most common top resolution was 1920x1080 and many mid and low-spec laptops were sold with shockingly poor 1024x768 and 1366x768 resolutions. (The 2560x1600 30-inch monitors appeared on the market as a prosumer option in ~2004 but these were tainted by their own blight--a fixed price of $1,100 that never wavered--and didn't see any compelling competition until the Korean and Chinese manufacturers disrupted the incumbents.)
Really? I've used monochrome hi-res CRTs in the early nineties (made by SUN for its workstations nonetheless) and they were shite -- (not to mention the text rendering of the software at the time was shite too).
I actually think you're just seeing those things through rose-colored glasses. Try a modern 5K retina iMac or 4K dell monitor.
Browser windows on the laptop usually run at a zoom of 200%, whereas windows on the external display run at 90%.
Annoying, but the beautiful text is worth the bother.
xrandr --output <output_name> --scale 2.0x2.0
The output_name can be found just by running "xrandr".
A related and more powerful feature is specifying an arbitrary matrix for the screen-display transformation (e.g. rotation by any angle, shear) (xrandr --transform).
Shameless plug: I have written an addon to workaround this .
The part of the comment that I found the most interesting was his take on a solution:
1. Make everything vectors. You'll have to choose between weird antialiasing artifacts and potential pixel cracks; either way things will look bad. And you'll encounter bitmaps eventually, and have to deal with the necessities of resampling and pixel aligning at that point. 2. Scale only to integral sizes, and resample. You'll avoid antialiasing and pixel alignment issues, but pay a performance penalty, and things may look slightly blurry.
Which leads to the problem of centering. I wish to center a bitmap image within a button's border. The image is 101 logical points high, and the border is 200 logical points high. With a 2x scale factor, I can center absolutely, and still be aligned to device pixels. With a 1x, 1.25, 1.33, etc. scale factor, centering will align me on a partial pixel, which looks like crap. So I have to round. Which way? If the goal is "make it look good," then the answer is "whichever way looks good," which depends on the visual style of the bezel, i.e. whether the top or bottom has more visual weight. So now we need hinting.
And that's where things start to get really nasty. In order to make things look good at arbitrary resolutions, we want to round to device pixels. But the rounding direction is not a local question! Consider what happens if we have two visual elements abutting in logical coordinates, and they round in opposite directions: now there's a device pixel between them. That's very visible: you get a pixel crack! So you have to coordinate rounding.
Question: even with hinting, is it actually possible to make a font where two glyphs align perfectly without white space between them?
WPF is a good example of a framework that attempted resolution independence and encountered this problem. Initially it has the "SnapsToDevicePixels" property, which triggers rounding behavior at draw time. But draw time is too late, because of the "abutting elements rounding in opposite directions" problem. So they introduced the "UseLayoutRounding" property, which does...something. And the guidance is basically "turn it on and see if it helps, if not, disable it." Great.
I think a way out of this is way higher DPI with some blurring as the last step. That's what print does. At 1000 or 2500 DPI nobody will notice aliasing anymore, and he blurring would take care of small cracks between objects that supposedly are abutting, make cases where two objects that shouldn't accidentally do overlap less obnoxious, etc.
Or more precisely, this is a Desktop Environment thing, not a distro thing. So whatever DE /u/bitL is using in their Mint (Cinnamon?) is the culprit.
CTRL-+ can really break the page layout in surprisingly varied ways.
The QT stuff is worse since they cannot be fixed by adjusting only the font size.
The older (Xaw3D,TK) stuff is really bad.
I have never seen different DPI on different displays working.
99% of my day is spent in the terminal (xfce4-terminal) though and the crisp text makes it all worth while.
"Well, KDE can do what we are looking for, but GNOME does it in a hackish way. As we use GNOME, our review is all about it, and as a conclusion, Linux is no good."
Why do these people insist on using GNOME after all?
But I've seen that from people complaining about workflow, supporting non-usual options, and accessibility.
Anyway, yours is the first review I see about KDE in HDPI.
In my mind, KDE is also superior to Windows in many ways anyway.
to take full advantage of that sweet sweet desktop systemd goodness.
I've had 1600x1200 CRT or LCD on my main desktop since... 00s? Late 90s? If you make money typing at a desk, you're foolish not to splurge on desk, chair, lighting, quiet, keyboard... and monitor. I've been dealing with this class of problem since the 80s although it was a little different back then...
If gnome-less, "linux" or more accurately, FOSS in general, has no problem with high res displays, in fact its really nice, far from a problem.
I don't know if KDE in general has resolution problems. Konsole looks beautiful at high res and decent font. The only thing more flame war-ish than emacs vs vi is font choice, however I have noticed visible quality differences such that random font A in 2005 at 24 pt size looks much worse than it does at 10pt size, and this varies by font and over time.
An example of the above I have freebsd on the other machine on my desk and displaying Monospace in a konsole font size 12 is "normal" but when you kick it up to 13 it looks perma bold because thats the 1 pixel to 2 pixel transition for Monospace on that platform and exact config. Luxi Mono displays "normally" at 11, permabold at 12, and at very large sizes like 17 it looks very peculiar randomish parts of letters display as 1 or 2 pix wide. Nimbus Mono L Regular 16 looks nice to my eyes and thats mostly what I use on that machine at 1600x1200.
The biggest problem I have is finding really good UTF8 fonts. There's a nice test file called UTF-8-demo.txt Its mirrored and copied all over, here is one link:
I find this a true test of difficult display rendering on linux.
The ergonomics of widescreen for long term anything, including movie viewing, is another topic and I believe my multiple high res 4:3 monitors are a productivity booster for me, so until they break I'll keep them.
I miss working on my irix :(
I have a Dell UP2414Q (3840x2160 resolution, driven via DisplayPort 1.2) connected to a nVidia GTX 660 card, which was one of the cheapest ones that support DP 1.2.
With the proprietary nvidia driver, I need to manually edit the xorg configuration file to have the correct modes and most importantly disable XRandR in favor of Xinerama.
This in turn breaks e.g. GNOME shell on Fedora 20 (without RandR, you’ll just get an exception in your syslog), and in general prevents plenty of use-cases (e.g. redshift for controlling display brightness, or changing rotation settings without restarting X11).
The reason for having to disable RandR is that there is not currently a standard way on how to represent multi-stream transport (MST) connections, and 4K displays require 2 streams (1920x1080 each) at the same time. With RandR enabled, what you’ll see is 2 connected outputs, and all applications will treat them as such, even though you have only one monitor connected.
Fixing this requires changes in RandR (i.e. the X server) and each driver. AFAIK, on the intel driver this should work, on nouveau there’s work under way, no clue about the proprietary nvidia driver.
Driver version: 346.16
XOrg server version: 1.16.1 (11601000)
The only issue I've ran into is that Gnome Shell won't respond to clicks when I run in 30bits/pixel mode.
In computer lifecycle terms that is a long time and it is a little bit embarrassing that Linux works as poorly as it does in these situations. (Windows ain't so great at it either, which reflects poorly on Microsoft, but it still handles the situation a lot better than Linux does.)
Gnome's hidpi support doesn't allow float values, making this a serious issue to me. On top of that, you'll have to change the cursor size manually by diving into the config.
The iphone 6+ display uses downsampling, so perhaps that's the way to go. But as things stand, I don't see my self using an external monitor in the near future, even after wayland becomes mainstream.
A small issue is GTK apps that don't respect DPI settings.
The most significant problem AFAIK is for people who switch frequently between different displays. There are a few settings here and there you have to adjust every time you switch between Hi and Lo-DPI.
Having written some Qt apps and used them on scaled screens... eh. It's got some work to go, but we're making progress. Apparently Qt 5.4 will have further improvements in this field too.
Currently on two monitors with the same physical dimensions but different resolution you can't get a common rendering.
But you can get a decent rendering from both of them, albeit a bit different; one for example will have a tad larger icons, the other maybe a little larger fonts,etc.
Not necessarily a big deal for consumers, but vital for professionals that have preferred working screen layouts on their old machines and don't want to change that on the new ones.
And OSX is not Linux. I don't want non-Linux OS.
Also, I get 7.5 hours on the battery, I wonder what went wrong between the UX31E and the UX31A?
Only quirk I have with the touchpad is that on Windows, with 1 finger idle on the touchpad and another finger moving the cursor would still move with the moving finger. On Ubuntu, the cursor remains stationary any time there are two fingers on it. I probably would not have noticed it if I weren't already so used to the Windows functionality so it isn't a big deal at all.
ten@vekony:~$ xdpyinfo | grep dimensions
dimensions: 3200x1800 pixels (846x476 millimeters)
I've set the scale to 1.38 for menu and titles bars (in the Screen Display settings) and everything looks great.
Chrome is a nightmare, there are some builds of Chromium that are better with HiDPI support, but Firefox works perfectly with the setting "layout.css.devPixelsPerPx" set to 1.8.
I've also set the text scaling factor to 1.58 (Using Unity Tweak Tool).
There is a good Arch Linux wiki page with some HiDPI tips - https://wiki.archlinux.org/index.php/HiDPI
I'm completely happy with the results and when I use my Wife's MacBook Air I think there is something wrong with my eyes as the difference is really noticeable.