Hacker News new | past | comments | ask | show | jobs | submit login

I used to have a "huge" 22" Iiyama CRT (20" Effective) boasting 2048x1152. I have forever been confused why current desktop screenS try to satisfy you with 1080p. This crap was getting hyped up On screens years after I was already enjoying higher resolutions at good refresh rates. And guess what? My 2 screens ay work are two 27" iiyamas LCD. And they don't go over 1080p... shouldn't it be even easier to get a better pixel density with lcds in bigger screens ?

I agree with the "tiny font" bit. As a patient with severe myopia my mac air is a lot more comfortable on the eyes than my 16" widescreen acer.




Well there's at least 27" TFTs with 2560x1440 and 30" TFTs with 2560x1600.

I guess part of it must be that single-link DVI and older HDMI doesn't allow for more than 1920x1200, and cheaper graphics units would be overwhelmed with anything over 2560x1600.


Meh.

> The IBM T220 and T221 are LCD monitors with a native resolution of 3840×2400 pixels (WQUXGA) on a screen with a diagonal of 22.2 inches

http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors

This was back in 2001, the screen ran off of a G200 (which was bundled with the T220) (http://en.wikipedia.org/wiki/Matrox_G200). A "modern" IGP is perfectly able to drive desktop and desktop application at that resolution (hell, an eyefinity card supports 6 screens to at least 1920x1200 in 3x2 configuration, that's a total resolution of 5760x2400... and you can play games on that)


A card with 6 outputs costs a few bucks. The Intel HD4000 I believe has support for 2x2560x1600 I believe.

Note that you're also comparing components with vastly differing prices. Maybe the market for $2000 monitors just isn't big enough.

EDIT: Eizo offers 30" displays with 4096 x 2560 resolution. However, they are not exactly cheap. About $20-$30K. Meanwhile a 27" 2560x1440 display goes for $600, or as an import from Korea even for $300.


There were two components to my comment: the first one was that there were high-DPI desktop screens back in 2001 which could be driven with 2D accelerators released in 1998, and thus a modern IGP would have no trouble driving these, and thus that "cheaper graphic units" wouldn't be overwhelmed with "anything over 2560x1600".

And the second part was modern mid- and high-end dedicated (but still consumer) GPUs being able to drive much, much higher resolutions than that without breaking a sweat.


Nobody would accept 2D-only these days. And in a world where people buy $500 PCs, you can't expect a $250 GPU to be the norm.


I have to confess I am flabbergasted by your ability to misunderstand (willfully?) what I think to be plain and clear english.

1. I never said anything about people having to accept 2D-only (and the G200 was not 2D-only, by the way), I pointed out that GPUs nearing 15 years old were already able to produce resolution you consider "overwhelming" and that a modern IGP would thus have more than enough power to handle it

2. The dedicated GPU note was to point out just how far beyond those resolutions you consider "overwhelming" a "modern" (Evergreen — and the first Eyefinity release — is 3 years old) dedicated GPU goes


I understand fully well that it is not _difficult_ to drive a lot of pixels if you want. Many modern cheap cards and IGPs however cannot, for whatever reasons: they physically have no support for dual-link DVI. Maybe they saved 50 cents on some connector that way, or whatever.

Note that I never said anything about overwhelming a dedicated GPU. That point was about cheap IGPs, specifically from Intel. (EDIT: OK I didn't explicitly say it, my fault.)

Ultimately I guess it boils down to priorities. PCs have gotten a lot cheaper, and some of that has been done by making things worse.

Personally I would like high-dpi as much as anybody here - although for now I think I'd first want a lot of screen estate (13" is a bit limiting at times), but I probably use my computer for reading a lot more than other people.


> Many modern cheap cards and IGPs however cannot

A $50 AMD card will get you at least 3 1920x1200 outputs. I am not sure if the HDMI port on them can be driven any higher, it possibly can though.

I am not sure what the max resolution on AMD's Trinity line is if they have Display Link or HDMI. Again, it may very well be up to the max Display Link or HDMI supports, which is pretty damn high.

Edit:

Turns out Intel also supports 2560 x 1600 over Display Port.

http://software.intel.com/en-us/articles/quick-reference-gui...

Scroll down to "Display and Audio Features Comparison"


> I am not sure if the HDMI port on them can be driven any higher, it possibly can though.

HDMI itself can (from 1.3 onwards), but the card may not allow that for whatever reason.

> I am not sure what the max resolution on AMD's Trinity line is if they have Display Link or HDMI

I can't find anything on AMD's website, but the Asus F2A85-M Pro[0] is specced at:

- 1920 x 1080 over HDMI

- 1920 x 1600 over RGB

- 2560 x 1600 over DVI

- 4096 x 2160 over DisplayPort

[0] http://www.asus.com/Motherboards/AMD_Socket_FM2/F2A85M_PRO/#...


I've never seen the point of early adoptations of HDMI. The first and last HDMI cable I bought constantly caused flckering and weird artifacts. I've only used it for a while because aesthetics and being lazy: no extra audio cables needed. A really old VGA cable provided more superior video quality on the same setup.. And HD+ resolutions, no problemo.


I'd suggest that the one and only HDMI cable you bought was faulty.


I think it's mostly because the interfacing controller thingie is so redicilously cheap - "thanks to" to the "flat screen revolution". Same controllers can be used everywhere, costing next to nothing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: