This manifests primarily in a feel of poor controls, or a game not doing what you tried to do.
Good HDTVs will give you 100ms of lag or so, bad ones can be in excess of 400ms. With most games nowadays running at 30fps, that's between 3 and 13 game frames of lag.
It's pretty absurd the difference this makes, and as a game developer I have to fight with it constantly and it is infuriating.
It's funny because a very old Westinghouse 37" LCD TV had between one and zero frames worth of lag.
Also of note, the Onkio receiver I measured added two more frames of lag. So you should always play direct.
Also, I know people will say "Game Mode!" but obviously, as a dev you can't just assume your end users will do that.
(I'm not a gamer)
The works to varying degrees of success on a per-model basis. Just check out forums discussing which television to buy when you want to play music/rhythm games (like Guitar Hero) to read a lot more details about specific displays.
$ sleep .1
$ sleep .4
One common offender is "Sharpness" settings. On many TVs, setting this to anything besides zero will make the TV apply a sharpening filter to the image, generating annoying halos around edges.
Another common offender are "color" (especially "tone" and "contrast") settings. These were meaningful for CRT TVs, or when decoding broadcast video, but applying them to digital input is simply ridiculous -- especially when there's no obvious neutral setting.
So if you have red/blue text (or anything with red/blue detail) looking awful, like it was lower resolution than it is, you have the TV to thank for that.
EDIT: to clarify, this doesn't apply if the text is a part of a movie (then you blame the codec for that, not the TV)
Hell, some of those HD broadcasts are 4:2:0 in order to reduce bandwidth.
I meant mainly for consoles, computers, and subtitles (though for subtitles it also depends on your DVD player) so I amended my post.
The HD broadcasts are 4:2:2, more often than not. There are a few stations that have their video servers down at 4:2:0, but many of those are scheduled to upgrade over the next 2 years.
Edit: The servers are 422, but the broadcasts are 420, as brigade points out. (The edit tag on my original comment has already expired. Grrr.)
I'm admittedly not familiar with other countries, but I would be incredibly surprised if any of them allowed 4:2:2.
I've assumed that the video settings of the server is what they were delivering, but there must be a down-sample on the way to the transmitter. (Cablelabs is the other large distribution format that we encounter, which yields the same story.)
I'm talking about the time it takes for the TV to tell the pixels to change.
No, as disturbing as it sounds, real TVs really do have hundreds of milliseconds of lag. Games that care deeply about precision timing, notably rhythm games like Guitar Hero, actually have HDTV lag settings to change the offsets between audio, video, and input, so that you can actually strum based on what you see and hear and not miss every note.
Looking at that site, they tested a relatively small number of TVs (only 14), and it sounds like they enabled "game mode" on any of the TVs that had it.
I'll certainly acknowledge that 400ms represents the most extreme case, but 100-200ms or more happens pretty regularly, especially when not using "game mode" or equivalent.
30 fps is just 33ms per frame. At least one frame of delay is necessary baseline to create transfer and render the image. After that, you're adding delays. 4 frames processing is 133 ms, 6 frames is 200 ms. This is in addition to the console frame rendering time, HDMI transfer from console to TV's frame buffer, and TV's display, assuming a direct connection.
Now, the more interesting thing about all this for me isn't that most TVs and most new home video receivers lag, it is the lag in your own perception and how your mind compensates. As an avid gamer, you learn to compensate for the equipment's lag, and certain studies show a kind of precognition with which top gamers seem to anticipate moves before they should occur. (The reaction happens after the event in the CPU, but before your brain and muscles should have been able to respond to it given normal lag time to perceive and react.) Your own perceptions of whether you can't play a game from lag, especially if you are a gamer, is a bad source of data on this one.
Fact is, home receivers have frame buffers, and home TVs have frame buffers. Most are two or three frames, many are as bad as four. This makes moving video, especially blocky or mosquito noise video, able to be cleaned much more. These receivers and sets look the best, which is why this lag is still increasing. To an extent, the more frames buffered, the more temporal data to use to clean the frames.
Finally, as a gamer, forget the FPS, they're easy to anticipate. You lead your shots naturally and subconsciously. Instead, try something like Geometry Wars Evolved. With the frame lag, you will occasionally be dead in the console before you've seen it on your screen. All my high scores are component video to a Sony Trinitron. I don't come close on a modern set through a receiver. This is well known to the gamers in the GW:E forums.
1. Just one example of improved "anticipation" (as opposed to "reaction"), as I don't have time to look up the various citations... "A different study by Kuhlman and Beitel (1991) measured the anticipation of seven through nine year old children who were categorized as non-experienced, moderately experienced, or highly experienced video game players. The researchers found that children with extensive experience playing video games can more accurately and consistently anticipate the arrival of a stimulus." – http://clearinghouse.missouriwestern.edu/manuscripts/847.php
There's a long laundry list of things that the vast majority of gamers won't notice directly that can affect gameplay. They have neither the vocabulary nor the technical expertise to identify or explain the things they experience.
I do agree with you though, 400ms is unsuitable for realtime games. But that doesn't stop people from ending up in that situation. However, one mitigating factor is that if you are buying those super cheap TVs, it is likely that you don't own a 360 or PS3 either.
Then again I see a surprising amount of people watching stuff with the aspect ratio set wrong. And a surprising amount of broadcasters broadcasting things in the wrong aspect ratio, so maybe people would just be oblivious...
Also, doing that research is nearly impossible and will in general result in a TV that's inferior for the majority of what you'd do with it.
Also, I disable the vast majority of these post-processing features the TV provides. I'm a purist and a photographer by hobby and I can see all the 'fixing' that is done in in-TV post-processing, and it looks bad, less sharp, etc, to me. That may affect my experience... Or perhaps I just buy nice TV's (for the same reasons)
A great litmus test is Assassin's Creed 2. I worked on that, and for all of development I had a CRT hooked to my dev kit. The game was always super responsive and I never missed a jump. And then, playing at home on my LCD TV... it was kind of insane the difference. It suddenly felt kind of mushy, and kind of like I was fighting the controls.
The reality is that we as developers do as much as possible to hide it, but it's still there if you know what to look for.
In fact, Assassin's Creed 2 has 100ms of built in lag due to renderer multithreading. That means it takes a minimum of 100ms for the game engine to render the first frame of a reaction to your input press.
It sounds crazy when it's all listed out, but I assure you it is not only real but something you deal with every time you play a game, even though you don't notice.
I expect Apple will solve this problem in their usual way; pick slightly less stupid settings and lock those in. In this case, the difference will be stunning and people will marvel at how those apple tv's can look so good.
Hrrm ... that was RMS's point, way back when, with some printer, if memory serves ...
As someone who's dealt with the inimitable joy of trying to get an MBP to work with an HDTV, Apple's "usual way" appears to be to code to the standards and to hell with anyone who breaks them (or the users stuck with non-compliant products). This appears to apply to wifi as well. There's been an open bug in OSX for years involving OSX assgining its own DHCP lease when it fails to negotiate one with the router. This results in the dreaded "Self-assigned IP" message, which is nigh-on impossible to rid yourself of short of voodoo dolls and wifi dances.
Getting a bit off-topic here, but that's not a bug: it's assigning itself a valid zeroconf* address because the DHCP server is not responding.
I think bug in this case just means that the software does not work as intended even though it follows the spec.
Also, in my experience Apple's DHCP agent will re-request an IP address after having assigned itself a self-assigned IP address. This generally takes about a minute or so, in that time the DHCP server can then reply once again. I've never had issues with this at all.
Zeroconf is a historical mistake. It shouldn't be there any more; it does far more harm than good.
But getting back to the original complaint, if not for zeroconf you'd have no IP address at all; I don't see how that's any better.
I'll admit to having only a rudimentary knowledge of how wireless networks work, but it's better than 99% of users and I find OSX frustrating (in this regard). At the end of the day, the user doesn't care whether the router manufacturer isn't following the spec, or whether Apple's implementation is buggy. The simple fact is that it works on Windows but not on OSX, and that's a failure on Apple's part.
What Windows does do wrong is that it then also drops the self-assigned IP address which may already be in use for communication, this can cause issues with other hosts that are communicating with it over zeroconf.
You are correct for Ad-Hoc networks.
In RMS's worldview, software freedom is an inherent good, not a derivative good. "Many eyes make all bug shallow" is nice, but for him it's a side-benefit.
OK. This article is flawed and many of the comments are just as flawed. Having been involved in the design and manufacturing of LCD displays (down to writing all FPGA image processing code, scaling, deinterlacing, etc.) I think I can say that none of this is accurate if the intent is to apply it generally.
Caveat: If you buy a TV don't expect it to be a computer monitor. Most TV designs are just that: TV sets. They are made to do one thing reasonably well: Take a crappy satellite/cable/whatever signal and give you a reasonable image back.
EDID can be programmed with any resolution you want. Do you need 921 x 333 at 12 frames per second? No problem. There is no such thing as a resolution not being available in EDID. Standards are one thing, but the EDID mechanism isn't inherently limited by standards.
BTW, there are commercially available EDID modifier gadgets that allow you to modify the EDID readout from the monitor. So, the monitor says one thing and your computer (or whatever device) receives your programmed values.
If you need a TV that will play nice with a computer you need to find one that was explicitly designed to do so.
Most consumer TVs use one of a very few commercially available processor chips to do their image processing. With a few exceptions they all do the same kinds of things. And no, the signals are rarely converted to YCbCr 4:2:2 internally but for the absolute cheapest and crappiest of processors. All the good ones convert the input to a common internal integer RGB format. The nice ones might standardize at 12 bits per channel (36 bits total) internally. When I did custom FPGA video processing we went as far as 24 bits per channel in order to avoid truncation of calculated values until the very last moment. This can make a huge difference depending on the application.
In general terms "monitor mode", if you will, should be a mode that bypasses as much of the internal processing as possible. You can force this bypass by using and EDID modifier gadget programmed for the actual resolution and timings of the panel. In other words, open the back of the TV, get the panel model number, get the data-sheet and program the EDID modifier to output these values to your computer. The processor should push this straight to the panel and you get very little, if any, processing. Again, this does not work on all TVs. As I said before, they are designed to be TVs, not monitors.
That said, I've connected many computers to off-the-shelf, un-modified, consumer TVs via DVI and HDMI. I have yet to run into any real issues.
Indeed, as I said. I then went on to say that lots of 1366x768 devices don't provide a detailed timing block, so they're limited to what you can express in the standard timings - which limits you to horizontal resolutions that are multiples of 8, and so can't express 1366x768.
That's the only thing you appear to disagree with about the article, so "This article is flawed" doesn't seem entirely fair. I'll admit to not being a video expert - I write a lot of code that interfaces with firmware at all levels, including displays, and I work closely with people who are video experts, but I've never built a TV. I don't think I've misrepresented any facts or presented any gross inaccuracies, but where I have made mistakes I'd love to be corrected.
> I have yet to run into any real issues.
Possibly because of hacks like the one this guy pointed to in Linux?
Tell that to Sony, Samsung, LG, etc. who all do it by default (outside of game/PC modes) across most (all?) of their lineup nowadays.
Or are you implying that my HX909 uses the absolute cheapest and crappiest of processors?
> In general terms "monitor mode", if you will, should be a mode that bypasses as much of the internal processing as possible
> As I said before, they are designed to be TVs, not monitors.
I think this is 100% the problem. TVs these days essentially are big monitors, and they should be designed as such.
I believe that if a TV is taking content from a digital source (ie, HDMI, DVI, DisplayPort etc.), it should perform the bare minimum image processing possible. Things like scaling and gamma correction are necessary of course, but other operations like sharpening filters just destroy images and are unnecessary for digital inputs, and things like motion interpolation (ie. MotionFlow) are ridiculous (seriously, the only thing it improves is text like rolling credits, but it introduces serious motion artefacts to actual video and does strange things to noise in images, making the image look what I can only describe as 'slushy')...
This is just my opinion from being both an engineer but also having done a lot of film-making and photography work. It just appals me to see the default settings on these TVs degrading the images so much, and it takes a lot of searching through menus to set it to something sensible, which the vast majority of the population won't do...
Many of the comments below discuss how people experience HDTV "lag", I was wondering if you could shed any light on what causes this. Initially I was thinking that, if its a commodity processor then of course there is the potential for lag, but then you mentioned the FPGA's.
Now I am no fool, I know that its going to take time to process things even with an FPGA, but you got me thinking, in the HDTV application surely an FPGA will be coupled with a commercial DSP chip, heck for high volume I am sure the manufacturers will go the whole hog and get the FPGA netlist converted out into an ASIC or even full blown foundry chip.
Wouldn't the FPGA coupled with a DSP kill most of the lag outright ?
So where in your experience is the lag ?
The de-interlacing block is where most of the delay comes from. Interlaced video (standard definition, 1080i, etc.) only transmits half the lines per field. This means that the processor has to synthesize the missing pixels in order to have full frames to display. A really cheap de-interlacer can be done with just a few lines of delay. You wouldn't want this as it would introduce bad imaging artifacts. Pretty much all consumer TVs use a method called "motion adaptive de-interlacing" (MADI). There are many implementations of MADI. In general terms you use data from frames before and after the one you want to process in order to detect pixels that have moved. Those that did not move can simply be replicated or averaged into the missing slots. Anything that moved requires different treatment. The key here is that you need to store a couple of frames worth of video before you can start with MADI. That's your two frames of delay (or more).
There might be other delays introduced by other subsystems such as the receiver.
201 DPI for the T220/221.
If you think about it their pretty cheap too. Remember paying 2K for a monitor back in the 80s? It's basically the same price once you factor inflation.
Other than that, I love mine and am considering a second one for dual head using eight DVI channels.
For computers, there are concerns about display connector bandwidth and LCD fabrication yields, but they are surmountable. The real limiting factor has actually been the terrible state of DPI scaling support in desktop operating systems. Higher resolutions make the UI elements too small to see; it's as simple as that.
The future is bright, though. Apple realized that their uniquely tight control over the hardware and software environment on iOS would allow them to easily switch to a higher resolution display and scale the UI without any compatibility problems. Now that they've released the retina display there's pressure on competitors (including themselves with OS X) to follow suit.
I just haven't met professionals who are sweating that particular detail of their workflow for several years. What industries are you seeing this problem in?
It's why I like my 1920x1200 14.1" Dell D830. You can find then for about $300 on Craigslist and you can fit a module bay batt plus a 9cell which, with an SSD and the integrated video can get 6-9 hours depending on use.
I actually spent a summer working with Edward Tufte, lived in a guest house on his premises. Got the job in part as a physics major who then went to medical school and knew at least enough about brains and eyeballs to be conversant. I think its safe to say I'm a fanatic about transmitting as much data as possible in the eyespan. That said, what people are doing with the extra data is also non-trivial. We, the societal we, are not spending money in a vacuum: IBM discontinued the T221 monitor fleitz cited above.
My question stands: what industry is suffering mission failures due lack of screen resolution?
Every laptop I've had in the last few years has had a DP output. I've read that this is because it's cheaper to license than HDMI. (But DP to HDMI conversion boxes are like $20 now.)
Note also that QXGA is the widest of these, despite the others being "widescreen".
I'd be happy with 8 of these (http://barco.com/en/product/1219/specs) arranged in two 2x2 clusters on the sides of my webcam.
edit: Eizo has another cool one, color and 4096 x 2160: http://www.eizo.com/global/products/radiforce/rx840/index.ht.... I don't even want to know how much they cost.
It's the kind of money you spend to develop today against the kind of computer that will be mainstream a couple years from now. At least, that's one bet. Xerox bet correctly - GUIs, bit-mapped displays and object orientation got hot in the mid 80's but, nevertheless, they didn't collect their prize.
One of the HDMI ports reported this (extracted using SwitchResX):
720 x 400 @ 70Hz
640 x 480 @ 60Hz
800 x 600 @ 60Hz
1024 x 768 @ 60Hz
Standard Timing Identification:
#0: 1280 x 1024 @ 60Hz (8180)
640 x 480 @ 60Hz
Standard Timing Identification:
My four-year-old Samsung has a "Just Scan" mode that maps 1:1 to the panel's pixels. I've never had a problem with overscan over HDMI, DVI, or VGA.
When I was home over the holidays, I noticed that even my parents' Black Friday mystery-name TV from Menards offered a 1:1 pixel mode. This mode was strict enough to map a 480i picture into a little box in the middle of the screen.
Poor purchasing decision I'll grant (though it is a decent TV), but my point is this area can still be a minefield.
It seems that it's a wash as to whether you will hit this or not all depending on the brand.
Study: 18% of people can't tell if they're watching true HDTV content or not
Still, I've never talked to anyone who has noticed the difference, so I assume it's a rare gripe.
All the cable providers are using the same set of transmission boxes from the same vendors.
Usually, they are employing statistical compression: Take 16 channels and determine which video needs more bandwidth in real time. So if your favorite show is opposite talking heads, then it looks fine. But if you're opposite action-packed hunting in jungle scenes (lots of high-frequency content), you're gonna see it.
I think we're suffering because its economically more appealing to treat the TV as a sales machine instead of an entertainment machine. No wonder there's no bandwidth left over on satellite or over those fat DOCSIS connections. Carriers are too busy selling us "lose weight now" bullshit over providing the service we're actually buying.
Toss in its "Public Interest" channels which hold useless junk like religious programming and public access, we have about 50 channels of non-entertainment nonsense. Everything looks like shit because the bean counters and MBAs think Billy Grahame and "Look good in that dress for $19.99" should be in contention over the actual shows and movies I watch.
One of the channels in my area is now encouraging people to pester Dish to pay them per-subscriber money. Meanwhile, Dish has audio dropouts and video artifacts aplenty.
(My impression is that many of the channels you label "Public Interest" are getting a free ride because they don't charge the carrier anything but allow the carrier to increase their advertised channel count.)
Drop cable and lobby to end the Comcast/AT&T/Time Warner oligopoly.
It's hard to compare. Yes they are higher bandwidth (15.6 Mbps or so OTA) but they are mpeg2-encoded. Many of the stat-mux cable boxes are mpg4 at a lower rate. So which is better? Depends entirely on what is being broadcast (and where in the stat-mux channels you're looking.)
If anything, I'll drop cable altogether and go OTA.
(It used to be the case that cheaper HDTVs didn't have "reality enhancement" stuff, but Moore's Law has probably eliminated that.)
I know from experience using an LG LCD TV with Windows 7 and ATI graphics that all drivers are not created equal. I was forced to downgrade my graphics driver when ATI decided to remove overscan settings from more recent releases. That being said, there are drivers out there that allow you to mold the picture to fit your screen.
Your best bet would be to get a high resolution PC monitor and just use that. The monitor will have much better quality for computer use. If you must get a TV I would say the primary concern is what your computer is capable of, more so than the TV.
Luckily I can get 1360x768 though the VGA port but the TV only accepts HD resolutions over HDMI -- this is becoming more of a problem as many computers now come with only with D-DVI or HDMI ports.
My original intention was to connect the computer via a DVI-to-HDMI cable but with only HD resolutions available with no 1-to-1 pixel mapping this is a no-go.
It reminds me of Joel Spolsky's rant about standards: http://www.joelonsoftware.com/items/2008/03/17.html -- the bit about headphone jacks in particular.
Such a general truth, its why web devs have nightmares about old versions of IE
My parent's HDTV has 8 input ports (4 are HDMI). All of them are crammed on the side of the TV and it looks like crap mounted on a wall. Not to mention being a pain to add new stuff. Why can't the TV come with a box that lies horizontal in my cabinet with all the in and out ports and have on umbilical cord hooked into the bottom of the TV? I know you can buy boxes, but it just seems like they should start looking at the implications of flat screens sometime in this century since they forgot to look in the last.
Considering how many people use receivers, I'm surprised there aren't good HD monitors with one HDMI input and no audio support.
Apparently individual screens are cut from larger sheets of pixels. Using the same vertical resolution for 4:3 screens (1024x768) and 16:9 screens (1366x768) makes it possible to cut them from the same sheet, pushing down the manufacturing costs.
And personally for me I have even more crazy problem because of that:
When I transfer signal through HDMI from 1280x720 notebook to HDReady TV it actually thinks that the signal IS HDReady from 1280x720 and stretches it by that 6% of difference and crops it by 6% and stretches again.
As a result I have something around ~1210x660 center part from original 1280x720 signal stretched to 1366x768... Can't find any solution yet.
But it's only possible with a signal designed for perfect scaling. Computers send cripser signals that are designed for no scaling at all, and the transformations hurt those signals.
Apart from radio. That was genius.
Turns out he is a game designer too and I read lots of interesting stuff. I love this site it has great people on it!