
TVs are all awful - sciurus
http://mjg59.dreamwidth.org/8705.html
======
mrcharles
If you are a gamer, TVs are now also awful for playing games, due to the built
in lag between the TV receiving a signal and displaying it. On a CRT it was
instantaneous, as game console data streams were sent basically straight to
the electron gun -- nowadays there's a massive amount of processing before the
signal is on screen.

This manifests primarily in a feel of poor controls, or a game not doing what
you tried to do.

Good HDTVs will give you 100ms of lag or so, bad ones can be in excess of
400ms. With most games nowadays running at 30fps, that's between 3 and 13 game
frames of lag.

It's pretty absurd the difference this makes, and as a game developer I have
to fight with it constantly and it is _infuriating_.

~~~
kevinalexbrown
So 100 ms is on about human eye-hand reaction time. Not that humans think in
discrete chunks, but 100-400ms is like 1-4 extra responses, which makes game
play quite different.

Also, I know people will say "Game Mode!" but obviously, as a dev you can't
just assume your end users will do that.

~~~
mnutt
It seems like this would also affect audio, depending on how you have it set
up: if it runs through the TV, the TV could delay the audio by 100-400ms to
compensate, but if it goes straight from the computer to the stereo there
would be a noticeable lag in the video. Is there something obvious I'm missing
why this isn't a problem?

~~~
xsmasher
Many home theater systems have an adjustable audio delay for exactly that
reason.

------
noonespecial
Tv's too should have _open_ firmware. Not to make RMS happy, but to protect
savvy consumers from monumentally idiotic or short-sighted decisions made
during the design. You can't make it perfect, leave the door open so your
customers can.

I expect Apple will solve this problem in their usual way; pick slightly less
stupid settings and lock those in. In this case, the difference will be
stunning and people will marvel at how those apple tv's can look so good.

~~~
redthrowaway
>I expect Apple will solve this problem in their usual way; pick slightly less
stupid settings and lock those in.

As someone who's dealt with the inimitable joy of trying to get an MBP to work
with an HDTV, Apple's "usual way" appears to be to code to the standards and
to hell with anyone who breaks them (or the users stuck with non-compliant
products). This appears to apply to wifi as well. There's been an open bug in
OSX for years involving OSX assgining its own DHCP lease when it fails to
negotiate one with the router. This results in the dreaded "Self-assigned IP"
message, which is nigh-on impossible to rid yourself of short of voodoo dolls
and wifi dances.

~~~
calloc
You can simply ask Mac OS X to request another DHCP address when it comes up
with Self-assigned IP. The Self-assigned IP address is actually in a range
that is specified in the RFC 3927 [1]. This is mainly done so that if it joins
a network without a DHCP server it can still communicate with other hosts on
the network, just not the outside.

Also, in my experience Apple's DHCP agent will re-request an IP address after
having assigned itself a self-assigned IP address. This generally takes about
a minute or so, in that time the DHCP server can then reply once again. I've
never had issues with this at all.

[1] <http://www.ietf.org/rfc/rfc3927.txt>

~~~
ajross
When it joins a what now? There _are_ no wifi networks without DHCP servers.
At least there are none in the consumer electronics market to which zeroconf
is targetted.

Zeroconf is a historical mistake. It shouldn't be there any more; it does far
more harm than good.

~~~
wmf
I think the use case was ad-hoc wi-fi networks.

But getting back to the original complaint, if not for zeroconf you'd have no
IP address at all; I don't see how that's any better.

~~~
redthrowaway
The problem is that OSX's behaviour (and not, notably, Windows') is to try
once then fail as a "good enough" solution. It may indeed keep trying, but
it's a problem that Windows never runs into in the first place.

I'll admit to having only a rudimentary knowledge of how wireless networks
work, but it's better than 99% of users and _I_ find OSX frustrating (in this
regard). At the end of the day, the user doesn't care whether the router
manufacturer isn't following the spec, or whether Apple's implementation is
buggy. The simple fact is that it works on Windows but not on OSX, and that's
a failure on Apple's part.

~~~
calloc
This behaviour is the same on Windows. If Windows doesn't receive a DHCP
address is will assign one in the self-assigned range first. After about a
minute or so it too will re-request an IP address from the DHCP server and if
it receives one it will assign that to the interface.

What Windows does do wrong is that it then also drops the self-assigned IP
address which may already be in use for communication, this can cause issues
with other hosts that are communicating with it over zeroconf.

------
algoshift
Right. One of the rules is "be nice".

OK. This article is flawed and many of the comments are just as flawed. Having
been involved in the design and manufacturing of LCD displays (down to writing
all FPGA image processing code, scaling, deinterlacing, etc.) I think I can
say that none of this is accurate if the intent is to apply it generally.

Caveat: If you buy a TV don't expect it to be a computer monitor. Most TV
designs are just that: TV sets. They are made to do one thing reasonably well:
Take a crappy satellite/cable/whatever signal and give you a reasonable image
back.

EDID can be programmed with any resolution you want. Do you need 921 x 333 at
12 frames per second? No problem. There is no such thing as a resolution not
being available in EDID. Standards are one thing, but the EDID mechanism isn't
inherently limited by standards.

BTW, there are commercially available EDID modifier gadgets that allow you to
modify the EDID readout from the monitor. So, the monitor says one thing and
your computer (or whatever device) receives your programmed values.

If you need a TV that will play nice with a computer you need to find one that
was explicitly designed to do so.

Most consumer TVs use one of a very few commercially available processor chips
to do their image processing. With a few exceptions they all do the same kinds
of things. And no, the signals are rarely converted to YCbCr 4:2:2 internally
but for the absolute cheapest and crappiest of processors. All the good ones
convert the input to a common internal integer RGB format. The nice ones might
standardize at 12 bits per channel (36 bits total) internally. When I did
custom FPGA video processing we went as far as 24 bits per channel in order to
avoid truncation of calculated values until the very last moment. This can
make a huge difference depending on the application.

In general terms "monitor mode", if you will, should be a mode that bypasses
as much of the internal processing as possible. You can force this bypass by
using and EDID modifier gadget programmed for the actual resolution and
timings of the panel. In other words, open the back of the TV, get the panel
model number, get the data-sheet and program the EDID modifier to output these
values to your computer. The processor should push this straight to the panel
and you get very little, if any, processing. Again, this does not work on all
TVs. As I said before, they are designed to be TVs, not monitors.

That said, I've connected many computers to off-the-shelf, un-modified,
consumer TVs via DVI and HDMI. I have yet to run into any real issues.

~~~
bowyakka
Since you say you have been involved in the production of such things I have a
question for you....

Many of the comments below discuss how people experience HDTV "lag", I was
wondering if you could shed any light on what causes this. Initially I was
thinking that, if its a commodity processor then of course there is the
potential for lag, but then you mentioned the FPGA's.

Now I am no fool, I know that its going to take time to process things even
with an FPGA, but you got me thinking, in the HDTV application surely an FPGA
will be coupled with a commercial DSP chip, heck for high volume I am sure the
manufacturers will go the whole hog and get the FPGA netlist converted out
into an ASIC or even full blown foundry chip.

Wouldn't the FPGA coupled with a DSP kill most of the lag outright ?

So where in your experience is the lag ?

~~~
brigade
Rephrasing beambot, some of the algorithms employed use knowledge of the next
frame (or two or five) to filter the current frame. So for those it doesn't
matter whether you use a FPGA, DSP, ASIC, commodity processor, whatever to do
the filtering - you have to wait until you have the needed future frames to
even start.

~~~
bowyakka
Ah check that makes perfect sense, so until I invent the turing oracle we are
stuck :)

------
digitalsushi
Here's a non-facetious, completely honest question from someone who just
doesn't know why- Why is is 2012 and my new TV and monitor each have about the
same horizontal resolution as my CRT monitor from 1998? It's 14 years later,
and I still only have about 2000 pixels to play with. I know the obvious
answer is that everyone is just matching the resolution movies are sold as,
but why can't I get a professional grade monitor with a "retina" quality
display for my desk?

~~~
rbanffy
If you don't need color, a quick googling gives you monstrously expensive
display with 4096 x 2560 pixels:
[http://accessories.us.dell.com/sna/productdetail.aspx?sku=A5...](http://accessories.us.dell.com/sna/productdetail.aspx?sku=A5623001).
If you need color and are willing to live with just 3280 x 2048, you can spend
a lot less on a
[http://accessories.us.dell.com/sna/productdetail.aspx?sku=A5...](http://accessories.us.dell.com/sna/productdetail.aspx?sku=A5213768).

I'd be happy with 8 of these (<http://barco.com/en/product/1219/specs>)
arranged in two 2x2 clusters on the sides of my webcam.

edit: Eizo has another cool one, color and 4096 x 2160:
[http://www.eizo.com/global/products/radiforce/rx840/index.ht...](http://www.eizo.com/global/products/radiforce/rx840/index.html).
I don't even want to know how much they cost.

~~~
Tossrock
My jaw literally dropped when I saw the price tag on the monochrome Barco. You
could get _several_ new cars for that much.

~~~
rbanffy
But they are insanely cool, aren't they?

It's the kind of money you spend to develop today against the kind of computer
that will be mainstream a couple years from now. At least, that's one bet.
Xerox bet correctly - GUIs, bit-mapped displays and object orientation got hot
in the mid 80's but, nevertheless, they didn't collect their prize.

~~~
wmf
But better monitors _won't_ be mainstream a couple years from now; there has
been no improvement in the last five years, so why should the future be any
different?

~~~
rbanffy
There is the persistent rumor Apple will equip their Macbooks and iPads with
retina-like displays. Once they do, everyone will have to do it.

~~~
Tossrock
I strongly doubt that apple is going to make 300+ DPI screens 10+ inches
across. I could always be wrong, but it seems like the amount of horsepower it
would take just to draw the screen would destroy battery life. Maybe some
super high end, three prong "Cinema Display", but definitely not a mobile
device.

------
dgallagher
I used an HDTV once that had three different HDMI ports on it (0, 1, 2). Each
port reported a slightly-different EDID for the same TV!

One of the HDMI ports reported this (extracted using SwitchResX):

    
    
        Established Timings:
        -------------------- 
        		720 x 400 @ 70Hz
        		640 x 480 @ 60Hz
        		800 x 600 @ 60Hz
        		1024 x 768 @ 60Hz
        
        Standard Timing Identification:
        ------------------------------- 
        	#0:	1280 x 1024 @ 60Hz 	(8180)
    

The two other HDMI ports reported this instead:

    
    
        Established Timings:
        -------------------- 
        		640 x 480 @ 60Hz
        
        Standard Timing Identification:
        ------------------------------- 
        

Almost all other EDID data matched, including additional timing data inside
the EDID extension block, so I'm not certain these differences were that big
of a deal. None the less, weird when all coming from the same TV.

------
baddox
Is his final point (about overscan) still relevant? Years ago I used to hear
HDTV enthusiasts urging everyone to check their TV settings, but in the last
2-3 years, I've only dealt with PCs hooked up to a few HDTVs (all with 1080p
native resolution), and I haven't seen a single one that overscans a 1080p DVI
or HDMI signal by default. The author acts like it's a certainty that your
1080p TV will by default overscan a 1080p signal.

~~~
culturestate
I have a one-year-old Samsung LED-backlit LCD and it overscans from a unibody
Mac Mini via HDMI _into the TV's PC-specific HDMI port._ I have to fix it
manually with the overscan slider in OSX's display preference pane.

~~~
acon
I also have a one-year-old Samsung and I managed to get it to not overscan for
at least one input. Put your Mac Minis HDMI into the HDMI 1 / DVI input and
then select the input in the Source menu on the TV, press Tools on the remote,
and chose PC as the name of the input. This magically turns off overscan. Hope
it works!

~~~
culturestate
Did that already; no joy. It's not a problem because of the overscan slider in
OSX, but it was a huge annoyance before I upgraded the mini (you need the
built-in HDMI to get the overscan adjustment).

~~~
nitrogen
Unfortunately you're most likely losing resolution in this case. If, for
example, your TV is cutting off 3% of the image, then the video card is
scaling your whole desktop down from 1920x1080 to ~1862x1048, which will
result in more than 3% apparent resolution lost due to less-than-optimal
interpolation.

~~~
culturestate
I'm fine with that; the Mini is used as an HTPC, so I really need the overscan
correction only so I can see the menus in VLC / Quicktime.

------
jpdoctor
Relevant: Viewers are awful too.

Study: 18% of people can't tell if they're watching true HDTV content or not

[http://techcrunch.com/2008/11/24/study-18-of-people-cant-
tel...](http://techcrunch.com/2008/11/24/study-18-of-people-cant-tell-if-
theyre-watching-true-hdtv-content-or-not/)

~~~
sehugg
Ugh, this. Comcast re-compresses some of their video streams so much that it
almost looks like YouTube. The best video quality I've seen on cable has been
HDNet, but many providers have dropped their channels because they can replace
them with three low-quality channels.

Still, I've never talked to anyone who has noticed the difference, so I assume
it's a rare gripe.

~~~
jpdoctor
> _Ugh, this. Comcast re-compresses some of their video streams so much that
> it almost looks like YouTube._

All the cable providers are using the same set of transmission boxes from the
same vendors.

Usually, they are employing statistical compression: Take 16 channels and
determine which video needs more bandwidth in real time. So if your favorite
show is opposite talking heads, then it looks fine. But if you're opposite
action-packed hunting in jungle scenes (lots of high-frequency content),
you're gonna see it.

~~~
drzaiusapelord
Except 20 or so channels on my dish network setup are nothing but direct sales
garbage.

I think we're suffering because its economically more appealing to treat the
TV as a sales machine instead of an entertainment machine. No wonder there's
no bandwidth left over on satellite or over those fat DOCSIS connections.
Carriers are too busy selling us "lose weight now" bullshit over providing the
service we're actually buying.

Toss in its "Public Interest" channels which hold useless junk like religious
programming and public access, we have about 50 channels of non-entertainment
nonsense. Everything looks like shit because the bean counters and MBAs think
Billy Grahame and "Look good in that dress for $19.99" should be in contention
over the actual shows and movies I watch.

~~~
maxerickson
On satellite there are also the dozens or hundreds of local market broadcast
channels. They lobbied hard against a de facto national ABC, CBS, etc.

One of the channels in my area is now encouraging people to pester Dish to pay
them per-subscriber money. Meanwhile, Dish has audio dropouts and video
artifacts aplenty.

(My impression is that many of the channels you label "Public Interest" are
getting a free ride because they don't charge the carrier anything but allow
the carrier to increase their advertised channel count.)

------
jedbrown
I want to put a display on the wall and stand five feet away with a keyboard
and mouse while working (mostly Emacs, web browser, and reading pdfs). What
should I check to determine whether a TV would work well in this configuration
(without overscan issues and the like)? Is the only safe thing to go to a
physical store with the computer, set everything up, and check for artifacts?

~~~
moheeb
This is probably most dependent on the video card and drivers that you have
available. A good video driver should allow you to adjust things such as the
overscan %, centering, etc. to get the best picture possible.

I know from experience using an LG LCD TV with Windows 7 and ATI graphics that
all drivers are not created equal. I was forced to downgrade my graphics
driver when ATI decided to remove overscan settings from more recent releases.
That being said, there are drivers out there that allow you to mold the
picture to fit your screen.

Your best bet would be to get a high resolution PC monitor and just use that.
The monitor will have much better quality for computer use. If you must get a
TV I would say the primary concern is what your computer is capable of, more
so than the TV.

~~~
jedbrown
The reason that a TV seemed attractive is that I could get a 46" TV for about
the same price as a 27" monitor, thus letting me work from a step further from
the display with similar perceived pixel/display size.

------
jodrellblank
Why would any technologist reading the likes of this over and over and over,
have any faith in future brain-computer interfaces, mind-uploading or similar?

~~~
7952
Just remember to keep your eyes closed when doing a firmware update.

------
wvenable
I have one of those terrible older 1366x768 TV's. This TV accepts input at
1080i and 1080p as well as 720p. What they don't usually tell you, is that
some of these TV's will up convert a 720p signal to 1080 and then down convert
it back to 1366x768. So you're actually better off with a higher resolution
signal.

Luckily I can get 1360x768 though the VGA port but the TV only accepts HD
resolutions over HDMI -- this is becoming more of a problem as many computers
now come with only with D-DVI or HDMI ports.

~~~
WiseWeasel
The VGA port is analog, meaning the signal from the computer gets sampled to
analog by your video card, then re-sampled to digital by your TV. This seems
like a lousy compromise compared to a computer monitor connected to that same
computer via DVI or HDMI.

~~~
wvenable
It's only lousy if digital->analog->digital conversion itself is lousy -- and
from everything I've read it's nearly impossible to see any difference.

My original intention was to connect the computer via a DVI-to-HDMI cable but
with only HD resolutions available with no 1-to-1 pixel mapping this is a no-
go.

------
cs702
This would be incredibly funny if it weren't true :-(

It reminds me of Joel Spolsky's rant about standards:
<http://www.joelonsoftware.com/items/2008/03/17.html> \-- the bit about
headphone jacks in particular.

------
jal278
"and so because it's never possible to kill technology that's escaped into the
wild we're stuck with it."

Such a general truth, its why web devs have nightmares about old versions of
IE

------
smackfu
My only complaint about my Samsung TV is that the input select menu takes
about 45 seconds to dismiss itself. Most equipment like receivers that drives
your TV expects that input select is instant and doesn't show a menu at all.
So you press the button to switch inputs and it sticks text on your screen for
almost a minute. Stupid.

~~~
raldi
Mine does that too, but changing the volume dismisses it.

~~~
smackfu
Good to know. The actual remote for the TV has a cancel or return button to
kill it, but the BluRay player remote we just picked up only has 5 TV buttons
(power, source, volume up, volume down, and mute).

------
protomyth
TV's also uniformly have one of the stupidest designs for input ports.

My parent's HDTV has 8 input ports (4 are HDMI). All of them are crammed on
the side of the TV and it looks like crap mounted on a wall. Not to mention
being a pain to add new stuff. Why can't the TV come with a box that lies
horizontal in my cabinet with all the in and out ports and have on umbilical
cord hooked into the bottom of the TV? I know you can buy boxes, but it just
seems like they should start looking at the implications of flat screens
sometime in this century since they forgot to look in the last.

~~~
smackfu
I'm surprised that when you pay more for a TV, you get more ports, even as it
becomes more likely you only need a single HDMI to hook into your receiver.

~~~
wmf
The marginal cost (which is truly marginal) to add more ports can be easily
absorbed by a more expensive TV.

Considering how many people use receivers, I'm surprised there aren't good HD
monitors with one HDMI input and no audio support.

~~~
protomyth
I would actually like to see the breakdown, because I am not sure it is the
majority of people.

~~~
wmf
My intuition agrees with yours that less than half of HDTVs are attached to
receivers. But I suspect it's over 10%, which should be enough to sustain a
few specialized models.

------
Aga
The second comment on the target page has a nice explanation on why this weird
resolution of 1366x768 is so popular.

Apparently individual screens are cut from larger sheets of pixels. Using the
same vertical resolution for 4:3 screens (1024x768) and 16:9 screens
(1366x768) makes it possible to cut them from the same sheet, pushing down the
manufacturing costs.

------
Tooluka
Person who decided that HDReady should be 1366x768 was a little bit insane.

And personally for me I have even more crazy problem because of that: When I
transfer signal through HDMI from 1280x720 notebook to HDReady TV it actually
thinks that the signal IS HDReady from 1280x720 and stretches it by that 6% of
difference and crops it by 6% and stretches again.

tl;dr As a result I have something around ~1210x660 center part from original
1280x720 signal stretched to 1366x768... Can't find any solution yet.

------
Derbasti
There are ways to achieve arbitrary upscaling without loss of information
(e.g. FFT-based). It would be interesting if TVs out there utilize such
methods or if they scale using some simple interpolation scheme.

~~~
obtu
TVs have ASICs that do that, see this subthread:
[http://mjg59.dreamwidth.org/8705.html?thread=254465#cmt25446...](http://mjg59.dreamwidth.org/8705.html?thread=254465#cmt254465)

But it's only possible with a signal designed for perfect scaling. Computers
send cripser signals that are designed for no scaling at all, and the
transformations hurt those signals.

------
mcantor
I thought this was going to be an alarmist article about getting rid of your
TV and doing something else with your time.

~~~
obtu
The funny thing is, Matthew would write that article if he thought it would
work.

------
Craiggybear
Television is the worst -- and best -- invention of man.

Apart from radio. That was genius.

------
kinnth
wow I just read the etiquette on hacker news and then clicked around to this
topic. I then just happened to click on user "mrcharles" as I never knew
people had profiles before.

Turns out he is a game designer too and I read lots of interesting stuff. I
love this site it has great people on it!

