
CES 2018: Look to the Processor, Not the Display, for TV Picture Improvements - teklaperry
https://spectrum.ieee.org/view-from-the-valley/consumer-electronics/audiovideo/ces-2018-look-to-the-processor-not-the-display-for-tv-picture-improvements
======
pjc50
Hmm. Reminds me of [https://superuser.com/questions/1282598/what-sense-does-
it-m...](https://superuser.com/questions/1282598/what-sense-does-it-make-for-
sharpness-to-be-adjustable-on-a-monitor/) \- what's happening with these
"picture improvements" is effectively post-processing "photoshop". They will
generally be displaying TV from digital sources, so the need for denoise is
questionable. Anti-banding might give you nicer sunsets. But fundamentally you
now have a TV that rather than displaying the input signal decides to give you
something it considers better. Results are likely to be variable.

Also:

"Samsung, though it didn’t tout a new processor, promised that all its
gadgets, big and small, will be a lot smarter this year—and will, by 2020, be
using AI and talking to the cloud."

Fully compliant with all major buzzwords. Also employing a black box user
surveillance engine.

~~~
lamlam
I was recently shocked to learn that you cannot buy "dumb" TVs from any of the
major brands any longer.

The only source I was able to find were random brands from China, and
commercial displays from companies like NEC. Unfortunately commercial displays
are optimized for longevity and not picture quality :/

If anyone knows where I can get a decent dumb TV please share.

~~~
kitsunesoba
It’s been my belief for a while now that there’s a market for premium dumb
TVs. They would:

\- Use high end best binned panels (lowest possible image retention for OLED,
etc)

\- Traditional LCD models would have very high standards for backlight
consistency

\- Do little to no image processing

\- Be designed around being hooked up to external media boxes

\- Have a nice modern OSD text

\- Allow you to use your phone as a remote via Bluetooth (and control other
media center components via IR cables and HDMI commands)

\- Not have any means of internet connection or any other smart functionality
whatsoever

Effectively, they’d be to TVs as studio monitor headphones are to audio.
Enthusiasts would jump all over something like this despite inflated prices, I
have no doubt.

~~~
clairity
the reason the premium consumer market doesn't sell dumb TVs is that it would
wring all of the profits out of selling TVs. consumer electronics companies
want to be seen as "adding value" on top of the display. if they don't, then
manufacturers can just go direct to consumer, cutting them out of the value
chain and redirecting profits to the manufacturers (and some to consumers in
the form of lower prices).

if they priced dumb TVs higher than their smart TVs, it would signal the
negative value of the "smarts" in the TV, so they don't want to do that.

so none of the electronics manufacturers really want to sell dumb TVs to the
mass market, since it would be akin to committing corporate suicide (in that
particular product category).

of course one way around this is to brand a technical feature (as with sony's
trinitron back in the day) and make that carry the value premium in the dumb
TV, which would be how you'd approach the enthusiast segment you mention.

------
jerf
"In the shorter term, the technology would enable scenarios like someone
asking the TV what is in the refrigerator, then having it display a recipe
then send that recipe to the stove."

From now on, I will interpret this being the first idea floated as the use for
some IoT tech as "we have no f'ing clue what we're going to use this for".

Thirty years ago, this problem was perhaps unsolved. Now it's solved in at
least two ways, with tablets and voice interactions, and those are better.

Sorry, Sony, Samsung, etc, there is no way my TV, which is in the ENTIRELY
WRONG ROOM, is going to wedge its way into my recipe workflow, and if that is
the best idea you have, as evidenced by it being the first idea mentioned in
this article... _you have no ideas_. Note how even in the example I quote
above, WTF is the TV doing in that interaction flow in the first place? What I
should be "asking" what is in the fridge is my cell phone, and by extension
since they share OSes, optionally a tablet.

(The core problem here is that Smart TVs are now mature. Ignoring surveillance
and privacy issues, what we need are just more responsive UIs, and preferably,
some sort of plan for how we're going to maintain these TVs in the future
because especially at the high end these things are too expensive to be
treated like cell phones that don't get updates after a year and a half. But
that sure doesn't win any CES awards.)

~~~
dfee
I just want the display, not the device. More and more it’s like Goodyear
wants to sell me a car with the tires.

~~~
Zenst
Precisely that, be great if they were moremodular to a standard, but then,
that's what PC's are, just not at the lego level for the common consumer.

Then consumers wonder why due to software support obsolescence that their
working TV becomes less functional after a few years.

So it is like Goodyear selling you a car with the tyres that they stop
supporting way before the tyre is even close to end of working life.

~~~
arthurfm
> be great if they were more modular to a standard

There is actually. The Open Pluggable Specification [1] which is primarily
used for digital signage screens. It's a shame consumer TVs don't use this
standard (or something similar).

[1]
[https://en.wikipedia.org/wiki/Open_Pluggable_Specification](https://en.wikipedia.org/wiki/Open_Pluggable_Specification)

------
cooper12
Why on earth does a TV need denoising built-in? I've noticed this trend—when I
rarely visit electronic stores—where all the display models have huge vibrant
screens, but whatever they show on them is always oversaturated, sharpened,
and in general overprocessed. On top of that, when I visit some people's
houses, they have everything setup completely wrong: stretching the picture to
avoid letterboxing, upscaling from a lower resolution signal (acceptable
depending on your distance, but better to have an HD signal for an _HD_ TV),
motion interpolation, and so much more. What ever happened to a television
naively playing what it was given? Why does everyone want a neon prickly
picture and assume they know better than the people who mastered the content?
I know some will say "personal preference", but there's no accounting for
taste and settings like these can ruin something carefully crafted. Lastly, to
get back to denoising, noise is actually detail, stop removing it (and you'll
almost never have it in modern stuff, mostly just old films—where it's part of
the actual negative). This trend extends to photography as well, with editors
not knowing when to ease off the saturation (the greens would hurt your eyes
IRL) and contrast sliders, which might explain public perception or
expectation.

~~~
mstade
Yeah it's unfortunate. It started I guess with TV makers implementing
processing to fix problems with broadcasts before digital came around, and
then just got progressively worse over time as the makers try to out-do each
other with features that "improve" picture quality.

What's worse, at this point people are so accustomed to the soap opera effect
of these features that when they are turned off they think something is wrong.
Try it at a friend's or relative's house – change their settings to turn off
motion interpolation and noise removal and such, and I'm sure most will start
thinking something's wrong with their set. It's sad.

------
dod9er
Its always about the buzzwords... a few years ago I tried to buy a TV without
3D-support because I didn´t want to spend the money on a useless feature (for
me). Guess what, no major brand offered a TV without 3D, I ended up buying the
usual suspect. But now the situation has improved ! It got easy again to buy a
TV without 3D-support at least :D Same for the current must-have features...

I think the common nerd would love a _dumb_ display with decent calibration
etc. out-of-the-box, so he could just connect his favorite media-box and have
fun for a reasonable price. But this isn´t where the profits are made!

edit: Because you could just replace your mediabox after a few years, when
e.g. netflix stops playing on it. I´m really waiting for a new player who
gives us this _dumb_ high-quality display so many nerds are waiting for.

~~~
dsr_
If you block your TV's MAC addresses at your firewall, all the intrusive
"smart" features stop working.

Or have DHCPd map them to static IPs and block those.

Either way, most people on HN can do this... and most TV purchasers aren't
even aware that it's an option.

~~~
Simon_says
I'm curious why you would need to block MAC addresses. Why wouldn't just not
plugging in ethernet and not giving it the WiFi password be sufficient to stop
your TV from going online?

~~~
osaariki
Just a guess: having the TV think it has an internet connection might prevent
annoying prompts to connect it to the internet.

------
indescions_2018
Games designed to be played in 4K 60FPS HDR can feel pretty "future-proof".
Playing Horizon Zero Dawn, an open world rpg with mech dinos, on the PS4 is
stunning. Even if rather buggy. You are reminded it has been some time since
you last were awed by the quality of snowflake particle effects.

I realize a $2K+ setup is beyond the reach of all but hard core enthusiasts.
Especially as Gigabit connection speed availability is all but required. But
4K video will probably remain the standard for the next 4-5 year time frame.

~~~
mstade
Man, HZD looks insanely great at 4K HDR on a 65" OLED – buy far the prettiest
game I've ever played. It doesn't play at 60fps though, as far as I know it's
capped at 30fps. Presumably they can't guarantee stable frame rates higher
than that, which probably makes sense given the insanely great graphics. At
30fps thought it's super stable, I've never seen it stutter and I've finished
both the main game and DLC. Great game, can't wait for the sequel!

------
seanalltogether
> The company gave a don’t-blink-or-you-might-miss-it peek at what it says
> will be the world’s first commercial MicroLED display, available sometime
> this year. MicroLED technology operates like the Jumbo Tron in a football
> stadium, with a dedicated LED for each colored subpixel—shrunk down to the
> size of a standard TV, that is, with subpixels so small your eye can’t
> distinguish them. It doesn’t require filters or backlights, so color and
> brightness can be exceptional.

Isn't that what OLED is?

~~~
papercrane
The big difference between them is chemistry. MicroLEDs are using inorganic
semiconductors, which are supposed to give it better lifetime performance.

~~~
Doxin
Could you explain to me what, in the context of display technology, organic
even means? Presumably oLED displays don't contain organisms.

~~~
detaro
Same as in "organic chemistry":
[https://en.wikipedia.org/wiki/Organic_chemistry](https://en.wikipedia.org/wiki/Organic_chemistry)

Roughly: Primarily carbon or carbon-hydrogen-based, often complex, molecules.
If not naturally occurring in organisms, then made from the same "toolbox".

(The liquid crystals in LCDs generally are organic compounds as well, but
traditional LEDs are not, thus the OLED distinction in that case)

------
wz1000
Will these household appliances with general purpose CPUs and WiFi chips
receive the necessary updates for an entire lifetime? I have a CRT TV that is
over 17 years old and still works. Will manufacturers provide even 5 years of
security updates? Because if not, the next big vulnerability could mean we're
all screwed.

~~~
sameAsYou
My 2 year old LG TV isn't receiving updates to the webOS anymore, and there
are some apps for their store which aren't available for older OSes. I don't
know how long I can keep buying 'other devices' like Chromecast/Roku to get
around the problem.

------
legulere
Isn't one of the problems of all those picture improvement technologies that
they increase input lag?

~~~
dmart
Yes, making these "improvements" completely useless to anyone who occasionally
plays videogames (and therefore most likely leaves their TV in game mode).

I'd honestly pay extra for _less_ processing: give me a 65" dumb monitor with
as little latency as possible, please.

~~~
modeless
NVIDIA has a product for you: [https://www.geforce.com/whats-
new/articles/bfgd-big-format-g...](https://www.geforce.com/whats-
new/articles/bfgd-big-format-gaming-display)

------
amelius
Sadly, from a usefulness perspective, it seems these improvements are
incremental, and the increments are getting smaller every year.

