Hacker News new | past | comments | ask | show | jobs | submit login

No, it is not just interpolating. The underlying algorithm uses machine learning by applying a trained deep neural network. So there is value added besides a mere upscale.

You're ultimately right, though, and that a true HD is only going to come from the raw film content. What the neural network gives us are essentially plausible higher-res hallucinations.

Edit: as per the other comment, if the original exists only on video and not film, perhaps this is the best we're going to get.




The neural network is applying what it "knows" about photos and inventing new data for the missing pixels. It's "creative interpolation" ;-)


> Edit: as per the other comment, if the original exists only on video and not film, perhaps this is the best we're going to get.

I don't think that's quite right, at least it doesn't jibe with what the DS9Doc people have been doing (which consists partly of remastering pieces of DS9 scenes):

https://www.indiegogo.com/projects/what-we-left-behind-star-...

I think the footage really was on film, but the issue was that it was composited with low-quality CGI effects, or something like that. So you can rescan the film, but you have to redo all the compositing (and probably with your own models because I'm guessing the original CGI didn't look that good). That's why a DS9 remaster is so expensive.


That's still interpolating, by the definitions I know.

The main difference here is that the interpolation algorithm on your TV is online. It's handling 30 frames per second, over 9 million pixels per second. Doing the interpolation offline (ahead of time), you can take as long as you want, look at multiple frames to try to make better guesses, try multiple things and use some fitness measure to pick a winner, even a frame or a pixel at a time.

It's still interpolation.


No, if I interpolate a sequence 200, 400, 600, ... I might get 200, 300, 400, 500, 600, 700. I've not added info. If I look at real world situations and find that whilst the figures fall at the even hundreds it's more realistic that they fall in a range from 20 to 30 points below the hundred on odd-hundreds. Then I have added information, albeit statistically, and the resulting sequence like 200, 287, 400, 475, 600, 672 is no longer raw interpolation.

In this case they're using machine learning to add additional information about textures that isn't in the footage broadcast. They can add frames by interpolation, but the ML texturising and detailing is not interpolation.

Starting with a blob, if you interpolate you get a smoother blob, with this process you get a more structured figure.


It's more like hallucination than anything. You're just forward-projecting your assumptions on what things ought to look like and hallucinating detail that just isn't there.

It can still look nicer than naive upscaling though.


I see what you're getting at, but it still seems within the definition of interpolation. From wikipedia

> ... interpolation is a method of constructing new data points within the range of a discrete set of known data points


Is there any evidence for this? Showing bad 480p DVD rips alongside 1080p upresed video isn't really a fair comparison -- comparing it to a real TV upscaler's output would be fairer. And honestly, even the unfair comparsion doesn't show a whole lot of benefits to me.


Its not upscaling. Its taking what it knows about other non ST pictures and creating new texture and information.


Right, but how much does it actually add over a decent upscaler?


are you saying this specific program, or the idea in theory?

http://screenshotcomparison.com/comparison/132311


I'd be surprised to find a group more obsessed with specifics of what is and isn't quality, it would be the anime community.

There are entire catalogue of overlay comparisons of different releases, encodings etc. [0].

Example: http://compare.bakashots.me/compare.php?setId=3896&compariso...

[0] http://compare.bakashots.me/


Interpolation looks bad enough without completely sandbagging its color correction. Why resort to that kind of nonsense?


Are you asking why did I change the exposure? Because I did, I was playing around. It was the comparison I had uploaded, I have a copy without on a different computer.


Wow! That's a bigger difference than I expected.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: