Between Apple, Google, Amazon, and Microsoft - Apple is the only one I would invite into my home to listen to my activities. Perhaps it's just marketing, but the other three have not done anything to earn my trust. Apple at least appears to be concerned about my privacy.
Microsoft will never, ever have a live mic in my home. I have done everything I can find to neuter any background listening my Win10 PC can do.
Apple wins on the privacy front, but they have neither the broad knowledge available that Google does nor the 2 years of experience that Amazon does.
Oh, plus free upgrade to 4K version of things I've bought? Awesome, that is the dream of buying online media... I once ripped my 300 movie DVD collection and couldn't be bothered to upgrade or redo anything in blu-ray. The free upgrade was truly a best case scenario for me, and I was excited to hear about that.
Likewise, if you're already deep into Google territory and are buying content from Google Play Music/Videos, YouTube RED, have an Android phone, etc - a Chromecast is the obvious choice since that all works seamlessly in that environment.
Apple has always been on the forefront of technology, with things like Retina displays, long before other vendors. I think they should've been there with 4K too.
They used to be. That is generally untrue for the last 5 or so years with only minor exceptions (mostly limited to the iPhone).
The biggest drawback is that damn glass remote. I dropped it on my hardwood floor and now it has a rather large chip in it. Thankfully its not on the touch surface.
My biggest annoyance UX-wise is that the touchpad button takes input without capacitance on the touchpad. This means I'm inadvertently pausing videos all the time when the remote is between my couch cushions. This is completely baffling, since this should be a solved problem, my MBP's touchpad doesn't accept clicks without capacitance.
I wonder how many people hit their caps, switch down to 1080P, and realize they can't tell the difference from the couch anyway.
I have to stand closer than ~6ft away from my now-considered-tiny 42" 1080p TV to tell the difference between 720p and 1080p. At my couch's distance of ~12ft I don't notice the difference between DVD quality and 1080p with most content. According to an online calculator that confirms those numbers so I think it's probably somewhat reliable, I'd need a 78" TV to get the benefit of 1080p at 12' away. I can't even imagine what kind of absurd seating/TV arrangement I'd have to have for 4K to matter. Couch 4ft from a 70" screen? Something like that?
In a monitor, yeah, bring on the extra pixels. In a TV though? Why?
Similar to the hi res audio formats don't sound better because of the hi res, but because the studios chose to release better masters with higher dynamic range for the format.
The spec-bumped 4k one is $179.
It should have been $99/$149.
Steve did a major fuckup to give control to Tim.
Their cheapest iPod is now $199. Their cheapest iPhone is $349. Their cheapest TV-streamer is $149. They've steadily eliminated the lower-price options they had, by neglecting to offer older product generations, or by eliminating the simpler models. The iPod shuffle was $49, iPod nano $149, Apple TV was $99 (often just $49).
It's as if they no longer want to participate in the lower range of the supply-demand curves. Before the iPhone you could walk into an Apple store with any price in mind from $50 on up, and you would find some decent iPod that would fit your budget every $50-step up (once you had chosen the most expensive iPod, there were headphones available to absorb further spending interest). At some point they decided the spending floor should be $200 or more, instead of wanting everybody to buy Apple products and thus participate in the personal computing revolution. Even their HomePod is absurdly priced starting at $349...
People notice color more than resolution, and right now there aren't enough people that own high color-space TVs. They instead have crappy 4k TV and that's going to limit the market, with people thinking that's going to be good enough.
I'm not sure that will actually hurt them. Pretty much all TVs support HDR10.
Playing with the included remote is super limited so games have to be designed with that in mind.
Game devs won't risk making a game that needs a third party accessory, so Apple again killed all gaming potential for the Apple TV.
Finally, I can use my 4K TV reliably.
I just want a clicky directional pad. Just put the Siri button on the old remote.
I just can't believe they are doubling down on their worst reviewed product. 1.5 stars on their own site.
It's being used to remove all that remains of the realism and life-like immersion of visual media in this insanely media-illiterate world. Right up Apple's alley, actually.
EDIT: People who are downvoting me are doing so because they don't know anything about video.
EDIT #2: I am very disappointed in the non-factual answers to my post, and that people would choose to downvote rather than jump in with anything even remotely accurate or helpful. These technologies are major components to the future of the web, and people on HN best become media literate if they want to keep their comfy jobs.
HDR video is usually recorded with the higher range. Are there videos where high dynamic range is compressed into standard dynamic range? Or where HDR is synthesized from regular video?
HDR now gives a lot more to play with from a post production standpoint (when shot raw or using some logarithmic transfer function to encode the sensor data), as more of the picture information coming off the camera sensor can be utilised (where exposure, recording format and compression allow)
From a technical standpoint 8-bits-per-color was never enough to display anywhere near the full spectrum. I've also personally been peeved with the flat, weak-ass dynamic range on most TVs and anything to add a few bits to the stream seems like a win in my book.
Unless, somehow, this ends up stuck in a rut in the uncanny valley (or someone wants to make the filesize argument), how is improved dynamic range a bad thing?
Can you explain how it's a gimmick?
If you are confusing HDR for the process of taking an existing Rec. 709 signal and gamut mapping into a subset of Rec. 2020, I agree that is a bit gimmicky.
This (ab)use of the technology has nothing to do with the HDR standards however, what true HDR video is, nor what material mastered for HDR is or is capable of, and shouldn't define how you see the technology (or present it to others).
HDR is a specific technique which involves cheaply processing pixels to fit a different color curve than it was captured in. It's not lifelike, it's candy-like.
It is true that these systems ship with fake-o postprocessing to take conventional content and smear them into these new colors, but that's a stopgap, and you aren't actually obligated to use them, you know. Just like TVs tend to ship in what I call "claw your eyes out mode" for display, but it's the work of just a few seconds to turn them back to something sane for any decent quality panel. (There are certainly panels that have no decent quality setting, no matter how much you fiddle; if you care, do some research. I do and did.)
Per what I see of your other comment, whether you like it or not, there are standards being called "HDR" that includes true expanded gamut support: https://en.wikipedia.org/wiki/High-dynamic-range_video#Stand...
Can we not even trust people on HN to only speak to what they know about?
So it appears all the "HDR" standards include higher gamut settings, making the simplification true enough for me to call it correct.
Not any more than we can trust the old curmudgeons to not beat their knowledge into others by force I guess.
Don't trivialize my specialized knowledge by saying I'm old. The only reason you can watch anything on youtube today is because people like me cared about standards and best practices -- and most importantly, cared about being right about things.
Typical "break stuff, stop reading books" mentality of engineers my age. It's sad.
I don't know the technical details of HDR but this sort of curve compressions is a common trick for matching a medium to our perception. Unless important info is lost, who cares?
Or, can anyone present a head-to-head comparison between 10bpp+ or "HDR"?
HDR post-processing is not being used to standardize the color subsampling accuracy of digital video files. It is being used to coerce pixels to following one particular subset of high-gamut color curvatures. It is NOT the same thing as 10bit color, nor as wide-gamut color spaces. It is a post-production trick used to make your eyes more enticed by the picture.
It is not about higher fidelity picture, it is about higher engagement viewers.
Still, I don't think HDR is a net negative even if its just improvements in luminance -- LCD TVs are especially bad at representing shadows and I have grown tired of watching thousand-dollar TVs treat dark images like a cheap laptop. Was hoping HDR would do something about that :/
Bit depth will give you a value per color channel between zero (black) and the upper bound for that bit depth (2^n - white). How the image data is encoded in those values is very much standardized for all display colourspaces (though whether the hardware is capable of displaying the full range of those values correctly is a different story).
HDR is certainly not a fad - the current crop of displays are nowhere close to being able to display the full potential of the technology yet (100% of Rec.2020 gamut @ 10,000 nits)
AndrewUnmuted - are you perhaps confusing 'true' HDR with upscaled / gamut mapped rec. 709 material? e.g. taking an existing rec. 709 master and mapping the color to a larger colourspace?
Netflix, Amazon and several other OTT providers have their original content mastered in true HDR from the get go - no post processing 'tricks'. The main limiting factor is the acquisition source, which is why Netflix have gone down the route of specifying acceptable cameras for use on their original content productions.
What the hell is wrong with everybody here?
e: I feel like you might be confusing 'HDR' with 'HDR upsampling'.