
E3: Microsoft’s next Xbox: 8K graphics, SSD storage, and ray-tracing for 2020 - tosh
https://www.theverge.com/2019/6/9/18656608/microsoft-new-xbox-hardware-specs-price-release-date-e3-2019
======
Mirioron
8k gaming? At 120 fps?

They should try to get games running at 120 fps on ultra settings at 1440p
before 4k. Leave 8k to the next generation or maybe the one beyond that. Even
the most powerful desktop GPUs struggle with 4k.

An RTX 2080 Ti gets 43 fps in AC Odyssey at 4k, 51 in AC Unity, 82 in
Battlefield 1, 49 in Shadow of the Tomb Raider, 63 in The Witcher 3 etc.[0]

The numbers will change depending in who's benchmarking them, but they're not
going to change much. This is the fastest GPU you can get right now and it
costs over a thousand dollars, yet it still can't run many games at 4k 60 fps.
It even struggles at 1440p in some. Yet this press announcement says that in a
year to a year and a half there will be a console that will cost half as much
as this GPU and it will bring many times higher graphical performance? I find
it hard to believe.

[0]
[https://www.eurogamer.net/articles/digitalfoundry-2019-05-07...](https://www.eurogamer.net/articles/digitalfoundry-2019-05-07-nvidia-
geforce-rtx-2080-ti-benchmarks-7001)

~~~
joe91
I find this kind of comment strange in a community of self professed hackers
:)

8k 120fps HDR is just the specification of the hardware. The hardware supports
an 8k framebuffer and can output it to the display at 120fps. Whether or not
you will choose to do that depends on what you are trying to achieve with the
available GPU cycles. It's exactly the same on current consoles (and PC
really). Maybe Forza 19 will choose to render simpler graphics at 8k 120fps,
but Witcher 5 will decide to run at 4k but use raytracing and dynamic global
illumination (these games may or may not exist :).

There are games that run at 4k/60 with HDR on XBox One X (I worked on one),
and there are games that choose not to in order to do something a bit more
advanced.

No one is saying that all games will run at 8k/120, just as no one is saying
that all games will run at 8k/60 on the RTX 2080 Ti even though its possible
to do.

~~~
StreamBright
In theory I can run with 120 miles/hour but my muscles limit me to 10
miles/hour. So how fast can I run?

~~~
solotronics
I think your tendons would explode from forces of acceleration at some
fraction well below 120mph.

------
frou_dh
It's amusing that both the Xbox and PS5 stories are leading with "8K gaming",
which is of course empty Big Number nonsense that's not going to materialize
in real games.

I remember being a wide-eyed teenager that would get hyped up into a fever
pitch by this kind of game system marketing.

~~~
dahart
Even if it did, do people want 8k? I haven’t been pining for more than 1080p
with my Xbox.

Why isn’t HDR color moving very fast? I feel like I’d rather have more color
resolution than spatial resolution given the choice...

~~~
xienze
> Why isn’t HDR color moving very fast?

My theory on tech adoption is that the advantages of some new piece of tech
have to be _immediately_ obvious to the average person and to such a large
degree that they would actually care. E.g.:

* Records -> Tapes: smaller, portable.

* Tapes -> CDs: obvious sound quality improvements, never degrades, don't have to rewind/fast-forward.

* CDs -> iPod et al: "you can carry hundreds of CDs in your pocket."

* VHS -> DVD: just say "it does for movies what CDs did for music" and that's all it takes to convince someone of the benefits.

* DVD -> Blu-ray: now it gets a little dicier. Yeah it's an obvious jump in quality but a lot of people just don't care.

* Blu-ray -> UHD (just the resolution aspect): the average consumer isn't the most discriminating in terms of source quality, and even to someone who really cares it can be a bit tough to tell the difference (talking about movie transfers here, in particular). Generally the biggest benefit I've found here when going from Blu-ray to UHD is when the movie's existing Blu-ray transfer is 10+ years old and just terrible. The fact that the studio simply did a new scan is what brings out the improvement, and it would look pretty damn good on Blu-ray (just look at any Blu-ray sourced from a 4K scan).

* SDR -> HDR: "the colors are better, man." Forget trying to sell this to your average consumer, even if they notice they won't care.

4K -> 8K? It's really out there as far as diminishing returns goes. It's not
going to be embraced in any meaningful way by the average consumer. It'll be
ubiquitous simply because hardware manufactures are eventually going to make
nothing but 8K TVs, but the average person probably won't be able to tell it
from 4K, perhaps even 1080p.

~~~
ummwhat
This is true even outside video tech. To gain adoption a product doesn't just
need to be better. It needs to be so much better that it's worth the effort of
adoption.

That's why every "Facebook killer" dies on the vine. They just aren't better
enough.

That's why every Bitcoin killer fails to displace Bitcoin as number 1.

Etc

------
BrentOzar
> “We’re using the SSD as virtual RAM”

Swap file as a selling point, hmm? If you’re leading with that, that makes me
a little nervous for the rest of the “innovations.”

Similarly, I laughed out loud when one of the talking heads in the video said
“the SSD and the solid state drive.” Okay then.

~~~
Mr_P
This may be a reference to texture streaming. While I've never developed for
consoles, I do know that texture streaming on desktop & mobile can be
challenging.

Being able to mmap a texture directly from ssd (as if it were RAM), with the
hardware & os taking care of all the details, could be a big deal for
developers.

Edit: To get a sense of how challenging this can be, see this recent talk from
GDC for texture streaming in Titanfall 2:
[https://www.youtube.com/watch?v=4BuvKotqpWo](https://www.youtube.com/watch?v=4BuvKotqpWo)
. Literally all of this complexity would go away if you could mmap textures
directly from disk and let the gpu handle it.

~~~
fwip
To my ears, this sounds very similar to what AMD's "Vega" cards provided for
high-end compute. The "High Bandwidth Cache Controller" allows datasets
outside of the GPU's memory to be accessed in a transparent way.

I can't find a decent technical article at the moment, but here's a link to
the "Radeon SSG," a $7k graphics card that ships with 2TB of solid-state
storage attached, designed to be used to process absolutely massive datasets
as though they were in RAM: [https://www.amd.com/en/products/professional-
graphics/radeon...](https://www.amd.com/en/products/professional-
graphics/radeon-pro-ssg)

------
sirmike_
Maybe they mean some kind of software ram disk? Whatever it is it sounds like
gold plated 90s marketing.

... fog lifts from stage

... a huge screen with flashing, strobing lights emerge with diffused 90s font
writing ...

Some kinda music roars and the writing on screen is undiffused instantly
revealing

“SOLID SATE DRIVE POWERED VIRTUAL RAM”

~~~
mikeash
I read that as implying that the SSD is part of the standard memory hierarchy,
byte-addressable and not requiring a filesystem. No need to load assets: just
create pointers into the SSD's address space. Sort of a modern version of the
ROM on an NES cartridge.

I could, of course, be _way_ off.

~~~
wmf
That's very unlikely and probably a bad idea anyway since NAND latency is too
high.

~~~
kmad
Sounds like a perfect application for Micron/Intel's (yet to be widely
commercialized) 3D XPoint[1] :) Although I find it highly unlikely a new Xbox
would be the place to debut the technology.

Regardless, claimed latency is between DRAM and NAND, so I wonder if it would
be performant enough for a gaming use case. To mikeash's point, 3D XPoint is
byte-addressable, so perhaps it's possible...

[1] [https://www.micron.com/products/advanced-
solutions/3d-xpoint...](https://www.micron.com/products/advanced-
solutions/3d-xpoint-technology)

~~~
opencl
3D Xpoint has been commercially available for years under the Optane brand
name. It's much more expensive than NAND (16 and 32GB NVMe drives available
for ~$2/GB) but a smallish amount in the Xbox acting as some sort of cache
seems at least semi-plausible and would fit with the "next generation SSD" and
"virtual RAM" statements.

------
loteck
Hard to believe a gaming company in 2019 puts out promo material, even
(especially?) marketing hype material, about their vision for the future of
gaming, and never once even utters the words virtual reality.

~~~
majora2007
Virtual Reality or even game streaming, which I feel is going to change the
gaming scene if they can pull it off.

Having played VR games, I totally believe this will transform a huge segment
of games. Not all, but a huge swath of casual players.

~~~
asdff
Streaming is a tough cookie because it's so reliant on the end user's ISP
behaving nicely with advertised speeds. I was a beta tester for google's
streaming project with assassins creed odyssey, and even though I had a
gigabit connection the game occasionally had to drop framerate. If even a
wired gigabit connection wasn't reliable enough, I can't imagine how terrible
the experience would be if you had a more modest internet package.

However, it would be sweet if LAN streaming took off. The Xbox becomes
something you just plug into the wall in a nook somewhere in your house, and
streams to every screen you own on the wifi network; phone, tv, laptop,
whatever you have. That seems a little more tangible to me than expecting ISPs
to walk the talk with internet speeds.

------
manishsharan
actually -- the most interesing gaming innovations that I watch for are from
Nintendo. I found the Wii console and Switch to be truly innovative. Whereas I
feel Sony and Microsoft just seem to be adding more muscle every iteration.

------
holografix
Excited about the new Xbox. I’ve had my classic Xbox One for a while now and
still enjoy playing it.

This promo video though, I hate to generalise but I didn’t find anyone’s
excitement believable at all.

~~~
toomuchequate
They are prob PC gamers anyway. What hardcore gamer mains consoles?

~~~
asdff
People who want to hand someone off craiglist $150 for the entire console vs.
dropping serious cash on marked up GPU to play the same triple A games.

------
DocSavage
Any idea how they're supporting ray-tracing? Haven't seen any details on AMD's
answer to Nvidia's ray-tracing cores, but I presume it'll be part of RDNA,
whatever that is, and use Microsoft DXR.

~~~
gouh
RTX is not necessary for RayTracing, AMD with Crytek made a demo ("Noir") of
realtime ray-tracing.

But still, I hope AMD will standardize a core non-proprietary Vulkan API for
ray-tracing unlike the RTX close proprietary Nvidia-only prefixed RTX for
Vulkan extension.

A bit off topic, but it's sad to see so few people raising against Nvidia-
monopoly closed & proprietary CUDA, Optix, RTX, etc. this is the equivalent of
Flash, Vendor Prefixed Web (-apple-, -ms-, -webkit-,...) or AMP but for
computer graphic

------
trevbreak
As much as the skepticism around their ability to truly do 8k 120hz (of which
I am equally skeptic), and the technologies that have been in PC gaming for a
while - but please don't forget the benefits this brings.

As we saw with the 4k console push, it helped expand the 4k TV market, and
particularly 4k media content.

I'm all for pushing hardware manufacturers, developers and content creators to
think more about high resolution and high refresh as something expected - not
just something to strive towards.

So as much as I doubt there will be much 8k or high refresh content available
- I'm glad we have two big companies in Sony and Microsoft helping drive that
push. Kudos to them.

------
ChoGGi
By 8K they meant the display resolution will be 8K, but the scene will be
rendered at 2K and upsampled?

Even a 2080 Ti has trouble rendering 4K with raytracing...

------
rocky1138
Just a reminder, before we get too excited: this is the definition of
vapourware. We haven't seen _anything_.

------
dvfjsdhgfv
It's a pity they ditched Kinnect, that killed it for me.

