
Intel EOLs Atom Chip Used for Microsoft HoloLens - msh
http://www.anandtech.com/show/11705/intel-eols-atom-chip-used-for-microsoft-hololens?utm_source=twitter&utm_medium=social
======
blackguardx
This isn't necessarily a death knell. If Microsoft wanted to continue with the
product, they can do what is called a "lifetime buy" where they buy up enough
chips to carry them until their next iteration with a different CPU.

This is especially common in industrial electronics where volume is lower and
product life is longer. Some lifetime buys can cover production builds for
over ten years or more.

~~~
sremani
Is it possible that Microsoft might have HL-2.0 which perhaps does not rely on
this Chip?

~~~
tw04
It literally says that in the article. This is in no way a death knell.

>The next-generation Microsoft HoloLens will be different compared to the
existing augmented reality platform, Microsoft revealed recently. While the
device will run Windows 10 and will be equipped with an HPU, it will also
feature an AI co-processor integrated into the latter that will use neural
networks to enable object and voice recognition skills without the need for an
Internet connection. The HPU 2.0 with the programmable AI co-processor will be
a self-sufficient device that will run on battery power of the next HoloLens
(hence, its power consumption will not be too high). The HPU 2.0 and the AI
co-processor were designed entirely in-house and therefore are tailored for
usage model of the HoloLens.

~~~
WorldMaker
I also would not be surprised, given what I perceived to be Microsoft's
roundabout passive aggressive PR complaints about the Atom family, if
Microsoft might make a play for the HoloLens 2.0 to run on ARM instead of x86.

------
Const-me
Technically, Intel continues to sell similar products. For example, i5-7Y54
consumes same 4W of power, the CPU is about 2 times faster, GPU is about 3-4
times faster.

The main downside is the price. These atoms were sold for $20-40, the newer
ones for $280. Not sure whether that’s a huge problem for MS Hololens, but
everything cheaper than that gonna switch to ARM for sure.

~~~
ahmeni
For a moment I thought this kind of massive price increase on a core component
would matter for HoloLens and then I remembered the dumb thing costs $3,000
USD anyways.

------
ansible
It'll be interesting to see what they come up with to decrease rendering
latency. From what I can see, that's one of the bigger challenges recently
with GPUs; instead of focusing on raw throughput (to improve image quality),
latency is now more of an issue.

I personally would welcome a AR HMD that sacrifices whatever is necessary to
get a low-latency system that very smoothly tracks the user's movements.

~~~
captainmuon
I wonder if it would be possible to move some processing into the display.
It's not the same, but in particle physics pixel detectors, which are
basically CCD sensors, people have been working to move the readout
electronics into the pixels. There, the motivation is radiation hardness and
cheaper production. Here, it could be decreased latency. What if you could put
a super small shader in each pixel, and do a final processing step like
'timewarp' right there? Or a part of your graphics memory _is_ directly your
display, no bus in between?

That being said, I tried a couple of VR solutions and I don't really have a
problem with latency. What I find more irritating is that I cannot focus well
on different planes. You do have the effect when you cross your eyes that you
put images from different planes together - although badly, because even if
you don't see the pixels, you do get some kind of moire pattern when shifting
planes (I can't describe it better). And what you don't have at all is depth
blur. I think both can only be solved by true light field displays (if that is
even possible). I hope there will be a breakthrough with holographic
projection at some time.

~~~
sbarre
> What I find more irritating is that I cannot focus well on different planes.

You might be interested in this Oculus VR research then:

[https://www.oculus.com/blog/oculus-research-to-present-
focal...](https://www.oculus.com/blog/oculus-research-to-present-focal-
surface-display-discovery-at-siggraph/)

------
fapjacks
This whole comment section is basically people that didn't read the article,
and then people copypasting article contents as replies.

------
0xbear
It was a shitty chip anyway. No SIMD of any kind. Good riddance, I hope
Microsoft picks something more respectable next time.

~~~
fulafel
The Cherry Trail Z8x00 chips are listed as supporting SSE too, are you saying
the Z8100P is a custom design with the SIMD units removed and MMX disabled
from the x86 FPU? Interesting idea, but would call for a reference I think.

~~~
0xbear
I mean real SIMD, the kind you need these days: AVX, AVX2, FMA. And to add
insult to an injury, MS was running the chip in 32 bit mode, which further
restricts the usable instruction set.

------
evgen
Sounds like what happened to the BeBox back in the day. They put their bet on
the Hobbit chip initially (multiple hobbits iirc, hence the emphasis on multi-
process/multi-thread) and had to delay quite a while to revamp their
architecture.

~~~
zeusk
not sure where you're getting that from, x86 chips are swappable with little
design considerations apart from socket, TDP and in very edge cases - IO.

As for hololens, the current version is approaching EOL anyway.

~~~
evgen
The Hobbit was _not_ an x86 chip, hence the delay its discontinuation
introduced.

I was not suggesting that an architecture change was necessary for Hololens,
but are there a large range of low-power/mobile x86 SoCs out there to choose
from? I know ARM has sucked up most of the oxygen in this particular room, but
surely someone other than Intel is working on a similar line?

~~~
zeusk
> The Hobbit was _not_ an x86 chip, hence the delay its discontinuation
> introduced.

Exactly my point.

> I was not suggesting that an architecture change was necessary for Hololens,
> but are there a large range of low-power/mobile x86 SoCs out there to choose
> from? I know ARM has sucked up most of the oxygen in this particular room,
> but surely someone other than Intel is working on a similar line?

ARM and x86 have readily available implementation for Windows OneCore.

------
revelation
Products that are not Laptop, Desktop or Server x64 CPUs at Intel are cut with
extreme prejudice when the bean counters look at them roughly 6 months to 1
year after initial announcement and realize "hey these are not i7 margins".

It's incredibly damaging to everything they do. You can't cut a chip platform
that your customers literally solder into embedded products with an
announcement 3 months before last order.

~~~
cwyers
> Intel asks its customers to place their final orders on the Atom x5-Z8100P
> SoC (belonging to the Cherry Trail family) by September 30 and says that the
> final shipments will be made on October 30. Given the fact that Intel seems
> to have only one customer using the microprocessor, the short amount of time
> between the announcement of the product discontinuance and the actual EOL
> was probably negotiated before. Moreover, since we are talking about a semi-
> custom chip, Microsoft was probably the initiator of the EOL, which
> indicates that the company is on track with its next-gen HoloLens.

