
Intel Skylake Review - p1esk
http://anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation
======
bitL
Pros:

\- 64GB for desktop (finally!)

\- 26 PCIe lanes (i.e. multiple fast M.2 PCIe NVMe, making 5k/4k RAW video
previewing possible in real-time)

OK:

\- very small performance increase comparing to Haswell two generations ago
(Broadwell?)

\- DDR4 performance matches XMP DDR3 performance; to see any difference you
need to use quad channel configuration

\- SGX, the jury is out on this one (double-edged sword, could improve
security but also turn into a complete malware mess due to enclave isolation
and inability to detect running encrypted botnets once they gain ring 0)

Cons:

\- power consumption is up (!?)

\- no AVX-512 for desktop

~~~
pdkl95
SGX is terrifying. Botnets are not the main problem. Windows (N+1) using SGX
to make it hard/impossible to remove the spyware (and other junk)[1] is the
problem. As an example, take a "trusted" environment[2] and add SGX as a way
of keeping end users out of the TEE.

While it is still early, SGX may be a key battlefield in the War On General
Purpose Computing.

[1] e.g. the stuff discussed recently
[https://news.ycombinator.com/item?id=9976298](https://news.ycombinator.com/item?id=9976298)

[2] [http://i.imgur.com/rjbzWyB.jpg](http://i.imgur.com/rjbzWyB.jpg)

~~~
userbinator
Very terrifying indeed. IMHO these days there is far too much freedom being
traded away for security, and users are gradually conditioned toward it. The
thinking is almost like "How about we put everyone in prison, because some
percentage of them will become criminals anyway?"

Back then it was relatively easy to patch your OS --- even if it was
proprietary and closed-source --- to make it behave how you wanted to, if you
knew what to change (see all the Windows customising forums for an example,
and the whole cracking community.) You didn't have to give up proprietary
software completely and move to something like Linux. Now it's become much
harder, and I feel that the middle ground between fully open-source and fully
closed has mostly disappeared as the two communities are gradually distancing
from each other.

[http://www.gnu.org/philosophy/right-to-
read.en.html](http://www.gnu.org/philosophy/right-to-read.en.html)

~~~
MichaelGG
It's "terrifying" but it's also something that's often desired. For example,
SGX could enable things like a trusted Bitcoin mixer. Or a trusted telephony
provider. Remote code trusting is pretty cool for such scenarios where the
other option is "well, sure, you gotta trust the site, but...".

The downside is that it'll be too juicy for DRM and games to pass up, and
that's shitty and most likely outweighs the benefit. Perhaps if Intel had
limited SGX to server workloads and gave us AVX512 instead.

(I'm guessing on SGX here. I am unaware of any detailed info on how exactly
it'll work, how the remote attestation works.)

~~~
ngneer
Lots of info here: [http://sgxisca.weebly.com/](http://sgxisca.weebly.com/)
Not a whole lot on remote attestation though

~~~
MichaelGG
Thanks, that looks very useful!

------
igravious
64GB! Sixty-four! For a desktop/laptop architecture! At 64GB (GiB?) main
memory we're getting to the point where you could copy the entire working set
of the OS to memory from disk on boot and then simply save to disk
periodically. So long as you have a battery/UPS that'd work. Is that a crazy?

I'm currently on 3rd gen, the i7-3537U. My dream machine would be 13/14"
laptop w/ 1080p (though 4k-ish wouldn't hurt), Skylake i7, usb-c, hdmi out,
16GB RAM, 256 decent SSD. Have I forgotten anything?

~~~
cm3
I would say it's very sad that we have only 64GB. We should be allowed to
install at least 128GB or 256GB in consumer grade mainboards. Similarly for a
developer workstation one has to choose a Xeon to get more cores, but a
smaller GPU and twice as many cpu cores in the non-Xeon line would be great
because enthusiasts don't use the iGPU anyway. Developers don't need a fast
GPU unless they really need one and then any iGPU is too little too slow so
most chips should have more cores in place of bigger iGPUs. It's true that
your office clerk workflow doesn't benefit from more cores due to legacy
software, but software developer workflows benefit a lot when building code or
running virtual machines or running (modern) concurrent applications. It's
very very sad indeed that AMD is the only x86 vendor who puts more cores on
consumer chips. Let's hope Zen will force a change.

~~~
nharada
Don't people use GPUs all the time now because of Netflix and movie streaming?
My old laptop's GPU would always kick in when I played something in 1080p.

~~~
soylentcola
I believe that streaming video can make use of GPU acceleration for decoding
(to take some load off of your CPU).

------
norea-armozel
I can't see myself retiring by i5-3570k since it's not a sufficient bump in
performance for me (even with DDR4 quad channel). So, I'll have to wait and
see what Intel does with its iGPU technology. If the iGPU technology gets as
good as a gtx 970 (or r9 290x) then I might buy one of their next generation
CPUs when they come to market. :/

~~~
Nursie
From what I've read, the Skylake GPU capability is a step backwards from the
Broadwell ones, though it may be that's because the audience for the two chips
released so far is considered to be gaming enthusiasts, who will probably have
one or more discrete cards anyway.

~~~
norea-armozel
Yeah, it just seems like only AMD understands that you have to marry the GPU
with the CPU aspects of the processors now. And I don't know if AMD can keep
it up at this rate considering some of the projections hint at a potential
default around 2020.

------
kenrikm
CPUs seemed to hit their sweet spot around 2010 and have not made a huge
amount of practical progress since. Up until about 2 weeks ago my gaming PC
was running a i5 750 a processor released back in 2009 and still able to run
all but the most demanding games on max settings GTA5 being the exclusion.
Even my video cards (GTX660ti in SLI) are on older side having been released
back in 2012/2013?.

Recently My PSU Blew and took the motherboard with it so I purchased a i5
4460/Z97 Combo for about $300 from Amazon which clears up the bottleneck in
GTA5. This CPU is already a year old but I'm glad to see it's within 1-2FPS of
the new chips in most games. Most likely won't even need to look at upgrading
for another 5 years.

I think to some extent intel is backing themselves into a corner with this
release, their last generation was so good (i5 4690k or 4460) that Skylake
seems rather lackluster.

~~~
Nursie
Was looking at this earlier today as my machine is about 4 years old now.

From what I can tell, moving from my i5-2400 (Sandy Bridge) to the i7 6700K
(Skylake) would apparently buy me about a 70% performance boost. But then
moving to a 4790K (Haswell) gets me a 69% boost. And I could get a 40% boost
by buying a 3770K (Ivy Bridge), and then I wouldn't even need a new
motherboard, RAM etc....

~~~
kenrikm
DDR3 and DDR4 are about the same performance and Skylake supports DDR3. You
most likely don't need to upgrade your RAM regardless.

~~~
paulannesley
Skylake supports DDR4 and DDR3L but the slots are incompatible; motherboard
manufacturers need to choose one or the other. All the Z170 boards I've seen
do not have DDR3L slots. Even if they did, DDR3L is lower voltage than DDR3;
standard DDR3 will never work in a Skylake board.

------
mixmastamyk
As I'm slaving over a hot laptop in the middle of summer, I have to ask. Will
this generation allow for cooler machines?

To be honest I write text files and email for a living. Can anyone recommend
the coolest yet snappy laptops? With plenty of memory, min 16gb, "retina"
display. Perhaps next year's Macbook with no fan is a candidate? Though I
don't mind a fan.

I have a "Intel(R) Core(TM) i7 CPU Q 720 @ 1.60GHz" with "Madison [Mobility
Radeon HD 5730 / 6570M]", which is a few years old. Looking for something new
that has half the heat output, and is hopefully faster.

------
devmoat
Anyone know if Skylake comes with SGX extensions?

~~~
bitL
Yes, that was one of the main points of architecture overhaul.

------
snake117
So I'm planning on building a new gaming rig because my current PC is still
running Sandy Bridge. Would you guys recommend Skylake? I found this MoBo +
CPU + Memory package on Newegg for $500:

[http://www.newegg.com/Product/Product.aspx?Item=19-117-601](http://www.newegg.com/Product/Product.aspx?Item=19-117-601)

From my understanding, the i5 CPUs are great for gaming and i7 is great for
running multiple applications at the same time. I appreciate any help, thanks!

~~~
talmand
I'm about to build a new rig as well and was waiting on Skylake to see if I
wanted to jump in. After seeing the reviews and price points, I'm going to
pass and go with Haswell. The pros do not justify the price premium to me.
Right now the difference for me is around $150, which I'd rather sink into the
video card.

Also, I've read that the Skylake CPUs don't come with a cooling solution, you
have to buy your own third-party cooling solution which adds to the cost. I
know many people think the stock cooling sucks, which it does for
overclocking. But if you don't overclock, the best reason for a third-party
cooling solution is to reduce noise.

Currently I'm planning on an i5 as I've never seen an advantage to having an
i7 for gaming. An i5 and a good video card is all you need.

~~~
cdr
If you care about spending an extra $30 on a CPU cooler, I'd say Skylake is
definitely not for you.

If you're getting Skylake for gaming you should probably be planning to drop
$400 (or more for a bigger size) on an Intel 750 SSD.

~~~
talmand
When pricing on a budget, yes, $30 can be a big deal. Let's see, $250 CPU +
$30 cooler (realistically, should be higher) versus $200 CPU that suits my
needs just fine. Gives me an extra $80 to maybe upgrade elsewhere that might
make a bigger difference to my goals.

Why exactly is a $400 SSD suggested to go with Skylake for gaming? Why not a
$100 SSD? What does the size of the SDD have to do with Skylake? I'm failing
to understand your point.

~~~
cdr
If you're not interested in spending the money to get top-tier performance,
there's absolutely no reason to buy Skylake right now.

The Intel 750 is NVMe, the next generation of storage interface. The 750 can
outperform a SATA SSD by 10x in reads, and even versus an outstanding SATA SSD
will get you at least a 4x increase.

Load times are actually pretty important for games, and after a GPU I don't
think there's anywhere you could spend extra that would give you more quality
of life.

You'd already be looking at $250 for a similarly-sized high-performance SATA
SSD, so the price difference isn't that large. You aren't getting high
performance even as far as SATA for $100.

~~~
talmand
I cannot disagree, I think that you actually supported my point for the most
part.

------
uxcn
Will latencies for DDR4 modules come down over time, or is there some
fundamental limitation to the standard?

I'd be curious if anyone knows what Intel's long term plan for L4 eDRAM is.

~~~
nitrogen
Are those absolute latency increases in nanoseconds, or just bigger numbers in
clock cycles? The clock cycle counts will increase as effective clock rates
increase, but the real latency in nanoseconds could still be the same or
lower.

~~~
uxcn
Realized time might be the same or lower than DDR3, but I was more interested
in the cycle counts (RAS, CAS, etc..). Latencies typically do increase as
frequency increases, but better modules usually still support lower latencies.

DDR4 uses a lower voltage, which is typically bad for latency, but I'm
wondering if there's some other fundamental limitation to the standard.

