Yes, this may be good for the future of Intel or the employees losing their jobs... but right now, 12000 people are having their lives affected in drastic, unexpected ways.
Assuming people stay in a job 4 years, you can get a 25% reduction in headcount per year by just not hiring. Intel has over 100,000 employees. They are likely hiring 10k-25k people per year just to stay at a constant size.
He's not "being let go".
From the article: Stacy Smith, who has been chief financial officer since 2007, will move to a new role as head of manufacturing and sales
I'm not sure what to make of the move, but if they really wanted to let him go they wouldn't have given him this different role.
Edit: just to elaborate, Intel is first and foremost a manufacturing company. Their manufacturing currently has a 61 percent gross margin. You don't put someone in charge of that if you want to let him go!?
But I'm not an Intel employee or close observer of the company. Maybe some Intel insiders can chime in with that they think this means.
A good 1/3 of their assets are in property plant and equipment value. That means that the fair market value of that stuff can tank if they don't keep up the pace of growth because what if they can't sell as much of their products anymore because of a shift to other technologies?
This is for sure a move because their revenue forecasts are grim in many areas they operate in.
The only time I've felt my computer to be "slow" was when I tried to use Photoshop and After Effects simultaneously.
Here's the problem. Running a five-year-old PC used to be an issue.
Today, running a five-year-old PC is a Intel Sandy Bridge i7-2600K ( Passmark Score: 8,518 ). While a modern i7-6700K has a Passmark Score of: 10,987.
FIVE YEARS, and FOUR generations of processors have created a gain of net 28% in multithreaded situations. Far less for single-threaded applications (maybe 15%). And absolutely negligible for gamers (which are 100% GPU throttled).
If you're running a 5-year-old i7-2600K, there is absolutely no reason to upgrade to Intel Skylake. None at all. Maybe you want a new GPU to play those VR games... but Intel isn't making gains anymore in processor speed.
Intel has been trying to get people to buy their power-efficient designs (Skylake is a hell-of-a-lot more power efficient...) so Intel continues to sell laptops at a decent rate. But no one I know has major issues with their desktop speeds.
The only people I know who have upgraded their computers are those who have had hardware failures. There's still no need to upgrade a computer from Sandy Bridge.
While that's the prevailing opinion, I don't necessarily agree. I think it's SandyBridge owners trying to convince themselves more than anything else, but really it's just being swooped up in the groupthink.
Skylake is the first chip that makes a very strong case as an upgrade. You gain NVME support for your PCIE SSDs, DDR4 (which has shown an improvement over DDR3 in some benchmarks), roughly 20% IPC improvement (5% per gen give or take), DX12_1 feature level IGP, CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming (Broadwell had it first and Skylake's is improved upon), vastly more power efficient and Thunderbolt3 support.
7 pretty good reasons off the top of my head.
You can dismiss each of these if you want, but this is all very attractive in reality.
The whole story is that SandyBridge is only competitive, in gaming, if you overclock to 4Ghz+. You still lose out on the other improvements though and any stock SB system compared to a stock SL system will look pretty sad once you factor in the platform updates.
If it's gaming you care about, take a look at the benchmarks of the 128MB L4 Broadwell chips compared to Devil's Canyon. Let alone SandyBridge. Both get crushed where it counts and Intel is just now getting Skylake 128MB L4 CPUs out the door. If you don't care about gaming, Skylake still crushes SB.
If you care about NVMe, just get a $20 expansion card. Besides, NVMe SSDs are expensive. Mushkin Reactor 1TB for $210 yo.
Hell, the fastest NVMe SSDs directly go into PCIe lanes. So if I actually cared about the faster speeds, I'd jump to an Intel 750 SSD.
Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.
Yes, if I had a laptop which only had room for a M.2 card, then maybe I'd get the Samsung M.2 card. But even if one were given to me for free, I'd rather get the $20 PCIe expansion card.
I can't think of a single situation where I actually need the onboard M.2 card on the Skylake motherboards, aside from the $20 convenience.
> roughly 20% IPC improvement (5% per gen give or take)
I admit, this is a good thing. But this is very very little, especially when you consider that the iPhone 5 to iPhone6 jump was 70% IPC improvement AND battery improvement, yet many people don't consider that enough of a jump.
Soooo... FIVE years gets you +20% speed, while ONE year gets you +70% speed on phones. That's why desktops aren't getting upgraded.
> DX12_1 feature level IGP
You buy a $300+ CPU without buying a $100 GPU? The cheapest of GPUs are significantly better than IGP. Hell, if I cared about DX12_1 IGP, I'd get an AMD A10 for half the cost and twice the IGP performance with drivers that actually work on games.
Except I game in capacities that far exceed even AMD's superior IGP. I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris. So I have a R9 290X. Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.
> CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming
NOT on the desktop. Crystalwell is laptop-only, and 45W to boot. Compared to 20W Laptop chips, I don't see the Crystalwell L4 Cache actually being useful to the majority of people.
In fact, I don't even know of any laptops with Crystalwell that I'd recommend to anyone. Here's a challenge: Name me a good laptop with Iris Pro / Crystalwell. Hint: Macbook Pro use AMD dGPUs for a reason.
And hell, we aren't even talking about laptops. We're talking about Desktops, and Crystalwell is UNAVAILABLE in desktop form. Its irrelevant to the desktop user, even if you thought that paying $600+ for a CPU was cost-effective (instead of buying a Sandy Bridge E5-2670 for $70 from Amazon).
Basically, you got DDR4 RAM and IPC +20%. That's all that I actually think the last five years will get you. Or, you can buy a 8-core 16-thread E5-2670 for $70... hell... two of them, get a nice Server Board for $300 and have a BEAST of a machine.
The 45W i7 is a heavy burden to carry with Crystalwell. Might as well get better graphics if you're going for the 15" Pro.
You missed the irony of your 45watts as a heavy burden. A R9 370X adds about 50watts to your TDP by itself. Along with its needless complexity. If someone wanted to reduce TDP and that complexity you could step down to the base Intel IGP. But if stepping up, Intel's solution makes a lot more sense.
Good luck with your overpriced Crystalwell failure. If you got actual benchmark scores to talk about, please respond to me here: https://news.ycombinator.com/item?id=11536519
But I actually know the benchmarks of everything you're talking about like the back of my hand. Your argument has no technical legs to stand on what-so-ever. Don't feel bad if I'm just calling out your Bull$.
No one wants that AMD chip in their Macbook. It adds complexity both in engineering and software. There's PLENTY to talk about technically there and why that's a good idea. Not to mention Intel's best-in-class Linux support.
Yes. If you're bargain hunting for gaming hardware you should just buy a console. Or, if you're seriously suggesting to put an Intel 750 into some old system like SandyBridge.. no comment. I would never recommend someone bother doing that.
Step up to an NVME setup, Skylake and do it right. Skylake i5 setups can be had for cheap. You're just arguing to argue on that point. Whether or not you have anything useful to add. The SB argument is common knowledge, an age old argument at that with no new information or insight.
>Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.
I'm not into cost-effective bargain hunting. Anyone who would gimp a nice Intel 750 SSD on a non-NVME system is a fool and you've suggested it.
>The cheapest of GPUs are significantly better than IGP.
No they aren't. The point about DX12_1 IGPs is that it's there, it's modern and it has already sucked the life out of the low end space and moving into the midrange with Iris Pro. Your stance is the 2010-era view on computers. Same era as Sandybridge TBH.
>I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris.
This demonstrates how much you know, and why people shouldn't listen to what you're saying. Which can be heard on any PC gaming forum a thousand times over. This is HN though and it won't fly.
Intel has already committed to FreeSync. It's incoming with KabyLake rumor is that it may be enabled for Skylake.
>Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.
Wrong on its face. You just haven't cared to investigate recently.
>NOT on the desktop. Crystalwell is laptop-only, and 45W to boot.
Nope. The Crystalwell chips are going into NUCs from here on out. There's a 128MB L4 NUC coming in 2 1/2 weeks and a 256MB NUC coming in 12 months.
The fact you're talking about gaming and recommending an ES-2670 for that is just silly. That might be a good machine for compiling code. If that's your goal, it's still a bad idea when distcc can utterly embarrass that old power hungry chip.
For gaming, Broadwell already demonstrated what Crystalwell adds for gaming performance with a standalone GPU. And it's a game-changer, it's faster than the i7-6700K. Yes, it is. And it definitely mops up where it counts (99th percentile frame times) on SandyBridge too.
In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have (even with an Intel 750, if you made the ridiculous decision to actually put one in SB). There might be more cost-effective ways to build a gaming rig, but if you're into saving money on hardware and gaming, buy a PS4.
I understand it's the prevailing thought among PC gaming kiddies, but holding your grip tighter on some old SandyBridge system won't change that in reality it's fallen pretty far behind in both overall platform performance and power efficiency.
The Iris Pro 5200 GT3e achieves Passmark 1,174.
If Iris Pro 580 GT4e is twice as good (Intel only claims 50% better), that's still not very good. Thats utterly awful actually.
A $100 GPU is the R7 360, just off the top of my head. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125...
Exactly $99 on Newegg right now. It achieves Passmark 3,150.
No one gives a care about the $600 Crystalwell chip that performs worse than a $100 dGPU. Its utterly awful. You'd be insane to actually recommend this product to anybody. You claim that you care about performance. Do you even look at the benchmark numbers bro? You're claims are so far away from reality I really just don't know how I'm supposed to respond to you.
Yes, a $600 Chip. I'm assuming this, unless you can figure out a cheaper Skylake Iris Pro: http://ark.intel.com/products/93336/Intel-Core-i7-6970HQ-Pro...
EDIT: I see that you're an anti-AMD guy. Okay, whatever. That's why there's another company out there.
NVidea GTX 750 Ti, $105 right now on Amazon. Passmark 3,686. Still utterly crushing your $600 iGPU with cheap-as-hell GPUs, no matter the brand.
Dude, I'm running a (what was at the time) high-end R9 290x, although this is more of a mid-range card now due to its age (Fury / 980 Ti). It has Passmark of 7,153, and you're seriously suggesting I "upgrade" to a Crystalwell Iris Pro that only achieves ~2000 Passmark?
PS: Skylake performing 20% faster than Sandy Bridge after five years of updates is awful.
> I'm not into cost-effective bargain hunting.
Then why the hell are you bringing up M.2? Intel 750 is the best of the best and plugs directly into PCIe. Sandy Bridge handles it just fine.
> In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have
Crystallwell has to beat a $100 dGPU first. And the benchmarks say otherwise. My bet? Crystalwell fails to beat a R7 360 / NVidia 750 Ti (NVidia's LAST generation BUDGET card) and I get to laugh at its worse-than console performance numbers despite the $600+ launch price for the chip.
But hey man, show me those benchmark numbers if you disagree with my assessment.
But I have to say, anti-AMD! Yes, very perceptive as I type this on a machine with a Radeon in it. Consider the fact that other people can criticize products they have extensive knowledge with.
Judging from the rest of your response, my post went completely over your head. And quite the troll as you try to change the points I made in attempt to "win" the argument. But it is amusing hearing some kid saying Intel's 20% IPC boost from SB to SL as awful shows how much you don't know. I have friends that work at Intel. Go back to PC Gamer as you have no idea what you're talking about. Some dumb kid sees ONLY 20% with massive power reductions, doesn't realize that computers can do more than just play video games.
You also failed reading comprehension and the ability to hold a conversation. Congrats. But either way, there's no way around the fact that in 2 1/2 weeks I'll be benchmarking a R9 Fury to an i7-6770HQ with PCIE NVME SSD, some DDR4 and absolutely crushing any SandyBridge system you own.
Enjoy your old ES-2670 and SandyBridge with an Intel 750. What a total fruitcake. Better use of your time is to go read about logical fallacies as you just spent an hour typing about a strawman you created to beat on with points I never made.
What you want to hear because you just want to argue- you're right, I'm wrong. Hope you feel better now. I'm not giving you any more help. I get it, you like your poverty gaming rig. See ya kid.
Sure, that's the top of the line $300 chip in the days of a whole PC being able to be bought for $300. What if you're on a five year old Pentium G620?
1) you're not the sort of person who buys a new rig every two years, and
2) a $300 PC today will give you exactly the same performance as the one you bought 5 years ago: the minimal gains you get in iron are naturally offset by minor losses in software (which is now built by people with SSDs, so good luck with your little spinning disks...)
The market is now artificially segmented to such a fine level, and moving so slowly at the top, that performance simply does not "trickle down" like it used to. Add to that the move to "power efficient" CPUs (aka: less powerful overall) and you will basically see zero gains if you stick to the bottom of the market.
But yeah, its peanuts. A 20% improvement over five years is pathetic. I'm just calling out your hyperbole, in case others didn't see it. Apple had like a 50% improvement in a single generation of iPhones, so a 20% difference over five years is very ignorable.
SSDs and GPUs improved dramatically over the past five years. Well... more specifically... SSDs got dramatically cheaper and retained roughly the same quality. So its worth it to upgrade to SSD or to get a new Graphics Card. But Intel doesn't have any GPU offering, and their SSDs are "enterprise" (aka: overpriced). Mushkin / Crucial are better brands for consumers... even Samsung (although a bit more expensive)
A five-year-old Pentium G620 is only ~25% slower than the Skylake Pentium G4520. Both are dual-core CPUs that are cheaper than $100 aimed at the budget audience.
Frankly, the fact that AMD Vishera FX-6300 still easily beats out the Pentium G4520 in multithreaded benchmarks... this demonstrates the absolute lack of Desktop CPU improvements. I'd only recommend the G4520 to someone who is really sure that they care about single-threaded performance (ie: Gamers). Most people will appreciate the lower total-cost-of-ownership that FX-6300 offers at that price point.
* AMD FX-6300 Passmark: 6,342
* Modern Skylake Pentium G4520 Passmark: 4,261.
The G4520 is a $80 chip, Released October 2015. FX-6300 was AMD's 2012 entry: a FOUR year old chip, now selling for $80 to $90 at Microcenter.
Microcenter has some $0 Motherboards if you buy an FX-6300 from them. That's the kind of benefit you get from buying "old". And since CPUs aren't really much faster, why the hell should you buy cutting edge?
Hell, why are you spending $80 on a new G4520? Facebook just decommissioned their servers. You can get a Dual socket ready Sandy Bridge 8-core 16-thread E5-2670 on Ebay for $80. Amazon for $70
Go get yourself a dual-socket 16-core 32-threads E5-2670 Sandy Bridge Workstation, just $80 per CPU.
Intel can't even compete against their own ghost from 5 years ago. Is it a wonder that sales are low?
I'm serious: about 80% of my day is meeting with real people in the real world. Mobile phones haven't changed that.
The other 20% of my day is sitting at my desk creating original work product (mathematical models and thoughtful memoranda) or reviewing the work product of others. Mobile phones haven't changed that, either.
No doubt the drought in PC sales is real and permanent. But I wonder how much of that is because people just don't need to keep their laptops up to date in the age of great cloud services.
As for entertainment though, are you watching Youtube extensively on your desktop?
There already is a relatively mobile tablet/desktop hybrid that works pretty great for both consumption and getting work done. It's called a laptop.
> As for entertainment though, are you watching Youtube extensively on your desktop?
Yes. I have a phone, a tablet, a desktop, and a laptop. The tablet is pretty much only used for netflix and textbooks, and the phone is for travelling. The tablet is absolutely worthless for browsing, coding, writing, or gaming; and the phone is only saved by the form factor. If I had (the space for) a TV then the tablet would be a completely unjustifiable purchase.
And yes, mobile devices have taken away entertainment share from the desktop as well as televisions.
I don't count time I spend out and about on my mobile, since that isn't time I was going to spend on my desktop anyway. (And I don't think having the option of going outside with a mobile has changed how often I do so.)
Plus, Intel provides almost all laptop chips, and I'm guessing most businesses still have either a desktop or a laptop per person, which will still get upgraded every so often. I doubt workers are going to be using iPads for data entry, although I suppose you never know.
None, since I don't want to have (and don't have or own) a portable surveillance and tracking device (also called mobile phone) in my pocket.