
Intel to Cut 12,000 Jobs, Forecast Misses Amid PC Blight - matt_wulfeck
http://www.bloomberg.com/news/articles/2016-04-19/intel-to-cutting-up-to-12-000-jobs-or-11-of-employees
======
tma-1
So Intel is cutting 11% of its workforce, Goldman Sachs just reported a 56%
drop in profits, Morgan Stanley had a 50% drop in profits, Netflix missed
subscriber growth estimates etc... yet, the Dow just hit a 9-Month high, and
the S&P500 is now above 2100.

The whole market is overvalued, not just the tech unicorns.

~~~
djsumdog
I don't see why job losses are a bad thing. The last generation of machines
was built so well that people don't need new ones. Good! This should be a
great thing.

Valuing growth over sustainability makes these backwards ass goals of hiring
more, building more and growing. The way this is spun is pretty horrible.

Amazon and Wal-mart losing is a good thing for the environment and society as
a whole.

You yell, "But people lose their jobs..." There are more jobs, plus why does
everyone need a job anyway? Can people in Silicon Valley not make $150 ~ $250k
a year, everyone else get a minimum income and we work together to make better
art, smaller factories and a world that will last much longer by recycling and
rebuilding and not needing to buy an endless supply of shit?!

"Ending is better than mending. The more stitches, the less riches." -Brave
New World, Huxley.

~~~
teraflop
> You yell, "But people lose their jobs..." There are more jobs, plus why does
> everyone need a job anyway?

This is kind of silly. Job losses are a bad thing precisely _because_ we don't
yet live in a world where you can get by on a guaranteed minimum income
without a job.

If you think it's worthwhile to push society in that direction, then great,
but don't kid yourself that believing in it is the same as having already
accomplished it.

~~~
drewm1980
Some european countries have already achieved this; your unemployment benefits
after getting fired are a significant fraction of your previous salary, and
slowly ramp down. You pay for it in taxes (and then some), but at least you
know you can still make rent if you get canned.

~~~
ovi256
You also pay for a bureaucracy to perform administration and fight abuse. That
consumes a fraction of the system's resources. It's hard to argue that this
adds much value, or any.

What this system boils down to is a compulsory savings scheme. So an
alternative would be a voluntary savings scheme, which would have higher
efficiency by not consuming resources for administration.

~~~
jonathankoren
Of course the obvious downside to a voluntary savings plan is that it isn't
effective at actually creating savings. We know this, because in the United
States, we don't have compulsory savings plans, and thus have a voluntary
system. So how's that working out? Not well. Approximately half of all
Americans save 5% or less of their income[0], which isn't enough for emergency
situations. Another study in 2013[1] said 27% had no savings at all. Now the
main reason for this is quite simple. A lot people don't make enough to have
anything so save.

[0] [http://money.cnn.com/2015/03/30/pf/income-saving-
habits/](http://money.cnn.com/2015/03/30/pf/income-saving-habits/) [1]
[http://money.cnn.com/2013/06/24/pf/emergency-
savings/](http://money.cnn.com/2013/06/24/pf/emergency-savings/)

------
tedsanders
This is what the end of Moore's Law looks like.

* Tick-tock is dead.

* 10 nm is severely delayed.

* EUV is severely delayed.

* Significant layoffs in R&D

* The ITRS roadmap is vaguer than it's ever been.

* Giant mergers are up (Intel+Altera, KLA+Lam, etc.), concentrating the industry more than ever.

* And ultimately: A 5-year-old PC still works just fine.

When I say this is the end of Moore's Law, I'm not trying to be dogmatic. Of
course there will still be a semiconductor industry and of course there will
still be amazing technological progress. But it seems the rate of that
progress is slowing, and now the industry is adjusting.

~~~
dangero
Interesting point. If we are witnessing the end, would we recognize it at the
time? Moore's Law has been something we've become accustomed to, so it almost
seems hard to believe that it will end even if we are told that it will.

Makes me wonder what research money will be spent on instead of just faster
chips.

~~~
api
Worst case scenario: nothing. The R&D goes away. Look at aerospace for a
nightmare scenario.

In the early 20th century we got fixed wing flight.

In the teens and 20s we got motorized fighters and the first passenger planes.

In the 40s we got jets.

In the 50s we broke the sound barrier and orbited Sputnik.

In the 60s we landed on the Moon.

In the 70s we... stopped going to the Moon.

In the 80s nothing much happened except declassification of a few things
(stealth) that were developed in the 60s and 70s.

In the 2000s we grounded the Concorde. Passenger flight got _slower_ and _more
expensive_.

In 2016 we fly on passenger planes no faster than what we used in the 70s and
80s, and we're stuck in low Earth orbit.

1969 was the peak of the aerospace industry. With the exception of SpaceX
(which is really just picking up where NASA left off), we are _less advanced_
today than we were in the 1960s.

There's many places we _could_ go beyond conventional Moore's Law: multi-
dimensional chips, optical, quantum, exotic materials with very low power
consumption, etc. But if what we have is "good enough" and there is little
demand for anything faster, the R&D dollars won't be spent. If anything the
shift toward mobile computing and wimpy thin client endpoint devices might
actually lead to a pull-back and loss of capability similar to the one we saw
in aerospace after the 70s.

The consolidation we are seeing is not a good sign. This is what happens when
an industry decides it's now a cash cow and it's time to go out to pasture.

We also _could_ have a base on the Moon and Mars right now and be working on
our first interstellar probe to Alpha Centauri. Physics didn't stop us.
Economics and politics did.

~~~
discodave
That's not entirely wrong but you're ignoring the massive price difference for
air-travel between the 60s and "slower" 2016.

The price drop in air-travel can be attributed to: 1\. Technology. Cheaper
(per seat), more efficient planes. 2\. Consolidation of airlines (and airplane
manufacturers). 3\. The disruption of full-service airlines by low-cost
carriers.

In semi-conductors we have been getting along with 1 and a bit of 2. I would
say that the disruption of Intel by ARM is an example of 3 because intel is
not incentivized to compete at those low price points.

~~~
hga
_3\. The disruption of full-service airlines by low-cost carriers._

In the US at least, this was due to political deregulation. The process was
started by Nixon, but finished in law by Jimmy Carter (!), note also the
leading lights of the Democratic party in the signing picture, e.g. Teddy
Kennedy 2nd from right:
[https://en.wikipedia.org/wiki/Airline_Deregulation_Act](https://en.wikipedia.org/wiki/Airline_Deregulation_Act)

------
anilshanbhag
One must understand that this is re-structuring. Intel is probably letting go
of some divisions it no longer intends to pursue. In the not-so-distant past,
Microsoft did an internal re-structuring when Satya Nadella became the CEO. It
isn't as bad as it is portrayed to be as most people end up getting re-hired
in other groups or take the severance and join a new company.

~~~
theandrewbailey
News like this out of the blue is strange. I was pretty sure that the post-PC
era was a sham. It still is, right?

~~~
5ilv3r
How much of your time is now on a mobile that used to be on a desktop? Their
core business is drying up.

~~~
malchow
Am I the only person for whom the answer to this question is "almost none?"

I'm serious: about 80% of my day is meeting with real people in the real
world. Mobile phones haven't changed that.

The other 20% of my day is sitting at my desk creating original work product
(mathematical models and thoughtful memoranda) or reviewing the work product
of others. Mobile phones haven't changed that, either.

No doubt the drought in PC sales is real and permanent. But I wonder how much
of that is because people just don't need to keep their laptops up to date in
the age of great cloud services.

~~~
bduerst
Yeah, mobile hasn't figured out a good way to take over the workspace. Some of
the tablet/laptop hybrids are getting closer.

As for entertainment though, are you watching Youtube extensively on your
desktop?

~~~
duderific
I know this sounds crazy, but there are still some people who have cable
subscriptions and watch TV on a TV. Oh the horror!

~~~
matwood
Those TVs are now 'smart' along with cable boxes and other peripherals. Is
Intel Inside any of those?

------
robertelder
I'm not sure how much of an effect this would have, but from a consumer's
point of view, there isn't as much of a reason to buy a new PC every few years
anymore. The laptop I'm using right now was purchased in 2011, and the prices
in stores now are very comparable to what I paid back then. I've made up my
mind to buy a new one a few times, but then when I go to the store, it just
isn't worth it.

~~~
mark-r
Moore's law may become irrelevant long before it becomes invalid. Without
rapid increases in usable computing power, the reasons for upgrading have
largely disappeared. In particular the chips at the top of the profit margin
curve, where Intel loves to play, are much less compelling.

Intel might eventually find itself in the same shoes Kodak did - when your
primary business dries up, there's unlikely to be a follow-on that is
successful enough to keep the company going.

~~~
noir_lord
I'm not convinced that moores law has ended (using the processing power
doubles not transistor count doubles).

We've hit the physical limits of our current processes but we haven't remotely
hit the physical limits of..well physics.

It's possible that we are on the flat before the next curve starts whether
that curve will reach the consumer space I don't know but at the top end of
computing (supercomputers) there is still massive demand for more processing
power.

~~~
prewett
What makes you think we haven't started to hit physical limits of physics? A
4nm transistor is 7 atoms wide. Obviously, a one atom transistor is pretty
much the absolute smallest possible, which would be about .5nm. Anything less
than 7nm experiences quantum tunneling, which is going to really mess things
up, and will get worse the smaller the size. [1] I imagine the interference
between transistors will be fun...

Next, how exactly are you going to make these atom-sized objects in quantity?
You can't be manually placing atoms. But lithography has a hard physical
diffraction limit, forcing the use of higher frequency light. The higher the
frequency, the greater the energy. Etching single-atom features will require
hard x-rays, which we may not even have technology to generate. Even if we do,
focusing x-rays is not trivial, as you can't just slap in a glass lens. Plus,
the focal lengths may be rather long. And what kind of photoresist do you use?
Hard x-rays are likely to blast pretty much anything, and the etching
characteristics are probably not neat little troughs. How do you ensure you
have your one atom stay put when everything around it is being blasted by
high-energy x-rays?

I think the physical limits are looming pretty large.

[1]
[https://en.m.wikipedia.org/wiki/5_nanometer](https://en.m.wikipedia.org/wiki/5_nanometer)

~~~
noir_lord
Maybe but (and I could be way off I'm a programmer not a physicist) those
limits are pretty much the limits on our current technology, it's a bit like
saying "well we've reached the point of diminishing returns with this steam
engine, that's it no more progress" and over in a shed somewhere else someone
is inventing the AC electric motor.

I'm optimistic because I've seen "end of progress" reports on computing power
since I was a kid in the 80's.

We are long way away from this
[https://en.wikipedia.org/wiki/Bremermann's_limit](https://en.wikipedia.org/wiki/Bremermann's_limit)

We've known for ages that Germanium and similar offer better characteristics
than Silicon but the cost to improve silicon has stayed below the cost to
retool for Germanium til now, if we do hit the limit at 5nm silicon then the
cost equation changes and alternate materials become worth the investment.

~~~
nebusoft
They already use germanium in their manufacturing processes. It's not all
germanium because there are challenges that would create. There are also
benefits to mixing a different sized atoms into the same lattice to affect the
speed in which the electrons travel so modern manufacturing often implants
germanium atoms into the silicon lattices. I do agree that material
improvements can change everything so the often spouted doom and gloom of
stagnation is often unfounded.

------
aluminussoma
A good friend was laid off from Intel during their last round of layoffs.
Without going to into details that would identify my friend, he was laid off
after losing a game of internal politics. His division's vice president even
apologized to him for his layoff.

My friend's story gave me the impression of Intel being a highly dysfunctional
company. My friend was sad to leave Intel, but I think it was good for him in
the long run.

For those about to be laid off from Intel, I hope it also works out for you.

~~~
mmagin
Are there companies with >100,000 employees who are not highly dysfunctional?

~~~
sounds
Number of employees as a metric for dysfunction? Why not number of court
injunctions? Number of union strikes?

I mean, if we're getting serious, we could write a dysfunction function, D =
dysfunctional points:

    
    
      D = (points for threat posed by dysfunctional government)
        + 4*(points for direct environmental harm done by company)
        + 2*(points for environmental harm caused by subcontractors)
        + (points for each lobbying dollar spent)
        + 2*(points for frivolous or non-FRAND patent lawsuits)
        + 10*(points for fighting against a free and open internet)
        + (points for trying to track users / violate privacy)
        + 10*(points for getting hacked, releasing user's private data)
    

etc.

------
wINfo
There's an irrational belief system at hardware companies about Windows that
clouds their logic and leads to a sense of separation anxiety. Intel lost the
opportunity to get a lead in mobile because they viewed the Linux/UNIX client
device category as a side business rather than making it their core platform.
They continued to invest substantial resources in supporting Windows 8 and
Windows 10 despite the obvious reality that Windows is a dead-end platform
with no growth prospects. It isn't just Intel but even nVidia/AMD lost the
mobile space for this same reason, too much Windows not enough Linux/UNIX.

The Windows ecosystem has become corrosive to any industry or company it
touches. We now see the end results of supporting a closed-source legacy
platform is 12,000 jobs at Intel due to the lack of excitement and innovation
in the PC space. Perhaps Linux will revive the PC market but in the meantime
Intel and their peers at nVidia/AMD have done little to make that a reality in
the mainstream sense.

~~~
WayneBro
Every single thing you said is complete hogwash and you have provided zero
evidence to backup any of your poorly formed opinions.

To begin with, Intel didn't lose an opportunity to get a lead in mobile
because of their views on Linux/UNIX. None of the players today have a "lead"
in mobile because of Unix. They have a lead because of the touch-based front-
end. It wouldn't have mattered what that was running on.

Furthermore, Windows is responsible for _creating_ numerous multi-billion
dollar industries and Windows obviously holds a ton of value for businesses
that Linux can't even compete with. That's why just about every business runs
Windows and not Linux.

The willfully obliviousness of nix fanboys like you is boring.

~~~
dang
> _Every single thing you said is complete hogwash [...] obliviousness of nix
> fanboys like you_

This comment violates the HN guidelines egregiously. We ban accounts that do
this, so please don't do it again.

Proper use of this site means commenting civilly and substantively, or not at
all. Please read the rules and follow them:

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

[https://news.ycombinator.com/newswelcome.html](https://news.ycombinator.com/newswelcome.html)

------
tomohawk
H1B use is up, though.

[http://dailycaller.com/2016/04/19/intel-lays-
off-12000-peopl...](http://dailycaller.com/2016/04/19/intel-lays-
off-12000-people-after-lobbying-for-more-foreign-workers/)

~~~
kevin_thibedeau
Last I checked they were the #1 sponsor for EEs/Computer Engineers. It's
appalling that Congress looks the other way when abuse is clearly evident.

~~~
azaxacavaba
When you pay the exact amount of money and benefits and sometimes even more as
joining bonus and stocks awards for a H1B employee, how can this be an abuse
of the system?

~~~
adrenalinelol
The argument "we can't fill all these open positions, please let us bring in
foreign workers" then firing 10,000 people contradicts why they want the
government to deregulate the system.

~~~
dingo_bat
No, it just means none of the 10k people were qualified enough. They were not
fit for the open positions. Otherwise there is no sense in hiring an H1B with
the same salary plus all the H1B related fees and expenses.

~~~
zmmmmm
> Otherwise there is no sense in hiring an H1B with the same salary plus all
> the H1B related fees and expenses

I am sure some employers will perceive them as more compliant employees, given
their visa status depends on their continued employment.

~~~
trustfundbaby
If you're looking for compliant employees, why go through the trouble of
hiring an h1b employee, bringing them from a different country and paying all
the fees associated with processing visas and probably a green card process,
and paying them the same or higher (usually, the outsourcing body shops are
generally the places that play below market for h1b) ... if you can just find
and American to do the same thing?

Hell when you get rid of an h1b worker there are even costs associated with
revoking their visa and possibly paying for their flight back!

~~~
lmm
> If you're looking for compliant employees, why go through the trouble of
> hiring an h1b employee, bringing them from a different country and paying
> all the fees associated with processing visas and probably a green card
> process, and paying them the same or higher (usually, the outsourcing body
> shops are generally the places that play below market for h1b) ... if you
> can just find and American to do the same thing?

The argument is that you can't, because American workers know their rights -
rights that every employee should have and stand on, but sadly those from
other countries don't always know about.

------
mikeyouse
Holy bloodbath..

From CNBC:

> _Shares of Intel were halted after the bell Tuesday as Intel announced it
> would cut 12,000 jobs, or 11 percent of its workforce_

> _The technology company also said the CFO would step down_

~~~
wolf_cook
From CNBC: "The technology company also said the chief financial officer Stacy
Smith would leave that role to lead sales."

~~~
shostack
Hmmm...CFO heading up sales is not something you see every day...

Anyone have any further context around this?

~~~
CamperBob2
Lateral moves, even downward ones, are part of the corporate culture at Intel.
At most large companies this type of job switching at the C-level would come
as a surprise, but if you read some of the stuff Andy Grove and others have
written about Intel's "matrix" practices, it's not too shocking.

Basically, they believe that experience across multiple areas of the business
is a good thing, and that no one should become too attached to a particular
position or title.

------
hiram112
Intel ranks #14 in H1B use and was a large proponent of the bill to increase
the allowance foreign tech workers.

So I'm assuming their inability to find 'talent' is no longer an issue? Same
as Microsoft, IBM, and the numerous other big corps that have had massive
layoffs recently, while also claiming an inability to find enough US tech
workers?

~~~
lgessler
You're assuming that the 11% is coming from roles that are so hard to hire for
it becomes more economical to look overseas for the talent--is that true?

In other words, I think it's just worth pointing out that Intel's a huge
company and it might be that these layoffs are coming from roles where market
demand is lower.

~~~
elliotec
I think this is spot on. A lot of companies I know that are doing layoffs are
entirely hands-off developers, except maybe the occasional breaking off with a
contracting company.

------
OliverJones
For years Intel got to ride the rocketship of routine chip-density doubling.
For years the new chips were so much superior to the old ones that it paid to
replace them and the machines into which they were built.

Now, not so much. I can put a SSD and more RAM in my eight-year-old laptop and
make it work just about as well as a new one.

I can switch off the old HP DL380/G5 boxes in my colo, hand them over to the
steel recycling guy, move the data to some cloud service, and come out ahead
electricity bill vs. cloud bill. I'm not buying many processor chips anymore.
Neither is anybody else, except maybe the cloud services. And their bargaining
power makes Dell and HP look like the guys in the white-box computer shop down
the street.

The processor chip rocket ship has entered orbit; its occupants are now in
microgravity. Some other rocket ship will be the next big ride.

It's too bad those folks are out of work. It's too bad plutocrats always
behave as if les bontemps rouleront toujours.

~~~
teh_klev
> I can put a SSD and more RAM in my eight-year-old laptop and make it work
> just about as well as a new one.

I did exactly that with an old i3 based all-in-one Sony Vaio PC my dad handed
down to me. It previously had a 3.5" 5400rpm spinner and 4GB of RAM and would
take a month of Sundays to boot. I installed a 250GB SSD and another 4GB of
RAM and it totally transformed the machine. All it cost was GBP62.00 (I got an
amazing Black Friday deal on both SSD and memory). Hell, the thing can even
run Visual Studio 2015 and a couple of CentOS VM's on Virtual Box and still
feel quite responsive.

------
spriggan3
Didn't they spend 300 millions on a diversity initiative recently to hire
women and minorities ? who are they firing now ?

[http://fortune.com/2015/01/12/intel-
diversity/](http://fortune.com/2015/01/12/intel-diversity/)

------
wrong_variable
Does anyone working within Intel know the reasoning ?

It seems skylake is doing really well.

Is it mostly electrical engineers working on the processors or sales and
marketing people ?

~~~
intelthrow
I hope it's the hardware side of the company waking up that the software and
"solutions" side of the company is a gangrenous fifth limb.

Things like trying to start an app store when they didn't have content (e.g.
Steam) or a platform (Google, Apple, Microsoft), trying to pivot that
abomination into a media store after Netflix already won and the studios are
all trying to start their own, trying for way too long to deliver solutions of
putting parts in verticals (e.g. tablets, micros) where they are at best only
competitive on one of price, power, or performance and immediately face-
planting, but limping along seemingly oblivious to why any of these projects
fail.

The mention of IoT though makes me think this isn't the case. Definitely part
of the brain-dead trendy "solutions" thinking. I don't think you're going to
compete with commodity cheap micros with the best IP or the best process
size... price is king.

There are too many layers at Intel, and that leaves too many of those layers
too well insulated, able to believe it was the brand that won them their
markets (not the economics: Wintel monopoly in personal computing history,
having the best server and laptop parts currently), and that despite the
economics of new products and "solutions", the brand will somehow convince
people to buy in.

Like, why do I ever see you on Television, Intel? Who is the audience for
these ads, potential investors? Shouldn't they rather see higher dividends?
People don't go to the mall and buy Intel chips, they go buy Macbooks and they
use Instagram. Facebook and Apple buy your chips at a scale where it's all
ROI, and you deliver that, so what are the ads for?

This shit is why I left, and why always did the quicksale on the stock
purchase plan.

Intel needs to just keep making the best parts and leave it to the market to
conjure up asinine things to do with them. Kill all the product lines where
there's no road to being #1 ROI within 10 years, keep those in the lab.

~~~
makomk
I'd still like to know whose idea it was to try and compete in the Internet of
Things market by pitting a buggy warmed-over 486 (the Quark) against ARM's
best, most modern chip designs. Sure, Intel got a bunch of headlines in the
tech press about their new IoT solution, but the technical details just didn't
add up. The performance and power figures were dire, it couldn't run existing
x86 code, it looked like a bear to integrate into anything, it just made no
sense.

~~~
petra
Intel's latest Quark has gotten to 1.5ua sleep current with sram retention.
That's not the best ,but that's good enough for many applications. And it's
using 22nm so it may be the cost leader, by far. And that's mostly what counts
in embedded.

------
mixedbit
What Intel needs is some reason for people to do compute intensive tasks on
their computers. Today games are probably the only popular type of app that
requires abundant CPU power, for most other popular activities CPU usage is
low which doesn't motivate to upgrade hardware. Maybe VR can change this?

~~~
blakeyrat
Games have been GPU-bound for years and years. Even in 2011 when it came out,
Skyrim used (at most) 30-40% of CPU power. Fallout 4 (updated version of the
same engine) uses proportionally far _less_.

In short, no, gamers aren't buying new CPUs every 18 months like they used to,
either. GPUs, maybe, but Intel's weak in the gaming GPU space.

------
mattbillenstein
Large enterprise companies like this can lay off 10% anytime they want with
little effect on output -- there are oodles of people slacking in big
corporations basically not doing much of anything except collecting a
paycheck.

~~~
aluminussoma
And do you think these same companies are just as efficient as identifying
who's slacking off? I don't think so.

They may know that 10% of the company is slacking off, but they often end up
amputating a hand when they meant to just trim the fingernails.

~~~
michaelt
A cynical manager who knows his employer periodically clears out 10% dead
weight would keep some dead weight around, so they could deliver the cut
without losing the guys they need.

------
xienze
> “It’s acknowledging the reality that it’s a single-digit growth world,”

Oh, the horror.

------
JamilD
I wonder if this is related at all to the Altera acquisition, and how many of
those employees will be affected. Intel had to give up a lot of cash, and take
on quite a few redundant employees…

~~~
spriggan3
Didn't they acquire McAfee too ? How did it ever make sense for an hardware
company ?

~~~
takno
Since when did Acquiring McAfee ever make sense for anybody?

~~~
maaku
it makes sense for McAfee shareholders...

------
randomname2
\- Cuts outlook: sees revenue up mid-single digits, down from prior outlook of
mid- to high-single digits

\- Also cuts full year margin guidance, sees 62% down from 63% before

\- Generated $4 BN in cash from operations, of which it spent $1.2 BN on
dividends, $793MM for buybacks and saved the rest for severance

\- Notable difference in GAAP vs non-GAAP: GAAP Net Income: $2.046BN (missing
expectations), non-GAAP Net Income: $2.629BN

------
rubicon33
Am I the only one with a sinking, gut instinct, that this is indicative of a
deeper problem in the economy? It feels like a crash is lurking... Maybe I'm
just paranoid?

~~~
jcslzr
interest at zero means that you are stepping on the gas pedal and the car is
hardly moving, so of curse a crash is coming, a big one

~~~
UK-AL
It's been like that since the last crash.

------
rdl
If they're cutting entire divisions, there will be a great mix of strong
performers in there -- great for other companies who are hiring.

I'd always love to talk to Intel people from the hardware security projects
(SGX, etc.).

------
neverminder
I really hope this is not going to affect their already ever lagging release
schedule. If I'm not getting Kaby Lake Q3 this year as promised I will be
severely disappointed.

~~~
creshal
Kaby Lake in itself is already a disappointment. Intel must have _lots_ of
problems with 10nm production if they're willing to slip a full product
generation.

~~~
neverminder
I wouldn't call it a disappointment. It will add native USB 3.1 (Type-C)
support which is major thing for me, considering that USB Type-C is projected
to become the most popular socket in the history of humanity. Also: support of
Intel Optane
([https://en.wikipedia.org/wiki/3D_XPoint](https://en.wikipedia.org/wiki/3D_XPoint)),
Thunderbolt 3, HEVC, etc
([https://en.wikipedia.org/wiki/Kaby_Lake](https://en.wikipedia.org/wiki/Kaby_Lake)).

~~~
Unklejoe
I thought type C was just a connector. How is this supported or (not
supported) by the SoC? There are USB 2.0 devices with a type C connector. This
is more confusing than it should be for me...

~~~
lucaspiller
Skylake supports all these, but needs a a seperated chip on the motherboard.
With Kaby Lake it'll be built in to the chip. That should hopefully mean these
connections aren't just restricted to high end laptops.

------
yeukhon
Intel's IT department is so big it was for me to believe it. On the other hand
I am pretty amazed that they were so serious about keeping internal
application communication encrypted end to end, in every layer.

~~~
a3n
They're likely a very juicy target for economic espionage.

------
ashitlerferad
Hope some of those folks are chip designers and they decide to go work on the
RISC-V/lowRISC projects.

------
qaq
Maybe it's time to innovate a bit? I have 0 reasons to upgrade my 6 year old
MAC Pro, I would need to spend 5K to get a mild performance boost.

~~~
superkuh
I agree but that's a bad example. Macs are notoriously expensive for the
performance they deliver.

~~~
qaq
That's a stereotype from a time long gone. The difference in price between Mac
Pro and identically configured dell workstation is minimal. On the low end yes
there is price difference. On mid tier iMac 27" 5k costs about the same as
dell 5K monitor without the computer (they use identical LCD).

------
ArtDev
I bet VR will lead to more PC sales in the coming years.

~~~
datashaman
There's this thing called the stock market where bets like this are quite
common. If you're certain, now is the time... :)

------
mtgx
> The company said it’s shifting focus to higher-growth areas, such as chips
> for data center machines and connected devices.

Where have I heard this before? I think in a little book called the
"Innovator's Dilemma". Can anyone predict what happens next?

I wonder if Intel will try to push Atom into "Core i3" and make _single-core_
Core i7's next to "increase profitability". They've already started making
dual-core Core i7s - I mean how ridiculous is _that_ idea?! Isn't a dual core
Core i7 supposed to be a Core i5? Do their brands still mean anything anymore?

------
tmaly
Intel is in a very cyclical industry. They over hire when the economy is doing
very well, and they cut when things are not. I got a pink slip from Intel back
in 2001 just after I graduated college when the dotcom bust happened.

------
mparramon
"Robots will soon begin taking human jobs in places like retail stores, fast
food restaurants, construction sites and transportation. The key technology
that will fuel the transition is inexpensive computer vision systems, and the
number of human jobs at risk numbers in the tens of millions. More than half
of the jobs in the United States could be eliminated."

[http://www.amazon.es/Manna-Visions-Humanitys-Future-
English-...](http://www.amazon.es/Manna-Visions-Humanitys-Future-English-
ebook/dp/B007HQH67U)

------
cmurf
Makes sense. My 5 year old laptop has a i7-2820QM, 8 vcpus, is plenty fast for
the stuff I do, 4600 bogomips, 45W. My recent upgrade was a NUC with a Pentium
N3700, 4 real cores, 3200 bogomips, 4W. Pretty impressive.

------
IBM
I think if there's any time Apple would switch to their own ARM designed chips
for Macs, it's now. This along with Intel slowing down from their Tick-Tock
schedule will probably do it.

~~~
billyhoffman
Why? The POWER->x86 jump was made largely because of the need to have more
performance from the chips (raw power) and better performance per watt (for
laptops). Intel wooed Apple on what they had available at the time as well as
the impressive roadmap of where the they going. I often get 10 hours of
battery life on my Macbook pro, so they did something right!

I totally think Apple will go to ARM, but I can't see Apple doing shift until
OS X is getting similar performance on ARM over x86. When Job introduced the
x86 switch, he said they had had an internal version of OS X running x86 since
day 1. I'm sure they have a version on ARM as well right now, and are just
waiting for them to get "good enough". Apple's marketing on recent iOS devices
has been also very interesting. Lots of use of the phrase "Desktop Class".
Also, the GeekBench scores, and AnandTech teardowns support the hype: Apple's
ARM chips truly are best-in-class and rapidly approaching mainstream x86
performance.

~~~
IBM
I think it's obvious why and you've stated part of it. Apple's designs are
rapidly approaching the level of Intel's and Intel is stretching out their R&D
and reducing their investment in this area. So Apple will be asking
themselves, why give away all that margin to Intel when we can do it ourselves
and keep it? (Intel's gross margin is 62%, no idea what it is for their
consumer processors specifically).

~~~
billyhoffman
I misunderstood you then. I read you post as "Intel in financial trouble? This
will make Apple leave them"

------
manav
While there are obvious market valuation issues at play, I think this signals
more about Intel and it's future strategy. The layoffs come shortly after
their $16.7 billion acquisition of Altera was completed.

Intel really missed out on mobile and with PC sales rapidly declining it looks
like they are going to refocus on enterprise and data centers. ARM and
NVIDIA/GPU computing are also expanding rapidly in those areas and that will
pose a major threat to Intel.

------
seeing
I wonder how CEOs feel about their bottom line when they're off by such a big
percentage in the workforce they need.

Couldn't they have predicted this sooner?

------
sxcurry
As you read these comments, I'll repeat mine from an earlier post: Don't come
to HN for legal, medical, or economic advice!

------
nxzero
Being in hardware is getting harder and harder.

Increasingly the footprint of hardware is becoming sparser, replaced by
software, etc.

It is time to make the push to make hardware open source mainstream from the
point power hops on to where software picks up.

There are many, many really good reasons to do this, but in the end, to me, it
will define how free the world is.

------
testpass
This will have a pretty bad effect on the company's morale. Depending on how
long this layoff occurs for (1 month vs 1 year), everyone will feel extremely
uneasy going into work knowing today's possibly the last day.

------
dman
Wondering if Intel is moving out of some markets completely - mobile chips
(phones / tablets) is the one market that comes to my mind. The margins on
those are almost non existent currently (on the low / medium end).

------
mixmastamyk
How competitive is Intel in mobile chips these days, have they been improving?

~~~
_yosefk
I think the trouble with application processors is you ought to have lower
prices and margins compared to desktops and servers, so even if they do great
there it's not that great financially. Intel is a relatively high margin
business so it either needs to find markets where that works, or grow volume
very drastically, which is hard given their already very high volumes, or
shrink.

------
kevin_thibedeau
Guess they should have kept going with StrongARM/XScale. They could've had a
mature, trusted product line by now.

------
known
Unlike Capitalism, Globalization is Zero-sum

------
jmount
Almost exactly decimation.

~~~
logfromblammo
Closer to nonation, technically, but decimation is the word everyone already
knows.

~~~
msellout
"decimation" is also the word with meaning. "nonation" is just turning
"nineth" into a verb and then somewhat of a gerund, like "ninething". Without
the historical context, it has little meaning.

~~~
logfromblammo
Yet that is exactly the pattern used by decimation.

    
    
      decimus (tenth) + -ate (convert to verb) = decimate
      decimus (tenth) + -ation (resulting state of -ate form) = decimation
    

It could be used with any Latin-rooted ordinal number. Primate, secundate,
tertiate, quartate, quintate, sextate, septimate, octavate, nonate, decimate,
undecimate, dodecimate.

Primate, secundate, tertiate, and octavate have meanings other than either
"reduce by 1/Nth" or "reduce to 1/Nth". Tertiate is actually roughly
equivalent to "threepeat". Those words have about as much inherent meaning as
"tenthing", but some had meanings extrapolated from "decimate" as early as 200
years ago. The "reduce to 1/Nth" meanings would be more accurately
encapsulated with the "an-" (to) prefix, as in "annihilate": literally "to
nothing (verb)". That would give you antertiate, anquartate, anquintate, et
cetera for the reciprocal-valued words.

As I said, "decimate" is the most well-known of the bunch. "Annihilate" is
also well known. But every last one of them is a perfectly acceptable English
combination of Latin roots, and should be understood easily enough.

~~~
msellout
> should be understood easily enough

I think you'll find you have an easier time communicating if you stick to
terms that people have heard before, even if they're slightly less accurate
than new creations.

Of course, if you're trying to get famous, you should absolutely invent new
words. Take Thomas Friedman's "glocalization". That one didn't do so well.
Steven Colbert's "truthiness", that was a gem.

~~~
logfromblammo
I refuse to participate in the evisceration of English vocabulary. If you
can't guess the meaning of an unfamiliar English word from context, your
working vocabulary simply isn't large enough. A staggeringly large quantity of
English words are derived from common elements filched or inherited from
Greek, Latin, Old Norse, Anglo-Saxon, and Norman.

Knowing the root words are sufficient to both guess the meaning of many of the
600k+ English words that are in the dictionary, but not typically used, and to
formulate new, English-sounding words with intuitively obvious meanings. This
is similar to knowing the IUPAC chemical naming rules, which allow chemists to
name molecules in such a way that other chemists will know how it is
structured, just by reading the name. (Z)-Hex-3-en-1-ol, for instance: hex
means the longest carbon chain is 6 atoms long, 3-en means there is a double
bond following the 3rd carbon, (Z) means it is in _cis_ configuration, and
1-ol means there is an -OH alcohol group on the first carbon in place of a
hydrogen. English is less formally structured, but it still has rules.

Truthiness = truth + -y (similar to) + -ness (quality of being)

Therefore, truthiness is the quality of being similar to truth, which is as
Colbert's character described it. It follows the rules. It reads like an
English word with unambiguous meaning, and is therefore adopted as though _it
already was an English word_.

Glocalization = global + local + -ize (convert to verb) + -ation (state
resulting from the verb action)

This smashes two dissimilar words (global, local) into one portmanteau that
has ambiguous, unclear meaning (glocal), and then tries to extend it with
regular English suffixes. Portmanteaus are less readily adopted without
literary backup. Dodgson's _frumious_ (furious + fuming) never would have made
it without the Bandersnatch, and _slithy_ (slimy + lithe) required a bit of
explanation from Humpty Dumpty. Needless to say, Dodgson was much better at it
than Friedman.

This is why I like to say that people who know English well are sesquilingual,
because you need to know a little bit of several other languages to know that
many of the words.

------
mtgx
Right now, the _only_ content in this article is "Developing...".

~~~
deanstag
I am actually curious. Did everyone who upvoted the title post not click on
the link at all?

~~~
RyJones
clearly, no.

~~~
cpeterso
HN's scoring algorithm should ignore or discount upvotes if the user did not
click on the article link before voting. For extra credit, HN could also watch
for page visibility changes that would indicate that the user, in addition to
opening the article, also switched to that the article tab. :)

~~~
a3n
I sometimes read an article at work or home, and then later read the comments
and upvote later at home or work. Don't assume that everyone operates the
machine the same way you do, nor with the same goals.

------
vicinno
bubble? next crisis is coming?

------
zump
GoodBYE!

------
angersock
Oh man, they're even better than AMD at _laying off people_.

