Hacker News new | comments | ask | show | jobs | submit login

One must understand that this is re-structuring. Intel is probably letting go of some divisions it no longer intends to pursue. In the not-so-distant past, Microsoft did an internal re-structuring when Satya Nadella became the CEO. It isn't as bad as it is portrayed to be as most people end up getting re-hired in other groups or take the severance and join a new company.

and I am not down-voting you, I am re-structuring your karma.

Yes, this may be good for the future of Intel or the employees losing their jobs... but right now, 12000 people are having their lives affected in drastic, unexpected ways.

I agree that it's likely the majority of those people are in unenviable positions. I've been thru cuts, it sucks, but at the same time, as you say it may be untenable for Intel to keep carrying them --also, it's possible Intel should have been more cautious in hiring (and maybe some of those people would have opted for personally more stable jobs, or maybe they would have taken worse jobs --it's hard to say.

It doesn't say that they are firing 12,000 people. Certainly some people will be laid off, but a reduction in head count of 12,000 is distinct from layoff count.

Assuming people stay in a job 4 years, you can get a 25% reduction in headcount per year by just not hiring. Intel has over 100,000 employees. They are likely hiring 10k-25k people per year just to stay at a constant size.

I doubt with the CFO being let go most people will end up getting re-hired in other groups. Their revenues are down, they're closing up shop in some areas or realizing they need to run some areas with less people. It's actually a pretty big deal.

the CFO being let go

He's not "being let go".

From the article: Stacy Smith, who has been chief financial officer since 2007, will move to a new role as head of manufacturing and sales

I'm not sure what to make of the move, but if they really wanted to let him go they wouldn't have given him this different role.

Edit: just to elaborate, Intel is first and foremost a manufacturing company. Their manufacturing currently has a 61 percent gross margin. You don't put someone in charge of that if you want to let him go!?

But I'm not an Intel employee or close observer of the company. Maybe some Intel insiders can chime in with that they think this means.

Intels profit has grown 3 percent and their revenue is up 7 percent compared to same quarter a year ago.

You're right. I made a bold statement without looking into the details. Looking at their yearly balance sheet, though, their yearly liabilities are going up. They've got quite a bit of debt.

A good 1/3 of their assets are in property plant and equipment value. That means that the fair market value of that stuff can tank if they don't keep up the pace of growth because what if they can't sell as much of their products anymore because of a shift to other technologies?

This is for sure a move because their revenue forecasts are grim in many areas they operate in.

News like this out of the blue is strange. I was pretty sure that the post-PC era was a sham. It still is, right?

I recently started working on a little eCommerce project. I found a way to indirectly estimate the sales of some of my competitors products. So I downloaded about 13gb of this data, processed it down into about 80 million rows and than ran a bunch of stats on it. Now 80 million rows is not "big data" scale, but it's not super small either. With my 16gb of memory (that I paid I think $400 for), my humble quad core i7 that i paid $300 for, and my 500 GB SSD. that I paid $100 for, My less than $1k desktop crunched through this dataset with no effort at all. I consider myself a "power" user, and my humble machine can do everything I can throw at it, and more. It'll be a long time before I upgrade again, and when I do, i'll probably buy the cheapest processor on the market. I'm probably not alone here.

My 3+ year old Lenovo laptop running an i5 with 8GB of RAM and a 500GB HDD (at 5400 RPM, no less) easily lets me multitask on Word, Photoshop and about 20+ Chrome tabs open

The only time I've felt my computer to be "slow" was when I tried to use Photoshop and After Effects simultaneously.

Yup. I do plenty of "medium data" work on my overclocked i5-2500k and 64 GB of RAM on Windows 7 / Ubuntu mixed machine. I've had the processor since the 2500k was the thing to have (years ago). No reason to upgrade yet. Got a new GPU after 6 years but that was about it.

This sounds intriguing, care to share any insights? Is this for Amazon?

Yeah, I want to hear about estimating competitor sales too.

Maybe i'll do a write-up in the future. I assume everyone does it already, but on the off chance they don't.... i don't see any upside in writing about something that might help my competition.

There are plenty of PCs.

Here's the problem. Running a five-year-old PC used to be an issue.

Today, running a five-year-old PC is a Intel Sandy Bridge i7-2600K ( Passmark Score: 8,518 ). While a modern i7-6700K has a Passmark Score of: 10,987.

FIVE YEARS, and FOUR generations of processors have created a gain of net 28% in multithreaded situations. Far less for single-threaded applications (maybe 15%). And absolutely negligible for gamers (which are 100% GPU throttled).

If you're running a 5-year-old i7-2600K, there is absolutely no reason to upgrade to Intel Skylake. None at all. Maybe you want a new GPU to play those VR games... but Intel isn't making gains anymore in processor speed.

Intel has been trying to get people to buy their power-efficient designs (Skylake is a hell-of-a-lot more power efficient...) so Intel continues to sell laptops at a decent rate. But no one I know has major issues with their desktop speeds.

The only people I know who have upgraded their computers are those who have had hardware failures. There's still no need to upgrade a computer from Sandy Bridge.

People repeat that meme that SandyBridge doesn't need replaced so often. You say it 2 or 3 times by yourself.

While that's the prevailing opinion, I don't necessarily agree. I think it's SandyBridge owners trying to convince themselves more than anything else, but really it's just being swooped up in the groupthink.

Skylake is the first chip that makes a very strong case as an upgrade. You gain NVME support for your PCIE SSDs, DDR4 (which has shown an improvement over DDR3 in some benchmarks), roughly 20% IPC improvement (5% per gen give or take), DX12_1 feature level IGP, CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming (Broadwell had it first and Skylake's is improved upon), vastly more power efficient and Thunderbolt3 support.

7 pretty good reasons off the top of my head. You can dismiss each of these if you want, but this is all very attractive in reality.

The whole story is that SandyBridge is only competitive, in gaming, if you overclock to 4Ghz+. You still lose out on the other improvements though and any stock SB system compared to a stock SL system will look pretty sad once you factor in the platform updates.

If it's gaming you care about, take a look at the benchmarks of the 128MB L4 Broadwell chips compared to Devil's Canyon. Let alone SandyBridge. Both get crushed where it counts and Intel is just now getting Skylake 128MB L4 CPUs out the door. If you don't care about gaming, Skylake still crushes SB.

> Skylake is the first chip that makes a very strong case as an upgrade. You gain NVME support for your PCIE SSDs


If you care about NVMe, just get a $20 expansion card. Besides, NVMe SSDs are expensive. Mushkin Reactor 1TB for $210 yo.

Hell, the fastest NVMe SSDs directly go into PCIe lanes. So if I actually cared about the faster speeds, I'd jump to an Intel 750 SSD.


Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.

Yes, if I had a laptop which only had room for a M.2 card, then maybe I'd get the Samsung M.2 card. But even if one were given to me for free, I'd rather get the $20 PCIe expansion card.

I can't think of a single situation where I actually need the onboard M.2 card on the Skylake motherboards, aside from the $20 convenience.

> roughly 20% IPC improvement (5% per gen give or take)

I admit, this is a good thing. But this is very very little, especially when you consider that the iPhone 5 to iPhone6 jump was 70% IPC improvement AND battery improvement, yet many people don't consider that enough of a jump.


Soooo... FIVE years gets you +20% speed, while ONE year gets you +70% speed on phones. That's why desktops aren't getting upgraded.

> DX12_1 feature level IGP

You buy a $300+ CPU without buying a $100 GPU? The cheapest of GPUs are significantly better than IGP. Hell, if I cared about DX12_1 IGP, I'd get an AMD A10 for half the cost and twice the IGP performance with drivers that actually work on games.

Except I game in capacities that far exceed even AMD's superior IGP. I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris. So I have a R9 290X. Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.

> CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming

NOT on the desktop. Crystalwell is laptop-only, and 45W to boot. Compared to 20W Laptop chips, I don't see the Crystalwell L4 Cache actually being useful to the majority of people.

In fact, I don't even know of any laptops with Crystalwell that I'd recommend to anyone. Here's a challenge: Name me a good laptop with Iris Pro / Crystalwell. Hint: Macbook Pro use AMD dGPUs for a reason.

And hell, we aren't even talking about laptops. We're talking about Desktops, and Crystalwell is UNAVAILABLE in desktop form. Its irrelevant to the desktop user, even if you thought that paying $600+ for a CPU was cost-effective (instead of buying a Sandy Bridge E5-2670 for $70 from Amazon).

Basically, you got DDR4 RAM and IPC +20%. That's all that I actually think the last five years will get you. Or, you can buy a 8-core 16-thread E5-2670 for $70... hell... two of them, get a nice Server Board for $300 and have a BEAST of a machine.


The base Macbook Pro 15" uses Crystalwell and has no dGPU.

Yeah, but would you seriously recommend it over the AMD Tonga (R9 M370X on the upscaled version)?

The 45W i7 is a heavy burden to carry with Crystalwell. Might as well get better graphics if you're going for the 15" Pro.

There is no way in hell I would prefer that AMD chip in my system over an all-Intel system. AMD are just terrible to use in Linux and most people around here want the ability to run that natively without issue. Not to mention the added complication of tacking that AMD chip onto the laptop both from an engineering / reliability stance and software complication.

You missed the irony of your 45watts as a heavy burden. A R9 370X adds about 50watts to your TDP by itself. Along with its needless complexity. If someone wanted to reduce TDP and that complexity you could step down to the base Intel IGP. But if stepping up, Intel's solution makes a lot more sense.

Your loss man. The benchmarks don't lie.

Good luck with your overpriced Crystalwell failure. If you got actual benchmark scores to talk about, please respond to me here: https://news.ycombinator.com/item?id=11536519

But I actually know the benchmarks of everything you're talking about like the back of my hand. Your argument has no technical legs to stand on what-so-ever. Don't feel bad if I'm just calling out your Bull$.

Wait, what are you talking about? That was in no way a response to what I said to you here. You don't need to change the topic just because you're wrong and you know it.

No one wants that AMD chip in their Macbook. It adds complexity both in engineering and software. There's PLENTY to talk about technically there and why that's a good idea. Not to mention Intel's best-in-class Linux support.

It's actually kind of annoying to have graphics card switching - it caused a number of problems in my old 15" MBP, to the point that I opted for integrated this time.

>Besides, NVMe SSDs are expensive

Yes. If you're bargain hunting for gaming hardware you should just buy a console. Or, if you're seriously suggesting to put an Intel 750 into some old system like SandyBridge.. no comment. I would never recommend someone bother doing that.

Step up to an NVME setup, Skylake and do it right. Skylake i5 setups can be had for cheap. You're just arguing to argue on that point. Whether or not you have anything useful to add. The SB argument is common knowledge, an age old argument at that with no new information or insight.

>Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.

I'm not into cost-effective bargain hunting. Anyone who would gimp a nice Intel 750 SSD on a non-NVME system is a fool and you've suggested it.

>The cheapest of GPUs are significantly better than IGP.

No they aren't. The point about DX12_1 IGPs is that it's there, it's modern and it has already sucked the life out of the low end space and moving into the midrange with Iris Pro. Your stance is the 2010-era view on computers. Same era as Sandybridge TBH.

>I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris.

This demonstrates how much you know, and why people shouldn't listen to what you're saying. Which can be heard on any PC gaming forum a thousand times over. This is HN though and it won't fly.

Intel has already committed to FreeSync. It's incoming with KabyLake rumor is that it may be enabled for Skylake.

>Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.

Wrong on its face. You just haven't cared to investigate recently.

>NOT on the desktop. Crystalwell is laptop-only, and 45W to boot.

Nope. The Crystalwell chips are going into NUCs from here on out. There's a 128MB L4 NUC coming in 2 1/2 weeks and a 256MB NUC coming in 12 months.

The fact you're talking about gaming and recommending an ES-2670 for that is just silly. That might be a good machine for compiling code. If that's your goal, it's still a bad idea when distcc can utterly embarrass that old power hungry chip.

For gaming, Broadwell already demonstrated what Crystalwell adds for gaming performance with a standalone GPU. And it's a game-changer, it's faster than the i7-6700K. Yes, it is. And it definitely mops up where it counts (99th percentile frame times) on SandyBridge too.

In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have (even with an Intel 750, if you made the ridiculous decision to actually put one in SB). There might be more cost-effective ways to build a gaming rig, but if you're into saving money on hardware and gaming, buy a PS4.

I understand it's the prevailing thought among PC gaming kiddies, but holding your grip tighter on some old SandyBridge system won't change that in reality it's fallen pretty far behind in both overall platform performance and power efficiency.

I can't find Iris Pro 580 on benchmark sites, because no gamer gives a care about that for gaming.

The Iris Pro 5200 GT3e achieves Passmark 1,174.


If Iris Pro 580 GT4e is twice as good (Intel only claims 50% better), that's still not very good. Thats utterly awful actually.

A $100 GPU is the R7 360, just off the top of my head. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125...

Exactly $99 on Newegg right now. It achieves Passmark 3,150.

No one gives a care about the $600 Crystalwell chip that performs worse than a $100 dGPU. Its utterly awful. You'd be insane to actually recommend this product to anybody. You claim that you care about performance. Do you even look at the benchmark numbers bro? You're claims are so far away from reality I really just don't know how I'm supposed to respond to you.

Yes, a $600 Chip. I'm assuming this, unless you can figure out a cheaper Skylake Iris Pro: http://ark.intel.com/products/93336/Intel-Core-i7-6970HQ-Pro...


EDIT: I see that you're an anti-AMD guy. Okay, whatever. That's why there's another company out there.


NVidea GTX 750 Ti, $105 right now on Amazon. Passmark 3,686. Still utterly crushing your $600 iGPU with cheap-as-hell GPUs, no matter the brand.


Dude, I'm running a (what was at the time) high-end R9 290x, although this is more of a mid-range card now due to its age (Fury / 980 Ti). It has Passmark of 7,153, and you're seriously suggesting I "upgrade" to a Crystalwell Iris Pro that only achieves ~2000 Passmark?


PS: Skylake performing 20% faster than Sandy Bridge after five years of updates is awful.


> I'm not into cost-effective bargain hunting.

Then why the hell are you bringing up M.2? Intel 750 is the best of the best and plugs directly into PCIe. Sandy Bridge handles it just fine.


> In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have

Crystallwell has to beat a $100 dGPU first. And the benchmarks say otherwise. My bet? Crystalwell fails to beat a R7 360 / NVidia 750 Ti (NVidia's LAST generation BUDGET card) and I get to laugh at its worse-than console performance numbers despite the $600+ launch price for the chip.

But hey man, show me those benchmark numbers if you disagree with my assessment.

"Bro", "dude", "man". So I know I'm talking to some little kid at least.

But I have to say, anti-AMD! Yes, very perceptive as I type this on a machine with a Radeon in it. Consider the fact that other people can criticize products they have extensive knowledge with.

Judging from the rest of your response, my post went completely over your head. And quite the troll as you try to change the points I made in attempt to "win" the argument. But it is amusing hearing some kid saying Intel's 20% IPC boost from SB to SL as awful shows how much you don't know. I have friends that work at Intel. Go back to PC Gamer as you have no idea what you're talking about. Some dumb kid sees ONLY 20% with massive power reductions, doesn't realize that computers can do more than just play video games.

You also failed reading comprehension and the ability to hold a conversation. Congrats. But either way, there's no way around the fact that in 2 1/2 weeks I'll be benchmarking a R9 Fury to an i7-6770HQ with PCIE NVME SSD, some DDR4 and absolutely crushing any SandyBridge system you own.

Enjoy your old ES-2670 and SandyBridge with an Intel 750. What a total fruitcake. Better use of your time is to go read about logical fallacies as you just spent an hour typing about a strawman you created to beat on with points I never made.

What you want to hear because you just want to argue- you're right, I'm wrong. Hope you feel better now. I'm not giving you any more help. I get it, you like your poverty gaming rig. See ya kid.

>> i7 2600k

Sure, that's the top of the line $300 chip in the days of a whole PC being able to be bought for $300. What if you're on a five year old Pentium G620?

If you bought a $300 PC 5 years ago:

1) you're not the sort of person who buys a new rig every two years, and 2) a $300 PC today will give you exactly the same performance as the one you bought 5 years ago: the minimal gains you get in iron are naturally offset by minor losses in software (which is now built by people with SSDs, so good luck with your little spinning disks...)

The market is now artificially segmented to such a fine level, and moving so slowly at the top, that performance simply does not "trickle down" like it used to. Add to that the move to "power efficient" CPUs (aka: less powerful overall) and you will basically see zero gains if you stick to the bottom of the market.

Not quite "exactly" the same performance. A 20% improvement with today's stuff.

But yeah, its peanuts. A 20% improvement over five years is pathetic. I'm just calling out your hyperbole, in case others didn't see it. Apple had like a 50% improvement in a single generation of iPhones, so a 20% difference over five years is very ignorable.

SSDs and GPUs improved dramatically over the past five years. Well... more specifically... SSDs got dramatically cheaper and retained roughly the same quality. So its worth it to upgrade to SSD or to get a new Graphics Card. But Intel doesn't have any GPU offering, and their SSDs are "enterprise" (aka: overpriced). Mushkin / Crucial are better brands for consumers... even Samsung (although a bit more expensive)

The cores are basically the same within the generations.

A five-year-old Pentium G620 is only ~25% slower than the Skylake Pentium G4520. Both are dual-core CPUs that are cheaper than $100 aimed at the budget audience.

Frankly, the fact that AMD Vishera FX-6300 still easily beats out the Pentium G4520 in multithreaded benchmarks... this demonstrates the absolute lack of Desktop CPU improvements. I'd only recommend the G4520 to someone who is really sure that they care about single-threaded performance (ie: Gamers). Most people will appreciate the lower total-cost-of-ownership that FX-6300 offers at that price point.


* AMD FX-6300 Passmark: 6,342

* Modern Skylake Pentium G4520 Passmark: 4,261.

The G4520 is a $80 chip, Released October 2015. FX-6300 was AMD's 2012 entry: a FOUR year old chip, now selling for $80 to $90 at Microcenter.

Microcenter has some $0 Motherboards if you buy an FX-6300 from them. That's the kind of benefit you get from buying "old". And since CPUs aren't really much faster, why the hell should you buy cutting edge?


Hell, why are you spending $80 on a new G4520? Facebook just decommissioned their servers. You can get a Dual socket ready Sandy Bridge 8-core 16-thread E5-2670 on Ebay for $80. Amazon for $70


Go get yourself a dual-socket 16-core 32-threads E5-2670 Sandy Bridge Workstation, just $80 per CPU.

Intel can't even compete against their own ghost from 5 years ago. Is it a wonder that sales are low?

I think we're in a post-PC era. Yes, I spend my work days on a laptop. But most of my personal computing happens on an iPhone or an iPad. I think that PCs (and laptops) will increasingly become "things we do work on" and smartphones and tablets will become "things we consume stuff on". A lot of PC sales, I think, were coming from people buying them for personal use to consume stuff. That market is changing.

On the consumer side its possibly back to a shared PC era. Instead of multiple PCs per household, just one PC to do work that's about 5 years old, and each person has their own annually updated phone/tablet for everything else.

They're so much slower to interface with though. I couldn't live without keyboard shortcuts.

On top of that productivity app companies are chasing mobile in order to not get left out in the cold, further exacerbating the PC decline.

This isn't at all out of the blue. As the article states, Intel revenues have been declining for over half a decade now. A few years after the first iPhone hit the market.

Are you sure about the revenues going down? According to http://www.statista.com/statistics/263559/intels-net-revenue... revenues seem to have gone up.

You can blame it on the end of Moore's law, or you can blame it on mobile computing. (The two are not unrelated.) But it is fundamental.

To me, it's more that Moore's law is over and that observable speedups are much less obvious to the non gaming consumers

To me the Moore's law will end where we all have a mediocre CPU at home connected to a fiber... any anytime you need to do work, or someone else, without being aware we share our CPU with eachother. Then there is no more Moore law - just one huge SPU (Sharable Processing Unit).

Out of the blue? There have been articles for a long time about the internal mess at Intel as they try to figure out mobile and IoT. One of the latest ones just last week:


I'm thinking its not so much a post-PC era as a customizable SOC era, and Intel does not want that at all.

How much of your time is now on a mobile that used to be on a desktop? Their core business is drying up.

Am I the only person for whom the answer to this question is "almost none?"

I'm serious: about 80% of my day is meeting with real people in the real world. Mobile phones haven't changed that.

The other 20% of my day is sitting at my desk creating original work product (mathematical models and thoughtful memoranda) or reviewing the work product of others. Mobile phones haven't changed that, either.

No doubt the drought in PC sales is real and permanent. But I wonder how much of that is because people just don't need to keep their laptops up to date in the age of great cloud services.

Nah. I think there are more than a few of us around. I mostly avoid using my smart phone and I don't have a data plan. I had to buy it because I was travelling abroad and it was a light device I could use to communicate and take photos with. Most of my work is done on my laptop.

Yeah, mobile hasn't figured out a good way to take over the workspace. Some of the tablet/laptop hybrids are getting closer.

As for entertainment though, are you watching Youtube extensively on your desktop?

I know this sounds crazy, but there are still some people who have cable subscriptions and watch TV on a TV. Oh the horror!

Those TVs are now 'smart' along with cable boxes and other peripherals. Is Intel Inside any of those?

> Yeah, mobile hasn't figured out a good way to take over the workspace. Some of the tablet/laptop hybrids are getting closer.

There already is a relatively mobile tablet/desktop hybrid that works pretty great for both consumption and getting work done. It's called a laptop.

> As for entertainment though, are you watching Youtube extensively on your desktop?

Yes. I have a phone, a tablet, a desktop, and a laptop. The tablet is pretty much only used for netflix and textbooks, and the phone is for travelling. The tablet is absolutely worthless for browsing, coding, writing, or gaming; and the phone is only saved by the form factor. If I had (the space for) a TV then the tablet would be a completely unjustifiable purchase.

Laptops typically aren't considered mobile devices (if we're being pedantic). Try running mobile apps on your laptop.

And yes, mobile devices have taken away entertainment share from the desktop as well as televisions.

Sure, say portable if that makes you happier. Not that it matters though, mobile programs are the ones trying to catch up to desktop programs, not the other way around.

No, you aren't the only one that barely uses mobile stuff. My desktops and use of laptops is essentially required for my job (software engineering). I've seen 1 person switch to a tablet, but I'm not sure they have enough power to do the things I need. One day all of this will merge into a single unit, but right now I see separate needs/uses for both.

Great cloud services, or the fact that my work has in no way been accelerated by newer processors. The only noticeably workflow-related speed increase I've had in the last five years is damn near zero-margin SSDs.

Almost none, the only time I use my mobile in lieu of a desktop is when my desktop is unavailable or my home Internet connection is down.

I don't count time I spend out and about on my mobile, since that isn't time I was going to spend on my desktop anyway. (And I don't think having the option of going outside with a mobile has changed how often I do so.)

All that mobile traffic increases server traffic. I don't think the switch to mobile is a big hit for Intel as long as they are the only major player in the high performance CPU market (let's see if AMD can turn it around with Zen, but I don't have my hopes up).

Also chiming in with a "none". All mobile devices have done for me is expand my use of computing devices into realms in which I never used one before.

Less than 5%. My phone stays in my pocket most of the time, even when I'm nowhere near any other device, and everyone else around me is heads down in theirs.

I pull out my phone to get my 2FA number every now and then :D

Yeah, I am not doing any development work on a mobile device. I have yet to go to an office where the cubes are filled with people working on iPads.

I think intel's core business has transitioned from desktop PCs to servers for the farms, which shows no signs of drying up.

Plus, Intel provides almost all laptop chips, and I'm guessing most businesses still have either a desktop or a laptop per person, which will still get upgraded every so often. I doubt workers are going to be using iPads for data entry, although I suppose you never know.

For me, maybe 30%, mostly because I have a computer in front of me for coding most of the day anyway. For my brother's family it's nearer to 90%. He still has a PC for work, although it is on about a 5-7 year replacement cycle, and they no longer need the more-expensive second PC.

> How much of your time is now on a mobile that used to be on a desktop?

None, since I don't want to have (and don't have or own) a portable surveillance and tracking device (also called mobile phone) in my pocket.

It's probably possible for PC sales to decline while PCs remain dominant in computing. If so, Intel probably doesn't need to restructure; they just need to get smaller.

No. Computer sales are at historic lows. We've reached the confluence of computers being "good enough" (there are no longer much performance gains from buying new computers) and more and more consumer computing moving to tablets/phones.

The particulars of Intel's situation aside, there are some (notably not all) watching the markets that believe we may be well into the prelude to a market recession.

We're pretty much post-PC froth/churn.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact