Hacker News new | more | comments | ask | show | jobs | submit login
Intel to Cut 12,000 Jobs, Forecast Misses Amid PC Blight (bloomberg.com)
637 points by matt_wulfeck on Apr 19, 2016 | hide | past | web | favorite | 502 comments



So Intel is cutting 11% of its workforce, Goldman Sachs just reported a 56% drop in profits, Morgan Stanley had a 50% drop in profits, Netflix missed subscriber growth estimates etc... yet, the Dow just hit a 9-Month high, and the S&P500 is now above 2100.

The whole market is overvalued, not just the tech unicorns.


Goldman's stock price is approximately 25% off it's 12 month high. [0] Morgan Stanley is approximately 35% off. [1] Seems like the market is properly taking in their bad news. Some parts of the market do fine, others get hit. And yes, some might be over-valued. When interest rates are low, the future cash flows of companies are discounted back at a lower rate. So if all things are equal (which they never really are, but humor me) low rates imply higher prices for both equities and bonds, and especially startups whose cash flows are mostly in the future.

[0] https://www.google.com/finance?cid=663137

[1] https://www.google.com/finance?cid=660479


Both GS and MS (as well as other FIs) are off there highs because of all the volatility in the markets the past 6 months and worries about losses from bad energy loans. BAC, WF, Citi are all off their 12-month highs by a significant margin despite doing relatively well in Q1.

Also, this has nothing to do with discounting the cash flows, it's mostly stock buybacks that's driving all the action:

http://www.bloomberg.com/news/articles/2016-04-19/early-warn...


> Both GS and MS (as well as other FIs) are off there highs because of all the volatility in the markets the past 6 months and worries about losses from bad energy loans.

No, they are off because they are making way less money (down 55% from a year ago!). In general volatility can be good for investment banks because it means higher trading volume. Revenue has been way down thanks to fewer deals (remember when tech companies had IPOs?) and reduced fixed income trading volume (GS revenue from fixed income trading is down 48% from a year ago).


It would be interesting if they could release metrics on trading or investing activity, the way some software companies release stats about active users.

Then you could calculate revenue per trade (or something more useful along those lines) to use as a signal in these cases.


They break out a lot of numbers, but I'm not sure how useful metrics on trading activity would be. Not all trades are equal, so something like revenue per trade is pretty meaningless.

http://www.goldmansachs.com/media-relations/press-releases/c...

The numbers are pretty brutal, especially in institutional client services.


These go hand in hand. The debt issued to buy back the stock is cheap. :-)


I think you may have overlooked the recent DOL Proposal and it's potential impact on profit margins if it pushes forward. Good for consumers, but bad if you have to improve processes, hire people to handle them, etc.

http://blog.emoneyadvisor.com/industry-news/trending/complet...


When similar rules occurred in the UK profit margins went up. Matt Levine on Bloomberg View talks about it regularly.


>worries about losses from bad energy loans

GS has $11BB in oil sector exposure. MS is $4.8BB. BofA and Citi are around $20BB. JPM is around $15BB. These are fractions of total loans outstanding (MS is 5%, all the rest are much smaller). Each of these banks has more in reserves than oil loan exposure. Does that sound scary to you?

Again, relatively sophisticated investors understand these things. This is basic research. Where is your evidence?


How much exposure do they have to the financial sector ? As in 2008, we should be wary of cross-pollination.

The limits on the banks' exposures are dependent on counter party risk. GS's exposure is far more, but they insured most of it with other banks, and please allow me the simplifying assumption they insured all of it with BAC. Which means that (hypothetical scenario) if BAC goes bankrupt, suddenly GS exposure to oil goes up to, say $100BB. Suddenly the reserves are woefully insufficient. Then there's the sudden risk that GS goes belly-up, which would increase everyone else's exposure. One of them goes and ...

Additionally, oil fuels the economy. Oil is what builds the roads, what makes everything on the roads moving, what keeps planes in the air and boats going forward. Oil provides significant parts of our electricity supply, and so on and so forth. So you can look at the oil sector problem in 2 ways. Either you look at the supply side, which is producing somewhere around 2% more oil than the market is willing to buy (at any price). This is short-sighted. "If we'd all just put 2% more gas in our tanks, there wouldn't be a problem", which is not realistic.

The other oil to look at is demand-side oil. The market is simply not buying 2% of the oil, except to store it. Why not ? One explanation would be that there is a global recession and the oil price move is simply the result of that. The fact that the price crash happened with oil production/supply constant (even slightly declining) would seem to support this. For instance the baltic dry index plunged before oil started having problems, same with container shipping, and this explains quite a bit of the excess capacity in oil, and therefore, I say caused the price drop in oil. Oil crashed because manufacturing (the source of the demand for shipping) crashed a few months before the oil crash (and hasn't recovered).

In other words : you've identified the wrong problem. Oil is a symptom of the underlying situation, not a cause. You say banks are capable of withstanding one aspect of a greater problem ? Well, I'm not saying that's bad, but it's not reassuring at all (and may not be true due to financial engineering).


You talk as if sophisticated investors can't read a balance sheet or don't understand buybacks.


The GAAP P/E of the S&P is now at 24.28, that's near, if not, a record high. That's bad. It's very overvalued historically. Banking stocks are doing badly because interest rates are so low across the curve that they can't make money from the spread. But even at that, they are overvalued along with various momentum stocks like Tesla, Netflix (which we saw implode yesterday.)


I started investing my retirement money recently. Thats about a good of a signal as any that its going to plummet.


Ah yes, the old maerF0x0 indicator. Beloved in some technical analysis circles


I've heard it many times in the back-rooms: "Wherever maerF0x0 goes, don't."


I hear the name even comes from the sound you make when you've just taken a mouthful of your breakfast toast, open the newspaper to the financial pages, and see what's happened to the value of your investments.


What a lovely mini thread! Thanks for the quick laugh :)


If you're investing for retirement into an index fond then where is the problem? Just wait until the recession is over (and perhaps buy more cheap stock). If you want to retire during a rececssion don't liquidate all your assets. Only as much as you actually need. If you believe that the market is never going to recover why invest in the first place?


it was partially a joke. But yes, dollar cost averaging across a recession will make the total portfolio look ok. I do wonder if we can always continue to make higher and higher "highs" in the market though, meaning any investments I make at the top will make $0 gain.

I do agree with you overall about indexing, however "Just wait until the recession is over" is market timing and I dont believe I (or others) can do this dependably well.


The high P/E is driven by interest rates being low.

One can argue the way to judge stocks value is not by P/E but by E/P relative to interest rates; that is whether their excess return relative to risk-free assets is justified by their risk.

Interest rates remain 5% below their long-term historical average, which can justify E/P going up significantly. Currently stocks offer a 3.5% return over some classes of t-bonds, well in line with historical norms.


This is very true. It's earning yield versus fixed income yield that matters most. If one is out of whack with the other, then there there is arbitrage. (All things equal, if there's a much better yield in equities, then people will move money from bonds to equities, or vice versa)

The current danger is that both will move down at the same time, and then where should one invest?


What else would you expect to happen in a world with negative interest rates?


man, I do believe the market in general is over valued. but not from the PE ratio. that metric is almost in shape:

https://www.quandl.com/data/MULTPL/SP500_PE_RATIO_MONTH-S-P-...

check how high it was in the 2000 and 2008, we're aren't even close


If you'll look at your chart, those peaks in ~2002 & ~2009 were market bottoms, earnings had already dropped precipitously; and they were also black swan events. Historically from a F P/E and P/E ratio, the market is overvalued, even without a black swan. Also when you look at the debt market right now, it does not look healthy at all. That's usually a sign that something is brewing, usually a recession. However, stocks can continue higher, blow off tops are common place in the last phases of a bull market.

My fear is that the central banks are pumping so much liquidity into the market that they are driving up equities and pushing people out of safer assets into things like high yield bonds and momentum stocks. If we do have a recession, the pain could be worse than usual (for stocks) for the mere fact that the debt market could have liquidity problems when tons of funds begin to pull their money out at once from HY.

Not to mention all of the corporate buybacks that companies are doing by leveraging, because cash is too expensive to bring back overseas.

You can still invest in solid companies, but companies like Tesla, Netflix, anything with a super high P/E is going to be taken out back and shot (that doesn't mean the companies will go out of business, only that their stocks are much like Amazon in the 2000s.)


2002 was a black swan event (because of 9/11); 2009 not so much, many institutions had been calling the housing bubble since 2004 and even Greenspan acknowledged it in 2007. A bubble that is acknowledged as such 2 years in advance by the central bank is not a black-swan event (defined as a surprise event that is impossible to predict).

The 2008 bubble was eminently predictable. The problem was that no one knew the exact trigger, and no one had a politically acceptable means to deflate the bubble until the domino effect started, and then hedge funds and then major banks started collapsing.

Until the knife started falling, no one had a financial incentive to stop. After all, subprime mortgages have crazy interest rates, and if you're BoA or JP Morgan Chase, the government will probably step in to stop your collapse...


> companies like Tesla, Netflix, anything with a super high P/E is going to be taken out back and shot

Not that I disagree with you, but I believe a contrarian viewpoint would be something along the lines of "We are currently in the midst of an economic revolution as increasingly large swathes of activity are digitized and lingering mechanical/human processes are computerized. Companies likely to be successful in this new economy are unlikely to be the same ones which were successful in the old."

(To which the obvious rebuttal is probably "People are always saying things are about to be different, and they're usually wrong.")


Those gigantic spikes represent plunges in earnings, not spikes in share prices... If you look right before the spikes, you'll see PE ratios in the mid- to high-twenties.


Netflix at -12% is hardly an implosion. For a growth stock like Netflix, it's not too uncommon. Just look at their chart, it's had plenty of jumps in either direction when their quarterly report exceeded or failed to meet expectations.


>The GAAP P/E of the S&P is now at 24.28, that's near, if not, a record high

Not even close to a record high. It's slightly high, but is it surprising that people are will to put a premium on earnings with negative interest rates?


no worries the FED will do SOMETHING and save us, SOMEHOW. no moral hazard to see here.


If you pick and choose stocks, you can tell any story you want. Intel cutting its workforce is more of a reflection on the state of the PC market rather than the US economy as a whole.

I'm not saying the market is overvalued or not, but this is in no way indicative of that.


^ To add to this I'd lay say the over-valuation(s) are a side-effect of our monetary policy being in never-ending "stimulus" mode.

On Intel I think the "everything is going mobile" is PR for investors, the PC market doesn't have any foreseeable growth potential atm.


The stimuls mode is a permanent regime now. It's good in the short term, but I do not know if anybody understood mid- and long-term consequences of this new permanent mode.


"The stimulus mode is a permanent regime now"

As a general FYI to those reading -- please fact check before commenting.

The Federal Reserve's quantitative easing program ended on Oct 29th, 2014 (538 days ago!) It is difficult to assert that a program which ended over 500 days ago is permanent.

Additionally, one may be willing to assert that "ZIRP is permanent regime now" too, to tack onto the culture that central banking has entered a new era.

For anyone who did not see that news, ZIRP (zero interest rate policy) ended on Dec 15, 2015 with a rise, and is currently believed to rise again at the next meeting.


Yes it's permanent. Interest rates are still super low, meaning companies can still take on debt to buy back their stock. The effect of quantitive easing remains until interest rates start going up and the Fed start buying government bonds from the banks.


>meaning companies can still take on debt to buy back their stock

Please explain the problem here. You're acting like there's some big deception imposed on the public due to companies choosing to return capital to shareholders via one specific mechanism. As if somehow they're buying back stock and "fooling" people into thinking they're making more money or something, and stock prices are irrationally rising. The buybacks, earnings, financials are all open for everyone to read. It's evident by some of your commentary that you can't be bothered.

If you don't agree with the price that other people are willing to sell for shares in this market, you are more than welcome to take the other side of that trade and sell into every buyer on the planet. I'm sure you're not doing that.


Shorting a stock isn't always practical. Stock buybacks can be deceptive if a company buys back stocks with debt and doesn't have proper cashflow to pay it off. Non-technical investors see the stock going up and keep piling more cash in. If you think markets are perfect, you have a lot to learn my friend.


>Shorting a stock isn't always practical.

Nor is it remotely the only way to assume a bearish position.

>Stock buybacks can be deceptive if a company buys back stocks with debt and doesn't have proper cashflow to pay it off.

You can't tell the cash flow and debt levels from the financial reporting? You don't account for this in your valuation? Where is the deception? Report it to the SEC.

>Non-technical investors see the stock going up and keep piling more cash in.

An equivalent number of people are "pulling cash out". A stock trade is just that: a trade. Again, what makes you a better judge of the "correct" price than those actually making the deal?

>If you think markets are perfect, you have a lot to learn my friend

Not sure where that comes from, but you're right, I do have more to learn. That said, it looks like I'm a few levels up on you (and far less confident in my ability to predict anything).


Interest rates have already started going up.


They went up once and the fed is signaling a lot of caution. The earlier analyst consensus was 4 hikes this year .. it is down to 2. The latest commentary I'm hearing says normal rates by 2019 .. which is psycho IMHO.


What they ended is increasing their balance sheet. They are still rotating into new debt as old debt is maturing so they have not actually rolled back their old QE programs- they just have not increased them.


Just take things in in a perspective, not just based on the latest quarter news:

http://b-i.forbesimg.com/jessecolombo/files/2014/01/united-s...

http://moneymorning.com/wp-content/blogs.dir/1/files/2015/09...

http://www.naic.org/images/capital_markets_archive/2013/1305...

whether they do a token 0.5% hike (they probably won't), it's obvious that something very basic has changed, and quite likely irreversibly.


Why did you link three pictures of the same graph?

They also don't support your hypothesis that something has changed irreversibly.

Raising the interest rate more than 0.5% would have likely had a negative impact. It's very likely the Fed will raise it again later this year.


Its not because interest rate is just slightly above zero that its not cheap anymore.


There is nothing mysterious about the short, mid or long term effects of artificially low interest rates and quantitative easing.

edit

Here's a link to read about it. https://mises.org/library/unseen-consequences-zero-interest-...


Have you been following the current earnings season? We are witnessing horrible earnings while stocks keep going up and up. All the losses in August 2015/February 2016 have virtually disappeared after the oil price recovery.


You also have a large increase in corporate debt, some of which, maybe a lot, has gone into stock buybacks. It's a way of translating ZIRP into stock price while nothing was created.


oil is a leading indicator of demand. The fact that it picked up is a sign that demand picked up. stocks are continuing to rise because investors see the current drop in earnings as a blip, not as a market catastrophe.


In the old days broad economic demand moved faster than production declines. So production being vaguely constant the price told you a lot about main street employment.

In the modern era of frac wells where depletion rates are like 50% in 2 years, broad economic demand is now much slower than production declines.

That's the problem with secondary recovery... The "balance sheet" looks good in that you'll profit (unless prices tank) but the cashflow is terrifying you don't dig a well and collect for 30 years like the old days, you do exotic processes and get high production rates for like two years, then production drops to zilch.

On a large, century size scale it screws up Hubbert graphs. You can ramp up for a century, but extensive secondary production means the decline slope will be just a couple years instead of a nice symmetric century or whatever.

By analogy its like switching farm land from an olive orchard to corn production and being surprised at fundamental shifts in the financial sheets. (Yo its just another plant, right, well theres a bit more to it...)


no, oil has been spiking up due to incessant rumors of production cuts. even though the latest talk about a production cut was a no go, oil spiked a bit earlier this week due to kuwaiti oil worker strike.


If they can't sell computers in the digital age where software has taken over everything, you can think whatever you want but that does reflect poorly on the economy. Yes, many factors are at play. It is still a terrible sign.


More computers are being sold than ever before. The market is absolutely saturated with computing devices. And aside from the intense competition Intel is facing, they're suffering the same issue that the GPU makers face -- I once upgraded yearly because the gains made it worthwhile. Now I upgrade pretty much only when something fails.

It is absolutely picking a narrative.


Intel screwed up and didn't become a major player in the mobile market. They could have, because they are still the world's best chip maker, but they didn't. The mobile market is experiencing the most growth and also the shortest product lifetimes. I still get a new phone every 1-2 years because the upgrades are worth it, but my laptop needs upgrades less often, and my desktop less often than that. Mobile phone processors today are where desktop processors were a decade and a half ago in terms of growth potential.


Intel has tried, and failed, and tried again in the mobile segment.

They sold off their ARM processors to Marvell. They've made various forays into wireless, though WiMAX didn't work out so well for them.


> I still get a new phone every 1-2 years because the upgrades are worth it

Really? About the only reason why I'm getting a new phone every so often is when the old one stops being updated. (My Nexus 4 lasted 3 years, and it would have done another year.)

Planned obsolescence, if you want.


The phone market is still rapidly evolving. For instance, I have a Nexus 5X right now. The fingerprint scanner on it alone is worth it. It was an afterthought to me when I got the phone, but now it's the single most important feature on the phone to me. The camera is also better than anything phone camera I've had before.

Also, I play some games on my phone, and having better hardware helps a lot with that.


What is so useful about a fingerprint scanner?

I could see how infrared sensor like in that new Caterpillar phone could come in useful especially if you are in a building trade, but fingerprint scanner?


You know that it's used for unlocking the phone, right? It's nothing like the use case you're describing.

You have to get a phone with one and use it for a week or so until its use becomes routine to really understand how big of a game changer it is. I subconsciously unlock my phone now every time I pick it up. It's amazing. Every day it saves me probably 60 seconds in total typing in stupid PINs, and those savings add up real quickly.


I'm still using my Nexus 4, that thing has pretty much already set the "good enough" bar we've come to reference for why PC sales are tanking. The only problem is a lack of security updates, but we can't blame the hardware for that.

Incidentally I had a HTC Desire HD before and that became unusable on newer versions, as is the Nexus 7 now. These two clearly didn't have the good enough hardware specs.


Is half of college graduates being unemployed or working a min wage job picking a narrative too? Is pointing out that wages are completely stagnant and not rising picking a narrative too? What information would you need to receive to admit the economy is not healthy?


Yes, don't go studying 17th Romantic Literature expecting a job.


Romanticism is a 19th century thing.


Obviously that's not what he studied in college.


Might be the joke?


Nah, that would be too clever.


I said 50% of graduates are min wage or unemployed. Not the 1% who majored in romantic literature. You also didn't address the fact that wages paid to employees are completely stagnant.


I think the point went over your head. It is not the systems fault that a person that chooses to graduate in a non marketable field does not have a job, e.g. a person who graduates in Romantic Literature. Everyone who graduated cs that I know has no trouble finding a job. One that is not at McBurger.


Where are you getting this 50% number from?

http://www.epi.org/publication/the-class-of-2015/ For young college graduates, the unemployment rate is currently 7.2 percent (compared with 5.5 percent in 2007), and the underemployment rate is 14.9 percent (compared with 9.6 percent in 2007).


Anecdote, I was an avid gamer that would upgrade whenever the price/performance was good enough for my budget. However, I bought almost top tier stuff back in 2008 and haven't upgraded the core of my system since. I don't game like I used to, but it still does everything I need it to do. I used to do yearly-ish new builds as well.


Me as well, I built an i5-2500K system backend 2010(ish), it's still my main home desktop, 32GB RAM and an SSD and it's as fast as the machine at work that is two generations newer (or more correct it's imperceptibly slower).

Things have really levelled out for average loads even developer loads (mostly do web dev, run vagrant machines that kind of thing).

I can't see me upgrading til this thing dies tbh.


Same boat. I built in 2009 and have a Q9450@3Ghz and a Radeon 5870 with 8GB RAM and a one-time top of the line Intel 160GB X25-M SSD. Still works great. There's only 1 thing I don't like about it, with 3 monitor outputs enabled the idle temp on the GPU is pretty hot (~86C). I'm hoping a newer machine will bring that down. Single output it's about 57C, so a huge difference.

But I got tired of 190F slowly being pumped out of my case and into the room. Its replacement is finally on the way. I preordered Intel's Skull Canyon NUC[0]. Got 32GB of DDR4-2800Mhz memory and a 512GB Samsung 950 Pro PCIE/NVME M.2 SSD. I'll be dailychaining a single DisplayPort cable to 3 new LCDs as well.

Pretty huge leap in performance. It just made sense to stop building new computers and jump on the NUC bandwagon. All I do is development, League of Legends and the rare CSGo. The ~Geforce 750 performance levels that NUC will provide will be enough. The inclusion of the Thunderbolt3 port for an external GPU case really put my mind at ease. Not that I intend to utilize it, but I'm glad it's there. Same upgradability as any other machine. SSD/RAM/GPU. The CPU is soldered, but I never once replaced a CPU after building a computer anyway. Other than the few Athlons I killed from overclocking in ~2001.

Probably upgrade more often if these new gaming NUCs are as good as I think they'll be. Next upgrade for me will be 10nm + Thunderbolt4 NUC. And the final perk, all-Intel so it'll work great with any Linux distro natively. That's worth a lot to me.

Unless Intel failed hard with this thing, which I highly doubt.. it's Intel.. I'm all-in on NUCs from here on out.

[0]http://www.newegg.com/Product/Product.aspx?Item=N82E16856102...


> Me as well, I built an i5-2500K system backend 2010(ish), it's still my main home desktop

Good pick: it's still one of the faster chips around. However, it's TDP is 95W, which is fine for a desktop. Since then, Intel has been concentrating on delivering the same (or, often, much less) speed with lower TDP ratings.

These address the market need for thin, high-priced laptops -- preferably without fans -- that you can use in Starbucks.

You could "upgrade" to a new Intel Core i7-6600U that's actually slower than your i5-2500K but has a TDP of only 15W.


What you folks are saying is mostly true. VR and 4K are the drivers for a whole new compute cycle in the home. I think the current core counts for Xeon are good enough for cloud (by this I mean that I think CPU isn't the bottleneck for the average EC2, Google Compute, Azure instance). Any one care to comment?


In the sense of VR/4K the CPU is way less important than the GPU and we still have some headroom on those even with existing process, what will be interesting is when everyone else's process catches up with where Intel is now, they've generally stayed out in front of everyone else for a long time (except AMD for a spell).

I'm really looking forwards to VR if it catches on though, having an insanely high resolution headset so I can dump multiple monitors for programming is a big win, combine that with something that has the portability/form factor of a MS Book/Macbook Pro and you'd be able to program as capably from a hotel room as at your desk at home/work.

That would be the biggest shift in my work habits since I went from Windows to Linux in the late 90's.

Also I think once everyone can get down to the same size as intel we might start seeing more exotic architectures, Intel has often won with the "with enough thrust a brick will fly" approach to engineering, it doesn't matter if your chip is clock for clock more efficient if Intel is operating at a level where they can put 5 times as many transistors down in the same unit area and ramp the clock speed way up.


Good point. Nvidia seems like the exciting company in this regard. I have a bad feeling they're going to tumble because the latest announcements related to Pascal have focused on deep learning instead of 4K for consumers. As far as I know, they haven't even announced their consumer Pascal cards yet. The rumor I've read online is announcements in June.


What would it mean for Nvidia to focus on 4k? Do the new screen resolutions require architectural innovations in the GPU? (Honest question. I'm not a graphics engineer)


Me either so take this with a pinch of salt.

Mostly it's about shuttling around 4 times as much memory for each screen as well as 4 times as much processing.

1920x1080 has ~ 2 million pixels. 3840x2160 has ~ 8 million pixels.

Internally iirc this is often done as vectors before been rasterised out and having various filters and shaders applied but that step requires that you store multiple buffers etc, same reason a card that will play a game just comfortably at 1024x768 will run away crying at 1920x1080 I guess.


I can see how higher resolutions will require the GPU to have more memory or more FLOPS or both. I guess my question is, how, if at all, are the improvements required to support 4K different from improvements that target something like deep learning performance?


Not quite about that last part. Clock rates have been more or less stationary over the last decade or so. What Intel has been focusing on for some time is cache, and how to practically always have the right data in the cache at the right time. but yes, having more transistors to work with do allow them to pack more cache onto the die.


Yeah that wasn't perhaps very clear, I meant historically they ramped the clock up (back in the P4 days).


And ran into a brick wall, iirc.


We still need more CPU in the cloud. There's an almost unlimited appetite; and we still often look for optimizations to our program that reduce CPU load. (At least at Google. Not sure how much I am allowed to say in more detail, though..)


Touche. I've also been involved in building clouds but ours was focused on customers building web-apps. Those are probably not compute bound. I can see why data processing workloads would need more CPU.


> VR and 4K are the drivers for a whole new compute cycle in the home.

My i7 desktop will be four years old in June. I've done the research and all I need to run the Oculus or HTC Vive (or the upcoming games that look good) is a graphics card upgrade.


As a software developer I constantly find myself optimizing and scaling across CPU's. Sure those programs I optimized 10 years ago run blazingly fast now. But as computers got faster the possibilities and expectations also increased.

I would like to compare CPU's with roads: Roads increase traffic, not the other way around. So if you make a faster CPU, the demand for a faster CPU increases.

So if Intel stops making faster CPU's, the demand will (unexpectedly) go down.


Good point except facebook is targeting 6k per eye. People are definitely going to buy hardware, but maybe not in huge numbers.


That's what I've got as well. I'll probably build a new machine (actually, I'll just buy a NUC) this Spring. Another big factor in not upgrading is the effort involved in setting up a new Windows machine. I'm sure I'm in for at least 30 hours on that front.


That's precisely the "issue", and it's only a problem if you make it a problem. The early-adopter and rapid-early-development churn is slowing down because processor technology is maturing and has reached full market penetration, so the market for new devices is shrinking. It's like a hotdog stand calling it a "market crash" when they've sold everyone a hotdog and nobody's hungry after lunchtime.

The automotive industry faced the same issue a long time ago, and their solution was to turn cars into a fashion statement. You no longer upgrade your car because your old one is worn out, you upgrade to signal your wealth through conspicuous consumption. The mobile phone market is somewhat the same way. The desktop PC market? Not so much.


Ain't that a bit sad, all that energy and pollution going towards planned obsolescence so that he with the shiniest toys wins?

There is some legitimate utility derived from novelty -- I don't want to be walking around in my grandfather's battered rags either -- but I wonder if some day we can shift towards something more sustainable and less paying to dig consumerist holes then paying again to fill them back in.


Yeah, I never went in for replacing things just for the 'new car smell'. That's probably why I drive a nearly-30-year-old car. :)

The point about your grandad's battered rags is that they're battered rags, though, not that they're old. His good leather jacket is probably still perfectly serviceable today.


My leather jacket is 20 years old and in great shape.


Maybe they should adjust their business to account for this new cycle of usage, and not fight it


That's why they're cutting 11% of their workforce.


I must say, I'm surprised by how much their workforce swelled. Why did their workforce rise so much after 2009? Was this in the Bay area? Did this rise bring in additional revenue/create new business lines?


One of the things mentioned in the story is that the "period of [staff] growth includes its two largest acquisitions, McAfee in 2011 and Altera last year".

I'd assume these acquisitions added to revenues.....


The PC isn't going anywhere, but the need to update every 30 months has gone away.

It's a sign of market maturity. Intel shouldn't be trading at a premium if it's selling a highly predictable, stable product to a consolidating marketplace.


Intel has always sold predictable, stable products. But the rate of increase in the core value delivered by their primary product line has slowed dramatically.


Computing devices are selling like hotcakes, far moreso than ever before.

But most of them are full of ARM chips fab'd by companies not named Intel.


>>Computing devices

hmm... only if I kid myself that the mobile phones (even the smart ones) are actually computing devices on which "I can compute whatever I wish to compute AND the way I can compute on a PC". Most of the ARM devices sold are NOT computing devices for the general person who purchases them.

It reminds me the fallacy of calling smart-phones "supercomputers in one's pocket".

May be the ARM chip based devices are "computing devices" for companies like Samsung, Google, or some car makers but certainly not for general public.


"sure UNIX is nice, but you need a mainframe for REAL work" "sure NT is nice, but you need a UNIX workstation for REAL work" "sure mobile is nice, buy you need a windows PC for REAL work"

I crunched to inbox zero with my phone while commuting to work, while I'm typing this at my "real work computer"


I read that statement more like in terms of the freedom you're allowed to build and run applications for such devices. Mobile devices usually discourage the user to do this.


> I read that statement more like in terms of the freedom you're allowed to build and run applications for such devices

Even if we disregard how arbitrary that definition of "computing device" is, it's still a false proposition. Anyone can easily get a browser, a word processor and spreadheets on their mobile device or tablet (ARM). That's about as much computing as a significant chunk of the population requires.

I have debian chroot on my Android device which gives me Vim, GCC and javac out of the box.


How much "computing" do you think the average user does with a PC?


How much photosynthesis do you think the average planet does with its star? Enough to justify chloroplasts?

How much complexity do you think the average mind generates with its time? Enough to justify the ATP?

How much thinking do you think the average human does with his brains? Are brains worth it? Stay tuned.


Given the growth in trends such as ARM for mobile devices and dynamic resource allocation on the cloud, I still don't think you can make that broad a statement.


Can you make the statement that the economy had nothing to do with it? A business' success or failure is reliant on the economy on a fundamental level. I imagine if college graduates weren't struggling to find jobs and pay off their massive student debts they would have sold more computers, despite the trends working against them.


Because it isn't relevant. This is the impact of moving to the cloud. Those broke and indebted college kids are buying $800 iPhones instead of $800 PCs, because they don't need the hassle.

I'm a power user. I have a 4 year old MacBook and a 2 core * 4GB VDI session. I won't replace the MacBook for another 18 months.

Even in enterprise IT... Every project we did in 2004/2005 required a server purchase with two Xeon sockets and a NIC that might be Intel.

Now the average marginal unit of virtual server needs 1/50 of an Intel Xeon. And a lot of those Xeons are at AWS, which squeezes the margins.


They're focusing on being "the cloud", which definitely makes business sense when you consider that 10% of the world's electricity goes to data centers. Microsoft's next US-East Azure building will be a mile long! I think Intel will do fine if they focus on that market instead of PCs.


Yes, but most of those "computers" (that people do keep buying) happen to be smartphones these days.


What are you going on about?

Both Intel's sales and profit are even YoY in 2015, and up from 2013. This doesn't have much to do with computer sales.


>state of the PC market This is funny because the margins on ARM processors are razor thin since everyone and their grandma compete with each other. Meanwhile Intel can charge a healthy markup for their server processers which are increasingly more in demand as more people on the planet have smart phones.


I don't see why job losses are a bad thing. The last generation of machines was built so well that people don't need new ones. Good! This should be a great thing.

Valuing growth over sustainability makes these backwards ass goals of hiring more, building more and growing. The way this is spun is pretty horrible.

Amazon and Wal-mart losing is a good thing for the environment and society as a whole.

You yell, "But people lose their jobs..." There are more jobs, plus why does everyone need a job anyway? Can people in Silicon Valley not make $150 ~ $250k a year, everyone else get a minimum income and we work together to make better art, smaller factories and a world that will last much longer by recycling and rebuilding and not needing to buy an endless supply of shit?!

"Ending is better than mending. The more stitches, the less riches." -Brave New World, Huxley.


> You yell, "But people lose their jobs..." There are more jobs, plus why does everyone need a job anyway?

This is kind of silly. Job losses are a bad thing precisely because we don't yet live in a world where you can get by on a guaranteed minimum income without a job.

If you think it's worthwhile to push society in that direction, then great, but don't kid yourself that believing in it is the same as having already accomplished it.


Some european countries have already achieved this; your unemployment benefits after getting fired are a significant fraction of your previous salary, and slowly ramp down. You pay for it in taxes (and then some), but at least you know you can still make rent if you get canned.


You also pay for a bureaucracy to perform administration and fight abuse. That consumes a fraction of the system's resources. It's hard to argue that this adds much value, or any.

What this system boils down to is a compulsory savings scheme. So an alternative would be a voluntary savings scheme, which would have higher efficiency by not consuming resources for administration.


Of course the obvious downside to a voluntary savings plan is that it isn't effective at actually creating savings. We know this, because in the United States, we don't have compulsory savings plans, and thus have a voluntary system. So how's that working out? Not well. Approximately half of all Americans save 5% or less of their income[0], which isn't enough for emergency situations. Another study in 2013[1] said 27% had no savings at all. Now the main reason for this is quite simple. A lot people don't make enough to have anything so save.

[0] http://money.cnn.com/2015/03/30/pf/income-saving-habits/ [1] http://money.cnn.com/2013/06/24/pf/emergency-savings/


i think maybe only Swiss here have really sustainable model; others I know are typical socialist let's-make-more-and-more-debt-with-unsustainable-social-benefits (happy to learn if it somewhere else it works and can work in foreseeable future too without creating huge debt, forcing people to work in their late 60' etc).

What Swiss specifically have - we all pay social deductions, within 3 pillars. If you end up unemployed, then you are entitled to 1-2 years of unemployment benefits in 70% of your former income, with some reasonable cap. That is if you worked 100%, in perm contract, at least a year in a row. Otherwise not entitled or payments are much lower.

You have to actively search daily for new job and prove it to your unemployment councilor, otherwise they will cut benefits. Those benefits are really structured in a way to pay you while you are looking for a job, not to have some extra long holidays on everybody's else budget.

Motivating setup, and since there is a social net of that 70% of my income, I am not frightened on the prospect of losing job and needing to store enough cash to live at least X months/years without any income. Handy when one has big fresh loan on his shoulders.


There's nothing particularly Swiss about it, similar schemes exist throughout Northern Europe.

In Norway you can get 1-2 years benefits at about 60% your previous annual income with a cap near median national income.


These are nice schemes but the caps are worrisome. Modern professionals in North America pile on so much debt (student loans of a few 100K, mortgage debt of a lot more, etc.) that the countries average income doesn't reflect reality. In the Bay area, I hear laments from people making 200K that they cannot make ends meet. It is a crazy world.


You can have your student debt repayments frozen here while unemployed, at the national bank's rate. Plenty of people with high mortgages too, but duh. We aren't living in a perfect world.


Right, they are obviously bad for the people losing their jobs and for the society. But are they bad for Intel? I think that was his point.


>I don't see why job losses are a bad thing.

You'll see it if you lose your job and can't find a new one, like it happens to millions of people the world over.

Add a medical or family emergency to that, and you'll totally get it.

Unemployment in the real world is not about some Sci-fi automation fantasy. Might be, but not yet. And even when historically it leads to new jobs or other benefits (something not always a given) the pain and trouble is still there for those unfortunate to be left on the wrong side of those new jobs.


The more jobs are lost, the quicker the demand to fix it, with Universal Basic Income or something similar. We know it has to happen eventually, but the sooner we man up and go for it, the sooner millions of people can get relief from their suffering.


I don't see why job losses are a bad thing.

As long as we move quickly to "you don't need a job to live" because otherwise these job losses are destroying people's lives.


Unless these folks had degrees in Working At Intel, losing a job is not the same as becoming permanently unemployable. Job loss sucks. But it's pretty hyperbolic to say it's destroying peoples' lives. Particularly for skilled workers, finding a new job isn't all that far out.


>Unless these folks had degrees in Working At Intel, losing a job is not the same as becoming permanently unemployable.

A 50 year old Intel employee might find it quite difficult to get a new job any time soon. Not to mention that where they live they'll have some other thousands ex-colleagues recently fired looking for similar type of jobs.

We're talking about thousands of people -- few of them will be highly sought after chip gurus. In fact obviously it's the less sought after ones that will be let go.

Add a recent mortgage, medical issue or kid at college and here goes their pension savings (or worse). Have the same layoff happen to their spouses near the same time (which it's not that rare), and they're pretty much screwed.

Really, it's as if people have no real world experience with these things...


That depends if you can apply for redundancy you can lose the good people.

Most of the UK's mobile industry was staffed by Ex Cellnet/BT who took Redundancy with some massive TAX free payouts - Senior Guys could get 5 figure payoffs and an extra 6 years on your pension.

The head of Vodafone in the UK was an example and as soem one said at his level they threw in a gold plated wheelbarrow for you to take your cash away in.


This is largely a problem because of the pitiful savings rate in the US. If you've been saving 0% of your pay, being laid off at 50 is a disaster. If you've been saving 30-50%, this is likely to be a non-issue. Highly paid people in this country spend way too much of the money they earn.


Not only in the US, and not only because of being "highly paid" but "spending too much".

In most of the world, including large parts of western Europe, unless you're upper middle class and higher, it's either hand-to-mouth or smallish savings (nowhere near 30%).


Which parts of western Europe?

I see via google that minimum wage in Germany is 1473 euros per month for a full time worker.

According to [1], food, housing, necessary household items, and public transportation look like they could reasonably total less than 800 euros per person. Add another 100 for having fun and for the occasional wardrobe expense, and you still get to save nearly 40%.

Having children ruins all of this, of course, but that's pretty easy to avoid.

[1] https://www.expatistan.com/cost-of-living/berlin


Germans save around 9 to 11% of their disposable income, of which the average (post taxes) amount is around 25-30.000. This puts their annual savings to around $2200 to $3300.

That's nowhere near being able to save 30-40% of your pay, as the parent said. And Germans are some of the biggest savers in Europe. For places like France, crisis stricken "PIIGS" (Italy, Spain, Portugal, Greece etc) and especially the UK it's even worse.


There's a large difference between being able to and choosing to save that much. I didn't say that it was the norm, but it's certainly possible if you have a near-median salary. Not doing so is a choice if you make a reasonably good salary, albeit an extremely common one that most people don't make consciously. It requires consciously tracking your spending, and being willing to forgo or at least cut back on some of the luxuries that have become much more common in the US and presumably Europe in the past 30 years (like eating out frequently). In return, you can get better financial security, even without an astronomical salary.


You paint a contrived scenario and then lament it. What is the purpose?

Where do you have stats showing all of these job cuts will happen in the same location? Where are the stats that show when a person loses their job, their spouse is also likely to lose their job?


It is not a contrived scenario. My dad was laid off 5 years before retirement and guess what? All his skills were working in the Intel organization and nobody is going to hire a senior manager in folsom a few years before retirement. Please try to be a bit more compassionate, it actually effects real people.


The "contrived" scenario is something that has been played again and again. as opposed to the "no worries" attitude.

As for the same location: typically companies have so-called offices in a few locations --not individual people equally distributed around the country.

Plus, who said that their spouse is "equally likely to lose their job"? It's just an example of a thing that can happen, and does happen, not something that anyone claimed inevitable or "as likely"...


20% of suicides are linked to unemployment.


you have "turtles all the way down" of bubble mentality


A relative few Silicon Valley elites living like kings while everyone else lives in a tiny box trying to raise their families on their meager Basic Income stipend doesn't sound like a utopia to me.


How does that work? If I have a meager income I'm not buying an iPhone or a desirable target for ads from Instagram.


Apparently it works just as djslumdog says -- people like him/her will continue doing the small amount of important work that can't be automated, and will be compensated handsomley to do so. They will fly around the world, enjoying vacations, live in large homes, eat lavish meals.

Everyone else will live in 100 square foot apartments in Soviet-style concrete bunkers, eating ramen (or possibly Soylent). But they won't need to work! They'll spend their entire day painting and composing music!

A lot of people here push this vision, and of course they always imagine themselves to be part of the elite. It really is insufferable.


Poor people are among the most overweight because food is so cheap in the US. I don't see why implementing Basic Income would magically make food expensive or make people have to "eat ramen or possibly Soylent."

Anyway, the world you are so worried about is already here. A small number of people do continue to do important work that can't be automated. Most other work is unecessary busy work. Artists and musicians already live off of government largesse, either by taking "McJobs" that are artificially highly paid because of government minimum wage, or by actually being on welfare.

Since people can't wrap their heads around the concept of Basic Income, we end up with hordes of people paid to do stupid shit that doesn't matter. The endgame is that we're all government employees in a Dilbertesque hell. Moving papers from one pile to another and back again. Digging holes and then filling them again, since letting people decide what to do during the workday would be too disruptive.


Using the big number of overweight people as a sign that healthy food is available cheap is an amazing feat of backwards logic IMO. A significant cause of overweight is that your diet is too imbalanced and your body is missing certain important nutrients while getting too much of others. That is the exact opposite of healthy food. And in fact, the "cheap food" most people consume is exactly that - fast food and high-calorie snacks. That sounds a lot more like the concrete-bunker-and-ramen theory again...


If you supplement with protein powder which is cheap also, count calories, lift weights, fast food is just fine for not getting fat. Lots of people on /r/fitness have done McDonald's or chipotle.


> The endgame is that we're all government employees in a Dilbertesque hell.

I once heard about the British Colonial Office. The time when it had the most employees was precisely the time when it was dissolved because Britain did not have colonies anymore. They were all just shuffling papers around all day.

I sadly don't have a source for that. Does anyone have a link, by any chance?


> Does anyone have a link, by any chance?

C. Northcote Parkinson, 1955. http://www.economist.com/node/14116121

"A glance at the figures shows that the staff totals represent automatic stages in an inevitable increase. And this increase, while related to that observed in other departments, has nothing to do with the size - or even the existence - of the Empire."

What I don't know is where Parkinson got his own figures from. Some of them are cited in the article, but I think not this one.


But what's the alternative? Don't give anyone even the meager means of living? Communism? (that worked out well). In the coming years, tons of people will be out of jobs and incapable of participating in the shrinking not yet automated away economy, what do we do?


you don't get it? basic income IS communism, we/they also had a guaranteed income because everybody had to have a job (=income), even if the job didn't make sense.

elites (political in communism vs actually still working people in BI scenario) have amazing life with various luxuries. rest will be allowed to exist and not die of starvation and have a lousy place for sleep. truly bright future...


You haven't answered the question: what is the alternative?

At least with BI, you don't have to toil away at a time-wasting job that's just busy-work: you can do something you enjoy, and maybe we'll all get lucky and you'll write the next Harry Potter series and become a multimillionaire, while paying a bunch of taxes to keep the system going. If not, at least you have the dignity of not digging holes and then filling them back in all day long to justify your paycheck.

If you don't like this, then let's hear your alternative. The only alternative to this that I can think of is to ban automation, or strictly regulate it so that any automation which does a job that humans can do is banned (allowing only automation which does things that humans cannot do). Do you really want to go that route?


Pedantic but an iphone is probably a way better purchase than most things if you are low income.

For $400 you get a device that will let you watch movies, play games, check email, get phone calls, etc.

Granted you could also get a cheaper phone but this replaces your TV and computer! Smartphones are really high-value!


Why do you think only iPhone can do that? Pretty much any 50$ used smartphone can do all those things.


Honestly, $50 smartphones tend to suck a lot. Even the highest-end Firefox OS phone for a while had major keyboard lag. Chances are trying to use things like Skype/VOIP over them would be painful....

I think if you're looking to invest in one useful thing, a good phone would be a great purchase. Maybe the Moto G at $200?

Amortized over a year, you're talking about a lot of bang for the buck.


I still use Samsung Galaxy S3 with Cyanogenmod I bought for 60€ that still meets my needs. If you are on a budget, buying a new, shiny smartphone is a waste of money imho as you can get a cheap, used but fully functional one for much lower price.


So do i on Cyanogenmod 10 (Android 4.4.4) and it handles all web/music/video needs. I guess the latest games that you wouldn't really want to play on a mobile anyway might have issues.

The S3 generation of hardware has stood up very well.

My only reason to upgrade would be for some good security hardware that can support full disk encryption with no slowdown.


>> watch movies???

How do you expect that person with meager income to pay for the network bandwidth and usage bill?

>>Smartphones are really high-value!

This I agree wholeheartedly. At least for people like me, who enjoy reading books, smart-phones (with sites like project Gutenberg) provide such an utopia that I feel like living in heaven. Of course, I know that smart-phones don't solve all problems automatically, but from knowledge-seeker's PoV, smart-phones are of great value.

Also, I understand that not all people like to read, but that's a different thing to ponder upon.


Don't worry. For BI to occur, the kings will first fall. Remember, they chose this course, and will follow it until the end-- they always do.


It's important to distinguish "a bad thing for the economy in general" from "a bad thing for an individual's share of the economy". Job loss without a corresponding drop in production is a good things for the economy as a whole - it's getting what's demanded more efficiently. It's terrible for the unemployed individuals in question, since they lost the leverage they were using to extract a larger share of the total economic output.


Yeah, but what's the societal point of efficiency when wealth has become so concentrated, that the economic ladder has been pulled up?


While that might be a nice long term view, it isn't fun to go through the entire process for the average person, heck it's not even pleasant for many of the "1%ers".

If a lot of companies suddenly go bust that are today supposedly getting in the way of innovation, it might be good for long term innovation, but in the short term, say 10-15 years it can really be hell. Not to mention that in this 10-15 year period while the economy is hemorrhaging it makes the entire economic ecosystem very vulnerable, and significantly reduces its capacity to deal with external attacks.

That is while US companies may start failing, who's to say that a foreign company won't take its place. Then you have to start taking isolationistic measures, which is a whole other mess.

The ideal scenario would be to see these companies slowly, over a period of decades fade out, and be replaced by others, which I honestly don't see happening as of yet in SV, but things like this generally do come out of the left field.

So if we see these companies shrink rapidly, with no one else to take the mantle, I honestly think it would be a dangerous situation as it can be a catalyst for something bigger.


>The whole market is overvalued, not just the tech unicorns.

Yep. That's what Peter Thiel is saying as well: https://news.ycombinator.com/item?id=11485376


I am not surprised. I am a huge fan of Thiel, his book Zero to One was a real eye opener for me.


Aside from if he's right or not, that is the belief one would expect him to hold, his other option is "looks like I'm a bad investor after all".


One can't be a bad investor and learn from their mistakes, presumably?


Yes one can learn but when one's investment is correlated to how one talks about it, best to keep those lessons to yourself and make better investments in future.

"My investments are over valued, oh the lessons I have learned"

"All investments are overvalued, oh the lessons we will learn"

are two very different statements.


You should start a new index, called the 3 index, that just looks at 3 companies. That should be enough to gauge the market, right?


The DJIA only has 30 which is always surprising to me (to its credit it tracks with the S&P 500 pretty well).


The DJIA is also share-price weighted. It's an objectively bad index and I find it surprising that it's still widely followed.


>yet, the Dow just hit a 9-Month high, and the S&P500 is >now above 2100.

Its because bad news is good news in the twisted upside down world of speculating on the Fed's interest rate policy direction via the stock market. Expectations of rate hikes go further into the future with every bad economic indicator released and there are many(retail sales, auto, housing, industrial production, rail traffic, oil and gas, capex) . Will continue to wait for the day the market finally figures out that interest rate policy frameworks are broken, the Fed has no more ammunition, and the stock market jumps out the window rather than using the stairs. At that point, I'm buying lads.


What's a lad? As someone who instinctually believes this is a house of cards, understanding how to prepare financially is a real challenge.


A group of males, like a flock of swans ;)


deposit in an FDIC insured account or US Treasury securities of short duration.


I love it when people make doom and gloom predictions. If you are so convinced, put your money where your mouth is. Short the S&P500 with everything you've got and post a screen shot here.


I've sold all of my stocks and RSUs in anticipation of a crash.


>The whole market is overvalued, not just the tech unicorns.

Who is it overvalued by, and how is the "true" value determined?


Suppose the market just tracks widget factories. Every factory produces the same quality widgets, all shipping is flat-rate by distance and there are no tariffs or taxes. In this market, it's easy to determine market value of any particular widget factory: market share of world-wide production.

Suppose everyone expects the demand for widgets to grow by 1% per year for the next 10 years. This growth might get priced into the market with some sort of net present value calculation. Capital will flow into building new widget factories to fund all of this new construction.

Now what if actually an asteroid hits a city with lots of widget demand. 50% of demand is wiped out instantly. Well in this case, everyone was overvaluing the market for widgets! All of those net-present-value calculations were way off. The market will correct.

Or maybe the asteroid takes out 50% of production. Suddenly much more investment in widget factories is required. The price shoots up to capitalize all of the construction, since demand has not changed.

In either case, before the asteroid the market was not correctly valued since no one had yet priced in the asteroid.


It wasn't really over valuing the price widgets.

Valuation only includes all available information.

You can't expect random acts of God to be priced in.

Overvalution is when psychology everyone is bullish, but the facts are not backing that up.


This is what I was getting at. "Valuation" of "the market" is far more subjective and psychological rather than objective and empirically measurable.


The actual information from markets is in estimated future prices, not the current bid/ask spread.

The true value is determined by what happens next, for short to medium values of "next."

Current fundamentals simply don't support a future of continuing growth.

A correction is certain. The only questions are when, how fast, and how much.


>Bubble

Four companies don't represent the market. Come on. I swear some users here will wish will wish bubbles out of the air to gripe about.


I don't think anyone above said "bubble". Overvalued is a different thing markets are usually either over or undervalued and probably overvalued today. Bubbles proper are kind of rare, the last major clear one being the 2006 housing bubble. You could argue there's a bond bubble presently.


You're right - Claiming the markets are overvalued like the unicorn bubble isn't literally saying there is a bubble, it's just implying there is.

There is most likely a corporate bond bubble, as we know from representative market signals. With the S&P 500 though, four companies are not representative signals.


Yup overvalued. The GMO report which is about the best source I've found for this stuff had the projected 7 year return for US equities at -2%/year which is not great. https://www.gmo.com/docs/default-source/public-commentary/gm... p9


Asset inflation caused by FED flooding the banks with cheap $.


That's what happens when you add more money to the system. It flows to highly liquid assets first.


Well, there's no point in putting your money into bonds right now. For all the interest you get you may as well use it to stuff your mattress.


To be pedantic: the market is perfectly valued at every moment. Of course market prices will change, sometimes very quickly.

BTW, this is a great site for tracking S&P ratios: http://www.multpl.com/


The efficient market hypothesis (the idea that the market is by definition perfectly valued at every moment) is highly controversial, and actively rejected by many finance and economics experts of all political views.


My point is simply a tautology: the price is the price. Now the price may change, someone may have their thumb on the scale, there may be shill bidders driving sentiment, etc. But at the moment you get a bid/ask spread, that is what you get.

Practically speaking, I personally don't believe the efficient market hypothesis due to the very obvious meddling by political actors, central banks, national treasuries, etc. Behavioral finance has also consistently demonstrated that humans do not react rationally in markets.


That may have been your point, but it's not what you said. "Perfectly valued" means that the price is what it should be (for some definition of "should").


"Perfectly valued" to me means that at that moment, the price reflects the actions of everyone. It includes the actions of all buyers/sellers who choose to participate and not participate, the regulators, as well as the meddlers.

My point is a pedantic one referring to the OP using the word "overvalued". The market price is never over/under valued. It is merely the market price at that moment.


Your definition of price and value, while correct to a certain standard, give no illustrative power nor does it lend itself to analysis or discussion.

As you said, it is merely tautological.


Your point sucks.


Please add 'capitalists' to that list of meddlers, to be fair. ;-)


Not the OP, but those finance and economics experts are contradicted by Gottfried Wilhelm Leibniz and his belief that we live in the best of all possible worlds.

In other words I think it's futile to try to attach moral attributes (for example saying that a market is "best" valued at the moment or not) to things like financial transactions, which don't have any intrinsic moral values associated with them. Otherwise we risk talking about long-dead German philosophers when trying to illuminate the problem.


You can thank Fed for that. I always wondered why Janet Yellen worries so much about stock markets. I mean isn't her duty to look at economy and make decisions instead all I see are ways to placate market by postponing things for ever.


US Steel just got ride of 25% of its non-union workforce. ATI just announced that it would be laying off 1/3 of its non-union workforce.

Carrier is moving manufacturing operations to Mexico.

We're on the verge of something really big and really bad.


Of course it is...its has been and is being totally propped up by the Fed monetizing debt with equity purchasing ie QEI, II, III with QEIV being considered soon[1].

This is nothing but an asset bubble that is sure to burst someday, as is akin to what the Gov't, the credit agencies, and WS did with housing and mortgages back in the 2000's that led to the meltdown and TARP.

[1]http://www.safehaven.com/article/39140/jobs-report-moves-fed...


How can the whole market be overvalued? Prices are relative, no?


it works like this: Imagine you have a huge amount of money, enough to buy whole companies like Apple, Coca Cola and Walmart. you want to keep your money and even make it grow, so you're looking for good businesses to buy.

I have one company for sale at USD329 billions, it grows 50%-100% a year for now and currently generates USD3.29 billions

I have another company for sale at USD219 billions, it is shrinking at 1.4% a year and currently generates USD14.69 billions

And finally I have another company for sale at USD200 billions, it is growing/shrinking between -8% and 60% a year and generates USD7.35B

Most people prefer the second one, who makes you a good 7% return a year. the first one looks good, but will take at least 6 years to be as good as the second and the last one is so so. the companies where 1) Facebook 2) Walmart and 3) Coca cola. (we're ignoring how much assets and liabilities they have for simplicity)

And as you see, even when facebook has a promising future, we don't know if it will reach a good value/margin level as walmart. so we may agree it's overvalued at is current price.

Hope this shed some light, I'm not a trader or something, so this info is very simplistic

P.D.: this analysis is called Fundamental Analysis, you can go deeper and honestly it has worked very well in my portafolio. I bought stock in Gerdau (GGB), a brazilian steel company, because I studied as a whole business and discover it was priced very low. that was starting in November, I bought those stocks at 1.22 and recently they reached 2.44. I basically duplicated my money in 6 months. Remember stocks aren't just tickets, they are little parts of a big business.


That's very interesting, thank you for posting this!


sure, man. glad it helped you.


In one sense yes — if the sum total of the market is '1' and everyone is just some fraction of that. Then there can only ever be relative price changes.

But the sum of all prices does in fact vary. I'm sure you're familiar with Tulip Mania. Prices can be bid ever-higher in a cycle, and they can also decline in tandem as in the depression.

For the entire market to have an expanding total price, new money has to be entering the market, or the net real value of the underlying financial assets has to be declining. Either way, the entire market can become overvalued or undervalued, especially compared to non-financial (or illiquid) assets denominated in the same currency — e.g., wages, energy, or land.


> How can the whole market be overvalued? Prices are relative, no?

Only if you subscribe to the "Greater Fool" theory[0]. On the other hand, if you think the stock price should reflect some sort of intrinsic value, say the net present value of the stream of all future dividends, then it could be the case that you would never recoup your investment with any stock currently.

[0] https://en.wikipedia.org/wiki/Greater_fool_theory


Short answer: No. E.g. if you buy 'shares' in an index (via an ETF), you're effectively buying a weighted basket of the component shares in the index. If the individual shares are overpriced then so is the index as a whole. Hence the S&P500 has a price:earnings ratio, and you can decide for yourself if that ratio is looking 'toppy'.


This is effectively a claim that the dollar is being undervalued.

Quite a few people would disagree.


The market movement can be explained somewhat by the fed's rate hike expectations changing (worsening economic conditions means that the fed is more cautious about raising rates, so discount rates are lower and valuation models are higher). It's also about what the market expected earnings to be. Yes, the tech earnings are beating up the NASDAQ, but people actually expected worse from the financial sector given the very low rates.


If helicopter money is the next step after 0 to negative interest rates, the stock market may be a pretty good deal right now.

Get paid a lot more money, receive double digit gains on your retirement account year after year -- and be completely fucked in the process, ironically, because prices went up even faster and its more profitable to simply hoard things than sell them.


People QE seems to be the very last thing they want to do, which I think really tells a story here.

Considering the money the government is still pumping into the market directly, I don't think people QE will have the same effect. It really depends on how it's executed though.


Healthcare as a sector is doing pretty well. Consolidation down to 3 major players will likely increase their margins further.


Which aspects are you referring to?


> The whole market is overvalued, not just the tech unicorns.

To the extent that Intel and the tech unicorns are complements, Intel's profits declining may actually signal that tech unicorns are undervalued.


Can you explain why the Goldman and MS stocks didn't drop? Overvalued means the stock prices should drop.


I have no opinion on whether GS and MS are overvalued, but here's a fun quote from Keynes. "The market can stay irrational longer than you can stay solvent."


It feels mostly all fake.


Money's gotta get parked somewhere.


Spot on!


hard to understand the market.


This is what the end of Moore's Law looks like.

* Tick-tock is dead.

* 10 nm is severely delayed.

* EUV is severely delayed.

* Significant layoffs in R&D

* The ITRS roadmap is vaguer than it's ever been.

* Giant mergers are up (Intel+Altera, KLA+Lam, etc.), concentrating the industry more than ever.

* And ultimately: A 5-year-old PC still works just fine.

When I say this is the end of Moore's Law, I'm not trying to be dogmatic. Of course there will still be a semiconductor industry and of course there will still be amazing technological progress. But it seems the rate of that progress is slowing, and now the industry is adjusting.


And ultimately: A 5-year-old PC still works just fine.

I suspect that is really the dominant factor here. It's not that the progress within the chip industry has stopped; there is far more number-crunching power in the CPU and GPU in my latest PC than in the one from five years ago. And if you're doing things like playing demanding games or modelling skyscrapers in a CAD package, that progress is probably very useful. It's just that for what most people use PCs for, it doesn't matter, because they were already good enough so unless their old one broke they don't need a new one anyway.

Not only that, but when it comes to replacement, smaller and more convenient devices like smartphones and tablets will do everything a lot of people need without needing a PC at all these days. If you mostly used a PC for things like staying in touch with your friends or retrieving information from a web site, rather than anything creative beyond a quick bit of typing or anything that needs more powerful equipment, you might not even have a laptop any more.

Given that Intel has never had much penetration in the mobile device market compared to ARM designs, it doesn't seem that surprising that the demand for their products is waning as the mass market moves in that general direction.


Moore's law does explicitly talk about semiconductors - with the recent strides in light-based computing (however small) we might be able to continue the overall computational power trend in the future.

I don't think we have to dig this grave just yet.


Interesting point. If we are witnessing the end, would we recognize it at the time? Moore's Law has been something we've become accustomed to, so it almost seems hard to believe that it will end even if we are told that it will.

Makes me wonder what research money will be spent on instead of just faster chips.


Worst case scenario: nothing. The R&D goes away. Look at aerospace for a nightmare scenario.

In the early 20th century we got fixed wing flight.

In the teens and 20s we got motorized fighters and the first passenger planes.

In the 40s we got jets.

In the 50s we broke the sound barrier and orbited Sputnik.

In the 60s we landed on the Moon.

In the 70s we... stopped going to the Moon.

In the 80s nothing much happened except declassification of a few things (stealth) that were developed in the 60s and 70s.

In the 2000s we grounded the Concorde. Passenger flight got slower and more expensive.

In 2016 we fly on passenger planes no faster than what we used in the 70s and 80s, and we're stuck in low Earth orbit.

1969 was the peak of the aerospace industry. With the exception of SpaceX (which is really just picking up where NASA left off), we are less advanced today than we were in the 1960s.

There's many places we could go beyond conventional Moore's Law: multi-dimensional chips, optical, quantum, exotic materials with very low power consumption, etc. But if what we have is "good enough" and there is little demand for anything faster, the R&D dollars won't be spent. If anything the shift toward mobile computing and wimpy thin client endpoint devices might actually lead to a pull-back and loss of capability similar to the one we saw in aerospace after the 70s.

The consolidation we are seeing is not a good sign. This is what happens when an industry decides it's now a cash cow and it's time to go out to pasture.

We also could have a base on the Moon and Mars right now and be working on our first interstellar probe to Alpha Centauri. Physics didn't stop us. Economics and politics did.


That's not entirely wrong but you're ignoring the massive price difference for air-travel between the 60s and "slower" 2016.

The price drop in air-travel can be attributed to: 1. Technology. Cheaper (per seat), more efficient planes. 2. Consolidation of airlines (and airplane manufacturers). 3. The disruption of full-service airlines by low-cost carriers.

In semi-conductors we have been getting along with 1 and a bit of 2. I would say that the disruption of Intel by ARM is an example of 3 because intel is not incentivized to compete at those low price points.


3. The disruption of full-service airlines by low-cost carriers.

In the US at least, this was due to political deregulation. The process was started by Nixon, but finished in law by Jimmy Carter (!), note also the leading lights of the Democratic party in the signing picture, e.g. Teddy Kennedy 2nd from right: https://en.wikipedia.org/wiki/Airline_Deregulation_Act


There's no carbon on the moon. Pretty challenging for a human settlement.


This is pretty depressing.


Maybe it is. But if the general public is happy with what they already have, manufacturers tend to iterate instead of innovate. Getting from europe to australia two times faster, for ten times the normal ticket price makes me settle for the slower option.

Even a 200% increase in CPU performance won't bring the general public back to desktop PC's they don't really need, because they are satisfied with what they have. We need something truly disruptive and get people interested in something else.

Right now there are a lot of experiments (AR, VR, wearables, transparent screens, lightfield, new batteries, etc) which will, probably when combined into a truly attractive package, change everything. Again.


One must understand that this is re-structuring. Intel is probably letting go of some divisions it no longer intends to pursue. In the not-so-distant past, Microsoft did an internal re-structuring when Satya Nadella became the CEO. It isn't as bad as it is portrayed to be as most people end up getting re-hired in other groups or take the severance and join a new company.


and I am not down-voting you, I am re-structuring your karma.

Yes, this may be good for the future of Intel or the employees losing their jobs... but right now, 12000 people are having their lives affected in drastic, unexpected ways.


I agree that it's likely the majority of those people are in unenviable positions. I've been thru cuts, it sucks, but at the same time, as you say it may be untenable for Intel to keep carrying them --also, it's possible Intel should have been more cautious in hiring (and maybe some of those people would have opted for personally more stable jobs, or maybe they would have taken worse jobs --it's hard to say.


It doesn't say that they are firing 12,000 people. Certainly some people will be laid off, but a reduction in head count of 12,000 is distinct from layoff count.

Assuming people stay in a job 4 years, you can get a 25% reduction in headcount per year by just not hiring. Intel has over 100,000 employees. They are likely hiring 10k-25k people per year just to stay at a constant size.


I doubt with the CFO being let go most people will end up getting re-hired in other groups. Their revenues are down, they're closing up shop in some areas or realizing they need to run some areas with less people. It's actually a pretty big deal.


the CFO being let go

He's not "being let go".

From the article: Stacy Smith, who has been chief financial officer since 2007, will move to a new role as head of manufacturing and sales

I'm not sure what to make of the move, but if they really wanted to let him go they wouldn't have given him this different role.

Edit: just to elaborate, Intel is first and foremost a manufacturing company. Their manufacturing currently has a 61 percent gross margin. You don't put someone in charge of that if you want to let him go!?

But I'm not an Intel employee or close observer of the company. Maybe some Intel insiders can chime in with that they think this means.


Intels profit has grown 3 percent and their revenue is up 7 percent compared to same quarter a year ago.


You're right. I made a bold statement without looking into the details. Looking at their yearly balance sheet, though, their yearly liabilities are going up. They've got quite a bit of debt.

A good 1/3 of their assets are in property plant and equipment value. That means that the fair market value of that stuff can tank if they don't keep up the pace of growth because what if they can't sell as much of their products anymore because of a shift to other technologies?

This is for sure a move because their revenue forecasts are grim in many areas they operate in.


News like this out of the blue is strange. I was pretty sure that the post-PC era was a sham. It still is, right?


I recently started working on a little eCommerce project. I found a way to indirectly estimate the sales of some of my competitors products. So I downloaded about 13gb of this data, processed it down into about 80 million rows and than ran a bunch of stats on it. Now 80 million rows is not "big data" scale, but it's not super small either. With my 16gb of memory (that I paid I think $400 for), my humble quad core i7 that i paid $300 for, and my 500 GB SSD. that I paid $100 for, My less than $1k desktop crunched through this dataset with no effort at all. I consider myself a "power" user, and my humble machine can do everything I can throw at it, and more. It'll be a long time before I upgrade again, and when I do, i'll probably buy the cheapest processor on the market. I'm probably not alone here.


My 3+ year old Lenovo laptop running an i5 with 8GB of RAM and a 500GB HDD (at 5400 RPM, no less) easily lets me multitask on Word, Photoshop and about 20+ Chrome tabs open

The only time I've felt my computer to be "slow" was when I tried to use Photoshop and After Effects simultaneously.


Yup. I do plenty of "medium data" work on my overclocked i5-2500k and 64 GB of RAM on Windows 7 / Ubuntu mixed machine. I've had the processor since the 2500k was the thing to have (years ago). No reason to upgrade yet. Got a new GPU after 6 years but that was about it.


This sounds intriguing, care to share any insights? Is this for Amazon?


Yeah, I want to hear about estimating competitor sales too.


Maybe i'll do a write-up in the future. I assume everyone does it already, but on the off chance they don't.... i don't see any upside in writing about something that might help my competition.


There are plenty of PCs.

Here's the problem. Running a five-year-old PC used to be an issue.

Today, running a five-year-old PC is a Intel Sandy Bridge i7-2600K ( Passmark Score: 8,518 ). While a modern i7-6700K has a Passmark Score of: 10,987.

FIVE YEARS, and FOUR generations of processors have created a gain of net 28% in multithreaded situations. Far less for single-threaded applications (maybe 15%). And absolutely negligible for gamers (which are 100% GPU throttled).

If you're running a 5-year-old i7-2600K, there is absolutely no reason to upgrade to Intel Skylake. None at all. Maybe you want a new GPU to play those VR games... but Intel isn't making gains anymore in processor speed.

Intel has been trying to get people to buy their power-efficient designs (Skylake is a hell-of-a-lot more power efficient...) so Intel continues to sell laptops at a decent rate. But no one I know has major issues with their desktop speeds.

The only people I know who have upgraded their computers are those who have had hardware failures. There's still no need to upgrade a computer from Sandy Bridge.


People repeat that meme that SandyBridge doesn't need replaced so often. You say it 2 or 3 times by yourself.

While that's the prevailing opinion, I don't necessarily agree. I think it's SandyBridge owners trying to convince themselves more than anything else, but really it's just being swooped up in the groupthink.

Skylake is the first chip that makes a very strong case as an upgrade. You gain NVME support for your PCIE SSDs, DDR4 (which has shown an improvement over DDR3 in some benchmarks), roughly 20% IPC improvement (5% per gen give or take), DX12_1 feature level IGP, CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming (Broadwell had it first and Skylake's is improved upon), vastly more power efficient and Thunderbolt3 support.

7 pretty good reasons off the top of my head. You can dismiss each of these if you want, but this is all very attractive in reality.

The whole story is that SandyBridge is only competitive, in gaming, if you overclock to 4Ghz+. You still lose out on the other improvements though and any stock SB system compared to a stock SL system will look pretty sad once you factor in the platform updates.

If it's gaming you care about, take a look at the benchmarks of the 128MB L4 Broadwell chips compared to Devil's Canyon. Let alone SandyBridge. Both get crushed where it counts and Intel is just now getting Skylake 128MB L4 CPUs out the door. If you don't care about gaming, Skylake still crushes SB.


> Skylake is the first chip that makes a very strong case as an upgrade. You gain NVME support for your PCIE SSDs

http://www.amazon.com/Ableconn-PEXM2-SSD-NGFF-Express-Adapte...

If you care about NVMe, just get a $20 expansion card. Besides, NVMe SSDs are expensive. Mushkin Reactor 1TB for $210 yo.

Hell, the fastest NVMe SSDs directly go into PCIe lanes. So if I actually cared about the faster speeds, I'd jump to an Intel 750 SSD.

http://www.newegg.com/Product/Product.aspx?Item=N82E16820167...

Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.

Yes, if I had a laptop which only had room for a M.2 card, then maybe I'd get the Samsung M.2 card. But even if one were given to me for free, I'd rather get the $20 PCIe expansion card.

I can't think of a single situation where I actually need the onboard M.2 card on the Skylake motherboards, aside from the $20 convenience.

> roughly 20% IPC improvement (5% per gen give or take)

I admit, this is a good thing. But this is very very little, especially when you consider that the iPhone 5 to iPhone6 jump was 70% IPC improvement AND battery improvement, yet many people don't consider that enough of a jump.

http://www.imore.com/a9-processor-iphone-6s-and-6s-plus-70-f...

Soooo... FIVE years gets you +20% speed, while ONE year gets you +70% speed on phones. That's why desktops aren't getting upgraded.

> DX12_1 feature level IGP

You buy a $300+ CPU without buying a $100 GPU? The cheapest of GPUs are significantly better than IGP. Hell, if I cared about DX12_1 IGP, I'd get an AMD A10 for half the cost and twice the IGP performance with drivers that actually work on games.

Except I game in capacities that far exceed even AMD's superior IGP. I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris. So I have a R9 290X. Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.

> CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming

NOT on the desktop. Crystalwell is laptop-only, and 45W to boot. Compared to 20W Laptop chips, I don't see the Crystalwell L4 Cache actually being useful to the majority of people.

In fact, I don't even know of any laptops with Crystalwell that I'd recommend to anyone. Here's a challenge: Name me a good laptop with Iris Pro / Crystalwell. Hint: Macbook Pro use AMD dGPUs for a reason.

And hell, we aren't even talking about laptops. We're talking about Desktops, and Crystalwell is UNAVAILABLE in desktop form. Its irrelevant to the desktop user, even if you thought that paying $600+ for a CPU was cost-effective (instead of buying a Sandy Bridge E5-2670 for $70 from Amazon).

Basically, you got DDR4 RAM and IPC +20%. That's all that I actually think the last five years will get you. Or, you can buy a 8-core 16-thread E5-2670 for $70... hell... two of them, get a nice Server Board for $300 and have a BEAST of a machine.

http://www.techspot.com/review/1155-affordable-dual-xeon-pc/


The base Macbook Pro 15" uses Crystalwell and has no dGPU.


Yeah, but would you seriously recommend it over the AMD Tonga (R9 M370X on the upscaled version)?

The 45W i7 is a heavy burden to carry with Crystalwell. Might as well get better graphics if you're going for the 15" Pro.


There is no way in hell I would prefer that AMD chip in my system over an all-Intel system. AMD are just terrible to use in Linux and most people around here want the ability to run that natively without issue. Not to mention the added complication of tacking that AMD chip onto the laptop both from an engineering / reliability stance and software complication.

You missed the irony of your 45watts as a heavy burden. A R9 370X adds about 50watts to your TDP by itself. Along with its needless complexity. If someone wanted to reduce TDP and that complexity you could step down to the base Intel IGP. But if stepping up, Intel's solution makes a lot more sense.


Your loss man. The benchmarks don't lie.

Good luck with your overpriced Crystalwell failure. If you got actual benchmark scores to talk about, please respond to me here: https://news.ycombinator.com/item?id=11536519

But I actually know the benchmarks of everything you're talking about like the back of my hand. Your argument has no technical legs to stand on what-so-ever. Don't feel bad if I'm just calling out your Bull$.


Wait, what are you talking about? That was in no way a response to what I said to you here. You don't need to change the topic just because you're wrong and you know it.

No one wants that AMD chip in their Macbook. It adds complexity both in engineering and software. There's PLENTY to talk about technically there and why that's a good idea. Not to mention Intel's best-in-class Linux support.


It's actually kind of annoying to have graphics card switching - it caused a number of problems in my old 15" MBP, to the point that I opted for integrated this time.


>Besides, NVMe SSDs are expensive

Yes. If you're bargain hunting for gaming hardware you should just buy a console. Or, if you're seriously suggesting to put an Intel 750 into some old system like SandyBridge.. no comment. I would never recommend someone bother doing that.

Step up to an NVME setup, Skylake and do it right. Skylake i5 setups can be had for cheap. You're just arguing to argue on that point. Whether or not you have anything useful to add. The SB argument is common knowledge, an age old argument at that with no new information or insight.

>Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.

I'm not into cost-effective bargain hunting. Anyone who would gimp a nice Intel 750 SSD on a non-NVME system is a fool and you've suggested it.

>The cheapest of GPUs are significantly better than IGP.

No they aren't. The point about DX12_1 IGPs is that it's there, it's modern and it has already sucked the life out of the low end space and moving into the midrange with Iris Pro. Your stance is the 2010-era view on computers. Same era as Sandybridge TBH.

>I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris.

This demonstrates how much you know, and why people shouldn't listen to what you're saying. Which can be heard on any PC gaming forum a thousand times over. This is HN though and it won't fly.

Intel has already committed to FreeSync. It's incoming with KabyLake rumor is that it may be enabled for Skylake.

>Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.

Wrong on its face. You just haven't cared to investigate recently.

>NOT on the desktop. Crystalwell is laptop-only, and 45W to boot.

Nope. The Crystalwell chips are going into NUCs from here on out. There's a 128MB L4 NUC coming in 2 1/2 weeks and a 256MB NUC coming in 12 months.

The fact you're talking about gaming and recommending an ES-2670 for that is just silly. That might be a good machine for compiling code. If that's your goal, it's still a bad idea when distcc can utterly embarrass that old power hungry chip.

For gaming, Broadwell already demonstrated what Crystalwell adds for gaming performance with a standalone GPU. And it's a game-changer, it's faster than the i7-6700K. Yes, it is. And it definitely mops up where it counts (99th percentile frame times) on SandyBridge too.

In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have (even with an Intel 750, if you made the ridiculous decision to actually put one in SB). There might be more cost-effective ways to build a gaming rig, but if you're into saving money on hardware and gaming, buy a PS4.

I understand it's the prevailing thought among PC gaming kiddies, but holding your grip tighter on some old SandyBridge system won't change that in reality it's fallen pretty far behind in both overall platform performance and power efficiency.


I can't find Iris Pro 580 on benchmark sites, because no gamer gives a care about that for gaming.

The Iris Pro 5200 GT3e achieves Passmark 1,174.

http://www.videocardbenchmark.net/video_lookup.php?gpu=Intel...

If Iris Pro 580 GT4e is twice as good (Intel only claims 50% better), that's still not very good. Thats utterly awful actually.

A $100 GPU is the R7 360, just off the top of my head. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125...

Exactly $99 on Newegg right now. It achieves Passmark 3,150.

No one gives a care about the $600 Crystalwell chip that performs worse than a $100 dGPU. Its utterly awful. You'd be insane to actually recommend this product to anybody. You claim that you care about performance. Do you even look at the benchmark numbers bro? You're claims are so far away from reality I really just don't know how I'm supposed to respond to you.

Yes, a $600 Chip. I'm assuming this, unless you can figure out a cheaper Skylake Iris Pro: http://ark.intel.com/products/93336/Intel-Core-i7-6970HQ-Pro...

----------

EDIT: I see that you're an anti-AMD guy. Okay, whatever. That's why there's another company out there.

http://www.amazon.com/ZOTAC-GeForce-DisplayPort-Graphics-ZT-...

NVidea GTX 750 Ti, $105 right now on Amazon. Passmark 3,686. Still utterly crushing your $600 iGPU with cheap-as-hell GPUs, no matter the brand.

http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+GTX+75...

Dude, I'm running a (what was at the time) high-end R9 290x, although this is more of a mid-range card now due to its age (Fury / 980 Ti). It has Passmark of 7,153, and you're seriously suggesting I "upgrade" to a Crystalwell Iris Pro that only achieves ~2000 Passmark?

------------

PS: Skylake performing 20% faster than Sandy Bridge after five years of updates is awful.

-----------

> I'm not into cost-effective bargain hunting.

Then why the hell are you bringing up M.2? Intel 750 is the best of the best and plugs directly into PCIe. Sandy Bridge handles it just fine.

-----------

> In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have

Crystallwell has to beat a $100 dGPU first. And the benchmarks say otherwise. My bet? Crystalwell fails to beat a R7 360 / NVidia 750 Ti (NVidia's LAST generation BUDGET card) and I get to laugh at its worse-than console performance numbers despite the $600+ launch price for the chip.

But hey man, show me those benchmark numbers if you disagree with my assessment.


"Bro", "dude", "man". So I know I'm talking to some little kid at least.

But I have to say, anti-AMD! Yes, very perceptive as I type this on a machine with a Radeon in it. Consider the fact that other people can criticize products they have extensive knowledge with.

Judging from the rest of your response, my post went completely over your head. And quite the troll as you try to change the points I made in attempt to "win" the argument. But it is amusing hearing some kid saying Intel's 20% IPC boost from SB to SL as awful shows how much you don't know. I have friends that work at Intel. Go back to PC Gamer as you have no idea what you're talking about. Some dumb kid sees ONLY 20% with massive power reductions, doesn't realize that computers can do more than just play video games.

You also failed reading comprehension and the ability to hold a conversation. Congrats. But either way, there's no way around the fact that in 2 1/2 weeks I'll be benchmarking a R9 Fury to an i7-6770HQ with PCIE NVME SSD, some DDR4 and absolutely crushing any SandyBridge system you own.

Enjoy your old ES-2670 and SandyBridge with an Intel 750. What a total fruitcake. Better use of your time is to go read about logical fallacies as you just spent an hour typing about a strawman you created to beat on with points I never made.

What you want to hear because you just want to argue- you're right, I'm wrong. Hope you feel better now. I'm not giving you any more help. I get it, you like your poverty gaming rig. See ya kid.


>> i7 2600k

Sure, that's the top of the line $300 chip in the days of a whole PC being able to be bought for $300. What if you're on a five year old Pentium G620?


If you bought a $300 PC 5 years ago:

1) you're not the sort of person who buys a new rig every two years, and 2) a $300 PC today will give you exactly the same performance as the one you bought 5 years ago: the minimal gains you get in iron are naturally offset by minor losses in software (which is now built by people with SSDs, so good luck with your little spinning disks...)

The market is now artificially segmented to such a fine level, and moving so slowly at the top, that performance simply does not "trickle down" like it used to. Add to that the move to "power efficient" CPUs (aka: less powerful overall) and you will basically see zero gains if you stick to the bottom of the market.


Not quite "exactly" the same performance. A 20% improvement with today's stuff.

But yeah, its peanuts. A 20% improvement over five years is pathetic. I'm just calling out your hyperbole, in case others didn't see it. Apple had like a 50% improvement in a single generation of iPhones, so a 20% difference over five years is very ignorable.

SSDs and GPUs improved dramatically over the past five years. Well... more specifically... SSDs got dramatically cheaper and retained roughly the same quality. So its worth it to upgrade to SSD or to get a new Graphics Card. But Intel doesn't have any GPU offering, and their SSDs are "enterprise" (aka: overpriced). Mushkin / Crucial are better brands for consumers... even Samsung (although a bit more expensive)


The cores are basically the same within the generations.

A five-year-old Pentium G620 is only ~25% slower than the Skylake Pentium G4520. Both are dual-core CPUs that are cheaper than $100 aimed at the budget audience.

Frankly, the fact that AMD Vishera FX-6300 still easily beats out the Pentium G4520 in multithreaded benchmarks... this demonstrates the absolute lack of Desktop CPU improvements. I'd only recommend the G4520 to someone who is really sure that they care about single-threaded performance (ie: Gamers). Most people will appreciate the lower total-cost-of-ownership that FX-6300 offers at that price point.

https://www.cpubenchmark.net/high_end_cpus.html

* AMD FX-6300 Passmark: 6,342

* Modern Skylake Pentium G4520 Passmark: 4,261.

The G4520 is a $80 chip, Released October 2015. FX-6300 was AMD's 2012 entry: a FOUR year old chip, now selling for $80 to $90 at Microcenter.

Microcenter has some $0 Motherboards if you buy an FX-6300 from them. That's the kind of benefit you get from buying "old". And since CPUs aren't really much faster, why the hell should you buy cutting edge?

--------

Hell, why are you spending $80 on a new G4520? Facebook just decommissioned their servers. You can get a Dual socket ready Sandy Bridge 8-core 16-thread E5-2670 on Ebay for $80. Amazon for $70

http://www.amazon.com/gp/offer-listing/B007H29FRS/ref=dp_olp...

Go get yourself a dual-socket 16-core 32-threads E5-2670 Sandy Bridge Workstation, just $80 per CPU.

Intel can't even compete against their own ghost from 5 years ago. Is it a wonder that sales are low?


I think we're in a post-PC era. Yes, I spend my work days on a laptop. But most of my personal computing happens on an iPhone or an iPad. I think that PCs (and laptops) will increasingly become "things we do work on" and smartphones and tablets will become "things we consume stuff on". A lot of PC sales, I think, were coming from people buying them for personal use to consume stuff. That market is changing.


On the consumer side its possibly back to a shared PC era. Instead of multiple PCs per household, just one PC to do work that's about 5 years old, and each person has their own annually updated phone/tablet for everything else.


They're so much slower to interface with though. I couldn't live without keyboard shortcuts.


On top of that productivity app companies are chasing mobile in order to not get left out in the cold, further exacerbating the PC decline.


This isn't at all out of the blue. As the article states, Intel revenues have been declining for over half a decade now. A few years after the first iPhone hit the market.


Are you sure about the revenues going down? According to http://www.statista.com/statistics/263559/intels-net-revenue... revenues seem to have gone up.


You can blame it on the end of Moore's law, or you can blame it on mobile computing. (The two are not unrelated.) But it is fundamental.


To me, it's more that Moore's law is over and that observable speedups are much less obvious to the non gaming consumers


To me the Moore's law will end where we all have a mediocre CPU at home connected to a fiber... any anytime you need to do work, or someone else, without being aware we share our CPU with eachother. Then there is no more Moore law - just one huge SPU (Sharable Processing Unit).


Out of the blue? There have been articles for a long time about the internal mess at Intel as they try to figure out mobile and IoT. One of the latest ones just last week:

http://www.businessinsider.com/intel-leaked-memo-murthy-rend...


I'm thinking its not so much a post-PC era as a customizable SOC era, and Intel does not want that at all.


How much of your time is now on a mobile that used to be on a desktop? Their core business is drying up.


Am I the only person for whom the answer to this question is "almost none?"

I'm serious: about 80% of my day is meeting with real people in the real world. Mobile phones haven't changed that.

The other 20% of my day is sitting at my desk creating original work product (mathematical models and thoughtful memoranda) or reviewing the work product of others. Mobile phones haven't changed that, either.

No doubt the drought in PC sales is real and permanent. But I wonder how much of that is because people just don't need to keep their laptops up to date in the age of great cloud services.


Nah. I think there are more than a few of us around. I mostly avoid using my smart phone and I don't have a data plan. I had to buy it because I was travelling abroad and it was a light device I could use to communicate and take photos with. Most of my work is done on my laptop.


Yeah, mobile hasn't figured out a good way to take over the workspace. Some of the tablet/laptop hybrids are getting closer.

As for entertainment though, are you watching Youtube extensively on your desktop?


I know this sounds crazy, but there are still some people who have cable subscriptions and watch TV on a TV. Oh the horror!


Those TVs are now 'smart' along with cable boxes and other peripherals. Is Intel Inside any of those?


> Yeah, mobile hasn't figured out a good way to take over the workspace. Some of the tablet/laptop hybrids are getting closer.

There already is a relatively mobile tablet/desktop hybrid that works pretty great for both consumption and getting work done. It's called a laptop.

> As for entertainment though, are you watching Youtube extensively on your desktop?

Yes. I have a phone, a tablet, a desktop, and a laptop. The tablet is pretty much only used for netflix and textbooks, and the phone is for travelling. The tablet is absolutely worthless for browsing, coding, writing, or gaming; and the phone is only saved by the form factor. If I had (the space for) a TV then the tablet would be a completely unjustifiable purchase.


Laptops typically aren't considered mobile devices (if we're being pedantic). Try running mobile apps on your laptop.

And yes, mobile devices have taken away entertainment share from the desktop as well as televisions.


Sure, say portable if that makes you happier. Not that it matters though, mobile programs are the ones trying to catch up to desktop programs, not the other way around.


No, you aren't the only one that barely uses mobile stuff. My desktops and use of laptops is essentially required for my job (software engineering). I've seen 1 person switch to a tablet, but I'm not sure they have enough power to do the things I need. One day all of this will merge into a single unit, but right now I see separate needs/uses for both.


Great cloud services, or the fact that my work has in no way been accelerated by newer processors. The only noticeably workflow-related speed increase I've had in the last five years is damn near zero-margin SSDs.


Almost none, the only time I use my mobile in lieu of a desktop is when my desktop is unavailable or my home Internet connection is down.

I don't count time I spend out and about on my mobile, since that isn't time I was going to spend on my desktop anyway. (And I don't think having the option of going outside with a mobile has changed how often I do so.)


All that mobile traffic increases server traffic. I don't think the switch to mobile is a big hit for Intel as long as they are the only major player in the high performance CPU market (let's see if AMD can turn it around with Zen, but I don't have my hopes up).


Also chiming in with a "none". All mobile devices have done for me is expand my use of computing devices into realms in which I never used one before.


Less than 5%. My phone stays in my pocket most of the time, even when I'm nowhere near any other device, and everyone else around me is heads down in theirs.


I pull out my phone to get my 2FA number every now and then :D


Yeah, I am not doing any development work on a mobile device. I have yet to go to an office where the cubes are filled with people working on iPads.


I think intel's core business has transitioned from desktop PCs to servers for the farms, which shows no signs of drying up.

Plus, Intel provides almost all laptop chips, and I'm guessing most businesses still have either a desktop or a laptop per person, which will still get upgraded every so often. I doubt workers are going to be using iPads for data entry, although I suppose you never know.


For me, maybe 30%, mostly because I have a computer in front of me for coding most of the day anyway. For my brother's family it's nearer to 90%. He still has a PC for work, although it is on about a 5-7 year replacement cycle, and they no longer need the more-expensive second PC.


> How much of your time is now on a mobile that used to be on a desktop?

None, since I don't want to have (and don't have or own) a portable surveillance and tracking device (also called mobile phone) in my pocket.


It's probably possible for PC sales to decline while PCs remain dominant in computing. If so, Intel probably doesn't need to restructure; they just need to get smaller.


No. Computer sales are at historic lows. We've reached the confluence of computers being "good enough" (there are no longer much performance gains from buying new computers) and more and more consumer computing moving to tablets/phones.


The particulars of Intel's situation aside, there are some (notably not all) watching the markets that believe we may be well into the prelude to a market recession.


We're pretty much post-PC froth/churn.

More

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: