The whole market is overvalued, not just the tech unicorns.
Also, this has nothing to do with discounting the cash flows, it's mostly stock buybacks that's driving all the action:
No, they are off because they are making way less money (down 55% from a year ago!). In general volatility can be good for investment banks because it means higher trading volume. Revenue has been way down thanks to fewer deals (remember when tech companies had IPOs?) and reduced fixed income trading volume (GS revenue from fixed income trading is down 48% from a year ago).
Then you could calculate revenue per trade (or something more useful along those lines) to use as a signal in these cases.
The numbers are pretty brutal, especially in institutional client services.
GS has $11BB in oil sector exposure. MS is $4.8BB. BofA and Citi are around $20BB. JPM is around $15BB. These are fractions of total loans outstanding (MS is 5%, all the rest are much smaller). Each of these banks has more in reserves than oil loan exposure. Does that sound scary to you?
Again, relatively sophisticated investors understand these things. This is basic research. Where is your evidence?
The limits on the banks' exposures are dependent on counter party risk. GS's exposure is far more, but they insured most of it with other banks, and please allow me the simplifying assumption they insured all of it with BAC. Which means that (hypothetical scenario) if BAC goes bankrupt, suddenly GS exposure to oil goes up to, say $100BB. Suddenly the reserves are woefully insufficient. Then there's the sudden risk that GS goes belly-up, which would increase everyone else's exposure. One of them goes and ...
Additionally, oil fuels the economy. Oil is what builds the roads, what makes everything on the roads moving, what keeps planes in the air and boats going forward. Oil provides significant parts of our electricity supply, and so on and so forth. So you can look at the oil sector problem in 2 ways. Either you look at the supply side, which is producing somewhere around 2% more oil than the market is willing to buy (at any price). This is short-sighted. "If we'd all just put 2% more gas in our tanks, there wouldn't be a problem", which is not realistic.
The other oil to look at is demand-side oil. The market is simply not buying 2% of the oil, except to store it. Why not ? One explanation would be that there is a global recession and the oil price move is simply the result of that. The fact that the price crash happened with oil production/supply constant (even slightly declining) would seem to support this. For instance the baltic dry index plunged before oil started having problems, same with container shipping, and this explains quite a bit of the excess capacity in oil, and therefore, I say caused the price drop in oil. Oil crashed because manufacturing (the source of the demand for shipping) crashed a few months before the oil crash (and hasn't recovered).
In other words : you've identified the wrong problem. Oil is a symptom of the underlying situation, not a cause. You say banks are capable of withstanding one aspect of a greater problem ? Well, I'm not saying that's bad, but it's not reassuring at all (and may not be true due to financial engineering).
I do agree with you overall about indexing, however "Just wait until the recession is over" is market timing and I dont believe I (or others) can do this dependably well.
One can argue the way to judge stocks value is not by P/E but by E/P relative to interest rates; that is whether their excess return relative to risk-free assets is justified by their risk.
Interest rates remain 5% below their long-term historical average, which can justify E/P going up significantly. Currently stocks offer a 3.5% return over some classes of t-bonds, well in line with historical norms.
The current danger is that both will move down at the same time, and then where should one invest?
check how high it was in the 2000 and 2008, we're aren't even close
My fear is that the central banks are pumping so much liquidity into the market that they are driving up equities and pushing people out of safer assets into things like high yield bonds and momentum stocks. If we do have a recession, the pain could be worse than usual (for stocks) for the mere fact that the debt market could have liquidity problems when tons of funds begin to pull their money out at once from HY.
Not to mention all of the corporate buybacks that companies are doing by leveraging, because cash is too expensive to bring back overseas.
You can still invest in solid companies, but companies like Tesla, Netflix, anything with a super high P/E is going to be taken out back and shot (that doesn't mean the companies will go out of business, only that their stocks are much like Amazon in the 2000s.)
The 2008 bubble was eminently predictable. The problem was that no one knew the exact trigger, and no one had a politically acceptable means to deflate the bubble until the domino effect started, and then hedge funds and then major banks started collapsing.
Until the knife started falling, no one had a financial incentive to stop. After all, subprime mortgages have crazy interest rates, and if you're BoA or JP Morgan Chase, the government will probably step in to stop your collapse...
Not that I disagree with you, but I believe a contrarian viewpoint would be something along the lines of "We are currently in the midst of an economic revolution as increasingly large swathes of activity are digitized and lingering mechanical/human processes are computerized. Companies likely to be successful in this new economy are unlikely to be the same ones which were successful in the old."
(To which the obvious rebuttal is probably "People are always saying things are about to be different, and they're usually wrong.")
Not even close to a record high. It's slightly high, but is it surprising that people are will to put a premium on earnings with negative interest rates?
I'm not saying the market is overvalued or not, but this is in no way indicative of that.
On Intel I think the "everything is going mobile" is PR for investors, the PC market doesn't have any foreseeable growth potential atm.
As a general FYI to those reading -- please fact check before commenting.
The Federal Reserve's quantitative easing program ended on Oct 29th, 2014 (538 days ago!) It is difficult to assert that a program which ended over 500 days ago is permanent.
Additionally, one may be willing to assert that "ZIRP is permanent regime now" too, to tack onto the culture that central banking has entered a new era.
For anyone who did not see that news, ZIRP (zero interest rate policy) ended on Dec 15, 2015 with a rise, and is currently believed to rise again at the next meeting.
Please explain the problem here. You're acting like there's some big deception imposed on the public due to companies choosing to return capital to shareholders via one specific mechanism. As if somehow they're buying back stock and "fooling" people into thinking they're making more money or something, and stock prices are irrationally rising. The buybacks, earnings, financials are all open for everyone to read. It's evident by some of your commentary that you can't be bothered.
If you don't agree with the price that other people are willing to sell for shares in this market, you are more than welcome to take the other side of that trade and sell into every buyer on the planet. I'm sure you're not doing that.
Nor is it remotely the only way to assume a bearish position.
>Stock buybacks can be deceptive if a company buys back stocks with debt and doesn't have proper cashflow to pay it off.
You can't tell the cash flow and debt levels from the financial reporting? You don't account for this in your valuation? Where is the deception? Report it to the SEC.
>Non-technical investors see the stock going up and keep piling more cash in.
An equivalent number of people are "pulling cash out". A stock trade is just that: a trade. Again, what makes you a better judge of the "correct" price than those actually making the deal?
>If you think markets are perfect, you have a lot to learn my friend
Not sure where that comes from, but you're right, I do have more to learn. That said, it looks like I'm a few levels up on you (and far less confident in my ability to predict anything).
whether they do a token 0.5% hike (they probably won't), it's obvious that something very basic has changed, and quite likely irreversibly.
They also don't support your hypothesis that something has changed irreversibly.
Raising the interest rate more than 0.5% would have likely had a negative impact. It's very likely the Fed will raise it again later this year.
Here's a link to read about it. https://mises.org/library/unseen-consequences-zero-interest-...
In the modern era of frac wells where depletion rates are like 50% in 2 years, broad economic demand is now much slower than production declines.
That's the problem with secondary recovery... The "balance sheet" looks good in that you'll profit (unless prices tank) but the cashflow is terrifying you don't dig a well and collect for 30 years like the old days, you do exotic processes and get high production rates for like two years, then production drops to zilch.
On a large, century size scale it screws up Hubbert graphs. You can ramp up for a century, but extensive secondary production means the decline slope will be just a couple years instead of a nice symmetric century or whatever.
By analogy its like switching farm land from an olive orchard to corn production and being surprised at fundamental shifts in the financial sheets. (Yo its just another plant, right, well theres a bit more to it...)
It is absolutely picking a narrative.
They sold off their ARM processors to Marvell. They've made various forays into wireless, though WiMAX didn't work out so well for them.
Really? About the only reason why I'm getting a new phone every so often is when the old one stops being updated. (My Nexus 4 lasted 3 years, and it would have done another year.)
Planned obsolescence, if you want.
Also, I play some games on my phone, and having better hardware helps a lot with that.
I could see how infrared sensor like in that new Caterpillar phone could come in useful especially if you are in a building trade, but fingerprint scanner?
You have to get a phone with one and use it for a week or so until its use becomes routine to really understand how big of a game changer it is. I subconsciously unlock my phone now every time I pick it up. It's amazing. Every day it saves me probably 60 seconds in total typing in stupid PINs, and those savings add up real quickly.
Incidentally I had a HTC Desire HD before and that became unusable on newer versions, as is the Nexus 7 now. These two clearly didn't have the good enough hardware specs.
For young college graduates, the unemployment rate is currently 7.2 percent (compared with 5.5 percent in 2007), and the underemployment rate is 14.9 percent (compared with 9.6 percent in 2007).
Things have really levelled out for average loads even developer loads (mostly do web dev, run vagrant machines that kind of thing).
I can't see me upgrading til this thing dies tbh.
But I got tired of 190F slowly being pumped out of my case and into the room.
Its replacement is finally on the way. I preordered Intel's Skull Canyon NUC. Got 32GB of DDR4-2800Mhz memory and a 512GB Samsung 950 Pro PCIE/NVME M.2 SSD. I'll be dailychaining a single DisplayPort cable to 3 new LCDs as well.
Pretty huge leap in performance. It just made sense to stop building new computers and jump on the NUC bandwagon. All I do is development, League of Legends and the rare CSGo. The ~Geforce 750 performance levels that NUC will provide will be enough. The inclusion of the Thunderbolt3 port for an external GPU case really put my mind at ease. Not that I intend to utilize it, but I'm glad it's there. Same upgradability as any other machine. SSD/RAM/GPU. The CPU is soldered, but I never once replaced a CPU after building a computer anyway. Other than the few Athlons I killed from overclocking in ~2001.
Probably upgrade more often if these new gaming NUCs are as good as I think they'll be. Next upgrade for me will be 10nm + Thunderbolt4 NUC. And the final perk, all-Intel so it'll work great with any Linux distro natively. That's worth a lot to me.
Unless Intel failed hard with this thing, which I highly doubt.. it's Intel.. I'm all-in on NUCs from here on out.
Good pick: it's still one of the faster chips around. However, it's TDP is 95W, which is fine for a desktop. Since then, Intel has been concentrating on delivering the same (or, often, much less) speed with lower TDP ratings.
These address the market need for thin, high-priced laptops -- preferably without fans -- that you can use in Starbucks.
You could "upgrade" to a new Intel Core i7-6600U that's actually slower than your i5-2500K but has a TDP of only 15W.
I'm really looking forwards to VR if it catches on though, having an insanely high resolution headset so I can dump multiple monitors for programming is a big win, combine that with something that has the portability/form factor of a MS Book/Macbook Pro and you'd be able to program as capably from a hotel room as at your desk at home/work.
That would be the biggest shift in my work habits since I went from Windows to Linux in the late 90's.
Also I think once everyone can get down to the same size as intel we might start seeing more exotic architectures, Intel has often won with the "with enough thrust a brick will fly" approach to engineering, it doesn't matter if your chip is clock for clock more efficient if Intel is operating at a level where they can put 5 times as many transistors down in the same unit area and ramp the clock speed way up.
Mostly it's about shuttling around 4 times as much memory for each screen as well as 4 times as much processing.
1920x1080 has ~ 2 million pixels.
3840x2160 has ~ 8 million pixels.
Internally iirc this is often done as vectors before been rasterised out and having various filters and shaders applied but that step requires that you store multiple buffers etc, same reason a card that will play a game just comfortably at 1024x768 will run away crying at 1920x1080 I guess.
My i7 desktop will be four years old in June. I've done the research and all I need to run the Oculus or HTC Vive (or the upcoming games that look good) is a graphics card upgrade.
I would like to compare CPU's with roads: Roads increase traffic, not the other way around. So if you make a faster CPU, the demand for a faster CPU increases.
So if Intel stops making faster CPU's, the demand will (unexpectedly) go down.
The automotive industry faced the same issue a long time ago, and their solution was to turn cars into a fashion statement. You no longer upgrade your car because your old one is worn out, you upgrade to signal your wealth through conspicuous consumption. The mobile phone market is somewhat the same way. The desktop PC market? Not so much.
There is some legitimate utility derived from novelty -- I don't want to be walking around in my grandfather's battered rags either -- but I wonder if some day we can shift towards something more sustainable and less paying to dig consumerist holes then paying again to fill them back in.
The point about your grandad's battered rags is that they're battered rags, though, not that they're old. His good leather jacket is probably still perfectly serviceable today.
I'd assume these acquisitions added to revenues.....
It's a sign of market maturity. Intel shouldn't be trading at a premium if it's selling a highly predictable, stable product to a consolidating marketplace.
But most of them are full of ARM chips fab'd by companies not named Intel.
hmm... only if I kid myself that the mobile phones (even the smart ones) are actually computing devices on which "I can compute whatever I wish to compute AND the way I can compute on a PC". Most of the ARM devices sold are NOT computing devices for the general person who purchases them.
It reminds me the fallacy of calling smart-phones "supercomputers in one's pocket".
May be the ARM chip based devices are "computing devices" for companies like Samsung, Google, or some car makers but certainly not for general public.
I crunched to inbox zero with my phone while commuting to work, while I'm typing this at my "real work computer"
Even if we disregard how arbitrary that definition of "computing device" is, it's still a false proposition. Anyone can easily get a browser, a word processor and spreadheets on their mobile device or tablet (ARM). That's about as much computing as a significant chunk of the population requires.
I have debian chroot on my Android device which gives me Vim, GCC and javac out of the box.
How much complexity do you think the average mind generates with its time? Enough to justify the ATP?
How much thinking do you think the average human does with his brains? Are brains worth it? Stay tuned.
I'm a power user. I have a 4 year old MacBook and a 2 core * 4GB VDI session. I won't replace the MacBook for another 18 months.
Even in enterprise IT... Every project we did in 2004/2005 required a server purchase with two Xeon sockets and a NIC that might be Intel.
Now the average marginal unit of virtual server needs 1/50 of an Intel Xeon. And a lot of those Xeons are at AWS, which squeezes the margins.
Both Intel's sales and profit are even YoY in 2015, and up from 2013. This doesn't have much to do with computer sales.
Valuing growth over sustainability makes these backwards ass goals of hiring more, building more and growing. The way this is spun is pretty horrible.
Amazon and Wal-mart losing is a good thing for the environment and society as a whole.
You yell, "But people lose their jobs..." There are more jobs, plus why does everyone need a job anyway? Can people in Silicon Valley not make $150 ~ $250k a year, everyone else get a minimum income and we work together to make better art, smaller factories and a world that will last much longer by recycling and rebuilding and not needing to buy an endless supply of shit?!
"Ending is better than mending. The more stitches, the less riches." -Brave New World, Huxley.
This is kind of silly. Job losses are a bad thing precisely because we don't yet live in a world where you can get by on a guaranteed minimum income without a job.
If you think it's worthwhile to push society in that direction, then great, but don't kid yourself that believing in it is the same as having already accomplished it.
What this system boils down to is a compulsory savings scheme. So an alternative would be a voluntary savings scheme, which would have higher efficiency by not consuming resources for administration.
What Swiss specifically have - we all pay social deductions, within 3 pillars. If you end up unemployed, then you are entitled to 1-2 years of unemployment benefits in 70% of your former income, with some reasonable cap. That is if you worked 100%, in perm contract, at least a year in a row. Otherwise not entitled or payments are much lower.
You have to actively search daily for new job and prove it to your unemployment councilor, otherwise they will cut benefits. Those benefits are really structured in a way to pay you while you are looking for a job, not to have some extra long holidays on everybody's else budget.
Motivating setup, and since there is a social net of that 70% of my income, I am not frightened on the prospect of losing job and needing to store enough cash to live at least X months/years without any income. Handy when one has big fresh loan on his shoulders.
In Norway you can get 1-2 years benefits at about 60% your previous annual income with a cap near median national income.
You'll see it if you lose your job and can't find a new one, like it happens to millions of people the world over.
Add a medical or family emergency to that, and you'll totally get it.
Unemployment in the real world is not about some Sci-fi automation fantasy. Might be, but not yet. And even when historically it leads to new jobs or other benefits (something not always a given) the pain and trouble is still there for those unfortunate to be left on the wrong side of those new jobs.
As long as we move quickly to "you don't need a job to live" because otherwise these job losses are destroying people's lives.
A 50 year old Intel employee might find it quite difficult to get a new job any time soon. Not to mention that where they live they'll have some other thousands ex-colleagues recently fired looking for similar type of jobs.
We're talking about thousands of people -- few of them will be highly sought after chip gurus. In fact obviously it's the less sought after ones that will be let go.
Add a recent mortgage, medical issue or kid at college and here goes their pension savings (or worse). Have the same layoff happen to their spouses near the same time (which it's not that rare), and they're pretty much screwed.
Really, it's as if people have no real world experience with these things...
Most of the UK's mobile industry was staffed by Ex Cellnet/BT who took Redundancy with some massive TAX free payouts - Senior Guys could get 5 figure payoffs and an extra 6 years on your pension.
The head of Vodafone in the UK was an example and as soem one said at his level they threw in a gold plated wheelbarrow for you to take your cash away in.
In most of the world, including large parts of western Europe, unless you're upper middle class and higher, it's either hand-to-mouth or smallish savings (nowhere near 30%).
I see via google that minimum wage in Germany is 1473 euros per month for a full time worker.
According to , food, housing, necessary household items, and public transportation look like they could reasonably total less than 800 euros per person. Add another 100 for having fun and for the occasional wardrobe expense, and you still get to save nearly 40%.
Having children ruins all of this, of course, but that's pretty easy to avoid.
That's nowhere near being able to save 30-40% of your pay, as the parent said. And Germans are some of the biggest savers in Europe. For places like France, crisis stricken "PIIGS" (Italy, Spain, Portugal, Greece etc) and especially the UK it's even worse.
Where do you have stats showing all of these job cuts will happen in the same location? Where are the stats that show when a person loses their job, their spouse is also likely to lose their job?
As for the same location: typically companies have so-called offices in a few locations --not individual people equally distributed around the country.
Plus, who said that their spouse is "equally likely to lose their job"? It's just an example of a thing that can happen, and does happen, not something that anyone claimed inevitable or "as likely"...
Everyone else will live in 100 square foot apartments in Soviet-style concrete bunkers, eating ramen (or possibly Soylent). But they won't need to work! They'll spend their entire day painting and composing music!
A lot of people here push this vision, and of course they always imagine themselves to be part of the elite. It really is insufferable.
Anyway, the world you are so worried about is already here. A small number of people do continue to do important work that can't be automated. Most other work is unecessary busy work. Artists and musicians already live off of government largesse, either by taking "McJobs" that are artificially highly paid because of government minimum wage, or by actually being on welfare.
Since people can't wrap their heads around the concept of Basic Income, we end up with hordes of people paid to do stupid shit that doesn't matter. The endgame is that we're all government employees in a Dilbertesque hell. Moving papers from one pile to another and back again. Digging holes and then filling them again, since letting people decide what to do during the workday would be too disruptive.
I once heard about the British Colonial Office. The time when it had the most employees was precisely the time when it was dissolved because Britain did not have colonies anymore. They were all just shuffling papers around all day.
I sadly don't have a source for that. Does anyone have a link, by any chance?
C. Northcote Parkinson, 1955. http://www.economist.com/node/14116121
"A glance at the figures shows that the staff totals represent automatic stages in an inevitable increase. And this increase, while related to that observed in other departments, has nothing to do with the size - or even the existence - of the Empire."
What I don't know is where Parkinson got his own figures from. Some of them are cited in the article, but I think not this one.
elites (political in communism vs actually still working people in BI scenario) have amazing life with various luxuries. rest will be allowed to exist and not die of starvation and have a lousy place for sleep. truly bright future...
At least with BI, you don't have to toil away at a time-wasting job that's just busy-work: you can do something you enjoy, and maybe we'll all get lucky and you'll write the next Harry Potter series and become a multimillionaire, while paying a bunch of taxes to keep the system going. If not, at least you have the dignity of not digging holes and then filling them back in all day long to justify your paycheck.
If you don't like this, then let's hear your alternative. The only alternative to this that I can think of is to ban automation, or strictly regulate it so that any automation which does a job that humans can do is banned (allowing only automation which does things that humans cannot do). Do you really want to go that route?
For $400 you get a device that will let you watch movies, play games, check email, get phone calls, etc.
Granted you could also get a cheaper phone but this replaces your TV and computer! Smartphones are really high-value!
I think if you're looking to invest in one useful thing, a good phone would be a great purchase. Maybe the Moto G at $200?
Amortized over a year, you're talking about a lot of bang for the buck.
The S3 generation of hardware has stood up very well.
My only reason to upgrade would be for some good security hardware that can support full disk encryption with no slowdown.
How do you expect that person with meager income to pay for the network bandwidth and usage bill?
>>Smartphones are really high-value!
This I agree wholeheartedly. At least for people like me, who enjoy reading books, smart-phones (with sites like project Gutenberg) provide such an utopia that I feel like living in heaven. Of course, I know that smart-phones don't solve all problems automatically, but from knowledge-seeker's PoV, smart-phones are of great value.
Also, I understand that not all people like to read, but that's a different thing to ponder upon.
If a lot of companies suddenly go bust that are today supposedly getting in the way of innovation, it might be good for long term innovation, but in the short term, say 10-15 years it can really be hell. Not to mention that in this 10-15 year period while the economy is hemorrhaging it makes the entire economic ecosystem very vulnerable, and significantly reduces its capacity to deal with external attacks.
That is while US companies may start failing, who's to say that a foreign company won't take its place. Then you have to start taking isolationistic measures, which is a whole other mess.
The ideal scenario would be to see these companies slowly, over a period of decades fade out, and be replaced by others, which I honestly don't see happening as of yet in SV, but things like this generally do come out of the left field.
So if we see these companies shrink rapidly, with no one else to take the mantle, I honestly think it would be a dangerous situation as it can be a catalyst for something bigger.
Yep. That's what Peter Thiel is saying as well:
"My investments are over valued, oh the lessons I have learned"
"All investments are overvalued, oh the lessons we will learn"
are two very different statements.
Its because bad news is good news in the twisted upside down world of speculating on the Fed's interest rate policy direction via the stock market. Expectations of rate hikes go further into the future with every bad economic indicator released and there are many(retail sales, auto, housing, industrial production, rail traffic, oil and gas, capex) . Will continue to wait for the day the market finally figures out that interest rate policy frameworks are broken, the Fed has no more ammunition, and the stock market jumps out the window rather than using the stairs. At that point, I'm buying lads.
Who is it overvalued by, and how is the "true" value determined?
Suppose everyone expects the demand for widgets to grow by 1% per year for the next 10 years. This growth might get priced into the market with some sort of net present value calculation. Capital will flow into building new widget factories to fund all of this new construction.
Now what if actually an asteroid hits a city with lots of widget demand. 50% of demand is wiped out instantly. Well in this case, everyone was overvaluing the market for widgets! All of those net-present-value calculations were way off. The market will correct.
Or maybe the asteroid takes out 50% of production. Suddenly much more investment in widget factories is required. The price shoots up to capitalize all of the construction, since demand has not changed.
In either case, before the asteroid the market was not correctly valued since no one had yet priced in the asteroid.
Valuation only includes all available information.
You can't expect random acts of God to be priced in.
Overvalution is when psychology everyone is bullish, but the facts are not backing that up.
The true value is determined by what happens next, for short to medium values of "next."
Current fundamentals simply don't support a future of continuing growth.
A correction is certain. The only questions are when, how fast, and how much.
Four companies don't represent the market. Come on. I swear some users here will wish will wish bubbles out of the air to gripe about.
There is most likely a corporate bond bubble, as we know from representative market signals. With the S&P 500 though, four companies are not representative signals.
BTW, this is a great site for tracking S&P ratios: http://www.multpl.com/
Practically speaking, I personally don't believe the efficient market hypothesis due to the very obvious meddling by political actors, central banks, national treasuries, etc. Behavioral finance has also consistently demonstrated that humans do not react rationally in markets.
My point is a pedantic one referring to the OP using the word "overvalued". The market price is never over/under valued. It is merely the market price at that moment.
As you said, it is merely tautological.
In other words I think it's futile to try to attach moral attributes (for example saying that a market is "best" valued at the moment or not) to things like financial transactions, which don't have any intrinsic moral values associated with them. Otherwise we risk talking about long-dead German philosophers when trying to illuminate the problem.
Carrier is moving manufacturing operations to Mexico.
We're on the verge of something really big and really bad.
This is nothing but an asset bubble that is sure to burst someday, as is akin to what the Gov't, the credit agencies, and WS did with housing and mortgages back in the 2000's that led to the meltdown and TARP.
I have one company for sale at USD329 billions, it grows 50%-100% a year for now and currently generates USD3.29 billions
I have another company for sale at USD219 billions, it is shrinking at 1.4% a year and currently generates USD14.69 billions
And finally I have another company for sale at USD200 billions, it is growing/shrinking between -8% and 60% a year and generates USD7.35B
Most people prefer the second one, who makes you a good 7% return a year. the first one looks good, but will take at least 6 years to be as good as the second and the last one is so so. the companies where 1) Facebook 2) Walmart and 3) Coca cola. (we're ignoring how much assets and liabilities they have for simplicity)
And as you see, even when facebook has a promising future, we don't know if it will reach a good value/margin level as walmart. so we may agree it's overvalued at is current price.
Hope this shed some light, I'm not a trader or something, so this info is very simplistic
P.D.: this analysis is called Fundamental Analysis, you can go deeper and honestly it has worked very well in my portafolio. I bought stock in Gerdau (GGB), a brazilian steel company, because I studied as a whole business and discover it was priced very low. that was starting in November, I bought those stocks at 1.22 and recently they reached 2.44. I basically duplicated my money in 6 months. Remember stocks aren't just tickets, they are little parts of a big business.
But the sum of all prices does in fact vary. I'm sure you're familiar with Tulip Mania. Prices can be bid ever-higher in a cycle, and they can also decline in tandem as in the depression.
For the entire market to have an expanding total price, new money has to be entering the market, or the net real value of the underlying financial assets has to be declining. Either way, the entire market can become overvalued or undervalued, especially compared to non-financial (or illiquid) assets denominated in the same currency — e.g., wages, energy, or land.
Only if you subscribe to the "Greater Fool" theory. On the other hand, if you think the stock price should reflect some sort of intrinsic value, say the net present value of the stream of all future dividends, then it could be the case that you would never recoup your investment with any stock currently.
Quite a few people would disagree.
Get paid a lot more money, receive double digit gains on your retirement account year after year -- and be completely fucked in the process, ironically, because prices went up even faster and its more profitable to simply hoard things than sell them.
Considering the money the government is still pumping into the market directly, I don't think people QE will have the same effect. It really depends on how it's executed though.
To the extent that Intel and the tech unicorns are complements, Intel's profits declining may actually signal that tech unicorns are undervalued.
* Tick-tock is dead.
* 10 nm is severely delayed.
* EUV is severely delayed.
* Significant layoffs in R&D
* The ITRS roadmap is vaguer than it's ever been.
* Giant mergers are up (Intel+Altera, KLA+Lam, etc.), concentrating the industry more than ever.
* And ultimately: A 5-year-old PC still works just fine.
When I say this is the end of Moore's Law, I'm not trying to be dogmatic. Of course there will still be a semiconductor industry and of course there will still be amazing technological progress. But it seems the rate of that progress is slowing, and now the industry is adjusting.
I suspect that is really the dominant factor here. It's not that the progress within the chip industry has stopped; there is far more number-crunching power in the CPU and GPU in my latest PC than in the one from five years ago. And if you're doing things like playing demanding games or modelling skyscrapers in a CAD package, that progress is probably very useful. It's just that for what most people use PCs for, it doesn't matter, because they were already good enough so unless their old one broke they don't need a new one anyway.
Not only that, but when it comes to replacement, smaller and more convenient devices like smartphones and tablets will do everything a lot of people need without needing a PC at all these days. If you mostly used a PC for things like staying in touch with your friends or retrieving information from a web site, rather than anything creative beyond a quick bit of typing or anything that needs more powerful equipment, you might not even have a laptop any more.
Given that Intel has never had much penetration in the mobile device market compared to ARM designs, it doesn't seem that surprising that the demand for their products is waning as the mass market moves in that general direction.
I don't think we have to dig this grave just yet.
Makes me wonder what research money will be spent on instead of just faster chips.
In the early 20th century we got fixed wing flight.
In the teens and 20s we got motorized fighters and the first passenger planes.
In the 40s we got jets.
In the 50s we broke the sound barrier and orbited Sputnik.
In the 60s we landed on the Moon.
In the 70s we... stopped going to the Moon.
In the 80s nothing much happened except declassification of a few things (stealth) that were developed in the 60s and 70s.
In the 2000s we grounded the Concorde. Passenger flight got slower and more expensive.
In 2016 we fly on passenger planes no faster than what we used in the 70s and 80s, and we're stuck in low Earth orbit.
1969 was the peak of the aerospace industry. With the exception of SpaceX (which is really just picking up where NASA left off), we are less advanced today than we were in the 1960s.
There's many places we could go beyond conventional Moore's Law: multi-dimensional chips, optical, quantum, exotic materials with very low power consumption, etc. But if what we have is "good enough" and there is little demand for anything faster, the R&D dollars won't be spent. If anything the shift toward mobile computing and wimpy thin client endpoint devices might actually lead to a pull-back and loss of capability similar to the one we saw in aerospace after the 70s.
The consolidation we are seeing is not a good sign. This is what happens when an industry decides it's now a cash cow and it's time to go out to pasture.
We also could have a base on the Moon and Mars right now and be working on our first interstellar probe to Alpha Centauri. Physics didn't stop us. Economics and politics did.
The price drop in air-travel can be attributed to:
1. Technology. Cheaper (per seat), more efficient planes.
2. Consolidation of airlines (and airplane manufacturers).
3. The disruption of full-service airlines by low-cost carriers.
In semi-conductors we have been getting along with 1 and a bit of 2. I would say that the disruption of Intel by ARM is an example of 3 because intel is not incentivized to compete at those low price points.
In the US at least, this was due to political deregulation. The process was started by Nixon, but finished in law by Jimmy Carter (!), note also the leading lights of the Democratic party in the signing picture, e.g. Teddy Kennedy 2nd from right: https://en.wikipedia.org/wiki/Airline_Deregulation_Act
Even a 200% increase in CPU performance won't bring the general public back to desktop PC's they don't really need, because they are satisfied with what they have. We need something truly disruptive and get people interested in something else.
Right now there are a lot of experiments (AR, VR, wearables, transparent screens, lightfield, new batteries, etc) which will, probably when combined into a truly attractive package, change everything. Again.
Yes, this may be good for the future of Intel or the employees losing their jobs... but right now, 12000 people are having their lives affected in drastic, unexpected ways.
Assuming people stay in a job 4 years, you can get a 25% reduction in headcount per year by just not hiring. Intel has over 100,000 employees. They are likely hiring 10k-25k people per year just to stay at a constant size.
He's not "being let go".
From the article: Stacy Smith, who has been chief financial officer since 2007, will move to a new role as head of manufacturing and sales
I'm not sure what to make of the move, but if they really wanted to let him go they wouldn't have given him this different role.
Edit: just to elaborate, Intel is first and foremost a manufacturing company. Their manufacturing currently has a 61 percent gross margin. You don't put someone in charge of that if you want to let him go!?
But I'm not an Intel employee or close observer of the company. Maybe some Intel insiders can chime in with that they think this means.
A good 1/3 of their assets are in property plant and equipment value. That means that the fair market value of that stuff can tank if they don't keep up the pace of growth because what if they can't sell as much of their products anymore because of a shift to other technologies?
This is for sure a move because their revenue forecasts are grim in many areas they operate in.
The only time I've felt my computer to be "slow" was when I tried to use Photoshop and After Effects simultaneously.
Here's the problem. Running a five-year-old PC used to be an issue.
Today, running a five-year-old PC is a Intel Sandy Bridge i7-2600K ( Passmark Score: 8,518 ). While a modern i7-6700K has a Passmark Score of: 10,987.
FIVE YEARS, and FOUR generations of processors have created a gain of net 28% in multithreaded situations. Far less for single-threaded applications (maybe 15%). And absolutely negligible for gamers (which are 100% GPU throttled).
If you're running a 5-year-old i7-2600K, there is absolutely no reason to upgrade to Intel Skylake. None at all. Maybe you want a new GPU to play those VR games... but Intel isn't making gains anymore in processor speed.
Intel has been trying to get people to buy their power-efficient designs (Skylake is a hell-of-a-lot more power efficient...) so Intel continues to sell laptops at a decent rate. But no one I know has major issues with their desktop speeds.
The only people I know who have upgraded their computers are those who have had hardware failures. There's still no need to upgrade a computer from Sandy Bridge.
While that's the prevailing opinion, I don't necessarily agree. I think it's SandyBridge owners trying to convince themselves more than anything else, but really it's just being swooped up in the groupthink.
Skylake is the first chip that makes a very strong case as an upgrade. You gain NVME support for your PCIE SSDs, DDR4 (which has shown an improvement over DDR3 in some benchmarks), roughly 20% IPC improvement (5% per gen give or take), DX12_1 feature level IGP, CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming (Broadwell had it first and Skylake's is improved upon), vastly more power efficient and Thunderbolt3 support.
7 pretty good reasons off the top of my head.
You can dismiss each of these if you want, but this is all very attractive in reality.
The whole story is that SandyBridge is only competitive, in gaming, if you overclock to 4Ghz+. You still lose out on the other improvements though and any stock SB system compared to a stock SL system will look pretty sad once you factor in the platform updates.
If it's gaming you care about, take a look at the benchmarks of the 128MB L4 Broadwell chips compared to Devil's Canyon. Let alone SandyBridge. Both get crushed where it counts and Intel is just now getting Skylake 128MB L4 CPUs out the door. If you don't care about gaming, Skylake still crushes SB.
If you care about NVMe, just get a $20 expansion card. Besides, NVMe SSDs are expensive. Mushkin Reactor 1TB for $210 yo.
Hell, the fastest NVMe SSDs directly go into PCIe lanes. So if I actually cared about the faster speeds, I'd jump to an Intel 750 SSD.
Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.
Yes, if I had a laptop which only had room for a M.2 card, then maybe I'd get the Samsung M.2 card. But even if one were given to me for free, I'd rather get the $20 PCIe expansion card.
I can't think of a single situation where I actually need the onboard M.2 card on the Skylake motherboards, aside from the $20 convenience.
> roughly 20% IPC improvement (5% per gen give or take)
I admit, this is a good thing. But this is very very little, especially when you consider that the iPhone 5 to iPhone6 jump was 70% IPC improvement AND battery improvement, yet many people don't consider that enough of a jump.
Soooo... FIVE years gets you +20% speed, while ONE year gets you +70% speed on phones. That's why desktops aren't getting upgraded.
> DX12_1 feature level IGP
You buy a $300+ CPU without buying a $100 GPU? The cheapest of GPUs are significantly better than IGP. Hell, if I cared about DX12_1 IGP, I'd get an AMD A10 for half the cost and twice the IGP performance with drivers that actually work on games.
Except I game in capacities that far exceed even AMD's superior IGP. I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris. So I have a R9 290X. Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.
> CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming
NOT on the desktop. Crystalwell is laptop-only, and 45W to boot. Compared to 20W Laptop chips, I don't see the Crystalwell L4 Cache actually being useful to the majority of people.
In fact, I don't even know of any laptops with Crystalwell that I'd recommend to anyone. Here's a challenge: Name me a good laptop with Iris Pro / Crystalwell. Hint: Macbook Pro use AMD dGPUs for a reason.
And hell, we aren't even talking about laptops. We're talking about Desktops, and Crystalwell is UNAVAILABLE in desktop form. Its irrelevant to the desktop user, even if you thought that paying $600+ for a CPU was cost-effective (instead of buying a Sandy Bridge E5-2670 for $70 from Amazon).
Basically, you got DDR4 RAM and IPC +20%. That's all that I actually think the last five years will get you. Or, you can buy a 8-core 16-thread E5-2670 for $70... hell... two of them, get a nice Server Board for $300 and have a BEAST of a machine.
The 45W i7 is a heavy burden to carry with Crystalwell. Might as well get better graphics if you're going for the 15" Pro.
You missed the irony of your 45watts as a heavy burden. A R9 370X adds about 50watts to your TDP by itself. Along with its needless complexity. If someone wanted to reduce TDP and that complexity you could step down to the base Intel IGP. But if stepping up, Intel's solution makes a lot more sense.
Good luck with your overpriced Crystalwell failure. If you got actual benchmark scores to talk about, please respond to me here: https://news.ycombinator.com/item?id=11536519
But I actually know the benchmarks of everything you're talking about like the back of my hand. Your argument has no technical legs to stand on what-so-ever. Don't feel bad if I'm just calling out your Bull$.
No one wants that AMD chip in their Macbook. It adds complexity both in engineering and software. There's PLENTY to talk about technically there and why that's a good idea. Not to mention Intel's best-in-class Linux support.
Yes. If you're bargain hunting for gaming hardware you should just buy a console. Or, if you're seriously suggesting to put an Intel 750 into some old system like SandyBridge.. no comment. I would never recommend someone bother doing that.
Step up to an NVME setup, Skylake and do it right. Skylake i5 setups can be had for cheap. You're just arguing to argue on that point. Whether or not you have anything useful to add. The SB argument is common knowledge, an age old argument at that with no new information or insight.
>Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.
I'm not into cost-effective bargain hunting. Anyone who would gimp a nice Intel 750 SSD on a non-NVME system is a fool and you've suggested it.
>The cheapest of GPUs are significantly better than IGP.
No they aren't. The point about DX12_1 IGPs is that it's there, it's modern and it has already sucked the life out of the low end space and moving into the midrange with Iris Pro. Your stance is the 2010-era view on computers. Same era as Sandybridge TBH.
>I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris.
This demonstrates how much you know, and why people shouldn't listen to what you're saying. Which can be heard on any PC gaming forum a thousand times over. This is HN though and it won't fly.
Intel has already committed to FreeSync. It's incoming with KabyLake rumor is that it may be enabled for Skylake.
>Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.
Wrong on its face. You just haven't cared to investigate recently.
>NOT on the desktop. Crystalwell is laptop-only, and 45W to boot.
Nope. The Crystalwell chips are going into NUCs from here on out. There's a 128MB L4 NUC coming in 2 1/2 weeks and a 256MB NUC coming in 12 months.
The fact you're talking about gaming and recommending an ES-2670 for that is just silly. That might be a good machine for compiling code. If that's your goal, it's still a bad idea when distcc can utterly embarrass that old power hungry chip.
For gaming, Broadwell already demonstrated what Crystalwell adds for gaming performance with a standalone GPU. And it's a game-changer, it's faster than the i7-6700K. Yes, it is. And it definitely mops up where it counts (99th percentile frame times) on SandyBridge too.
In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have (even with an Intel 750, if you made the ridiculous decision to actually put one in SB). There might be more cost-effective ways to build a gaming rig, but if you're into saving money on hardware and gaming, buy a PS4.
I understand it's the prevailing thought among PC gaming kiddies, but holding your grip tighter on some old SandyBridge system won't change that in reality it's fallen pretty far behind in both overall platform performance and power efficiency.
The Iris Pro 5200 GT3e achieves Passmark 1,174.
If Iris Pro 580 GT4e is twice as good (Intel only claims 50% better), that's still not very good. Thats utterly awful actually.
A $100 GPU is the R7 360, just off the top of my head. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125...
Exactly $99 on Newegg right now. It achieves Passmark 3,150.
No one gives a care about the $600 Crystalwell chip that performs worse than a $100 dGPU. Its utterly awful. You'd be insane to actually recommend this product to anybody. You claim that you care about performance. Do you even look at the benchmark numbers bro? You're claims are so far away from reality I really just don't know how I'm supposed to respond to you.
Yes, a $600 Chip. I'm assuming this, unless you can figure out a cheaper Skylake Iris Pro: http://ark.intel.com/products/93336/Intel-Core-i7-6970HQ-Pro...
EDIT: I see that you're an anti-AMD guy. Okay, whatever. That's why there's another company out there.
NVidea GTX 750 Ti, $105 right now on Amazon. Passmark 3,686. Still utterly crushing your $600 iGPU with cheap-as-hell GPUs, no matter the brand.
Dude, I'm running a (what was at the time) high-end R9 290x, although this is more of a mid-range card now due to its age (Fury / 980 Ti). It has Passmark of 7,153, and you're seriously suggesting I "upgrade" to a Crystalwell Iris Pro that only achieves ~2000 Passmark?
PS: Skylake performing 20% faster than Sandy Bridge after five years of updates is awful.
> I'm not into cost-effective bargain hunting.
Then why the hell are you bringing up M.2? Intel 750 is the best of the best and plugs directly into PCIe. Sandy Bridge handles it just fine.
> In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have
Crystallwell has to beat a $100 dGPU first. And the benchmarks say otherwise. My bet? Crystalwell fails to beat a R7 360 / NVidia 750 Ti (NVidia's LAST generation BUDGET card) and I get to laugh at its worse-than console performance numbers despite the $600+ launch price for the chip.
But hey man, show me those benchmark numbers if you disagree with my assessment.
But I have to say, anti-AMD! Yes, very perceptive as I type this on a machine with a Radeon in it. Consider the fact that other people can criticize products they have extensive knowledge with.
Judging from the rest of your response, my post went completely over your head. And quite the troll as you try to change the points I made in attempt to "win" the argument. But it is amusing hearing some kid saying Intel's 20% IPC boost from SB to SL as awful shows how much you don't know. I have friends that work at Intel. Go back to PC Gamer as you have no idea what you're talking about. Some dumb kid sees ONLY 20% with massive power reductions, doesn't realize that computers can do more than just play video games.
You also failed reading comprehension and the ability to hold a conversation. Congrats. But either way, there's no way around the fact that in 2 1/2 weeks I'll be benchmarking a R9 Fury to an i7-6770HQ with PCIE NVME SSD, some DDR4 and absolutely crushing any SandyBridge system you own.
Enjoy your old ES-2670 and SandyBridge with an Intel 750. What a total fruitcake. Better use of your time is to go read about logical fallacies as you just spent an hour typing about a strawman you created to beat on with points I never made.
What you want to hear because you just want to argue- you're right, I'm wrong. Hope you feel better now. I'm not giving you any more help. I get it, you like your poverty gaming rig. See ya kid.
Sure, that's the top of the line $300 chip in the days of a whole PC being able to be bought for $300. What if you're on a five year old Pentium G620?
1) you're not the sort of person who buys a new rig every two years, and
2) a $300 PC today will give you exactly the same performance as the one you bought 5 years ago: the minimal gains you get in iron are naturally offset by minor losses in software (which is now built by people with SSDs, so good luck with your little spinning disks...)
The market is now artificially segmented to such a fine level, and moving so slowly at the top, that performance simply does not "trickle down" like it used to. Add to that the move to "power efficient" CPUs (aka: less powerful overall) and you will basically see zero gains if you stick to the bottom of the market.
But yeah, its peanuts. A 20% improvement over five years is pathetic. I'm just calling out your hyperbole, in case others didn't see it. Apple had like a 50% improvement in a single generation of iPhones, so a 20% difference over five years is very ignorable.
SSDs and GPUs improved dramatically over the past five years. Well... more specifically... SSDs got dramatically cheaper and retained roughly the same quality. So its worth it to upgrade to SSD or to get a new Graphics Card. But Intel doesn't have any GPU offering, and their SSDs are "enterprise" (aka: overpriced). Mushkin / Crucial are better brands for consumers... even Samsung (although a bit more expensive)
A five-year-old Pentium G620 is only ~25% slower than the Skylake Pentium G4520. Both are dual-core CPUs that are cheaper than $100 aimed at the budget audience.
Frankly, the fact that AMD Vishera FX-6300 still easily beats out the Pentium G4520 in multithreaded benchmarks... this demonstrates the absolute lack of Desktop CPU improvements. I'd only recommend the G4520 to someone who is really sure that they care about single-threaded performance (ie: Gamers). Most people will appreciate the lower total-cost-of-ownership that FX-6300 offers at that price point.
* AMD FX-6300 Passmark: 6,342
* Modern Skylake Pentium G4520 Passmark: 4,261.
The G4520 is a $80 chip, Released October 2015. FX-6300 was AMD's 2012 entry: a FOUR year old chip, now selling for $80 to $90 at Microcenter.
Microcenter has some $0 Motherboards if you buy an FX-6300 from them. That's the kind of benefit you get from buying "old". And since CPUs aren't really much faster, why the hell should you buy cutting edge?
Hell, why are you spending $80 on a new G4520? Facebook just decommissioned their servers. You can get a Dual socket ready Sandy Bridge 8-core 16-thread E5-2670 on Ebay for $80. Amazon for $70
Go get yourself a dual-socket 16-core 32-threads E5-2670 Sandy Bridge Workstation, just $80 per CPU.
Intel can't even compete against their own ghost from 5 years ago. Is it a wonder that sales are low?
I'm serious: about 80% of my day is meeting with real people in the real world. Mobile phones haven't changed that.
The other 20% of my day is sitting at my desk creating original work product (mathematical models and thoughtful memoranda) or reviewing the work product of others. Mobile phones haven't changed that, either.
No doubt the drought in PC sales is real and permanent. But I wonder how much of that is because people just don't need to keep their laptops up to date in the age of great cloud services.
As for entertainment though, are you watching Youtube extensively on your desktop?
There already is a relatively mobile tablet/desktop hybrid that works pretty great for both consumption and getting work done. It's called a laptop.
> As for entertainment though, are you watching Youtube extensively on your desktop?
Yes. I have a phone, a tablet, a desktop, and a laptop. The tablet is pretty much only used for netflix and textbooks, and the phone is for travelling. The tablet is absolutely worthless for browsing, coding, writing, or gaming; and the phone is only saved by the form factor. If I had (the space for) a TV then the tablet would be a completely unjustifiable purchase.
And yes, mobile devices have taken away entertainment share from the desktop as well as televisions.
I don't count time I spend out and about on my mobile, since that isn't time I was going to spend on my desktop anyway. (And I don't think having the option of going outside with a mobile has changed how often I do so.)
Plus, Intel provides almost all laptop chips, and I'm guessing most businesses still have either a desktop or a laptop per person, which will still get upgraded every so often. I doubt workers are going to be using iPads for data entry, although I suppose you never know.
None, since I don't want to have (and don't have or own) a portable surveillance and tracking device (also called mobile phone) in my pocket.