
AMD Returns to Full-Year Profitability, Forecasts Strong 2018 - artsandsci
https://www.extremetech.com/computing/263110-amd-triumphantly-returns-full-year-profitability-forecasts-strong-2018
======
xedarius
I recently bought a Ryzen. I was moving from a mac to a pc and I hadn't owned
a pc in over 10 years. Thing is, 10 years ago my pc had 4 cores .. and at the
time of choosing Ryzen it was the only consumer processor (with a reasonable
price tag) that was offering more cores.

I'm very pleased, my Ryzen is fast and stable, I'm glad the company is reaping
the rewards.

~~~
soulnothing
Another Ryzen owner here. I came from mobile 2nd and 3rd gen i7s.

The price to performance is insanely good. I'm on an x370 desktop, with
2xrx550 and rx560. I'm using it as a virt/dev host. I have 16 docker instances
running on the host os (Arch Linux).

Then 3 vms with PCI pass through. One for windows gaming/development. Until
about three weeks back I was getting a solid 60fps @ 1080p on high across most
games, but recently it's been down below 10fps. The next VM is a Linux
developer desktop. Lastly, an emulator box connected to a projector away from
my desk.

To switch between I just have dongles, 1xhdmi and 1xusb, behind my keyboard,
and alternate the plugs. I did this because I couldn't find a good hdmi kvm
that did 4k above 60hz.

This year has really been amazing for enthusiasts.

~~~
jagger27
> This year has really been amazing for enthusiasts.

I mean, graphics card prices have been pretty horrific, but I agree on the CPU
side.

~~~
ViViDboarder
Right? I can’t wait for Bitcoin to crash. Let us do something of real value
than mining coins.

------
matthewmacleod
This is great. It's super important to have multiple market players producing
CPUs; obviously AMD is not the same company as Intel, but we also really can't
afford to have a monopoly in something as important as CPUs.

~~~
dnautics
don't forget though that intel owns a sizeable chunk of amd (iirc)

~~~
e1ven
I've heard that they have a patent cross license agreement, which is why AMD
can use x86 and Intel can use AMD64, but I hadn't heard about the ownership.

I don't see Intel listed at [http://www.nasdaq.com/symbol/amd/institutional-
holdings](http://www.nasdaq.com/symbol/amd/institutional-holdings)

Where do you see they have an ownership stake? How much is it?

~~~
pulse7
It was IBM who forced Intel and AMD to sign the agreement. From CNET:

"Early 1980s--IBM chooses Intel's so-called x86 chip architecture and the DOS
software operating system built by Microsoft. To avoid overdependence on Intel
as its sole source of chips, IBM demands that Intel finds it a second
supplier.

1982--Intel and AMD sign a technology exchange agreement making AMD a second
supplier. The deal gives AMD access to Intel's so-called second-generation
"286" chip technology."

------
dingdingdang
Will be adding to AMD's bottomline since I will be buying their CPU/GPU next
time my workstation needs updating. This is fundamentally due to the grief
caused by Intel's handling (or the lack of it, is perhaps more accurate
description) of Spectre/Meltdown crisis.

Simply do not interested in Intel's kind of mentality when it comes to the
hardware that runs all my computing. Nah.

------
nirv
Great news! I hope AMD releases a reasonably priced Intel NUC equivalent with
their new APU. In the same compact, businesslike case (without gamers'
ridiculousness), with a couple 40gbps USB-C Thunderbolt 3 ports, powered by
USB-C/PD cable.

~~~
bitL
AMD would have to license Thunderbolt from Intel, so it's unlikely.

~~~
nirv
Apparently, not: "Intel to make Thunderbolt 3 royalty-free in 2018"[1].

[1] [http://www.zdnet.com/article/intel-to-make-
thunderbolt-3-roy...](http://www.zdnet.com/article/intel-to-make-
thunderbolt-3-royalty-free-in-2018/)

~~~
bitL
That's good news! AMD can finally have full stack.

------
djsumdog
I realize they've always been an underdog compared to Intel, but they still
dominated the console market. When every xbox and PS4 sold as your CPU in it,
how were they not profitable?

Where they dumping all that money into the Zen code R&D? With the Meltdown
craziness, will this be the year we start to see AMD return to the data
centre?

~~~
zitterbewegung
Console market is where usually the chip maker in second or third place tries
to make a play for their business because console makers are basically assured
sales but they will have strange requirements.

PS3 XBOX360 and Nintendo Wii were all IBM chips . They tried a bunch of new
architectures but for the most part developer ergonomics was horrible.

So for the PS4 and XBOX One they decided they needed regular computer parts.
I'm sure AMD and Intel may have put bids on it but AMD would be the one that
would have been extremely motivated to make sales to Microsoft and Sony. As
time goes on you can make higher and higher margins since the chips don't
change (but on the other hand we are seeing console revisions within the
traditional generation). But, I think that the margins will keep on
increasing.

~~~
dralley
Well, there's motivation on AMD's part, but they also have a value proposition
that nobody else could have matched. They can provide both the CPU and GPU on
one chip, which simplifies the console designs pretty dramatically. More
compact, easier to cool, etc.

Intel graphics solutions were (and remain) sub-par, and Nvidia has no x86
solutions (and their ARM chips would have been underpowered).

~~~
kimixa
I don't know if an ARM chip really would be underpowered - remember the
consoles are using the ultra-low-power jaguar cores. Probably not a million
miles away from an A57, or whatever denver-based core NVidia could pull out of
the bag.

~~~
jrs95
It definitely wasn't underpowered for the Switch. And they managed to get
Skyrim and Doom running on that. I'd imagine they could do even better
performance wise in a traditional console rather than kind of a portable
hybrid.

~~~
freeflight
> they managed to get Skyrim and Doom running on that

Original Doom was already running on the SNES like 25 years ago.

Getting something to run is not really that big of a hurdle as you can always
scale down resolution/graphical fidelity/frames per second. The question after
that rather being: Who would want to play an obviously inferior version of the
game? Because that's exactly what these Switch versions of Skyrim and Doom
are, inferior.

~~~
girvo
Inferior, except for the fact I can take them with me.

------
shiado
I tried to buy a RX580 a while ago to play some games I have been meaning to
play and they were sold out everywhere.

Part of me wonders if the long term effects of the mining craze will be
negative for the makers of graphics cards and in return PC component makers as
a whole. Many people might be turned off of pc gaming by the prohibitive
prices of buying cards at 2.5x MSRP and turn to consoles instead. Or in my
case I intend to wait and pick up cheap hardware when crypto tanks hard (i.e.
Tether fraud collapses), or GPU mining becomes unprofitable, or altcoins
switch to proof of stake.

~~~
softawre
Yes, this is a major concern for PC gaming. I have a couple 1070s I bought for
400$/each, now I could get double that used. In-freaking-sane.

~~~
solotronics
just curious but why not sell them now and buy newer ones later for non-
inflated prices?

how is it possible that AMD seems to be at the limit of production for their
GPUs but barely beat estimated earnings this quarter?

~~~
maksimum
If you're trying to be economically rational you're better off simply using
the GPUs to mine while you're not using them. ROI time is around 6-9 months
for most GPUs these days.

~~~
outworlder
Does that include electricity costs? For which cryptocurrency?

~~~
solotronics
yes the electricity cost is almost insignificant. currently most profitable on
GPU is XMR before it was ETH, profitability changes as the prices fluxuate

------
InTheArena
AMD is in a great place at the moment - but we need to see what the smaller
Zen cores coming this year can do, and what the Zen 2 release next year look
like. If they have solid, significant improvements, it means that AMD is set
for a bit, and can really compete and take marketshare from Intel.

Better yet, I am really hopeful about Epyc - it still doesn't seem to be
shipping in huge numbers, but as someone really burned by Meltdown, it seems
like perfect timing for AMD to compete.

~~~
lawrenceyan
If you don't mind me asking, in what capacity were you affected by the Intel
Meltdown exploits? Was it an issue in terms of security, performance, etc?

~~~
e1ven
Not OP, but I personally was pretty heavily affected - I run Robohash.org,
which runs off a bunch of Digital Ocean droplets.

I needed to fire up ~ 30% additional nodes after they did their migration.

I didn't measure as carefully as I could have, but the existing nodes couldn't
keep up with demand after the upgrade.

------
guiomie
A lot of people are buying AMD/ATI GPUs to mine ethereum, their GPUs do
extremely well against this one specific crypto currency. With ETH moving to
PoS this year, thus removing the need for GPU mining, I wonder how this will
impact the price of AMD stock in the future. Lot of people use Nvidia to mine
other GPU compatible atlcoins.

~~~
Filligree
AMD hasn't scaled up their production to meet that demand, and neither has
Nvidia; they're both wary of a crash causing demand to stop as miners sell
their burned-out GPUs.

Right now that makes things hard for everyone not a miner, but if AMD ends up
being the only option for gamers while Nvidia still sells to miners, they
might actually gain from the mess.

~~~
sangnoir
> they're both wary of a crash causing demand to stop as miners sell their
> _burned-out_ GPUs

Offtopic FYI: GPUs that are in use 24/7 would probably in a better shape than
those that encounter more thermal (on/off) cycles. There's obviously more wear
and tear on the moving part, but fans are easier to replace than mechanically
stressed silicon/PCBs.

~~~
Yizahi
a) it is not at all certain since both processes do damage to board and chips
and not a single person in this thread knows what will be more severe for each
small part of the board or connections.

b) have you actually tried replacing monstrous coolers on moderns cards? Or
even buying spares? It is rare, expensive, hard to disassemble and hard to fit
back. Some cards even use glue stickers on memory thermo-interface, you can
potentially tore chips away.

~~~
Fnoord
Regarding B I just checked iFixit for Radeon and found 2 (two!!) guides for
graphics cards. One for reapplying thermal paste, and one for fixing a noisy
fan with I think tape. All the other guides are for laptops specifically. If
you know any other good written guides (pref not videos) available to clean
and/or replace fans on graphics cards I'd like to learn about them.

------
UK-Al05
Not surprised. Ryzen was a hit and their GPUs are great for Linux and mining.

~~~
madengr
I think AMD GPUs are crap; at least their windows drivers. I recently built a
gaming PC (MSI RX-580 GPU) for my son, and never had so many frustrating
crashes, black screens, etc. Waited for a month for their 5.18 driver release,
and no improvement.

What a piece of shit. Sold it on eBay in 30 seconds for $40 more than I paid
for it, and bought an Nvidia 1060; works flawlessly. The guy I sold it to said
the RX-580 is working fine for mining. He got a card for $300 and I got rid of
a headache.

~~~
m-p-3
> 5.18 driver release?

How many years ago was this? The latest AMD Adrenalin drivers are at 18.1.1.
Oh and on Linux the open-source AMDGPU drivers (made by AMD themselves) are
now part of the kernel since not too long ago, so any recent cards will work
out of the box on a distro with a recent kernel.

~~~
dragontamer
AMD's driver scheme is basically year.month.something

I haven't figured out the last number. But the 18.1.1 release basically means
"2018 January". Similarly, their 17.7 release meant "2017 July".

5.18 seems to predate AMD's current naming scheme. So it was probably
something from a long time ago...

------
dragontamer
Ryzen looks really, really good. I'm planning on upgrading to Team Red over
Intel as soon as DDR4 prices come down a bit. Intel's only advantage seems to
be AVX512 (and that comes with a clockspeed penalty), and a truly unified L3
cache (good for databases).

But Ryzen's "split" L3 cache seems to be great for process-level parallelism
(think compile times), and seems to scale to more cores for a cheaper price.
They have an Achilles heel though: 128-bit wide SIMD means much slower AVX2
code, and no support for AVX512.

But for most general purpose code, Ryzen looks downright amazing. Even their
slower AVX2 issues is mitigated by having way more cores than their
competition. AMD sort of brute-forces a solution.

AMD's GPGPU solution looks inferior to NVidia, but for my purposes, I
basically just need a working compiler. I don't plan on doing any deep
learning stuff (so CuDNN is not an advantage in my case), but I'd like to play
with high-performance SIMD / SIMT coding. So AMD's ROCm initiative looks
pretty nice. The main benefit of ROCm is AMD's attempts to mainline it.
They're not YET successful at integrating ROCm into the Linux mainline, but
their repeated patch-attempts gives strong promise for the future of Linux /
AMD compatibility.

The effort has borne real fruit too: Linux accepted a number of AMD drivers
for 4.15 mainline.

NVidia's CUDA is definitely more mainstream though. I can get AWS instances
with HUGE NVidia P100s to perform heavy-duty compute. There's absolutely no
comparable AMD card in any of the major cloud providers (AWS / Google /
Azure). I may end up upgrading to NVidia as a GPGPU solution for CUDA instead.

OpenCL unfortunately, doesn't seem like a good solution unless I buy a Xeon
Phi (which is way more expensive than consumer stuff). AMD's ROCm / HCC or
CUDA are the only things that I'm optimistic for in the near future.

~~~
sangnoir
> I'm planning on upgrading to Team Red over Intel as soon as DDR4 prices come
> down a bit.

You probably want to buy that DDR4 as soon as you can - memory prices (DDR3,
DDR4) have been consistently going _up_ \- not down[1]. It's insane. The
price-fixing fines the memory manufacturers paid were clearly not punitive
enough, we need more anti-trust action in this area.

I recently made a build on a budget, I had to snipe online specials on DDR4;
waiting for the prices to come down is not a winning strategy.

1\.
[https://pcpartpicker.com/trends/price/memory/](https://pcpartpicker.com/trends/price/memory/)

~~~
paulmd
That trend seems to be over. China's antitrust regulator has put the DRAM
cartel on notice, and they're willing to stand up state-sponsored fabs to deal
with it if necessary. Coincidentally just after that, Samsung announced
production is going up and prices are expected to decline over this year.

[http://www.china.org.cn/business/2017-12/23/content_50157479...](http://www.china.org.cn/business/2017-12/23/content_50157479.htm)

[https://www.theregister.co.uk/2017/12/21/china_memory_insour...](https://www.theregister.co.uk/2017/12/21/china_memory_insource/)

[https://www.reuters.com/article/us-samsung-elec-chips-
outloo...](https://www.reuters.com/article/us-samsung-elec-chips-outlook-
analysis/end-of-a-chip-boom-memory-chip-price-drop-spooks-investors-
idUSKBN1F406U)

Pretty blatant market manipulation here by the DRAM cartel - and China has
enough downstream manufacturing at stake here that they're willing to go to
the mat over it.

~~~
sangnoir
This is fantastic news! I hadn't heard of this - it means my future memory
upgrades will be reasonably priced :-).

------
shmerl
I wish though they'd produce power efficient Vega for gaming sooner. Looks
like it's being pushed off to 2019.

Sapphire Nitro version requires 3 (yes, 3!) 8-pin power connectors. That's why
I'm staying with RX 480 for the time being. It works very well with Mesa on
Linux.

~~~
vetinari
That's up to Sapphire, why they did it (probably for overclocking).

The reference design (including the LC version) and some other designs, like
Asus Strix, require only 2 8-pin power connectors.

~~~
shmerl
But they still are quite power hungry at least according to the TDP specs. So
it gives an impression that Vega isn't really where it's supposed to be
efficiency wise.

And I prefer to avoid reference design which is usually too noisy.

~~~
philjohn
Many people are undervolting them, which brings power draw down considerably
and leads to only a miniscule difference in performance.

~~~
shmerl
So why are they so overvolted by default?

~~~
paulmd
Unpopular truth: because that's where they need to be to ensure stability and
yields.

There is a wide variation in the stability of hardware across various
different tasks. An overclock/undervolt that is stable in one task is not
necessarily stable in others, as anyone who's overclocked can attest. f.ex
Just Cause 3 needed several hundred MHz less than I could get in TF2 or
Witcher 3.

The voltage is set where it needs to be to ensure that there's no instability
_in any program_ , on any sample in a batch. People look at one task and one
sample and assume that their OC must be stable on everything, on every card in
the batch. In reality it's not, or not to the degree that the manufacturer
requires.

Yes, you can get extra performance on any given sample by eating up your
safety margin and pushing closer to the limit of that specific sample's
frequency/voltage curve. But as the economists say: if there were free money
laying on the sidewalk, AMD would have picked it up already. They're not
stupid, they ship the voltages they need.

~~~
shmerl
_> They're not stupid, they ship the voltages they need._

That's my point then. Vega just needs too much by default. Hopefully next
iteration will be less power hungry.

------
ComputerGuru
For those that haven’t seen it, intel is releasing a new chipset that has an
integrated AMD GPU on it; I am eager to see how the sales for that go. It does
seem to imply that intel has given up on their own iGPU aspirations, which
makes no sense given how much they’ve doubled-down on the GPU tech in recent
years. But it may be just intel playing the long game to compete with nVidia,
which is just killing it in the GPU department in recent years.

[https://arstechnica.com/gadgets/2018/01/kaby-lake-g-
unveiled...](https://arstechnica.com/gadgets/2018/01/kaby-lake-g-unveiled-
intel-cpu-amd-gpu-nvidia-beating-performance/)

~~~
def-
Intel's plan seems to be doing this temporarily until their own high
performance GPUs are ready.

~~~
forgot-my-pw
Don't they say this every year? That their integrated GPU will be better, but
it's still crap.

~~~
paulmd
Well, to be fair, Intel HD graphics _is_ a lot better than GMA. With Crystal
Well, it's actually pretty close to something like a GTX 750 and outperforms
even the new Vega APUs.

But this time they're actually making a play for the _discrete_ graphics
market. They've hired Raja Koduri and everything. Not the first time they've
done that either (see: Larrabee) but they do look to be making a serious
attempt.

~~~
throwaway2048
if it genuinely outpreformed the vega apus why would intel put an vega apu
core on a new processor line...

~~~
paulmd
They're not in the same performance class. A Crystalwell Iris Pro 580 (and the
2500U) are slightly slower than a GTX 750 non-Ti. The new Intel APU is going
to be ballpark RX 570 performance - or something like a 1060 3 GB on the
NVIDIA side. About 4x the performance - which is absolutely necessary for the
sorts of applications Hades Canyon is aimed at, like VR.

[http://gpu.userbenchmark.com/Compare/Intel-Iris-
Pro-580-Mobi...](http://gpu.userbenchmark.com/Compare/Intel-Iris-
Pro-580-Mobile-Skylake-vs-AMD-Radeon-Vega-8-Mobile-Graphics/m132950vsm378274)

[http://gpu.userbenchmark.com/Compare/Intel-Iris-
Pro-580-Mobi...](http://gpu.userbenchmark.com/Compare/Intel-Iris-
Pro-580-Mobile-Skylake-vs-AMD-Radeon-RX-Vega-M-GH-Graphics/m132950vsm422266)

[http://gpu.userbenchmark.com/Compare/AMD-RX-570-vs-AMD-
Radeo...](http://gpu.userbenchmark.com/Compare/AMD-RX-570-vs-AMD-Radeon-RX-
Vega-M-GH-Graphics/3924vsm422266)

Having a discrete graphics die, and especially having access to HBM2, makes a
huge difference in performance. There isn't much you can do with 30 GB/s of
bandwidth to share between CPU and GPU. Having Crystal Well L4 cache is a huge
boost but it's still a halfassed fix compared to having proper VRAM available.

Of course, it's also a vastly more expensive part as well. Just the CPU+GPU
package is more expensive than some of the 2500U laptops.

Presumably Intel is aiming for something more like Vega M GH with Jupiter
Sound/Arctic Sound - it makes little sense to design a low-end discrete part
with no room for future performance growth.

------
zwieback
1/3 of the sales increase from crypto, probably not sustainable.

~~~
pythonaut_16
I don't think it has to be sustainable. They just need more time for EPYC to
really take off and for Ryzen to continue gaining market share. I think their
strongest areas are going to be CPUs and APUs, and things like Kaby Lake G.
They shouldn't give up on graphics but it's going to take longer for them to
be super competitive there.

------
rdlecler1
Why’s Nvidia’s margins so high in comparison? Given the technology and patent
portfolio I’m surprised that someone hasn’t tried to take them over.

~~~
lmm
Are AMD's margins actually any worse in the graphics card market? Or is it
just that AMD is also playing in the (lower-margin) CPU market, so their
overall margin is worse?

~~~
paulmd
Nobody here can tell you that officially, at least without breaking an NDA,
but the answer is very obviously "yes".

Comparing apples to apples, AMD sells their Vega 64 flagship at a MSRP of
$499, NVIDIA sells a product with an equivalent die size at $1200. AMD sells
their cutdown at $399, NVIDIA sells theirs at $799. And that's before you
figure that NVIDIA is using commodity GDDR5/5X while AMD is using expensive
HBM2 on consumer products - NVIDIA charges between $3k and $15k for their HBM2
products. So half the MSRP, with a more expensive BOM.

AMD's margins on Vega are trainwreck bad. Some experts actually think they are
losing money on every unit sold at MSRP, hence the "$100 free games bundle" on
launch, and the de-facto price increases above MSRP during the fall. They're
banking heavily on HBM2 costs coming down, and probably also on NVIDIA not
being aggressive with the launch of gaming Volta (aka Ampere). Apart from Vega
FE, they really don't see any of the extra revenue from the inflated prices
during the mining boom either. That's all going to AIB partners and retailers.
All AMD gets out of it is volume, and up until now they've been reluctant to
increase production.

In contrast Ryzen is actually dirt cheap to manufacture due to its MCM layout.
Their margins there are probably better than Intel, even with prices
significantly below the launch prices.

------
topspin
Sure hope AMD can stay on plane this time. Athlon 64 was tremendous and forced
a huge opening in the x86 market, but AMD squandered this opportunity in
following years. Competing with Intel long term means keeping up on all
fronts.

------
currymj
i'm optimistic AMD GPUs will be usable for general purpose computing soon.
ROCm seems to be coming along nicely, and they have various deep learning
frameworks functioning to various degrees.

in particular, I think there's buy-in from the framework maintainers, they're
not going to go out of their way to port but they also aren't averse to
merging in code written by AMD engineers.

i don't think people in research have any particular loyalty to NVIDIA, and
everybody's macbook pro now has an AMD GPU, so there are also personal
incentives to get this stuff working properly.

------
enzolovesbacon
I was about to buy a Ryzen workstation, but ended buying an (used) HP Z600 (2x
E5620, 48GB ECC RAM, 2TB HD) because it was the best bang for the buck; just
the Ryzen 1800x + motherboard would cost me what I paid for the Z600 (I live
in Brazil, prices here are crazy).

That said, I'm really rooting for AMD here. It's very nice to have this kind
of competition and customers will benefit a lot from this.

------
ythn
Been a loyal AMD user ever since my first laptop which had an Athlon64.
They've always given me great performance at a fraction of the cost of Intel.

~~~
SG-
Not for laptops they haven't. Also never understood brand loyalty over so many
generations.

~~~
ZenoArrow
The key part of GPs suggestion again...

"great performance at a fraction of the cost of Intel"

Only two points need to be true for that statement to hold up:

* Performance levels that GP is happy with.

* Cheaper than Intel.

Which point(s) do you disagree with?

~~~
nottorp
Were there laptops with Athlon64s back then? Maybe the huge ones with desktop
parts.

However, at that time, on the desktop, AMD was 80-90% of the performance for
50-60% of the price. Fine with me, I was using one.

------
thisisit
Total debt at 1.4 billion dollars looks huge. So while revenues look great I
wonder if their cash flow has improved as well?

~~~
zdw
Might be relative. I'm no financial expert, but intel has something like 26B
in debt, compared to 226B in market cap:
[https://ycharts.com/companies/INTC](https://ycharts.com/companies/INTC)

It doesn't seem so out of line for AMD to have 1.4B in debt to 13B market cap.

~~~
thisisit
The question is really on whether they are generating enough cash. Intel for
example has 11.8 billion in Free cash flow:

[http://financials.morningstar.com/cash-
flow/cf.html?t=INTC&r...](http://financials.morningstar.com/cash-
flow/cf.html?t=INTC&region=usa)

whereas AMD seems to be still losing cash, currently at 217 million in loss:

[http://financials.morningstar.com/cash-
flow/cf.html?t=AMD](http://financials.morningstar.com/cash-flow/cf.html?t=AMD)

------
tyingq
Totally rooting for AMD to be the AVIS to Intel's HERTZ. The early Opteron/x64
days were great when I actually had a credible alternative to the status quo.
Go AMD! Competition benefits everyone, everyone, even Intel.

------
matte_black
I recently doubled my position in AMD prior to earnings, mostly from
discussions on HN. Here’s hoping they will have continued success in the long
term.

------
davedx
Is it worth picking up some AMD stock?

~~~
lettergram
Probably not today, but soon yes

[https://i.imgur.com/bGtm6kt.png](https://i.imgur.com/bGtm6kt.png)

They weren't as impacted by the recent vulnrabilities, Intel is likley going
to _lose_ quite a bit of marketshare (Intel currently has ~95%+ of the server
market). If AMD can break into the server market with even a 10% showing, then
their golden.

(Image and prediction(s) from:
[https://projectpiglet.com](https://projectpiglet.com))

Edit: words

~~~
ZenoArrow
> "(Image and prediction(s) from:
> [https://projectpiglet.com)"](https://projectpiglet.com\)")

From the site...

"Experts' Opinion Score, is a score representing how experts feel about a
given stock, product, or topic."

With all due respect, I don't think that's a very sensible metric to base
investment decisions on.

~~~
lettergram
I disagree, i do think we may have a misunderstanding on experts. The system
identifies experts as people who say things such as " I work at AMD and things
are going well" or " Google probably won't have a good quarter, source: I work
at Google"

There are other methods, but it typically will pick out people who work at a
given place, are holding a large amount of an asset, or have some expertise in
the field (say CPU design). As they are literally the best sources of info
there is a high correlation with movements in price within roughly 45 days,
aka at predicting quarterly results.

~~~
ZenoArrow
> "I disagree, i do think we may have a misunderstanding on experts."

I have no problem with "experts" in general, but I do think we have a
misunderstanding with regards to experts in the field in question.

What incentives are in play for the experts behind that website to be honest?
Let's conduct a role play. Consider that I'm the expert in question, and I
decide I want to short the stock for a company (i.e. bet that the stock price
will fall rather than rise). In this situation, what would be the best advice
I could give on this website to make sure my bet works out? I would mark down
my "confidence score" (which is effectively what the website publishes) in
order to maximise my chances of making money. The financial health of the
company is secondary. The financial health will have some influence on how
likely the bet I placed is likely to succeed, but it's not the only factor at
play.

Benjamin Graham (Warren Buffett's mentor) summed up the most obvious path for
a successful investor to take by stating "The intelligent investor is a
realist who sells to optimists and buys from pessimists.". Taking this a step
further, without knowing the financial position of the "expert" you're getting
advice from, how do you know if the "expert" is in the market to buy, to sell,
or is neutral? I would suggest to you that you don't know that, and with that
in mind any advice you get from an anonymous investor is advice to be taken
with a large dose of salt.

Just in case you think I'm just describing a hypothetical situation, there
have been high profile cases where investment firms were caught betting
against the advice they gave their clients. For example:

[http://www.independent.co.uk/news/business/news/goldman-
bet-...](http://www.independent.co.uk/news/business/news/goldman-bet-against-
securities-it-sold-to-clients-1953406.html)

~~~
lettergram
[https://blog.projectpiglet.com/2018/01/causality-in-
cryptoma...](https://blog.projectpiglet.com/2018/01/causality-in-
cryptomarkets/)

Typically, I just run the numbers and it usually works out. I see your point
though, any suggestions?

~~~
ZenoArrow
> "Typically, I just run the numbers and it usually works out. I see your
> point though, any suggestions?"

I'm not an expert, but I believe Buffett's advice of 'invest in what you know'
is sound:

[https://www.simplysafedividends.com/warren-buffett-
investmen...](https://www.simplysafedividends.com/warren-buffett-investment-
advice/)

[https://www.marketwatch.com/story/the-genius-of-warren-
buffe...](https://www.marketwatch.com/story/the-genius-of-warren-buffett-
in-23-quotes-2015-08-19)

In other words, it pays off doing your own research into the financial health
of a company, to determine whether a company is currently undervalued or
overvalued. Should be noted that this approach works best by taking a long
term approach when buying stocks, as you may have to weather some short term
market irrationality.

------
Animats
Wow. 34% gross margin in a competitive hardware business. AMD is doing OK.

~~~
TomVDB
Intel and Nvidia have gross margins of 58% or higher. Other semiconductor
companies like Broadcom are around 50%.

34% is better than the 30% of last year, but it's still really bad.

------
m3kw9
Their P/E is now over 340. I’m not sure if that’s new territory do a stock
that is not penny? Or is it because they just became earnings positive?

~~~
jnordwick
Prob. PER is only really useful for a firm with a stable and positive net
income (think of things like industrials where growth capex isn't such a major
factor). For something like a firm fluttering around zero EPS you would use a
more forward measure of expected profit. The problem with PER is that stock
price is forward looking while EPS is backwards looking.

------
Dolores12
It is due to crypto miners. Once crypto coins goes down the market will be
flooded with used graphic cards. Hence sales will decrease.

~~~
freehunter
Would you buy a used GPU if you knew it was used for mining? I'd feel like it
was pushed too hard and could be ready to burn out. Makes me wonder if the
secondhand market for video cards will tank for a while after mining stops.

~~~
vesrah
Cheaper GPUs for me then. I don't see a difference between buying used parts
from someone that browsed the internet versus playing games. CPUs/GPUs don't
just 'wear out' on a normal time scale before they are obsolete. We've got
companies that have been running the same mainframes for 50 years.

~~~
matte_black
It is in no one’s interest except the miners to promote used mining GPUs as a
viable used part. If you simply continue the narrative that used mining gpus
are bad, the prices will be pushed down as people refuse to buy them.

If you are looking to buy those used GPUs for cheap you should avoid any
public language that tries to downplay the impact of 24/7 mining.

~~~
ihattendorf
Maybe instead of trying to influence the market, he's trying to have an honest
discussion about the viability of GPUs used for mining?

~~~
matte_black
That is best done behind closed doors.

------
gkgicccj
Are there thinkpads with AMD?

PS: how does one delete old HN comments? Seems like a major privacy issue to
me.

~~~
kuschku
> PS: how does one delete old HN comments? Seems like a major privacy issue to
> me.

There's no option in the interface, the best option right now is asking nicely
per email.

Of course, you can just wait until May and then use the rights the GDPR grants
you to force them to delete your comments.

~~~
e12e
Assuming the poster has rights under GDPR, and hn will adjust to follow GDPR.
Does seem likely they will (have to).

~~~
kuschku
The GDPR applies to (basically) any business on the planet that stores data of
EU citizen. So it's quite easy to make that assumption.

~~~
kevin_thibedeau
How is social media commentary willingly handed over for public exposure a
form of personal data? If GDPR is allowed to have such wide applicability
there's going to be a lot of shakedown scams from bad actors.

~~~
e12e
> there's going to be a lot of shakedown scams from bad actors.

I'm not sure how that would work.

Any compliant service is likely to allow self-service (eg: a button to delete
a comment; a link to list out all data; an edit function to correct wrong
data).

If you're storing personal information and don't comply with the law, you risk
a fine. Just as you risk a fine for mismanaging health data, or risk
prosecution for storing data that is illegal, like child pornography.

~~~
e12e
You might also want to look at GDPR chapter 3, article 12, point(?) 5:

"Information provided under Articles 13 and 14 and any communication and any
actions taken under Articles 15 to 22 and 34 shall be provided free of charge.
Where requests from a data subject are manifestly unfounded or excessive, in
particular because of their repetitive character, the controller may either:

charge a reasonable fee taking into account the administrative costs of
providing the information or communication or taking the action requested; or

refuse to act on the request.

The controller shall bear the burden of demonstrating the manifestly unfounded
or excessive character of the request."

[https://gdpr-info.eu/art-12-gdpr/](https://gdpr-info.eu/art-12-gdpr/)

------
tsmarsh
Strong forecasts with Spectre still around seems very optimistic.

Can AMD or Intel release another generation of chips with Spectre
vulnerabilities?

Do we think they're going to have a solution to Spectre within a year?

Can x86 survive without out of order processing? Can any architectures perform
at modern levels without it?

Is x86 relevant in the server if you use Linux/BSD and can recompile your
deployables?

Without Spectre I'd be very bullish on AMD. With Spectre, I'm bearish for the
entire sector.

