
Chipmakers Rally in Talent War - bcaulfield
https://www.eetimes.com/document.asp?doc_id=1332865
======
slashink
Really what has to happen is that the chip companies have to wake up to the
fact that levels of compensation at software companies has outpaced them. If
you want to hire top talent for half of what GOOGLE/FB/AMZN would pay the
talent then yes there will be a shortage.

Trying to convince engineers with these fluff pieces how they are important
will not make a difference. Pay up.

~~~
whack
As a EE myself, who left the hardware industry to become a software engineer,
I agree completely. It's also worth examining the reasons why chipmakers in
today's world can't ever compete with SW companies for top talent:

1\. The market size and opportunity in SW is tremendous as compared to HW. SW
is eating the world in nearly every industry, and this is going to continue
for the next 10-20 years. HW used to be great, during the time of Moore's law.
But with Moore's law coming to an end, it's going to be harder and harder for
chipmakers to compete with the same products they released 3 years ago.

2\. Entrepreneurial opportunities are severely limited in HW. A software
startup can be launched by 2 college kids in a dorm room, using a couple
thousand dollars of angel investments, and scaled nationally with 1-2 million
in VC funding. With HW startups, because of the engineering/logistical
nightmares involved in fabricating a chip and selling a physical product, you
need so much more money/credentials/connections to get a startup off the
ground. SW devs can realistically dream of building/joining a world-changing
startup, but HW engineers are mostly resigned to being a cog in a big machine.

3\. Paranoia and risk aversion. Strongly linked to the above, is the culture
of rational risk-aversion that exists in any HW project. With a software
project, the Agile/MVP methodology works extremely well, because the cost of
making changes, or fixing prior mistakes, is so low. In HW, the costs of
making changes and releasing a buggy product, are astronomically higher.
Because of this, HW projects are extremely risk-averse. Junior developers are
micromanaged by their superiors. People are given little autonomy to innovate
and try new ideas. Project timelines are measured in years, not months. From
an artistic perspective, developers love having the freedom to build freely
and get their product in the hands of users as soon as possible. This is so
much easier in SW as compared to HW.

Combine all 3 of the above, and it's easy to see why SW is so much more
lucrative and enticing. As an EE myself, I tell everyone I know that unless
they have a deep passion for the field itself, they should strongly consider a
career switch.

~~~
twtw
Re: Dead Moore's law and entrepreneurial opportunities.

It seems to me that the fact that Moore's law has ended may end up with a flip
side that could potentially be awesome. Previously, it nearly always made more
sense to do something on a CPU and wait for performance to double than to
develop a custom ASIC. With the free lunch deal over, I suspect that we will
begin to see a ton more custom ASICs and architectural innovation. This has
already begun for ML/DL/AI.

Without Moore's law, the semiconductor industry may get a lot more exciting.
It might also just fade to commodity even more, but that isn't as fun to think
about.

~~~
deepnotderp
Ding ding.

I've always been of the opinion that the end of Moore's Law will result in a
new wave of startup innovation. When the cost of newer nodes are a barrier to
entry and the advantages of startups can be obviated by just moving to the
next node (which the startup cannot do), then innovation stagnates. This is
largely what killed many FPGA startups (e.g. Tabula, which btw had super cool
tech).

------
fnbr
What's missing here is a discussion of compensation. I've seen at conferences
how proud senior Google employees (ie Jeff Dean) are of the TPU work that
their teams are doing, and I suspect that compensation is following
accordingly. I doubt that Intel or similar companies are compensating their
people at the same level. Whenever I see people talking about a talent war,
I'm always suspicious that they're getting outspent and are unwilling to
compete.

~~~
WWLink
At my university, the word on the street was to not waste your time with chip
design or device physics, because most of that work was heavily outsourced and
the employers didn't pay well. I thought that was a real shame, because our
device physics professor was brilliant and his research sounded fascinating.

Sadly, the people I met from a very popular microchip manufacturer (they don't
make CPUs though?) suggested that they had to work long hours and didn't like
their jobs. They were actually.. surprisingly transparent about that. I
appreciated it, of course!

~~~
deepnotderp
Also if you _do_ make a device physics breakthrough then people will promptly
ignore you. Doubly so if you're young.

~~~
andars
Last I heard, you had big claims and have yet to test your ideas in physical
hardware. People will probably be less likely to ignore you if you have
something that you can demonstrate being able to manufacture reliably and that
you aren't just exploiting some simulation artifact.

~~~
deepnotderp
It's relatively hard to fool TCAD. SPICE simulations at least I can believe
there can be scoping artifacts, but even those are generally in a certain
pattern. In any case, I'd be okay with a "this is great if it works in
hardware, but until then idc", but the response is more "you're a young idiot
who shouldn't bother with memory cell innovation". A lot of this is
superstition due to SRAM being finicky. I could go on about this, but let's
take it offline in that case.

------
wsxcde
The unfortunate thing is that the hardware guys do all the hard work but the
software guys make all the money. And that's the real reason why Google pays
way more than Intel.

It's the likes of who Intel figure out how you can put a billion little things
each with a few tens of atoms on a chip that somehow work together to produce
a useful computing device. And then Apple finds a way to create a really
pretty GUI and walk away with 90% of the profit in the Industry.

As a hardware guy, I always feel like this is somehow unfair.

~~~
electricslpnsld
> And then Apple finds a way to create a really pretty GUI

Isn't Apple designing most of its chips in house, these days, at least on
mobile where the big money is at? I suppose Apple isn't building its own fabs,
but describing them as 'software guys' seems a bit of a mischaracterization of
the company, and how/why they make oodles of money (hell, their software has
been pretty damn buggy of late anyways).

~~~
MagnumOpus
They do order up a custom SoC package that then is actually manufactured by
TSMC or Samsung - but do you think they would have to cut the price of their
phones if they had to use the standard chips that Samsung, HTC etc use?

I'd say no, they could charge nearly the same. Which means the complicated
hardware is not the differentiator "killer app", the marketing around the
custom OS and the ecosystem is.

(Which is also why they don't sell iOS or macOS to other hardware firms - the
hardware is good, but the software is the USP.)

------
godelmachine
Engineer from India here, let me tell you my story.

In my final year of engineering, I dreamt of being an Integrated Circuit
Design Engineer. While the rest of the folks from my class took up
microcontrollers based project, my team was the only one to take up a FPGA
based project, because that was as close as I could get to VLSI (Obviously at
my insistence, I'd to fight with my partners over FPGA vs Microcontroller)

After engineering, I searched for a job in VLSI in Pune & Bangalore for 6
months, but in vain. Got to know they don't hire freshers. So, I did a 6 month
Post Graduate Diploma in VLSI Design from CDAC (Center for Development of
Advanced Computing - makers of India's supercomputers) (Its like finishing
school) . Agtee the course got over, interviews began. While folks who did
their course in .NET/ Java got 20-30 interview calls, we got an average of
only 5 calls. All 5 were very selective in interviews. Tried for a job in VLSI
for another 1 year, and then finally took up a job in IT in Enterprise
Automation.

One year after my IT job, spoke with a few classmates. Some people were really
doing well making 70k INR/ month while my lab partner was stuck on 12k/ month.
I was making 35k/ month. Lab partner said - "IT folks make much better than we
do. What was the point of getting into VLSI?"

This was one point. Another point for me is immigration.

I am planning to immigrate to Canada (Starting my PR process soon - am 28,
unmarried). My father (GC holder) has filed for my GC abut my priority date is
10 years away) So US gonna be tough.

But even if I had made it in VLSI, how many jobs are there in Canada for Chip
Designers / RTL Design Engineers or maybe FPGA Engineers ? If I stay in IT
doing my BMC Software stuff, I know I may get a job in some ITSM company.
There aren't many VLSI jobs in every book and corner of earth. Whereas IT
ensures that jobs are available in plenty everywhere.

Comments / feedback would be highly appreciated.

~~~
calvinbhai
FPGA and ASIC design as a field is very vast. Tool sets cost high and once a
company invests in an ecosystem, they generally don’t switch. In such a case,
if you showcase your expertise with say ALTERA at a company that uses Xilinx,
it may not come across as impressive to those looking to fill in a new spot.

I think it’s better to keep your feet in non hardware fields if you want to be
sure of getting a job as the first thing.

To continue in FPGA/ASIC design, I’d suggest you do a MS degree in a US
university in a location that has a lot of chip design industries (I’d guess
Bay Area, Texas), because that’s where you can have a good chance of getting
an internship and that’s going to be your foot in the door, to start a career
in the field of FPGA/ASIC design. Getting an internship is very very crucial
if you want to get started with a career in hardware engineering since the
situation is similar in US also, where you need to hit the ground running from
day 1. With an internship/co-op it becomes much easier since they give you
some slack because you are a student .

Whatever you do, dont just get stuck in the cess pool of IT Services jobs in
India, because your friends in those fields have magnificent titles and pay.
If you get to work on a product at a product company, great. But orherwise for
someone like you IMO a graduate degree in the US is the best bet.

(I’m from b’luru, did grad course in FPGA Design in the US and later switched
to s/w dev because I was not awesome at h/w)

~~~
godelmachine
Many thanks for your suggestions. Went through your reply couple of times.
Would you kindly give me pointers on how to get scholarships for MS in US? And
how to successfully get RA or TA. Asking since you have already finished
yours.

As I've previously mentioned, I am in IT now working in Enterprise Software.

~~~
calvinbhai
I went to a good university with an ok EE dept in a very bad location, and
didn’t look for assistantship.

But based on what I have seen, look for universities / departments that are
well funded for research (I don’t know how you can get that info), which often
means more assistantship opportunities.

Getting an RA/TA can also be competitive but since you have some work
experience, it depends on how well you market your experience to fit the RA/TA
role.

~~~
godelmachine
Thanks for the info. Much grateful _/\\_

------
deepnotderp
As a "young" person in the semiconductor industry, I can whole heartedly
guarantee that no young person will ever want to go into this field ( nvidia
is the one exception but only for DL software)

The field is essentially "no matter what the data says, the person with more
experience is right", looks down upon young talent, essentially refuses to
innovate and has too many clueless people making decisions.

Don't believe me? Ask your ex-Intel friend.

EDIT: this turned into a rant. Naturally, this is one view, there's another
perspective as well. E.g. ASML is extraordinarily innovative and Intel's TMG
was an absolute engine.

~~~
twtw
The hubris is strong with this one.

> Looks down on young talent

Are you certain this "young talent" is looked down upon for youth, not in
response to their criticism of people who have actually consistently made
chips that work and work reliably?

fyi, people are more likely to listen if you don't call them clueless.

~~~
deepnotderp
> Are you certain this "young talent" is looked down upon for youth, not in
> response to their criticism of people who have actually consistently made
> chips that work and work reliably?

You could use the same line of reasoning for science in general. Was it stupid
to criticize the opinions of scientists from classical mechanics?

Also: meltdown.

~~~
twtw
A couple of things:

Science is not the same as engineering. Engineering requires a balance between
performance, reliability, cost, etc. I don't think the parallel to paradigm
shifts in science is a very good one.

There is a lot of wisdom behind listening to people with a proven track record
over cocky youth. The people with experience are typically a lot better at
striking an effective balance, even if the data suggests that some great new
innovation will benefit one characteristic.

Of course it was not stupid for scientists to move beyond old theories. But
the scientists with new theories almost always learned from and worked with
the previous generation and then moved beyond them, rather than criticizing
the character and competence of the old guard.

My main point was simply that if you adopt the same tone with the people who
"look down" on you as you did in your original comment, I am not surprised
that you don't get any respect.

~~~
deepnotderp
My original comment was rather ranty, I've calmed down since then :)

I think the comparison with science is fair, engineering in this sense is very
close to physics. If you don't like that example we could try the shift from
vacuum tubes to integrated circuits, or the disrespect thrown at neural nets
before they got popular again. All analogies are crude, but they convey some
meaning.

I mean, I've shown people hard data (not even from simulations,from hardware)
and they've still essentially ignored it, so I think it's fair to say there's
at least _some_ dysfunctionality in the industry. As epitomized, by the very
article I'm responding to here.

I'm not usually like this I promise, it's just that this article struck a bit
of a chord with me :)

------
georgeburdell
Maybe if they paid competitively they would be able to attract new grads. Many
of my Berkeley and Stanford EE acquaintances are joining Apple, Facebook, and
Google as hardware engineers and getting 6 figure initial RSU grants and $140k
starting salaries (this is at the PhD level). Even Intel, which has among the
best pay in the traditional semi industry, pays about 20-30 percent less total
comp, and so they are only attracting either the people still intensely
interested in their specialty (and willing to forego pay) or those who
couldn't pass an interview at the aforementioned other companies

~~~
jarjoura
My friend who works in Apple Hardware had a competing offer from Nvidia. They
both offered the same pay and an expectation that you will work nights and
weekends, but apparently a followup team lunch at Nvidia didn't go well and he
took the offer at Apple.

I don't know what the dollar amount was, I didn't ask, but I think it's fair
to say that Nvidia is at least interested in paying whatever the market rate
for EE is.

------
kev009
I believe the fundamental issue in all of the technology industry is tying
people that know how to take money and turn it into new technology that begets
more money. There is tremendous room in semiconductors to do so. In terms of
selecting winners, it's no harder than software, but software cycle times (and
profit taking) are much shorter so it's faster to wash the wins and losses. It
also makes it easier to appear to know what you are doing (VCs) when you
don't. A semiconductor upstart as a new entity or within an existing company
is a minimum of several years of investment before you can even gague long
term success. People (VCs, governance boards, executives, GMs etc) are no
better at picking winners in software than hardware, but you spin the wheel
more often and therefore it's more lucrative.

------
curiouscat321
The issue of compensation extends far beyond just chipmakers. The large tech
companies (FB/GOOG et al.) pay top compensation for most technical fields and
have the ability to suck up talent worldwide.

Chipmakers, automotive, media...the question is how long incumbent companies
are going to continue poor compensation practices before they figure out how
far behind they are.

------
kikoreis
If Meltdown and Spectre do end up causing industry wide redesign of CPU
internals and issuing of new generations of chips I expect competition for
silicon design, verification and enablement expertise to heat up to the point
of being red hot.

------
chiyc
I'm not surprised to hear that traditional chipmakers are having trouble with
their talent pool. I just recently left semiconductors to get into software.

Getting into software is much more accessible, especially with the abundance
of learning material online. It's also easier for people to imagine how
software can make a difference in people's lives, which is a common motivator.
We're more aware of how we use software on a daily basis. And it's often
software that goes viral in a consumer market, not hardware - Apple and phones
aside. Silicon fabrication and chip design are at a level of abstraction
that's harder to grasp.

How much accessible information is out there on hardware design? Neither is it
easy to start fabricating your own chips.

> “A big part of the challenge is in awareness — we need to help them see the
> path.”

> That shouldn’t be difficult. “We make air travel safer, healthcare more
> predictable, we have an easy story to tell,” said Durn.

I agree that there needs to be more awareness. It's not, however, as easy as
you might think to sell a student on this story. Sure, it might sound great at
first. It sounds convincing. Almost everything is based on the hardware these
companies produce, right?

Even if you convince them and get these new engineers in, you have to find out
how to keep them.

The scale of engineering that needs to happen at these companies is massive.
It takes many moving parts. Many of these companies start by allocating new
undergraduate engineers to grunt work that's required at any company of this
size.

> As our industry matured, operational efficiencies and a whole set of
> different DNA took hold

Good luck convincing your new engineers that being inserted into your company
as a cog and dealing with your business procedures for “operational
efficiencies” is making the world a better place.

> About 85% of chip vendors need new kinds of talent to keep pace with the
> rise of digital operations powered by automated systems, big data, and
> machine learning. However, 77% of them report a shortage of talent,
> particularly for EEs, according to studies by Deloitte Consulting done for
> the SEMI trade group.

Sure, there’s going to be a shortage of talent when you often demand graduate
degrees at a minimum. Even more so when you don’t pay as much as the software
giants are.

There is a good point in here though. These companies do need these new kinds
of talent. And to be clear, it’s not the talent that needs to keep up with the
new age, it’s these companies that need to keep pace. They need new processes
and reorganization. And by reorganization, I mean something beyond just
shuffling around managers and fattening up middle management.

You have to somehow replace the processes that worked just ten years ago but
are slowing you down now. IT and your software departments can’t be seen as a
cost center anymore. Don’t hire bright young engineers to do things you should
be automating. Hire people to automate those things instead.

~~~
stevenwoo
I graduated in 1989 with an Electrical Engineering degree and three semesters
co-oping at AMD in a project on one of their Intel drop in replacement
processors. My ancient experience coincides with your more recent experience -
it taught me that a.) the really interesting stuff to me personally was going
to require a Ph.D. to get in the door, they had extremely bright people
working on lower level projects but everyone I talked to on the top projects
had advanced degrees b.) each chip iteration was a huge process involving lots
of different people at each stage with different expertise so each individual
contribution was going to be tiny c.) the software we used to make/design
chips was a lot more fun to work on and iterate and didn't require nearly as
big of a capital investment/gamble, even back then.

------
deepaksurti
Based on all the responses in the thread, I am now hardly surprised that my
Ask HN to switch to hardware received 0 responses.
([https://news.ycombinator.com/item?id=16054742](https://news.ycombinator.com/item?id=16054742))

May be @madengr, could answer it as I am of the exact same opinion, that
hardware is way more fun that software. And even in software, I am not doing
the usual boring things, still it has had me. Any feedback highly appreciated.

------
peterburkimsher
"Today, kids dream about Google, Facebook, and Apple; they don’t dream about
us... his father brought home an early chip made at Western Electric as the
brains for a U.S. missile."

Right there is my reason for not working for them. I don't want to be involved
in military technology.

I'm an immigrant, and building weapons to kill foreigners is a conflict of
interest.

~~~
sgift
Good luck working in anything software or hardware which isn't somehow
involved with military technology.

