Hacker News new | past | comments | ask | show | jobs | submit login
Applied Materials Is a Major Reason Computers Keep Getting Faster (wsj.com)
58 points by Bostonian on Sept 24, 2023 | hide | past | favorite | 26 comments



I doubt there is any man-made object more complex than a state of the art semiconductor. The amount of science and engineering that has gone into the evolution of the fabrication should not be taken for granted. Exotic is an understatement. Just for a sliver of insight, take a look at the ASML lithography tool and how it generates the light that exposes the nanoscale circuits on the chip:

https://www.youtube.com/watch?v=Jv40Viz-KTc

And that was filmed years ago, as today, TSMC is using ASML machines at 7 nm and below.

Puff piece or not, the equipment companies have delivered technical wonders under extreme duress and endured brutal business cycles for many decades while still managing to enable these feats of science and engineering.


I'm a bit confused by this.

The video was posted to YouTube in 2021. It looks like a video shot in 2021. It has a copyright notice at the end that states 2021. Yet it's talking about a then-current 193nm wavelength lithography, which according to this post from ASML's newsroom was the then-current process for producing 100nm chips at the end of 2000:

https://www.asml.com/en/news/press-releases/2000/asml-announ...

According to the reporters comments at the start of the video, this would have been a current process when he started at Intel 21 years ago, but he's talking about it as a current process?

I'm sure I've missed something simple in the video but I don't know what it is?


I haven't watched the video yet, but most things don't need state of the art process—that's pretty much just for the main processors of phones and computers. There are fabs still running much older processes for everything else that just doesn't need the smallest possible transistors—toasters, industrial equipment, etc.

Edit: watching the video though, it's about an EUV machine, which is definitely recent tech. The 193 nm figure is actually referring to its predecessors. It generates 13.5 nm light.


193nm is the wavelength of the light used. The single digit nm fugue you're probably familiar with is the size of the features on the chip.


Yeah I’m assuming that’s what it is due to the reference to a 13.5nm node later. Thanks for clarifying.


Isn’t the transistor size on the “3 nm” node around 30 nm?


Wow I thought all the LHC hardware looked impressive. That ASML EUV machine is something else!


That's why semiconductor engineering is current "rocket science"


https://archive.ph/rjaAe

Original title and subtitle (which don't fit here): "The Most Important Tech Company You’ve Never Heard of Is a Major Reason Computers Keep Getting Faster. Few people outside of semiconductor manufacturing have ever heard of Applied Materials and its competitors—but what they do is more essential than ever to maintaining the global pace of technological progress."


When I finished my phd in materials science I interviewed and got an offer at applied for $120k/yr in 2019.

No thanks.

Throughout my time in grad school I saw leaders in the semiconductor industry talk to professors and complain about how they can no longer get the best talent and that every one is going to software.

Well, have you looked at what you're paying and what you're asking to get for that money? And then they complain about nobody wanting to work anymore.

So yeah when I see these bs pr pieces about how important they are, how come they're not valued higher and pay better salaries? The talk and the reality don't match.


As I see it, it's not so much the fault of research institutions and hardware companies that salaries are not competitive with software.

It's that the software industry has too much money. The economies of scale of the software industry allows excess central bank printed money to flow predominantly towards software companies and VC backed mega startups. You can be speculative in software startups with the standard VC formula: pump a couple billion into a company, get a couple hundred million users (just scale with AWS, no problem!), and then dump shares on the stock market full of investors holding free stimulus cash and looking for an alternative to their (used to be) <1% savings interest rate.

Hardware companies just can't scale that way. (How's the Arizona plant going, TSMC?)

The salaries in software are abhorrently inflated (and I'm speaking as a beneficiary to this phenomenon). It's "unfortunate" that most people with technical degrees have a relatively easy career option to switch to software, if we take that out of the equation, $120k/year doesn't sound really that bad TBH.


>. The economies of scale of the software industry allows excess central bank printed money to flow predominantly towards software companies and VC backed mega startups.

I don't know why people keep saying this. The central bank doesn't do this. It doesn't hand over money to investors. I don't like saying it but this sounds too much like a conspiracy theory to me. I mean, the pathway for why exactly this is supposed to happen is not explained at all.

In reality commercial banks create the money and they don't give a damn about small or medium sized companies. The central bank is just there to make sure the money system doesn't collapse in either direction. This means they do tend to support commercial banks with excess reserves but those banks make the decisions based on the expectation of getting that money back, which is in stark contrast to the free money rethoric. If someone is doing something it's not the central banks.

Have you tried working with hardware by the way? Anything related to hardware has a huge capital intensitivity and this means more of the money is going into paying interest or capital returns to investors instead of employees. If the earning potential of a software and hardware developer are the same but each hardware job requires $400k in capital then just the usual 8% return expectation would cut the hardware developer's salary by $32k. Meanwhile the software guy needs a laptop and monitors on a desk in an office. None of this has anything to do with central banks. If anything cheap central bank money would disproportionately benefit the hardware guy.


> I don't know why people keep saying this.

Does it occur to you that maybe people keep saying this because that's the truth?

I think it suffices to say that at least for the COVID round in the US, the "money printing" was in the form of the Fed buying US treasuries and the US government paying out trillions of dollars to normal people in the form of COVID stimulus checks. Yes, a significant part of it directly went to normal people, and it wasn't in the form of commercial banks lending money out.

You're advised to look at the Fed balance sheet, covid stimulus checks, the market cap of big tech, and the flow of speculative money (VC capital [or startup valuations], BTC, GME, etc.), especially with respect to the dates and timings.

If you think you learned everything you need to know about economics from some 10+ year old textbook, let me give you an update: in the US there's no "fractional reserve banking" system any more, unless the "fraction" is zero. Basically it's QE all the way now. ( https://www.federalreserve.gov/monetarypolicy/reservereq.htm )


Software salaries have been higher than hardware salaries since well before covid. So that part of your theory is bunk.

Had it occurred to you some people repeat stuff because they just want other people to believe the things they think? That's kind of how discussion works. It isn't a measure of truth to see the same line repeated at all. That's a fallacy.


The COVID part I was replying to GGP's claim that the central bank doesn't print money.

Generally QE has started since 2008.

> Had it occurred to you some people repeat stuff because they just want other people to believe the things they think?

Yes, that's a possibility, not a definite. Why did you think that hasn't occurred to me? I'm not the one saying "I don't know why people keep saying this."

I specifically offered an explanation that involves the GGP possibly being wrong: "Does it occur to you that maybe people keep saying this because that's the truth?" Note the "maybe". It's not rhetoric. I meant what I wrote.

Of course there are other explanations but I'm pretty sure their view on what central banks are wrong or at least out of date. And I explained and gave pointers. I still wrote the "maybe".

Anyway, given your definite claim that people just repeat stuff because they just want other people to think the same, I guess discussion with you becomes meaningless at this point.

Finally I'll just note that quoting my counter example to explain what central banks do as evidence that I got my original claims wrong isn't really a argument either way.

Have a nice day.


> The economies of scale of the software industry allows excess central bank printed money to flow predominantly towards software companies and VC backed mega startups

This is not why.

It's because of profit margins.

The Income per Employee at Applied Materials is ~$190-200k in FY24, but was traditionally around $60-100k in the 2010s [0]

The Income per Employee at Google is ~$390-428K in FY24. In the 2010s it was around $200-350k. [1]

Both AMAT and GOOG are blue chips in their categories, yet a software driven company like GOOG has more money to spread to employees simply because the margins are much higher than in Hardware companies.

A company like AMAT needs to invest massively in lab space, commodities, manufacturing space, real estate, and machinery, which takes a massive dent out of your margins. Most spending in tech on the other hand can go directly to employees and marketing, as the only major cost outlay is compute, which itself is increasingly being offloaded to cloud providers at companies this size because F1000s can get sweetheart deals with competitive pricing (cheaper than DCs) because they are strategic enough accounts.

The only Hardware companies that can succeed in this kind of market tend to be Taiwanese, Korean, and Japanese because the conversion rates are EXTREMELY competitive. A company like TSMC, TEL, Canon, etc can earn in dollars but pay Asian market rate salaries in Yen/NTD/Won. With salaries at those companies being around $40-70k in Taiwan/SK/Japan, this means you are able to almost double your net income simply by having lower employee salaries. On top of that, all those Japanese, Korean, and Taiwanese companies got govt support and backing, which wasn't vogue in the US until recently. Btw, the low salaries in TW/SK/JP were a major reason engineers from there defected to PRC companies, who gladly payed US market rate salaries with 0% income tax, given mansions, companies cars, etc.

Also, VC doesn't invest in public markets - that's an Investment Banking function. VC only prioritizes Software and Pharmaceticals because of the profit margins issue above. Now that the CHIPS act was passed, over the next few years we'll see better health for the Hardware industry within the US because now they have support that is comparable to what the Pharma industry (a similarly expensive industry from a cost outlay perspective) gets.

[0] - https://csimarket.com/stocks/AMAT-Income-per-Employee.html

[1] - https://csimarket.com/stocks/GOOG-Income-per-Employee.html


Yes, I totally agree with basically everything you said except the VC point (which, I don't think I implied VC invested in public markets).

Thanks for elaborating my point.


120k? I've had 60-80k offers. Friends with PhD and several post-doc accepted <80k. I accepted the only offer I had at 65k, and quit several months in the job. Understaffed, underfunded, engineer covering operators quitting on the spot. Noped out and never looked back. Such a shame, the science is awesome, but the career outlook just bleak.


This is the sad truth about the industry. The EE/CE department at my university was world renowned. Groundbreaking research being done. Professors who defined industry trends. Everyone I knew in the program was a genius and truly wanted to move the world forward.

Upon graduation the smartest in the class got $70K offers from Intel. If you had a PhD then maybe $80K. Today they are all working at FAANG or enterprise SaaS companies.


Anecdotally, when I worked at Samsung Austin Semiconductor (2017-2019), the top PhD engineers were paid lavishly. I don't know exact numbers, but I was told well north of $200K, plus hefty bonuses. Since mid-level engineers without a PhD could comfortably be in the mid-100s, that sounds about right to me. My guess is they wouldn't offer the $200K+ range to start, though.

I definitely don't blame anyone for not wanting to go there. Hell, that's where I pivoted into tech - I was a Shift Supervisor, realized I had no appetite for management and didn't want to be a technician, so I got a M.S. in SWE from UT Austin and then jumped ship. Instant salary boost, even though I had to start over on the career rung. Software does in fact pay massively well.


Is it their fault that VC goes to SaaS and pointless startups as fast as flies go to shit ?

$120k/yr is a very good amount of money. The problem is the inflation of the tech sector because the second dot com boom we are experiencing


Do you think it is the difference between hardware vs software?

The competition for hardware is fierce, so margins and profits are low.


Moore Law is all but dead. Will 3D stacking that Applied Materials is specialized in be facing the same obstacle?

Are they the leader of the field? I guess it isn’t clear to me why the article talks about Applied Materials.


Do they pay well?


Not at all


This is PR.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: