Hacker News new | past | comments | ask | show | jobs | submit login
Jim Keller to Depart Intel (intel.com)
192 points by periya 25 days ago | hide | past | favorite | 107 comments



Jim’s talk with Lex Friedman was quite amazing. Highly recommend listening-

https://youtu.be/Nb2tebYAaOA


> If you constantly unpacked everything for deeper understanding, you're never going to get anything done. If you don't unpack understanding when you need to, you'll do the wrong thing.

I really liked that quote. It's a great talk indeed.


It echoes a Confucius quote I like: “Learning without thinking is useless. Thinking without learning is dangerous.” (Analects 2:15)


"Explore Vs Exploit Dilemma"


I wish that the interviewer had more background in computer architecture. A lot of these questions wouldn't be asked if he took an architecture class in school.

Keller says a lot of interesting things in this interview that aren't followed up on. He calls for more substantial changes and architectural changes, I wonder what he thinks of spatial architectures.


I think the interviewer 'acts' dumb, because then the resulting video requires less pre-knowledge to understand.

The video goes from something that 100,000 people with a computer architecture background might watch to something that 10,000,000 people with a tech background might watch.


That's fair, I think you're right.

Selfishly, though, as someone who doesn't work at Intel/nvidia/AMD but is interested in architecture and digital design, it's frustrating how hard it is to find candid opinions from industry experts in comparison to software. Computer architecture just isn't the same in terms of attitude.

One of my favorite things about the computer architecture courses I took was to sit after class and ask my professors about something I'd read about, and more often than not they'd tell me that what I read was bullshit and nobody seriously considered it in the industry, or they'd tell me about something they were excited about that I hadn't heard of. It's hard to get a handle on where the industry is heading and what important people see as the next steps to iterate on.

A general conversation is probably more useful to most people, but there are a lot of times in this conversation where you get the feeling that Keller is right on the precipice of giving an insider prediction of the future, and the topic shifts instead. It's frustrating because those types of conversations are just straight-up difficult to find elsewhere.


Have you listened to https://oxide.computer/podcast/ ?

It's has more depth than Friedman's podcast, I listen to it mainly for the computing history because they interview people about their careers. I hope they do another season soon


This podcast has a similar problem too. Right when you think the guest is into something, Cantrill rants about a tangential thing and the topic chances afterwards.

I normally enjoy Cantrill's rants, I watch his talks on YouTube solely for listening him rant, but it gets annoying when interviewing a guest.


Sorry about that -- still learning how to effectively interview! (If there are specific examples you can cite, that would actually be helpful; I am trying to get better at this.)


Hi, thanks for hearing!

Unfortunately I don't have specific examples in mind since it has been few months listening the podcast but I remember the episode with Jonathan Blow being a prominent one, because I was new to him, his rants and game development in general, and it felt like every time he was onto something interesting you were ranting about a related issue and after that he was proceeding with his next point without concluding the previous one. At least it felt to me that way.

I hope that didn't sound like I'm complaining. I'm a big fan of you guys and it's a great podcast. I could listen to 100 episodes :)


Yeah, the Jonathan Blow episode seems to be the one where that criticism comes up. We generally haven't edited them at all (one take, no cuts), and that one was just too long to realistically edit -- but we probably should have edited some of me out. The perils of a three hour conversation!


Yup.

He asked plasma physicist Alexander Fridman (his father) to clarify what plasma is.

I love it.

Hearing Knuth, Penrose, Fridman's own definition of concepts of things I thought I knew is illuminating. Like allowing us norms to get a peek into their genius.


Or at least listened a bit more - rather then "agree to disagree"...

That said, I haven't listened to him much, and he just grated me wrong here.


I love all of Lex's podcasts. They've been truly mind-expanding for me personally.


Lex is a bit quirky, but I like that.


Amazing how he didn't know how to read until age 8, and then read two books a week up to now (about 50 years)


This baffles me.

How do you tradeoff with learning and “doing” i.e writing code or putting what you have learnt into practice?

I can’t finish a technical book without getting the urge to implement


I just watched this entire thing. I'd never heard of Lex before.

It's pretty funny for the interviewer to say "agree to disagree" or "well, no" when he's clearly not the expert of the two.


amazing podcast episode for sure, I wonder where Keller will go next. It'd be insane if he went to AMD or Apple in their new ARM push.


Bloomberg has the following commentary:

“Keller’s departure is a big deal and suggests that whatever he was implementing at Intel was not working or the old Intel guard did not want to implement it,” Hans Mosesmann, an analyst at Rosenblatt Securities, wrote in a note to investors. “The net of this situation for us is that Intel’s processor and process node roadmaps are going to be more in flux or broken than even we had expected.”


Jim Keller giving a talk on Moore's Law at Berkeley,

https://www.youtube.com/watch?v=oIG9ztQw2Gc

And another

https://www.youtube.com/watch?v=8eT1jaHmlx8

And another

https://www.youtube.com/watch?v=Qnl7--MvNAM

DEC, PA Semi, Apple, AMD, Tesla, Intel. He's been everywhere.

https://en.wikipedia.org/wiki/Jim_Keller_(engineer)

The internal announcement memo:

https://wccftech.com/exclusive-intel-internal-memo-jim-kelle...


I like it when they’re introducing him and during the introduction he was praised for “Zen Architecture”; you can see Jim in the background shaking his head - sign of humility that he is not alone, amazing engineering team that made that happen under his leadership. This kind of humility is rare.


Intel has some absolutely killer politics, especially at the higher level, entire organisations will plan road maps deliberately contingent on deliverables they know other teams are going to miss. I wouldn't be surprised if this was the result of Keller just having enough of it and giving up. The fact that the press release involves an already resolved organisational structure that looks highly political is a big red flag.


It's so funny to hear the interviewer explaining to Jim what he understands. All this happening after in the first part of the interview he couldn't compute the answer Jim gave when explaining predicting branches. Also he's in no position to push back on ideas. As if I really care what the interviewer care or not.

It's such a pity the interviewer didn't prepare more technical questions that touch on the new architectures, compilers, cache, TPUs and his design experience. I only care about him asking good questions.

Jim equally seems incredibly engaged and patient with the questions sometime moving the game to a lot higher level than at which the question was posed. Without breaking a sweat.


interviewer is a domain expert in AI, which maybe explain what he pushed back and what he didn’t delve deeper into


If Jim is going back to Apple or AMD then the Intel struggle won’t stop anytime soon. As much as I like Apple, I hope his choice will be AMD.


Every time I criticise Intel, Or more like pointing out facts , supporters will always use Jim Keller as the excuse, as if he was the silver bullet.

Intel's struggle has nothing to do with is processors' design. Sunny Cove and Willow Cove ( aka Icelake and TigerLake ) were close to design complete before Jim Keller joined. Intel's problem is with their manufacturing, both technical and economical. And Jim Keller is not a Fab guy, nothing he could do to fix this.


The great man theory of chip design. In a few months we'll get a ton of the "Xe sucks because Raja's GPUs are always too hot" theory.


He lovingly lays each atom and then kisses each box gently before it heads out the door.


Process didn’t cause all the security holes implemented as performance hacks, architecture did. AMD caught up through architecture, process was just icing on the cake.


No one thought of speculative execution and hypertheading as performance hacks until the last few years. They were brilliant techniques. They still are, they just were found to have a cost. They are still used now but more carefully.

Intel Management Engine and SGX on the other hand, are basically user-hostile parts of the hardware, with some bugs mixed in.


> No one thought of speculative execution and hypertheading as performance hacks until the last few years.

"Everybody" who has some knowledge in security (espcially with respect to side channels) knew from beginning that these CPU features were a ticking time bomb in terms of potential side channels.

What was unclear was how this (at this time played down by CPU vendors) potential threat could be used to create real attacks.

Going from potential threat (that "everybody" knew about) to real attack is the central achievement of the authors of the Spectre and Meltdown attacks (and their successors).


That’s not the impression that I got. Research on side channels were basically limited to timing side channels in cryptography. Everything else was not seen as practically exploitable.


> Everything else was not seen as practically exploitable.

For a concrete blog post, see https://cyber.wtf/2017/07/28/negative-result-reading-kernel-...

Note that according to https://en.wikipedia.org/w/index.php?title=Spectre_(security... Spectre was published January 2018, i.e. this blog post is indeed older.

As I wrote: It was unclear how this potential threat could be used to create real attacks.


ME is exactly what Intel's data center customers want. It's only present in consumer CPUs because it doesn't scale to have separate ME and non-ME SKUs.


Intel's problem is that they fired expensive, older American workers and replaced them with H1B contractors.

There's a video on AdoredTV about the topic: https://www.youtube.com/watch?v=agxSclh27uo


But, but it's good for the stock price!


[flagged]


As someone who used to hold an H-1B visa I can confirm that they are one of the dominant factors that brought Silicon Valley to where it is today. Workers on H-1B visas are fearful and obedient. Despotic managers who love to cut corners without being held to account thrive on such a workforce.


It might be your personal anecdote. I know many H1B employees who are well paid and not fearful or obedient.

Different cultures have different ways of being courteous and showing respect to others(elderly). You must've had some misunderstandings based on your own. Silicon Valley has been a dominant force. Many H1B employees contributed to the valley's success.


It’s not about race—it’s about a shitty corporate culture. Within Intel there are full-time employees (blue badges) and contractors (green badges). Intel has shifted towards hiring more green badges while paying them less, offering them fewer benefits, and in general treating them worse in comparison to the full-time blue badges. And yes, many green badges are working for Intel under H1B visas. Because of the poor working conditions, there isn’t a great incentive for green badges to be doing their best work. Meanwhile, blue badges become proportionally fewer in number within the company.


I agree it's shitty culture to color code employee badges and subject a set of people to humiliation. I've worked as an employee and a contractor. We all will at some point in our career. You can't judge someone's integrity to work or competence by the color of the badge they are wearing. H1B employees do compete for IT jobs and win them over Americans. But that's because of interview performance and skills. There's enough data for anyone who wants to find out on how H1B employees are paid.


Often the problem isn’t nation of origin, it’s the nature of the training given from one generation to the next and knowledge transfer.

My father used to work in Hughes in some areas that required American citizens by US law; he regularly told stories of someone in a department retiring before replacements had been hired for knowledge transfer, causing lost time and knowledge.


The mood of the country is racist propaganda

I mean a lot of things are obvious, why pretend they are not


Police killing black men is no longer racist propaganda. It's a reality of life for black people.


25+ years ago in Redmond, most of the H1B were Europeans.

The criticisms haven't changed. Just the available sources of cheapest labor.


If you worked in Microsoft, you would know the H1B employees never came cheap. If you were dreaming of reaching Wall Street salaries in Redmond, then that's a mismatch in expectations. Microsoft was always making software for around the world. If American didn't have H1B program, they would've simply expanded the dev centers in Asia and Europe rather than pay Wall Street salaries in Redmond.


How does this relate to your statement that criticism of H1B is racist propaganda?


If this statement didn't have racist intent, how would you know?


Why do you say their problem is based in manufacturing?


They have been working on the 10nm manufacturing process for ages and it's still not completely finished. That gave AMD huge opportunity to catch up (which it did).


Nothing stops Intel manufacturing their designs with third parties. Yet they don't.

Either it isn't worth it (too much coupling between fab and designs - not the right abstractions), or institutional inertia stops them doing it.


IIRC most of Intel's expenses are related to the fabs and semiconductor R&D. The chip designs (the computer architecture part that gets computer people excited) is almost a rounding error on top of that. From that perspective Intel designs chips so that they can sell the silicon real estate they manufacture, not the other way around (that they have fabs so they can realize their chip design ambitions).

So if they would fab their chips somewhere else, they would be sitting on a huge expensive asset producing nothing. If they couldn't find a productive use for their fabs it would likely mean the end of the company. And if they can't produce their own chips in their own fabs, why would anyone else want to use Intel fabs?

Further, the vertical integration of fab process and chip design is something Intel regards as a competitive advantage. For a long time this was very much true, but it seems the hard work by TSMC and others have made it possible to make top-end chip designs on a merchant foundry process nowadays.

Intel at some point tried to play the merchant foundry game, but it seems it wasn't successful and they shut it down. Which perhaps isn't that unsurprising, considering TSMC, and to a lesser extent Globalfoundries, have been at that game for decades and they're good at it.

So all in all, I don't think fabbing their chips at some third party is a viable approach for Intel. Either they fix their process or they go under. "Go under" not necessarily meaning bankruptcy, it could also mean a massively, hugely downsized company doing chip designs to be fabbed at some third party. I think they're still far away from such a drastic step.


> Nothing stops Intel manufacturing their designs with third parties. Yet they don't.

It wouldn't surprise me if the reason Keller left is because he wanted to outsource manufacturing to a competitive third party fab, and got denied.

Keller is an a computer architect, but architecture doesn't matter if you can't physically build the thing.


I'm not a hardware engineer/cpu designer/electrical engineer, but my understanding is that designing high end cpus requires engineering and designing towards the specific manufacturing process of the fabs you're using. Even if intel did decide to, I don't think they could just send their designs to tsmc/global foundries or what have you.


Not my expertise either, but it seems that the advantages of vertical integrations between fabbing and chip design being large was the "common wisdom" in the industry for a long time. But it seems that lately AMD/NVIDIA/TSMC and others have demonstrated that fabless chip companies and merchant foundries is a model capable of producing the highest end chips as well.

And yes, if for some reason Intel would want to fab their chips at some merchant foundry (see my sibling answer to yours why I think that's unlikely, but just for the sake of argument), I'm quite sure they couldn't just email the RTL's to the foundry and get chips back. It would take a lot of work to adapt the chips to the merchant foundry's process.


They cant, they will need to redesign the whole thing with new rules, mask and testing. Not to mention using tools that likely Intel is extremely unfamiliar with. ( At least their CPU teams.)

It is more of a cost/ economical issues, which jabl above provides a decent explanation.


Ice Lake outperforms Zen2 in IPC but it can't run on higher clock speed (on efficiently) and can't ship many chips. Many of it is due to 10nm manufacturing.


There was a joke in the reddit thread but I seriously think it'd be fantastic had he gone to work for VIA to kickstart competitive processors. Even with a strong AMD, we would still benefit from more competition in x86.


I'm guessing a project he was working on either wrapped up or got canned. He had previously helped AMD with their Zen architecture.


My guess is he is going back to Apple.


He's been at Apple before. Doubt he'll go back to a place he's been before unless he thinks the landscape has changed enough he can have a really big impact again. I don't think thats the case at Apple.

No - I think he'll go to Nvidia, because he hasn't been there before, hasn't really worked on GPU's before, and because I think GPU's could gain a lot of performance by having someone look at optimizing the big picture, adding the right abstraction layers, and generally making a GPU more CPU-like to allow more execution to be moved to it more easily.


> Doubt he'll go back to a place he's been before unless he thinks the landscape has changed enough he can have a really big impact again

He did two separate stints at AMD, and if the rumors of Apple switching the Mac to the A-series are true (which they probably are) then there will be some very interesting problems to solve over the next few years.


Also the timing w/ WWDC coming up and rumours of an Intel → ARM switch.


If Apple switches Macs to ARM this or next year as rumored, all the interesting problems w.r.t. chip design must have been solved at this point.


You could make this same argument for any sufficiently mature architecture, but the reality is that there's always something new around the corner. I can't imagine the processor in an eventual ARM-based Mac Pro will look much like the one in the first ARM-based MacBook, you know?


Lol my guess is as good as yours and everyone else’s.


Wonder if he'd be interested in doing stuff with Mellanox's tech?


I think he may go to Google? or Facebook?


My guesses to this are either 1) leaving due to not a good culture fit or 2) literally a personal reason (e.g., health).

Intel has a culture which isn't exactly amenable to working flexibly, or to someone coming in and making lots of changes to the culture itself.


That’s a huge surprise. For reference, Intel’s stock is down 6.5% today (on what is admittedly a historically bad day for the market to begin with).


I don't know that those are necessarily correlated. The S&P500 is also down by 6%. NVidia is down by 6%. AMD is actually down by 8%


A good person leaving a company drives the company's stock price down. A truly great person leaving a company drives entire industry's stock down.

(tongue in cheek)


Wishing him good luck and healthy life.


And...back to AMD?


It says "for personal reasons," which is not text I would expect if he was just going to another company.


Also, he's agreed to stay on as a consultant for the next six months, which I assume is not something you'd do if you were planning on working for a competitor at the same time.


6 months is not an atypical non-compete clause length.


I obviously haven't seen his contract but it would surprise me if the noncompete clock started before his final day as a consultant. If it did, that would sort of eliminate the value of having a noncompete in the first place, wouldn't it?


I left a company with a noncompete in place (it was largely valid) and did consulting work for them. The two relationships are completely separate. As a full time employee, I had signed the noncompete. As soon as my full time employment ended, the term of the noncompete began counting down. My later relationship with the company was completely separate, with its own contract (which I ensured had no non-compete clauses).

I could easily see similar happening for Jim, here. The full time role has expired, and there is a new, separate contract for consulting work.


I've done similar things in the past, but my value to those companies' competitors was much lower than Keller's would be, presumably, to someone like AMD vis-a-vis internal problems and short-term strategic thinking. The importance of enforcing a noncompete and accompanying garden leave increases pretty linearly with how senior you are in your industry, in my experience.

Either way, I imagine we'll find out soon enough.


He could have had a falling out with Intel losing their edge


These things are in development for many years before they hit silicon, I don't think his time at intel has been long enough for that to be true. More likely, IMO, is that the same bureaucracy that lead to intel falling behind drives people like Keller away.

If scaling pipelines is hard, scaling pipeline pipelines is harder.


Accountant management. Current CEO came up thru finance. Knows how to hold a dividend steady but not a wafer.


I would say that actually this is the time where you can have more freedom to get things done and be able take bigger risks.

Intel has to change something to catch up with AMD. So I imagine the 'play safe' people have less power now.


He was probably there specifically to help Intel regain that edge. But he may not have received enough support to implement his vision, or simply he had a personal reason to leave.


I've personally known people who leave with that line and surface at a new company shortly thereafter.


When someone really has personal reasons, they usually "take a sabbatical" or are "off on sick leave".

For this kind of person, Intel gains a lot simply having his name associated with the company, so would have no qualms giving him years to travel the world on a yacht if he wanted to.

I think it really means "Intel didn't fire me, but I don't want to say any more".


Resigned effective immediately, announced the same day due to "personal reasons" is not a good look. Anyone who's spent a sufficient amount of time reading $BIGCORP press releases immediately sees how it stands out from the usual PR fluff.

In situations where I have seen that in the past, the person was caught in grievously bad, unambiguous case of sexual harassment, racism or something equally socially despised.

If this is not the case, Intel's PR people are doing a serious disservice to Keller in the way the announcement has been structured.

Generally if a higher level executive resigns due to actual "personal reasons" or a family tragedy, and it's an amicable departure, it's announced with at least a few weeks notice.


Your throwing of shade is unwarranted:

> Intel is pleased to announce, however, that Mr. Keller has agreed to serve as a consultant for six months to assist with the transition.


I agree with the shade bit, however, I've personally seen several high visibility execs 'fired' with this exact line about staying on for 6 months for the transition. I've never seen anyone stay longer than 4 weeks.


If I'm being critical of anyone, it's the person who wrote the press release and decided on the "effective immediately" and same date, not Keller. They have to be aware of the optics of it.


I am thinking if it was an opportunistic PR. They knew he was resigning for some time, but didn't want some bad PR from it as Keller was known as the man to save Intel. Given the market situation today they have decided to rush into this announcement.

And yes I do agree with your take on it. Intel has some of the best PR and marketing folks in tech industry. The writing is surely something to be worth looking into.


Or they find out they have an aggressive terminal illness.


That would be a reasonable explanation... But let's hope it's not that.


Wow. This likely throws a wrench in Intel's ability to design a revolutionary "Non Core" architecture free of all the security vulnerabilities that have been plaguing the Core family due to unsafe shortcuts in the name of performance.

Here's to hoping that in the 2 years Keller had been in Intel, he had left many good, realizable ideas on how to overhaul Intel's CPU architecture. If not, then this news might be the death knell for Intel's CPU might for the next decade.


Given the lifetime of CPU design, if Intel releases a revolutionary new architecture in two years, we’ll know where it came from.


Best guess: Jim Keller came in with guns blazing about how Moore's law is not dead and if you believe so you're stupid.

He was a comp-architecture guy counting on the device/physics folks to deliver. They didn't, while Jim put his reputation on the line. He probably resigned in disappointment and/or protest.

- Moore's law is dead at the physics level.

- Exponential tech progress doesn't stop but it won't be in the form of Si FETs, at least not in the foreseeable future.

- There is plenty of opportunity at the higher layers of abstraction though.

- Fortunately the AGI problem has escaped Moore's law (AGI can happen with existing node technology). And in my opinion that's all that matters for the next 10 years.


If AGI is solved, it most certainly won’t be in the next decade and will not be economical on current node tech.


What is the AGI problem? I've never heard of it.


Artificial General Intelligence.


Got a source for that AGI claim?


Source is me.

(Of all the naysayers who I've come across in the past many years, if a tenth of them were willing to fund me to work on AGI, by now I'd be well on my way to prove them wrong. But therein lies the rub. Why would they spend money on being proven wrong?)


Naysayers have already decided they won't invest in AGI. Perhaps if you approached yaysayers instead of naysayers, you'd have a better chance of getting funded. The only question they would have to answer is, why fund you, specifically, instead of someone else.


That is my current strategy, i.e., to work with the yaysayers. "Why me" would be the least of my worries. My biggest hurdle was getting a Ph.D degree in a highly multidisciplinary field and that's now out of the way.


I get the impression you have a grudge towards something but it isn't clear to me what that is.

If you have already cleared the biggest hurdle you can think of, why the grudge? It should be smooth sailing for you from hereon out.


I just want to clarify (for the record) that I didn't mean investor community (which I have not approached yet).

I was mainly talking about my friends (most, actually all, of whom are naysayers) and people I interact with online.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: