I really liked that quote. It's a great talk indeed.
Keller says a lot of interesting things in this interview that aren't followed up on. He calls for more substantial changes and architectural changes, I wonder what he thinks of spatial architectures.
The video goes from something that 100,000 people with a computer architecture background might watch to something that 10,000,000 people with a tech background might watch.
Selfishly, though, as someone who doesn't work at Intel/nvidia/AMD but is interested in architecture and digital design, it's frustrating how hard it is to find candid opinions from industry experts in comparison to software. Computer architecture just isn't the same in terms of attitude.
One of my favorite things about the computer architecture courses I took was to sit after class and ask my professors about something I'd read about, and more often than not they'd tell me that what I read was bullshit and nobody seriously considered it in the industry, or they'd tell me about something they were excited about that I hadn't heard of. It's hard to get a handle on where the industry is heading and what important people see as the next steps to iterate on.
A general conversation is probably more useful to most people, but there are a lot of times in this conversation where you get the feeling that Keller is right on the precipice of giving an insider prediction of the future, and the topic shifts instead. It's frustrating because those types of conversations are just straight-up difficult to find elsewhere.
It's has more depth than Friedman's podcast, I listen to it mainly for the computing history because they interview people about their careers. I hope they do another season soon
I normally enjoy Cantrill's rants, I watch his talks on YouTube solely for listening him rant, but it gets annoying when interviewing a guest.
Unfortunately I don't have specific examples in mind since it has been few months listening the podcast but I remember the episode with Jonathan Blow being a prominent one, because I was new to him, his rants and game development in general, and it felt like every time he was onto something interesting you were ranting about a related issue and after that he was proceeding with his next point without concluding the previous one. At least it felt to me that way.
I hope that didn't sound like I'm complaining. I'm a big fan of you guys and it's a great podcast. I could listen to 100 episodes :)
He asked plasma physicist Alexander Fridman (his father) to clarify what plasma is.
I love it.
Hearing Knuth, Penrose, Fridman's own definition of concepts of things I thought I knew is illuminating. Like allowing us norms to get a peek into their genius.
That said, I haven't listened to him much, and he just grated me wrong here.
How do you tradeoff with learning and “doing” i.e writing code or putting what you have learnt into practice?
I can’t finish a technical book without getting the urge to implement
It's pretty funny for the interviewer to say "agree to disagree" or "well, no" when he's clearly not the expert of the two.
“Keller’s departure is a big deal and suggests that whatever he was implementing at Intel was not working or the old Intel guard did not want to implement it,” Hans Mosesmann, an analyst at Rosenblatt Securities, wrote in a note to investors. “The net of this situation for us is that Intel’s processor and process node roadmaps are going to be more in flux or broken than even we had expected.”
DEC, PA Semi, Apple, AMD, Tesla, Intel. He's been everywhere.
The internal announcement memo:
It's such a pity the interviewer didn't prepare more technical questions that touch on the new architectures, compilers, cache, TPUs and his design experience. I only care about him asking good questions.
Jim equally seems incredibly engaged and patient with the questions sometime moving the game to a lot higher level than at which the question was posed. Without breaking a sweat.
Intel's struggle has nothing to do with is processors' design. Sunny Cove and Willow Cove ( aka Icelake and TigerLake ) were close to design complete before Jim Keller joined. Intel's problem is with their manufacturing, both technical and economical. And Jim Keller is not a Fab guy, nothing he could do to fix this.
Intel Management Engine and SGX on the other hand, are basically user-hostile parts of the hardware, with some bugs mixed in.
"Everybody" who has some knowledge in security (espcially with respect to side channels) knew from beginning that these CPU features were a ticking time bomb in terms of potential side channels.
What was unclear was how this (at this time played down by CPU vendors) potential threat could be used to create real attacks.
Going from potential threat (that "everybody" knew about) to real attack is the central achievement of the authors of the Spectre and Meltdown attacks (and their successors).
For a concrete blog post, see https://cyber.wtf/2017/07/28/negative-result-reading-kernel-...
Note that according to https://en.wikipedia.org/w/index.php?title=Spectre_(security... Spectre was published January 2018, i.e. this blog post is indeed older.
As I wrote: It was unclear how this potential threat could be used to create real attacks.
There's a video on AdoredTV about the topic: https://www.youtube.com/watch?v=agxSclh27uo
Different cultures have different ways of being courteous and showing respect to others(elderly). You must've had some misunderstandings based on your own. Silicon Valley has been a dominant force. Many H1B employees contributed to the valley's success.
My father used to work in Hughes in some areas that required American citizens by US law; he regularly told stories of someone in a department retiring before replacements had been hired for knowledge transfer, causing lost time and knowledge.
I mean a lot of things are obvious, why pretend they are not
The criticisms haven't changed. Just the available sources of cheapest labor.
Either it isn't worth it (too much coupling between fab and designs - not the right abstractions), or institutional inertia stops them doing it.
So if they would fab their chips somewhere else, they would be sitting on a huge expensive asset producing nothing. If they couldn't find a productive use for their fabs it would likely mean the end of the company. And if they can't produce their own chips in their own fabs, why would anyone else want to use Intel fabs?
Further, the vertical integration of fab process and chip design is something Intel regards as a competitive advantage. For a long time this was very much true, but it seems the hard work by TSMC and others have made it possible to make top-end chip designs on a merchant foundry process nowadays.
Intel at some point tried to play the merchant foundry game, but it seems it wasn't successful and they shut it down. Which perhaps isn't that unsurprising, considering TSMC, and to a lesser extent Globalfoundries, have been at that game for decades and they're good at it.
So all in all, I don't think fabbing their chips at some third party is a viable approach for Intel. Either they fix their process or they go under. "Go under" not necessarily meaning bankruptcy, it could also mean a massively, hugely downsized company doing chip designs to be fabbed at some third party. I think they're still far away from such a drastic step.
It wouldn't surprise me if the reason Keller left is because he wanted to outsource manufacturing to a competitive third party fab, and got denied.
Keller is an a computer architect, but architecture doesn't matter if you can't physically build the thing.
And yes, if for some reason Intel would want to fab their chips at some merchant foundry (see my sibling answer to yours why I think that's unlikely, but just for the sake of argument), I'm quite sure they couldn't just email the RTL's to the foundry and get chips back. It would take a lot of work to adapt the chips to the merchant foundry's process.
It is more of a cost/ economical issues, which jabl above provides a decent explanation.
No - I think he'll go to Nvidia, because he hasn't been there before, hasn't really worked on GPU's before, and because I think GPU's could gain a lot of performance by having someone look at optimizing the big picture, adding the right abstraction layers, and generally making a GPU more CPU-like to allow more execution to be moved to it more easily.
He did two separate stints at AMD, and if the rumors of Apple switching the Mac to the A-series are true (which they probably are) then there will be some very interesting problems to solve over the next few years.
Intel has a culture which isn't exactly amenable to working flexibly, or to someone coming in and making lots of changes to the culture itself.
(tongue in cheek)
I could easily see similar happening for Jim, here. The full time role has expired, and there is a new, separate contract for consulting work.
Either way, I imagine we'll find out soon enough.
If scaling pipelines is hard, scaling pipeline pipelines is harder.
Intel has to change something to catch up with AMD. So I imagine the 'play safe' people have less power now.
For this kind of person, Intel gains a lot simply having his name associated with the company, so would have no qualms giving him years to travel the world on a yacht if he wanted to.
I think it really means "Intel didn't fire me, but I don't want to say any more".
In situations where I have seen that in the past, the person was caught in grievously bad, unambiguous case of sexual harassment, racism or something equally socially despised.
If this is not the case, Intel's PR people are doing a serious disservice to Keller in the way the announcement has been structured.
Generally if a higher level executive resigns due to actual "personal reasons" or a family tragedy, and it's an amicable departure, it's announced with at least a few weeks notice.
> Intel is pleased to announce, however, that Mr. Keller has agreed to serve as a consultant for six months to assist with the transition.
And yes I do agree with your take on it. Intel has some of the best PR and marketing folks in tech industry. The writing is surely something to be worth looking into.
Here's to hoping that in the 2 years Keller had been in Intel, he had left many good, realizable ideas on how to overhaul Intel's CPU architecture. If not, then this news might be the death knell for Intel's CPU might for the next decade.
He was a comp-architecture guy counting on the device/physics folks to deliver. They didn't, while Jim put his reputation on the line. He probably resigned in disappointment and/or protest.
- Moore's law is dead at the physics level.
- Exponential tech progress doesn't stop but it won't be in the form of Si FETs, at least not in the foreseeable future.
- There is plenty of opportunity at the higher layers of abstraction though.
- Fortunately the AGI problem has escaped Moore's law (AGI can happen with existing node technology). And in my opinion that's all that matters for the next 10 years.
(Of all the naysayers who I've come across in the past many years, if a tenth of them were willing to fund me to work on AGI, by now I'd be well on my way to prove them wrong. But therein lies the rub. Why would they spend money on being proven wrong?)
If you have already cleared the biggest hurdle you can think of, why the grudge? It should be smooth sailing for you from hereon out.
I was mainly talking about my friends (most, actually all, of whom are naysayers) and people I interact with online.