Hacker News new | past | comments | ask | show | jobs | submit login
Intel Core Ultra Meteor Lake CPU Breaks Cover with 128 GPU Cores (hothardware.com)
40 points by rbanffy on May 2, 2023 | hide | past | favorite | 50 comments



Core Ultra? Seriously? Next you’ll be telling me that the i5 will be called Core Max and the i3 will be called Core Pro.

Regardless of the actual basis for the name, you’d think that after spending two decades with the lower case “i” prefix which became culturally ascendant with the iMac, they’d be more sensitive to giving off the appearance of copying Apple’s homework again.


How can you say that Intel copied the initial lowercase 'i' from Apple when every Intel product from the 1960s forward was an i123 of some kind? The 3101 SRAM had the little 'i' etched on the top of the IC. Lowercase letter 'i' is their long-standing brand.


An IC marking is not the same as a brand logo.

AMD has etched/printed their arrowhead logo on practically every product they've produced since the the AMD2901, but the brand logo for Ryzen is the enso.

Intel did market some products as the "i386" and "i486" but consumer-facing identification was typically "Intel 80x86". The lowercase-i IC marking practically disappeared from intel CPUs starting with the Pentium.

The "i" logo appears on early P5 models (like the P166 from 1996 I keep on my desk), and then disappears, relegated to the bottom of the CPU.

Here's some P5 shots: https://en.wikipedia.org/wiki/Pentium_(original)#Models_and_...

Pentium Pros, PIIs, PIIIs, P4s, and Core2 products are all practically devoid of the marking.

I would argue that the resurrection of the "i" branding in 2008 with the i7-920 was eased along by iProducts, especially since it practically disappeared from consumer/retail branding for over 15 years.


>How can you say that Intel copied the initial lowercase 'i' from Apple when every Intel product from the 1960s forward was an i123 of some kind?

For these people, the tech world has existed only since the launch of the iPhone. Nothing that came before or after it matters, and Apple is the sun around which all tech revolves around.

Plenty of other companies had 'i' prefix in front of product names, even cars or dish washers.


That's not true, they also pretend that tech which was created after the iPhone but only later incorporated into the Apple platform is a Jobs invention.


> How can you say that Intel copied

I didn't. I said "appearance of copying" and I was very careful to choose those words.


And IOS has been used by Cisco for their networking device operating system.


Yes, but not with a lower case "i".


If they want to mix things up, Intel could always take inspiration from early Android phone naming conventions[0].

[0]: https://web.archive.org/web/20120211122642/http://androidpho...


Not even the Space Cadet keyboard had an Ultra key. But it had Super, Hyper, Meta, and Greek!


It's not like "Ultra" was novel when Apple used it...


Correct, but that's not my point. I'm not saying Intel copied Apple, I'm saying that it surprised me that of the thousands of English words (and indeed millions of inventable words) Intel chose to pick one recently used by a quasi-competitor. They're entitled to, of course. But if I were Intel I would be cautious about avoiding a perceived pattern because I wouldn't want to unintentionally give off follower vibes.

It's like Ford naming their electric car division "Ford Model e" which can be justified as a callback to the Ford Model T, but is also weird considering the product names of Tesla vehicles, and because it's a weird name for a corporate division. It gives off follower vibes.


Were you surprised when Apple chose to pick one recently used by a quasi-competitor?


They should've gone with Core Tall, Core Grande, and Core Centoventotto.


I'm partial to Core Common, Core Uncommon, Core Rare, and Core Legendary.


Nah. Common -> Magic -> Rare -> Legendary.

If you are gonna copy ARPG item tiers, do it right.

They could also add a Unique tier, but that would have a limit of 1 per customer.


> Core Centoventotto

My gradfather used to drive a Fiat 128 (Centoventotto) in the late 60ies - early 70ies.

Legend has it that my uncle drove that to Norway (from the south of Italy) on his honeymoon trip.

I've lost track of it, but i remember it was still in use in ~2008 or so, today i think it should be somewhere at my uncle's place.


> the lower case “i” prefix which became culturally ascendant with the iMac

I think you mean iPhone or iPod. Pretty much everyone knew what an iPod was. Basically everyone in the world knows what an iPhone is.

Lots of people have never seen or heard of an iMac.

iMac sales are likely less than 2M per year. iPhone sales are about 200M per year...

The iMac is about as culturally significant as the Microsoft Surface...


You shouldn't leak insider info. /s

With Intel dropping the Pentium and Celeron monikers for just Intel <Model number> I think you're right on the money.


Throwing the Core i brand away seems like a very bad move even with Ryzen R being a 1:1 copy. Intel has had a consistent, reliable naming and tier scheme in the desktop market for about 15 years. Things like "i5 x600K" are virtual fixpoints.


It feels like unified memory for cpu and gpu is going to be a big trend moving forward for laptop/desktop, which is why investment in Arc to help drive this meteor platform are more important. Console ports are already becoming a big problem for current gpu card memory, and apple silicon is clearly showing the benefits on the laptop and desktop side.


Do I understand correctly that three architectures of "AI-enabled hardware" (I couldn't come up with a better term) are the following?

1) separate cpu and gpu,

2) cpu and neural cores,

3) gpu-like cpu (like in this post).

In the long term, is any of these architectures potentially preferable for a) training, b) inference?

(I am guessing cpu + gpu is not ideal for consumer-level inference because of gpu prices and their space requirements, I don't know much about hardware.)


More interestingly, Intel is working on 2000 W cooling solutions. Imagine that, 2000 W in a few square centimeters.

https://www.tomshardware.com/news/intel-working-on-new-cooli...

I wonder if making the whole CPU package larger is another solution. Just like the best LED lighting is spread out (disc, square, linear) instead of being in the shape of a traditional light bulb. Maybe in the future we'll have A4 sized CPUs.


It seems comical to have a CPU more powerful than a space heater people use to heat their rooms in the winter. I also wonder if it's even possible to use a 110v circuit in most houses for this, considering it would create a computer pulling atleast 2500-3000 watts, something most single outlets in a standard home can't handle.

Would be funny to start seeing people run 220v to their offices for their computer.


Some 2kW power supplies come with weird cables to handle 110V. Slightly annoying for a 230V domestic setup but 3kW from a single socket is fine in Europe. Not totally encouraged of course.


You're talking about 15amp setups in both cases, which is what I was talking about, even including the point that Americans might have to start wiring their offices with 220v just like we currently wire our kitchens and laundry rooms.


Not in Italy, it's not normal here to consume 3kW from a single socket. Most houses powerline max at 3.3kW


Almost sure that is for servers. They've been pushing density for a while now, and as four sockets per U became the new norm, data center operator had a slight Pikachu face when they saw 40-60 kW per rack.

With Intel's recent innovation in die interconnects and silicon photonics, this seems a straightforward direction to push. They're going to increase density, it's what the semi industry does.


One problem with making the package larger is the relativistic effects of longer wires.

Anyway, the idea of writing code to manipulate 2000 watts of power billions of times a second, on a desk, is kinda scary! Wow!


IBM has been pushing that envelope for quite some time, with both POWER and Z. The Telum cooling system looks scary.

https://twitter.com/PatrickMoorhead/status/15240319067944263...

In return, you get a 5 GHz base clock, but the "BMC" is two x86-based servers running Linux that, among other things, make sure cooling is working properly before applying power to the CPU/memory modules (and to shut everything down as soon as it goes off-nominal).

And, definitely, none of that gear is designed for home use.


Household circuit in US maxes out at around 1700W. I hope in future where we don't need to wire L2 charge port to a home office, just to run a faster workstation.


I live in a building renovated in 2010ish and all circuits are 20A minimum.


That can't be right, an air conditioner alone pulls more than a kilowatt.


He means the standard residential circuit (like you would plug a PC or TV into) is typically 15 amps. Larger appliances, like an AC, are dedicated circuits, usually at a higher amperage. It's not uncommon for AC to be on 30A or 60A


That is right, but it's per breaker, so your air conditioner will likely be on its own breaker.

(Assuming 15A fuse, 15A * 110V = 1650W)

Here in Sweden we have 240V from the wall but our fuses are usually 10A, so 2400W per breaker (so we can run Intel CPUs without special installations ;)


Large draw appliances usually get put on a double pole breaker and a house will have many circuits.


I recently got a new PC with i7-13700k. Paired with very good (but enormous) air cooler. When there is high CPU usage I can feel hot air coming from computer! Yes, that air cooler is almost silent (BeQuiet Dark Rock 4), but if when CPU uses 200+W that heat has to end up somewhere. Hopefuly, for development you rarely need 100% CPU. I wouldn't want anything more power hungry for my desktop.


What’s so special about having 128 gpu cores when nvidia has had thousands of cores in their gpus for over a decade.


> What’s so special about having 128 gpu cores when nvidia has had thousands of cores in their gpus for over a decade.

Its a CPU not a GPU, that’s what is special. (Also, counting shading units, which are often called cores on NVidia GPUs, this has 1024, which still wouldn’t be a lot for a GPU, but 128 vs. the usual way cores are counted for Nvidia GPUs is the wrong comparison.)


The CPU has 18 cores according to the link. So probably 9 or 18 x64 things. Wouldn't like to guess how an Intel GPU unit (EU?) compares to a SM or CU from the other two.

edit: misc stuff online suggests EU corresponds to a simd. It's sufficiently annoying to find an approximate ratio between that and the main two GPU arch that I'm giving up. Anyone know if 128 is a lot or trivial in this context? My outdated laptop has 8CUs and the better workstation cards are order of 100.


NVIDIA hasn't had thousands of cores, unless you count every SIMD ALU lane as a core.


B-but each SIMD lane needs an instruction pointer for mutexes!


This isn't a discrete GPU.


I hear you, but Apple sells today a laptop with 54 gpu/neural-network cores.

For a workstation, which this seems to target - doesn’t seem particular a lot to have ~2x that


Core isn't a standardized unit of measurement. ARM-cores on the Apple Silicon line and the x86-Intel/AMD core behave very differently.


For the CPU cores, M and x86 are converging. M1 had a very expressive lead on gen 11 and 12 Intel because of multiple hiccups on Intel's product pipeline, but that is not going to hold for long. As for the GPU cores, I know very little how they compare, and GPU cores tend to be wildly different between manufacturers.


>not going to hold for long

honest question: why do you think that?

My believe is that a good analogy for such leads would be an average marathon runner vs Kipchoge. His "lead" will only increase with time.


> why do you think that?

Because, love it or lump it, Intel's manufacturing process is ludicrously advanced. Power hungry, particularly when they're locked on older nodes, but extremely capable when their interests align. Pat Gelsinger has even stated that it's his goal to get Apple back as an Intel customer. Their foundry services started working with ARM to get RISC cores manufactured, and if they stick to their roadmap, it's possible for them to leapfrog TSMC's density limitations.

We'll have to see where things go, but I do agree with the parent. Apple's lead was their investment on the TSMC 5nm node, which as-of the M2 is clearly difficult to iterate on. Apple needs advancements in density to make a faster chip - Intel does not.


> Apple's lead was their investment on the TSMC 5nm node

That's only part of the deal. Apple also made some really cool architectural innovations, as well as creating workload-oriented ISA extensions that proved extremely effective.


38 GPU cores. The neural network inference cores don't do floating point and therefore aren't used for training or games. Assuming these GPU cores are comparable, if >3x isn't a reasonable target, what would you suggest is?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: