Hacker News new | past | comments | ask | show | jobs | submit login

> I’d like to thank Intel here for making this possible. The CPU innovation stagnation between 2012-2017 has resulted in 4 cores still being an acceptable low-end CPU in early 2022. Without this, my laptop would likely be obsolete by now.

Hahaha, this last sentence seals the overall tone of the article.




What's hilarious to me is that Apple considers their own computers of this era to be "vintage"

No joke, that's the real term https://support.apple.com/en-us/HT201624 "About vintage products. -- Products are considered vintage when Apple stopped distributing them for sale more than 5 and less than 7 years ago."

Well, not much has changed on the intel side in that timeframe. My 2013 laptop is still getting OS updates and performs fine for my needs. And now I can brag that I'm running a vintage computer as my daily driver.


Vintage is a technical term for "we don't stock parts to repair these anymore" not "they're unusable and you should throw it out".


s/technical/marketing/g


Possibly legal.

I heard a story that the original Apple II machines came with a warranty that had no expiration and that California residents who purchased a machine in California are still entitled to hardware support on the machine.


Right, Apple specifically calls those "obsolete":

> About obsolete products

> Products are considered obsolete when Apple stopped distributing them for sale more than 7 years ago.


Planned obsolescence is Apple's bread and butter.


This sentiment is thrown out a lot but Apple has some of the best LTS on their hardware in the business. 6 year old iPhones are still getting updates. Someone in this thread mentioned their laptop that Apple describes as vintage is still getting updates. Apple is still selling a 4 year old smart watch as new.

I feel much better spending my money on Apple products trusting it will be supported for a long time than spending it on another device where I will be lucky to have support for more than 2 years.

I would say Apple’s bread and butter is creating an ecosystem of hardware and software that works so well together that choosing to leave their environment is such a hard decision that it is easier to stick with Apple.


It's not Apple, it's any closed-source hardware. The moment the vendor no longer supports it, it's dead. Not counting those few rare cases where fanatics reverse engineer it.


It's easier to bring the obsolescence bugging users to update. And you cannot disable the popups.


Seems arguable -- but if that's the case, they are one of the lesser offenders in the market right now.

The lifespan of much of Apple's hardware is considerably longer than most competitors' products. This is especially true with phones. I'm using an iPhone 7 right now, which was first released in Sept 2016. So, five full years and counting, and it's still running the latest iOS version. Not just "still gets security updates" on an old OS, but actually runs the latest release.

For hardware in 2022, that's not bad.

Personally, I think it's kind of a low bar, and we really ought to be able to do better. (I think a 5-year lifespan for a phone or a laptop, and 10 for a desktop, should be a starting point.)


no, it's not; continued innovation motivating upgrades, however, is. (disclaimer: was an engineer with a great rank inside Apple for several years)


> continued innovation motivating upgrades, however, is

Apple seems overly aggressive about breaking compatibility on iOS; as a result tons of apps break every year. This imposes an ongoing burden of maintenance costs onto developers, and many games and apps are simply abandoned. It seems to me that saving Apple time and money at the expense of developers and users may be the wrong trade-off in the long run.

And that's not even counting the 32-bit apocalypse which broke most games on iOS and macOS. "This app needs to be updated" but of course it never will be... Many of them have been pulled from the app store as well.

Windows does fairly well in terms of ongoing compatibility on the desktop, and (non-phone) gaming handhelds rarely break compatibility with old software (consider the 3DS which went through 5 or more hardware revisions and many firmware updates with negligible game/app breakage.) The few things that tend to break over time are online multiplayer games when they pull the plug on the servers, or streaming apps like Youtube or Netflix.


to your point, i wonder how many of the "needs to be upgraded" apps just need a recompilation with current Xcode, then resubmission, and I don't know the answer.


This is a blatant lie that is so tiresome. How long is life of an average Android phone? How long do they supply updates? Google’s is 2 years of software updates and 3 years of security updates, IIRC. Apple supplies updates to 5 years phones rutinely and in lots of cases even longer. They now even provide a way for you, a private person, to just get tools and parts necessary to do certain DIY repairs/replacements. Quit your bullshit.


This.

Last January I converted my old (2012) quad-core Intel desktop and gave it to my son for gaming. The only thing I had to do was add a 1-TB SSD and put in an RTX-3060 Ti, and it runs every game (Fortnite, CoD, etc) at 100+ FPS without any issues. I'm still amazed that a simple GPU change and the machine keeps up with his friends brand-new AMD box (not sure what chip, just that it was new in 2021).

When the 12th generation intel chips came out last month, it was the first time I felt good about the future for chips. (Yes, I know AMD has been kicking their butt, but that's been in core count and power, which aren't as relevant for my son's gaming.)


The Ryzen 5600X is the gaming CPU of this gen outside of the lower budget Intel chips. Ryzen didn't just win on core count they won in single thread too because competing Intel chips are 120W, which limits their performance on stock cooling. The 5600X is 76W actual.


It was. Now there's 12400F which offers just slightly lower performance(few percent slower) at almost half of the price. And there's 12600k which at the same price offers slightly better performance in gaming than 5600x, but in multithreaded workloads often outperforms 5800x.


that chip can suck down 150W of cooling (which is likely the cooling setup that benchmarks will be running with).

Not going to argue semantics here, but the grandparents point was that cooling the intel chips tends to be where it falls down (thus, the real world may differ significantly), even if there are paper gains in game benchmarks.

Offering a chip which takes double the cooling is just leaning into their point.


It's not just the cooling, it's power usage in general, that's how Intel moved ahead with this generation... 12600K is using 224W compared with 5600X's 134W during the same test (SuperPi stress test where 12600K scored worse at 389s vs. 5600X's 380s).

Don't get me wrong, I appreciate that the competition is heating up again, that can only be good news for consumers. But sucking more power to deliver perf gains is totally the wrong direction for this.


12400f costs 180 dollars right now. 5600x costs 300 dollars. They perform similarly in gaming tasks(5600x does have larger L3 cache and in cache-sensitive professional applications it is faster). 12400f draws 117W(intel spec). You won't be running both of these CPUs with their stock coolers so, the difference in CPU cooler costs will be 20 dollars at most(if you are not opting for some kind of water cooling on mid range build, which would be insane). Heck, even if you buy CPU cooler you still end up spending less money than on 5600x with stock cooler.


Game performance, and cooling in general outside of a very specific use case, isn't affected by what the marketing department of Intel says the 10 core AVX wattage of the 12600k is being compared to what the marketing department of AMD says the 6 core AVX wattage of the 5600X is.

When a single core is maxed out the real world measured difference is roughly 12 watts between the two and remember the 12600k is doing slightly more per second than the 5600X. If your gaming device of choice is a laptop that'd be interesting but both are desktop gaming CPUs. If 24/7 all core workload power efficiency is your concern I'd steer you away from either gaming optimized option.

I really like what AMD has done, my last 2 desktop CPUs have been a 3900X and a 5950X, but if I were to buy a entry level gaming CPU right now the 12600k would be very interesting regardless what the all core AVX thermals look like on a stock cooler.


A 12400F is slower than a 5600X in games?

That's really weird because a 11400F has better performance..


There's also the slightly more relevant metrics like IPC which is much lower on Ryzen chips overall. Maybe their single-core speed is a little slower, but they make up for it with incredibly high-performance instruction optimization. 64-bit division on a Coffee Lake CPU is a 36-cycle instruction, with up to a hundred cycles of latency. The same instruction on a Zen3 CPU is no more than 17 cycles, with latency no lower than 12. That's a rather extreme example, but it definitely makes a difference.


My wife has been gaming on a fourth-gen i5 + RX 570 for a few months, and it was mostly fine until Halo Infinite— with that title it was pretty clear that she was CPU bound, with the system chugging on open areas, mob AI, particles, etc.

We upgraded her to a second hand B450 mobo + Ryzen 2400 and that's been fine. Still running on low settings due to the limited GPU, but everything is at least playable. And this motherboard has FW updates that support 5xxx series Ryzens, so there's upgrade headroom there now as well.


Don't upgrade for Halo Infinite. That game also turned my otherwise fine machine into a potato. I've since fixed the issue by not playing halo and playing splitgate instead.


I have plenty of time in both Halo Infinite and Splitgate. Even if you take away the portals, Splitgate doesn't play like Halo Infinite at all. They look and sound similar, but they play and feel very different.


What other metrics are there for a processor in gaming, if not (compute) power per core, as well as core count?

Sure, a gaming computer doesn't need 8 cores as much as an encoder, but AMD's single core performance is right up there with Intel, for much less money.

Disclaimer: AMD fanboy and stockholder


I was using power to mean electricity, not compute power. Thanks for noticing that ambiguity.

As for your comment about core usage in games, I'm still surprised how few cores it takes, even for modern games. Not my knowledge area at all, so all of that part of gaming and computing architecture is a bit opaque to me.


In more CPU demanding games it will start running into trouble. Even on a single-threaded performance level a Ryzen 5600X is about twice as fast as a Core i5-2500k from 2012.


Was it a Xeon processor :)?


Ha - it actually is!!!!

This is too funny - I was about to reply to you, and then decided to check my files. Lo and behold, I found that when I bought the components in March 2012, I bought an Intel E5-2609 (which is indeed a Xeon!) for $299. Had I not looked, I never would have guessed.

Thinking about it now (and I might be wrong), the machine has 32GB of RAM, so I might have bought the Xeon to go above 16GB. Can anyone confirm if that would have made sense at the time (or if my memory is just fuzzy?) ?

Chip details here: https://ark.intel.com/content/www/us/en/ark/products/64588/i...


Second hand PC dealers in my country (Turkey) used to put together affordable gaming rigs using old Xeon processors a couple of years back. That's why my first guess was that if it was a Xeon.

I think Xeon processors also have special motherboards, so you can put TWO processors on some of them. That was a huge selling point for computer nerds like me.

Wikipedia says "Because of a hardware limitation not fixed until Ivy Bridge-E in 2013, most older Intel CPUs only support up to 4 gibibit chips for 8 GiB DIMMs". The E5-2609 is at the generation of sandy bridge. So you probably bought the Xeon to support 32GB :).


Indeed. Not only are my ThinkPad T420s still usable around home (see another post here), but my AMD FX-8350, approaching a decade, is still going strong as my primary (gaming, photo editing, etc) desktop.


Yeah, so something I used to do was buy used servers from ~2010 or so (5-6 years old at the time).

Due to the fact that CPUs really haven't changed a lot for servers (at least for the stuff I was doing), it was a relatively cheap way to get 40+ core computers for less than a thousand bucks.


If it weren’t for the cumbersome and outdated iDRACs and ILOs of those old servers they’d still be perfectly usable for lots of things. If you don’t need out of band management it’s probably best to ignore them as much as possible.

I used a PCIe to M.2 NVMe adapter and an EFI shim so that my Dell R820 could boot from it (which is not supported by the BIOS). Stuff like that adds a ton of life to older systems. I still got rid of it because it was too loud and power hungry for what I was doing. Surprisingly, I was able to sell it for roughly what I paid for it over a year later. 32 cores and 192GB of RAM for under a grand is pretty incredible.


> I still got rid of it because it was loud and power hungry.

Same! I loved my 48 core server, but there was actually a glitch in the Con Edison system where I ended up getting free power for two years [1], and once they fixed it and I got a power bill email for almost $900 while I was at work, I called my wife and said "UNPLUG THE SERVER NOW. RUN, DON'T WALK!"

[1] To be clear, I didn't know there was a glitch, I just thought power was cheaper in that apartment. I'm not greedy, I'm just dumb.


$900 for a month? That seems excessive. 720 hrs * 1.8 kW = 1296 kWh. 1800w is a maxed out 120v 20A circuit.

I pay about $0.10 per kWh, that’s $129.60 for a month.


Well, $900 was the total power bill. I also had one window units running in that room, and another window unit running in my bedroom, and I also actually had an Apple XServe running that I was using as a media server.

I think what caused such a catastrophic power bill was the surge pricing in NYC that happens in the summer, and the servers pumping out a ton of heat, forcing the AC's to run harder and longer.


I went back to a 2008 R-series several years ago after having some newer laptops fail on me. I still had it and it kept working, so I kept using it. Last year I finally decided I needed a trackpad so I upgraded to a W500 (2009) which if you squint is the same computer with everything upgraded slightly. It's low-end, but it can still do a lot of work and even load Facebook or recipe sites if necessary.

Overall, I've found that even in 2021 a Core 2 Duo can still be a sufficient low-end CPU. Thanks Intel!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: