Hacker News new | past | comments | ask | show | jobs | submit | Heston's comments login

Could she just get a friend using AT&T put their sim in her phone to have it updated?

If the Xeon counterpart isn't failing as well, then they most certainly know the problem. Too aggressive voltages for a more fragile transistor size


I think there’s a real concern Xeon e-2400 may be failing at this point too. It's an open question if Emerald Rapids might have the same issues (and EMR has mesh, not ring, so this is an interesting question as to diagnosing the cause!) but W-2400 and W-3500 still use Golden Cove.

The leading theory at this point is not really voltage or current related but actually a defect in the coatings that is allowing oxidation in the vias. Manufacturing defect.

https://www.youtube.com/watch?v=gTeubeCIwRw

It affects even 35W SKUs like 13700t, so it’s really not the snide/trite “too much power” thing. Like bro zen boosts to 6 GHz now too and nobody says a word. And believe it or not, if you look at the power consumption, both of them are probably fairly comparable in core power - both brands are consistently boosting to around 25-30W single-thread under load. AMD's highest voltages will occur during these same single-core boost loads, these are the ones of concern at this point - if it is just voltage that is killing these 35W chips, well, AMD is playing in the exact same voltage/current domains.

Furthermore, if it was power it wouldn’t be a problem that is limited to 10-25% of the silicon, it would be all of them.

There was a specific problem with partners implementing eTVB wrong, and that was rectified. The remaining problem is actually pretty complex and potentially there are multiple overlapping issues.

It just has become this lightning rod for people who are generally dissatisfied with Intel, and people are lumping their random "it doesn't keep up with X3D efficiency!" complaints into one big bucket. But like, Intel actually isn't all that far off the non-x3d skus in terms of efficiency, especially in non-AVX workloads. "140W instead of 115W for gaming" is pretty typical of the results, and that's not "burn my processor out" level bad. 13900K has always been silly, but 13700K is like... fine?

https://tpucdn.com/review/intel-core-i7-13700k/images/power-...

https://tpucdn.com/review/intel-core-i7-13700k/images/effici...

https://tpucdn.com/review/intel-core-i7-13700k/images/power-...

https://old.reddit.com/r/hardware/comments/yehe1s/intel_rapt...

(granted this may be launch BIOS, and it sounds like part of the problem is that partners have been tinkering over time and it's gotten worse and worse... I'm dubious these numbers are the same numbers as you'd get today, but in fact they are pretty consistent across a whole bunch of reviewers, ctrl-f "CPU consumption" and the gaming and non-AVX power numbers are in broadly unconcerning ranges, 57-170W is broadly speaking fine.)

Again, even if there is a power/current issue, at the end of the day it's going to have a specific cause and diagnosis attached to it - like AMD's issues were VSOC being too high. Saying "too much power" is like writing "died of a broken heart" on a death certificate, maybe that's true but that's not really a medical diagnosis. Some voltage on some rail is damaging some thing, or some thermal limit is being exceeded unintentionally, and that is causing electromigration, or something.

You might as well just come out and say it: intel's hubris displeased the gods, they tempted fate and this is their divine punishment. That's really what people are trying to say. Right? Don't dress it up in un-deserved technical window-dressing.


Thai bananas taste at least 3 times as good as regular Cavendish. A much stronger fruity flavor


It's definitely a generation issue. When there's no sun(overcast, nighttime) there's no energy. This doesn't even factor in solars quick deterioration from peak performance and the cost of the panels and environment damage from producing them.


Yeah, I think people looking at just the upfront cost are not really acknowledging or addressing that solar panel installations degrade faster than a nuclear reactor. They also take up more space and as you mentioned will likely end up causing more damage to produce at planetary scale. They are definitely great when coupled with batteries for many use cases including decentralizing aspects of the grid for residential and smaller scale usages, but the raw performance of nuclear is impressive and exciting. So little inputs needed for how much you get, and for how long too. There are old reactors still producing after over 50 years…that is mind-boggling.

Who knows how far the tech can be pushed with modern advancements and less blockers on developing the technology further. It should be in the toolbox as part of a strategy for renewable energy needs on Earth and beyond.


nuclear energy is very exciting and absolutely crucial for space exploration, but not economically competitive with solar in the foreseeable future on earth's surface

there are also solar panels still producing power after 50 years; they do degrade a little, especially in the first ten years, but the 20–30 year panel lifetimes you see published are more of a warranty and accounting issue than anything else. (of course some panels crack or yellow within a year or two)

it's true that solar farms take up a lot of space, but even in high-density countries like japan there is room for them. singapore might have a problem tho


What would surface area needs be like for over-provisioning needs in the US? What if we want to scale energy production by 2x or 10x or 100x for advanced industrial and commercial usage needs in the future? I think then the solar panel approach becomes limited on earth.

You’re right that the panels don’t degrade a ton. I read online that after 20-30 years they might drop 15% efficiency. For residential usage that might be okay, but it does mean needing upkeep and worrying about baseline potential dropping, which in some climates could be bad.

I do think a combination of the technologies is best, since scaling up energy production will be simpler and easier and more resource-friendly with nuclear than with more solar. Moving up the Kardashev scale will require capturing all the energy that can be captured from all sources so why let any go to waste. :)


If you want to scale to 100x power you're going to have to rely on solar power even more than we already do. You seem to be awfully attached to the idea that the reactor must be located on earth. Solar power is fusion power with the reactor being located in space. It is very unlikely that humanity can build a bigger reactor.


humanity, defined loosely, can definitely build a bigger reactor than the sun. the milky way is a trillion times bigger than the sun, and mostly made up of stars (as opposed to large black holes, which are probably effectively inaccessible), so there's plenty of material available

already-existing natural blue hypergiants can reach energy outputs several million times that of the sun, in large part because they're on the order of 100 times bigger, usually limited only by the eddington mass limit. bat99-98 is estimated at 226 solar masses. so designed artificial stars can clearly reach that size, and conceivably, with a better understanding of plasma dynamics, they could be stabilized. in fact, we already know† how to build an even larger star: if you build a star of very low metallicity (similar to natural population-iii stars, of which possibly none survive today), its eddington mass limit is much higher, around 1000 solar masses

more likely, though, the humans will instead build a larger number of smaller, safer reactors. microscopic black holes can convert mass into hawking radiation at manageable photon energies and useful power levels. the necessary experimentation poses no risk of creating a large black hole (the density of matter necessary to grow small black holes to macroscopic proportion doesn't exist outside of the cores of stars, and the necessary quantity of matter at those densities is also literally astronomical) but will surely involve many explosions as starving black holes explode in a final tantrum of high-energy gamma rays, and of course must be carried out in free fall to prevent your nascent black hole from simply falling between the atoms of your laboratory floor before exploding deep inside your chosen planet

constructing larger reactors, by contrast, does pose a risk of producing phenomena such as disappointing white dwarfs, neutron stars, and black holes, or worse, supernovae, rather than a useful power source

if we believe dyson's calculations, though, a much more worthwhile thing to do is to figure out how to slow down our entropy production enough to preserve life into the cold, dark post-stellar era

______

† i mean we know in scientific terms what the structure of such an artificial star would be, where to find the materials, and what would be required to bring them together in the right way. it's fairly simple, actually. the only difficult part is getting a large enough budget to build the necessary fleet of spacecraft to harvest 10³³ kg of hydrogen and helium, about a billionth of the milky way, and bring it together over a distance of several light years; plausibly you need on the order of 10³⁵ spacecraft, about 120 doublings of a von neumann probe


yes, as you approach kardashev type 1 you will definitely want to start harvesting sunlight from von neumann probes on solar orbit

including transportation, natural gas, etc., but not including foods like corn and canola, the usa uses 100 quads per year, or 3.3 terawatts in si units. its average utility-scale solar power capacity factor is 21%, so you'd need 15.7 terawatts peak of solar farms to supply that, before scaling up by 2× or 10× or 100×. 15.7 terawatts of 24% efficient solar panels would require 65 terawatts of sunlight, which is to say, 65 billion square meters or 65536 square kilometers (to pick a round number). this is of course 256², so, like the entire spectrum of mainstream political opinion in the usa, it would all fit between houston and austin. you could drive around it in a day

well, not quite; that's 29° latitude, so you need to space your panels apart by a factor of 1/cos 29°, about 14%, so they don't shade each other. also in texas, unlike any other phenomenon known to humanity, it would be a bit smaller, because texas has a 25% capacity factor; the reason the usa has an overall lower solar capacity factor of 21% is that some solar farms are in suboptimal places like maine (10%) so the power doesn't require long-distance transmission

so right now it's really tough for nuclear to compete with solar on earth


I've never been able to get Ventoy to boot a modern windows installation iso, otherwise it's a very handy tool


Not sure when they added it, but there's a "wimboot" mode that works for Windows isos. Just used it last week without issue.


I recently upgraded to using a faster USB stick for Ventoy. For some reason, Windows 10 wouldn't install from my newer, faster USB stick, but would install from my older, slower USB stick. I haven't yet figured out what the problem is.

When booting the Windows ISO, the boot process would never progress past the screen showing the blue windows icon on a black background.


Windows uses some weird install process where it needs two partitions for the install media. Basically the whole installation is too big for FAT32 so IIRC the way to have it work is to split the ISO in two having the bootable part on one partition and a lot of the installation content on another.

https://github.com/WoeUSB/WoeUSB


I wonder if it's related to being a USB 2 stick rather than 3? Different drivers and perhaps UEFI handles it differently


Do you mind saying which island this was? I'd really like to live somewhere like this.

Technology has pulled as away from what's really important and then we look for escapism.


Millport, Scotland. £3.60 for a return trip on the ferry. Ten mile ride around the island. Cheapest day out ever.


What a ridiculous argument. That's like saying the police shouldnt have weapons to defend themselves with because there's a history of police killing innocent people.

No, the real answer here is to address the actual corruption to expose and remove it.

Not to mention, if you have nothing to hide, how can someone abuse their power to use it against you?

These bills are put in place because they save lives, especially vulnerable children.


The only reason to consider type hints is for a performance increase and there wasn't any mention of that. What can you really expect from using type hints accurately?


> The only reason to consider type hints is for a performance increase

No, the reason to consider type hints is because it makes it easier to understand existing code and write new code that interacts with it correctly. Comprehensibility and correctness are more important than performace.


In a dynamically typed language, that's what comments are for. Since you didn't need types specified in the first place, why would you have trouble interacting with it? Comprehensibility and correctness are just as important as performance.


Type hints are great when using FastAPI. Your inputs are automatically validated, and you get a /docs endpoint that tells people what to expect from your API.

I'd say performance is far from the only reason to consider type hints.


Similarly for Typer, which is literally "the FastAPI of CLIs"[1]. Handy to type your `main` parameters and have CLI argument parsing. For more complicated cases, it's a wrapper around Click.

[1] https://typer.tiangolo.com/


mypyc does that: https://github.com/mypyc/mypyc

> Mypyc compiles Python modules to C extensions. It uses standard Python type hints to generate fast code. Mypyc uses mypy to perform type checking and type inference.

> Mypyc can compile anything from one module to an entire codebase. The mypy project has been using mypyc to compile mypy since 2019, giving it a 4x performance boost over regular Python.

I have not experience a 4x boost, rather between 1.5x and 2x. I guess it depends on the code.


> Compiler instructions

> I don’t know how many people are doing this, but tools like mypyc will use type hints to compile Python code to something faster, like C extensions.


Using typed dependencies makes the Python IDE experience significantly nicer.


"...the problem they face going through cycles of discharging and recharging in a stable way."

They suffer from poor wearing like almost all new battery technologies. Until that's solved they aren't useful.


The article is about a solution to make lithium metal anode more stable. They discovered a coating that prevents the formation of dendrites, which are the cause of short cycle life of previous attempts at lithium metal anodes.


It's much easier and more accurate to time your python scripts with `time python script.py`. Cool write up.


`time` is absolutely awful, with the minor exception of bsd’s time maybe.

If you’re going to benchmark scripts or executables, use hyperfine.


Very nice. Thanks for sharing.

https://github.com/sharkdp/hyperfine


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: