Hacker Newsnew | past | comments | ask | show | jobs | submit | Flow's commentslogin

If you play with friends and your cheats cooperate, I don't think honeypots would be fool-proof any longer. Unless you all get the same fake data.

Perhaps not related to the article but I find it puzzling that Bluetooth in 2026 still sounds like a fax machine when you use the mic too. That and a much too high latency in general.

Using the mic requires the device to drop to a much older protocol for headsets designed for an old, obsolete Bluetooth generation.

Afaik that's one big reason why BT is such a mess. Many different use-cases are dictated by different protocols, many of which are outdated, and two paired devices can only use a protocol supported by both. So the headphone can't just reuse the same nice connection and add a mic, it has to start pretending like it's some Bluetooth 2.0 device from 2005 or something.


Looks perfect here. iOS Safari


StackOverflow didn't feel like a welcoming and humane place the last 10+ years, at least for me.

Actually I think it never did.

It started when I was new there and couldn't write answers, just write comments and then got blasted for writing answer-like comments as comments. What was I supposed to do? I engaged less and less and finally asked them to remove my account.

And then it seems like the power-users/moderators just took over and made it even more hostile.

I hope Wikipedia doesn't end up like this despite some similarities.


I don't think the reputation system ever worked that way - new users could always answer questions, but comments required more reputation.


OK, you might be right and I got it backwards. It still felt wrong at the time before I got enough points.


I don't get why Nvidia can't do both? Is it because of the limited production capabilities of the factories?


Yes. If you're bottlenecked on silicon and secondaries like memory, why would you want to put more of those resources into lower margin consumer products if you could use those very resources to make and sell more high margin AI accelerators instead?

From a business standpoint, it makes some sense to throttle the gaming supply some. Not to the point of surrendering the market to someone else probably, but to a measurable degree.


We will have to wait and see but my bet is that Nvidia will move to Leading Edge node N2 earlier now they have the Margin to work with. Both Hopper and Blackwell were too late in the design cycle. The AI hype and continue to buy the latest and great leaving Gaming at a mainstream node.

Nvidia using Mainstream node has always been the norm considering most Fab capacity always goes to Mobile SoC first. But I expect the internet / gamers will be angry anyway because Nvidia does not provide them with the latest and greatest.

In reality the extra R&D cost for designing with leading edge will be amortised by all the AI order which give Nvidia competitive advantage at the consumer level when they compete. That is assuming there are competition because most recent data have shown Nvidia owning 90%+ of discreet market share, 9% for AMD and 1% for Intel.


Ever thought of writing an emulator? On Reddit theres /r/EmuDev which is a nice place.

For example you could start by writing a CHIP8 emu, then a Space Invaders Emu. After Space Invaders most people write a Game Boy(almost same CPU as Space Invaders and hardware is well documented) emu, but you could try to do a 8086 PC if you want to know more about "real" computers.

There are free BIOS you can use, and FreeDOS, and then rest of the machine is pretty well documented.


I started a SvelteKit project last year.

The compiler and preprocessors are very picky about accessibility issues like these.

It was annoying in the beginning, but I tried to follow the rules it set out. It feels very professional, and I’m thankful for the guidance.

What’s annoying now is when my coworkers thinks I’m making things unnecessarily complicated by trying to follow these rules and guidelines. A <div> with an onclick is so easy compared to a <button> with extra styling and sometimes calling preventDefault().


I have that experience when working with eslint-angular. Quite a few things I realised I hadn't been doing properly before.


Funny, and the persistent memory-image everyone outside the Smalltalk/Lisp community seems to hate, it's your normal filesystem now.


The irony is that anyone using IDEs is using the same idea, as all of them use a virtual filesystem layer to simulate the same capabilities as the image approach.


Not really, because you still recompile and start the program from scratch, rather than modifying the code that executes on the still-existing data structures.

Edit: rather than just naysaying, it occurred to me to reference the notion of Orthogonal Persistence, which the image-based approach provides (not without drawbacks) but IDEs typically don’t. Previous HN discussion: https://news.ycombinator.com/item?id=39615228


Kind of, see Cadilac model for Energize C++, born out of Lucid Lisp.

https://dreamsongs.com/Cadillac.html


Did you ever play Nebulus? The ZX Spectrum version looks as good as the C64 version, just less colors. Nebulus is from 1988.

https://www.youtube.com/watch?v=ZAud8w5mTa4


I doubt that. It takes more or less 100% CPU on the scanlines affected.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: