My first thought was ‘this already existed, I could have bought virtually the same product already.’ …Which the article acknowledges even. Odd title.
I really wish that e-ink could be unshackled. I really love e-ink, I ordered Dasung’s latest portable monitor because I read on my computer a lot (which I’d love to you for work, but for a myriad of reasons I can’t… at least I can use it personally). My smart[ish] watch has an e-ink display. I’d love to see more products and more competition in the space, but sadly that doesn’t seem like it’ll happen.
> My first thought was ‘this already existed, I could have bought virtually the same product already.’ …Which the article acknowledges even. Odd title.
This reminds me of how "the first color e-ink reader" came out a few years ago, even though I already bought one 10 years ago, and even THAT one wasn't the first.
The article already touches on this, but in the modern day the games that exist on physical media are pretty much useless without their zero-day patches. Putting physical media aside, companies rarely make older builds available, so even when media contains a game and servers are up there is already plenty of ‘lost media’ if you consider old and interesting (potentially hilariously broken) old builds of virtually every game.
I’ve realized this at some point, but video games are ephemeral and should really be enjoyed in the now. Even if you can perfectly preserve a game, and the means to play it, tastes change so quickly in gaming that a game that’s fun today might not be enjoyable even a year later.
“Plays are ephemeral and should really be enjoyed in the now. Even if you can perfectly preserve a play, and the means to perform it, tastes change so quickly in theatre that a play that’s entertaining today might not be enjoyable even a year later.” - you, in the 1620s, probably.
Now imagine that the play cannot be altered (game build), it can only be performed on a specifically shaped stage (hardware requirements), actors can only be replaced by lookalikes (‘remaster’ tweaks), it can only be performed with a full theatre (online requirements), and the playwright retains the only copy of the play (source code).
Then you start to approach the problem that is gaming.
Which is exactly why more needs to be done by governments to stop this. A simple fix would be to say that in order to qualify for copyright protection, a national archive reserves the right to request a functional copy of the work in a form compatible with any future data migration projects.
People have been altering plays for ages without much issue (just look at the mod community), emulating and porting plays to different stages, and even reverse engineering their own copies of the script.
The only real catch has been online stuff and even that is sometimes worked around or recreated. It's not an impossible task to solve the problem of making games playable into the future, but it'll probably require legislation to force game companies to preserve their game and server code to allow for it.
> in the modern day the games that exist on physical media are pretty much useless without their zero-day patches
Most single and local multiplayer Switch games are playable offline - which is to say, not useless - without any zero-day patches.
And patches to those games can often (usually?) be downloaded while you are playing the game. (Sometimes there are issues however - I seem to remember some game, maybe SMT5, changing its game save format in some way.)
But it is true that patches can improve performance or fix bugs, and it seems like there is pressure to hit a ship date and assume that bugs can be patched later. It's not entirely bad though - some games that are heavily criticized on initial release (No Man's Sky, Cyberpunk 2077) end up fixing bugs and other issues and becoming successful games.
These day0/day1 patches and updates usually weigh a lot less than the base games, on Switch the appeal of cartridges is to avoid filling up storage.
Before the Switch, save files were also stored on cartridges, making physical medium far more appealing than the mess digital was on the 3DS (if you owned more than 1 console).
No real offense intended, because I understand the feeling here, but this...
> I’ve realized this at some point, but video games are ephemeral and should really be enjoyed in the now. Even if you can perfectly preserve a game, and the means to play it, tastes change so quickly in gaming that a game that’s fun today might not be enjoyable even a year later.
This is horseshit.
It's a defeatist attitude, and it's not reflective of reality. Yes - some things go out of fashion for a while, but trends almost always cycle back. You might think something is out of style right now, (and that's fine) but to be facetious: One man's trash is another man's treasure.
> Yes - some things go out of fashion for a while, but trends almost always cycle back.
Exactly, this is even supported by Nintendo's own services offering emulation of their older systems. There is clearly demand for the ability to older games.
Capitulation to an "inevitable" fate of download only games is just taking the easy way out by not sticking to your own core values. I have personally pre-ordered a Switch 2, but I will not being purchasing any online only cartridges or download only software.
We haven't had the watershed moment that brings it into focus for gamers at large yet, The Crew was close. But Nintendo has kept the download servers going for all of their systems which has provided a false sense of security. Once those start being shut down maybe we'll see some actual response. Though with the introduction of Gamecube emulation on the Switch 2, they are only a small step away from emulating the Wii and giving people another scapegoat for their lazy acceptance of lack of ownership.
I guess it’s all relative. One of my code bases still has pylint running with only a couple custom lint rules, that one is slow as hell.
As for version bumping, maybe it’s just a me thing, but I hard fix a version and only update occasionally. Sure each update brings new warnings, but most of them are valid and if you only do it a couple times a year… not that big a deal.
Try ruff instead of pylint. It will open your eyes to how performant software can actually be. All of us that have been working in python and JavaScript for too long have really forgotten.
We use ruff for everything that isn’t a custom rule.
The only thing I found it ‘missing’ was an indentation check (it’s in preview, I don’t turn on preview rules), but I realized it doesn’t matter because we also have a formatter running on everything.
Please describe in more specific terms. Are we talking non-technical, intern, junior, or senior experienced humans?
Literally just yesterday I was diverted to help someone who is a senior developer, but a novice in Python itself, figure out why code that AI helped them write was completely busted. They were extra perplexed because most of the code was identical to a block of logic we use today (and works), but in this new context didn’t work at all. Turns out whatever the AI did, it didn’t have a concept of method overloads, so the types being passed around were suddenly wrong.
AI works well for people who know nothing (it can do things for them that work well enough), or people who know ‘everything’ (it can get them 95% of the way, they can use their experience to find and fill the remaining 5%). It’s absolutely terrible for people with middling experience.
For context, my experience is colored by the kind of work I do: building codebases from scratch that tackle ‘niche’ (read: not readily available as FOSS or described online) problems, usually in small teams or solo.
For a completely inexperienced dev, they may delegate to having AI draft the entire project for them. If a part doesn’t work, they just keep repeating the prompt until it does. They’re not tweaking and twiddling, so the mindset is ‘if it works then I’m done’.
For an experienced dev, usually they will define a structure and have a clear understanding of what the inputs and outputs of each component are. They’ll also write what are known to be critical code sections themselves. AI is usually used here as they might take advantage of an intern—to do the busy work—and because they have adequate experience it’s fairly trivial to review the code and manually fix problems before adding it to the codebase.
For people in the middle ground, they end up with hybrid of these qualities, and it generally doesn’t turn out well. They might define a structure, but not well enough to know exactly what components to create from an LLM, nor knowledge of which sections need to be done by hand, nor be adept at finding deficiencies in the code they’re given. Because they have the ability to debug, they spend time debugging failures instead of just promoting again, and because they let bad code slip into the codebase failures happen just as often as if done by hand, but with the disadvantage of not having authored the code in the first place.
This is political, so are the tariffs. It’s the same cat and mouse game with local/state regulations on restaurant staff compensation and ‘service fees.’
No love for Amazon in general, they’ve been gaming the system for a long time, but it’s not hard to see why they would do this. Prices will go up, this an easy way to deflect the blame (and to be fair… it’s an accurate deflection).
A CLI argument specification and a parser that implements it. Mostly because I’m annoyed with Python’s argparse, so I’m adding blackjack and removing hookers. Also because I’m annoyed with the arguments used for programs at work and how the same author might use args a different way for each tool (To pull up a usage string, you might need to use `help`, `-h`, `--help`, `-help`, `+help`, or `+<prog>.help`… not all scripts display a usage string when no args are given).
It’s not much, but it’s all I can cram into the free time I have, there’s a possibility I might actually finish it for once, and it’s something I could actually use once it’s done.
While there's certainly an argument for keeping a geographic map, the new map is loads clearer at a glance on connections and peak/exceptional route changes. I don't see a good way to do both and have your cake and eat it too.
I’m pretty sure part of the intent is that it should be easy to write (type) in this format. Separator characters are not that. Depending on the editor, they’re not especially readable either.
I appreciate this article a lot more because it contains an iota of benchmarking with an explanation about why this might be more performant. Especially since my first thought was 'wouldn't this require more instructions to be executed?'
The original post seems really weird to me. I would have dismissed it as someone's hobby project, but... that doesn't seem like what it's trying to be.
"More instructions to execute" is not synonymous with "slower".
NaNboxing lets you use less memory on certain use cases. Because memory access is slow and caches are fixed size, this is usually a performance win, even if you have to do a few extra ops on every access.
I mean, a modern computer is operating the gigahertz range. Adding a few extra bitwise instructions might be something like a nanosecond. Which is absolutely fleeting compared to memory operations.
My first thought was ‘this already existed, I could have bought virtually the same product already.’ …Which the article acknowledges even. Odd title.
I really wish that e-ink could be unshackled. I really love e-ink, I ordered Dasung’s latest portable monitor because I read on my computer a lot (which I’d love to you for work, but for a myriad of reasons I can’t… at least I can use it personally). My smart[ish] watch has an e-ink display. I’d love to see more products and more competition in the space, but sadly that doesn’t seem like it’ll happen.