None of these are performance concerns. Modern JS engines are plenty fast for most of my use cases.
It irks me that I can't trust it to be an integer within a given range. Especially with Number, I often have the sensation that the type system just doesn't have my back. Sure, I can be careful and make sure it's always an integer, I've got 53 bits of integer precision which is plenty. But I've shot myself in the foot too many times, and I hust don't trust it to be an integer even if I know it is.
As for BigInt, I default to it by now and I've not found my performance noticeably worse. But it irks me that I can get a number that's out of range of an actual int32 or int64, especially when doing databases. Will I get tto that point? Probably not, but it's a potential error waiting to be overlooked that could be so easily solved if JS had int32/int64 data types.
Sound currency arithmetic is a lot harder when you have to constantly watch out for the accidental introduction of a fractional part that the type system can't warn you about, and that can never be safe with IEEE 754 floats. (This doesn't just bite in and near finance: go use floating-point math to compute sales tax, you'll find out soon enough what I mean.)
Bigints solve that problem, but can't be natively represented by JSON, so there tends to be a lot of resistance to their use.
Not really. In my parent comment I tried to make clear that it's not a limitation for me in real-world scenarios I encounter, but still something I feel like being a potential class of problems that could be so easily solved.
When I really needed dedicated integer types of a specific size, e.g. for encoding/decoding some binary data, so far I've been successful using something like Uint8Array
> Especially with Number, I often have the sensation that the type system just doesn't have my back.
That's sounding dangerously close to dependent types, which are awesome but barely exist in any programming languages, let alone mainstream general purpose programming languages.
You could do this with a branded type. The downside will be ergonomics, since you can't safely use e.g. the normal arithmetic operators on these restricted integer types.
> As for BigInt, I default to it by now and I've not found my performance noticeably worse. But it irks me that I can get a number that's out of range of an actual int32 or int64, especially when doing databases. Will I get tto that point? Probably not, but it's a potential error waiting to be overlooked that could be so easily solved if JS had int32/int64 data types.
If your numbers can get out of the range of 32 or 64 bits then representing them as int32 or int64 will not solve your problems, it will just give you other problems instead ;)
If you want integers in JS/TS I think using bigint is a great option. The performance cost is completely negligible, the literal syntax is concise, and plenty of other languages (Python, etc.) have gotten away with using arbitrary precision bignums for their integers without any trouble. One could even do `type Int = bigint` to make it clear in code that the "big" part is not why the type is used.
I assume the screenshots are converted into a textual description quite quickly, so presumably the only disk usage would be screenshots in the buffer waiting to be processed.
I wonder if magic-wormhole could be implemented as a layer on top of Syncthing:
- Generate a short code
- Use the code as the seed to deterministically generate a Syncthing device key + config
Since the Syncthing device key could be generated deterministically, sharing the code with both sides would be enough to complete a dir/file transfer and then discard the keys.
How well does it run in AMD GPUs these days compared to Nvidia or Apple silicon?
I've been considering buying one of those powerful Ryzen mini PCs to use as an LLM server in my LAN, but I've read before that the AMD backend (ROCm IIRC) is kinda buggy
reply