It's kind of ridiculous that the US has held onto pennies for this long. New Zealand phased out our 5 cent coins back in 2006, approximately 3 US pennies worth at present.
I agree with your sentiment, but according to another comment in this thread, the treasury secretary has congressionally approved latitude to adjust how many coins are minted.
The authority over U.S. coinage is established in the Constitution (Article I, Section 8), which gives Congress the power "to coin Money, regulate the Value thereof." Any major changes to U.S. currency, including eliminating a denomination, must go through Congress.
>“The Secretary of the Treasury shall,” Section 5111 reads, “mint and issue” denominations of coins “in amounts the Secretary decides are necessary to meet the needs of the United States.”
I'm gutted; I absolutely love the Sculpt Ergonomic keyboard, especially with the optional slightly inverted slope, it's even more comfortable than previous ergonomic keyboard models. And I've been exclusively using Microsoft ergonomic keyboards for over 20 years, started in my late teens. Tempted to buy a second Sculpt Ergonomic keyboard as a backup for when this keyboard eventually dies, hopefully not for a long, long time!
I love the sculpt keyboard, however the function keys are tiny and frustrating (I use them quite a bit for maya) when some of the keys stopped working on it, I upgraded to the surface ergonomic keyboard which has a similar shape and built in numpad and took the riser that goes under the initial ergonimic keyboard and with the help of some velcro tape...
it's bigger on my desk which I dont love but I DO use the numpad and the F keys are much more usable, (I did create a cover for the FN key and the power off button to avoid hitting them by accident though)
Similar situation here. I have on occasions seen all sorts of crazy looking split keyboards around the internet. They've looked interesting, but pricey, I can't touch them, and the MS ergo keyboards have served me well so I've not taken it any further. This will be what makes me take it further, maybe some interesting keycaps (maybe black!)
I've also heard of a trick of integrating WoL with DNS, where if the server requests a lookup for a local IP, it'd send WoL packets to the destination. You'd probably just need to set the TTL for the server's IP very low so that it doesn't get cached.
I've had zero issues deploying .Net on Linux, whilst developing in Windows. The docker support is really good, if you want to go that way too, and didn't need any platform-specific shenanigans to get working.
JSON really is a terrible serialization format. Even JavaScript can't safely deserialize JSON without silent data corruption. I've had to stringify numbers because of JavaScript, and there were no errors. Perhaps that's the fault of JavaScript, but I find the lack of encoding the numerical storage type to be a bug rather than a feature.
Sounds like they've been bitten by IEEE 794 floating point problems. JS only supports encoding numbers that are representable in 64-bit ("double precision") IEEE 794. Most JSON standards make the same assumption and define JSON number to match. (There's no lack of a standard an "encoding" standard there, it just inherits JS', which is double precision IEEE 794.) Some JSON implementations in some languages don't follow this particular bit of JSON standardization and instead try to output numbers outside of the range representable by IEEE 794, but that's arguably much more an "implementation error" than an error in the standard.
This is a most common occurrence in dealing with int64/"long" numbers towards the top or bottom of that range (given the floating point layout needs space).
There is no JSON standard for numbers outside of the range of double precision IEEE 794 floating point other than "just stringify it", even now that JS has a BigInt type that supports a much larger range. But "just stringify it" mostly works well enough.
The JSON "Number" standard is arbitrary precision decimal[1], though it does mention that implementations MAY limit the parsed value to fit within the allowable range of an IEEE 754 double-precision binary floating point value. JSON "Number"s can't encode all JS numbers, since they can't encode NANs and infinities.
The "dual" standard RFC 8259 [1] (both are normative standards under their respective bodies, ECMA and IETF) is also a useful comparison here. It's wording is a bit stronger than ECMA's, though not by much. ("Good interoperability" is its specific call out.)
It's also interesting that the proposed JSON 5 (standalone) specification [2] doesn't seem to address it at all (but does add back in the other IEEE 754 numbers that ECMA 404 and RFC 8259 exclude from JSON; +/-Infinity and +/-NaN). It both maintains that its numbers are "arbitrary precision" but also requires these few IEEE 754 features, which may be even more confusing than either ECMA 404 or RFC 8259.
One example that's bitten me is that working with large integers is fraught with peril. If you can't be sure that your integer values can be exactly represented in an IEEE 754 double precision float and you might be exchanging data with a JavaScript implementation, mysterious truncations start to happen. If you've ever seen a JSON API and wondered why some integer values are encoded as strings rather than a native JSON number, that's why.
I use branches gratuitously, and push to my forks regularly as you do, but I also do interactive rebases and edits when preparing to push to upstream to clean up my mess.