Hacker News new | past | comments | ask | show | jobs | submit | incorrecthorse's comments login

Looks like this missed the opportunity to load the board in JS from the URL to be truly static.


> After a candidate's defeat in an election, you will be supplied with the "cause" of the voters' disgruntlement. Any conceivable cause can do. The media, however, go to great lengths to make the process "thorough" with their armies of fact-checkers. It is as if they wanted to be wrong with infinite precision (instead of accepting being approximately right, like a fable writer).

-- N.N. Taleb


> they've gone from barely stringing together a TODO app to structuring and executing large-scale changes in entire repositories in 3 years.

No they didn't. They're still at the step of barely stringing together a TODO app, and mostly because it's as simple as copying the gazillionth TODO app from GitHub.


I’ve used copilot recently in my work codebase and it absolutely has no idea what’s going on in the codebase. At best it’ll look at the currently open file. Half the time it can’t seem to comprehend even the current file fully. I’d be happy if it was better but it’s simply not.

I do use chatgpt most recently today to build me a GitHub actions yaml file based on my spec and it saved me days of work. Not perfect but close enough that I can fill in some details and be done. So sometimes it’s a good tool. It’s also an excellent rubber duck- often better than most of my coworkers. I don’t really know how to extrapolate what it’ll be in the future. I would guess we hit some kind of a limit that will be tricky to get past because nothing scales forever


Excellent advice.

On the technical side, I believed waiting was due to the lock queue rather that having acquired an ACCESS EXCLUSIVE lock. The ALTER is specifically _waiting_ for any lock lower than ACCESS EXCLUSIVE to be release.


It also makes all new readers/writers to wait for that lock, essentially leading to downtime until the lock is eventually acquired and released. This is the classic readers/writers library example, and you want to avoid starving the writers.

Thats why size of data is the least of your issues - its the access patterns/hotness that are the issue.


I'm not sure information theory deals with this question.

Since this isn't lossless decompression, the point of having no "real" data is already reached. It _is_ inventing things, and the only relevant question is how plausible are the things being invented; in other words, if the video also existed in higher resolution, how close would it actually look like the inferred version. Seems obvious that this metric increases as a function of the amount of information from the source, but I would guess the exact relationship is a very open question.


Most modern navigation apps continue working in tunnels and other places without GPS. It's more like GPS augmented with dead reckoning.


Is that sensor based dead reckoning or simple interpolation based on the previous (or expected) speed of travel along the route though?


Many comments are missing the point here (although the article doesn't properly explain neither); it's not about resolution, but about fixing imperfections in filming:

> The recent Cameron restorations were based on new 4K scans of the original negative, none of which needed extensive repair of that kind. [...] The A.I. can artificially refocus an out-of-focus image, as well as make other creative tweaks. “You don’t want to crank the knob all the way because then it’ll look like garbage,” Burdick said. “But if we can make it look a little better, we might as well.”

The only movies which would require upscaling to 4K are those released between about mid-2000s to mid-2010s, the advent of native digital cinema, but filmed in 2K. Everything before was filmed in 35mm film, which can be scanned to 4K with information to spare; everything after is filmed in native digital 4K or more.

Moreover, upscaling which deal only with resolution has absolutely no need of AI. Any TV will decently upscale in _real-time_ a non-4K movie, and more sophisticated techniques can give basically indistinguishable results. 2017's _Alien: Covenant_ was voluntarily produced in 2K but released in 4K through upscaling and the image look just great.


> The only movies which would require upscaling to 4K are those released between about mid-2000s to mid-2010s, the advent of native digital cinema, but filmed in 2K. Everything before was filmed in 35mm film, which can be scanned to 4K with information to spare; everything after is filmed in native digital 4K or more.

Good to call this out, I think this is something that's really lost on people.

It really blows my mind that George Lucas, for all of his apparent obsessive concern about his films supposedly looking dated, chose to shoot Star Wars Episode 2 in 1080p in contrast to Episode I on 35mm film.


I guess 1080p was the big shiny edge thing back at the time. 35mm can supposedly be scanned beyond 8K, so you could theoretically consider 4K filming not good enough neither.


> In some sense this is almost tautological.

Yes. The interesting property would be the reverse proposition: what percentage of victories are granted by not blundering?

In amateur level chess, that number is very high. That's the point the author was trying to make.


No. Use integers to store the smallest money decimal, and store the currency name alongside.


What happens if you’re sure that four decimal places is the smallest, then suddenly a partner system starts sending you 6 decimal places?


Precision should be part of the spec for integrations. With the integer multiple of minimal unit, that makes it clear in the API what it is.

e.g. it doesn't make sense to support billing in sub-currency unit amounts just by allowing it in your API definition, as you're going to need to batch that until you get a billable amount which is larger than the fee for issuing a bill. Even for something like $100,000.1234, the bank doesn't let you do a transfer for 0.34c.

For cases where sub-currency unit billing is a thing, it should be agreed what the minimal unit is (e.g. advertising has largely standardised on millicents)


Just a note that precision is a part of the standard if you’re using the ISO 4217 standard, which defines minor unit fractions.

https://en.wikipedia.org/wiki/ISO_4217

Or choose a different standard, I don’t know what else is out there, but you probably should choose an existing one.


Implementation of fixed point decimals, using multiple integer representations encoded within a floating point system. Nice.


Well I mean, if your minimal unit is 1c, then a price like $22.56 should be encoded as 2256 cents.

If you're doing ads and going for millicents, something like $0.01234 should be encoded as 1234 millicents.

Obviously you have to agree on what you're measuring in the API, you can't have some values be millicents and others cents.


Yeah I am more laughing that once encoded in JSON as { "p": 2256, "dp": 2 } you are using 2 floating point numbers. But JSON, and indeed JS wasn't designed.


To be clear, I wasn't advocating for flexible decimal points. There is no "dp" parameter in the solution I was proposing. It's just documented in the API that "price" is denominated in cents (or satoshis or whatever you want)


Then you should store the time as well, because the number of decimals in a currency can change (see ISK). Also, some systems disagree on the number of decimals, so be careful. And of course prices can have more decimals. And then you have cryptocurrencies, so make sure you use bigints


You store it as an integer, but as we just saw in the OP, for general interop with any system that parses JSON you have to assume that it will be parsed as a double. So to avoid precision loss you are going to have to store it as a string anyway. At that point it's upto you whether you want to reinvent the wheel and implement all the required arithmetic operations for your new fixed-point type. Or you could just use the existing decimal type that ships on almost every mature platform: Java, C#, Python, Ruby, etc.


In dollars, what do you get up to with a double of cents without precision loss? It's in the trillions, I figure? So a very large space of applications where smallest-denomination-as-JSON-number is going to be fine.


Prices can certainly have more decimals that cents.

If you just store cents you can't represent them. You either have to guess at the beginning the smallest unit or store the precision along with it.

Just use strings, it's much simpler.


The vast majority of humanity is Poor Brain and therefore actual money is what matters.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: