Hacker News new | past | comments | ask | show | jobs | submit | DougBTX's comments login

This article claims that 1e2 is interpreted as a string, while this other article on the front page[0] claims that 556474e378 is interpreted as a number. What's correct?

The YAML "Scalars" section[1] says:

> A few examples also use the int, float and null types from the JSON schema.

And includes these examples:

    canonical: 1.23015e+3
    exponential: 12.3015e+02
So, is the "+" required here or not? Is a YAML parser buggy if it doesn't parse all JSON numbers as numbers?

Edit: Ah, further on, it says:

    Canonical Form
    Either 0, .inf, -.inf, .nan or scientific notation matching the regular expression
    -? [1-9] ( \. [0-9]* [1-9] )? ( e [-+] [1-9] [0-9]* )?
The example 1e2 clearly matches this regex, so his YAML parser is broken.

Edit edit:

In YAML 1.1, there were separate definitions of float[2] and int[3] types, where only floats support "scientific" notation, and must have a ".", unlike JSON.

So this article is talking about YAML 1.1, while the other article is talking about YAML 1.2.

[0] https://news.ycombinator.com/item?id=41498264

[1] https://yaml.org/spec/1.2.2/#23-scalars

[2] https://yaml.org/type/float.html

[3] https://yaml.org/type/int.html


What's "correct" depends on whether your YAML parser defaults to YAML 1.1 or 1.2.

Most YAML parsers default to 1.1 for compatibility reasons, because if they default to 1.2 then existing YAML documents expecting 1.1 behavior will be parsed incorrectly.

YAML is a difficult language to parse if you care about getting the correct data.


Which only goes to prove that every replacement of XML eventually ran into the exactly same complexity issues they thought to solve in the first place. Another example of why "just use..." is a losing approach.


> The example 1e2 clearly matches this regex, so his YAML parser is broken.

1e2 does not match this regex. 1e+2 or 1e-2 would, though.


The "canonical" form indeed requires an exponent sign, but the tag resolution process in YAML 1.2 (section 10.2.2) does allow its omission. That said however, the equivalent specification in YAML 1.1 [1] does require an exponent sign at any case! It should be no surprise that YAML 1.1 isn't a superset of any version of JSON, but I don't know whether this is intentional or simply an oversight.

[1] https://yaml.org/type/float.html


YAML 1.2.2 still isn't a superset of JSON, because it requires all keys to be unique:

> The content of a mapping node is an unordered set of key/value node pairs, with the restriction that each of the keys is unique


While JSON doesn't prevent duplicate keys per se, it doesn't fully specify its semantics anyway and only states that duplicate keys are less "interoperable". And there is an explicit profile of JSON with this requirement (I-JSON [1]), so YAML 1.2.2 can be said to be a superset of some version of JSON.

[1] https://datatracker.ietf.org/doc/html/rfc7493


"X is a superset of some subset of Y" is a weak statement.

This is not about semantics, it's about grammar. While it's fair to say that JSON "usually" is valid YAML, it's still good to be strict about it, because the existence of a single counterexample can be used maliciously.


Agreed, though it is more like "X is a superset of some common but not de jure interpretation of Y". The real culprit is the ambiguity of Y...


A JSON implementation that rejects non-unique keys is a valid JSON implementation, so a valid YAML 1.2 implementation is still a valid JSON implementation.


Some things in the world of JSON depend on order, too. There is a Microsoft way of indicating the type of an object using a key that must be first in the JSON syntax.


Same situation as above - a JSON parser that ignores order is a valid JSON implementation.


And such a parser could work with that convention. However, if its complementary formatter printed the objects such that that type field doesn't appear first, it wouldn't be correctly implementing the specification.


> So this article is talking about YAML 1.1, while the other article is talking about YAML 1.2.

Precisely. The article is really noticing quirks and limitations libyaml, the library doing the heavy lifting behind PyYAML, not YAML-the-spec proper.

Granted, in practice, library limitations are probably what you want to know about. AFAIK, libfyaml[0] (not libyaml) is the most spec-compliant library around. It's a shame more downstream languages aren't using it.

[0]:https://github.com/pantoniou/libfyaml


No, the YAML parser is a valid YAML 1.1 parser, where this behaviour is totally correct and in spec.


Rust has a great package manager, so moving libs into std doesn’t bring much benefit.

On the other hand a change like this perf improvement can be released without tying it to a language version, that’s good too.


> On the other hand a change like this perf improvement can be released without tying it to a language version, that’s good too.

And you pay for that by having literally no way to parse something ubiquitous like json out of the box on install, relying to either installing third party lib (which is yet another security attack vector, requires yet another approval for upgrade, API can change on a whim by maintainer and other can of worms) or by using other language.


I think you can consider a few extremely common crates (serde, tokio, etc.) to basically not be "third-party". The risk that dtolnay will randomly break serde_json is not meaningfully different from the risk that the rust core team will randomly break the standard library.

> requires yet another approval for upgrade

Approval from whom?


The “inherently” means not by default, i.e., the runtime has to support moving green threads between OS threads itself.


That's not how I use "inherent", maybe they should just say "default" then?

But that's like... C is also single threaded by default, what isn't?


They will never be transparently/fundamentalally managed by the OS alone. The runtime will need to determine how to juggle green threads across multiple OS threads. In that way, this mapping is not inherent.

It can be designed around but that itself is a runtime design decision and I would not say it's akin to default vs custom.


With respect, it's not particularly relevant how you use "inherent". It's a standard usage. Rather than asking the whole rest of the world to change, you should probably learn the definition.


“Inherently” means “intrinsically”, meaning it’s a characteristic that can’t be changed without changing the nature of the thing. It doesn’t mean “by default”.


Ah, that makes sense, thanks!


Of course logic gates apply logical reasoning to solve problems, they are not much use for anything else (except as a space heater if there are a lot of them).


"Reasoning" implies the extrapolation of information - not the mechanical generation of a fixed output based on known inputs. No one would claim that a set of gears is "reasoning" but the logic gate is as fixed in it's output as a transmission.


Inside a constructor you can access a partially initialised "this" value, and even call methods on it, which leads to rules like: "Do not call overridable methods in constructors"[0], as they can lead to surprising, non-local, bugs.

Rust has functions associated with types which are conventionally used like constructors, but critically the new objects must have all their fields provided all at once, so it is impossible to observe a partially initialised object.

[0] https://learn.microsoft.com/en-us/dotnet/fundamentals/code-a...


Virgil solved this a little differently. The initialization expressions for fields (outside of constructors) as well as implicit assignment of constructor parameters to fields happens before super constructor calls. Such initialization expressions cannot reference "this"--"this" is only available in _constructor bodies_. Initializing fields before calling super and then the chaining of super calls guarantees the whole chain of super constructor calls will finish before entering the body of a constructor, and all fields will be initialized. Thus by construction, virtual methods invoked on "this" won't see uninitialized fields.

https://github.com/titzer/virgil/blob/master/doc/tutorial/Cl...


You can most likely use session types to soundly observe a partially initialized MaybeUninit<MyObject> in Rust. The proper use of session types could ensure that the object is only assumed to be initialized after every field of it has been written to, and that no uninitialized fields are ever accessed in an unsound way. The issue though is that this is not automated in any way, it requires you to write custom code for each case of partial initialization you might be dealing with.


> Where's the paradox?

Exactly. I propose that the paradox is in first-past-the-post voting, a 5% swing leads to a 100% change in representation. How can that be?



Arrow’s impossibility theorem only applies to ranked choice voting systems.


Yes, but what I mean is that even if you move away from FPTP voting, the others all have compromises.


But for many the drawbacks of ranked choice systems are far more preferable to those of FPTP. Additionally, Arrow's theorem only states that the spoiler effect cannot be completely eliminated by ranked choice - it says nothing about how often such an event actually occurs, and in practice spoiler candidates will occur less frequently with ranked choice systems compared to FPTP.

Additionally, rated choice voting systems are not subject to Arrow's theorem.


... which are almost never used in voting for governments.


FPTP is a ranked choice system.


No it isn't.


https://en.m.wikipedia.org/wiki/Ranked_voting

> The most commonly-used example of a ranked-choice system is the familiar plurality voting rule, which gives one "point" (vote) to the candidate ranked first, and zero points to all others (making additional marks unnecessary).


This is not what is usually meant with the term (plurality voting is different from ranked voting where there are non-trivial preference orderings) and in any case not what the Arrow theorem applies to.


Arrow’s theorem defines a ranked voting system as a function that takes a permutation of the candidates for each voter and outputs a single permutation of the candidates. As a special case, if you take the function which sorts the candidates by how many voters ranked them as first, you get plurality voting.


And that makes plurality voting different from ranked voting where the voters can submit arbitrary preference orderings. In any case, as I said, the Arrow theorem doesn't apply to plurality voting, as there can be no cycles in the aggregate.



Only if you have infinite voters.


That was a pretty interesting twist in the story. There is a check, but due to a confusion in semantics it checks that the user is root (uid == 0) instead of the pointer!


I’d recommend testing the Apply step in a staging environment before rolling out.

For releases, the plans don’t need to be committed to version control. Instead they can be generated in CI just before release, then put reviewing the plan on the checklist before confirming the release to prod.


FTA:

> The Houthis first attacked commercial ships connected to Israel, expanding to those with ties to the U.S. and United Kingdom following coordinated strikes by the two countries, in partnership with other nations. The Yemen-based group then began to attack any ship going to or from Israel, as well, before declaring they would attack nearly any commercial ship transiting through the Red Sea, as well the Indian Ocean and Arabian Sea, USNI News previously reported. The Houthis have also claimed they would target some ships in the Mediteranean Sea.


> Hiding the truth seems like the exact opposite of that.

It seems like “true defamation” could be a lie of omission. If someone harps on about a past misdeed but omits N years of atonement, that’s misleading, even if true.


You can historically prove a misdeed, but how do you prove atonement other than a prison sentence ? Even then, how does that prove change ?


If the goal of your prison system is rehabilitation, then the fair thing to do is assuming they changed. Possibly even if the only purpose of imprisonment was punishment.

This depends on risk of course. You and society shouldn't be exposed to undue risk because of that assumption. But limitations on the offenders freedom to - for instance - perform certain jobs after their release should be decided by a court, not the general population.


> but how do you prove atonement

Accumulated years of not repeating the mistake. And people will disagree on how many are necessary.


> Accumulated years of not repeating the mistake.

> And people will disagree on how many are necessary.

I think you've hit the nail on the head, there is unwritten large variance timeframes that people "need to have met" for atonement, it isnt an easy solution.

The radiation continues to affect, long after the nuclear explosion.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: