Hacker Newsnew | past | comments | ask | show | jobs | submit | genneth's commentslogin

I couldn't help but pursue the pun: https://github.com/genneth/monet


This is "law" from an European (EU) perspective. The foundations differ in English and US law. I've always thought it would be interesting to compare them in the same way computer scientists compare the design choices in different operating systems. At the top level the same outcomes are desirable, but the lower levels and choices of abstractions are different.


The book covers both. I think you were too hasty to get this criticism posted to HN that you made an assumption about the book by its cover. The author is in the UK and this was published by Oxford, which are common law jurisdictions.

For example, I turned to a random part about copyright:

"In the continental European tradition, the focus has been on the author and the work. This understanding of ‘authors’ law’ built on the Age of Romanticism of the eighteenth and nineteenth centuries, where the singularity of creative im- agination of an individual author took precedence over the mundane business interests of a publisher. The idea was that ‘authors’ law’ is part of ‘natural law’ rather than being ‘posited’ by a legislator (positive law). The ‘authors’ right’, in that line of thinking, is constituted by the original act of creation of the author and should not be tied to formalities (such as registration), while the ‘work’ that is created belongs to the ‘author’s domain’. This is a matter of per- sonality rights (droit moral or moral right), rather than a matter of ownership (as Locke would have it).

In the common law that inspired the United Kingdom and the United States, the focus was not on the author and their work, but on the original and the copy. This was less a matter of personality and romantic imagination than a matter of pragmatism. Copyright was simply a choice made by a legislator (positive law), rather than a natural right inherent in the author’s act of cre- ation. This led to the requirement of registration and an emphasis on copy- right as an economic, not a moral right. Here, copyright law is about the domain of the ‘work’ rather than the domain of the ‘author’, and such work is considered original in the sense of not being copied, rather than original in the sense of being creative or novel"


What does that have to do with actual US copyright jurisprudence?


It's really hard to compare them. Just like it's hard to compare one country to another. So many factors.


The common way people think about common law versus civil law is this:

-common law depends more on courts to make and refine legal decisions -civil law relies more on regulators.

In civil law countries it's more common for the statutes (governing text) to be longer and go into great detail. In common law countries you see some extremely short laws - like the Sherman act in US Antitrust law is like 2 sentences long.

That's the common understanding. These days though both EU and US are converging a bit in their approach.


I do take your point, but the Sherman anti trust act is roughly a page to two pages long[0]

Only wanted to provide an accurate take, your overall point I agree with though

[0]: https://www.govinfo.gov/content/pkg/COMPS-3055/pdf/COMPS-305...


Thanks for the fact check. I'm writing on my phone so I can't do this justice, but the operative wording of the Sherman Act (or its followup the Clayton Act) is something like: "unfair restraint on trade is unlawful," or something like that. That's the one to two sentences I'm referring to. My point is that it's extremely limited phrasing. In the 100 years it has been in place there have been thousands or court decisions explaining what it means - and refining and defining what that sentence means in the US legal system.

If the act itself is only 2 pages that is a marvel, though. Usually they spend at least ten explaining why they are passing a law and who they are.


I wonder if this explains the propensity for lawsuits in the US. It's basically our regulation and enforcement mechanism.


Exactly. Yes. That's the idea.

The US legal system relies more on "ex post" legal enforcement - meaning, if you break the law then you get busted and you personally pay the victim. Europe is a more "ex ante" system - they rely on regulators to strictly define what the law should look like exactly and actually requires industry to do very specific things to comply with it. If someone gets hurt the system compensates them from a fund. The person who hurt them doesn't necessarily pay.

That's the theoretical underpinning and difference in our systems. But like I said the systems have a bit of both these days. Eu is flirting with more class actions, and US has more regulatory scrutiny in certain sectors, like California privacy laws for example being very detailed.


UK doesn't seem to have quite as much suing as the US? And they also have a common law system.

In contrast, German military procurement is famous for its endless lawsuits.


One different with UK is they use the 'English Rule' of fee-shifting. In English civil lawsuits, the losing party has to pay the attorneys fees of the other party.

In the US, there's basically no downside of suing someone if you can keep your own costs down.


That makes sense, thanks!


I'm not sure why you feel the need to double-quote that. At any rate, the book seems to cover UK law, and in fact is published at Oxford?


I'm not sure why you're attracting downvotes for correctly stating that the way law works in the UK is very different than the way it works in continental Europe. The UK is a common law country, like the US, while many EU nations use civil law systems.

The basic mechanisms of UK law are more similar to US law than to French law. The actual laws on the books are probably the other way around, though.


Because the interval, on average, grows exponentially with the number of basic operations. So it quickly becomes practically useless.


This is doubly unpersuasive in any common law jurisdiction. The whole point of such a system is that the idea that the correct outcomes are easier to arrive at in the specific than the general, and that the codification happens after precedents.

Even in civil law systems the fact is that case law is a coexistent partner to the interpretation of texts.

The fact this guy is learning law should actually be a cause of not taking this very seriously. I would be more interested by a practitioner making the case. (I am not!) In fact, this applies strongly to the case he gives: I would bet no judge in Canada (outside of Quebec, maybe?) would take his argument seriously. The way he has approached this question honest smacks of ivory tower academia.

If you made be steelman the argument, it should be that the usual methods predate the era of formal methods, and that the overall system could be made better. Then one would have to explain better in an operationally important way for practitioners and workers in the system, usually by some convincing argument about time saved. Remember that arguments over definitions is a very small part of the time taken in the job!


Exactly this. The Platonic ideal of statutory interpretation is the “no vehicles in the park” statute. It’s simple enough on the surface but you can think of an infinite number of edge cases that demand nuance (e.g. Scooters? A war memorial featuring a tank? A wheelchair?).

Enumerating and handling these cases in a formal mathematical sense is at best futile and at worst leads to absurd outcomes. Law isn’t made formally because lawyers don’t know how, it’s because intelligently applied common sense is an essential part of a sane legal system. [0]

[0] For what it’s worth, it’s not as if the formalism vs. common sense debate is actually settled in the legal sphere. See, e.g., the debate about textualism vs. legislative intent in American statutory interpretation. Even the textualist position lies much, much further along the spectrum than formal specification though, and it has its own problems.


Compiler error if these are ever instantiated/need to be monomorphised? The problem is that this is very unrustic by having the error be very non local.

But maybe this is unavoidable once you have sufficiently complex const fn anyway...


There's a principle in Rust that only the function signature is needed to know if a function call is valid, without any need to look at the function body.

It makes things much easier to think about: at a minimum, if you change the body of a function it doesn't affect whether other code compiles or not. It's also one of the big differences between generic parameters (all in the function signature) and template metaprogramming (templates can do duck typing).


In reality the application running on top of the database wants C-kinda-A. https://research.google/pubs/pub45855/

> Despite being a global distributed system, Spanner claims to be consistent and highly available, which implies there are no partitions and thus many are skeptical. Does this mean that Spanner is a CA system as defined by CAP? The short answer is “no” technically, but “yes” in effect and its users can and do assume CA.

The point is that if my application will actually go down anyway if the WAN craps out, then really, I don't actually need P. The application would also be significantly simpler if it assumes that if the application can work, then the DB is also up. And one seems that realistic systems built on Spanner simply assume CA and then get on with life...


>The point is that if my application will actually go down anyway if the WAN craps out, then really, I don't actually need P.

Can you guarantee no further modification to the data in the database occurs when the WAN craps out? What happens to in flight requests, batch/scheduled jobs, etc.? There's no situation where your application continues modifying the contents of the database even when there's no WAN link?

I've seen very few real world services where there is a 0% chance of data changing just because the application servers can no longer talk with clients but can still talk with the database. What happens when those databases try to rejoin the cluster and their data is now inconsistent, if you did not take into account partition tolerance?


Since version 4, plotly isn't horrible. It's okay. It does basic things fine enough that I use it almost exclusively. However, I most plot for personal analysis, not for presenting to others.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: