Law already is a programming language. It just has (deliberate) compiler issues
Where law is unclear, The drafters of the law were unclear what they wanted or could not agree what they wanted or were unwilling to explicitly state what they wanted.
I don't think we need to rewrite our laws in software friendly language.
We shall soon enough be needing to have law and software hand in hand - the obvious example is the self driving car and trolley problem - a car sees child ahead in the road, and cannot brake in time so must swerve, but swerving will kill the driver (or a old man on the sidewalk).
That must be explicitly coded into the car ahead of time - and that will need to be a regulation / law of the jurisdiction because no car company will just choose.
And so we are presented with the perfect problem - it will demand democratic involvement, it is impossibly hard ethically, and it will need the software to be inspectable / verified before and after the incident.
If we solved those problems then the recent Post Office debacle would have different .
I agree on all accounts, however I've always taken "Law is a programming language" or, maybe better, "Law is a program" as more of a metaphor. I see many similarities between law and computer programs, it's just that a law is a program written for humans (judge, jury, ...) to interpret, whereas a computer program is written for a computer.
Thus I think any attempt to translate a written law into something a computer can interpret as a failure, because the interpreter is not optional in the law. The interpreter is part of it and constitutes a vital piece of it, and without it written law does not make a lot of sense at all. Still, it is, in my view, a program which can be "executed", just not without a lot of context.
Actually, software can be as well "interpretable". It just means that the outcome is not "he has to be convicted because of X and Y" but instead "if X and Y are relevant then he should be convicted". Which means, software cannot automatically handle a case but it needs to be interpreted - however, the interpretation will be more formal and not so handwavy.
> Software is an axiomatic system. Human behavoir is NOT.
It really isn't though. When I write software I have to consider the different interpretations by the OS, browser, etc. and draft accordingly. Ditto for when I draft legal documents and need to consider the different interpretations by opposing counsel, a regulator, etc. As someone who drafts legal documents for humans and writes code for computers I can tell you first-hand the two processes are remarkably similar.
I think the concept that is most useful here is that of fuzzy logic: rather than if statements requiring something that returns 1 or 0, they require a probability p \in [0, 1]. Some things are essentially certain. Others are not. Of course, the main point shown at trials is that these probabilities are themselves uncertain...
Where law is unclear, The drafters of the law were unclear what they wanted or could not agree what they wanted or were unwilling to explicitly state what they wanted.
I don't think we need to rewrite our laws in software friendly language.
We shall soon enough be needing to have law and software hand in hand - the obvious example is the self driving car and trolley problem - a car sees child ahead in the road, and cannot brake in time so must swerve, but swerving will kill the driver (or a old man on the sidewalk).
That must be explicitly coded into the car ahead of time - and that will need to be a regulation / law of the jurisdiction because no car company will just choose.
And so we are presented with the perfect problem - it will demand democratic involvement, it is impossibly hard ethically, and it will need the software to be inspectable / verified before and after the incident.
If we solved those problems then the recent Post Office debacle would have different .