
The Harmful Consequences of Postel's Maxim - ctz
https://tools.ietf.org/html/draft-thomson-postel-was-wrong-00
======
hyperpape
While this doesn't settle any of our debates, it's interesting to note that
there's a real question about whether today's debates have anything to do with
what Postel originally meant:
[http://www.cookcomputing.com/blog/archives/000551.html](http://www.cookcomputing.com/blog/archives/000551.html).

The robustness principle is so compressed that it invites the reader to
project an interpretation onto it.

~~~
StephenFalken
Great point.

The original Usenet _comp.mail.pine_ newsgroup post [1] by Mark Crispin
(father of the IMAP protocol):

    
    
      This statement is based upon a terrible misunderstand of Postel's
      robustness principle. I knew Jon Postel. He was quite unhappy with
      how his robustness principle was abused to cover up non-compliant
      behavior, and to criticize compliant software.
    
      Jon's principle could perhaps be more accurately stated as "in general,
      only a subset of a protocol is actually used in real life. So, you should
      be conservative and only generate that subset. However, you should also
      be liberal and accept everything that the protocol permits, even if it
      appears that nobody will ever use it."
    

[1]
[https://groups.google.com/d/msg/comp.mail.pine/E5ojND1L4u8/i...](https://groups.google.com/d/msg/comp.mail.pine/E5ojND1L4u8/iYunDY6h_xMJ)

~~~
ironick
I don't give this reminiscence ANY credit. The very first version of Postel's
law makes it VERY clear that Postel intended his law to deal with non-
compliant behavior in a tolerant or "liberal" way: "In general, an
implementation should be conservative in its sending behavior, and liberal in
its receiving behavior. That is, it should be careful to send well-formed
datagrams, but should ___accept any datagram that it can interpret_ __(e.g.,
__ _not object to technical errors_ __where the meaning is still clear). "

See my potted history of Postel's law:
[http://ironick.typepad.com/ironick/2005/05/my_history_of_t.h...](http://ironick.typepad.com/ironick/2005/05/my_history_of_t.html)

~~~
hyperpape
Nice to see that so well documented: I'd only seen the one most common
reference, RFC 793.

That said, it's still unclear how far this extends: the example given is of an
unknown error code, which might lead you to think that the requirement is
"syntactically well-formed input where you can't 100% determine the
semantics." That's a far cry from the way browsers handle malformed HTML.
Similarly, you have to apply some judgment concerning what an agent can
interpret the meaning of.

------
pierrebai
I have a hard time not to be sarcastic about the author's naivete.

He does acknowledge that Postel's Maxim might be essential to any widely
deployed protocol that wants to be successful. He also acknowledges that his
alternative is inaplicable to the early life of a protocol.

The main two flaws in tghe reasoning is that incompatibility or bugs are not
intentional and that success is contingent on something 'just working'. For a
thousand-feet views, you want errors, whatever their source, to propagate as
little as possible and affect as little of a network as possible. Postel Maxim
provides that effect. Being strict ensures that some process somewhere over
which you have no control will affect your system.

Fortunately, it's being applied everywhere, notwistanding purists. Your house
electrical input gets filtered and aim to provide a standard volatge. Your
computer power supplies filters that and aims to provide a stable voltage and
amps. Your electronics are surrounded by capacitors... wand it goes up the
stack. It's just good engineering.

~~~
gedejong
Fortunately, Postel's law is not applied everywhere. Electrical deviations are
formally specified so engineers design smaller error bounds, resulting in
lower cost of the end-product (or lower weight and size). Your computer power
supply will still produce magic smoke when connected to 330V or when subjected
to strong fluctuations. The 'liberal' part is actually the 'strict' part: your
electricity provider is obliged to provide you with an electric potential
within certain boundaries.

The reasoning of the author is simple: we want a POC of an idea ASAP (lacking
formal specifications of anomalies and error bounds) and when successful,
error bounds and boundary conditions including specifications thereof should
be communicated and implemented. That seems like a cogent and professional
point to make, given the complexity of our systems.

~~~
jacquesm
Plenty of computer power supplies will produce magic smoke at 175V too! They
interpret 'substantially less than 240V' as 'switch to 115V mode' and then
happily blow up with the excess voltage as input.

------
Animats
I've argued in the past for an intermediate position, especially for HTML.
Browsers should be moderately tolerant of bad HTML. But rather than trying to
handle errors invisibly, they should revert to a simplified rendering system
intended to get the content across without the decorative effects. After the
first error, a browser might stop processing further Javascript, display a red
band indicating defective HTML, and display all text in the default font. It
might also report the error to the server in some way.

Read through the error-recovery specification for HTML5. It's many pages of
defined tolerance for old bugs. Then read the charset-guessing specification
for HTML5, which is wildly ambiguous. (Statistical analysis of the document to
guess the charset is suggested.) The spec should have mandated a charset
parameter in the header a decade ago. If there's no charset specification,
documents should render in ASCII with hex for values > 127.

~~~
dagw
You've got two browsers to chose from. One that handles every site you visit
without a problem, one which throws a bunch of obscure error messages on about
20% of the sites you visit.

Which do you think most people will chose?

~~~
femto113
I think this would have been great if done from the beginning, but even in
early versions of Mosaic malformed HTML would still appear "correct" visually,
and since once it _looked_ ok most people figured it _was_ ok we've been
buried under broken HTML from the beginning. The idea that the browser
"handles every site without a problem" is slightly misleading though, since
even if everything looks ok the user is paying a price of lower performance
and a slower pace of innovation as browser developers devote huge amounts of
time, money, and attention to not puking on all that broken HTML.

~~~
comex
To a large degree this is it. Nobody bats an eye if a misplaced quote
somewhere in a Python program causes the whole program to fail to start, but
XHTML breaking pages on syntax errors was considered a terrible idea because
the old way worked fine(tm).

However, Python source code is not typically dynamically generated, while HTML
is, increasing the probability of errors the site author could not trivially
predict and the user can do nothing about.

------
datenwolf
_groan_ …

"Fail early and hard, don't recover from errors" is a recipe for disaster.

That principle applied to critical systems software engineering leads to
humans getting killed. E.g. in aerospace the result is airplanes falling out
of the sky. Seriously. The Airbus A400M that recently crashed in Spain did so,
because somewhere in the installation of the engine control software the
control parameter files were rendered unusable. The result was, that the
engine control software did fail hard, while this would have been a
recoverable error (just have a set of default control parameters hardcoded
into the software putting the engines into a fail safe operational regime);
instead the engines shut off, because the engine control software failed hard.

In mission and life critical systems there are usually several redundant core
systems and sensors, based on different working principles, so that there's
always a workable set of information available. Failing hard renders this kind
of redundancy futile.

No, Postel's Maxim holds as strong as ever. The key point here is: "Be
conservative in what you send", i.e. your implementation should be strict in
what it subjects other players to.

Also being string in what's expected can be easily exploited to DoS a system
(Great Firewall RST packets anyone?)

~~~
pvg
_E.g. in aerospace the result is airplanes falling out of the sky. Seriously._

You're misinterpreting 'fail fast' \- it doesn't mean 'entire system should
fail catastrophically at slightest problem' or 'systems should not be fault-
tolerant'. It just means that components should report failure as soon as
possible so the rest of the system can handle it accordingly instead of
continuing operation with an unrecognized faulty component leading to
unpredictable outcomes.

~~~
jerf
In particular, "Just Sort Of Keep Going And Hope It All Works Out Eventually"
is also a great way to crash planes and kill people, if we're going to pretend
that everything's safety critical.

------
nabla9
I have always felt that Postel's Maxim combined with network effect leads to
complications in long term while it promotes interoperability in the short
term.

It's game theoretically successful strategy to get your implementation to work
with everyone. When you accept sloppy input, this allows sloppy
implementations to become popular.

Eventually de facto protocol becomes unnecessarily complicated and you need to
understand quirks in popular implementations.

------
TeMPOraL
Like the equivalent rule for human interaction, this is a coordination problem
and it fails in the same way - when you're nice to people, you get overrun by
assholes. People who don't care about being "conservative in their sending
behaviour". This rule is really great _if_ you can make everyone stick to it.
In a competitive environment, it's impossible without some additional way to
ensure assholes will be punished.

------
kabdib
Basically, we shouldn't issue standards or RFCs without test vectors and tests
that are meaningful, and updated if bugs are found in them (or the RFC).

Expecting someone to (say) read the HTTP spec and write a compliant
implementation without tests that everyone else is using as well is lunacy,
and leads to the nightmare we have today.

Standards without engineering to back them up are bad.

Side effect: Committees that produce "ivory tower" standards that are
unimplementable will find that their work is ignored.

Another side effect: Standards will get simpler, because over-complex nonsense
will be obvious once the committee gets down to making an exemplar actually
work.

Not that it will ever happen...

[I helped write an HTTP proxy once. The compliant part took a couple weeks;
making it work with everyone else's crappy HTTP implementation was a months-
long nightmare on rollerskates]

------
InclinedPlane
What is missing from this is the ability to toggle a "strict mode" with a
browser, or even a "reference implementation browser". Right now developers
have to rely on their knowledge alone combined with the simple fact of whether
or not their work seems to render correctly on the major browsers. That is
where you run afoul of Postel's Maxim, because it should be possible to use
these tools (like browsers) in a way that provides the sort of "hey, don't do
that" feedback during development even though during normal operation the
browser should do its best to make due with whatever it's given.

That same pattern exists elsewhere too, so often people need to do "API
science" to figure out how to use various tools. With the common result being
the discovery of how to use those tools seemingly effectively, but
incorrectly.

------
sytelus
I think both of these philosophies represent extremes.

This means Forgive all mistakes: _Be liberal in what you accept, and
conservative in what you send_

This means forgive no mistakes: _Protocol designs and implementations should
be maximally strict._

I would suggest an alternative, Forgive most mistakes, but always let them
know they did made mistake:

 _Be conservative in what you send, and as liberal as possible in what you
accept but always let them know what they could have done better_

~~~
eridius
Does that solve interoperability though? If you accept it and let them know,
you're still accepting it, meaning you're still going to tolerate that
protocol implementation continuing its bad behavior in the future. And if the
protocol implementation becomes widely-used, then we have the exact same
maintenance issue as we do with Postel's Maxim.

------
jmount
My take on this: [http://www.win-vector.com/blog/2010/02/postels-law-not-
sure-...](http://www.win-vector.com/blog/2010/02/postels-law-not-sure-who-to-
be-angry-with/)

Though I am beginning to think it coming down to "do you want to make it easy
to run on a dev-box or easy to run in production?"

------
poofyleek
Being liberal in acceptance did not discount the conservatism in output. If
diligently only one side of the maxim is followed, there would be less issues.
For example, if many implementations were conservative in sending, that alone
is sufficient. After that allowing for liberal handling of input is to
compensate for those who did not follow the conservatism in output. Postel
can't be easily blamed for long term entropic outcomes, accelerated by the
abundance of implementations that ignore both sides of the maxim.

------
pjc50
"Conservative in what you accept" works fine if the first implementation
shipped is complete and bug-free. If you have two implementations that have
incompatible bugs then third parties have to detect who to be bug-compatible
with. If your system is not backwards compatible (easily achieved in HTML by
ignoring new elements), then you have to do version detection as well.

------
protomyth
So, if the original spec for HTML rejected ill formed pages and enforced
nesting and end tags, would the web be better today?

I think the lack of formality actually hurt non-technical users because it
made tool harder to program.

~~~
walshemj
But you then end up with OSI instead of TCP/IP and the internet would look
very different.

