The IETF considered replacing Igoe. No consensus was reached. People with lots of IETF institutional credibility (ie: that had been instrumental in previous standardization efforts, ones in which no tampering concerns were entertained by anyone) strongly disagreed with his ouster.
Also, from a political perspective, the effort to replace Igoe was hamstrung by the lack of anyone in the universe raising their hand and saying "I'm a respected cryptographer AND I want to punch myself in the face continuously for several years by chairing CFRG".
The less reverence the Internet has for the cryptographic wisdom of the IETF, the better off I think we all are.
I thought he explained pretty clearly that this is not a unanimous opinion of the IETF, and that there are very vocal groups that oppose it. Firing the chair of a IETF crypto research group would have only given those groups more ammunition, and probably would have prevented this RFC from ever being published.
That said, the very fact that the RFC was published sends a very strong message to those contingents about the IETF's priorities, and I for one am very happy to hear it.
Why? It's extremely common for industry/standardization groups like the IETF to come to conclusions that are contrary to the position of one of its members, and it usually doesn't result in said member being expelled or falling on their sword.
> if your application doesn’t support privacy, that’s probably a bug in your application.
Amateur radio is explicitly not for traffic that needs to remain private. It exists for limited purposes not including routine communication that can be served by other means (e.g. a phone or ordinary internet connection). It is chiefly for education and research/experimentation in radio. It is not for general personal communications or commercial use.
The applicable rule in the US says:
"(a) No amateur station shall transmit: [...] messages encoded for the purpose of obscuring their meaning"
This serves to ensure the amateur radio service is not used in violation of its rules and purpose.
The rule has exceptions elsewhere in the rules. For example, remote control of satellites and model aircraft. And FCC rules as a whole pretty much go out the window when transmissions are for the purpose of protecting the immediate safety of life or property.
The rules are also susceptible of a particular interpretation: You can use encryption, provided the algorithm is documented, and you keep a record of the keys used. This has been used to block non-amateur access to WiFi access points operating within the ordinary WiFi band, but under Part 97 rules (e.g. non-FCC-approved equipment, or higher power than allowed for unlicensed users).
The rule also does not in any way prevent use of authentication and message integrity mechanisms, e.g. HMAC, because they are not intended to obscure the meaning of the message, merely authenticate it.
If you need private communication, there are other avenues available than the amateur radio service. And if you want greater freedom for unlicensed use of the airwaves than now exists, you'll have my support in principle (there are real problems with a free-for-all, but there are myriad ways FCC rules and spectrum allocation practices could be greatly improved in this regard). But this rule is not a bug, it is a deliberate feature of the amateur radio service.
IANAL. The ham radio community needs to raise this with the FCC. This section was originally constructed a long time ago (1993?). I'm guessing it's biased this way to stop 1940's era spys from operating.
The regulations prohibit various forms of commercial use, e.g. preventing a cab company from moving its dispatch onto the 2m ham band— a reasonable policy since there is limited spectrum and commercial parties could easily overrun it: but if the traffic is encrypted how is the community management supposed to function?
Personally I'd like to see the regulations adopt special rules for highly directional or low-power limited-range signals in the SHF+ bands where there is plenty of spectrum which basically drops all the content rules beyond requiring cleartext contact information. Without competition for spectrum the balance of interests is different and it would be nice to be able to lawfully backhaul community internet access over some chunks of spectrum up at 3cm. Since no one would likely notice or care you could already use crypto in these places, so it might as well be made permitted.
The amateur radio community is not universal in their dislike for this rule. I don't personally see any way in which the rule could be removed without altering the fundamental character of the amateur radio service.
You simply should not be making transmissions in the amateur radio service that require privacy. Permitting unrestricted encryption makes that basically impossible to enforce.
I’m too lazy to look it up but I would be surprised if many of the founders of the internet where not ham operators. Again I’m being lazy about looking it up but if not specifically in writing at least in spirit I believe one of the purposes of the armature radio program has been to advance wireless communication and technology.
Right now strong encryption and authentication are where most of the efforts in the field seem to be focused. It should be at the forefront of the experimentation being done by amateur radio operators.
Not that I really have an answer to the problem of bandwidth abuse. I completely understand how this would be a problem and have no doubt that it would be abused.
Yes, some hams have been saying "please don't encrypt all Internet protocols because then I wouldn't be able to enjoy my hobby of running those protocols over ham radio", which seems incredibly selfish.
It'll be easy enough to run vintage protocols (like unencrypted HTTP or IP) over amateur radio, and bridge them to secure protocols on the other end. Amateur radio links just won't be able to route end-to-end-encrypted traffic, and that doesn't seem like a severe limitation. From what I've seen, amateur radio links typically form the last hop in the chain between a base station and a remote endpoint, so just terminate the end-to-end connection at the base station and transmit unencrypted to the endpoint.
If they've said "please don't encrypt", they're making a much broader request than is necessary to comply.
The only thing we really need is for a protocol to have some way for us to transmit a callsign in cleartext (otherwise there would have to be a break in data exchange every ~10 minutes for transmission of a callsign unencrypted). On WiFi this gets accomplished by setting the SSID to our callsign.
The actual data could be encrypted, but we would have to record (and arguably publish) any keys (including session keys) used in the process.
Internet protocols have been used by ham radio operators for decades. 44/8 was allocated to amateur packet radio in the 1970s. And as I alluded to in my comment, the same WiFi you probably used to post your comment is also frequently used under Part 97 (amateur radio) rules.
" . . . the IETF is putting this stake in the ground in May of 2014."
This isn't much of a stake in the ground, but its a start.
So far, the disclosures have involved the NSA and GCHQ: intercepting hardware and modifying it; strong-arming companies into "coöperating"; pushing weaknesses known only to them into standards; and spending tens of billions to copy most of the Internet and have server farms sort it.
To the contrary, I think forcing every proposal to at least confront pervasive monitoring and how it may effect what is being proposed will force authors to be more conscious of PM and security in general with regard to their proposals. It's not the job of the IETF to stop monitoring, but it is their job to make sure the cards are on the cable when putting forward a new RFC.
The time it took from 'common knowledge' to a formal proposal makes me a little worried. If the IETF isn't really a "council of wise folks" then in the long term, doesn't their effectiveness get eroded?
>>The time it took from 'common knowledge' to a formal proposal makes me a little worried.
As anti-NSA, pro-snowden as I am(I proudly wear my Snowden t-shirts)... I think it's important for formal steps to make sure they've filtered out the hype and not just react while the general public is on fire about the issue. Waiting about 6 months to a year after Snowden did his thing I think is fast enough. Gives folks enough time to digest the info and for anyone else out there thinking about "leaking" more info that (dis)proves Snowden enough time to weigh consequences 'n such. This way when formal steps begin to happen nobody can complain that they didn't have enough time to respond/defend themselves, etc. USGov has been given plenty of time to quell concerns and haven't done a particularly good job so I feel that Snowden is mostly, if not completely, correct and I can confidently consider the USGov/NSA as the real villains in the world attempting to misdirect everyone else with allegations of boogieman terrorists & spying from Russia/China/Nigeria/whatever.
Not that I ever believed otherwise... just that this gives me more talking points in future debates with friends/family/random-online-comments.
I guess I kind of expect the IETF to be predictive, and not be pop-culture or mass-market/media driven.
I guess ultimately though, that's the only way to go. We knew years before Snowden that something fishy was up, but it took the leaks to really make people care at a level that could facilitate change.
>I guess I kind of expect the IETF to be predictive, and not be pop-culture or mass-market/media driven.
But that's the GP's exact point: this wasn't a media-driven RFC -- if it had been driven by media, it would have been published last summer. As it stands now, the body has carefully deliberated on the facts and yet still published a very strongly-worded RFC.
This pro-Snowden programmer is pretty happy to see the IETF step forward and take on a leadership role here. Let's not forget that the IETF was founded by a consortium of US government agencies and only when private in the 1990s.
With this RFC, they are asserting their independence in a surprisingly direct manner (for a standards body).
Standards bodies are, by necessity, slower-moving than those developing the works they standardize, It's remarkable that they successfully agreed upon a statement like this at all, let alone this quickly. As stated in the article, they had to deal with blatantly broken objections that ought to have been easy to dismiss; I'd be curious how the actual standardization effort managed to achieve rough consensus amid those objections.