Give an example then of the specific rule that blocks standard QOS techniques then. I can't be expected to prove a negative.
Additionally, I'm not just talking about peering discrimination. The residential ISPs want the ability to restrict arbitrary sources and destinations at the last mile even if capacity exists on the rest of their network and their IX.
The caveat is that throttling is allowed if it is “reasonable network management” which is defined as actions being done with a “technical network management justification.”
And this is where legislation ends, under the ambiguity of reasonable and technical. Comcast tested this when a complaint was filed against them for throttling BitTorrent. I consider throttling BitTorrent “reasonable” and justifiable under purely “technical” reasons but clearly other people disagree. Enough people probably disagree that they had to deny any such throttling practices, meaning that in practice they would likely not be allowed to throttle based on technical reasons alone.
> So what you're saying is that there was a process for figuring out the inevitable ambiguities, and you just disagree with the ruling?
The fact that what constitutes “reasonable” and “technical” QoS is ambiguous and strongly dependent on popular opinion is a huge design smell for net neutrality legislation.
> As an aside, I also don't see the "reasonable", "technical" justification for throttling bittorrent anymore than throttling https.
If 20% of your customers are using 80% of your network capacity because of BitTorrent then yes, throttling them is both technical and reasonable.
Additionally, I'm not just talking about peering discrimination. The residential ISPs want the ability to restrict arbitrary sources and destinations at the last mile even if capacity exists on the rest of their network and their IX.