I wasn't responding directly to what you said because it's irrelevant. You were pointing out that in fact hosting a personal website doesn't in fact expose its visitors to a lot of risk, especially if it's a static site instead of a blog or something. But that doesn't imply that people hosting their own personal websites will find it easy to comply with a regulatory regime tailored to raising the barriers to entry for "the next Facebook". More likely they will find it infeasible.
It's true that imposing liability for publishing defective software is logically independent from imposing liability for collecting unnecessary PII that leaks. But pjmlp's quote from the article we were commenting on explicitly proposed doing both of these:
> For example, if you want to see Microsoft have a heart attack, talk about the idea of defining legal liability for bad code in a commercial product. If you want to give Facebook nightmares, talk about the idea of making it legally liable for any and all leaks of our personal records that a jury can be persuaded were unnecessarily collected.
So my argument does not, as you say, "rely on a bit of a non-sequitur: [that] expanding the scope of data collection/handling regulations will inevitably extend to regulating the publishing of software." The proposal in question is to both regulate software publishing and also regulate data handling, so it's irrelevant whether or not the scope would thus "inevitably extend" from one to the other.
Probably it is true that the most favorable situation for the current incumbents would be to have no liability, as at present, or as minimal liability as they can get away with. But the second-most-favorable situation, and one that is definitely politically viable even if the current situation is not, would be to have a regulatory regime that raises the barriers to entry for new entrants as much as possible and prevents disruption to their markets, by enshrining in law the particular way they're doing business today: AI melody recognition for prior restraint of free speech, combined with armies of outsourced moderators to watch for terrorism and pornography, centrally-controlled app-store platforms, locked-down end-user hardware (with a grandfathered carve-out for desktops and laptops), real-name policies, fax-us-your-passport ID verification, "two-factor" authentication that turns out to be one-factor, and so on. Anything that encourages you to post stuff on your own blog or website would be a big drawback for GitHub, YouTube, and Fecebutt.
It's true that imposing liability for publishing defective software is logically independent from imposing liability for collecting unnecessary PII that leaks. But pjmlp's quote from the article we were commenting on explicitly proposed doing both of these:
> For example, if you want to see Microsoft have a heart attack, talk about the idea of defining legal liability for bad code in a commercial product. If you want to give Facebook nightmares, talk about the idea of making it legally liable for any and all leaks of our personal records that a jury can be persuaded were unnecessarily collected.
So my argument does not, as you say, "rely on a bit of a non-sequitur: [that] expanding the scope of data collection/handling regulations will inevitably extend to regulating the publishing of software." The proposal in question is to both regulate software publishing and also regulate data handling, so it's irrelevant whether or not the scope would thus "inevitably extend" from one to the other.
Probably it is true that the most favorable situation for the current incumbents would be to have no liability, as at present, or as minimal liability as they can get away with. But the second-most-favorable situation, and one that is definitely politically viable even if the current situation is not, would be to have a regulatory regime that raises the barriers to entry for new entrants as much as possible and prevents disruption to their markets, by enshrining in law the particular way they're doing business today: AI melody recognition for prior restraint of free speech, combined with armies of outsourced moderators to watch for terrorism and pornography, centrally-controlled app-store platforms, locked-down end-user hardware (with a grandfathered carve-out for desktops and laptops), real-name policies, fax-us-your-passport ID verification, "two-factor" authentication that turns out to be one-factor, and so on. Anything that encourages you to post stuff on your own blog or website would be a big drawback for GitHub, YouTube, and Fecebutt.