> If someone is using a VPN, they can go in any country, so it is going to be bypassing some of that
I love how the initial stated goal of the bill is to prevent involuntary access to pornography by children, but they're looking at how to prevent circumvention using VPNs.
call them and tell them they were wrong/right in their vote, and that you do/don't support their decision.
"your vote on X has seriously caused me to reconsider my support of the NDP and you as a candidate. this is a heavyhanded measure that bla bla bla bla..." or similar such statement.
Korea already has laws that are similar to the bill S-210, and it is extremely frustrating when I am visiting there. Every website that I want to use without any tracking is blocked by the South Korean government, thus I have to use VPN to use certain websites otherwise unavailable.
When will they ever stop? The moment we (=the people) have fought one surveillance law, they (=politicians) simply come up with another. Until we are too tired to fight anymore...
That’s their _politicians_ strategy. It’s a time worn political stratagem. Keep pushing the same bill through with a different name, different sponsors, and at a time few are paying attention.
There has to be some sort of punishment for them to stop. Otherwise they'll just repeat it. If they find themselves consistently primaried or voted out for supporting bills they'll get the message.
There's no voting this wacko out. The lady who keeps shoving these bills out (or onto other bills) is an unelected senator that is there til she's 75 or does some gross misconduct or something.
I think blame may lie more on lobbyists than politicians. Though the fact that politicians haven't fixed the problems with lobbying doesn't look so good either.
What not just have site that are serve explicit things advertise that. Have a large scale Blocklist, and if you serve explicit content, then you need to be on there.
Then have a option when people sign up for internet to block all the IPs associated with explicit stuff.
Adults and parents can choose "yes or no" when they pay for internet and then be done with it.
If children are still getting access, then their either technically savvy, which nothing will stop. Or an adult in their life is allowing unblocked internet.
Just make ip filtering easy and default so that parents can set it up without having to understand anything. It should be a required thing when you request internet access "do you want to filter out explicit content"
Some sites that have dual content may have to have separate IP addresses for non-explicit and explicit/non explicit. But that is relatively easy. And if they do not care, then they will by default end up on explicit list.
A couple problems with that are that people will disagree on what is inappropriate for children (e.g. drugs, nudity, sex, simulated violence, real violence, simulated gore, real gore, foul language, educational material that's sexual in nature, hate speech, advertising), and parents may want unfiltered or differently filtered access for themselves while blocking it on a device/profile the child uses.
I imagine a huge firewall list is probably also technically challenging for ISPs unless there's some convention for IPs (like even/odd LSB).
But you can just have the site serve a content labels header. Specify what the content is, with a standard for topics which may be commonly considered inappropriate for children, and let parents configure controls at the OS level, which the browser should respect. This lets you even provide the info at the level of individual resources (so e.g. wikipedia could label images or pages with topics like nudity or violence, and parents could opt to allow/block topics).
You could also have a coarse header indicating that the site does not want minors to access it, and that could be an option for the browser to block.
The one question I want to ask these government officials is "Where are the parents?"
If underage children are accessing explicit content, it is the parents, not the websites that are at fault. Why are you handing children devices that are not parental locked to only permit applications that have been approved by the parent?
If the parent fails to properly vet the apps they allow their children to use then it is the parent that needs to be questioned. Pushing this requirement onto websites and services is moving the goal posts. If the parents are too technically inept or unwilling to police their children's devices then they should not be handing them to their children.
Blaming the government for whatever their child does. It seems most parents, they have outsourced the primary education of their kids to the government and tiktok.
I told one such parent that the government is only in charge of educating your child to get a job, not teaching him basic behavior, values and life lessons, and that parent said "no, I pay taxes, I expect the government to teach him everything". Nuff said.
Why is this surprising? It's a society where the government is expected to step in and take care of you at every turn, from birth to death. That's what welfare is for, and free health care, and food stamps, government housing, government education, subsidized early child care, all the way to assisted death.
If the goal is to design a society where the government is responsible for everything, why does the population need to be responsible?
> If the goal is to design a society where the government is responsible for everything, why does the population need to be responsible?
Those people that want that sort of society shouldn't be allowed to do anything without explicit approval from a government-mandated device that tells them every single action that they should do, even breathing.
They want a government that's responsible for everything, then they get to live that idea: If they even do a single action that wasn't told via their device, straight to jail as a 'deviant'.
>The one question I want to ask these government officials is "Where are the parents?"
To be clear, this bill originated outside the government. The Canadian Conservative Party do not currently form the government. They are ideologically-driven religious fundamentalists who don't care "where the parents are". For them it's "porn == bad" and "government oversight == bad except if it's on an issue that my ideology happens to align with".
> How to say you don't have kids without saying you don't have kids.
> Unless you're the perfect kid, did you forgot what you did when you were a kid that you parent didn't agree on?
But just because child you unwittingly stumbled upon something a child shouldn't see doesn't mean that the website was at fault. Just because you got hurt doesn't always mean that someone should be sued.
Internet is getting pretty easy to find access to. We may be past the point where parents can limit their kids to only accessing the internet through devices that the parents control.
It shouldn’t be about limiting access in the first place. The truth is that parents need to have those uncomfortable and direct conversations with their children and help guide them to make healthy decisions.
Which parental controls would, say, allow a kid to visit /r/roblox but not /r/nsfw? Best I can tell, Firefox literally does not have any parental controls built in, and they have a note that extensions to provide it are easily bypassed.
It seems reasonable to me to at least start requiring sites and commercial software vendors to build a system that makes parental controls possible. And if a commercial site like reddit is going to have child focused content like a Roblox forum, then it should be easily separated from porn without requiring parents to set up a MITM proxy with complicated filtering rules.
There are simple privacy and (adult) autonomy preserving ways filtering or access control could be done, but after 30 years it still seems to be an impossible task. Chuck-E-Cheese doesn't have a porn theatre in the back where an employee just asks if you're 18 and accepts any 7 year old saying yes.
reddit is a big no no for children. its full of porn, politically charged topics, and predators. you can also use wildcards in a dns sinkhole to target keywords perhaps.
I don't disagree, but also I don't think it's unreasonable that a site like that which also hosts children's forums is going to have parents wanting regulation.
Pornhub doesn't have a part of the website for minor teenagers or Minecraft. Neopets doesn't have a porn or gore section. Even 4chan separates their SFW and NSFW boards onto different sites (though obviously even their SFW boards are not child friendly and don't pretend to be).
You can't do anything at the DNS level for reddit (short of blocking it all) because it's all one site. And it uses TLS, so you'd need to MITM to do partial filtering, which is beyond the capability of most people. I assume Instagram and tiktok are similar/even harder to filter.
Porn sites at least used to ask for credit cards. Now they just ask yes/no are you 18, and they have children's sections. They should really be working to clean up their act in a privacy/autonomy friendly way (e.g. through labeling and partnering with browsers) before they're forced into these kinds of laws. Or stop targeting children as a market and ban anyone that hints that they are under 18.
Some consumer routers may have the ability to filter URLs I think. I'm not really experienced in networking/web technologies. The reddit schema seems easy enough to understand but yeah IG/tiktok, seem impossible. Could you force a particular build of chrome that has extensions baked in for blocking sites/urls? Maybe an endpoint management tool to keep those applications in check. Something like Jamf.
We can't even successfully curtail adverse behavior in the enterprise, with budgetary spends for DLP and communications monitoring measured in the millions.
Parents don't stand a chance. You could be running a fucking kiosk with only Minecraft on it. If it still has an internet connection, kids will be harassed, bullied and solicited.
It's the kids who taught me about the existence and purpose of Finstagram accounts.
The only effective parental control is unplugging them from the internet. But you can't do that, because our lives are inextricable from it now.
> Protecting Young Persons from Exposure to Pornography Act.
It also says senators were afraid to vote against the bill.
Can someone explain to me why children looking at porn is a massively evil thing. Why that control should be in the hands of governments and not something a parent should decide for their kid.
Somehow looking at genitals at age 17 is an abomination but at 18 it is totally fine.
Showing half naked men and women in billboards is totally fine, but no tits dicks or vaginas.
What is so wrong with tits dicks and vaginas that we have to “protect” the children against them? All teens have a subset of tits, dicks and vaginas so it’s not like they are unaware of their existence.
I don't have strong feelings either way, but I think in practice, the vast majority of parents would choose to have their kids not view it, yet do not have a practical way to enforce that choice.
I worked at a go kart racing track when I was a teen and the company installed something called "governors" that would limit the speed at which the go karts could go. The governors had a remote control which allowed an operator to selectively limit the speed of each individual go kart. As a 14 year old, if someone did a slight playful bump into another racer, I would govern them. It's the first instance of "power" I had over others. And.., I abused it for the "good".
This being said, I have a belief that if you give someone a button to ruin someones life, some people will push the button with thoughtless abandon.
Correct me if I'm wrong, but I'm not seeing a democratic pushback mechanism in this bill, and I think this is a disaster. This is also being said in the context that the most voted petition in Canadian history is e-4701, which is a vote of no confidence.
The Stanford Prison Experiment has been discredited and debunked for almost a decade now. It's interesting to me how little attention the discrediting got and how many people still point to it as a real unguided trial because of that.
No, the SPE has not been "discredited." To do that, you will have to run an experiment that demonstrates that people will NOT do what you tell them if the request is unethical.
The SPE was itself unethical to run, and was a terribly designed experiment, but that doesn't mean that its conclusions were invalid. They were very valid. If someone refuses to commit an atrocity for you under color of authority, all you have to do is ask the next person. You won't have to go very far down the line.
I think part of Stanford’s hardiness is due to the fact that it ‘feels true.’ We have all had our chains yanked by petty tyrants… even if the science was rotten.
You are right about it not being properly discredited, though- I remember it being featured and commonly cited in textbooks; something that big and ‘influential’ is going to take a long time to die.
The reason it "feels true" is that it's one of several experiments designed not to determine if something is possible, but how something that happened (the Holocaust) actually happened.
People forget that when they criticize Zimbardo and Milgram. Their methods were problematic, but nothing they found was surprising. It's hard to replicate their work, but that's only because you can't get similar experiments approved by ethics committees these days.
sure it feels true... but are we sure that's not just a US or Western thing? is that just a low-level, client-facing role thing?
plus it's role-playing a prison, where legit, actually-murdered-a-bunch-of-people are locked up in real life, and they play real games -- and guards play them back. Tell people to roleplay a group of people who are violently anti-social, and another group to role play the people who have to keep them in line (also violently) and the participants fall back on what they think the roles do and how they act.
just like with Freud, there may be some slivers of truth, but there is too much untestable, non-transferable bunk that comes with it; you can't call that science.
neat idea, but attempts at replicating it have failed, and manager's notes + participant testimony have shown that they were, at best, pushing to an outcome, and at worst demanding one. it's basically bunk.
This proposed law, and similar ones across the English speaking work (Australia, UK, NZ, US) are because of lobbying by Baroness Beeban Kidron [0] for a decade.
An earlier comment of mine:
Baroness Beeban Kidron has been lobbying for stringent anti-CSAM measures in tech for years [0].
She's lead a major pressure campaign in the US, Canada, and the UK for years [1]. Her charity 5Rights and WeProtect both have been able to back Labour and the Tories so she's able to lobby across the aisles.
It doesn't hurt that the publishing company her parents founded (Pluto Press) has a strong niche in the political space.
The Molly Russell suicide also played a role [2], which she leveraged to highlight the need for restrictive anti-CSAM measures, especially as it became a top tabloid story in the UK.
> the most voted petition in Canadian history is e-4701, which is a vote of no confidence
Most Liberal Party members voted against this bill (S-210 [1]). Support for this bill comes from the remaining parties, ie: (1) the Conservative Party, (2) the NDP (ie Canada’s semi-socialist progressive party), and (3) Bloc Quebocois (Quebec’s nationalist party).
e-4701 [2] is a petition introduced by a conservative. Conservatives in Canada have a vested interest in pushing out Justin Trudeau, so there are no surprises there.
e-4701 has less to do with the actual popularity of the parties or the system than it does with the fact that we're 8 years in with a fairly progressive leader. c.f. the annual anti-Trudeau demonstrations that are astroturfed (yellow vests, "freedom convoy 2022", united we roll, etc). This system didn't exist when Harper was PM or anyone before it; and we're hardly at Mulroney level of discontent.
edit: also... the bill in question is opposed by the Liberal party at large (the ones who this petition oppose). it's mainly being pushed by the Conservatives and the Bloc (which makes for an odd union already) as well as the NDP (which pushes it completely into nutty territory of voting patterns). and it was introduced in the Senate - where most Senators do not have party affiliation.
It's not censorship, it's age verification. You can still access this stuff if you can prove you're an adult. Same as how children aren't allowed to buy the same material in stores. It's still being published, there's no censorship.
> any site or service that makes sexually explicit materials available
so basically, the internet.
> Canadian ISPs required to ensure that the sites are rendered inaccessible
At best, this is regulatory capture for the current tech giants, at worst, basically ability to hand pick who gets to see what sites. So yes, censorship
under the cloak of "age verification" and "protecting kids". We have heard it all before. I'm surprised they didn't somehow stuff the "terrorism" angle in there as well.
>At best, this is regulatory capture for the current tech giants, at worst, basically ability to hand pick who gets to see what sites.
It hasn't happened with any other censorship bill Canada has passed.
This includes laws on pronoun use:
Canada’s gender identity rights Bill C-16 explained
>through a process that would start with a complaint and progress to a proceeding before a human rights tribunal. If the tribunal rules that harassment or discrimination took place, there would typically be an order for monetary and non-monetary remedies. A non-monetary remedy may include sensitivity training, issuing an apology, or even a publication ban, he says.
That's not a censorship bill, that's an anti-harassment bill. Harassment is illegal everywhere: I'm not free to follow you around calling you an asshole. I could get charged for that, especially if you're my employee, tenant, or in the presence of other exacerbating factors. Canadian hate law says I'm not free to follow you around making disparaging comments about your race. C-16 expands that to say that I'm not allowed to follow you around disparaging your gender identity. That's it.
This bill, conversely, gives the government explicit power to block websites that host content that is not child-appropriate. Completely different.
Requiring an onerous age verification scheme provided by government approved vendors is a lot closer to censorship than it isn't.
Say you post stuff to your own blog, and sometimes use colorful language. A parent decides to report you to the regulatory agency, and now you have 20 days to do whatever they demand you do to remediate, or else your site will be blocked at the ISP level.
In order to have age verification, you need identity verification, i.e. tying your identity to your activity. Classic chilling effects. If you're in the closet because you come from a religious family or have a religious boss, can you risk some random site or government bureaucracy getting hacked and outing you? Strong anonymity is essential for free expression.
Then the requirement to identify yourself is friction, so sites will want to avoid it, which can only be accomplished through censorship. Ordinary sites not solely focused on X-rated content will be "moderated" down to the level of children, even when they have adult audiences, because they don't want to be locked behind the porn filter.
Instead of having diverse communities tailored to all different kinds of people and ideas, you bifurcate the world into nerfed risk-averse corporate censorship and explicit smut. The only place you're allowed to have an adult conversation is Pornhub, which is not exactly known for quality intellectual discourse.
> In order to have age verification, you need identity verification, i.e. tying your identity to your activity
I don't see any reason age verification has to tie your identity to your activity. It should be possible with modern cryptographic techniques to make a system whereby the service that checks your age doesn't find out what site the check is for, and that site doesn't find out who you are.
One is you get a unique token which can be tied back to your identity by someone who compromises the issuing service, so if they get compromised you're screwed.
The other is you get a generic token or one that otherwise can't be tied back to a specific identity, in which case the token leaks and there is no way to trace back who is leaking it, creating a generic bypass of the whole system.
Porn site issues you a unique token. You have an age verification service sign that token using a blind signature. You return the signed token to the porn site. The porn site can verify the signature using the public key of the verification service.
Verification service gets compromised, attackers record and eventually publish mapping between unique verification tokens and users, now all the porn sites or anyone who has compromised any of them can have stored its verification tokens and know who all the users are.
The verification service doesn't see the token from the porn site. The user takes the token from the porn site and from that generates a blinded token. The blinded token is what is sent to verification service. They sign it and return it.
It is the user that then generates the signed porn token, by applying the unblinding function to the signed blinded token.
If the blinded tokens leak the porn sites cannot match them up with the tokens they issued because they don't have the parameters to unblind them.
I don't see how it gets you out of the dichotomy. If you can't trace the tokens back to the user then somebody can set up a service that will proxy sign for anyone and you have no way to know who they are. If you can, attackers can unmask legitimate users.
You're also describing a real-time system, which is subject to timing attacks. Even if you couldn't compare the tokens with each other, you could see that each time User 32323 logged in, John Smith requested a token.
I'll talk about timing attacks at the end of this.
Here's how it could work, using an RSA-based signature. The age verification service is using RSA with a modulus of N, a public exponent of e, and a private exponent of d.
To produce a signature S for a message M the age verification service computes and returns S = M^d mod N. Someone who wants to verify that S is a signature for M computes S^e mod N and if that equals M then S was a signature for M.
1. Porn site issues a token to User 32323. Let's all this token T.
2. User 32323 picks a random number r that is relatively prime to N. Since r is relatively prime to N, User 32323 can easily compute r' such that r r' = 1 mod N.
3. User 32323 asks the age verification service to sign r^e T.
4. The age verification service, after receiving proof that the user is an adult, which probably involved the user providing government ID that shows their real identity, signs r^e T.
Remember, to sign the age verification service raises the message they are signing to the power of d mod N, which in this gives r^(ed) T^d = r T^d mod N. The age verification service returns r T^d to User 32323.
5. User 32323 can multiply that r T^d by r', giving T^d mod N.
Note that T^d mod N is the signature that the age verification service would have generated if it had been given the token T directly to sign, instead of having been given r^e T.
The net result is that the age verification service has signed T without ever having seen T. They only saw r^e T.
6. User 32323 can return their token T back to the porn site, along with the signature S = T^d mod N, and a note telling the porn site which age verification service was used.
7. The porn site looks up the modulus N and public exponent e for that age verification service, compute S^e mod N and see that this equals T. That tells them that an adult used the age verification service to get T signed, so they allow the account to be created.
If someone is trying to figure out the real identity of User 32323 they might get T and T^d mod N from porn site. And they could get all the messages that the age verification service signed between the time T was issues and the time User 32323 submitted T^d mod N.
But for each message M there will be some r such that r^e T = M, and so any such message could be the right one [1]. You get no information other than whatever you can infer from timing.
Same for someone starting with the age verifications of a particular person and trying to figure out if any of those are for some particular porn site.
I think that age verification would probably only be done at account creation, which would mean much less timing information would be available. The risk of a timing attack could be further reduced by using a high volume verification service so that there are more verifications going on at near the same time.
You would further reduce the risk by adding some delay on your end. Wait until several hours or even a day or two after receiving T from the porn site before you return the signed T to complete your signup.
[1] There is a very small possibility of a T where there is no r that maps it to M. That could happen if T happened to have a factor in common with N. Since the N for an RSA system is constructed by multiplying two large (thousands of bits) together the chances of accidentally hitting such a T are in the 1 in 2^thousands. And no one knows how to deliberately construct such a T without first factoring N.
> The website blocking provisions are focused on limiting user access and can therefore be applied to websites anywhere in the world with Canadian ISPs required to ensure that the sites are rendered inaccessible. And what about the risk of overblocking? The bill not only envisions the possibility of blocking lawful content or limiting access to those over 18, it expressly permits it. Section 9(5) states that if the court determines that an order is needed, it may have the effect of preventing access to “material other than sexually explicit material made available by the organization” or limiting access to anyone, not just young people. This raises the prospect of full censorship of lawful content under court order based on notices from a government agency.
It's silly to contrast a narrow restriction on speech like "you can be charged for making statements which constitute harassment" or "this specific product cannot be advertised in certain ways" with a massive restriction on speech like "any website that does not check users' IDs can be blocked at an ISP level if it is found to have content on it which is not appropriate for children."
If you like free speech, am I "free" to "speak" loud screams directly into your ear? Am I free to speak lies to you about the safety features in my airplane, when you're buying tickets? Free speech absolutism is a childish position; what crosses a line is subjective.
I think any reasonable person would agree that this internet blocking regime crosses a line and unfairly stifles people's ability to communicate. Not every public space should be obliged to be child-appropriate; obscenity laws are best left in the 20th century.
When a kid can walk into a 18+ movie showing at a theater or buy a porn magazine from the corner store without proof of age you'll have an argument.
Until then you're advocating for the continuation of a a special exemption for online porn sites, which are worse than adult movies and magazines because they are known and habitual hosts of child sexual abuse materials, non-consensual porn, child and adult sex trafficking victims.
Corner stores don't track your ID and purchases. Come on. We all know that any information you enter into a website will be used to create a profile on you. The fact that these profiles will be attached to very personal information about people's sexualities should be very troubling to you. This is a huge privacy violation.
On top of that, it's not just porn websites that this will apply to. It's any website featuring explicit material that could be "harmful to children." That includes most discussions of sex. You're going to have to register your ID to even have an R-rated conversation online. That'll have a huge chilling effect on free speech.
Children's internet access can easily be controlled by parents with readily-available tools. The problem here is parental negligence. There are plenty of ways the government can push parents into using these tools, and none of them are damaging privacy or free speech.
And don't bring child sex abuse material into this. If you're concerned about CSAM on porn sites, you should be advocating that the government investigate that. Not that they institute draconian ID-checking laws. Unless you think CSAM is only a problem when it's viewed by someone under 18?
It uses a one drop rule to test websites. Once theres a teensy bit of adult content, and as has been pointed out this covers things like googles unsafe search modes, then the requirement is to block first and ask questions later.
Requiring people to be licensed or verified to access content is as much censorship as blocking it entirely. There are valid reasons to not want to be on a conservative governments list of porn users. You need to expose yourself to risk to access content? Censorship.
You expect that once this is law, that a single adult picture on the site and a complaint to the CRTC will result in a website being blocked or forced to implement age verification?
So if someone links to or sneaks in a naked picture in the comment section of a tech site that and makes a complaint, then that site face serious consequences?
My underlying issue with the mechanism here is the use of affirmative defenses from
"Organizations (broadly defined under the Criminal Code) can rely on three potential defences:
The organization instituted a “prescribed age-verification method” to limit access. It would be up to the government to determine what methods qualify with due regard for reliability and privacy. There is a major global business of vendors that sell these technologies and who are vocal proponents of this kind of legislation.
The organization can make the case that there is “legitimate purpose related to science, medicine, education or the arts.”
The organization took steps required to limit access after having received a notification from the enforcement agency (likely the CRTC)."
(not quoted due to long lines and lack of line-wrapping in preformatted blocks)
Affirmative defenses shift the burden of proof from the accuser to the accused. The accuser does not have to prove that the accused didn't implement age-verification, or that the use was not educational.
The accuser only has to prove the presence of nude imagery (which is easy), and the accused then has to generate a case on how their age-verification is sufficient or their usage is educational (which is hard).
The burden of proof and work required thereby, rests almost entirely on the accused. The balance of effort required is heavily skewed.
It would also prevent any attempts to get a summary judgement prior to the trial, at least in the US. In the US, a summary judgement is only possible when the facts don't support the case. I.e. if we presume everything the plaintiff claims is true, it's still not a violation of the law.
Imagine someone says that you wearing purple was an assault on them; you could likely get a summary judgement because even if you were wearing purple and that caused them some trauma, it doesn't rise to the legal level of assault. The facts are irrelevant, because the entire case is wrong.
That doesn't work here, because the only thing the law cares about is whether nude imagery exists. In order to assert an affirmative defense you have to launch a defense, which requires a trial. There is no earlier point where you could say "hey, we're clearly a sex education site because x, y and z".
Imagine a sex education website. They very likely have indisputable nude content on their site. They're now in a situation where anyone in the country can meet a very low cost bar (find a nude image), and cause them to have to mount a difficult and expensive defense (prove one of the affirmative defenses at trial). The law doesn't require the plaintiff to prove or even identify the affirmative defenses that don't apply.
This law would be much better if the burden of proving that provider did not implement age verification, has not resolved any issues pointed out, and is not educational fell on the plaintiff. Summary judgements could be back in play; the judge could look at the evidence and say "even if I assume what you're saying is true, that doesn't prove the site isn't educational" and toss the case before the defendant has to build a whole case.
In my greater opinion, I don't think children accessing sexual content is a big enough issue to shift the burden of proof onto the defendant. I think ideally children wouldn't be exposed to pornography until they're old enough to contextualize the content into a coherent world view (e.g. the BDSM community takes a lot of care around consent and emotional well being that is sometimes not obvious from pornography tailored to the market). That being said, I think the "damage" done by seeing it earlier than one should is too low to justify this dramatic of a response.
No, it isn’t. It’s about power. There’s nothing a Canadian ISP can do to prevent kids from finding porn online. If the Canadian government actually cared about this they wouldn’t use (hopelessly ineffective!) ISP level blocking. They would be funding an information campaign on device-level content blockers and be funding a browser add-on and block/allow list.
Not meaningfully. If a kid wants to find porn, they'll find porn. These sorts of measures are ultimately ineffective at blocking everything, and if there's even one thing that slips through, even if only for a little while, that's enough.
Accessing porn will be more difficult and require more effort. When things are more difficult and require more effort people do it less. Especially vices.
For the sake of discussion I’ll grant you porn is a vice but that is definitely just your opinion.
Isn’t this bill supposed to “protect children”? Accessing porn isn’t illegal for adults. If the debate is about restricting access to a legal “vice” then opinions will likely shift even farther to the “no” side. I agree imposing puritanical beliefs is the actual motivation but that’s not a great slogan.
Isn't it obvious it won't? Unless they go full-on China firewall style there will always be website readily available that won't enforce that crap and kids will find them easily.
Banning children from using the internet at all would decrease porn consumption among minors. That doesn't make it a good idea. Although, frankly, it would be more effective AND less of a privacy violation than the ID law would be.
Why are so many media organizations and news outlets quiet on this? I've been searching around for any hits of S-210 and it seems there's remarkably few! If you're Canadian let's spread the word!
Trudeau is subsidizing most of Canadian media. Who knows what Pierre will do, but I’ll take a bet that it could be a cost saving measure to stop handouts to ‘fake news’ MSM.
I had been following C-18 closely. Wanted it to fail. Facebook/Meta bailed out of the market, deciding to not paying anything. The other one was Google which finally settled for paying $100m per year to a single institution which will divide the money among others.
That's still far less than what Canada was asking. 2% of their yearly revenue which was bumped up to 4% after Meta left. But Google's $100m is more like 0.5% of their yearly gross revenue in Canada.
It was only a matter of time before a certain class of people figured out how to use the internet and subsequently ruin it. I'm more surprised it lasted this long.
That person obviously knows what they're talking about and are much smarter than me but I still think this here is a weasel thing to say.
> I think the best way to deal with the issue includes education, digital skills, and parental oversight of Internet use including the use of personal filters or blocking tools if desired.
That made me actually laugh. Internet access is ubiquitous, parental digital literacy is often low, none of those are actually helpful suggestions that would help address the symptom.
Imagine if we suggested that for other problems. "I think the best way to deal with gun violence includes education, gun skills, and parental oversight of gun use including the use of safety mechanisms and gun safes if desired."
I appreciate people spending their time fighting for our rights. But these arguments here weren't very practical in my opinion.
> That made me actually laugh. Internet access is ubiquitous, parental digital literacy is often low, none of those are actually helpful suggestions that would help address the symptom.
I think that works both for and against your point though. For the very same reason, it's not likely there will ever be a way to reliably solve this technologically. So while we may need some technological measures, we also need to educate parents and kids about this at the same time. Anyone trying to sell a technological 'this will solve the porn problem' solution should be viewed with a very critical eye.
> For the very same reason, it's not likely there will ever be a way to reliably solve this technologically.
How did you arrive at this? Because I'm optimistic and think that if the government sets the ground rules, porn distributors will find a way to implement it reliably (using technology) and continue delivering porn to their consumers.
We get age checked here in Canada all the time when we buy booze for instance. Why not enforce it here with something like an age verification?
Because the internet does not end at the Canadian border, and porn companies operating from other countries have no reason to implement such systems. And you can't block access to them -- VPNs are trivial to access these days.
> We get age checked here in Canada all the time when we buy booze for instance. Why not enforce it here with something like an age verification?
Because to buy alcohol you have to physically show up at a store with a physical ID that needs to pass as real, and you need to present yourself passably as an adult. Even then, it's not terribly difficult to get around even that -- plenty of high-schoolers are able to reliably get alcohol with cheap fake ID's because they look old enough and the ID is good enough. Online delivery services are also not terribly hard to spoof for a dedicated enough teenager.
>Why not enforce it here with something like an age verification?
While I'd love this technical utopia, the Internet is a global construct. All it takes is someone in another country to decide they'd like to prey on kids by selling them porn, and your age verification goes out the window.
To use your analogy, if the government can't even stop licensed stores in fixed physical locations from selling alcohol to kids (because it does happen), then how is one country going to enforce this in a globally distributed, virtual space?
We get carded in the US all the time when we buy booze too, but that system is still flawed: fake IDs often work, store clerks sometimes don't care as much as they should, underage people can sometimes get a random adult passing through to buy booze for them in exchange for 20 bucks, etc.
And it's a lot easier to enforce rules when an interaction is in-person. How can you believe that porn blocking on the internet would ever actually work? (If blocking online TV/movie/music copyright infringement hasn't worked, why would you expect something like this to be successful?) All it takes is for one site to ignore the rules (porn sites based outside of Canada will not care to implement any kind of age verification), and for teens to use a VPN or some other trick to bypass ISP-level blocking.
Then you have to ban or heavily regulate VPNs. At one point do you start realizing that you live in a Chinese-style internet censorship state?
This is also missing the glaring fact that dedicated porn distributors are not the only sources of pornographic material. There are plenty of users on social media who post nsfw content or borderline nsfw content.
So next you end up having everyone dox themselves to the site just to 'protect the children'. But then you have things like the fediverse and torrents, where obviously Canada can't really force much. At that point what are you going to do? Cut off global internet access? Start banning everything? Start heavily regulating software development past even what GDPR does?
I think in the end we should just stop caring so much about this stuff. Yes, educate children, with a frank discussion, about porn and why it can be a problem.
But if a teenager really wants to find porn, they're going to find porn, and... so what? I downloaded low-quality porn JPEGs from BBSes as a teen in the early/mid-90s, and then found so much more porn on the internet in the mid/late-90s, and I turned out ok. Sure, some people end up addicted and/or with unhealthy views on sex and intimacy because of porn, but Draconian blocking schemes (that ultimately don't work) aren't the answer.
Maybe parents should actually parent their children. Install parental controls where possible (which I know are far from perfect), but otherwise, maybe give your kids some measure of trust. This particular thing isn't like worrying about a child predator luring your kid out to meet in person after finding them online; there's very little harm in a kid seeing some porn and then a parent finding out about it and disciplining the kid, if that's what they want to do. When I broke the rules as a kid, I got TV or Nintendo privileges taken away for a time; parents can take away a teen's internet-connected devices as punishment.
> That made me actually laugh. Internet access is ubiquitous, parental digital literacy is often low, none of those are actually helpful suggestions that would help address the symptom.
And tools for enforcing any kind of rules on your own are hot garbage. Especially any that a normal person could possibly hope to figure out.
You either have to cut off the 'net entirely (which is less practical with each passing year) or commit to a second job of managing this crap.
I'm not a fan of bills like this, but also something's gonna have to change, or we'll get... bills like this.
My guess is nothing will change, and this sort of thing is what'll happen instead.
I don’t think equating kids watching porn with gun violence makes sense. One has a much greater and more permanent effect so the reaction should be stronger. The other isn’t, in my opinion, a big deal and the way the government is trying to deal with it has much farther reaching effects than the issue warrants.
The argument doesn’t work for gun violence because that’s other people using the equipment. It works if every kid has a gun (smartphone) and keeps shooting themselves with it though
Except that in this instance I can actually see the merit but obviously YMMV. I had access to porn since as far back as I can remember. In hindsight I wish something like this had been in place.
Also grew up with unrestricted internet access, I recall everyone in middle/high school sharing LiveLeak links through MSN. Sure, it desensitized us, maybe “survivor bias”, but if I quickly think of everyone I know, we’ve all ended up as semi-functioning adults.
What I’d rather see governments do — figure out how to solve decreasing attention span in kids. They’ll eventually watch porn anyways, but tolerating anything that lasts more than 5 minutes is more important.
Kids will get access if the really want to. I remember downloading "that scene" from American Pie over 1mbps using KaZaa back in 2001.
The only real solution is as others have suggested here: parents being more attentive to what their kids are browsing and making an effort to put in safeguards.
Like the article states, this law is an avalanche for abuse of power (that guaranteed will take place).
I also always had access to it but I never saw the harm in it, always knew it was unrealistic, the same way an action movie is.
FWIW Instead of protections that will be eventually breakable, What I would like is more a mandatory disclaimer saying that everything about it is unrealistic and that its pure entertainment.
If it did, the question is would you still be opposed? Is hiding knowledge better than allowing it? I think without knowledge you might have the opposite opinion
There is another problem there: since the legislation would apply to so many kinds of sites it would have the effect of cutting off young people from services they rely on. Sites which don't mainly host explicit content. It could have life-long economic repercussions for disadvantaged youth. Very evil and stupid, these conservative shits.
They're appealing to their "fear" base. A certain segment of the voting population lives in fear (no thanks to the politicians who stoke this fear) of ... well everything.
Base? This was a senate private member's bill. Senators in Canada are appointed, not elected, and they serve for life. The senator who introduced the bill, Julie Miville-Duchêne, is a member of the independent senators group, a non-partisan group of 39 senators.
It was voted against by the government. It's only getting traction with support from opposition parties. It might pass the house of commons without government support due to a minority parliament. All this despite the fact that the opposition spans the gamut from far left to far right.
There are no sides - there are nuanced and diverse issues and varying approaches which optimize for different values and to use words like 'us','them','side' is detrimental to the political process and enables those who yield narrative control over media uncharacteristic influence in a democracy.
centrists are probably generally less fearful as retreating to one extreme of a polarized dynamic is a common fear response. But I think the point is more, everyone is partially motivated by fear, sometimes governments use that fear as a lever.
Not about sides. Some people are just afraid of things. Afraid everyone is out to steal their kids. Afraid of war. Afraid of immigrants. Afraid of not enough immigrants. This just appeals to those people.
I assumed this is what parental controls are invented for, but all too many parents don't seem to even have the knowledge to activate those in the first place.
Maybe they should learn how they work instead of supporting a thin veil of totalitarianism.
Which parental controls, and how would those even work?
Back in the IE6 days I remember there was some setting to tweak allowed content ratings for websites, which I believe was based on HTTP headers, but does anyone implement that anymore (did they ever)? I just checked reddit, and they don't seem to send any headers, and use the same domains for porn as they do for child-focused content.
The obvious thing to advocate for would be to require commercial porn sites to send some kind of standardized headers that would allow parental controls to work in the first place, and/or to require any domains with content targeted to children to not also host adult-only content (e.g. reddit should put either their child-focused subs or their porn on a separate subdomain).
Then require commercial browser vendors to implement content blocking settings based on those headers.
Alternatively, it would be possible for governments to provide an oauth style service that provides tokens that assert the user is an adult without revealing any other info about the user, and then require porn sites to check for that token.
> It is unrealistic to lock your kids out of all social media.
Excuse me, what?
Is it also unrealistic to prevent your kids from doing hard drugs? Sure maybe you can’t control them past a certain age, but you certainly can AND SHOULD within a certain age time frame.
Social media has been shown to be so hazardous to developing youth’s mental health I think not only is it realistic, but you have a duty and an obligation do prevent your kids from using social media in an unrestricted manner.
They didn't say unrestricted, they said locked out. As in, not accessible at all.
And you know what is also very hazardous to a developing person's mental health? Being left out of most social gatherings and interactions with their peers.
You can block your kids from accessing social media, but to succeed you also need to force their peers to communicate and interact with them in a unique and more frictive way. And you have no right to force someone else's kids to do that. So the reality is that without a lot of like-minded families that do the same, your kids are going to be left out of a lot of things.
So is it unrealistic to control your children's access to social media? No, not at all. But locking them out from it entirely as you stated, is a good way to negatively impact their social development.
I disagree. I grew up without a cell phone while all my friends had them. Beyond that, my friends had online gaming where I barely even had access to the internet, and what access I had was limited.
I find the argument that kids are going to unanimously ostracize another kid without an Instagram account very hard to believe. Kids _will_ be assholes for any reason under the sun, so it's not a question of will my kid have a hard time socially from time-to-time—that's just life.
Hobbies, social outlets, groups in the community—these are all ways to ensure your kids aren't socially stunted without giving them the emotional / mental equivalent of heroin.
> I disagree. I grew up without a cell phone while all my friends had them. Beyond that, my friends had online gaming where I barely even had access to the internet, and what access I had was limited.
The world's changed since you and I were at the ages being discussed here. We grew up in a different world, frankly.
> I find the argument that kids are going to unanimously ostracize another kid without an Instagram account very hard to believe. Kids _will_ be assholes for any reason under the sun, so it's not a question of will my kid have a hard time socially from time-to-time—that's just life.
They won't be ostracized. They will just be forgotten. If you are messaging a group of 10 people, and for 9/10 you can just use one platform, but for the last person, you need to use something else, then eventually they will start being forgotten unless someone is very proactive about including them. Miss one get together here, one joke there, and quite quickly you find yourself on the outside of the group. It's real, and I have seen it happen. It's not that kids are being assholes, it's that they are just behaving as young humans do.
> Hobbies, social outlets, groups in the community—these are all ways to ensure your kids aren't socially stunted without giving them the emotional / mental equivalent of heroin.
Those are all good and well, but they are no substitute for your kids finding their own friends as they grow up. Having the autonomy to organize your own social networks yourself is very important for kids.
Loneliness and a lifetime of sadness has been shown to have greater risks. Is cutting out social connections the answer? Because that's what ends up happening.
Please present evidence that denying teens access to social media causes them to have literally no meaningful social connections. I do expect social interactions to suffer for a teen in that situation. But I find it hard to believe that it causes complete isolation and (gimme a break) a "lifetime of sadness".
The parent didn't say "literally no meaningful social connections". You're asking for proof of a point that wasn't made.
Sadness, loneliness, isolation, these are not things that you can measure with such binary precision. They're emotions, and they vary, person-to-person, and moment-to-moment. Is it not plausible that for some people, 'social interactions suffering', even if not completely removed from social connections, is enough to make them feel a profound level of sadness or loneliness or isolation?
Is it unrealistic? Like, social media wasn't invented when I was a kid and I found some way to spend my time.
I don't have kids but am planning on it, and I don't see myself locking kids out of social media. I also don't super care if they see naked people on the Internet. If it becomes a problem somehow I'll deal with the problem myself.
I'm more worried about kids shows on TV with messages like "follow all the rules or you'll be bricked inside a tunnel". I grew up with those, and in retrospect, they are super creepy.
To me, the problem is that while you might block your children from using social media, are all of your children's friends going to be in the same situation. If your kid has a circle of friends that interacts a lot on social media, and your kid doesn't participate in that, eventually they will be left out of that group.
Yeah, exactly. That's why I think a categorical ban is not necessarily "good parenting". I think you have to peek in and see what they're watching on TikTok, and provide relevant information. If they think a video where someone burns down a store as "a prank" is funny, remind them of the consequences; hurting other people, prison, whatever. (OK, TikTok is a little less extreme than that, but you get the point.)
Certainly, I understand why people want to delegate work to the government here; parenting is hard work. But it's necessary work, you don't want the government's children to go out into the world and do their own thing. You want your kids to. And so your touch is going to be required in their formative years. There is no getting around that.
> That's why I think a categorical ban is not necessarily "good parenting".
I agree -- both because it's unlikely to be effectual, and because it's not really addressing the root problem of parent involvement.
> I think you have to peek in and see what they're watching on TikTok, and provide relevant information.
Exactly. You can't just expect to go into your firewall settings, ban some URLs and expect that to solve the problem. Or at the same time, to just completely ban them from accessing social media at all (though arguments could be made for monitored/timed access at certain age 'gates') -- I've written multiple other comments about why that can backfire too.
>It is unrealistic to lock your kids out of all social media.
Really? so the state blocks porn, but then that social media will have people that use bad language so the state needs to also protect your kids from bad language, maybe people say on those social media that Santa or Jesus does not exist so state should protect them again and ban this stuff too.
Do not allow your child on social media that is for adults.
I am sure PlayStation offer social feature for children with strong moderation, so find similar social media that is targeted for children.
It is perfectly reasonable to lock your kids out of social media until they're mature enough to handle to it. It's actually probably in their benefit to block them from the non-nsfw parts of social media as well since social media has an extremely negative effect on mental health.
Parents are allowed to autocrat - it may feel icky but to parent well you need to occasionally put on your villain hat.
>Parents are allowed to autocrat - it may feel icky but to parent well you need to occasionally put on your villain hat
It's not about icky. If you autocrat too much as a parent your kids will simple resent you and once they're 18/left home they'll do whatever they want now that you're no longer around to control them, because they never learned any self control. Like US college kids going all out with drugs and alcohol.
There is a place between a micro-managing helicopter parent and laissez-faire - your parenting should be in that place. And while it'd be awesome to be your kids' best friend forever it's more important that they learn boundaries, self-control and how to human. You can impart self-control without opening every door and, given how psychologically exploitative a lot of the internet and advertising is, it's hardly a fair fight to just let them sink or swim.
What a weird argument in the context of this bill. "If I'm too strict my child will resent me therefore I will vote so that my opinion can be enforced on everyone by the government and I can be my child's buddy".
Must be interesting with other "sinful" activities, "I'd totally let you do cocain but shoot the government won't let me!".
My parents were pretty strict with me as a kid, and I missed out on a lot of things that my peers got to do. I did resent them, somewhat, at the time, but by my mid-20s I was able to recognize that they were just doing their best and, like literally all parents, were making things up as they went along.
When I went to college I was fine. I developed a new social circle quickly, and didn't end up becoming a drunk or a druggie. Sure, I did many of the things my parents never would allow me to do, but it was fine.
I totally get that some people end up in a worse place than I did, but that's not an excuse for blanket governmental bans on things. But I would absolutely 100% support any parent that decides to deny their children any and all access to social media. That shit is cancer, and IMO is worse for developing brains than nicotine or alcohol.
I hesitate to present such a hard line on social media, when I'm kinda "whatever" on teens seeing some porn. The problem is that I see what social media does to adults with fully-developed brains, and I start to feel like social media is akin to heroin: no amount of it is safe, for anyone.
A more relaxed view might be to give teens access to social media, but only in a supervised setting. Parents should be monitoring what goes on with their social media accounts, and have frank (but calm and non-judgemental) discussions with the kid whenever anything concerning comes up. Also parents need to find a way to impress upon their kids that social media is not real life, and that people present whatever slice of their lives (often an unrealistic rosy picture) they decide to paint. And then there's all the misinformation and echo chambers, and... ugh, yeah, no, just don't let kids on social media.
I believe its even more unrealistic to make the internet sfw / children.
While i do sympathize with the impossible difficulty of being a parent in the current decade, there is realistically no way i am going to just accept the death of the adult internet. Thats likely a common sentiment and realistically the people making these laws are too stupid to enforce them against motivated opponents with technical expertise.
I dont see a way out of this other then create a whitelisted subset for children with enforcement being the responsibility of the parents. Because going death of anonymity for access control is a no go either.
Looking on the bright side, to me this looks like just a lack of safe for children platforms that are still tolerable to use.
It's absolutely ridiculous that we consider nudity to be nsfw - but TikTok and Instagram have a fair amount of content on them that are intentionally as close to nsfw as you can get without getting banned. I agree with your literal point - but there's actual porn on social media.
Yeah, I mean why does the government get to tell me how I raise my kids? I wanna put them to work, drive heavy machinery, and marry them off to 90 year old "founders" all I like so stay outta my business.
The difference is that in the cases you mention, the government is protecting kids from bad actions that parents might take.
Requiring age verification won't stop a parent from giving a kid access to porn. And, frankly, if a parent wants to show a teen porn as a way to educate them about what can be bad about it and what the dangers are, that's a private matter between the parent and the teen.
Oh I'm sorry you're totally right, we need to ban working, driving heavy machinery, and marriage to protect your children. Because that's what this is all actually about right? Sock puppet guy has made that quite clear, his goal is to ban all porn.
There are a plethora of tools a concerned parent can deploy currently to limit web exposure - the main difficulty in doing so is more of a social pressure than a technical one and the salves provided by this bill are trivial to circumvent for a determined teen.
Is it, though? I found and enjoyed a good amount of porn on BBSes and the early internet when I was a teen, and I turned out fine.
Porn absolutely has serious problems associated with it (both on the production and consumption sides), but teens getting exposed to porn is just not a big deal. It can turn into a big deal, but that's what, y'know... parenting... is for.
That's fair. And as another commenter pointed out, this is probably your responsibility as a parent. Of course, we also let the government tell us we can't let our kids work, so clearly there's a line we draw.
So I ask, why is it the government's responsibility in this specific case?
>Not wanting your children to be influenced by pornography at a formative age is a very reasonable fear in this era.
I agree, so as a developer do you think it is easier to block porn at OS level and your router or you think that is safer that some regulation that only some websites will follow is more effective ?
I read about some proposal for adult website to scan your face before granting access and other idiotic schemes, where the super obvious KISS solution is have Android, iOS, Windows, Ubuntu etc have an idiot proof way for a parent to setup a child account, this child account will set some cookie or whatever flag you want so websites can refuse to serve content to them. Also the big tech could use their combined energy to have a blacklist of bad websites that do not respect the rules like maybe advertisers.
TLDR as a parent do not wait for the goverment to protect your child from porn, open Google and figure it out,
It would be so much easier for people to ignore you if you just stated your values up front you know? Maybe next time you could even use your real account instead of multiple sock puppets.
There are probably reasonable ways for the government to do this. Some kind of government run validation website where a user can generate a code to validate their age to each individual website is probably ok (in the UK we have a similar system for providing driving licence information to rental companies for example).
Mandating that people do face scans or, even worse, send their IDs to potentially scummy websites is batshit crazy though.
If what a government really want to do is ban access to pornographic content then they should be able to make that case to the electorate.
>There are probably reasonable ways for the government to do this. Some kind of government run validation website where a user can generate a code to validate their age to each individual website is probably ok (in the UK we have a similar system for providing driving licence information to rental companies for example).
Why not just the parent sets the child device to "kid mode" and that is it.
otherwise how do you make different national systems work World wide ?
Your idea is valid but seems much more harder to implement. With my proposal where the parent and the OS vendor collaborate on this seems much more flexible and would work even better, like religious extremists could configure their child and their own devices to block much more then porn.
I'm mostly with you. In fact I've often wished that browsers did more things like that, for example cookie warnings. The cookie warning thing is a ridiculous misunderstanding of how things should work. The server doesn't store things on your PC, it just offers them to your browser, it's up to the browser to return them or not, it doesn't even require new protocols.
But of course kids are resourceful and will find a way of accessing content they want to access, for example maybe using a games console or something a parent might not immediately think of as having a browser.
Whether it actually makes sense to protect kids from themselves to that extent I think is a valid question which I'm not sure I have the answer. I know passing around CDs full of such content was common when I was young (in the early days of the internet).
What about consenting women who've not been coerced or pressured in any way? You want to take people's free will away, seems to be another perspective to your belief.
I agree with defensive in depth but that is contradicting priovacy if I need to show my face or ID card to a porn website, or how otehrs claim that reddit and TikTok is almost porn I should show my ID to all social media.
As I said a parent should not allow the child to access social media that is targeting adults.
The OS can put a flag and the websites can reject those connections, no need for destroying privacy.
Not all porn is women exploration, there are this days this OnlyFans women that are exploiting men but who should decide what is and what is not exploitation.
There is also CGI or AI porn, there are nude and artworks that religious extremists complain about. You can't stop porn so better don't wait for that day and setup your child devices if they need one properly.
Given how conclusively the article explains that this bill is supported by effectively everybody _but_ Trudeau, I'd like to take a moment to thank you for unburdening the rest of us so effectively with any ongoing need to give precious time or thought to anyone wielding that particular two word phrase, or, for those of us who have already wasted our time in dialogue with folks like you, reaffirming the conclusions we'd already drawn. Appreciated.
From the article, "In fact, government ministers voted against it. Instead, the bill is backed by the Conservatives, Bloc and NDP with a smattering of votes from backbench Liberal MPs."
Your anger seems to be misplaced in this instance.
> "The bill, which is the brainchild of Senator Julie Miville-Duchêne, is not a government bill. In fact, government ministers voted against it. Instead, the bill is backed by the Conservatives, Bloc and NDP with a smattering of votes from backbench Liberal MPs."
Remember, the opposition and the Senate can get bills through in a minority parliament.
That's a good point but it's worth noting that the NDP is backing it, as well as some Liberal MPs. For non-Canadians, NDP is our "far left" party. I don't mean that in a pejorative sense, just that our Liberal party tends to be your "mainstream" moderates and we have a 3rd party for the less moderate left-leaning voters. That's what the NDP is. So if they're backing this then that can only mean that there is bi-partisan support for this.
Unfortunately if Poilievre actually did get into the PM office it'd be on the back of a CPC majority - so he'd have a lot more freedom to act than Trudeau's current minority government.
We'll see. Could also be a Harper-like minority, getting support from the bloc. Trudeau also has a very supportive NDP right now, so Trudeau does have a lot of freedom currently.
I'll never say never but the current state of the CPC is so departed from mainstream Canadian politics that it's far more likely that BQ would side with LPC... and the NDP and Greens would basically have to have a melt-down to side with the CPC.
I think it's almost vanishingly unlikely that the CPC form government unless they actually achieve a majority share of seats.
That's what I thought initially too, but reading the article, it seems like it's just requiring age verification for accessing adult material. If we are concerned about censorship - I have to say that ship has long ago sailed. I don't have a problem with blocking porn or steps preventing kids from accessing porn
>I don't have a problem with blocking porn or steps preventing kids from accessing porn
It's just the first step; once they have the infrastructure for real-ID verification in place, they'll roll it out to more and more of the internet until adults have no more anonymity online.
“Terrorism” and “protect the kids!” are the most used boogeyman by governments in the last two decades or so to further violate private citizens privacy rights, freedom rights, and further monitoring and censoring communications, and the accusation is ready if you dare to stand against it.
Growing up in a country with internet censorship, I can tell it is a huge slippery slope not to mention the dangers of having to upload government ID to access adult websites. I hope politicians come to their senses on this.
If they really want to do this without any privacy invasion ulterior motives, then someone needs to push an anonymous credentials scheme. Have the government give people age verification keys. A person uses their key to verify their age on a website, but the website won't know who they are, and the government won't know they accessed said website(outside any other means of tracking internet traffic).
In China they already took a step further. The kids were using their parents' ID card to play games, so those games had to implement face recognition in addition to the ID card.
Don't buy their framing - every website with user-uploaded content is an "adult website" in the eyes of this law. If one of your users uploads a single jpeg of porn/hateful political rant/description of self harm, you'll be liable for "not implementing adequate measures to prevent minors from being exposed to pornography/harmful material/incitement to violence".
"Adequate measures" are, of course, a complete loss of anonymity, adding your website to the surveillance state apparatus.
The bill doesn't say anything about uploading government ID.
It says that it is illegal for a company to to give porn to kids. The company can defend it's self against the charge if they have a "prescribed age-verification method".
There are blind, privacy-preserving ways that this can be done. A third party verifies a government ID and issues an age-verification token. The token is passed to the porn site, which has a way to verify the token without talking to the entity that issued the token.
That way the porn site doesn't know who you are (it just knows "this person is old enough to access this content"), and the age verification entity doesn't know what you used the token to access.
Of course, this scheme is more complicated than building an age verification system that involves uploading a government ID (or asking a third party directly to verify someone's age), so ultimately no one gets any privacy or anonymity.
What age verification system exists other than government ID?
The only way we ever verify someone's age for legal purposes, at least here in Canada, is by checking government ID. Birth certificate, driver's permit, photo ID card, health card. They all have your birth date on it. A younger adult may need to show this ID if they look particularly young and want to buy cigarettes or alcohol.
Though relevant to the topic, I would note that the same young adult may need to show ID to buy pornography on DVD or Bluray at a retail store. That's already established and I would think few object to that. It's the security and privacy issues that arise when we start sending this data in a recorded and logged form over the Internet. This remains true whether it's a government ID or a privately issued ID.
>It's the security and privacy issues that arise when we start sending this data in a recorded and logged form over the Internet.
That's for consumers and distributors to figure out. The lack of trust between consumers and distributors is no reason to continue allowing online porn to be exempt from long established and agreed upon controls.
I think the security and privacy concerns absolutely are a reason to continue allowing this sort of thing to skate by. At least until the security and privacy concerns can be addressed.
Micheal Geist is a very well known figure in Canada, and has built a solid reputation raising awareness of important privacy issues. He's testified in parliamentary advisory committees and CRTC hearings. The way you keep referring to him as "this blogger", unfairly makes it sound like some rando on the web wrote it.
This is not about verifying ages. This is about the government tracking people so they can use information against them in the future. Any kid that wants to bypass this will use a vpn or submit false documents.
Yes, because cross-party consensus from politicians is always an indicator of good change! /s
I think 'this blogger' (see other comments) is well-informed and up-to-date on their stuff, so they probably have a more well-informed opinion than who I'm replying to.
These kinds of solutions are so short-sighted because a) kids will get around it and b) in 5 years we'll have someone come along trying to ban another set of websites, and then another. This has and continues to happen all over the world and throughout history. When you give people or institutions the ability to censor information, given enough time, that authority will be wielded against you.
And everything said about it and the status in the house: https://openparliament.ca/bills/44-1/S-210/