Today many argue the state has a need to access such correspondence to prevent crime, but such a need is like the need of an addict: nothing good can come from it and the people should not enable these institutions to satisfy an ever growing demand for insight into their private lives. One must remember that democracy is founded on the believe that thoughts and words are not crimes and everyone must be free to express them-self in public, but even more so in private correspondence. A society that mistrusts its own citizens to a point where all those that whisper to each other are called criminals, dealers, traitors or terrorists is rotten at its core.
And yet some still say: but if the state can read all private correspondence it would be so much easier to catch criminals. And yes, it is true that these totalitarian methods ar efficient in fighting street level crime. However for society as a whole, such methods enable a terror of the state that is a crime against humanity itself. They say "but the state will never abuse its power" and i say: it did countless times before. Do not stray away from liberty and freedom for promises of safety made by those that profit from oppression.
I feel disregarding the usefulness of surveillance is part of the problem. We should not be arguing that that nothing good comes out of surveillance. It provides your opponent an easy strawman for a hollow victory. Because frankly, surveillance is a useful tool for law enforcement.
We need to rather argue that the moral cost and side-effects of public surveillance far outweighs its usefulness.
In this regard I recommend you go look into the evidence on mass surveillance. There have been several reports done on the mass surveillance programs that have been operating since 2001 and in report after report, the mass surveillance has been found not only to be ineffective at producing any tips, it commonly just tied up law enforcement resources that could have been spent on their legitimate tips.
Here is a very well sourced article referencing several FBI internal reports, a white house appointed review group, those of non-profits, and local police departments:
The actual, real point is that they're underestimating the downsides or surveillance, and that even if it would work, it would still not be worth it. That's the only argument that can hold, and the actual reason we're against it.
The problem is that we, the people, can never know what, if any, good is coming out of surveillance. Attorney General Barr admitted that in one of his speeches arguing for back doors in encryption. The government cannot reveal what is being discovered through surveillance without disclosing sources and methods that it (understandably) wants to keep concealed from adversaries. But without that information we and our elected representatives cannot exercise proper oversight. And without proper oversight any such capability will be abused.
Often that is way too much time--25 to 50 years in many cases, since those are the time frames for declassification of classified information--for such revelations to be useful for oversight, especially with the state of encryption as it is since computers and the Internet.
Before computers and the Internet, it was possible to have a reasonable tradeoff between strength of encryption and the ability of law enforcement to conduct surveillance, because perfect encryption was impossible and imperfect encryption got more expensive the closer you wanted it to be to perfect. So people were already making a cost-benefit tradeoff (difficulty of breaking the encryption and obtaining private data vs. cost), and it was reasonable for the government to ask that the potential benefits of surveillance be included in the tradeoff, since that would just adjust the balance of the tradeoff, and the adjustment could be periodically reviewed based on data on past surveillance that was revealed by things like FOIA requests.
But now, with computers and the Internet, perfect encryption is cheaper than imperfect encryption. Perfect encryption is just a mathematical algorithm, and it's straightforward to put that algorithm in computer code and verify that the code correctly executes the algorithm. Imperfect encryption requires adding code to that perfect algorithm, which adds cost, and also adds a risk that wasn't even there before, of whatever back doors are in the code being exploited. So now we users, to enable surveillance by law enforcement, would not be just making a small adjustment that could be periodically reviewed in a tradeoff we have to make anyway. We would be adding a new tradeoff that we have no other incentive to make, and thus taking on a new oversight burden, which is, if not impossible, at least extremely difficult to properly fulfill, that we have no other incentive to take on. That is simply not a bargain that free citizens of a free society should accept.
> embedding those checks/balances/required transparency in the surveillance processes in such a way that they cannot be circumvented by those in power.
The processes can't be transparent because, as I said, that would reveal sources and methods that should be concealed from adversaries. An application for a FISA warrant can't wait for the years it would take to allow a FOIA request to be fulfilled in the interest of transparency.
That's only part of what I mean about the goal to demand expanding oversight, maybe those timeframes are too long, but the point is that those time frames sometimes serve a useful purpose to slow things down for safety of parties involved or other reasons. A goal should be to find a healthy "medium" where "Surveillance FOIA 2.0" still allows for transparency/oversight/review without hobbling the process, and FOIA was just one example of an existing transparency tool to model from, it's not the only tool/model it was the first example to mind, but you would hopefully expand to a larger suite of transparency/sousveillance ("watch the watchers") tools.
I'm also not claiming that we shouldn't fight surveillance attempts, simply that where surveillance seems inevitable/a foregone conclusion/rough to fight that we also need to devote resources to fighting for increased sousveillance/transparency, because power will always abuse surveillance.
To me, breaking perfect encryption by putting backdoors in computer algorithms is precisely the kind of place where we should not think that surveillance is inevitable/a foregone conclusion, but should draw a line in the sand and say that no, we're not going to accept this, law enforcement simply needs to up its game and figure out how to operate in this new environment where anyone who wants to can use perfect encryption.
If X is the amount of utility coming out of surveillance, and you cannot know X, then you cannot argue that X = 0 or that X > Y (for any Y you want to pick, like downsides of surveillance) or that X < Y either.
Essentially it becomes impossible to rationally debate the issue on the basis of whether it is a net gain.
Which means you need to fall back on other forms of reasoning. A reasonable position is that freedoms shouldn't be sacrificed for something whose utility cannot be demonstrated. But that's an argument about what sorts of justifications are required for laws, not about how much utility the law would have.
The argument I make is that it is more cost effective to develop a society where one does not need to commit crimes to get by in the first place. Law enforcement is reactionary and can only punish when crimes are already committed. While we shouldn't get rid of law enforcement, because crime will always exist, let's look to societies that have low crime (and societies that have high crime) and see what we can learn from them (and improve upon). Such policy is far more advantageous for citizens.
Your idea seems predicated on people in general being benevolent towards others. That's not going to work, there's a significant motivated cadre who want to do terrible things provided they 'win'. You don't enlist Cambridge Analytica when you think you're right, you do that sorry of thing when you don't care about being right/moral/legal but only about subjugating others.
Be careful about how you frame that. While this is true of some people who engage in activities like this, there is also the "ends justify the means" group. The latter does believe what they are doing is right and moral, and that being right and moral justifies behavior that is illegal. It's easy to be cynical and assume that the latter group is just the former group deluding themselves, but there are people who genuinely think that way. Addressing them requires a different approach than addressing those who just want power and control by any means.
I think this is why we can see people gladly vote for those that they very much disagree with. I think this is why attacking someone's tribal leader makes them double down and strengthens their convictions rather than changing belief. I think the question is how to get people to realize that you have to fight fair to get others to fight you back with fairness.
> And yes, it is true that these totalitarian methods ar [sic] efficient in fighting street level crime. However for society as a whole, such methods enable a terror of the state that is a crime against humanity itself
However I think it’s a very serious fallacy to split surveillance into a ‘useful’ component and ‘side-effects’.
They aren’t side-effects. They are the effects.
Reduced crime may be a consequence of a surveillance society.
In such a society you may discover that discussing crime statistics in a negative light would reflect badly on the party bosses and must be done with caution.
I know this is Hacker News, but not every argument requires infinite nuance, we don't need to sit down and examine the pros & cons of torture or any other clear and obvious abuse of government power. We don't need to dignify the position of "read all citizens private correspondence" with a cost/benefit analysis. This practice provides legitimacy to clearly unconscionable actions. It is permissible, even strategically valuable, to have certain positions that we are absolutist on, policies that aren't tolerated under any conditions.
Same with banning secure private communication.
Studies show torture is simply not effective. Similar to how surveilling all communication is not effective.
Once they have this ability, it will be much harder to make them give it up.
>Once they have this ability, it will be much harder to make them give it up.
There's an argument to be made that this is already happening and, in fact, has been happening for decades.
I'm of course referring to the "War on Drugs."
There's quite a bit of analysis in the literature to show that restrictions on mind-altering substances was explicitly introduced to disadvantage particular populations.
What's more, despite popular perception, the "crime" rate is at its lowest levels in more than 50 years, yet we continue to fund law "enforcement" at levels even higher than when we were at the peak of the "crime" rate during that time.
So. Since crime rates have plummeted, yet we're spending more than ever, it's likely your paraphrase is already the case right now.
More's the pity.
(But the detective that put together the residential warrant "bundled" it with a bunch of non-residential warrants, nearly burying/hiding it, when it took it to the Judge to be signed, and then again the SWAT team now says that the residential warrant execution plan was buried in the same swamp of non-residential warrant executions, and it's really hard to keep from wondering if that was malice or incompetence as all these details come out. Was it personal? Or was it dumb luck? I can't even tell which is worse at this point, because either seems to show a lack of responsibility, and both are worsened by likely what will continue to be a lack of consequences or atonement.)
Notable examples are RICO Laws, FISA courts and the PATRIOT act. So. Yes.
Or, just as likely, "once they can catch the people they don't like, they won't need to bother catching the people that you think are criminals".
Using the invasion of Iraq as an example, there has been many years where public opinion was negative, or at least lukewarm, towards the invasion, though not violently so. Casually reviewing the polling history, this can be observed as early as in 2004.  But I think one can safely say that the war hasn't been at the forefront of most people's minds during the last few decades, except for the very beginning. But then, there has been very little mention of the financial cost of the war in the media, if any. And why would there be, as the media also earns a lot of money on these "safe" wars.
The video “Troops Versus Building --- an Iraq War tale” by soldier grunt Blacktail should give you a pretty hands-on idea of the financial cost of the war, however. 
: Troops Versus Building --- an Iraq War tale, 24 Nov 2009, Blacktail, https://youtu.be/2N-1E2F9pmc
The root of 99% of crime in the US is poverty. Not private communication.
Trying to solve poverty by spying on everyone’s data is like trying to cure cancer with Tylenol. Even if you temporarily prevent a symptom from occurring, you're still dying of cancer.
So much of political thought in the US is focused on the futile efforts of treating symptoms, and not curing underlying causes.
And the worst part is, if you look at statistics in the rest of the developed world, poverty in fact has a cure! Like in most things, the US is the head-in-the-sand stubborn outlier here.
Politicians in general have shown they don't have the moral probity to be trusted to direct a democracy.
We need a sort of reverse-Stasi. Everything a senior politician does should be reviewed and only closed if it is provably personal and without public interest.
Maybe our politicians need to wear bodycams.
You give too much credit to police states.
What happens in reality is that criminals with connections and a minimum of self restrain are folded into the "dark side" of the State, while their rivals are cracked down hard. In this way, a number of low-impact, high-revenue illegal activities are tolerated (in exchange of bribes), crime syndicates are expected to self police and not break whatever taboos were imposed from above; and then this "dark side" of the government do put a lid on top of the deviant side of society, diverting their energies into activities that do not challenge the status quo.
Does it make for a safer place to live for the common citizen? Maybe. While it may be less likely that you will be injured in an armed robbery, you will also be more likely to get your money swindled by this scheme or another... and you will have less chance of redress when this happens.
The problem isn't "undesirables" (MISS ME with that shit), it's lies by omission and economic power brought to bear against people's rational expectations of privacy.
Holy shit, this is straight up racism. How the hell you got from protecting freethinkers to eugenics, I have no idea.
I'm not really sure what the EFF is unhappy with about this act, since their complaints don't seem to be reflected in the text.
From the act:
CYBERSECURITY PROTECTIONS DO NOT GIVE RISE TO LIABILITY.—Notwithstanding paragraph (6), a provider of an interactive computer service shall not be deemed to be in violation of section 2252 or 2252A of title 18, United States Code, for the purposes of subparagraph (A) of such paragraph (6), and shall not otherwise be subject to any charge in a criminal prosecution under State law under subparagraph (B) of such paragraph (6), or any claim in a civil action under State law under subparagraph (C) of such paragraph (6), because the provider—
“(A) utilizes full end-to-end encrypted messaging services, device encryption, or other encryption services;
“(B) does not possess the information necessary to decrypt a communication; or
“(C) fails to take an action that would otherwise undermine the ability of the provider to offer full end-to-end encrypted messaging services, device encryption, or other encryption services.”.
> Sen. Leahy’s amendment prohibits holding companies liable because they use “end-to-end encryption, device encryption, or other encryption services.” But the bill still encourages state lawmakers to look for loopholes to undermine end-to-end encryption, such as demanding that messages be scanned on a local device, before they get encrypted and sent along to their recipient.
The original Earn It Act was bad. But that bad stuff has been massively ripped out. Plus real protections for privacy added in. It's not the same as it was - look up the text and compare what's been struck through with what is left.
I think in the current form it's a definite win for privacy and common sense.
- HTTPS is more secure and private than HTTP
- Signal is more secure and private than SMS/Skype/Messenger
- Tor/Browser is more secure and private than Cloud/Chrome
- FileVault, BitLocker and LUKS are more secure and private than RAW unencrypted disk data
- 2FA and Hashing is better than raw Passwords
- List goes on and on.
All you have to do is: get out of your desperate situation where "everything's controlled", because it's not true, at least not everything. We don't have the best security and privacy but we have some of it AND we need to fight to keep and expand it.
Now go on and tell me how smart you are and how you see things different and generalize stuff again in replies. Or you can just shut up and stand up to contribute and improve it.
1) Yes, you can build better encryption and privacy preserving technology. That is a technology and adoption problem.
2) There is so much you can do once the law says you can’t encrypt certain kinds of things and if you do, the state will be after you. That is a social problem. Our elected leaders aren’t serving us.
Not mutually exclusive. We have to do both. Build better tech, educate others and esp our politicians who can change the fabric of society.
I hate this sooo much! Having children really makes some people entirely unable to think. I've heard people entirely willing to outlaw any form of encryption to "protect" their children from pedophiles despite the likelihood of getting their identity stolen, bankrupt and unable to feed their children being significantly higher than a pedophile targeting their child even now, and would increase by several orders of magnitude if we suddenly weren't allowed to encrypt things like e-banking and email.
In the real world, many states freely admit that they will fight against secure, private messaging between citizens (say, because law enforcement needs a backdoor to solve crimes). And while governments can, and do, make laws to that effect, improvement will be legislated away beyond a certain point. This also produces a chilling effect on engineering: why work on a technology that will likely be outlawed if successful?
In most cases when the government is making laws to criminalize X trying to overpower it with better engineering just does not work. My 2c.
It was a generation of pioneers who weren't quite so timid about working on new technology and fight absurd, archaic laws that built the foundations of modern consumer/commercial encryption. The U.S government especially tried very hard to chill these efforts too, and failed (so far at least).
What I don’t understand is why you seem to be suggesting that as an alternative to opposing legislation which would prohibit such work.
However, the second part (about the fait accompli) is a very important point.
You’d have to achieve widespread usage among the general population for this to be effective to be effective.
Treading shaky legal ground is not the same as circumvention.
This time around, if encryption is banned, they will do more than just hound Phil Zimmerman for years on end.
They’ll come after the end users, and ‘circumvention’ won’t help.
We today probably have the best privacy tools in all of recorded history. Modern encryption means that anyone on the planet can send a message to anyone else on the planet without fear of government decryption in transit (asymmetric public keys, Tor, PGP, pick your tools). Using freely available tools I can encrypt a file on a USB drive in a way that even an NSA data center running for a billion years wouldn't decrypt. Those sorts of things were not possible a hundred years ago. They weren't really possible only a generation ago.
The VPN’s people will actually use are a poor trade-off that will leave most people with a false sense of security at best, and probably with significantly less rights over what happens to their data regardless.
Don’t disagree with you on the rest of the above.
I don't have the technical skills to improve it. Someone I inform might.
None of your list is better than nothing, if the authoritarians want your data. Except, maybe, Tor, and only if people contribute to running exit nodes.
If it isn't end-to-end, and only you know and control your keys, you are already doomed. In other words, you cannot trust any service with your keys. That includes https and signal.
Because our priority is kids! Honestly! Seriously I mean it!
Not that that would be bad, it's just that there are norms for who-does-what in gov't, and overlap of responsibilities is usually bad.
This kind of rhetoric is so boring at this point. Trying to destroy a proposal via scope creep is intellectually dishonest and does little more than weaken your own sides position. If I was a senator on the fence about this bill and this was the best argument you could muster against it it would be an easy yea.
Nobody anywhere is challenging the laws again child abuse. What they are challenging is the laws attempting to decrease it or catch people doing it through means that don't actually work and harm other important things.
To use your example, it's like writing a law "against murder" that's written in a way that actually doesn't make murder illegal and also bans seat belts.
In fact, this "scope creep" you mention is exactly how laws work. First you make murder illegal (done), then you start looking at the common causes for murder and find ways to fix those.
I like the idea of federal legislature ceding power to state legislatures.
Additionally, it looks like encryption is offered more protections in this bill, Considering federal laws preempt state, especially with regard to telecommunications, I do not see what the risks are with passing this(regarding encryption).
The bill ammends Section 230(e) of the Communications Act of 1934.
> CYBERSECURITY PROTECTIONS DO NOT GIVE RISE TO LIABILITY.—Notwithstanding paragraph (6), a provider of an interactive computer service shall not be deemed to be in violation of section 2252 or 2252A of title 18, United States Code, for the purposes of subparagraph (A) of such paragraph (6), and shall not otherwise be subject to any charge in a criminal prosecution under State law under subparagraph (B) of such paragraph (6), or any claim in a civil action under State law under subparagraph (C) of such paragraph (6), because the provider—
Snowden is great, but do your own research.
Edit: I usually trust Snowden as well. I could be missing something in this bill. EFF did not provide specifics. Hopefully someone here can.
> Just a few months ago, Senator Lindsey Graham (R–SC) delivered an ominous threat to Apple, Facebook, and any other tech company that might refuse to kill encryption programs that prevent malicious hackers, law enforcement officers, and others from accessing our private communications systems: "You're going to find a way to do this or we're going to do it for you."
> Graham has authored the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2019 — or EARN IT Act [...]
I looked and could not find this. That article did not offer any specifics either.
Scroll to summary, specifically "Section 230 immunity for CSAM can be earned via 1 of 2 “safe harbors”."
Lindsay Graham has repeatedly sought to weaken encryption and mandate backdoors and key escrow. Also in June this year following the EARN IT Act he introduced the Lawful Access To Encrypted Data Act (LAED) which would mandate backdoors:
LEAD is extreme and has little support. It is widely believed that the LAED was not intended to be passed but is meant to help pass the EARN IT Act by making the EARN IT Act seem like a more moderate and reasonable piece of legislation.
The EARN IT Act is really a ploy by Lindsay Graham and others to bypass Congress on this issue which they cannot otherwise get passed, and allow a small group of people who are not even security experts to develop regulation and mandates (which will probably be against encryption) under the guise of fighting child porn.
"Unpaid" here just means paid in other means, exchaging political favors, getting hired by the benefitted companies, etc.
Of course the same goes for paid commissions...
I've never heard the commerce clause explained that way. Very cool.
Maybe it has something to do with "Notwithstanding paragraph (6)"
“(6) NO EFFECT ON CHILD SEXUAL EXPLOITATION LAW.—Nothing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit—
“(A) any claim in a civil action brought against a provider of an interactive computer service under section 2255 of title 18, United States Code, if the conduct underlying the claim constitutes a violation of section 2252 or section 2252A of that title;
“(B) any charge in a criminal prosecution brought against a provider of an interactive computer service under State law regarding the advertisement, promotion, presentation, distribution, or solicitation of child sexual abuse material, as defined in section 2256(8) of title 18, United States Code; or
“(C) any claim in a civil action brought against a provider of an interactive computer service under State law regarding the advertisement, promotion, presentation, distribution, or solicitation of child sexual abuse material, as defined in section 2256(8) of title 18, United States Code.
> "(C) fails to take an action that would otherwise undermine the ability of the provider to offer full end-to-end encrypted messaging services, device encryption, or other encryption services.”
My guess is no since even with a key escrow scheme the messages can still be encrypted end to end. Its just that there is another party which may be able to decrypt it later.
I don't, because I'm sure most state legislatures are even less informed on the importance of online encryption than Congress is. Doubly so if you live in a red state.
They were easily forced to comply with wiretapping demands of the government.
The moment a central platform controlled (most of) our communication and was able to provide security to the users by using encryption, legislators realized the danger and also the solution. They would just need to force these central hubs with laws as they did back in the day. Legislation was/is needed to break users' encryption and give government the access to all communication that was lost with modern forms of communication and encryption.
So on one hand, it is "just" a return to a previous state of affairs. On the other hand, given for example today's methods of automatically listening in and transcribing voice to text, this would be a way more massive intrusion and control mechanism.
This is a needlessly negative outlook. Click the link, support the EFF. Educate your friends and family.
Defeatism is not a compelling philosophy.
And I see people suggest that we aren't completely screwed because we got 10,000 people to walk down a street one time. Its an insane position and we need dramatic measures that the human race in its current form are unable to enact.
* Climate change is going to happen because people like cars and supermarkets and nobody is asking anyone to _actually_ sacrifice anything. Much of the climate change movement is "someone else should do something about it" and those that cut themselves by going "off-grid" just make themselves quieter and are replaced by twenty more people with growing carbon footprints.
* Online security is fucked because the vast majority of people can't handle more than a modicum of detail. Its on the technical class (i.e. the 5%-15%) to protect it by hook or crook. It will lose in a democratic battle to preserve it because people aren't interested enough to care and its too complex.
My suggestion is that defeatism posits a better question to answer: "What do we do once we've lost?".
The answer to that question yields something useful instead of the suspension of disbelief that the human race will suddenly turn over a new leaf.
That's not to say its not worth trying but that we should assume failure, otherwise you end up like the UK's Brexit negotiators who were "all-in" on the "oh they'll buckle as the time runs out" strategy and have no plan if that doesn't pan out beyond "so what now?".
The basic strategy right now seems to run on the basic belief that human-kind will figure out its slowly destroying its habitat and change accordingly. I don't think we've even managed to all figure out the extent of the destruction let alone even start to change accordingly.
What habits have we changed since the 1990s within our population to cut global carbon emissions? I'm struggling to glue a couple of things together. If I'm not mistaken, global air travel has _soared_ within that time window and everyone is just sitting on their hands under the mistaken belief that renewables will bail us out while we all still drive in a car to go to the supermarket.
It's hard to make progress while denying the actual nature of the situation.
Not saying that's necessarily true in either of these situations, just that it's not a waste of energy to spread depressing truths.
It certainly is, if those "truths" are not absolutes. Find a way to redirect your -- and others' -- energies toward something positive and constructive. Spreading doom and gloom, full stop? Always a waste.
However, telling a sad or difficult truth and stopping there is still a big improvement on pretending there was no issue in the first place.
Saying negative things is just negative, full stop, as already stated. You can make this positive by suggesting a solution or asking someone qualified to suggest one and then helping them achieve it.
I dunno, I'm not sure that's always an absolute. Example: your boss says the budget has been slashed and the department head will be downsizing the whole department by 50% in the coming months. This is very obviously a negative thing, but is your boss doing you a disservice by giving you a heads up? "Oh but he should give me guidance that I should work diligently but also prepare resumes and look into other options." Of course they should and it would be really nice if they also could find out for sure if your head is on the chopping block- but let me ask you this: even if your boss doesn't offer this very obvious advice or have any insight into the department head's persona non grata list, is knowing about this negative truth a net-negative for any employee this concerns? Of course not.
Or even more relevant to HN: a very outgoing individual who loves the face-to-face environment of the office has just been told that the CEO has decided everyone will be 100% remote starting next week until 2022. Is being told this negative truth a net-negative for the social butterfly employee who now at least has a brief chance to wrap their head around the idea and mentally prepare for the change instead of being blindsided?
Sometimes simply raising awareness that a negative thing is happening isn't a bad thing.
This is very pessimistic. People still can understand the dangers (and are understanding) and switch to decentralized platforms, which by the way are actively being developed: https://joinmastodon.org.
Facebook and Google actually do a relatively good job of protecting our data. They have to live up to specific claims by government, which some people don't like, but otherwise, it's pragmatically safe.
'A million servers' here and there, without process, oversight, 'a lot to lose', lack of transparency, and it might be a whole lot worse for most people.
'Many leaks' at small companies may not have the public and regulatory impact as a 'single big leak' at G or FB.
For 'the tech literate' who know how to manage themselves, it may open up avenues of greater security, but for the net-plebes, not so much.
Think about the freedom that comes with ample food + health and lifestyle choices. Some people are incredibly more fit and healthy than any other people in history, but most of us are somewhat more sedentary, we eat to much, don't exercise enough.
If another user blocks you, that's on them but you have no right to push your messages into another user's home system.
If all else fails, you can host your own. It's ultimate free speech as intended and as someone who hosts a mastodon instance, it's a system that works pretty good, there is no noticeable hatespeech in my federated timeline.
edit: I might have misunderstood the GP, but I'll let the comment exist. Banning mastodon in the US is a non-concern since it's a french open source development.
But even if someone really wanted to get Mastodon banned and created a server full of illegal material to give politicians the fuel they'd need, the law enforcement answer there would be to go after that host. The overall network would not be liable because it wouldn't be hosting/distributing the material.
But I think GP might have meant banned at a government level?
Already happened in some countries.
B.t.w. someone wrote long ago here at HN that the 2nd amendment about weapons, nowadays should be about cryptography instead. Maybe:
> A well regulated Militia, being necessary to the security of a free State, the right of the people to [use end-to-end encryption], shall not be infringed.
Companies that are willing to play dirty are the ones that get investment and also are the ones who retain users. They have an inherent advantage over any platform that tries to be moral -- or alternatively platforms that try to be moral have an inherent disadvantage. As long as privacy and respectful user experience is on the bottom of the list of priorities of most users they are the ones who will be able to build momentum, and I can't imagine what could happen to change that.
Even the fact that this bill is proposed proves that government lost some of the control that it had over private conversations.
I installed Element, but it required username/password pair. I understand that it's a bit more secure than using phone number+email for first authentication, but it makes discovery of friends too hard. It trades too much UX for security, just like PGP.
Perhaps the creators of the app could partner with some big providers to allow the app to try creating your account on one of these providers at random (and keep trying different providers if your intended name is already taken on one that it tries).
I also agree with you about discovery of friends being hard unless users provide their phone numbers to a central server, so perhaps there should be an option for that when creating your account. This central database could run by an independent, audited, third party service. I'm not sure who could be trusted in that role (perhaps Let's Encrypt?), or how much it would cost, but it's an interesting thought experiment.
Thats similar to what eMail does, it trades UX for [nothing] (would place decentralisation there if it were not for gmail etc.)
U.S. users are leaving Facebook by the millions, Edison Research says (marketplace.org)
1293 points by rmason on Mar 6, 2019 | 616 comments
The questions isn't "What do 90% of people do?". The question is whether or not those of us that care about privacy have the ability to be private.
So stop being another brick in the wall: stop using the platforms just because it is convenient. For example:
If we're talking photos:
One day IPv6 will get rid of the platforms.
Previously I had “business class” internet but that is just the name of the plan. Anywhere there was cable I could get business class plans.
The one time I tried to get a business-class connection from Time Warner they refused to offer any business-class service to a non-commercial address except for their "Home Office" plan, which was basically just their top residential plan with a better SLA.
Well, you already control who you share it with, as you're the one initiating connections to sites like Facebook. It just happens people give a lot away to browse sites these days (admittedly exactly what they're giving away remains quite opaque).
Unfortunately even if we did move to one-platform-one-person, the question of data control remains as murky as ever. Suppose you are hosting a party so you send your street address to your friends so they know where to show up. Then they play a fun quiz game that tells them their Harry Potter patronus based on the street address of their friends (that means you). Suddenly some anonymous quiz maker (let's call them Oxford Synthetica) has access to your street address and at least one of your friends' info through no direct fault of your own.
It really is a case that if you give up then it will be over. Just make it a part of your life. I'm not asking you to rally every week. Just don't forget about it and positive change will come.
Perhaps - and most fall into this category.
However, you do have the ability to form a (simple, cheap) SCorp/LLC in the US jurisdiction of your choice and provision your own mail/dns/vpn/etc. by that corporate entity.
So now a corporate entity is the provider, and you are the customer, and notices/subpoenas/takedowns/requests will be seen by you and you will take action on them.
At the very least, you can self-provide your own VPN this way if you don't want to run your own mail services.
And if that is to the point that it scares you that they'd lock you up for such behavior then you need better government, not better privacy.
You are mistaken counselor (c) The Descendants. These three easy things improve your privacy by 10x:
- Incognito / Private browsing by default. Clears cookies used by trackers and platforms. Staying logged in is bad. Login, do your thing and close the tab. No Chrome, it has its own Id.
- VPN. Hides your browsing history from ISP and your IP address from trackers and platforms. Smartphone too, and disable Location and Background app refresh for most apps.
- Adblocker (uBlock Origin / AdGuard for Safari, iOS too). Prevents trackers and malware from executing.
How is that different from what the US does? It seems to me an error to think that US companies are somehow the default, and any other country using its own services is sidestepping the issue.
By 'default' I don't mean to imply that this is desirable or healthy though.
? They are quite different.
Facebook is a social network. On occasion, with a warrant approved by an independent Judiciary, an agency may request on a case by case basis, information relating to a specific concern for which there are specific indications warranting a search - much like the search of your car or home.
WeChat is a control, censorship and surveillance network.
- WeChat, like >90% of Chinese companies, has CCP party apparatus that work within the company who overview protocol to ensure loyalty to the CCP agenda. The US 'equivalent' would be the CIA having staff at Facebook to intercede in policy decisions and to make sure 'The Man in the White House' has his policy objectives met.
- WeChat censors everything. If you, right now, start saying something negative about Xi, it will likely get censored by one of the massive army of censors. The US equivalent would be 100% of Facebook posts going through a 'large office in Virginia' where President's political operatives oversee censorship.
- WeChat censors anything they want, for whatever reason. Winnie the Pooh comparisons to Xi? Banned. References to Tiannamen or Hong Kong protest? Banned. The US equivalent would be Facebook banning all memes mocking Trump, and of course, banning any and all activity related to BLM, social justice, protests, any kind of history that contradicts the ruling American party's official view of the world.
- WeChat is used used to identify networks of civil antagonists. Have you said something about 'Hong Kong'? Well, you're friends are going to be tagged and more likely monitored. The US equivalent would be government operatives sifting through the Facebook DB all day, using that information to monitor your friends messages, because you said 'Black Lives Matter'.
- WeChat is used for a host of other things including payment etc. meaning the breach of privacy is considerably more significant. Everything you buy, everywhere you travel etc. tracked and monitored at all times.
1. You think that US intelligence agencies don't have any staff inside Facebook?
2. I mean, Joel Kaplan is right there. Not even a secret.
The US (and surely Russia, China) all have clandestine operations within Facebook. This is not part of a 'deal' with Facebook, it's regular spy-craft.
But Facebook does not drive it's policy around the wishes clandestine operatives within the company!
The US and Russia surely have clandestine operatives within Tencent as well.
But the CCP has something entirely different: legit, out in the open entities there to oversee and ensure CCP party policy and Xi's orders are effectively executed. If the Executive team at Tencent didn't 'get with the program' they would be forced out. While they don't probably care about regular operating matters, they do in fact 'hold the real power'. For example, if Tencent decided to not censor certain subjects, action would be very quickly taken.
Not only does the CCP have minders in Tencent/WeChat - they have them in almost every Chinese company . These 'inner minders' are literally the direct apparatus of state control within the ostensibly private economy.
When we talk about 'state control of the economy' - this is literally it.
The US corollary would be Government officials in every single US private company, overseeing that everyone adheres to GOP policy for example. So not just Facebook, but Cisco, Disney, GM, Mattel, CNN, Morgan Stanley, B of A, Goldman - etc. - many of which would also be directly owned by the US government, as of course in China most major banks are nationalized. If the CEO of Goldman doesn't do what Mr. White House and 'The Party' want him to do - he'd be 'out' and someone more 'pragmatic' about their capitalism (i.e. make money but bend the knee where they have to) would be 'in'.
From TheGuardian : "(in Xinjiang) there are QR codes on people’s doors for when the party goes in to check on who is in. If someone leaves through the back door instead of the front door, that can be considered suspicious behaviour." - one of many chilling examples of control that is enabled by private companies, controlled 'from the inside' by CCP staff.
Chinese avoid the problem of being spied on by their government?
I know where I'm watched and how not to be watched, I know how to very easily hide the odd activity. The average citizen no longer gets that for free and it really sucks for them but remember that's what this has always been about, trying to improve the lot of the average, not us. We're fine because we know and can do.
So its elliptic curve time for messengers but everyone that "knows" will get some European or underground US piece of software on their phone via an unofficial app store.
Freedom only works with some responsibility, but it seems to me that state level actors just try to keep up with the ad and tech industry in collecting data.
The EFF tries to blame it on Barr, but I think the wish for control is bipartisan, especially since a generation is in charge that doesn't really understand the problems of this data collection. Not that their younger compatriots seem more promising in assessing actual problems.
There might be a point where using a Asian or African social network would be preferable, but that is something not on the mind of common social media users. Sure, they do it if the product has appeal like TikTok. On bad days I wish users of social media (the self presenting kind) would be banned from all other sites.
Feinstein is always on the wrong side of copyright/patent/encryption bills. Just look up her voting record. She co-sponsored PIPA. See also https://www.wired.com/2016/04/senates-draft-encryption-bill-.... She co-sponsored the Sonny Bono Copyright Term Extension Act, too.
She's hopeless on this.
A lot of big companies have old and crusty leaders / owners, who do not understand the terms of this problem.
I wouldn't be surprised to see this bill pass and then get nerfed in a few years, once big corporations actually experience the operational cost and lobby to have it rolled back / made it optional for them.
The thing we have to come as a society is willing to accept that sometimes [very big bad] will happen, but a lack of back doors is more important.
Yes, there are insecurities and holes in the above, pending the methods of implementation, but it's one level of potentially many.
And whilst it's good to know that there are technical workarounds such as this, the real work, the real progress for society, is to make these technical workarounds unnecessary by, as the EFF says, contacting your representatives and letting them know what they're constituency thinks. Politics, sickeningly, is the only avenue for worthwhile change.
It seems that the world has somehow forgotten the lesson. People can only see the centralized systems and don't realize that the problem they are trying to solve is not solvable in general. Encryption is unstoppable.
Notarize your apps buddy and sign into your app account... ;)
You can make as many PGP identities as you want and associate them with any sort of identity you desire, so there might be a political point available with respect to anonymity as well in some situations.
If this passes you can be sure anything digital is no longer safe from alphabet agencies. Sadly the younger generation has no concern for privacy and the older generation likely has no clue their data is being siphoned. Advocates are too much of a minority to make a difference.
Yes, I'm aware of some of the issues surrounding IP copying. I won't call it theft because the word is contentious and there are those in this forum who don't think you can steal ideas.
I'm also aware of the US' own history of gathering IP to bootstrap its industrial efforts. I guess that "but you did it first" isn't a good defence in law (not really sure of that, after all IANAL) but it makes the US case weaker in my eyes.
Do you think that's related to China allowing zero domestic tech competition, but demanding that their companies be allowed to compete in the rest of the world?
The main issue with something like this though isn’t really about the rest of the world catching up to the US technologically. It’s a question of whether service providers would choose not operating in the US over following their rules. But given the size of the US market, you’d find a lot of service providers choosing to follow the rules.
Which is incidentally the same way EU nations enforce their laws on companies like Google. And how countries like France, Germany and UK would enforce the anti-encryption laws they’ve been trying for years to get on the books.
I wonder if anyone in the last decade has _ever_ walked into their rep's office, talked to them, and changed their vote on a key issue. I really doubt it.
The age of true democratic participation is over. Politicians have no need to care about the ordinary constituent to win elections.
They just need to win over the elite, and know how to deliver their scripts.
That can only be broken in one of two ways. Either people don't care enough or they grow so distrustful of it that they don't participate in it or keep it in check.
When things like that happen you get elected officials decided by less than 40% of the country, anti-maskers, and asymmetric warfare in the streets to combat "the man". It's untenable.
We should all have to log on with our identification through a blanket "know your customer" law like they do to combat money laundering in banks, and our country of origin at least should be displayed. Other than that... You want encryption? Fine, but if your id links you to 4chan offshoots planning domestic terror attacks i should be able to report your address to the authorities and there should be social repercussions for bad behavior. And I need to know if you're Russian or from the US and that you're human before I engage in US politics with you.
Music got worse since Napster, Sean Parker was the guy who got Zuckerberg funded, Google is evil, and John Perry Barlow is dead.
Others who know history much better than me are far better equipped to debate against your position. If you really want a challenge, I would invite you to reply to the top comment in this thread by Jon_Lowtek to help open a discussion on the deeper merits of your position. For me personally, your arguments aren't very persuasive.
I am aware of history as much to know that the Internet was designed by a bunch of people who talked on Ham radio and wanted a secure place to sell drugs to each other without being caught by the fuzz. They hated Ma' Bell. That's why it's designed the way it is. Then later it became the RIAA and the rest. I know all of that and if anything that's hardened my position on this, not decreased it. We don't live in the 1960s or the 1990s anymore, and what's worse is everything the people said during that time about how society would be if we went "full retard' into "fight the man" hipsterism and the subsequent deregulation of everything actually happened. Nobody laughs at Al Gore or Tipper these days, or Lars Ulrich for that matter.
Do you like musicians making a penny on a song leaving no room for art just lowest common denominator crap, the rise of the marxist/fascist camps that are hurling what was once a promising country that was getting over it's growing pains straight into 1920s/30s Germany at lighting speed, the totally unfiltered and unmoderated filth that is 4chan, etc? Or do you just ignore it because "some day blah blah blah dictator might take control" yet we are the closest we have ever been to a dictatorship because of faceless social manipulation technique where we have no clue who or where anyone we talk to is, whether they're paid, etc?
Not everyone is an engineer out looking for how things work. It's the same error the founding fathers made with the enlightenment thinking everyone is going to become smarter if you just opened everything up. It's complete baloney, people will just use the power vacuum to seize it for themselves because they make the most noise. Nobody cared about Aaron Swartz except the hacker community. Snowden sold out to the Russians from the beginning. And you're sitting there still in the year 2000 when all this stuff was new and the only ones who spoke were nerds and we all agreed on basically everything.
The point is that everything that this theory set out to solve, became worse. More nationalistic, more poor musicians with even bigger giant platforms controlling their art, more corrupt, more propagandized, etc. Not less. Everyone knows it. The other point is that instead of tearing stuff down and engaging in subversive activities for the same ends, maybe it's time to focus on making the systems you have better. Not turn it into some weird mix of The Matrix, Philip K. Dick, and William Gibson. They're just as bad of an instruction manual as 1984 was to be honest.
You know who benefits from VPNs, privacy, and security?
The proud boys. Russia. China. Steve Bannon. And their bots, that's who. They are a threat to civil society at present, and they're using that very privacy and security to organize to destroy the thing you're trying to "protect" by drinking Barlow's kool-aid.
- block access ala GFW, ensuring that most people will have difficulty accessing it or using it
- block access to any data you cannot decrypt or from an endpoint you cannot backdoor
- go after creators and ensure some kind of backdoor is inherent to the project
- shut down projects by exerting pressure on developers
- run some kind of propaganda campaign on the evils of using unsanctioned software (supporting terrorism etc.)
- do nothing, knowing that most users will avoid using anything that isn't one of the major web platforms
depending on sofistication of that solution you could imagine some forms of tunnelling to be efficient against that (IP-over-X). Then of course due to the complexity this workaround will be used by a tiny fraction of users.
>- block access to any data you cannot decrypt or from an endpoint you cannot backdoor
steganography would be a solution to this, you can decrypt the cat pictures I'm exchanging with friends but you may not be able to notice that those images have hidden content (which may also be encrypted)
>- do nothing, knowing that most users will avoid using anything that isn't one of the major web platforms
this seems to be a guaranteed-to-succeed solution. Probably much better than
>- run some kind of propaganda campaign on the evils of using unsanctioned software (supporting terrorism etc.)
since there is always a risk that this may backfire and encourage resistance in some groups
>shut down projects by exerting pressure on developers
The developer probably just going to the press with that, no need to damage your own private centric project, sure the NSA can say they never did that so the developers can ignore it OR they openly say it was them, then the project just can change the country.
>run some kind of propaganda campaign on the evils of using unsanctioned software (supporting terrorism etc.)
>do nothing, knowing that most users will avoid using anything that isn't one of the major web platforms
Those are probably the only viable solutions, but on the other hand it's pure marketing for the product, like banning telegram in Russia.
Because i don't like pressure from a Secret Services...do you?
OK with one exception...Xenia Onatopp
Plus if you really really want to know what's happening there, call the NSA. They'll send a few chaps to join the matrix dev community. Or they'll use other tools to access the machines in question. And you can do that to the 0.01% of people that actually switch after mainstream tools become insecure in a way you can't when there are billions of people using encryption by default.
Not sure if they give me any information as a Swiss citizen :-)
>They'll send a few chaps to join the matrix dev community
That's probably more on the James Bond side of reality
>Or they'll use other tools to access the machines in question
Yes that's what they probably will do, but that is targeting and not a Backdoor/break encryption by law.
They're also involved in the Coreboot alternative BIOS project:
One of the key revelations from Snowden was that all that stuff that seemed far fetched or paranoid to us, wasn't just happening, it was routine and much more advanced stuff was also going on.
It's also much easier to target Matrix etc when you don't have to target all of Facebook because you have a backdoor there. Reducing the NSAs workload by 99% makes the other 1% much easier because they have 100 times the resources to spend on it.
You should also consider as a swiss citizen that US law is increasingly world law. If you sit in Switzerland and contribute to a project banned by US law, they may well have you extradited to spend the rest of your life in a US prison. They're doing that to Assange right now. They've done it to others as well. Locking up foreigners is quite in-style in the US at the moment... I haven't read the bill, so I don't know what the criminal sanctions are for failing to provide backdoors...
Best of luck
>You should also consider as a swiss citizen that US law is increasingly world law
That's Bullshit, was probably half true in the Cold War era but for sure not today.
> They're doing that to Assange right now.
That's England, you know those little dogs of yours.
In the cold War it was true of the western world and the USs use was limited because the US wanted to look gentle and reasonable. Today it's true for most of the planet and the US wants to look tough.
If you don't like the Assange case (I agree the brits are lap dogs, but that case actually got kicked off in Sweden and was handled under EU law much more aggressively than the in UK), try the FIFA case. The US ordered Switzerland to arrest and extradite foreign nationals, many had never set foot in the US for trial, Switzerland obeyed...
I don't like any of this. But it's time to abandon our innocence. If you're working on any meaningful project in encryption, the NSA are at least aware of you. The US are happy to use covert methods to undermine you if they think its worth it (and the bar is low). If you're important enough (or a DA wants to expense some flights) they'll use overt methods. Proceed at your own risk and don't assume that America banning encryption doesn't make it illegal for you just because you're a Swiss citizen in Switzerland.
No problem with that, from the article:
>With wire fraud, one needs a wire that originates in the US
Sometimes we even ask for it...that's normal international business:
>Proceed at your own risk and don't assume that America banning encryption doesn't make it illegal for you just because you're a Swiss citizen in Switzerland
Yeah i stop here...
BTW from your first article about android:
>So, if it’s not looking to plant backdoors, what’s the NSA’s business with Android? Ironically, the agency has been working to make Android more secure.
>It is just as preposterous to think that the best way to gain access to any operating system is to publicly announce that you are contributing to the OS, and make the tainted code accessible to anyone with an interest in it.
So it was NOT Snowden, but NSA itself.
Second Article about Coreboot:
>Myers published a paper about STM last year on how NSA’s STM implementation could work. All Coreboot code, including all the STM contributions from the NSA, are open source, so anyone could verify that there is no backdoor in there -- in theory.
Even if the project is open source and development is distributed, there is often a major entity behind it driving it on which the requirements can be enforced.
Again i was talking about the opensource android, not the googlified closed version.
>for Matrix force the hosted Element.io instance to provide it
How? If they are not US Citizens?
The parts that control doing calls, connecting to cells, etc.
They are called "Radio" firmware.