
Privacy underground: a class on privacy, held at a secret location - znpy
https://kategreen28.org/privacyug
======
jancsika
I've harped on this before, but I think it's relevant here, too-- I'd love
some kind of FLOSS sandbox, FLOSS centralized social network, and FLOSS "good-
faith" adversary group to help invalidate ineffective and ineffectual
software/tech.

If someone starts down the rabbithole of encrypted email, Tor, key-signing
parties, etc., there should be a safe place they can go (physically and/or
online) to get absolutely walloped by good-natured hackers who will do
everything short of permanently harming the person's hardware/reputation to
show them they don't know what the hell they are doing.

Without this, these efforts will fall flat because the users (and even devs in
many cases) aren't doing a good job of testing whether their systems work. The
only sure way to test them is to get the internet to hate you, and that's too
big a price to pay for testing a privacy system.

It's a bit like someone running tests and celebrating their algo as the most
performant without realizing the compiler simply optimized out the algo.
Chances are that person is going to have a bad time when the thing goes into
production.

There's also a fringe-benefit, which is that users/devs who don't give up and
continue getting walloped will quickly realize the value of credential
revocation and tell others about that value.

Edit: just so it's clear-- the point is for the user/dev to be able to enter
the group/place when they wish to be taught, and _leave_ when they are done.
That's the current problem with testing by making the internet hate you-- you
can't turn it off.

~~~
CM30
Yeah, I'd like to see something like this too. The TV show Hunted might be a
real life opsec version to some degree, but a system which is online and open
to anyone who wants to participate here would be a lot better. Could even be
tied into a bounty system or something, where you put money in and then ask
the site to share it among whichever people manage to find out your personal
information/bypass your systems.

------
pit2
You already failed the exam if you clicked the link from your Facebook-
signed/Google-signed browser.

~~~
blhack
You failed the test by knowing what sorts of information your browser shares
and making a decision based on that information?

------
ryandrake
> If you think you are local to me and want in on the location, please drop me
> a tweet and we can connect securely.

Secret class on privacy, organized through a public social network...?

~~~
znpy
No, the public communication channel is used to establish a private
communication channel (this might be gpg-encrypted emails)

------
greggarious
Author claims to be privacy conscious, has a site that requires Javascript to
render properly... interesting.

------
generaltsos
It appears that this site has been embraced by the HN-hug-of-death.

~~~
aaronchall
Once you share something private, it's no longer private, is it?

~~~
msla
Being able to share a private thing and have it remain private is the whole
point of encryption.

~~~
ben509
Encryption deals with secrecy, though. We do talk about "private" and "public"
in encryption, but that's an abuse of the terms. "Private" is shorthand
meaning inside a secure perimeter, and "public" means outside the secure
perimeter.

~~~
msla
It's more interesting to differentiate the _social_ meanings of "private" and
"secret": When I send a letter in an envelope, it's private. It's not for your
eyes. It might not be secret, however, because I might not mind the
information in the letter being publicly revealed.

That distinction between "private" and "secret" is something the opponents of
encryption forever try to erase, by claiming that things which are not secret
should not be private, which has the effect of making privacy suspicious by
default, instead of unremarkable. It's the snail-mail equivalent of trying to
shame everyone into sending their mail using postcards, so someone who sends
"a lot" of envelopes is now a target of suspicion, for some arbitrary
definition of "a lot" which changes based on context.

------
tway20180412
Privacy is an info coordinate on the Action axis and the Actor axis. There's
the intended coordinate, and there's the actual coordinate. If an Actor
performed Action at an intended coordinate, when can the actual coordinate be
allowed to differ?

For example, Action (posting this comment), Actor (me). Intended (Action=High,
Actor=Low), Actual (Action=High, Actor=Low). I intend the Action to be
publicly available over the internet for anyone to read, and it is unlikely
Dang will delete this comment. I intend the Actor to be hidden, because this
account is a throwaway and has no other identifying comments. My VPN's ip is
in HN's private logs which are unlikely to be leaked.

Some scenarios are clear:

\- A murderer intending for (Actor=Low) should have (Actor=High) and be caught

\- Using the restroom intending for (Action=Low) should have (Action=Low).

Other scenarios are more nuanced:

\- Offshore accounts (Actor=Low) leaked by journalists in Panama Papers
(Actor=High)

\- Business trade secrets (Action=Low) revealed during product liability trial
(Action=High).

