
Snowden: Tech Workers Are Complicit in How Their Companies Hurt Society - AndrewBissell
https://www.vice.com/en_ca/article/wxqx8q/snowden-tech-workers-are-complicit-in-how-their-companies-hurt-society
======
seesawtron
Previosuly discussed here:
[https://news.ycombinator.com/item?id=23643777](https://news.ycombinator.com/item?id=23643777)

------
creato
> and a letter signed by at least 1,666 Google employees demanding the company
> stop selling technology to police departments.

I had to read this link to see what it was talking about. Maybe some
ridiculous AI crime prediction algorithm?

> Google employees also expressed frustration with Google's praise of police
> departments like Clarkston PD that used _G Suite software_

Err, what the hell? Is someone supposed to be ashamed of selling email and
docs/spreadsheet software to police departments? Of all the things Google
does, selling G Suite is probably one of the most "moral". It's a simple
exchange of money for basic useful services, with few/no strings attached.

What am I missing?

~~~
trevyn
complicit: “choosing to be involved in an illegal or questionable act.”

If you feel that the act is questionable, you know that you are involved, and
you choose to remain involved, you are complicit.

Trying to morally distance yourself by coming up with justifications is self-
hypnosis. Which is fine, but it’s deceiving yourself.

~~~
AnthonyMouse
Should Google and everyone else also stop paying taxes, because they fund the
police?

~~~
trevyn
If you are assuming that this would be absurd, that assumption and the belief
system it rests on may be preventing you from seeing and understanding what is
happening.

There is a long history of people doing personally uncomfortable things after
checking their internal moral compass.

------
ChuckNorris89
Unless your work involves helping people, treating or educating them, that's
basically most industries, no?

Automotive, aerospace, consumer electronics, travel, advertising, mining,
shipping, food, meat, chem, pretty much anything related to consumerism, all
hurt society or the environment to some degree in order to benefit their
customers and shareholders.

SW engineers are an easy target for the mainstream media since _" they're paid
huge sums for just sitting in front of a screen all day so they can't be
adding much value to society as a physical laborer for example"_.

Edit: Of course, the main thing is if the damage done to society or the
environment is outweighed by the benefits provided in return such as the food,
energy and pharma industry.

~~~
balloob
I work in home automation (Home Assistant) and we can improve our customer
lives with privacy and local control without hurting society. And I bet a lot
of other industries too.

~~~
raxxorrax
How would you add privacy compared to a non-automated house? I developed a few
devices with voice interfaces.

In normal cases you exchange privacy for convenience.

------
WrtCdEvrydy
Of course we are, we get paid to sell out those beneath us for a dollar.

I liked to think I was helping people in my younger years but as I got more
and more involved in ad tech, we just optimize for maximum profit no matter
what else.

~~~
melvinram
> Of course we are ... more involved in ad tech

Not all tech is ad tech. Please don't lump all tech into your statement of
"get paid to sell out those beneath us for a dollar".

~~~
luckylion
The cynical response is probably that all tech is either ad tech, uses ad tech
or builds things used by ad tech.

You don't really need ad tech to make that statement about selling out those
beneath you applicable for most tech though. Work on self-driving cars? Good
bye to people earning money driving cars. Yes, you can rationalize that by
saying "no, we're just automating what can be automated to free human
productivity and make sure their capacity isn't wasted on these trivial
tasks", but this isn't Star Trek and they won't be freed from driving cars,
they'll be out of a job.

Yes, there are exceptions, but yes, they are exceptions.

~~~
paledot
While also saving thousands of lives a year. Morally, the calculus works.

------
wavesounds
As engineers in tech have way more power than we realize. Its not a
coincidence that the biggest most successful tech companies also have some of
the best engineers working there. It's the engineers who made those companies
successful not the other way around.

Wherever you work you have a responsibility to make sure your skills are being
used for something you support. If you can't convince the leadership at your
company to do the right thing then you need to leave and stop helping them. Go
work for their competitors who are doing the right thing. There's tons of
companies who pay just as well and aren't bad for society or the planet.

~~~
kmlx
> It's the engineers who made those companies successful not the other way
> around.

looks reasonable at first, but not as clear-cut as one might think.

> Wherever you work you have a responsibility to make sure your skills are
> being used for something you support.

i love it how rosy the world looks when i read this.

i've got acquaintances that fully support the arms trade, spyware, gambling
etc they love it, they moved house for these kinds of jobs.

moral of the story is don't assume too much.

------
gorgoiler
Some things are objectively bad and should be regulated by society. Subverting
the web for tracking and hoarding personal data are two big ones.

Regulation means laws proposed by a government, and it also means either an
impact on the profits of entrepreneurs (n. fr. _middle man_ ) or at least a
lot of press from entrepreneurs about how this will stifle innovation and stop
the next generation reaching the American Dream: _you know what’s cool? A
billion dollars_

It’s easy to hop skip and jump through all sorts of stages to end up at crazy
conclusions, but I don’t think it’s too much of a stretch to point to
materialism, wealth inequality, and love of money as being serious
corruptions. A little modesty and humility goes a long way. Maybe don’t [faux]
celebrate the next massive tech exit as the top news item on HN?

~~~
unishark
It seems like a tautology to say subverting is objectively bad. When does
tracking become subverting though?

Also it's a bit ironic to decry both materialism and wealth inequality at the
same time. I guess if you're concerned about the wealthy's souls or something.

~~~
gorgoiler
I don’t want to overcome wealth inequality so we can all own Porsche SUVs.

I’m not interested in facilitating mass materialism. Materialism is the
problem, and wealth inequality a symptom, in my experience.

~~~
unishark
> I don’t want to overcome wealth inequality so we can all own Porsche SUVs.

Indeed quite the opposite. But if some do care to own them, that doesn't have
to bother you.

------
AnthonyMouse
I still think this is primarily a result of the financial regulations that
effectively make it impossible to have an efficient anonymous digital payment
system for small transactions.

The result is that adtech is the only other means to fill that need (you pay
with data instead of money), so it does, so it's everywhere when it ought not
to be.

You can make the case that people should refuse to work on it, but that's a
lot easier to achieve in practice when the alternative isn't prohibited by
law.

We need something equivalent to cash that works on the internet.

------
ma2rten
The article seemed incoherent to me. It went from "people who write
spreadsheet apps are being subverted" (how?) to "the police is flying drones"
to "tech workers need to stand up".

I haven't watched article interview, I'm hope it's more coherent and it's just
VICE's summary.

------
JohnTClark
Every time I read this kind of message i feel like it's propaganda that is
trying to make the west weaker by convincing people to not work on military
technology. What you make can hurt people, yes, some people deserve to be
hurt.

~~~
jsinai
Just throwing this out there since we’re talking about morals: who deserves to
be hurt and how do we decide that?

One of the big issues with tech involvement in security and defence is the
shift in onus of responsibility from person to algorithm. Just this week we’ve
seen a story top HN about a black man who was falsely identified as a
terrorist by a facial recognition algorithm [1]. Closer scrutiny showed that
the terrorist score was 52%, barely more than a coin flip. Even closer
scrutiny showed that the image was grainy. Not once was a human called into
the loop to assess the algorithm.

Maybe the intention was good (let’s use facial recognition technology to catch
terrorists), but the tech itself was flawed and there was a failure to imagine
scenarios where an innocent person is denied their liberty because the tech
didn’t work. Add to this the racial inequality, and the lack of empathy
becomes deeper. Would anyone feel comfortable deploying facial recognition
technology if there was a 50% chance (a coin flip) that they themselves would
be hit? Finally, this goes beyond race. Who (in the West) decides who (not the
west) should get hurt (be killed really) because they are collateral? Again
the empathy gap decides that life is cheap on the other side.

The problem with tech culture is that we don’t like to imagine scenarios where
it just simply isn’t up to the task. This can become a matter of life and
death (or liberty) when security and defence are involved. At no point am I
saying that countries should not invest in technology for security and
defence, but that rather such systems are weak if they’re merely brute force
(even a conv net can be trained to brute force it’s test accuracy) and not
highly accurate. And right now the technology isn’t ready for deployment.

[1]:
[https://news.ycombinator.com/item?id=23628394](https://news.ycombinator.com/item?id=23628394)

~~~
unishark
The application of AI in security is primarily needed to due to the enormity
of the problem, not due to its ability to replace people. There simply isn't
enough manpower or time to train experts and have them examine all the data.
You can argue some of the data should not be examined at all for privacy
reasons. But there are plenty of scenarios where privacy concerns won't hold.
Such as comparing the security camera photo of a bank robber to mugshots.

The question of when it is reliable is an big one people have been working on
a long time. For example in medicine the stakes are even higher. The problem
is simply that recent technologies (deep learning) have taken a huge leap
forward in performance, but a huge leap backwards when it comes to being able
to assess confidence.

------
snarfy
Citizens are complicit in how their governments hurt society. I'm not sure
what he's getting at.

------
igravious
Leans back in rocking chair, tamps pipe.

When I was younger I used to code in C and C++. Borland compiler and IDE tools
were my favourite. From what I recall Microsoft stomped all over them. It was
my first introduction Big Tech. I loved DOS (yes DOS!) and I was getting to
learn Windows but I felt that Microsoft was stifling its competition by
abusing its dominance of the PC OS market. I was young and naive and had never
heard of Free (as in freedom) Software and Open Source was a term that was yet
to be coined.

(I'm going somewhere relevant with this.)

When Linux came along it made a few small ripples, I jumped on board when it
was maybe six or seven years old – a beat up PC running Slackware was being
used as a router connected to an ISDN line in a business park I interned at.
Some dude explained roughly how it worked and gave me a burnt CD. I had to
figure out how to load CD drivers manually, compile the kernel manually, I
learned so much. And in time I learned about FOSS and GNU and Stallman and the
GPL. I thought, wow – that's a neat hack – using the legal system to guarantee
freedoms. Giving people control over their devices, and way less of an
opportunity for a big company to stifle it. I was a convert.

We're been at the Ad/Data Lock-in/Surveillance Tech equivalent of Stallman and
the mythical printer for over a decade now. I personally think our modern
Borland moment was when Facebook was allowed to buy Instagram (2012). That
should _never_ have been allowed by regulators. Not to mention WhatsApp
(2014). Same for Google buying Android (2005), DoubleClick (2007), or Nest
(2014). Not to mention Amazon aquistions. Nor Apple. I'm sure there are many
more such examples. Someone mentioned this article
([https://promarket.org/2019/12/09/the-lack-of-competition-
has...](https://promarket.org/2019/12/09/the-lack-of-competition-has-deprived-
american-workers-of-1-25-trillion-of-income/)) recently about the
concentration of corporations in the US, I made a point of bookmarking the
link. And that's before we even get on to the topics that Snowden brought to
light.

It's hard not to become totally cynical. We need the contemporary equivalent
of Stallman and Torvalds to do to Big Tech what GNU/Linux has done to
Microsoft – and it was for Microsoft's own good, they're a much better company
now! Linux could not have succeeded without the GPL. Do we need another legal
hack to spread from the USA to the rest of the world? I'd say probably. Back
in the day there were calls to break Microsoft up into a PC OS and Office
tools divisions. There appears to be a complete unwillingness in the US to
break up or prevent the formation of abusive monopolies in tech. Until
antitrust regulators get their you know what together I think we need to
legally mandate that key tech standards are federated. I cannot think of any
other solution. We need to force Facebook and Twitter to plug into an ITU
([https://www.itu.int/en/Pages/default.aspx](https://www.itu.int/en/Pages/default.aspx))
like standard for a start. (I think we can fix Amazon (and a whole lot more)
by forcing them to pay their lowest paid workers and sub-contractors a whole
lot more – but that's a whole other topic.) Google I don't know what to do
about – possibly get them to divest Android? Force Android to be more open?
It's already quite open though. And so is Chrome. They do have a complete
monopoly on video streaming and censor and demonetise content in
unpredictable, illiberal, anti-democratic and un-free ways – Facebook and
Twitter are both unacceptable in this regard also in my eyes.

And it'll only get worse. Microsoft didn't change until they were forced to
change. The lack of uptake of Mastadon and Diaspora prove that the FOSS model
is not enough. We need FOSS++.

[https://en.wikipedia.org/wiki/Template:List_of_mergers_and_a...](https://en.wikipedia.org/wiki/Template:List_of_mergers_and_acquisitions_by)

------
alkibiades
how can a 25 year old say no to 400k a year :(

i can fire with that in a few years of complicity

~~~
unishark
Well if it's for the military or police many will refuse apparently. But they
don't have as big a problem with tech companies having the power.

