Hacker News new | comments | ask | show | jobs | submit login
Ex-Facebook Executive: “You Don’t Realize It but You Are Being Programmed” (medium.com)
160 points by Jerry2 9 months ago | hide | past | web | favorite | 39 comments

To a larger context beyond Facebook, how else human life could be without getting programmed? Any exposure to culture literally is a process of being programed from shopping at BestBuy to watch a movie in a theatre to take a flight with first class and economy cabins. Maybe the argument here is "The short-term, dopamine-driven feedback loops" but is there any study that long-term loops aren't just as powerful as short term feedback programming? Or I guess the concern here is the centralization of programming done by one company, but then again, the process is centralized, but the content is still diverse and mostly created by those who'd have done the programming outside of such process.

A fair argument could be the programming now is much more organized and can be done more efficiently toward a certain goal, but I don't think having more efficiency in this context is necessarily a concern - most concepts that can be programmed can also be re-programmed just as quickly. If Facebook is willing to expose a button to de-program a user they should be capable of doing that.

We might become inconsistent as individual human beings from this, namely what I believe yesterday is different from what I believe today or the next hour due to facebook, but later on some other programming might ease this confusion and convince you that nature is just as inconsistent as you.

You seem to think that the problem lies only with people being programmed to buy things - the choice of word is probably not helping here a lot - but the real issue here is about the validation loops people experience when on social media.

It's not that someone else is programming you to vote candidate X instead of candidate Y or that now you want to buy brand Z, it's that we are rewiring our brains to follow the short-term feedback loops in how we act within society, i.e. we don't go places anymore because we want to experience something new but to share the simple fact of going somewhere with our networks and getting validated by them. I think that's the bigger issue here.

> ... but to share the simple fact of going somewhere with our networks and getting validated by them. I think that's the bigger issue here.

This isn't a new phenomenon. We see this in the form of companies branding strategy. The more you see the brand on the people you respect, the more likely you are to buy it.

A couple of examples.

1. Michael Jordan and Nike. Air Jordan's became the shoe to own in the 1980's.

2. Apple gear. Apple chooses their gear to be instantly recognizable. If you buy a mac book pro, say, you are extending their brand, unless you hide the apple logo somehow.

You also see this with brands like Abercrombie and Fitch, American Eagle, Hollister, and RVCA, where they sell clothing with the brand front and center on it.

So in 2018, the game is being played on Instagram and Facebook instead of college campuses and high schools in the 1980's.

>unless you hide the apple logo somehow.

I once bought a macbook pro while providing tech support for an organisation that used them exclusively. One of the first things I discovered when setting it up was that it takes three layers of silver gaffa tape over the apple to completely hide its glow.

I believe the distinction is this: primitive influences on behavior through culturization happen because you experience and do things with other people. Being programmed by Facebook is different: you are posting things and getting feedback in the form of likes which boosts dopamine and you end up in a vortex of false gratification and false relationships and false experiences. It's about attention and the need for hormonal drugs, not about the actual experience or person.

How can I put this into better words? Maybe an analogy from the inventor of Twitter and Medium:

Social media/the internet is like a simple AI that gives you more of what you like. Now, imagine you see a car crash and you look, along with everyone else--you all stop to look. It's natural, it's inescapable. The internet notices this behavior and decides that humans want more car crashes--so it tries to supply more car crashes.

That's the kind of programming we're talking about. Humans are vulnerable and have psychological weaknesses. Social media is tailored to attack certain weaknesses because it's lucrative in an attention economy where advertisements = $$.

Isn't actual experience with real person about attention and need for hormonal drugs? And for that matter, isn't it about gratification too? Isn't a friend smiling at you and treating you nicer when you say things he likes similar to him hitting like button?

The dopamine hit after you see upvoted post is kinda like dopamine hit when you are popular in real life - except less powerful.

Because instead of things being obviously ads on TV and the like (which work, but we can still understand the nature of the source), now things are effectively coming from your friends (“trusted”) with a tiny summary such that you don’t even need to read any further if it works with your existing confirmation bias. Combine this with instant, relatively inexpensive and totally anonymous micro targeting, and you have the most advanced psychological warfare weapon yet.

Everything you said is true, and I'm glad somebody pointed this out because I really wouldn't like this discussion to overshoot, like most discussions do, and get to the point where criticizing FB becomes the dumb thing to do.

This being said, I think you are touching all of the problems but missing most of them:

-Yes, it's about short term loops vs long term loops. This is not about control, it's about what is a good way for people to spend their time: it's been shown times and times again that engaging in long term activities has a great positive impact on your sense of fulfillement, and even if, sure, the whole internet is pretty dangerous in this sense, FB is definitely among the worst content suppliers out there, combining the addictive attention grabbing with mostly shallow and unchalleging content, i think.

-Yes, it's about centralization: democracy is great, and democracy works with as much plurality as possible. Sure, mass media have existed for a while now (radios have been crucially exploited by all sorts of dictatorships in 20th century btw), but it has never happened that such a large fraction of the attention time of a whole population was cotrolled by a single private organization. This is Obviuosly dangerous.

-Yes, it's about efficiency. Like other commenters pointed out, it's also about the sneaky way in which any content arrives to you just like it was directly from your friend, rather than from a recognised news or content outlet. This catches us hugely more off guard, to such an extent that we are not prepared to handle i think.

-Yes, it's also probably about the human relations between people- here's where i'm not actually sure it's making things worse, but it's definitely making things different, just like telephone or cellphones did. This is why we should at least be careful, expecially with children.

Yeah, of course humans will still be humans in the end, I mean come on. It's just about deciding what is best for us

The visual image I like is that we're all puppets influenced by the media sources we consume. Each media source is shaped by its business model — and other goals, which gives us a (statistically) biased view of the world (e.g., social feed algorithms and engagement).

See the puppet images we made in this media literacy guide: https://github.com/nemild/hack-an-engineer

(Contributions welcome)

Ideally, we are individually able to realize this — and cut some of these strings.

Or go out and read more sources as this system originally intended. It's not like they are unavailable. Can't blame Facebook for sheer laziness.

Every time I hear guff about how magnificently 'digital natives' negotiate technology, I have to refrain from being the pain in the arse who points out that perhaps one in a hundred knows how to do anything more than tap where entrained by a corporate interface.

I've realized in regards to 2016 elections, and probably Brexit vote, when you can't hack the election machines to give you the results you want, you hack the electorate to accomplish that. As a bonus, unallowed entry to computer systems is illegal, putting targeted ads and spamming Facebook/Twitter with bots? Not really illegal...

Politics in a democracy has always been always been about hacking the electorate. It just seems that the tools for doing this are becoming more and more powerful.

I think it's not actually about becoming more powerful, but about being obscure. Long before Cambridge Analytica, presidents were still winning handily by hacking the electorate with expert use of mass media. Kennedy dominated television and won. Before him, it was Roosevelt on the radio.

The trouble is that like any other virus, people build up an immunity to the tricks on established platforms. You're not going to see another Kennedy or Roosevelt using the techniques they used, and it'll even be difficult to pull what Cambridge Analytica did now that that cat's out of the bag.

If you want to remain as effective as them you need to keep pushing and capitalize on the next mass media platform before people realize what you're doing.

What scares me is that a large number of very bright people nowadays work at places like Google, Facebook and others whose ultimate business model is to hack people's psychology to entice them to buy more stuff. What they are working on will be the perfect toolkit for dictators to manipulate people.

Advertisers have been doing this as long as there have been advertisers. One that really blows some people's mind (although it's reasonably well-known by now) is that although hardly anyone thinks twice about a diamond engagement ring, the notion we have of diamond rings today was invented by advertisers in the 1940s: https://www.theatlantic.com/international/archive/2015/02/ho...

Similarly, Americans will ask for a Kleenex even when they don't actually mean a Kleenex-brand tissue. I didn't even think twice about this until I moved to Europe and it was pointed out how weird that was.

It's true the big tech companies make it easier to target segments of people by collecting data, but pervasive mass manipulation is hardly new or unique to big tech companies and generally it doesn't need nearly the amount of data that they collect. People are generally similar enough to each other psychologically that you can paint with broad strokes like demographics and still hit most of your target (and interestingly, some of the most viral memes have no particularly targeted message).

Human existence has always involved a lot of fighting and killing, but a single technology qualitatively changes the outcomes as a result: nuclear weapons. Even though it’s judt a more powerful bomb, it changes everything. Something can change our lives in unforeseen and devastating ways without being fundamentally new, and just a matter of degree.

I think that should read:

"Politics in a democratically elected aristocracy ...

Let's pretend the voting system was a binding function from the collection of ballots to people in charge (which is not even the case):

The number of bits encoded per ballot per period between the elections is the bit-rate of that voter's feedback/input. The bit-rate of the output is the number of bits (entropy) in (changes in) laws. The democratic grade (the dimensionless ratio between bitrate of a voter to bitrate of law changes) is nearly 0.

Hacking the electorate is only profitable if we don't get to vote on ideas (at a high bit-rate) but on aristocrats.

<insert the aristocrats joke here>

On the one hand, I appreciate the desire to be protected, but on the other I think the most amazing thing advertising has convinced people of, is of the effectiveness of advertising itself.

Personally, I would prefer a fix to "hacking the electorate" stem from the required addition of information and transparency, rather than the setting up of mechanisms to regulate what is allowed to be seen, censorship. Adding information seems a much less prone to abuse than prohibiting information in a open democratic society.

Yeah, the results have to be hacked. No way the politicians can be so tone death as to scuttle their chances from the word go...

both trump and brexit votes required swaying (or discouraging to vote) less than 2% of the electorate. that's certainly within capabilities of foreign psyops no matter the vote in question.

Or simply picking the wrong campaign theme...

> hack the electorate to accomplish that

What is the purpose of campaigning, if not that?

Palihapitiya is a great speaker and very brave for owning up to what many try to downplay their involvement in. Here's a link to the video this article pulls from, it's succinct and really hits home:

https://www.youtube.com/watch?v=78oMjNCAayQ [4:05]

Brave, huh? I guess I will give him some credit for owning up to this, sure. But his only repercussion from helping create this is 'guilt'. And from the looks of his wealth and new ventures, I don't think he wastes any time with his guilt.

Meanwhile, the world burns..

We don’t know anything about his inner life, and the hell of real guilt cannot be overstated.

Palihapitiya and Parker seem to disagree about something important. Parker says that the psychological programming was deliberate:

“It’s exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology,” he said. “The inventors, creators [...] understood this consciously. And we did it anyway.”

But Palihapitiya says,

“Because your behaviors, you don’t realize it, but you are being programmed. It was unintentional, but now you gotta decide how much you’re willing to give up”.

Does Palihapitiya not realize that the programming he decries was built in to the system deliberately by its designers? Or is Parker wrong?

I don't see how their views conflict. Parker is claiming that facebook is intentionally manipulating its users while Palipapitiya is claiming that the users are largely unaware of the manipulation. Their claims are orthogonal and can coexist quite easily.

This is as old as mass-media. The formula has been `money -> propaganda -> control` for at least two centuries now. Social media just put it on steroids.

Limiting paid propaganda would help, but that's sacrilege in our ad/media driven economy, so the ultimate lesson is probably just "don't be poor".

There's nothing new in this post that didn't exist 4 months ago when Palihapitiya said the words.

I have hope for (but low expectations) that this sort of realization that the 'free' media is being bought and paid for by people who want to manipulate your opinions and actions, will encourage more people to subscribe to media sources with better transparency.

"He finishes this up by warning the audience not to think they're too smart to fall for the implications of social media, and stated that those who are best-and-brightest are the most likely to fall for it, "because you are fucking check-boxing your whole Goddamn life.""

What does he mean when he says "check-boxing" and how does that relate to the "implications of social media"?

Does he mean "approval-seeking" and the idea that this is what Facebook uses (e.g. "Like" buttons) to "engage" visitors (while gathering data about their lives, personalities, etc.)?

Anyone have a different read?

In this case I would read "check-boxing" as "reducing to a simple optimisation loop without considering the implications of what you're optimising for".

Granted, I'm doing some pretty heavy interpretational lifting there, but that optimisation loop is something I see as common to productivity, video games, social media, and many other obsessions of the nerdosphere. It feels good to give your optimiser a constant stream of feedback, even if trivial.

Of course, now our optimisers are being manipulated by much larger optimisers that run on engagement metrics rather than dopamine. Where did the intuition to build these myopic hyperoptimisers come from? Surely not the same people who grew up minmaxing character attributes or building spreadsheets for optimal daily task efficiency? When you gaze into the optimiser...

Weird that we managed to turn society into a GAN.

I've heard people describe Facebook as basically just filling out a bunch of forms. I take him to mean that we are reducing the rich contours of human experience and interaction to just filling out forms and tapping buttons on a screen.

Radio, a technology which most us would consider old, was used extensively used to brainwash the populace in Rwanda to commit grave crimes. How is the current technology any different this time,

This reminds me of the novel Daemon.

Not to give too much away, but the main way that the antagonist in the book preys on people is not by hacking their devices, but rather by hacking them through complex, technology-assisted social engineering.

That just sounds like realism.

Hmmm i guess facebook is the IDE that's used to program me and each post is a line of code.

I doubt that Mark understud what facebook was before it got to what it is. How could he.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact