(I'd searched the FB link, didn't see the LinkedIn post.)
Per the guidelines: "Off-Topic: Most stories about politics, or crime, or sports, unless they're evidence of some interesting new phenomenon. Videos of pratfalls or disasters, or cute animal pictures. If they'd cover it on TV news, it's probably off-topic."
To me effectively everyone wins. Facebook users who want to curate their experience can do so. Facebook gets out of the censorship business so governments can lay off. Governments will have a hard time objecting because it is the users controlling their information, not the company. "Sources of truth" have to compete to get the customer's business.
Its hard but not that hard, and its Facebook's core business.
There's also the practical problem with social networks - within a degree or two of connections, you'll at least have some people who are part of communities covered by your own blocks. But it is in Facebook's interests to show you those communities and extend those linkages; likewise, people would simultaneously object to 1) Facebook not showing them content from friends even if it violates their own content settings 2) Facebook violations their content settings.
It would be a lose/lose situation. It is now, too.
They could then no more wash their hands of it than they can now: if people are using your platform to do bad things, or things society rejects, then you'll take the blame nomatter what level (or lack thereof) of interference you take.
The amount of effort involved in telling a lie is pretty low. Going out to do good research to tell the truth is often much more expensive in terms of effort.
Or, said another way:
"The amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it."
I think its not only possible but relatively easy to have user settings that allow people to toggle on/off "1) Facebook not showing them content from friends even if it violates their own content settings".
Here, you have requested something fairly close to a general AI which understands both written language, images, and even more fun, images of language. Cartoon violence right on down to "Will no one rid me of this turbulent priest?" And will quoting Henry II of England set off your detector?
Let's unpack your COVID post thing -- note that WHO guidelines have changed a few times. Recall that era wherein they said that there was no evidence of human-to-human transmission. Also, don't buy masks. So you'd have to factor in a time domain as the guidelines change, in addition to your generalized AI as above.
In IT, I have frequently heard requests for "why can't we just get the things we know we will like?" and such. It all boils down to that mind-reading, future-predicting generalized AI.
Why don't we just turn it on?
Whats so difficult about letting a user toggle those existing filters on and off? Or allowing users more control over their news feed, such as allowing it to be chronological? Or hiding items based on keywords?
Its our job as technologists to not only tell people if/when their request is technologically impractical/impossible, but more importantly, to make informed recommendations of what is technically feasible and most closely matches their intended goals (which they might not yet even fully understand themselves).
I hate seeing politics of any stripe in my news feed, for example. I get enough of it elsewhere. But how can Facebook take an image, recognize that it is a cartoon, and that the cartoon is based on something political? We're not there yet. We're not anywhere near there yet.
You and I, as humans in this time period, will likely recognize a drawing of a man with kind of mocha-y skin, ears that stick straight out, large-ish front teeth, and kind of a squint as a standard issue caricature of a particular former President of the USA. We can immediately go "aha, politics!"
So I suppose we could try to train some AIs on a historical set of caricatures of politically-relevant figures. Of course, caricaturists and cartoonists of all stripes are called upon to produce new stylized exaggerations of people as they step into the limelight (or are dragged into it), so our training set must be updated on a daily basis. Hrm, some more work there ...
But then it gets into "politics you would like to hear" versus "politics you would NOT like to hear," and that is just going to be something that will require a mental model of what you do and do not like.
Right now, Amazon and Netflix cannot get my recommendations even CLOSE and their data is much better curated than what someone might type into a Facebook text box.
Chronological? Reasonable! Probably doable. Keywords? Who is putting those in? Who is curating that set of keywords?
allow 3rd-party individuals/companies to create user-filters that users could load up at their preference. Allow a proliferation of filters to compete.
So, say, if you don't want profanity you could view FaceBook with the, say, "FaceBook-no-profanity" filter active. Or if you don't want antisemitic remarks you can load up FaceBook with, say, the no-antisemitic filter. Filters could be loaded, unloaded, switched on and off at the users desire.
So everyone could see the Internet they want or, at any time, view everything or part of it. And if you want to see what a friend is talking about, you could load the filter(s) (s)he is using. Sets of filters could be grouped, e.g., TomT's filters, Mary's filters, Mom's filters, etc., so you can, so to speak, "see the (Internet) world through their eyes".
I see where this is going...in the long run the Internet will turn into a vast wasteland with gardens of little consequence and minimal membership. It will become defunct and all else will be simple P2P over TCP/IP.
In IT I infrequently hear anyone say IT.
So, let's just do it.
They are criminally negligent in allowing themselves to be what they are: a publishing platform designed to divide humanity into smaller and smaller subgroups in order to ease marketing, without regard to public safety or any conception of common good. Social media does not care about the blood it spills.
Want the good old web back? Enforce responsibility for content published on the publisher. Let's see facebook moderate every post. Let hateful shitheads find a place to host them, and let them start their own blogs, with their identifiable information publicly available. Failing that, maybe we can do something like this idea I posted this morning. (https://www.reddit.com/r/socialmedia/comments/gwhiie/trust_a...)
When nearly everyone in your bubble has the same beliefs on politics and culture you will react violently towards threatening ideas.
"Those other people are idiots and will kill us all without some rules" is a basic human conceit. These people calling for censorship on Facebook don't want to see that stuff, sure, but way more importantly they want FB to censor so that other people CAN'T read that stuff.
They think that if they shout loud enough, they will get to decide what other, more gullible/radicalized/violent/whatever people are allowed to read.
It's really condescending nonsense to claim that you know better than another adult what they should be allowed to read.
Lots of really famous/successful people are thinking along these lines lately. It's quite popular amongst the "thought leadership" set to think that Facebook has some moral obligation to play publishing cop and shut down private groups where people say things they don't like, because people other than the ones calling for censorship are so gullible and malleable and childlike that they'll turn into a terrorist if they read enough about 5G or vaccines or antifa or whatever in private FB groups or something.
A handy example is John Gruber's recent post on the matter:
The correct answer to people being wrong is education, never censorship.
I defer, as always, to Heinlein:
> “Thing that got me was not her list of things she hated, since she was obviously crazy as a Cyborg, but fact that always somebody agreed with her prohibitions. Must be a yearning deep in human heart to stop other people from doing as they please. Rules, laws — always for other fellow. A murky part of us, something we had before we came down out of trees, and failed to shuck when we stood up. Because not one of those people said: "Please pass this so that I won't be able to do something I know I should stop." Nyet, tovarishchee, was always something they hated to see neighbors doing. Stop them "for their own good" — not because speaker claimed to be harmed by it.”
I'm having a hard time believing that they want to stop with Facebook, either. I suspect this will move on to hosting providers next.
I wish there were more companies like Cloudflare.
They did a complete 180 over the past couple of years, for particularly distasteful content (in their opinion). For better or worse.
What do you do when people reject education itself as a matter of course or principle?
Or can only view such attempts defensively, see them as patronizing at best, and so radicalize further as a result?
Heck, even "this post is misleading" approaches, which leave the original content intact, are seen by some as a form of censorship (and definitely as propaganda to be discarded) and I don't think that's an unreasonable response based on their existing point of view.
Fundamentally, if misinformation is more appealing and attractive than truths, how far can educational approaches go? If it's information junk food, then I see education helping no more than it helps for the obesity crisis - perhaps less so because that unhealthiness actually feels bad and prompts action. There's no equivalent with disinfo.
Shrug. Sometimes the wrong choices of others can't be helped.
> Fundamentally, if misinformation is more appealing and attractive than truths, how far can educational approaches go?
Well, based on approximately the last 1000 years of history, considering that we have an orbiting science lab 400km above the Earth and outdated superstitions seem to be waning worldwide, I would venture a preliminary guess that in the long run, truth will win out.
All of the other times we tried mass censorship, it ended very poorly, with millions upon millions of deaths.
Trust in your fellow human beings. You can't have a society without trust. Most people are not malicious, nor are they exceedingly dumb. Let people read.
If you're worried about people not knowing the things they need to know, or being mislead by wrong or bad information, I encourage you to write a book, or make a website. Hell, I'll even help you set it up, for free, if you need assistance doing so.
I agree that people want to adjust what other people can read, but this is where Facebook can really sidestep this by making things user preferences. Because it sounds strange to campaign to force Facebook to "override a person's explicit preferences about what to show/hide".
Does this include "We killed Osama Bin Laden" [perhaps with some details and photos of the operation]?
This seems to be somewhat implicit in our society ("violence is bad, unless it's our heroes doing it to people another color far away, then woohoo bombs away!") but usually unsaid and unexamined. It's a good moral move by Twitter to point out the contradiction.
A better move would have been for them to simply ban everyone calling for violence, militaries included (if and only if they are going to be banning people selectively based on content, which they are, which I think is bad).
That depends on how you define "it." Media is not neutral. At least, we haven't yet had a neutral mass media. In 2020 that means defaults, recomendation engines & such. These are core to FB's product, business model. They're not neutral.
You need to take the scale FB's influence into account. Elections, revolutions, mass protests, primary political narratives... all determined on FB. What happens on Facebook can determine presidencies and prime ministerships. The have more power than Rupert Murdoch. At this scale you can't just "get out of the censorship business."
Youtube, for comparison, appears to be trying to get out of the harder problems by exiting the space entirely. On hard news issues, they're defaulting to (more than defaulting to) television news sources. I don't think they want politics on their platform anymore. It's too dangerous.
FB can't do that as easily, but I expect they'll try something similar.
Basically... social media will try to hand the problem back to traditional media somehow.
In fact, if I were Facebook and wanting to reduce competition, I would try to pass legislation that requires these kinds of controls, making it harder for upstarts.
"Everyone", including racists, terrorists, drug cartels, human smugglers, child pornographers, scammers, phishers, hackers, gun nuts, anarchists...
I know you say you'd be fine with all that stuff, but if I write long enough I'll find something you're not OK with. And everyone else has a different something. Finding common ground on which we can all agree isn't a mistaken goal or an unnatural side effect of having a public forum. It is the purpose of having a public forum.
This is just another variant of the "forums should be unmoderated" argument. No one wants that. And the proof? You're making the point on a heavily moderated forum.
That's terrible argument. HN already works pretty close to what mchusma proposes. See "showdead" option in profile setting.
Which was exactly my point. "Facebook" is not "the world and Internet", is it? The question is about whether moderated forums have value to people (they do, you're reading one right now). Demanding that your moderated forum not be moderated affects whether or not people see it as valuable. Facebook wouldn't be Facebook without moderation to make it so. HN wouldn't be HN, Reddit wouldn't be Reddit, etc...
> you think you are the owner of the truth [...] For you free speech is a bad idea
Go. Away. This junk doesn't belong on HN. And I'm gonna bet you get flagged for that by the very moderation we're discussing.
For your solution, the NY Times will run stories about Facebook censoring us on Monday then stories about Facebook letting Nazis eat babies on Tuesday.
That's the issue Facebook faces here: not minimising social damage or maximising freedom, it's protecting share price and avoiding legislation due to economically motivated attacks.
Fyi, I actually really like your answer.
This means that Facebook has to provide a process to categorize posts into the categories. Facebook argues they don't want to do that categorization.
> e.g. "dont show me posts that glorify violence" or "only allow covid posts that match WHO guidelines" or "allow exceptions from political figures or of historical significance")
Glorifying violence isn't objective measurable.
How do you identify matches to WHO guidance (incl. satire, valuable critique, ...)
How exactly do you identify "political figures or of historical significance" sure, you can identify elected officials, but what about opposition leaders which often are not formally designated or cases where elections are disputed?
And to be clear: I think the stance of "we are neutral" is wrong. And I don't think this can be offloaded to the user, while Facebook stays "neutral".
I think perfect is the enemy of good in this case. Giving users more control is good.
I guess to be clear on my stance, I think Facebook should be trying to be as neutral as possible, and keep making progress on that front, knowing it is impossible to be perfectly neutral in every way.
Basically you can adjust the likelihood of something showing on your feed by severity weighted by location and friend graph (showing less severe things closer to you and more severe things further away). This matches how I think humans actually process information.
That's cool, except for all of those people that have been that the WHO is secretly out to get them some how. The issue goes deeper...
You find lone actors committing mass killings:
"New Zealand mosque shooter broadcast slaughter on Facebook"
You find genocides:
"A Genocide Incited on Facebook, With Posts From Myanmar’s Military"
You find radicalisation:
"Prosecutors say Dylann Roof ‘self-radicalized’ online, wrote another manifesto in jail"
You find disinformation with global consequences:
"How fear of the unknown sows disinformation during a pandemic"
Yes, the examples listed include more than Facebook, but Facebook, and its "they trust me, dumb fucks" unitary executive (https://www.vox.com/technology/2018/11/19/18099011/mark-zuck...) is very much at the. heart of the problem.
Or, as some may better recognise the notion, "with great power comes great responsibility". Facebook and Zuckerberg consistently shirk theirs, and externalise the concommitant costs.
Anticipating the usual rebuttal that Facebook could not operate if it were required to act responsibly: if an organisation cannot exist or operate at scale without actively harming society, then it shouldn't exist.
The "individual responsibility" argument, going back to 1970s anti-littering campaigns and the crying (fake) Indian (fake) TV advert (and before), are corporate gambits to shirk their own obligations:
Annie Leonard, "Moving from Individual Change to Societal Change" (2013) [pdf]
Don't buy it.
Unfortunately we are in the minority, the majority wants SOMEONE ELSE to choose what they can see, what is the "good speech" and what is the "bad speech"
They want someone else to tell them what the "good things" are they should be supporting, and what the "bad things" are that they need to oppose
In short, people no longer want to think critically for themselves, they want the magic box to do that for them
My statement is Facebook should not be doing any promotation at all, and people aurging over the level and type of filtering / promotion is the problem
Facebook painted themselves into this corner by choosing what people will see and not see instead of letting user choose this for themselves. That is what the GrandParent was saying.
See you desire them to censor and filter, you want that you just disagree with their choices
I do not want them picking anything at all for anyone.
not sure how that makes me "ignorant" on this topic, or how my comment is some how wrong or misunderstanding what it happening. It seems more likely you are misunderstanding my comment
Of source this is all really pointless because I have never, and will never have either a Facebook or Twitter account so...
Oh, the irony.
Racism is protected speech. Hate is protected speech. You seem not to understand free speech so you want it suppressed - it sounds a lot like the Inquisition in good ol' times.
Secondly what ranking you see is largely influenced by what you and your friends engage with.
Facebook won't change by inside. Zuck will change his behavior when he feels the pain in his pocket. Every decent person must quit now. Quit. Now.
The financial consequences seem to be overblown. At the end of the day, people in Timothy's situation will certainly be able to make ends meet. At the end of the day, I'm sure he'll still be better off than well over half the country. This should be clear to most people reading this.
I'm not saying this to downplay Timothy's actions; I'm saying this to remind others that are on the fence - if you think you can't afford, morally, to keep working at Facebook, you can afford, financially, to quit.
Either "...I don't want to be part of it, despite how much they pay me" or "...I don't want to be part of it, but I know I'll be okay" are both complicating the issue.
Be a Zen master. Strive for directness, simplicity. Don't fight yourself.
Not judging this way or that, just saying.
Ultimately developers leaving and people refusing to work for Facebook, drive up Facebook's costs and speaks to their management in the only language Facebook management seems to understand: Profitability.
I think he's doing the right thing and I wish more people could feel like they are in a position of walking the walk, every day.
I agree that software developers have it pretty good. But let's not, on that basis, discount acts of moral integrity.
If nothing works he can probably open a gofundme campaign.
Number of Facebook employees: about 45k. Let's say half of them work in the US (20k). Let's assume people tand to stay at a job for 5 years on average and think about leaving for 6 months before they leave. That means there are 2000 Facebook employees thinking about leaving right now.
Now, what is the probablity that if someone is already thinking about leaving they would take advantage of this situation? Let's give Facebook employees the benefit of the doubt and say that only 1% of them would do such a thing. That means we should expect to hear from about 20 Facebook emloyees making socal media and LinkedIn posts about them leaving Facebook citing policy as a reason while just taking advantage of the situation. Is it really far fetched to say that this guy could be one of them? If anything I would expect 19 more posts like this!
Especially since at some companies such a public statement can put you on a no-hire list. Some companies avoid potential "trouble makers" or avoid hiring, which might be interpreted as making a point.
Still risk of high total loss is quite low for individuals from FAANG companies.
He worked at Facebook, let's not get carried away.
Working at Facebook and then leaving because of the non-intervention wrt to the President is a bit like manufacturing landmines quite happily, but having a change of heart because you saw a dog stepping on one. You've still been manufacturing landmines all that time, making money and not giving a damn.
Facebook has been on my list of companies not to work at for a long time, but I have friends who work there. Accepting that other people value different things and being willing to interact with them on a basic, human level is a lot more likely to build bridges and lead to constructive conversation about important matters down the road than casting people aside because they don't conform to your worldview today.
I'll grant you that Facebook, among others, enables exactly the sort of thing I'm advocating against. I'll grant you that it seems to promote people yelling at each other over the internet instead of building the relationships that enable difficult conversations in time. And I'll grant you that they do it to sell ads for shit we mostly don't need.
But that doesn't make everybody who works there bad, and it doesn't mean that people who decide that continuing to work there is no longer consistent with their values should be equated with merchants of death when they publicly leave.
You're castigating somebody for voting with their feet, but presumably mostly because they didn't do it as soon as you did.
No. I'm saying they did always vote with their feet. And "get money, fuck society" was the vote they cast.
Facebook isn't the army, there is no draft, they had and have plenty of other options. It's greed and a general feeling of superiority that sees the general population the same way the owners of factory farms see pigs.
Instead of personal cost to him, I'd rather view it through the lenses of the impact of his action. It's inviting a lot of constructive conversation and making people re-evaluate Mark's decisions, which I think overall is a net positive.
Namely those who hold stock options or who used it to hyper target unsuspecting fools and scam them.
> Instead of personal cost to him, I'd rather view it through the lenses of the impact of his action.
Let's. The "constructive conversation" has been going on already. This doesn't move the needle on that conversation, it's not one with a lot of grey area where the world is on the fence. This is great - for him. But not much else.
Oh sure those too. But small businesses benefit from it immensely, e.g using fb live and fb groups to connect with their customers for example.
> Let's. The "constructive conversation" has been going on already. This doesn't move the needle on that conversation, it's not one with a lot of grey area where the world is on the fence. This is great - for him. But not much else.
If that's the case, what do you think he should have done, or we should have done?
This is not obvious to me.
Doesn’t mean it’s not the right thing to do, just that it would carry more weight if he was leaving to go work at Burger King. Then you’d say “man that guy stands tall”.
Edit: Ok reasonable criticism, I’ll withdraw.
Until he does it's just virtue signalling.
(yeah, I don't get some people either)
Resigning isn’t about sacrifice to the volcano god, and if that is what you’re looking for, you are not the audience.
It reminds me of the old joke about the skinflint who accidentally put a pound in the collection plate instead of a shilling. "Oh well", he says, "I'll get credit for the pound." The priest hears him and replies, "No, you'll get credit for the shilling."
Now imagine that OP inspires 100 employees quit -- now Facebook has a problem that it must address, because it's threatening their bottom line.
You are literally the only person who thinks this.
Simply adding this to your post doesn't make it true. Clearly you've judged it to not be as valuable of an act as it could have been.
Not saying you're being dismisve, just saying.
Edit: Of course, the big 5 (FAANG) will not touch him even with a 10 meter pole, but there are other places also pays well.
That he should also devote his efforts full time to combating the problems he speaks about for less compensation, or donate the excess compensation to related causes, otherwise the action is unworthy.
These are restatements of classic utilitarian dilemmas, as stated by, among many others, Peter Singer.
"In a society in which the narrow pursuit of material self-interest is the norm, the shift to an ethical stance is more radical than many people realize."
Yes, he's a privileged white male and will likely get a job offer soon, but that doesn't mean he's rolling in cash. Also, take a look at average salaries. Devs don't make $250,000 in the US unless they've got some really high profile public backstory.
In my experience, this is VERY far from accurate in high-paying markets like Seattle and the Bay Area
Of course at Facebook's scale it's a small number, but given the damage to their reputation, they will have to take note once this crosses critical mass.
Give him credit for what he did. He is losing a lot more than us armchair pundits.
Also I don't think it ads much to the conversation to try to minimize someones sacrifice, you don't know their personal situation, and even if its not that hard on them personally who cares, it's still a principled stand of where to invest ones time and energy.
I don’t think that’s their fault, I’m happy they=’re leaving, but the people commenting and upvoting these threads should question how much attention this one person deserves. There are people working jobs in far worse conditions with far worse bosses we never hear about.
Maybe the goal of praising him isn't to make his life better but is instead to encourage other people to also stand up for what they believe in.
Or maybe it's to contrast him with other people who are in a similar position of privilege (and feel the same about Zuckerberg's lack of action) but chose not to sacrifice anything at all for their principles.
I can’t help but think this actually is a judgement.
It is privilege having a stable life to get there, but we're calling out this dude's privilege like he's Jeff Bezos throwing a million dollar coin to a witcher.
It's definitely a sacrifice to quit a nice job during a pandemic and nationwide riots.
- Have two eyes? what about the blind
- Can you speak english? What about those that never had the opportunity to learn
- Are you alive ? What about those that passed away
The judgement never ends
For who has the privilege of judging others privilege and where does it end?
Excerpt From: J. Lee Porter. “Crypto Citizens.”
I'm surprised they didn't have him out on his ass the same day and gave him a whole extra week to potentially sabotage the systems.
OK fine. I agree it's totally an uphill struggle and in all fairness it's one we're all fighting.
It's one man versus a sea of apathy where all the incentives are geared towards productivity at all costs and bitter disempowerment and ostracism for anyone that steps out of line.
I were him I'd go hardcore until they actually fired me i.e. make it hard for them rather than take a gallant bow -- how's that for starters?
Whatever happened to fighting for what you believe in? How can you leave all your hard work in the hands of people you no longer trust?
I've actually gone all the way up and been fired by a roomful of directors and a vice president for calling out bullshit agile theatre.
Part of it rests on everyone's shoulders because I know for sure a lot of people reading this thread right now would jump at the chance to take his place, struggle be damned.
Look at what happens to anyone that expresses a dissenting opinion I'd like to see more downvotes just to make sure I'm on the right side of this issue.
Our problem is we seek acceptance and agreement too much and don't fight for what others believe in on principle.
We're all to blame but now we have one less person who can do anything about it. Look at the world from the outside in for once and quit suffering your blasted anecdoche.
How many times have they tried to manage up only to be hamstringed by their own peers?
We should all know we're trying to change them and help each other, how else?
Tell me you've never tried and failed due to "politics" but have you ever stuck your neck out even a little?
I was responding to your claim that it was illegitimate to quit in protest, and that people should stay and change organizations from within.
Given this belief, I assumed you might have some insight you have about how to do this effectively. Maybe even examples of when it has been done?
And yes all our energies should be towards changing the organizations, as systems professionals, rather than competing on "look what I can do" while the world outside our gated communities crumbles.
Being willing to walk away is fundamental bargaining tool.
If some people quit over an issue, it strengthens the bargaining position of those who stay behind.
Both tactics are required.
Zuck, on the other hand, faces an open mutiny and a damaging of the Facebook brand. Perhaps he’s just stuck between a rock and a hard place, though, because Facebook censoring POTUS can only invite more government scrutiny.
Real good deeds aren't announced.
And additionally, quitting your job on moral grounds is _absolutely_ something you should post about. He made a personal post on Facebook, as a current Facebook employee, noting that he's leaving. This will be seen by his coworkers, and as the very presence on HN demonstrates, the public at large. Virtue signaling is Old Navy changing their profile picture to a rainbow flag during Pride month, or an NFL team posting a black rectangle on Instagram to support BLM. Virtue signaling is _not_ quitting your job to protest your employer's ethical failings, or writing content that would violate a non-disparagement agreement.
Public announcements of resignation have two impacts: on fellow employees, and on the employer. Fellow employees may be waiting for validation of a sinking feeling that they're in the wrong place, before they quit themselves. When the employer sees numerous conscience quits in response to a bad action, they have an opportunity to reflect and change.
These guys lack any sort of self awareness.
There are also things that are not political. They are war. They are 'over my or your dead body' type of things like slavery, holocasts and treason. We don't vote on those and accept whatever the outcome is. We go to war. That's why that sort of stuff is in a constitution. Because it's not negotiable. Not up for popular vote.
Now. I do admit that I don't think that applies to Facebook. I don't think they are intentionally trying to end the US democracy and break the US society.
But never say never. There is definitely a point. One does not sell hamburgers to the KKK for example. Or help build their website.
That's also why you see so many apolitical brands speaking out and picking a side here. They do understand this is different. This is a war for human rights. Nobody deserves the privilege to stay neutral on this.
The problem with convenient trap doors like this is now we’re back to being unable to debate the merits of Snowden because he committed treason.
Never? If you worked for an employer that seated black people in the back in the 1960s, it wouldn't be admirable to quit?
I am not on some high horse here, I am not sure what I would be willing to put up with if I had no other options but I certainly prefer working with companies that have values that are aligned with my own.
"Liberal" and "conservative" aren't the only political flavors, either. Lots of leftist activists don't work with liberals and lots of right-wing activists don't work with conservatives, for example. It's also difficult to work with someone whose beliefs conflict too much with your own or whose views are dehumanizing (like, I wouldn't want to work with someone who regards themselves as superior because of their race).
There are too many actions that are political when they come down to it (pricing products, marketing products, choosing partners, choosing tools, choosing customers, choosing features, choosing charities, etc). There is no possible way that all of them are going to align with your views.
The reason that works is that you’re just able to not care that they don’t.
The straw that broke the camels back.
> For society to function, people need to be willing to work with others whose politics they find distasteful.
And how are we supposed to work with people that outright lie? Who are dishonest. Who crosses a red line with that lie? Are we supposed to shrug and accept it? Would you accept people lying in a professional environment?
So stupid people love it.
"people acting stupidly" = people in temporary states of stupidity
That being said, HN does have logical and helpful technical discussion that does not resemble the kind of social media that people are talking about when they say "social media". The incidence of toxicity is much lower. That is why I believe HN is social media yet much healthier than, for example, facebook.
It seems like a needless exercise to classify which type each person is. We can just take both cases with a grain of salt, address their points skeptically, and move on from doing psych evals on a couple sentences about something else.
Even just pointing out that the word being said is 'stupid', but the actual in-group/out-group divide is technical literacy or systems-awareness. And really just in the frame of social media. Perhaps there should be a word to describe the local ignorance that everyone engages in when they are turned off and relaxing.
You can eat fast food once in a while and still be very healthy. They're not necessarily advocating for (metaphorically) shutting down all fast food restaurants, just making a point about over-consumption side effects.
At least that's how I read it.
The emotional swings I get from images far exceeds what I get from text.
This kind of scare has happened with all media: games, TV, film, fiction books. If there's a difference between the fast-food social media and this website, I'd like to hear what it is.
Even if they spent all their time on social media, their warning against it wouldn't necessarily be hypocritical. If an alcoholic tells you that alcohol is addictive, you should believe them.
My understanding is that this website was created was to make YC more famous among the kind of people who would post here. Posts are indexed by search engines and, after some period, undeletable. Like all social media - including all the way back to usenet, before the phrase existed - flaming, high-tensions, and consequences are non-negligible. I've seen someone write a post as an insider that got a reply from their CEO asking them not to do it. The sum of this is that YC's advertising is creating a platform for people to harm their careers.
Yet, despite knowing this, I continue to browse because I find value in the content that exceeds the dangers. I imagine others feel the same and hope that, by viewing HN under the same umbrella as Facebook etc., they might not view the negatives as fatal. At least not to the point of demanding other people leave their jobs.
But honestly, if you cant tell the difference between Social Media and a forum, maybe you should step away from the net all together. Read some books, learn a hobby. Go outside. Make some real life friends... For it seems as if you too close to the forest to know that its made up of trees.
Its just like fast food. If its all you eat, you have no idea what real food tastes and feels like...
I'm aware of how targeted advertising works: it reminds me of seeing toy adverts between children's TV shows.
Thank you for the life coaching.
Edit: furthermore many users post with real or pseudo-real names that reward them with real life social clout.
I use adblocker so I do not see ads on HN. Correct me if this is different from your experience.
I can imagine, as you point out, the impact on "social life" for users with real/pseudo real names. Back in the day forums were used for anonymous discussions when you would not trust strangers on the internet. I guess now the opposite is trending.
>Back in the day forums were used for anonymous discussions when you would not trust strangers on the internet. I guess now the opposite is trending.
While writing this I was thinking of the famous 'LINUX is obsolete' thread.
I was getting dopamine hits on 4chan and GameFAQs in 2005 when I was 12. I'd call it social media.
Certainly don't target certain politicians while leaving others alone.
When you can't win your argument on merit, you censor.
It's up to the politicians and constituents to call out lies, why should some Ministry of Truth decide what's a lie?
Politicians will always address their counterparts and should always try to inform their constituents. It's up to the constituents to stay educated.
Censoring politicians because people might be too stupid is a really bad reason.
No, you've been told that because people aren't happy with the current status quo.
There has always been propaganda and the truth is always possible to find.
Unless you have something specific I feel like you're using hyperboles (think of the children!!!) to win your way.
It's not lawlessness, people aren't extremely misinformed (if they are that's on them), the truth isn't borderline impossible.
This truth hunt will die out when the people seeking it are back in charge.
Yes remove threats, but don't remove lies. Having lies on your platform is not lawlessness, it's the internet.
If you don't want lies then take 230 away from them and make them take down libel/slander/lies that EVERYONE says, not just the people they choose to target.
There are thousands of elected officials, so it’s not simply a theoretical question to ask what should be done when one of them posts dangerous misinformation. Shouldn’t Facebook prevent it from spreading widely?
Spreading widely is the only reason we should censor speech? If what a person says is absolutely ridiculous, it should be seen and exposed by sunlight and more discussion.
You can remove whatever you want from your own private platform. If president trump want to say something that is removed from these platform I'm sure he can host his own blog.
If you're going to use a social media platform, you have to obey by the rules of that platform. It's not censorship. He's still free to say whatever he wants, just not on that platform.
It absolutely is censorship, AND they are within their legal rights to do it.
Everyone is allowed to have an opinion on facebooks policy.
I genuinely wonder if we should be giving this vile person a platform through which he can so effectively reach so many people so frequently.
What would have happened if Facebook had been around in the 1920s and Hitler had said, I want to round up all of the people of Jewish decent and murder them... in a really bad way?
Now what would have happened if that message was only shared locally not globally. Or it was determined by the majority what message could be shared. Many people love comparing Trump to Hitler, but what they forget is Hitler was LOVED. He was eloquent. He was a great at harnessing everyone's rage and controlling it. He was an EVIL controlling dictator, but he understood how to control the majority well. Trump does not. If he wins again it will not be majority votes it will be electoral college.
Now Obama, in his first election knew how to influence people. Shit, sorry I miss that guy, I digress.
The problem is when a politician says something that can be backed up by force it's a fact, not an opinion. If I Said, Trump should, "do XYZ". It's an opinion and if it's egregious enough it should be discarded. If I say, "I want to do XYZ to This person." It needs to be kept if only as proof of intent (even if it's no longer shared via algorithm). Now if I were to say, I want to start a war, and I had the legal right, power, and capability to do so. It should be kept, spread AND commented on. Because one thing that comments do well is breed divisiveness and I don't want unity on posts which are hateful.
The thoughts of a leader need to be spread. Even if you abhor Trump, I don't want to stop his message unless he loses power over me. Because no matter what you want he has POWER over all of us. We have power too as has been seen recently. But information is power, to remove information is to dis-empower the people. He's not a redneck with a gun. He's a redneck with nukes.