Hacker News new | past | comments | ask | show | jobs | submit login
Pinterest Blocks Vaccination Searches in Move to Control the Conversation (wsj.com)
73 points by vanderfluge 29 days ago | hide | past | web | favorite | 77 comments



This is incredibly worrying. I understand that anti-vax conspiracy theories are dangerous, but I can think of a ton of ways banning "bad medical advice", as determined by a $10/hr contractor at Pinterest, Facebook, etc. could end up hurting people with health problems by preventing them from sharing useful information. I have gender dysphoria and at one point considered getting on testosterone, but thanks to conversations with transmen and detransitioned women on social media, I learned about less-commonly discussed side effects and decided against taking testosterone. This is an area where research is still developing and there aren't a ton of long-term studies, so it's incredibly helpful to be able to learn from other people's experiences, even if they aren't yet confirmed by research.

Likewise, I know people who've had a post or comment removed by Facebook because it was erroneously interpreted as racist or sexist. Off the top of my head, I know a friend who was sharing information on ICE raids with friends who were immigrants and they got banned from Facebook for a couple of hours (he was unbanned after appealing). I know someone else who was sharing statistics on domestic violence that was interpreted as discriminatory. The corporations making these policies don't have a good enough understanding of political issues to determine whether a statement is hate speech - and hell, even people who are knowledgeable about politics and history disagree in a lot of areas.


It doesn't have to be a $10/hr contractor even - it could be a $500,000 per year MD.

For instance, for years, people recommended a high carb/low fat diet. Now that's being challenged. (I'm not weighing in on the argument, just stating that it is one.)

How would this debate take place if this type of information were blocked? For years (decades?) Atkins was treated as a fringe-food nut job. Now, his early research is being promoted by many others, and the tide may yet turn in his direction. [0] for example only.

How many other topics are out there that face the same issues of being labeled "fringe" before being commonly accepted?

[0] - https://journals.sagepub.com/doi/abs/10.1177/088307380934759...


Science gets it wrong sometimes. New information requires revision. The best we can do is go by consensus expert opinion. There will always be some probabilty of error but we needn’t be afraid to declare some things dangerous and wrong.


> There will always be some probability of error but we needn’t be afraid to declare some things dangerous and wrong.

But, who is we in this case? And what specifically are some things and how narrowly are they defined?

> The best we can do is go by consensus expert opinion.

Is that really the best we can do? How many times in history has consensus expert opinion been found later to be dangerous and wrong?

Should we assign that task to an AI or a customer service drone of unknown education and experience?

Or, should we assume that any reader of any particular "dangerous and wrong" things might be a better judge of whether those things are, in fact, "dangerous and wrong" as applied to their specific circumstance?


We is you, me, and society at large. Yes, it really is the best we can do. If a very large majority of the people who study an area of science agree on a conclusion in that area then it’s more likely they are right than someone who has no expertise in that area. Clearly if a large majority of the experts in an area of science agree on something then it suggests that if a person were to educate themself in that area to the point of being an expert then they’d agree with the conclusion too.

Non experts deciding for themselves what is right/wrong is a recipe for disaster. Peoples’ intuition is usually wrong without a lot of experience to back it up. It’s why we don’t let just anyone practice medicine or structural engineering. Expertise matters and the opinions of experts matter much more than a nonexpert’s opinion.


>> But, who is we in this case? And what specifically are some things and how narrowly are they defined?

> We is you, me, and society at large.

But that is not who would be passing judgment in this brave new world of customer service reps and AI. Do we just ban everything mildly controversial?

> Expertise matters and the opinions of experts matter much more than a nonexpert’s opinion.

Are you a qualified, cited expert in this area that you are holding forth on?

> If a very large majority of the people who study an area of science agree on a conclusion in that area then it’s more likely they are right than someone who has no expertise in that area.

There is some room for intelligent debate in nearly any 'consensus' opinion. Some percentage of even experts nearly always disagree with the consensus, and consensus has often been proven wrong. If society went along with the expert scientific opinion concerning eugenics, for example, many of us might not even be alive today.

> Non experts deciding for themselves what is right/wrong is a recipe for disaster. Peoples’ intuition is usually wrong without a lot of experience to back it up.

Non-experts deciding for themselves what is right/wrong is exactly how the world has existed for thousands of years. You seem to be saying that the answer is to just shut down this debate if it occurs among the great unwashed.

> It’s why we don’t let just anyone practice medicine or structural engineering. Expertise matters and the opinions of experts matter much more than a nonexpert’s opinion.

And, yet, we do. In most free jurisdictions, you are free to practice medicine on yourself or do design your own structure or home.


Of course people are free to practice medicine on themselves. They are free to make up their own minds on what they think is right/wrong. They ought not be free, in my opinion, to unduly influence others. At least not necessarily free to spout off whatever ideas they think are correct. Of course this quickly gets into grey areas and situations where the right amount of suppression of ideas gets tricky. It’s OK for government to get involved in this too. For instance we don’t allow peddlers of snake oil to make whatever claims they desire to make. This is a good thing.

I’ll restate my point in a different way. When government is deciding what types of scientific information peddling ought to be banned or regulated it’s best for our leaders to consult the experts of that area.


> For instance we don’t allow peddlers of snake oil to make whatever claims they desire

The important word here is “peddlers”. We regulate the sale of medical products. (And advertising related to such a sale.)

But we do not regulate who may join in the argument about (say) whether stress causes ulcers, or low-fat diets prevent heart attacks. The self-proclaimed experts have at various points in time been quite sure about these things. But thankfully their self-confidence did not result in a ban on people questioning the data.


We don’t allow people selling certain homeopathic remedies to make certain medical claims while selling the product. So we do regulate speech.


> .. this quickly gets into grey areas and situations where the right amount of suppression of ideas gets tricky.

"Suppression of ideas" is a grey area? I have no words.


I should have said speech instead of ideas. I think it’s clear from what I wrote what I was getting at. All societies regulate speech. I don’t know anyone who thinks speech should never be regulated.


If speeches are regulated to this extent then it hampers society from discovering others experiences and learning from them.


'Peoples’ intuition is usually wrong without a lot of experience to back it up' -- people's understandings are based on their own and their social circle experiences. What's make you think its always wrong? Do you mean we keep our brain shut and feeling suppressed about anything that's not coherent with experts opinion? That's what happen in Autocratic government.


I never suggested, hinted, or implied that peoples’ non-expert intuition is always wrong. I never suggested, or implied that people keep their brains shut and feelings suppressed if those feelings are not in line with expert opinion.


Society should be following reasoning from first principles rather than relying on opinions of anyone.

Experts should be better at explaining the reasoning behind their opinions from first principles, but we should not trust them until they do so. Experts can make mistakes and have biases, often to new ideas.


My area of expertise is mathematics. A number of times I’ve explained to someone that the concept of infinite sets is a well defined one. There is a definition and it allows us to work with such sets. I provided the (from my perspective) simple definition and an explanation but to no avail. My point is that often times people outside of the area just don’t understand it. Personally I don’t care if someone doesn’t understand something but I do care if their misunderstanding becomes normative and endangers others.

Pre-internet nutjobs existed in all communities. Cranks and whatnot. This is nothing new. What is new is the scale at which such people can propagate their nonsense. The cost of convincing others your are right has drastically declined. The speed at which such stupidity can spread has greatly increased.

We have entered an era in which regulation of stupid, crackpot ideas may need to happen. If and when we do decide to crackdown on this it’s best to rely on expert opinion. This is of course just an opinion of mine.

I submit to you that the vast majority of what you believe is due to knowledge you gained form others and not from first principles as you put it.


But in mathematics the experts really do know what they are talking about, for the most part. Many things aren't like this.

Just today there was a thread on mental illness, and the crazy grab-bag of ideas which passes for expert consensus:

https://news.ycombinator.com/item?id=19198396

And if that's not crazy enough, look up what they believed 60 years ago. Should those have been locked in, by government force? Or should we be free to mock the shrinks for their delusions of understanding, if we wish?


In an area as concrete and black/white as mathematics it’s still hard to convince some non experts that we really do know what we are talking about. Imagine how much harder it is in trying to convince anti vaxers to vaccinate. When society makes policy it’s best not to treat everyone’s opinion as equal. We are not able to always correctly deduce what is the best course of action on our own when it’s an area we have no expertise in.

I used be a fundamentalist, right wing Christian. Absolutely convinced that evolution was wrong. Eventually I was able to take the blinders off and ask myself, “Why is it that the overwhelming majority of people who study biology at the advanced level agree with evolution?”. It takes a great deal of arrogance to dismiss a conclusion that the overwhelming majority of the experts in a given area agree upon. Of course people get it wrong sometimes but we have to navigate life with imperfect information/knowledge. Who else do we rely upon? Keep in mind I’m not saying believe whatever an expert says. I’m saying that if the overwhelming majority of experts in a given area agree on something then that carries a tremendous amount of weight.

My wife is a psychiatrist. betulaq’s comment in the link you provided is one worth looking at.


But which year's crop of psychiatrist ideas should we enshrine in law? In 1950 there were rebels who didn't buy the consensus, and their ideas won, things improved. Why wouldn't this change have been prevented?

Or worse, how do you know that the evolution side would win the battle to be selected as the official experts on this matter? We have these fights over school boards right now, and sometimes the biblical literalists have more votes. Who gets to decide the how the head-count of experts is to be conducted? I think it pays to imagine these weapons being used by our enemies.

I don't know the solution to the anti-vax madness, but I think censorship is a much bigger battle.


I’ve only been talking about topics in which the overwhelming majority of experts in that area agree upon. If each year’s class of residency graduates in psychiatry all have different opinions on a given topic then this clearly is not in the scope of my comments. Also, I acknowledged that sometimes experts get it wrong. My point is that this fact ought not disuade someone from relying upon consensus expert opinion. Few people have expertise in an area of science and very few have expertise in more than one area of science. We need to rely on what others tell us to be true. I don’t know anyone who has personally run the Michelson-Morley experiment but I know the overwhelming majority of physicists agree the results of the experiment. I’m not arrogant enough to think they are wrong.

If 99% of oncologists think you have cancer then I hope you get treatment for cancer. And if 99% of them think option A is your best hope then I suggest you take their advice. You don’t have to. They may be wrong but in this world of uncertainty and imperfect information it’s the best option.


And the expert opinion is as usual: more studies need to be funded.


> I have gender dysphoria and at one point considered getting on testosterone, but thanks to conversations with transmen and detransitioned women on social media, I learned about less-commonly discussed side effects and decided against taking testosterone. This is an area where research is still developing and there aren't a ton of long-term studies, so it's incredibly helpful to be able to learn from other people's experiences, even if they aren't yet confirmed by research.

I'm sorry for using your actual problem as a jumping-off point, but this is a real issue:

There are people called TERFs, or Trans-Exclusionary Radical Feminists, who think "being trans" doesn't exist, that trans people are all mentally ill and/or (in the case of transwomen) men trying to invade female spaces to cause havoc, and who are organized enough to turn being trans into a huge political clusterfuck.

More than it already is, I mean.

My point is, the project of cleaning up "Bad Medical Information" runs into politics even quicker than most would imagine, and TERFs are very, very good at playing the "You're misogynistic!" card early, often, and loudly. How much courage would a tech company have in the face of that these days?

(They're also good at playing the "TERF is a slur! Cis is a slur!" card.)


TERF is most definitely a slur, and cis is often used as one as well. E.g. "cis white males!"


>TERF is most definitely a slur

Really? What else should we call self-professed radical feminists whose definition of a woman excludes trans women?


But I defend your right to believe and share that biological men who claim to be women are actually women and would never allow a tech company to censor you!


Radfem. You wouldn’t go around calling pro-choice and pro-life people “anti-life” and “anti-choice” would you?


Not all radical feminists are trans-exclusionary, hence the need for the term.

https://en.wikipedia.org/wiki/Radical_feminism#Views_on_tran...


I think the need for distinction arises from the fact that nothing in the foundations of radfem that presupposes trans exclusionary ideology (especially considering the evolution of the science of gender).

Trans exclusionary groups are pretty proud of the exclusion in and how it differentiates them from other feminists (or radfeminists for that matter) grouts. So TERF seems pretty apt to me.

Granted Im pretty naturally biased given they'd say I don't exist but the point stands.


Probably not a slur.


The concerns they have don't sound completely unreasonable.


I agree. It worries me that public conversation gets policed by companies whose main goal is to sell ads to the highest bidder. By definition they will have to prefer information that makes them money over information that doesn’t make them money. They are also very vulnerable to pressure from governments.

In addition you have no real recourse if the big machine doesn’t like what you posted.


This is true but its more stopping scammers / snake oil salespersons selling placebo's to desperate people.

I few years ago I reported (to Matt Cuts) people using ppc selling snake oil pills to people with chronic organ failure by buying ppc on the medication keywords that people like me where using.


"This is true but its more stopping scammers / snake oil salespersons selling placebo's to desperate people."

Snake oil salesmen are probably willing to pay good money for their ads so all incentives are for Google, Facebook and others to not stop these ads.


That was my first thought too, about how this can cut down on new information that isn't supported by medical literature. I have some weird food issues that I can't find any research on. Some people with similar issues go on an all meat diet, which might not be the most nutritionally complete, but it beats the hell out of their regular symptoms. It would be a shame for discussions like that to vanish under the premise that it doesn't match the food pyramid.


I was vaccine injured. It destroyed my life. This is not rare. I'm not anti-vax. I'm pro safer vaccination practices. But of course, I'm not allowed to talk about this despite overwhelming scientific evidence that vaccines are not 100% safe.


How would you like it if I called "gender dysphoria" a conspiracy theory? Despite me not being an expert in the issue?

That's what your saying to people with vaccine injuries.


This is a really, really, really thorny issue.

On the one hand, it seems noble/responsible to suppress anti-vax, Russian influence, conspiracy theories, etc.

On the other hand, for centuries it's been recognized that the best antidote to "bad speech" isn't censorship, it's more speech. Don't ban, convince.

But on the other other hand, that's been exclusively argued in the domain of government action, that government censorship is ultimately worse than what it purports to cure.

In this case, tech firms/platforms like Facebook, YouTube and Pinterest aren't the government or society, they're private actors just like newspapers and members of the free press in general. Just like it could be irresponsible of the NYT to publish letters to the editor supporting anti-vax, you can argue it's equally irresponsible for tech firms to allow the same on their platforms.

Yet on the other other other hand, we're reaching a point where a great deal of discourse is concentrated on a few user-content-driven sites, so censorship on them feels like it's inching closer in spirit to government censorship.

But on the other x 4 hand, mainstream public conversation has always been driven mainly by merely a handful of newspapers and then news programs with their own editorial agendas, so a handful of tech actors exercising their own "responsible" (as self-interpreted) curation and promotion doesn't seem to be anything new.

In the end, free speech has never been an absolute right (e.g. yelling fire in a crowded theater, libel, etc.) and it's ultimately a question of finding the right balance between harms.


> On the other hand, for centuries it's been recognized that the best antidote to "bad speech" isn't censorship, it's more speech. Don't ban, convince.

For centuries there were fairly large barriers to widely disseminating your speech, and to lots of people having time or ability to pay attention. I think this may have limited the amount of "bad speech" that the truth had to counter.

It's now a lot easier to disseminate speech, and people have more time and ability to consume speech.

I probably encounter more opinions in a day now on any given topic, from people claiming to have above average or expert knowledge on them, than I did in a month a mere 40 years ago.

This may make what worked centuries ago (or even a few decades ago) ineffective today.


Thanks to AI, it's possible to algorithmically generate almost unlimited amounts of false or meaningless speech to drown out accurate information.


Thanks to AI can't we also do the opposite?

E.g. detect conspiracy theory speech and automatically provide relevant links/warnings to relevant curated sources?


The list of things that are true is orders of magnitude smaller than the list of things that appear true. Randomly generating plausible-looking bullshit is a thing we can do now, but ascertaining the validity of a factual statement is tantamount to solving the AGI problem.


"We" the public can't, but Google might. But that would require them acknowledging the problem. And once they've done that the easier solution seems to be to not recommend (which implies endorsement!) these things.


That's way, way harder.


> On the other hand, for centuries it's been recognized that the best antidote to "bad speech" isn't censorship, it's more speech. Don't ban, convince.

The problem with that here is that the medical industry is actually wrong. They want to censor speech because they are the bad science, and people are figuring it out. If they were right, then yes, "more speech" would be a great strategy.


This isn't their first rodeo. In 2008 Google launched Knol, which was a fairly major initiative to provide a service like Wikipedia, but without Wikipedia's problems around unattributed content. The idea was that authors would post with their professional affiliation, and get compensated with ad revenue. They hoped to attract actual medical professionals, as medicine was one of the areas where you want to be able to rely on the information.

It failed miserably. The incentives were favorable to spammers but not to actual professionals. No real expert would invest their time into the platform, putting their reputation on the line, for a few paltry ad dollars. But there were some people who mastered cranking out low quality content.

I'm not making any claims about the new effort, just pointing out that identifying a problem and putting resources behind a solution is often not enough.


I still feel like large tech companies don't understand the amount of power they wield beyond it's capacity to influence buying decisions. I personally believe that YouTube (pre-2009) was the early breeding ground for conspiracy theorists, and Facebook groups/pages was the enriched gasoline added to the fire.


Everyone is saying this, but is there actually a measured increase in belief in conspiracy theories? I have only ever seen it stated as an assumption.


My father in law is a conspiracy theorist and it came a lot more from Fox News and talk radio than YouTube. They were pretty mainstream well before YouTube


The difference was that 1990s conspiracies were all about secret government agencies doing secret stuff in secret places. Modern conspiracies are out in public. Pizzagate, still belived by many, is different than area 51. This visibility means belief is constantly reenforced.


Most of the people I've talked to aren't crazy alien conspiracy theorists so I can't really comment on that brand of conspiracy theorist. Honestly my main experience is the standard 9/11, global warming, and Obama conspiracy theories (which were incredibly racist) from the early to mid 2000s and those were all pretty similar to modern theories and at least for my father in law and his friends were definitely from Fox News and talk radio.

Edit: a good example to me was the start of the birther movement, which is a quinessential modern conspiracy theory to me, and I remember that as talk radio and Fox News.


Global warming and birtherism were a turning point. It was then than condpiracy theories were directly applicable to politics. Managing them then became a practical political weapon.


The John Birch Society predates both, and was a similar movement by similar people towards similar ends.


The question is a bit philosophical at this point given the boundaries but is it truly their power?

The spread was by "Conspiracy Theorists" types look for self confirmation first to others of the type. It isn't like they could do anything to say get them to support say sensible infastructure upgrades because it wouldn't flatter their egos.

YouTube rewind got critically panned and any overt efforts for youtube tend to be poorly received.


The recommendation engine plays a very big role.

Fundamentally the ad companies (FGA) have a problem - their entire value proposition is that the little space at the side of the page and before the content is very valuable and has an influence on people's thoughts and decisions.

So if they then try to claim that the content itself, or the list of recommended "watch next" videos on Youtube, isn't influential, it fails the sniff test.

(Apple are mostly clear on this front. Netflix know the "watch next" is critical to their success, but they don't let people fill it with unvetted garbage.)


Pre 2009? Is this not true today?


Our society is so obsessed with the fear of censorship, or the "elite" hoarding all information, that we've now dived right into the opposite deep end. There's always too much of a good thing.

We created tech to usher in an era of new mediums. But we never stopped to ask ourselves what should we keep from the "old" systems. In any system that gets recreated/refactored, there is always a duplication of efforts. And that's what we're seeing right now. We're slowly realizing that editorialization is desperately needed.

In all these social graphs that's been created, we assume all "nodes" are rational actors, or are likely to be rational actors and that the information if just let free would steadily sort itself out. But that's not what we're seeing. Old diseases are new again and entire republics have fallen victim to misinformation.

It's well past time to address these issues. I thought 2008 was bad. I could have never imagine 2016. And now I'm severely worried about what non-sense we'll see in 2020.

Tech companies, and the employees in these companies, need to continually ask themselves what they're empowering.

EDIT: https://www.youtube.com/watch?v=PMotykw0SIk&feature=youtu.be...


Some firms are more on top of it than others.

Netflix's deal with Goop is not encouraging, Goop just a "lifestyle brand" that is selling snake oil.


In the end they will always go for more money despite all the talk about values.


Well, I feel like it is also worth pointing out that all money starts to be a lot more persuasive when your business model is "lose other people's money until a miracle occurs". Honestly, if we have a system which permits/encourages sky-high valuations and seemingly interesting businesses to expand well beyond their user's desires or ability to pay, this is precisely what we get.

If you monetize on ads, you need damn expensive ads. If you want to have expensive ads, you gotta do something to actually make them work.

if sustainability vs valuation was a goal, then there might be more incentive to avoid these kinds of odd, consumer hostile behaviors. Users may not be the product, but, they're not the direct revenue source either. As long as ads can trick people into clicking, there is an incentive to charge for those ads.


It's not all vapid nonsense - people are now tying branding to their personal identity more (probably as a side effect of social media, but I digress).

If a person feels a consumer-facing company doesn't align with their values then they won't be a customer.


The cure for bad information is usually more information.


Expanding on my own post... why is it any tech company's responsibility to protect people from their own bad decisions?


Are you serious? Because as of so far, in the US, the law demands it. Look at every product you buy "Don't do X dangerous thing with said device". We regulate countless things that can be physically harmful to individuals. I can't go to the store and by dynamite because I can harm myself and others with it.

So should information be regulated if it can cause physical harm to oneself and others. This is not an easy question to answer.


So law should dictate what we're able to say? There was a time when that was counter to the entire point.


Legal responsibility, or moral? The answer is of course "it depends," in this case on who you ask. Maybe they don't even think it's their responsibility, but they think it will make life better for their customers. I personally do a lot of things that aren't my responsibility, but which I think are good to do for various other reasons.


That makes sense in cases where you know what's best for others.


What if the bad information is the result of an already too low signal:noise ratio for the good information?

Kind of like this comic: http://www.poorlydrawnlines.com/comic/knowledge/


Thanks, this is hilarious. My response would be, maybe others' lack of information, or disagreement with my understanding of the truth, shouldn't bother me as much as people think it should.


In my opinion this hypothesis is totally wrong because the accuracy of information is the problem and not the quantity of it.

We can handle quantity, now we need to handle error correction and build better access permissions too.


The problem with censorship is that either the truth is obvious, and there is no need for it, or it isn't, so there is no way to decide fairly what to censor.


I don't know if censorship is the right branding here because it imposes a correlation between authoritarian citizen-torturing regimes and how we effectively use and encounter "censorship" in our every day lives when doing things such as registering a company name or posting porn on facebook or something.

I can't sell vitamins that say "cures cancer" on the packaging so why should a youtuber sell ads while saying the exact same thing.


Eh, there is a problem, human bandwidth. If I spam flood you with crap information you will never have time to input good information. Fake/false/random information is far easier to create than good information, much like in any complex problem space there are only a few correct answers and possibly and infinite number of incorrect ones.


I'm not convinced this is true. Take Wikipedia for example, the overall information quality is extremely high, especially considering it is open to all to edit. One would assume (and many did at first) that Fake/false/random information would reign supreme.

This is because Wikipedia policies have made it clear that all information must be derived directly from primary sources. Multiple editors are aligned with that policy and work with others to enforce it.

The general populace has not been trained to do that, so false information proliferates. Maybe that reasoning skill is what we should focus on teaching.


No more than violence never solves anything.

Which is to say that it's a false aphorism.


This cracks me up. Violence solves things all the time. Speaking of aphorisms!


When the medical system actually starts curing people then they can claim their advice is the best. Otherwise it's just opinion. If you can't cure a disease you don't understand it.


Can they finish their other wars first?




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: