> I do not think there is much evidence that misinformation has become more widespread, that this increase in misinformation is due to technological change, or that it is at the root of the political trends liberals are most angry about. If anything, people seem to be better-informed than in the past — which is what you would expect because our information technology has gotten better [...]
This kind of framing makes me feel very weary. There is plenty of evidence, but I'm not interested in going on a hunt.
Rather, let's think for just one second about how to model this:
- The number of "bad actors" probably has not changed significantly over time
- The cost of producing documents has gone down significantly
- The cost of diffusing documents, or information, per audience headcount, has gone down significantly
- Overall, the total volume of sketchy information produced yearly has been dramatically increasing as a result of these lowering barriers.
- Conversely, the total volume of quality information has clearly increased as well.
- The technological tools at our disposal, as consumers of content, have become more and more sophisticated, such that we now have access to more quality information than ever before in History.
- However, consuming quality content requires constant deliberate effort and prudence, and it is much easier to stumble upon crappy information, than it is to find quality information.
- The less expert you are at a subject, the more energy is required to sift through the crap.
- Most people are either not expert at anything, or are expert in a narrow field of interest.
- Information no longer flows in the same way it once did: it now diffuses through a social graph which is prone to significant network effects. Non-experts can have a bigger influence than experts, even when the information they share is entirely meritless. Echo-chambers form naturally and require a lot of deliberate and conscious effort to get out from. It is very difficult to reach certain pockets of the graph.
How come only experts are/should be allowed to speak on a subject? This leads to a technocracy, which while enticing those in the pseudo-aristocracy, it sucks for everybody else involved.
Every time I see someone post this all that goes through my head is this person has an ego and thinks they should speak for the country/world.
As others have pointed out, I am not suggesting (nor do I think) that only experts should be allowed to speak on a given subject.
However, it is my (perhaps vain) hope that people understand that there is value in expertise. I highly recommend the following read: it's something so many people get wrong, and it's highly pragmatic.
There's value in expertise, but at the end of the day it's not the experts life and decisions should be left to the individual. Once you realize this, these types of comments become meaningless.
The person you replied to said nothing to imply they disagree with you. Perhaps we are forced to choose as a society between technocracy and misinformation. I'm suspicious of that line of argument myself, but "technocracy is bad" and "misinformation isn't a problem" are separate arguments.
Sure they did, it was implied in these three points:
> - The less expert you are at a subject, the more energy is required to sift through the crap.
> - Most people are either not expert at anything, or are expert in a narrow field of interest.
> - Information no longer flows in the same way it once did: it now diffuses through a social graph which is prone to significant network effects. Non-experts can have a bigger influence than experts, even when the information they share is entirely meritless. Echo-chambers form naturally and require a lot of deliberate and conscious effort to get out from. It is very difficult to reach certain pockets of the graph.
That only argues that non-experts will be fooled. The universe is under no obligation to make misinformation less bad just because technocracy is even worse. Maybe we're just screwed!
Between you and me I suspect it's not that big of an issue, but no matter how bad technocracy is (and I agree that it's very bad) it still has no bearing on whether non-experts proliferate misinformation.
what youre really asking is "why do credentials matter and why should randomly samplibg information not result in a appropriate confidence in a result"
after all, we knew for decades that the average guess for a cows weight tends to be more accu4ate than any individual guess.
the probablem as you are responding to, is how cheap messaging is and how effortless it is to both produce information and entrain the body politik in its recitation.
these can be over come by what you deride, credentialling and weighting sources. because in the exam0le od the cown, theres no people interested in giving wrong answers or mani0ulating responders to effecr a skew on the answers.
we know and have directly seen that is a naive point of view. as such, your response takes a simplistic exam0le and broadly interprets everything naively.
> what youre really asking is "why do credentials matter and why should randomly samplibg information not result in a appropriate confidence in a result"
No, not at all. What I'm saying, is that one's ability to speak on ones experiences with some subject should not be gaited by credentials. One should also not create a rank structure based off of credentials. Is it not apparent to you yet that one can be an expert without any formal credentials?
> the probablem as you are responding to, is how cheap messaging is and how effortless it is to both produce information and entrain the body politik in its recitation.
> these can be over come by what you deride, credentialling and weighting sources. because in the exam0le od the cown, theres no people interested in giving wrong answers or mani0ulating responders to effecr a skew on the answers.
What are you fighting to hard to protect here? You want more people to take the COVID vaccine? What exactly, as there's many more solutions than just listening to your closets high ranking technocrat.
> If anything, people seem to be better-informed than in the past
[citation needed]
more information =/ better informed, the same way drinking from a firehose doesn't allow you to ingest more water than drinking slowly from a glass. This point missed ironically by a website called "slowboring.com"
I would suggest that a slightly better firehose analogy is that you're drinking from a firehose that includes multiple streams of good drinking water, urine and poison all combined together. When you drink from that firehose, what do you think the odds are that you'll just get the drinking water part of the stream?
Hehe I really love the firehose visual, and I think it's quite apt... Except that there's also a "pollution" aspect to information.
If you consume enough misinformation, it doesn't matter who you are or how smart you are, it's going to affect the way you think. Just as in terms of physical fitness, "you are what you eat", in terms of mental fitness, "you are what you read/hear/consume".
I believe he was using the source article. One example was a leap in public polling around how our government functions (33% jump in being able to identify the three branches of the US federal government).
Similar numbers can be found about faith in democracy, views toward fellow citizens, etc. It's not just lack of knowledge, it's the embracing and polarizing extremes.
Have you read the article? He talks specifically about this very subject, and why this narrative is wrong.
> When I asked Twitter followers to suggest the best evidence they had that misinformation has become worse than it was 30 years ago, a lot of people expressed their frustration with the people who won’t get Covid-19 vaccines. I also find this extremely frustrating.
> That said, vaccination rates for kids have actually risen since the mid-1990s [...]
> A lot of people know that the licensing of the polio vaccine in the 1950s was widely greeted with celebratory headlines and the ringing of church bells. [...] Eighteen months after authorization, vaccine uptake was still slow, and that was after a much longer development process.
The COVID vaccine is not the same as other vaccines. One can have hesitancy towards the COVID and other mRNA vaccines and go get the flu shot the same day. If you want to solve the hesitancy problem you must recognize this first, otherwise you’ll be stuck in a loop like you are now, demanding they’re safe just like every other vaccine in history.
mRNA isn't new (it was discovered in the 90s), and has been arguably studied far more than other types of vaccines. Also, there are non-mRNA COVID19 vaccines available as well, if that is your issue.
Accounting for both these facts, how is vaccine hesitancy justified ?
Actually the 1960s, and yet I’m still not wrong as the first mRNA vaccine approved for use was Pfizer’s COVID vaccine. So an actual large scale usage of an mRNA vaccine just happened.
> and has been arguably studied far more than other types of vaccines
Gonna need a source for that.
> Also, there are non-mRNA COVID19 vaccines available as well, if that is your issue.
> Accounting for both these facts, how is vaccine hesitancy justified ?
And this is covered by a lack of fear of the virus. Many people have caught it and recovered, substantially more than have died. So therefore the vaccine is not needed.
The promise of the internet is more information will help our accuracy, this doesn't take into account the nature of some to want information that confirms their beliefs, not info that is confirmed accurate.
The poll backing that tweet appears to be exactly the problem you're highlighting though. It's a bunch of academics in which they poll people on questions selected specifically to target a minority of right wing people, which they then use to "prove" that Democrats are better informed. And also that young parents are less informed than older non-parents, which they discover for reasons we'll get to in a moment.
But hey, anyone can do that. Here's a poll that shows the opposite - that right wing people are better informed about COVID than left wing people:
Except that's actually a professionally done poll.
The "Covid State Project" is an academic consortium (so we can guess that their work won't replicate anyway), and these people REALLY want to believe that left wing academics sit at the top of the informational totem pole. This sort of polling is a waste of time because you can get any result you like, it has confirmation bias all over its design. Want to show that <outgroup> is stupid and dumb? Find the most extreme and stupid beliefs held only by a tiny minority of members of that outgroup and poll everyone on only those questions (an actual question they asked about, "do vaccines contain microchips").
In fact not all of their questions are even correct. They only asked four, and one of them was "is there any impact on fertility" which was rated false. This is itself a lie and I know this, because my fiancés best friend lost her period for nearly six months. No periods = no fertility, so I have seen with my own eyes that this can happen. Not surprisingly, this is the question in which the highest number of people said yes to, yet they rated this belief false and claimed anyone who believes it is "misinformed".
This probably explains their otherwise unexplainable finding that young parents are amongst the "wrongest" people. Young parents are exactly the sort of people who are most likely to know lots of women who are caring about periods and fertility, and thus who will be the first to pick up on widespread disruption to menstrual cycles. When they answer correctly that vaccination can affect cycles and thus fertility, they get graded as misinformed.
It's garbage-grade research like this which makes academics look so corrupt and terrible. 14 academics worked on this and a poll with four questions, one of which has the wrong answer, is the best they can do.
> The "Covid State Project" is an academic consortium (so we can guess that their work won't replicate anyway)
I'm not sure what you mean by that (perhaps just a simple case of discrediting the messenger in order to ignore the message), but this fact:
> The single biggest predictor, by far, of belief in vaccine misinformation? Identifying as a Republican, reflecting the widespread misinformation and conspiracy theorizing about Covid on the right.
Is supported by polling, hospital rates, death rates, etc. Denying it because it makes your 'side' look bad is not only silly, but dangerous. People are dying.
I wish I could upvote this 10x though I see atm that is not necessarily.
A very nice summary of priors and predicates that need to be grokked!
To these I myself often add several aspects looking more specifically what "social graph" means and what the so-called algorithm means, with lemmas about "engagement" and its industry...
...and the underbelly of how the algorithm works, the constellation of surveillance tracking entity resolution and targeting that make up the actual functioning of "teh algorithm."
These regularly lead me back to trying to find pithy summations of specific paths through this mire, such as "deplatforming is not censorship, it is failure to provide or sell a megaphone," and, "we need to reason about 1A the way we do about 2A," (ie in terms of understood liberty vs aggregate societal harms).
Another such summation is simply to observe that the subset of "the information out there" that any individual sees, and which is amplified within their own social circles online and off-, is the most hotly contested property in the market today. In many markets.
Against this individual integrity and good faith often appears to be but a reed in the wind.
> These regularly lead me back to trying to find pithy summations of specific paths through this mire, such as "deplatforming is not censorship, it is failure to provide or sell a megaphone," and, "we need to reason about 1A the way we do about 2A," (ie in terms of understood liberty vs aggregate societal harms).
This is one of the scariest paragraphs I’ve seen on HN in some time.
Who exactly gets to reason about these changes? How do we change 1A and no cause censorship? Our political parties have thrown the entire nation into division and one side wants to limit speech while the other doesn’t. So what do we do? Implement a dictatorship?
While we’re here, 2A has not cause societal harms, the humans did. Separate them already. The amount of gun owners vs those shooting up places have a very very very wide margin.
By "we" I mean, we as a society. Me, you, all of us.
By "reason" I mean, we need to have open debate about where the appropriate line today is in various domains between competing goods and aims.
The conflict is often as here, between collective interest, and individual liberty.
Not the time or place reopen/rehash well-trod arguments,
but I will observe that from the perspective of someone who believes in gun control, and to my point about the analogy,
our collective problem actually is precisely that humans are regardless of good faith intentions, defined by predictably acting out of accord with those intentions,
and hence we as society have reason to seek to regulate and mitigate the impact of force multipliers.
In the US, guns are the most well-known force multiplier in the domain of interpersonal conflict.
But today the force multiplier which arguably has orders of magnitude more impact, is "speech."
The distinction between "people" and their tools is IMO specious. Humans have agency; what is at issue is that given that agency, what do we agree about tool use?
Reasoning about "free speech" does not === censorship. It means, we need to educate people to have a better mental model of what our industry (here on HN) is capable of doing and is doing, in the feedback system which marries surveillance and the construction of personalized information streams.
The short-hand way for describing this is a good start: it amounts to stochastic mind control. No individual may be "programmed," but our industry and its clients put billions of dollars and literal and political fortunes among other resources, into the presumption that our current tooling is capable of producing discernible changes in collective behavior.
We argue here and elsewhere regularly as professionals about the ethics and consequences of using this system to e.g. provoke consumption.
When I say we need to reason about speech, I am myself most interested in us coming to terms with, and finding a way to survive as a society which e.g. values individual agency (call it liberty if you prefer), when that same tooling over the last decade especially has increasingly been used and with exponentially increasing power, to drive social and political behavior.
In the US this is most evident and most clearly a present danger wrt the fomentation of tribalism, in specific tribes which are largely defined by their anti-democratic, anti-communitarian, authoratarianism, married by convenience and inclination to racism, homophobia, misogyny, and the exhausting catalog of other permanent social horrors.
Twitter, or the online ad market, are no more "free speech" in an Originalist sense than a contemporary semi-automatic weapon is a musket.
In bother cases, we need to understand and apply Constitutional law to find the appropriate balance between collective and individual interests,
and what I saying is, we need to do so with an honest discussion of evolving power dynamics given that individuals now have at least hypothetical access to force multipliers.
To make this concrete: the 2016 and 2020 elections were subject to asymmetrical information warfare. Asymmetry in the terrorist sense: a small but motivated group, such as the GRU, is capable of provoking catastrophic disruption.
I do not posit any particular answers.
But we need to have the hard discussion, the sooner the better.
> in the feedback system which marries surveillance and the construction of personalized information streams.
You've continued to terrify me. How can we construct "personalized information streams" without censorship? By definition you either allow someone to view information outside their stream, or you don't. It clearly sounds like you're saying we shouldn't allow people to construct their own streams, but have someone else do it for them. Therefore censorship.
> I do not posit any particular answers.
> But we need to have the hard discussion, the sooner the better.
And again, what would the results of this conversation be? You have one side demanding control over speech, and the other demanding no control over it. One side, once some perceived threat arises, wants total control and extermination of the perceived threat. Yet the other side doesn't perceive this threat. The other side believes this threat to be an authoritarian attempt to stifle non-aligned speech and thoughts.
What great societal damage has you wanting to control speech so much? COVID? Vaccine hesitancy?
Exactly. The fundamental difference in advertising today compared to 30 years ago is that for the entire of history until the last 2 decades, it was expensive to target individuals. Ads on TV and newspapers have to adhere to standards established by organizations like the Advertising Standards Council. Malicious ads or propaganda that target individuals with misinformation or malware or phishing can be bought on modern platforms for pennies or mere fractions of a penny. Nobody is being held to account for the damage they are doing to our society. In the past at least the cost of postage was a barrier to entry, but today? This desperately needs to be fixed.
What misinformation have you been convinced of because you saw a targeted advertisement? Or a slightly different version: What views do you hold that originated from a targeted ad?
Or for some more specific examples:
In elections, do you start out looking to elect one candidate and then you see a targeted ad and you change your vote because of it?
For COVID vaccines, did you start off with a stance and then you see a targeted ad and you change your stance because of that ad?
As I post every so often here on Hacker News, have a look at this ad my elderly father got a couple of years ago just before COVID hit: https://imgur.com/a/Lk85lST . It's a perfectly innocent search by someone looking for help with a problem that directs towards an ad for a phishing site. The worst part is that it's highly targeted. I tried doing EXACTLY the same search from my own phone and laptop and got completely different results despite coming in from the same wifi network at the same location and the same time. This isn't acceptable. If malicious ads couldn't be targeted like that so that more people saw the same unacceptable content, it sure as hell would be reported a lot more. Instead, someone has made a decision that nobody should be held accountable. It generates a profit, so it's OK, right?
Oh totally, that ad seems malicious. I was just curious about the misinformation angle that you mentioned and how you in particular have been duped or had your opinions changed because of a targeted ad.
I don't see those ads. My filter bubble is completely different. Personally, I think it would be beneficial for society if all of us were exposed to information that falls outside of what our individual filter bubble normally shows us from time to time.
Another example that came up in my life yesterday relates to the trucker protest here in Ottawa. One of the news stories from last month about restrictions is being heavily promoted by people sympathetic to the cause. However, in the intervening 4 weeks since this particular story was published, those restrictions were lifted and are no longer being enforced. The people promoting the old article do so because it benefits their narrative, but they are effectively spreading misinformation now by failing to acknowledge that the content is no longer true. That's a problem, and that kind of behaviour is not easy to solve. Maybe someday our web browsers will have the ability to flag content that is obsolete, but that's a pipe dream at present.
This kind of framing makes me feel very weary. There is plenty of evidence, but I'm not interested in going on a hunt.
Rather, let's think for just one second about how to model this:
- The number of "bad actors" probably has not changed significantly over time
- The cost of producing documents has gone down significantly
- The cost of diffusing documents, or information, per audience headcount, has gone down significantly
- Overall, the total volume of sketchy information produced yearly has been dramatically increasing as a result of these lowering barriers.
- Conversely, the total volume of quality information has clearly increased as well.
- The technological tools at our disposal, as consumers of content, have become more and more sophisticated, such that we now have access to more quality information than ever before in History.
- However, consuming quality content requires constant deliberate effort and prudence, and it is much easier to stumble upon crappy information, than it is to find quality information.
- The less expert you are at a subject, the more energy is required to sift through the crap.
- Most people are either not expert at anything, or are expert in a narrow field of interest.
- Information no longer flows in the same way it once did: it now diffuses through a social graph which is prone to significant network effects. Non-experts can have a bigger influence than experts, even when the information they share is entirely meritless. Echo-chambers form naturally and require a lot of deliberate and conscious effort to get out from. It is very difficult to reach certain pockets of the graph.