Hacker News new | past | comments | ask | show | jobs | submit login
The 'attention economy' corrupts science (bigthink.com)
323 points by respinal on Oct 3, 2022 | hide | past | favorite | 167 comments



The attention economy corrupts everything it touches: not just science, but journalism, politics, and even childhood.

Being famous used to be rather difficult. Of course there were exceptions (writing To Kill a Mockingbird, being the guy who dove into the river to save a drowning child, for example), but for the most part, you were going to live your life known only to the few hundred or thousand people you met personally.

Even though you could pick up a phone and dial anyone in the world who owned a phone, you wouldn't, and if you did, they'd hang up on you. Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.

Is that a good or a bad thing? It's certainly bad in some ways, and this is one of them.


The problem seems to be more fundamental. Attention is not inherently bad, the issue is how and what kind of attention is rewarded. Many platforms reward engagement of attention seeking behaviour, both good and particularly bad ones, as it easily evokes primal emotions in the audience. And so there is incentives for content creators to continue peddling shitty content.


...it is more fundamental.

It's Moloch - the optimization for one criterion (https://slatestarcodex.com/2014/07/30/meditations-on-moloch/) on the grandest scale. It's attention in the "attention economy", but attention is just signal for revenue opportunity, which is profit extraction aka the optimizing measure for Capitalism.

Too be clear, I am not even against Capitalism. Its been a powerful tool for driving market economies, and in conjunction with social justice it has risen all boats. The problem now is that it's been eroding foundational societal elements that make contemporary society (and Capitalism itself) possible in the first place... like attention, science, community, political sense-making, and more.

This is different from, but related to, Goodhart's Law (https://en.wikipedia.org/wiki/Goodhart%27s_law).


> market economies

One particular market economy is well ahead of the others in the experiment to erode democracy. In the marketplace, truth, fact, lies, and propaganda are all just information.

But democracy depends on education and instruction. These two critical types of information are undermined by for-profit purveyors of falsehood. The dysfunctional cycle is complete when people themselves crave the lies and propaganda more than they want the truth and facts.

Belief is repeatedly rewarded; critical thinking becomes an ancient habit that only the signatories of the constitution appear to have exercised.


This is a result of capitalism with unlimited credit-based money supply.

Modern interment businesses rely on the equation CAC < LTV, that is, “is the cost of acquiring a customer less than the lifetime value?” If so, they’ll spend money on advertising and this support an attention-based feed.

The fact that advertising now leads to increased revenues, which can be used as a basis for credit, completes the feedback loop.

With money creation coming from credit, interest rates can be unreasonably low. Bond yields, even for dismal companies, have been ludicrously low by historical standards. Advertising being a tool to turn money into revenue streams, anything that produces an emotional response will be monetized to the extreme.

A company can raise case by borrowing at essentially zero cost, use this cash to buy ads, and increase the demand for attention.

With fixed money supply, the interest rates wouldn’t get nearly so low. That would reduce demand for advertising because the cost of borrowing the money to acquire a user might exceed the LTV of the user.


If I say: Then attention becomes more valuable under Capitalism, so a siginficant portion of people will choose to have low screen time? Think of a "Deep Work" Movement or so.

I don't intend the above to be read as a would-be theorem in economics ("if-then"), I just want to know how you reason here.


> Then attention becomes more valuable under Capitalism, so a siginficant portion of people will choose to have low screen time?

I'm not the person you've responded to but i don't think that these sentences are logically connected.

If the former is true then the hustle to get this attention will become even stronger, increasing the attention economy. Unless you're not talking about valuable in a momentary manner, but that would mean you've moved on from capitalism.


I think the point is that if everyone is hustling for attention, the ability to do deep work becomes more valuable.

Humans are weird, attention isn't nearly the only way to make money, and in fact, attention itself _depends_ on these other ways to even exist.

Think of it like the rich. Even though they themselves would never build a road, nor do they value building a road, they benefit hugely from them, such that those who are able to build roads make money. And the less people there are that can build roads, paradoxically, the more valuable the ability to build roads becomes.

It's for these reasons that things are so cyclical, this too shall pass.


I don't know, it implies people would be more conservative with their attention if they had some better way to spend it. I mean, there are obvious ways like studying or being with good family and friends or exercising, but they're hard or undesirable for x/y/z. Maybe some of these difficulties are an inherent part of the value, sometimes it's probably not.

Technology should be modulating these difficulties, and instead some of it is short circuiting our more basic emotional systems. Lower hanging fruit. Though I wonder if the more difficult tasks that make us happier, more sustainable as individuals could be pursued in a purely capitalistic incentive scheme, provided it's viewed on a time scale of ten or twenty years instead of one. What businesses take such a long term perspective though?


The thing that scares me the most is that unlike reading a novel, or pensively writing a letter in the 19th century, we focus much less now on one thing for long periods. Even looking at HN, but look especially at TikTok, YouTube “shorts” — the dopamine pinball game going off in our brains and constant change of focus is robbing us of a skill which I fear will have unforeseen consequences at the scale of our global society.


And yet, in my career, I've noticed the rewards are increasing for being the person who is willing to focus on one thing for a long time (for several weeks, or months). For instance, I've never been the kind of software developer who could write obviously clever code. But I have written code that was admired and praised, and sometimes seen as the salvation of the company I was working for -- but not because I'm especially skilled as a software developer, but only because I was willing to think about specific problems, deeply, for longer than anyone else at the company. In 2012/2013, to the extent that I helped re-invent the tech stack at Timeout.com, it was because I was willing to spend weeks thinking about exactly why we'd reached the limits of what we could do with various cache strategies, and then what would come next. I then introduced the idea of "an architecture of small apps" which was the phrase I used because the phrase "microservices" didn't really become widespread until Martin Fowler wrote his essay about it at the very end of of 2013. Likewise, I now work as the principal software architect at Futurestay.com, and my main contribution has been my willingness to spend weeks thinking about the flaws in the old database schema, and what we needed to do to streamline our data model and overcome the tech debt that built up over the 7 years before I was hired. We live in a world where there are large economic rewards for the kinds of people who are willing to think about one thing, deeply, for weeks and weeks or even months and months, until finally understanding a problem better than anyone else.

I have to hope some young people eventually escape the attention-sucking technologies that try to sabotage their concentration, and eventually discover the satisfactions of thinking about complex problems, continuously, for months and months and months.


A very similar experience here (although not to quite the same level of success!).

Even for simpler problems the ability to just sit with it for a few hours/days dramatically increases your ability to write quality solutions.

Good code isn’t the most difficult or complex CS algorithms. It comes from thinking about the problem until you understand it so well that the code is almost self-evident.

Of course, once you’ve thought things through so much it seems trivial to you - which I blame for the annoying verbal tick we all tend to get when explaining things where we say “Basically it’s…”, or “it’s really quite simple…”

The real fun starts when you get to the problems that _do_ need weeks of thinking, and multiple prototypes to validate your theories. That’s when you start to feel like the “science” in “computer science” might not be misnomer :)


"which I blame for the annoying verbal tick we all tend to get when explaining things where we say “Basically it’s…”, or “it’s really quite simple…”"

Very true. I notice this in books on advanced deep-dives on graph theory, queue theory or neural nets, where the writers use the word "obviously" about things that are obviously not obvious.



I can relate ,i had experienced this , after thinking for long time , solution just clicks , it's about intricate details


yep. Another post of mine in this thread is pointing out that the less people do of this type of work, the more valuable it becomes because the work is necessary.

Things are cyclical, and as it becomes obvious that sort of work has become more valuable, more people will start doing it.


Pretty sure being famous is still pretty damn hard. For every tick tock virally famous influencer there are thousands of wannabes that nobody cares about.

> Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.

I'm sorry, did a hand reach out of your phone and force you to install and view the app of the week?


Still hard but I think the parent is arguing that there are more famous people today than say 50 years ago. One of the things the internet did is create more niche groups. There's way more B list and C list celebrities than ever before. Being an A list is still very hard, but being a celebrity in general is easier. Plus there's all the one hit wonders and many more of them. Even the "runs into burning building" famous people are much more likely to become known across the country or globe than before


How would you quantify that?

Take 50 years ago. There were more newspapers then - every town had its local columnists, who were the bloggers of the day. Big cities had multiple daily newspapers.

TV and radio broadcasting wasn't so centralized. There were a lot more local radio DJs and locally produced TV shows.

But if we look back at history, it's mostly the A list who come to mind.

As 99% Invisible suggests, "always read the plaque" - for example, from https://99percentinvisible.org/article/always-read-plaque-ma... :

] Roman: If you’ve never heard of Ellis Chesbrough, you’re not alone. In fact, as I record this, he doesn’t even have a Wikipedia entry. ...

] Dan: But back in the 1800s, Chesbrough was the man. And no one has ever worked harder to save Chicago from its own poop.

(Though he did have a Wikipedia entry, says https://en.wikipedia.org/w/index.php?title=Ellis_S._Chesbrou... ).

A better 99% example is from the mid-20th century, when a college student was so locally famous the student body voted to name a building after him. But I can't find that example.


I expect the nature of niches has shifted to be less localized and more subject matter. Until recently, there really wasn't an equivalent of Instagram or Tik-Tok influencers that I have doubtless never heard of but which have many thousands of followers.

To the 99 PI point, I'm always struck walking around a city like London how much significant statuary there is of people I've never heard of in spite of being reasonably familiar with British history. And, yes, some of them probably don't have a Wikipedia article and if you were to create one, some admin would probably decide it was insufficiently notable.

>TV and radio broadcasting wasn't so centralized. There were a lot more local radio DJs and locally produced TV shows.

I'm not sure I agree with this in general though. No, you're less likely to know of local DJs today. But go back a few decades and "everyone" watched the same lineup of TV on a Thursday night and it would probably have been something of a cultural knowledge shortcoming if you didn't know who the network news anchors were. (I could name them from 25 years or so back. Really wouldn't know today.)


But there were other sorts of fame which are less popular today than yesteryear.

Secret lodges were once very popular, with their own hierarchies and (internal) fame.

Newspaper kiosks would carry a wider range of newspapers and magazines, but have now effectively disappeared.

Local clubs were also more common. When my old neighborhood was build in the 1950s, one of the lots we set aside as a clubhouse, with square dancing events. (It's since been turned into a pool.)

> go back a few decades and "everyone" watched the same lineup of TV on a Thursday night and it would probably have been something of a cultural knowledge shortcoming if you didn't know who the network news anchors were.

Sure, but godelski was talking about 50 years ago.

In the 1970s, a local TV station in Miami had "Toby the Robot" in a show to read the Sunday comics - https://www.pbase.com/donboyd/image/132365543 . Toby would also appear in local parades.

At https://www.pbase.com/donboyd/memories_tv_and_radio you can see some of the local Miami TV shows produced in the 1950s and 1960s .

Rick Shaw was an important local radio DJ - https://pbase.com/donboyd/memories_rickshaw . Read the comments about how locally famous he was.

Go back a few more years and there was even less network programming. If you saw the musical "Hairspray", that portrays a show based on Baltimore's Buddy Deane Show, which was one of several local teen dance television shows later replaced by national shows.

Remember, it wasn't until 1951 that we had a nation-wide microwave system that could carry TV broadcasts, and at the beginning most TV shows were still created locally. The Prime Time Access Rule went into place about 50 years ago to try to prevent that centralization.

(For another fictional movie portrayal, O Brother, Where Art Thou? shows The Soggy Bottom Boys achieving local fame because of their song on The Flour Hour.)

So even though there are new methods now, my observation is still that there were other ways to get famous, and some of these ways are no longer so common, making them somewhat hidden to modern viewpoints.

How then would you determine the validity of "There's way more B list and C list celebrities than ever before."?

I certainly don't know how.


Many of your examples don’t seem to fit fame. People may be known locally, but fame requires someone to be widely known which implies a wide geographic range.

Put another way, at what level is the winner of a beauty contest famous? I don’t think people would universally agree except at the extremes. Miss America is famous, the winner of a spring break wet T-shirt contest isn’t barring something unusual happening.


https://en.wiktionary.org/wiki/famous#Adjective gives this counter-example:

  2. In the public eye.

    Some people are only famous within their city.
A DDG search for examples easily finds things like:

"Throughout Newberry’s long history, there has been an accumulative list of memorable and locally famous people. Machinist and inventor, Reginald S. “Reg” Ruggles was one of them.", at https://mynewberrynews.com/community/newberrys-machinist-and... :

"Most of the letters ended with a PS containing one of Aunt Agatha's aphorisms, which became famous throughout the county" - https://en.wikipedia.org/wiki/Sidney_Grapes


>Miss America is famous,

Is Miss America even famous these days? I sure couldn't name one--though perhaps I'd recognize names from years past.

I tend to agree that if you go back a number of decades more local people were probably fairly well-known locally but I'm not sure I'd put the top guy at the Elks Lodge in the "famous" category. Draw a small enough circle in geography or niche interest and a lot of people are notable to some degree.


I think they are famous in the same way Nobel prize winners are very well known within their branch of the scientific community.

In both cases they easily qualify for a Wikipedia page though I doubt most people can name last years winners.


To some degree, of course, it's because fame/notability in the Miss America case isn't really separable from relatively reliable third-party sources (though I wouldn't put a lot of money down on whatever life story agents and PR people have concocted). The same tends to be true of actors, pro athletes, and politicians above some minimal level. Even senior company executives and academics may not have much written about them.


The population is also larger than it was 50 years ago


Which also means that the number people where >1k people know them also increases. The question is about proportions and locality. I feel pretty confident in saying that the rate of non-local celebrities has dramatically increased faster than population. I would also wager that the number of total celebrities (as defined above) has also increased faster than population and accelerated through the instagram and even more through the tiktok age. We have far more ways to communicate than previously, especially non-local. I can talk to someone in Japan without a HAM radio nor an expensive long distance phone call. Hell, I wouldn't be surprised if someone living in Japan reads this comment.


You cant actually be confident until you run a statistical test


Being famous doesn’t require my personal attention, just the attention of people who actually downloaded the app of the week.

The barrier to being famous dropped as people spend less time on any one thing. A few seconds of attention is qualitatively different than reading a novel or even watching a movie.


That's rediculous. You think that influencer fame just happens? Influencers put lots of effort into being famous. Probably a similar amount as hollywood starlets of yesteryear put into being "discovered".


How many musicians do you think put in extreme effort without becoming famous?

It’s not that it’s easy, it’s that more people make it to B and C list fame.


I don't understand this attitude. Who is the judge of what ideas are great and what ideas are idiotic?


[flagged]


>Are you suggesting that it is difficult to judge what ideas are great and what ideas are idiotic

that's quite an arrogant thing to say and probably will rub a lot of people the wrong way.

but let's give it the benefit of a doubt.

Let's say that there is a quality greatness separate from the qualities of likely to succeed, easy to implement, monetizable and so forth

where idiotic is concerned I would wonder if this is a quality separate from physically impossible, illegal, has obvious undesirable side effects? Is being guaranteed of business failure for an idea you hope to build a business on make that idea idiotic?

Now once we have this question as to what exactly comprises greatness and idiocy in ideas we can say there are many intelligent and successful people who have thought an idea was brilliant and would be hugely successful (two qualities we have separated), which idea then resoundingly failed.

The Segway comes to mind. I remember when that was first revealed, I thought hey this thing is genius, amazing. I sent it round the office, everyone had a big laugh about how stupid it was, including our lead designer who went into a big tirade about how people have bikes (in Denmark), Americans should just get bikes, they were not going to redesign their cities to accommodate the Segway if they don't redesign to accommodate the bicycle etc. etc.

So, was the Segway an idiotic idea, a brilliant idea, or a mediocre one?

I submit it was both brilliant and idiotic.

Our designer who saw the idiocy did not see how we would end up with the computerized autobalancing of the Segway in everything (except evidently bikes), how we would have autobalancing electric scooters, skateboards etc.

the people who saw the technical brilliance of the Segway did not realize how it was just not going to be a successful consumer product.

So - would you have judged the Segway as Brilliant or idiotic or just mediocre, and why?

on edit: clarification, fixed some grammar


The underlying idea made sense. It just looked incredibly dorky compared to the small unicycle things I see zipping around these days. I remember back in the day people used to joke that only an American would be willing to use those things.

That said it will only worsen the obesity epidemic


No, that's not what they're suggesting. That's a straw man that begs the question. They're suggesting there's no such thing objectively. Not everyone agrees on what's great because it's subjective and different people value different things. You're trying to reify your taste and in the process confounding your subjective values whith objective values (which can't exist.) PG does the same thing in one of his essays, whereupon this was repeatedly pointed out.


"Not everyone agrees on what's great because it's subjective and different people value different things."

You are confusing people's disagreement with the competing definitions of "objective" and "subjective". These are separate things. Something can be objectively true and yet people will still disagree about it. Objectively, vaccines for Covid-19 can lower the risk of death for those who get a case of Covid-19 -- people disagree about it, but it is not a subjective question. Objectively, Darwin's theory of natural selection explains much of the development of the diversity of life on Earth, but people still reject the theory. There are many objective facts in the world that people still disagree with, because some people are irrational. What you seem to be confused about is the role disagreement plays in determining whether something is objectively true, which is to say, it plays no role at all -- many things are objectively true, yet fiercely contested by irrational people. The fact that irrational people exist does not mean that everything in the world is subjective, it only means that some people have some experiences which they experience subjectively. George Orwell made the remark that some people could not be brought around to facing to reality till their wrong beliefs were tested on a battlefield, and it is true, if you shoot someone in the brain, the argument that everything is subjective comes to a sudden end.


Except we know for a fact people value different things. It is undeniably subjective. No, you have no argument here. People disagree because they value differently.


You are simply playing with words in a manner that is pointless. There are clearly objective facts about this universe that can be plainly stated. And there are rules of logic that allow us to objectively identify certain sets of assertions as containing a contradiction. Therefore, as I said above, “Are you suggesting that it is difficult to judge what ideas are great and what ideas are idiotic? I actually find this easy.”


Nope. I value different things than you value, because we feel differently about different things (subjectivity!) That's it. It's not complicated.


You keep making the same mistake, which is to assume the existence of feelings disproves the existence of an objective reality. Reality exists outside of our feelings.


Feelings are the only way anyone values anything. There is no evidence of any other kind of value. It's as invisible and non-existent as any god.


>Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.

No you can’t. Attention is scarce. That’s why it’s called “attention economy”. In fact, it was far easier to force idiotic ideas onto the screens of millions of people before the internet came along. That’s what TV commercials were.


I think you're thinking about corporate/org-scale attention seeking. Putting things into historical context, getting attention today is certainly easier for a random individual. In fact anyone could directly compete with large organisations and even surpass them in some cases.

You can set up a youtube account then do episodic dumb shit, and i can guarantee you'll get thousands of not millions of views eventually. You simply couldn't do that 30 years ago.


Exactly. And it is, indeed, still hard to become famous, but not as hard as 30 years ago.


> That’s why it’s called “attention economy”.

Those are your words. How many people do you think have already read them? How many more do you think will have read them after 10, 20, or 50 years?

If you'd said the same thing before we were using computers to communicate with each other, who would you have said it to? How many people would have heard it? How long would those words have been discoverable? What chance you would you have had to have your words reach millions of people?

Forget about you, what chance would I and everyone else have? Speaking as someone who wasn't born after the internet it's much easier now than it was when I was a child for my words to reach a massive number of people, and every child alive today has a better chance than I did at their age.

Attention is scarce, when you've got near instant communication over a global network of billions of people you only need a tiny fraction of them to be looking your way and you've got your ideas onto the screens of millions of people


> and if you did, they'd hang up on you

Not so much as you'd think. Try it. You'd be surprised.


You should write a blog post about that, with transcripts. I bet it'd get lots of attention and go viral.

See how easy it is to become famous? I just told you how :)


these days I'd be surprised if they even pick up in the first place. If I don't know you and I'm not expecting your call you get voicemail. Some folks basically never answer their phone because 'everyone I want to talk to knows to text me'.

I don't doubt you'll eventually reach someone who would stay on the line and chat a bit, but unless you're lucky it could take a while.


It is well-established that the human animal is evolved to live in small groups. When people come together in large numbers, it is a special occurrence, limited in time and space. The idea of living "as if" one is always in this situation is unnatural and arguably unhealthy. It is not something we should be promoting or even allowing. We should be promoting small groups.

If I was asked to "regulate" this problem of so-called "altmetrics", and the "attention economy" in general, here is how would I do it.

Twitter used to be based on SMS, but since 2020 it is just a gigantic website like Facebook. These two mega-sized websites are the primary sources of "altmetrics". If we take away the right to create these gigantic outsourced websites, what would happen.

I would place limits on websites that are comprised of so-called user-generated content. For example, if someone wants to run a website with millions of pages, they are free to do so. (If they could actually produce enough content to justify so many pages.) However, they are not free to have millions of different people author the pages. A website could not be a mass scale middleman (intermediaries) for people who wish to publish using the www. A mega-website doing no work to produce content, financed solely by selling people out to advertising could not, for example, supplant an organisation that employs journalists and editors to produce news.

By regulating creation of these mega-websites we could reduce the incentive for advertising. The mega-websites would lose their traffic and disappear. They would be replaced by normal-sized websites that cater to specific audiences.

Allowing a few websites to grow to enormous size while not having to do any work to produce content has been a mistake. Of course they can make billions in advertising revenue. It also allows any notion of egalitarianism in the www's design to be compromised in favour of a sharecropper model, so-called "platforms".

Without oversized websites no one would be able to publish their content to every website in existence. No website would be able to do zero work to create content and yet act as a middleman drawing massive traffic that can be monitised via advertising. That is what these mega-websites like Twitter and Facebook do. They sit between people who create content and people who consume it and sell these people out to advertisers.

The cost of publishing data/information to the www will continue to fall. The technology to make it easy will contnue to advance. We do need to be able to communicate in small groups, as we have always done. That is possible. We do not need to collectively use mega-websites with billions of pages run by a handful of third parties in order to do it. The follow-on effects of millions of people communicating via these third party websites are obviously harmful.


Some excellent ideas here. Can you make an argument that they're constitutional, though? Because that would be the legal problem.

Let's take a pre-web TV show: American's Funniest Home Videos. That showed UGC to millions of people, allowing millions to propose their own (of course, the producers were the deciders, at least up to the point where the videos went on the air). So how is that show different from the websites you'd prohibit?


It was untargeted multicast with no back channel for comments/feedback. This is the diametric opposite of Facebook or Twitter which are all about soliciting and encouraging as much back channel communication as possible, and feature content heavily curated to your particular interests to make it more of a dopamine hit than your average America’s Funniest Home Video watching session.

And here’s another difference: at least in my country, this programme was something that might be shown for an hour, maybe each day or maybe each week (it’s been a while, I don’t remember which one it was). There wasn’t a 24/7 channel that would show a steady stream of such videos without a pause - in other words, the show started, and ended. Twitter and Facebook never end.


You're right about the last part, for sure.

As for comments/feedback: au contraire : A lot of those UGC shows had (or have) phone call-ins and phone voting, and piles of fan mail. The producers would gauge audience reaction and modify the selections accordingly.

But now it's much more efficient, unfortunately.


You may be interested in the upcoming Gonzalez v Google now on the Supreme Court docket. The Supreme Court has never taken a Section 230 case and here they are taking one even when there is no split in the circuits. It is likely that the interpretation of 230 by courts is going to change.

Twitter and other Big Tech websites comprised of UGC and pointers to articles on newspaper websites have grown too large to moderate. Section 230 gives them immunity from lawsuits that the TV show does not have. The TV show has producers, actual people who make editorial decisions. Big Tech has people too but they hide their decision-making in "algorithms".

Gonzalez is probably going to change what Big Tech, i.e., mega-sized websites, can get away with under Section 230 immunity.


Is there a legal problem. What's the basis for the opinion. What is the precedent.

Let's take a pre-web example: Use of the early Internet was restricted to military, academic and later other institutions. Pre-1993, advertising was generally not permitted.^1 Were those rules legal problems.

1. https://eric.ed.gov/?id=ED350986

"ICANN", a secretive non-profit coporation with some very well-compensated staff, domiciled in the US, supposedly regulates "the Internet", at least in part. But no one can tell us where ICANN's "authority" comes from. Maybe acquiescence. I don't know.

When ICANN started handing out "domain names" in the early 1990s there was a rule that certain obscene strings could not be registered. Was that a legal problem. Later the restriction was inexplicably lifted. At first the registrations were free. Then they were $100. Then they were $50. All apparently arbitrary decision-making accepted without legal challenge. Where do these rules come from. Under what authority are they made. If ICANN had some rule about how websites can use "their" domain names, would that have been a legal problem.

The Internet has all manner of "rules" and "limits". Some may be technical in nature, but some are policy-based, at least in part. And those policies may come from a variety of non-governmental sources, including mysterious ones like ICANN. How can anyone challenge these "rules".

Let's say I want an IPv4 address block, but a "regional registry" says I am not allowed to receive one. How does this registry even have any authority to set rules and tell me I cannot have a block. Who "owns" the rights to network addresses. What legal recourse do I have when I am refused.

Whatever the answer, the fact is that there is an enormous amount of cooperation and acquiescence to restrictions imposed by sui generis "authorities" that goes into creating a single "Internet". And AFAIK these registries and other organisations like ICANN have few if any "legal problems".

Obviously the rulemaking is not only limited to made up "Internet authorities".

For example, Cloudflare can "kick a website off the internet". Cloudflare makes its own rules. The website may continue to publish via Tor or some other option, but the point is that most "rules" of the Internet are not found in any legal system.

Or how about when .org was going to be sold off to a shell company. Public protest stopped the sale. Although ICANN would have us believe they stopped it. Who "owns" .org. What are the rules. Who makes them. Can they be challenged through legal process.

Websites like Facebook and Twitter need the cooperation of many parties to do what they do.

The Internet, including the few mega-websites, operates according to cooperation and compliance with self-appointed "authorities" whose "rules" are generally never subjected to legal analysis.

In the rare cases where such "rules" are legally challenged, the defendants almost invariably settle to keep the novel issues out of the courts and retain their uniquely derived "power".


Historically ICANN had authority via two mechanisms. The first is that it was appointed as the Internet Assigned Numbers Authority by the Internet Architecture Board (part of the IETF). At a hard minimum that gives it the authority the run the IETF IANA functions, and historically would have been what gave it authority to issue IP addresses and AS Numbers. In performing the IETF IANA functions it has zero regulatory role, it is just a registry. I believe it has slightly more say over policy in issuing AS and IP addresses to the regional internet registries.

The other (no longer applicable) source of authority was the contract from the US Department of Commerce. This is no longer applicable, as the US government decided it did not need to be involved with this, especially because it got a lot of criticism for being involved, but the contract really offered the government no real control over ICANN.

The place where ICANN has the most say is domain names. Here ICANN acts as a full blown policy maker, in addition to running the naming IANA functions (like creation of the root zone file). To the extent that ICANN "regulates 'the Tnternet'", it is restricted to its policy setting over the DNS.

It is less easy to give any current source of authority for this without the commerce contract. But it really comes down to everybody accepting ICANN as the provider of the root zone file, and the fact that ICANN would reassign the TLD delegations to a different registry if a currently assigned registry does not want to follow its rules.

-----

The regional internet registries nominally get their authority to issue DNS address ranges from a delegation from the IANA (in its numbering function). Like mentioned before, one can trace ICANN's authority to run the numbering portion of the IANA back to being designated as the IANA by the Internet Architecture Board (part of the IETF), but this could be a little misleading, as the "numbering community" now exists, and ICANN claims that it is that community that would be allowed to appoint a new organization to run the IANA numbering functions.

(Also ICANN has created a stand alone nonprofit to actually run the IANA functions. This is a membership based non-profit with ICANN as the sole member, making it basically like a subsidiary without legally being one, because ICANN wanted to make it very clear that the operation of the IANA functions is separate from the policy making part of the organization once the Government contract was stopped.)

------

Now perhaps you want to know how the IETF has authority? Quite simply, they are an outgrowth of, and assumed the functions of the Network Working Group, a group a early ARPANET researchers. As the group that defines protocols like TCP, IP, HTTP, etc they derive their authority from just that, that they develop the standards that underlie the internet.


You are saying that we should prohibit large group gatherings, because they are "unnatural" and "unhealthy" yet provide not a single source or even definition what a large group is supposed to be. Quite grandiose and all encompassing statement to be not backed up by even a single mentioned source.


"Large group" could be as small as 200 or 300, or perhaps as low as 151.

A website operator who wants to publish pages managed and edited by 150 of his "friends" would be acceptable.

A website operator who wants to hand out pages to 100,000 people would not be acceptable.

https://journals.plos.org/plosone/article?id=10.1371/journal...

https://arxiv.org/abs/1105.5170

https://arxiv.org/abs/1011.1547

https://pdodds.w3.uvm.edu/teaching/courses/2014-01UVM-303/ou...

https://www.newyorker.com/science/maria-konnikova/social-med...


None of the links concludes that interactions with groups above the size of the dunbar number are detremental. None of this says that large gatherings are "unnatural" and certainly does not provide arguments why we should not allow gatherings or interactions beyond a certain point of people involved. It states that there is a certain number we can actively and comfortably manage, but nothing about the quite wide reaching things you postulated.


Another idea is to prohibit mega-websites that comprise thousands of pages with content generated for free by distinct users from (a) using metadata generated from access to those pages or (b) serving advertising on those pages. Generally, prohibit rent-seeking by websites that delegate ongoing content generation to users on a large scale.


Isn't it already breaking down country wise? Most poorer countries dont have the infra/resources to run these sites. So they have to latch on to the mega nets. But the more developed ones are all inching towards their own nets.


Came here to say this. There is nothing meaningful in our lives which goes completely unscathed from the cultural destruction of the attention economy.


> Being famous used to be rather difficult. Of course there were exceptions (writing To Kill a Mockingbird...

Shots fired!


Humans ... us people things, etc ... are prone to our most immediate concerns. It used to be that predators were our concern, now its social media. We react, we make noises, we continue. The unfortunate thing is that there are no lions to take out the weak anymore.


Having spent over 10 years in a university and been a professor, the problem isn’t attention seeking behavior but a lack of accountability. For example, you can literally make up any data you want in a grant proposal and so long as it sounds right no one can or will double check it. The foundation of academia is rotting, but maybe it’s always been like this


It has not. And "a lack of accountability" is a band-aid on the real problem: bad gatekeeping. People getting into science, not for the search for truth, but in search of respectability, green card, money, or whatever else. Trying to whip them into real scientists through transparency and accountability is like trying to achieve security in your home by flinging the gates and doors wide open but slapping cameras and motion detectors everywhere. Either they win, or you get fatigued.


>People getting into science, not for the search for truth

I got into science in order to discover truth and develop new things.

Left after less than a year, got a job as a SWE then eventually started my own business selling used video games of all things.

Have never met a group of people less interested in the truth, or a system that goes to such extreme effort to actively inhibit discovery, than what I experienced in academia.

A friend who works for a private non-profit "researching" cures for cancer has reached the same conclusion there too.

Getting money from rich old people/the government takes precedence over science and it's just sad.


"Have never met a group of people less interested in the truth, or a system that goes to such extreme effort to actively inhibit discovery, than what I experienced in academia." Totally true!


Mirrors my experience. The good ones leave and the vultures remain.


Hi to any charities.


> Getting money from rich old people

that's sort of shabby treatment of a group of people who step forward as willing to give money. You're singling them out because virtually nobody else is stepping forward. But you want the money (that's called greed, btw) so I guess you'd rather have money disbursed to you... how?

the proper functioning of an economy is essentially defined as allocating resources in a way that leads to the best outcomes for the most people. Look around the world and describe a better system than the one the US has had. I agree it's headed in the wrong direction, but it's not at all clear how to fix it.


>that's sort of shabby treatment of a group of people who step forward as willing to give money. You're singling them out because virtually nobody else is stepping forward. But you want the money (that's called greed, btw) so I guess you'd rather have money disbursed to you... how?

They aren't giving money for truth, they are giving money because they want to hear their opinions repeated by someone with a PhD. Be it climate change denial, sex denial or whatever the flavor of the moment is you will find someone willing to give you money to say whatever it is they already believe.


> the real problem: bad gatekeeping. People getting into science, not for the search for truth, but in search of respectability, green card, money, or whatever else.

What possible gate could keep out people who get into science for the wrong reasons? As long as there are people willing to pay off a scientist, there will always be scientists willing to pocket the money in exchange for lies. As long as scientists are well respected, people who want to be respected will give consideration to becoming a scientist.

All we can can do is open the gates and doors, keep an eye on everyone who comes in, and kick out the ones who cause problems. In fact, it's better not to be aggressive about gatekeeping when it comes to science because otherwise you risk kicking out someone who could have discovered something important. Transparency and verification make for good science anyway so it's not as if anyone would be saving money on 'cameras and motion detectors' by locking the gates and doors and only letting their friends in.

Accountability is the key here and the lack of it is not just limited to scientists. Not long ago in the US a doctor who believes that tumors and cysts are caused by the sperm of demons found her way into international headlines. She is also on record saying that medications contain the DNA of aliens. She still got to keep her medical license and she has a medical practice. There is almost no accountability at all.


I think there are very few people who are truly scientists in that they spend their lives trying to solve multiple problems. I think most people have 1 or 2 grand ideas and they execute them. I've seen "scientist"s that are truly intellectually bankrupt, he just needs to do certain things throughout the year to pass some checks and get paid.


> People getting into science, not for the search for truth, but in search of respectability, green card, money, or whatever else.

I doubt that. It's much easier to get all of those things elsewhere (ok, maybe not the green card, but the fact that this is not a US-only problem also discards this).

If it is an issue of systemic corruption, people are getting corrupted after they walk in, not before.


Not even a mentally sick person would get into science for a green card. I see prejudices behind your opinion.


I disagree on every point.


> lack of accountability.

I think this also applies at all levels. I'd argue that it is the big reason people feel very frustrated with reviewing, and especially in hyped areas (e.g. ML). There's plenty of incentives to reject papers (low acceptance rates mean "higher quality" publications, advantage yourself by rejecting or being overly critical of your competitors, you get no clout for reviewing and no one will get upset if you take all of 5 minutes to review), but very few incentives (I can't even name one) to accept papers. It is fairly easy to dismiss papers as not novel because we all build off the shoulders of giants and things are substantially more obvious post hoc. Metareviewers and Area Chairs will 99/100 times stand with reviewers even if they are in the wrong and can be proven so (I had a reviewer give me a strong reject claiming I should compare to another work, which was actually our main comparitor and we compared to in 5+ tables and 5+ graphs). I can't see these issues being resolved until we all agree that there needs to be some incentive to write high quality reviews. The worst of this is that the people it hurts the most is the grad students and junior researchers (prevents graduating and career advancement). I'm not saying we have to accept papers, but I am saying we need to ensure that we are providing high quality reviews. Rejections suck, but rejections that don't provide valuable feedback are worse.

If the publication system is a noisy signal then we need to fix it AND recognize it as such. There's been plenty of studies to show that this process is highly noisy but we're all acting like publications are all that matters.

This is all before we even talk about advantages linked to connections even in double blind reviews, collusion rings, or citation hacking. I feel we can't even get the first step right.


We all get papers rejected. Nobel prizes get also papers rejected. And rejection is one of the very few tools we have to stop the crap from flowing in. This is not to say that all rejected papers are crap. Sorry to hear about your bad time with rejections, this is a universal thing.


I don't think this response really is getting to what I'm complaining about. Rejects suck, but often they are deserving. I'm not saying that we need to throw out the peer review system. I even specifically said I don't think we need to accept more papers. But I do think we need a mechanism to ensure that reviews are good and high quality. Especially the reject ones. If your paper is being rejected without feedback that can convey what needs to be fixed to become a good paper, then this wasn't a peer review.

I've had plenty of good reviews and plenty of bad reviews. A reject doesn't sting nearly as much when I think "maybe they have a point." But if a reviewer tells me to compare to something I'm already comparing to and is in many graphs and tables, I'm entirely unconvinced that they even read the paper and I think we can all agree that that type of reviewer is not benefiting anyone. They are the system failing.

You may disagree with the number of bad reviewers there are (and that's okay) but I highly doubt you would actually defend them. I'm just saying we need a system where we either encourage these people to change the _quality_ of their reviews or stop reviewing (I am _not_ arguing that we need to change their scores).


"I do think we need a mechanism to ensure that reviews are good and high quality. Especially the reject ones." This is a very important point, IMHO more for proposals than for articles. The latter deserving less time because time is at a premium, although I am aware that this is a most painful truth!


The biggest newspapers are writing stories with no proof or facts, and no one cares anymore.

No one cares, when you can watch 5 movies in a day and feel all sorts of emotions people used to only feel a few times a year, watch porn, play videogames, read about anything you want, you have little time or neurochemistry left to be mad at scientist, journalists or politicians lying to you.


Let’s not pretend that yellow journalism is a recent thing. It’s been around since before the automobile. The difference is that we have internet. So it’s easier to fact check, but it’s also easier for lies to spread more widely before being corrected.


Not only that, but the power of suggestion is corrupting by itself. “Is colejohnson66 working hard enough?” is sufficient to put a doubt in someone’s mind. I mean you probably do, but you have to defend yourself after such a uttering. Some media use this and it’s disgusting.


I have read a lot about the history of science in Britain. Even before the creation of the Royal Society, people doing science generally all knew each other and communicated extensively, so I think there was a high degree of accountability. Even in the 19th century, scientific sub-communities were small. I suspect that the problem of accountability arose in the 20th century with the expansion of institutional science.


> but maybe it’s always been like this

The older I get, the more I believe this is the truth. For most institutions we've been taught to hold in high regard.


This. I'm actually optimistic (I know, Unpopular Opinion) about the future for a bunch of reasons, but one of them is the increased transparency that we now have on our institutions.

It used to be that the media controlled how much of what went on that we saw, and the media was part of an establishment that valued the status quo and "stability" so didn't report on some things.

Now, the blinders are off and we're seeing what was always there.

For Science, this means that the institutions that were set up by a bunch of rich men who could afford to spend their time satisfying their curiosity, are now visibly falling apart. Because we can see this, we can change it to something better. The pressure to kill off the journal system is growing. The various crises that the article mentions, etc. This is all good, in the long term.


I strongly agree.

But I'm also increasingly worried that we're not very far from losing this. It already seems like social media sites like Twitter, Facebook, and Reddit are increasingly controlled by US propaganda and corporate interests. It's hard to know how far gone what we think is true freedom of speech and democratization of voice already is.


I don't know. I've been around academics for awhile now and really tried to get a good answer to this and I still don't know. If anything I don't think it's always been like this.

There's a variety of indicators suggesting that something changed in the mid90s to about 2000 with academics (grant success rates, publications, etc). When I talk to much more senior colleagues, I also get the sense something changed even if they describe it in different terms.

Was everything better before? No, I think many things are better now, but it feels like the fundamentals of the system are worse. To me it feels like being in a house that's been remodeled with a new roof and HVAC system, but where the ground is sinking and foundation is in the process of collapsing.


While I hate to parse words, by definition, fabrication of data it's not science. It's fraud.


That's just restating the problem in a single, flashy sentence.


Not really. The difference is, it's not giving credit where credit is *not* due.

We have the same problem with the misuse of the words leader and journalist. Allowing violators to claim a respected title *that they don't deserve* only enables the problem.

Put another way: if you pet barked, do you still call it a cat?

The point is, a scientist does science. Fraud is not science. If you do fraud you no longer deserve the right to be called a scientist.

Words matter.


As far as I can tell, there's way more accountability in science now, just like there was more accountability in a Soviet factory than an American one in the Cold War. It just doesn't always achieve its intended results.


You have any data/research to support your experience?


What most people fail to realize is that fundamentally, most principle investigators, the people who actually run the research world, are primarily fundraisers. Their day-to-day job is a mix of grant and proposal writing, relationship building/organizational meetings, and checking in on their postdocs, candidates, and lab techs.

Your main goal, as a PI, is to keep your lab running, and thus research flowing, by any means possible. For some PIs, this means milking every drop of available talent and time out of your doctoral candidates, and is the most common cause of the horror stories you hear about people leaving academia. On the other end of the extreme, they can embed themselves so deeply in fundraising with private or public capital that their lab staff don't see them for more than 15-30 minutes a week, because they are essentially living their lives hopping from sales meeting to sales meeting.

This wouldn't be a problem if the job of the PI was explicitly meant to be that of a salesman, but the actual role of a PI is to define the research being done. They draft the hypotheses, the expected impact, etc, because that is their intended role, but in reality these will always be constructed in a way that makes it easier for the PI to solicit funding.

It's impossible for the attention economy to not play into the research funding loop then, because every set of eyeballs is another potential revenue source for future research, or a tool to justify growing the footprint of your lab. I wouldn't go so far as to call the superstructure corrupting of science though, not in those words. I'd say it forces science to be mission focused, where the mission is a subtle negotiation between the people funding the research and the people performing it, and often times the person with the capital lands much closer to their ideal.


Just speaking from personal experience, what you describe gets murky really fast in a couple of ways.

The model you're describing implicitly is this idealistic "executive-mentor" model of the PI, who has the ideas and the postdocs or doctoral students are just implementing it. Basically scientist wants to produce, so needs help and outsources the work to others.

In my experience, though, this is not at all what happens many times. Ideas come from those doctoral students or postdocs or whowever, the PI takes them, and then they get credit for those ideas. I've seen PIs who really don't fundamentally understand the research areas, who kind of are just "black holes" for credit of the ideas and work of others around them, and then because they're more senior, they end up getting the credit. Sometimes this process seems intentional, in that the PI cultivates a false impression of what's going on, and sometimes it just happens because of the nature of the attention economy.

So although the "executive-mentor" model is a good one, what's closer to reality in many cases (although not all) is more of a "public liason-mascot" system, or some kind of hybrid.

Because of this mismatch between reality and the assumed schema, the attention economy then incentivizes abuse and corruption.

This isn't even getting into issues about how chasing grants as a fundamental scientific endeavor distorts what is researched. Even if you have a pure leader-mentor PI who is just trying to get their own independent ideas researched with funding, you then have to ask "what is rewarded? Is it what's good rigorous science, or what is popular?"

The problem I think is that what garners attention is not what is rigorous or innovative. Sometimes those things overlap, and maybe they're correlated, but they're not the same.

Maybe this isn't unique to science, but it doesn't make it ok, and it seems like changing it to prevent these problems is necessary.


Attention is not the problem; it's the lack of accountability. Social platforms care about engagement, not quality of content (there's virtually no mechanism to incentivize content meets any standard of quality other than what can be measured in the moment).


Quality is subjective, but there’s no accountability about harmful or illegal content either, so platforms don’t only promote “general purpose” spam, but actively harmful content that intentionally seeds outrage or encourages violence as that generally leads to more engagement.


Absolutely, it should include a range of indicators like spam, scams, known-falsehoods and unsubstantiated claims.


Quality is subjective? Is it information or a product? Or is all information now a product?


Why go to such a long tangent, when you could make a solid case about the legacy grant distribution system[1] corrupting science for decades? It is as close to funding and career success as it gets.

https://newscience.org/nih/


Sometimes I am left to wonder about the widespread criticism of science. Slow progress, broken publishing and career progression, terrible working conditions, prestige farming, poor mental health and exploitation of students, fake data, statistical warcrimes, bullying, sexism, racism, elitism, harrassment...

The question is, does any of this matter on a historical scale? Is Science doomed to fail? In 200 years, our descendants will probably look back at us with the same mix of condescension and slightly horrified fascination with which we view our 19th century counterparts. Our stupid scientific publishing system will be viewed similarly to the plumbing in London in the 1850s. The chimney sweep and the graduate student suffer similar plights. It is terrible, immoral, we should do better, but then this is always the case.

On one level, our future descendants should be grateful - we eradicated small pox (that alone would be enough really, an unprecedented gift to all future humanity, a boundless alleviation of suffering), discovered antibiotics, greatly improved child mortality, invented quantum mechanics and relativity, drastically increased our computational capacity, left the planet for the first time, and connected almost everyone in the world. All of this under conditions considerably less ideal than they are today, despite still being far from desirable. Maybe that's all that matters?

Science being Science will sort itself out most likely. We should mainly try to reduce the human suffering involved, and allow greater diveristy in how and where research is done.


My main concern is that perhaps there is an "attention threshold" on fundamental breakthrough results in physics and mathematics, and that we're diminishing the chances of ever getting such results again because theorists don't have enough attention to spare. If you spend all day writing grant proposals and trivial papers (just for the sake of publishing), then go home and veg out on Netflix and Twitter, when are you supposed to have that quality "sitting and thinking" time?


That the society is moving away from the concept of calling or vocation is also a problem. As a scientist (or physician or nurse or whatever) you are no longer rewarded for devoting your life to such things. You will not be supported (maybe even shamed for your enthusiasm or for failing in the reas of life that you are sacrifying), noone will watch your back, the administration will blame you for what they can, and you will not be paid overtime of course


"Science being Science will sort itself out most likely" defies logic of current events.


Most of the problems are with the Academies. They are of course deeply flawed institutions like all human institutions, and generally overrated. But Science isn't merely the Academies.


To be honest I'm working 10 years in a university and I still don't understand how Academies work.


" Scientists list media exposure counts on résumés, and many PhD theses now include the number of times a candidate’s work has appeared in the popular science press." This is a mandatory requirement to a EB1A green card. Maybe the government can do something from its side to reduce the fluff.


The problem is, any system that replaces it, and has some measurable/countable metric, can effectively be gamed in the same manner.


"The attention a scientist’s work gains from the public now plays into its perceived value. Scientists list media exposure counts on résumés, and many PhD theses now include the number of times a candidate’s work has appeared in the popular science press. Science has succumbed to the attention economy."

Sitting on a tenure and promotion committee at an R1 university, this type of stuff is just as likely to torpedo you as it is to boost you.


I agree. I’ve almost never seen this, and I read a lot of scientific vitae. (The one time I did see it it was because a major documentary by a major scientific org had been made from the results.) I think they mostly made this observation up. (There’s no citation.) Calls the whole thing into question, which sounded pretty simplistic to begin with.


There are only two places I call it out in my CV:

1) A pandemic-related paper, where press coverage was a big deal, because it was meant to influence policy decisions on a very short time frame.

2) One paper where we are in the 99th percentile for media and social media mentions for a very good journal, and we call that out specifically.

But as a general rule? You're far more likely to trigger "Is that what you're spending your time on?" and "No serious about scholarship" than you are to find someone impressed with your altmetric score. Indeed, I'm worried about several of my colleagues who are learning hard in that direction (being the Social Media Editor of a journal, etc.)


The question is when? Surely , with passage of time, it s more likely scams will be revealed. But if the professor has already retired, the damage and lost effort is irreversible.


> "Science and scientists are part of society. Neither sit on a lofty perch that makes them impervious to societal shifts."

Ironically, this is not the general public's perception. Even more ironic is the fact that celebrity "scientists" like Bill Nye and Neil Degrasse Tyson are fond of pushing the "science is better than the rest of us" narrative.


I disagree with the premise of the article, interconnectedness is a great benefit to science. Yes there are inequities, but that also means more collaboration, and more scrutiny. Science is deeply corrupt but for other reasons such as a lot of money in certain areas of interest, pressure from governments and ivory tower mentality.


It's nice that this has finally come to the public's awareness. For a few years, I was afraid that the attention economy would be an 'invisible hand' which would have a significant impact on everything but which nobody would notice.

People used to be uncomfortable discussing 'attention' or 'the media' - I suspect because it was too abstract or not relevant enough to them - Now that many people are struggling to get any attention for their work, attention and the media seem more relevant.

When someone publishes the best work of their career and it receives less recognition than their early work, it sometimes makes them wonder what has changed.


Well, yes, but what else can we do? Certainly we shouldn't give out Nobel prizes on the number of like buttons clicked on TikTok, but at some point the most influential science is the science that influences the most people. Sure, it's possible that someone has written a great paper that will be super influential in three or four hundred years, but we have no way to measure or accurately predict that. So we're stuck with the citation counts and the votes for best paper at the conferences. It's all we've got.


It suggests that science is an artifact of attention. That bears study. Maybe get a nice paper out of it.


if a scientist writes a paper in a forest and nobody's around to retweet the headline, does it even exist at all?


Don't forget to tweet about it with paper alert emojis! (U+1F6A8)


> Twenty-five years ago, it was projected that, in an ever-more interconnected world, money would no longer be the prime currency, attention would be.

While it does make for a good opening line, using a single prediction from 25 years ago that arguably worked out to some extent (probably not everyone prefers likes over cash), without giving any context on how many similar predictions were made in the same context, just feels a bit odd.


Money is still the prime currency. Attention is just a proxy for it.

The problem is that, now more than there ever has been, there is a direct link between attention and money - in particular advertising - on a scale that is measured down to micro-engagements. To maximise profit you need to do whatever you can do legally to get your ad revenue (or whatever).

Calling it the "attention economy" gives the impression that the problem is the consumer demanding an endless sea of changing content, whereas actually there's a profit motive to generate a lot of content in the cheapest possible way, to push it on people as much as possible, and the quality of that content only matters if you care about retention.


The "economy" corrupts science too.


> “get your science the attention it deserves.” (On Google, that search term garners nearly 500 million hits.)

So 500 million pages have one or more words similar to those. That's pretty flimsy evidence. Putting quotes around it I get 5 results (not 5 million, 5), one of which is this article. I don't doubt the overall conclusions of the article, but I do wonder how well supported some of it is.


Having multiple family members working actively as scientists and academics i've been pretty blackpilled about what "science" actually is for the most part.

Off course there's heaps of interesting papers and progress out there but at least 90% of money and time seems to be spent on politics, careerism and working actively for some disproportionally funded but "profitable" niche.

It's get ahead in the game, "earn money for investors" or further some industry astroturfed cause. Also a lot of PhD's use them to grift like cheap salesmen these days unfortunately.

Probably has something to do with the corporate incentive structures that have emerged.


I'm curious if you think there is more of this in science than in other places. Or is it that we want to think of science and academia as better than/more idealistic?

This comment is similar to comments I've heard about nonprofits, govt work, and a quite a few large businesses. For a long time I thought nonprofits were generally good, in college I learned more about what nonprofit means and how that kneejerk reaction of mine could provide cover for a huge range of behaviors.

"money and time seems to be spent on politics, careerism and working actively for some disproportionally funded but "profitable" niche"


Good question. No, i don't think it's much different in other sectors.

And the comparison to NGO's is a good one because the same realisation can be made about big parts of that world - ie. lots of them exist to further geopolitical or corporate ends, niches, or keep higher middle class do gooders busy instead of focusing on political or systemic issues. (i have lots of respect for a lot of people in that world still).

I guess that leads me to the actual problem; their images as somehow pure or isolated from the rest of societal power dynamics.

Everywhere it's about resources, realpolitiks and (media) power, but the internet is filled with people blindly believing in "science", "ngo"'s or people with diplomas for that sake.


> I'm curious if you think there is more of this in science than in other places. Or is it that we want to think of science and academia as better than/more idealistic?

As it is, Science has developed a rather substantial reputation (lots of marketing during the COVID phenomenon), and has a large and hyper-confident fan base spreading The Good Word on social media.

I often wonder why Science seems unable / uninterested in addressing this non-ideal state of affairs, considering it's substantial intellectual power (a fact I am regularly educated on by my superiors).


Science and technology provide the most important examples of surrogate activities. Some scientists claim that they are motivated by “curiosity” or by a desire to “benefit humanity.” But it is easy to see that neither of these can be the principal motive of most scientists. As for “curiosity,” that notion is simply absurd. Most scientists work on highly specialized problems that are not the object of any normal curiosity. For example, is an astronomer, a mathematician or an entomologist curious about the properties of isopropyltrimethylmethane? Of course not. Only a chemist is curious about such a thing, and he is curious about it only because chemistry is his surrogate activity. Is the chemist curious about the appropriate classification of a new species of beetle? No. That question is of interest only to the entomologist, and he is interested in it only because entomology is his surrogate activity. If the chemist and the entomologist had to exert themselves seriously to obtain the physical necessities, and if that effort exercised their abilities in an interesting way but in some nonscientific pursuit, then they wouldn’t give a damn about isopropyltrimethylmethane or the classification of beetles. Suppose that lack of funds for postgraduate education had led the chemist to become an insurance broker instead of a chemist. In that case he would have been very interested in insurance matters but would have cared nothing about isopropyltrimethylmethane. In any case it is not normal to put into the satisfaction of mere curiosity the amount of time and effort that scientists put into their work. The “curiosity” explanation for the scientists’ motive just doesn’t stand up.

The “benefit of humanity” explanation doesn’t work any better. Some scientific work has no conceivable relation to the welfare of the human race—most of archaeology or comparative linguistics for example. Some other areas of science present obviously dangerous possibilities. Yet scientists in these areas are just as enthusiastic about their work as those who develop vaccines or study air pollution. Consider the case of Dr. Edward Teller, who had an obvious emotional involvement in promoting nuclear power plants. Did this involvement stem from a desire to benefit humanity? If so, then why didn’t Dr. Teller get emotional about other “humanitarian” causes? If he was such a humanitarian then why did he help to develop the H- bomb? As with many other scientific achievements, it is very much open to question whether nuclear power plants actually do benefit humanity. Does the cheap electricity outweigh the accumulating waste and the risk of accidents? Dr. Teller saw only one side of the question. Clearly his emotional involvement with nuclear power arose not from a desire to “benefit humanity” but from a personal fulfillment he got from his work and from seeing it put to practical use.

The same is true of scientists generally. With possible rare exceptions, their motive is neither curiosity nor a desire to benefit humanity but the need to go through the power process: to have a goal (a scientific problem to solve), to make an effort (research) and to attain the goal (solution of the problem.) Science is a surrogate activity because scientists work mainly for the fulfillment they get out of the work itself.


I agree with some of the problems listed (over-hyping minor results) though personally I think the link to attention economy feels a bit contrived. There are much greater forces leading to these problems - notably the emphasis on metrics for science work (as mentioned) and politicization. This didn't convince me the attention economy lens adds anything


While this article feels true, I wish it provided more of its own researched evidence to prove the point. I went through several links and possibly the most relevant ones are "Why are medical journals full of fashionable nonsense?"[1] and the book "Science Fictions"[2].

Most of the other links talk about other tangentially related topics. The subheading "How the attention economy corrupts science" which should contain the meat actually has little to no research of its own that can convince me of the title (unless I'm willing to read the book "Science Fictions"). I read the article "Why are medical journals full of fashionable nonsense?" and found it to have a similar vibe although it had more concrete evidence. Still, the need for more than a few examples is something I feel is fair to call for. Basically, I don't find it to fully support this original article about "attention economy" corrupting science.

Overall, I think this is quite an ironic state. The article seems to hold on to what feels like an idea that is socially popular, something we all suspect, and it presents it as true with evidence that is either vague or indirect (look, this book exists on the topic, therefore it is true). The article fails to clearly draw the differences between "working hard to get attention to one's science" vs "the act of getting attention is corrupting science". I'm overall unconvinced that this article really does anything much to support its premise.

[1] - https://bigthink.com/health/medical-journals-fashionable-non...

[2] - https://us.macmillan.com/books/9781250222688/sciencefictions


The article being mediocre-ly researched (and click-baity) is the ultimate evidence of what it is trying to prove.


“What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.” - Herbert Simon


Reminds me of this Science article in 2017:

http://science.sciencemag.org/content/357/6354/880.2#1504269...

on why scientists need social media influencers …


Prove how much you agree with this by not discussing it on hacker news.


Yes. The "prestigious journals" aren't helping. Nature used to mostly cover biology, and had reviewers who understood that. Now, "Nature" is a brand covering a wide range of topics, badly.

Then there are university PR departments, who hype every little lab result into a world-changing breakthrough. This is particularly bad in energy-related areas. (I want some publication to reprint, each month, battery PR from 1, 5, and 10 years ago.)


Not even Nature dares launching a journal about economics.


I for one am super happy that I now have access to world class talks, documentaries, and instructional videos on YouTube. Don’t complain about bad content. Instead choose to only watch the quality stuff.


At the point more than 50% of results are not reproduced and/or reproducible — is science really science?


We have changed the environment that we live in. And in doing so, we created our own Achille’s heel. Possibly the future (or non-future) of civilisation will be the outcome of a natural selection of the fittest, of the culture that deals with this in the best way.


> Revenue from subscriptions and from authors, who pay to be published, are no longer the only profit sources.

It's not obvious to me what this means. What are the other profit sources for scientific journals, and why do they depend on things like social media virality?


The problem with science is that there are too few discoveries for too many scientists. This caused the reproducibility crisis. This caused grad student overwork and abuse. It also means working scientists have had to turn to the "attention economy" to save themselves.

We forget that this notion that anyone could (and everyone should) be a scientist is really new and (ironically) quite untested idea. Well, this error mode is called a "glut" and it sucks for everyone involved. Maybe science is better off being either a hobby for rich weirdos (and/or the savants that they patronize) or a game that big business keeps paying to play to secure that all-important new intellectual property. Whoever decided you could make a science factory that produces discoveries was an idiot, and now we have proof.


So what can we do? As always , nothing, let's throw our hands up in the air and pretend 'it is what it is'. meanwhile, celebrity scientists keep using funds to enrich themselves


The author recommends greater consumer awareness, and to vote with our clicks (identify the better content and only reward those.)


The problem is integral to science and scientists, i dont think the audience is to blame


The article is very insightful and explains a lot why do you get a growing number of useless inflated headlines arxiv papers trying to gather views from twitter or linkedin.


I find these arguments kinda funny. Women have known for a long time that attention is a huge catalyst to driving their partners along their lines of perception. Their partners have been freely giving attention to women all their lives. Most of them didn't realize it is a problem then. It is highly unlikely that they will realize it's a problem now. After all we give it freely, nobody has a gun to your head to pay attention to your girlfriend all day or watch shitty content.


Should we instead call it the "status economy"?


Absolute tautology corrupts absolutely.


That's just the eddys in the attention-flow talking.


Isn't this also related to how the vaccines-cause-autism conversation started? The study involved only had a handful of subjects (a few of which were very unqualified), and then a big important journal (The Lancet IIRC) picked it up for the novelty.

The article mentions attention economy as in media, TikTok, etc playing a role before "community assessment." But it's not like scientists don't also gravitate towards the new shiny thing in their own ways.


Yeah. Andrew Wakefield was stripped of his medical license in 2010 for publishing fraudulent research and it was later discovered that he was paid to discredit the MMR vaccine.[0]

And yet, ~10% of Americas still believe the study. [1]

[0] https://briandeer.com/mmr/lancet-summary.htm

[1] https://news.gallup.com/poll/276929/fewer-continue-vaccines-...


I’m frankly shocked it’s only 10%


In the linked study, 10% said they believed vaccines caused autism, and while they were not asked whether they accepted Wakefield’s claims, presumably just about all who did were included within that group. 45% said they were unsure, leaving open the important question of which way they would go when faced with a choice.


In the linked study, 10% said they believed vaccines caused autism, and while they were not asked if they accepted Wakefield’s claims, presumably just about all who do were in that group. 45% said they were unsure whether vaccines caused autism, leaving open the important question of which way they would go when faced with a choice.


in all fairness, where is columbia now?


Science was already a badly governed and "corrupted" arena. This just adds more trash to disentangle/deal with.


This boils down to a fundamental question. Why do we spend any time doing science to begin with? Historically scientists were drawn to the field in order to improve human understanding of our reality. These individuals often died quite poor and unknown, but advanced us forward. Now popular science is the goal and getting huge money grants. The goal is no longer the pursuit of knowledge, it's a money game. Like journalism. The only useful science done at the moment is at tech companies who will use it to build better products.


> The goal is no longer the pursuit of knowledge, it's a money game.

I don't know about that. All the PhDs I know are dirt-poor (or were until they left science to get tech jobs), and are in the game because they are passionate about science and the project of advancing human knowledge.

It's true that your ability to get a tenure-track position is very dependent on your ability to successfully obtain grant money, but most of the scientists I know view that as a necessary evil, not the game in itself.

> The only useful science done at the moment is at tech companies who will use it to build better products.

This is trivially demonstrably false. https://en.wikipedia.org/wiki/James_Webb_Space_Telescope for the first example that came to mind from recent news.


I definitely exaggerated, but a great example of the opposite case is CERN. Physicists knew the limits of what could and couldn't be tested with it, but they hyped it up to convince governments to spend money anyway. The JWST is a much more fruitful project, but how many of those are there compared to projects that just focus on getting grant money. We should build more telescopes and fewer participle accelerators, but the grant money doesn't match the need since it's hard for politicians to understand.


It is true in the field of computer science at least where the market has destroyed the academic sphere in terms of innovation by building better products.


As a small software vendor, this doens't seem accurate to me. My whole field is reading the papers but very often the scientists are ahead. It's their job to discover new territory, while we also have to deal with more mundane "product" tasks. Industry will do the "fine-tuning" but not really the innovation, or just small innovations. So from a consumer point of view, it might seem that industry people are ahead with their product output, but the new science is light-years ahead of that.


Diego Ongaro wrote the Raft paper as his PhD dissertation at Stanford, so this is also false.

I’m sure the original claim can be narrowed even further than you just did to make it true; I don’t deny that there is a trend for more research, particularly CS and biotech, to be done privately. It’s hyperbole to claim academia is doing nothing though, even in CS.


Pretty sure the money a scientist can personally earn with grants is far less than they can at a tech company.


This is absolutely true.

I could leave for industry tomorrow and likely double my salary.

The money I've personally earned from grants is... $0. And I've been very successful in getting grants.

I only got a job offer at one university where the PI of a grant directly got a monetary benefit from it, and while it was nice, it was never going to be more than "That's a nice little bonus" money.

If you want to make money as a scientist in academia, consulting or a startup is where it's at.


> Historically scientists were drawn to the field in order to improve human understanding of our reality.

I doubt that in the general case. People have always been driven by base selfish motives.


True, but such selfish motives can be generic (e.g. personal glory) - and if directed into scientific endeavours can still result in genuinely passionate enquiry.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: