There is far more content than there used to be but no more hours in the day. Our filters must reject more.
Yes, there are costs -- deep work & study both suffer -- but there are benefits too: informational content that can be compressed does get compressed. An introduction to a concrete skill that would at one time have been padded out to fit into an hour long movie or lecture might become a 30 minute youtube video and then a 30 second tiktok, by which point it has become a snap cut between the critical actions and finger-wag followed by pitfalls. You can look it up, watch it multiple times until it's committed to memory, and you don't have to spend hours torturing yourself with irrelevant tangents and nonsense. This is an astonishingly compact form of communication and it's beautiful to see.
The flip side is that when people get used to consuming content in 30 second blipverts, they become unable to maintain attention through a 10 second break in the action.
I don't know for sure about causation, but the students that I see incessantly consume tiktok completely lose state and working context in a very short time. It's a very strong correlation.
(And, I disagree a bit with your premise: for those of us who have become literate at skimming directions, the 30 second tiktok is still slower and more context-switch heavy than we're accustomed to... also, the risk that the tiktok is just quickly presented snap-edited bullshit that we don't have time to adequately question is high).
Developing some skills requires focus and careful study. We're robbing youth of the patience needed to conquer these skills.
The flip side is that when people get used to consuming content in 30 second blipverts, they become unable to maintain attention through a 10 second break in the action.
I see this written so frequently. Is there any studies to back up this claim? Please forgive me: Normally, I abhor the "citation please" type of response, but this claim is misleading to me. It just sounds like grumpy old person complaining about speed of the world and young(er) people.
Example: I tried Googling for "does consuming short content make it harder to focus on longer content?". None of the content is scientific research, just a bunch of blowhards writing "it's never been worse" blog posts.
You can try yourself. Use the most short-form content media: either one of HN, reddit and tiktok for 8 hours per day for one week. I can guarantee you won’t be able to concencrate on anything after the one week.
How anyone could read that article, even just skim it, and come out thinking that it was presenting evidence that this is a well researched topic is beyond me. The entire point of the article is that they couldn't fine studies on the topic...
I doubt there are controlled studies, but you can probably make a reasonable hypothesis based on viewing habits at different ages. Old people in 1993 watched Bonanza reruns; in 2023 they’re hooked to the constant crisis of cable news.
Anecdotally, I’ve definitely seen a shift in corporate comms as people gravitate to IM and text as opposed to email, driven both by habit and by avoiding accountability as email audit has become common.
Google is a terrible tool to find scientific research. Rather use Google Scholar (https://scholar.google.com) or some of the fancy new AI-assisted literature survey tools such as Elicit (https://elicit.org). I used both to find these result:
- Gen Z and Millennials in the Workplace: How are Leaders Adapting to their Short Attention Span and How Will they Keep them from Leaving a Qualitative Study (https://digitalcommons.fiu.edu/etd/4800/)
- Short-term mindfulness intervention reduces the negative attentional effects associated with heavy media multitasking (https://doi.org/10.1038/srep24542)
- Caught in the Loop: The Effects of The Addictive Nature Of Short-form videos On Users’ Perceived Attention Span And Mood (http://essay.utwente.nl/95551/)
Disclaimer: this is not my field, and I only spent 15 minutes on this, so these may not be the best articles around. On the other hand, the post seems to mention quite a lot of research on this topic, so you could read that, too. To read more, check articles that cite, or were cited by, these. Also, for obvious reasons, there does not seem to be many studies available on long term effects.
Well, I can't give you a neurological point of view on it, but if you train an LLM to be great at handling a context size of 512 tokens, when you try to ask it to do a task which requires greater context length (or in this context - attention span), the responses become incoherent, scattered and perplex as hell.
Which I imagine to be what would happen to a human if they trained their brain to process information of a certain context length exclusively.
So yeah, perhaps your brain becomes amazing at processing succinct information that's been engineered to be short, to the point and has none of the subtext or subtlety that you could include in a larger context.
That being said, perhaps that's exactly the kind of mental capacity you need for high context switching professions, I'm not sure which those are yet, but I'm sure there has to be someone in the world that has a use for that.
TL;DR if we keep the direct environment of our child clean and limit distractions she's calm, and focused. If we don't she gets whiny and somewhat unbearable.
I'm raising a child right now and it's pretty simple to observe a few simple patterns:
1. if there is stuff in her visible range during any activity that is even mildly interesting she wants it and it distracts from any activity. be it her own scooter, or her own toys lying around. denying her access to it makes her upset.
2. if there is any food in her visible range, cookies, fruits, anything, while eating dinner, she wants it and it makes feeding harder. denying access to it will make her upset and scream.
3. if there is a screen of any sort or a bright light on she will stare at it. denying her access to it makes her upset unless I turn it off and put my devices away.
4. We don't have a TV so she enjoys books a lot and we have improved our eating habits which makes it A LOT easier to feed her the same her healthy food.
5. Grand parents think they should bring a bunch of gifts every time they visit meaning our place is filled with junk, infinite amounts of clothes and they get upset when they don't get to gift it much like the kids that receive them.
What I'm trying to say it that even taking away all the digital distractions, a lot of these simple things are things I and many people around me did not have in abundance as a child.
It's not just because the digital stuff that attention spans decline. It's because EVERYTHING is noisy. Even the children books and toys are flashy and noisy.
It's not like you wake up one day and have attention or not, it's something that is learned over long periods of time.
How is any one supposed to grow up learning focus with all that junk around? I think a lot of these things are designed to get attention from children. The toy, book, and even children clothing manufacturers compete with each other on attention, ultimately I would argue with the goal of making the biggest profits.
A "neat" thing is that this exact same behavior is present in adults. Numerous studies have demonstrated a negative impact on focus (and other factors) when a person's phone is in their vicinity, even if it's completely inactive. I'd hypothesize it's our bodies becoming somewhat acclimated to the little micro-dopamine bursts you're going to get each time you get a bzzt, brrt, beep, or other sort of "engagement" (read: addiction) optimized notifier letting you know that something has happened. Even just seeing your phone, your body just starts anticipating it, and waiting for its next hit - in the same way your mouth may begin to salivate when looking at a delicious plate of food.
I suspect we'll look back at this era we're living in, the same way we might look back at the 19th century. In the 19th century cocaine started showing up in just about everything. It was used in medicinal tonics, casual drinks like Coca Cola (cocaine + kola nut), and more. People would use it recreationally, employers would give it to their workers, inventors and others (among them, Thomas Edison) would use it to improve their productivity, and more. Given the dramatic side effects of cocaine, people certainly realized there was a problem - but inertia is one hell of a beast.
Small kids struggle with focus, and here it's a good thing: they work to expand their environment and explore.
And, yes, you're totally right: too many toys and other things seem to exist to exploit this in the name of keeping kids quiet for a couple of minutes and extracting money from guardians. Not to mention the excessive use of tablets and screen time for kids under 5.
But I'm talking about something somewhat different: I know quite a few 17-19 year olds that genuinely want to absorb information, but literally cannot hold focus through a 15 second interruption. This is way more prevalent now than a few years ago, and it seems to be the students who are obsessed with short-form video that struggle the most.
I think all the things you're mentioning are relevant, but the big thing is that with mobile devices and short-form content we've created an environment where distraction never lets up. Most of us had to learn to be bored during some of our summer and sit with the feeling: that opportunity doesn't really exist now for youth.
Something I've learned from raising children is that the behaviour patterns the children follow are ultimately the same behaviour patterns us adults follow too. We just like to pretend that we know better or have overcome the childish instincts, but in many cases we have not. We act exactly same way and then come up with some bullshit excuse for it after the fact that makes it seem like we are not exactly the same species of monkeys acting on instincts like our children are.
Regarding your point 5. Gifts are a slow form of cluttering that snuck up on me. Children have multiple occasions per year where they are expected to get gifts, often more than one from more than one individual, which leads to sometimes dozens of gifts per child per year, most of which they rapidly lose interest in, added to an ever growing pile. It is rude to throw away gifts once received, and donating comes with its own caveats and effort. Unless the gifts pile is actively and continuously curated it grows over time and clutters the house. By the time I realized what was happening, we had already lost the fight and old toys were everywhere.
We recently moved and used the occasion to declutter. We flipped the script and took only what we wanted: just the toys the kids still wanted, just the books worth keeping, just the clothes worth wearing, and so on. I would estimate we kept less than a quarter of the stuff in our old house. All that stuff clutters not just the house, but the mind as well.
I think we are on the right of the bell curve with our kids discipline (healthy eating, little screen time) but the picture you paint sounds miserable.
A sparse environment, no TV, simple toys, denying gifts from grandparents isn’t something noble.
Of course don’t let them eat cookies before dinner and sit on iPads all day, but there has to be some balance for everyone’s enjoyment and sanity.
> A sparse environment, no TV, simple toys, denying gifts from grandparents isn’t something noble.
Providing an environment conducive to personal growth IS noble. Of course it's not only a matter of removing damaging attention sinks, but also substituting those with more healthy alternatives, like books, activities, education, quality time with family, etc.
> when people get used to consuming content in 30 second blipverts, they become unable to maintain attention through a 10 second break in the action.
I keep hearing this but is there actual evidence? My anecdata is that I can watch tiktoks and read programming books all day without one impacting the other. I honestly have trouble believing that our attention mechanism is so flawed it can be broken so easily.
I think the more likely explanation is that consumed content is just more efficient these days. In other words, it's not our attention span that's changing but our data culture. I think that's a good thing too.
Well, we're discussing a post that looks at a bunch of moderate-quality evidence in this area. Unfortunately, no one had the foresight to realize that attention span measures would be very important for us to have high quality control evidence before 2000.
So we have some moderate quality measures that say that attention span has become lower over time. And we have higher quality measures that show that low attention span is correlated to consuming short-form video. For example, https://www.mdpi.com/1660-4601/18/16/8820
> our attention mechanism is so flawed it can be broken so easily.
In my opinion, our attention mechanism is very weak compared to the demands of academic study and modern knowledge work.
> In my opinion, our attention mechanism is very weak compared to the demands of academic study and modern knowledge work.
Or it could be that academic study and modern knowledge work is just severaly outdated in their data delivery methods compared to contemporary techniques used in tools that "cause attention span issues" like Tiktok.
Maybe the UX of a traditional science paper has to be reviewed instead of trying to fault the end user for not torturing themselves trying to ingest something that has UX from half a century ago.
To me the issue seems pretty clear. We got new information delivery methods that are significantly better and when going back to old methods we naturally get unsatisfied. Does that mean we are getting "dumber" as "attention deficit" memes imply or that certain fields are just failing to catch up?
We can simply accept that ingesting old data types will be more difficult for the new generations or update them to match new expectations. Either way this sounds like whole lot of nothing for most of us.
> Or it could be that academic study and modern knowledge work is just severaly outdated in their data delivery methods compared to contemporary techniques used in tools that "cause attention span issues" like Tiktok.
Some things take time. You don't perfect a painting technique, explore a family of variations in a musical theme, analyze a complicated social issue, or solve non-trivial equations in a 25 second slice.
The students that I'm talking about-- they can ask their peer a question that they're interested in learning the answer to, and then have their attention wander and lose state in the time it takes their friend to finish chewing. This used to be pretty rare; now it's distressingly common.
I'm all for multi-modality and different ways to present information. But most people need to develop the skills to show up and think deeply for >20 minute spans.
That's a very good point. I guess the real new problem here is learning to identify and manage different types of discussions and information exchange formats which can be challenging but totally solvable issue imo once people start working on it instead of pointing fingers and fear mongering.
> We got new information delivery methods that are significantly better and when going back to old methods we naturally get unsatisfied. Does that mean we are getting "dumber" as "attention deficit" memes imply or that certain fields are just failing to catch up?
the new delivery methods are not significantly better at delivering knowledge, but at diverting attention. They are engineered for that outcome, not for the information retention ability after a significant amount of time, or even for the comprehension of that information. So yes, as a result, the society is getting dumber, because our intellectual resources are rerouted to futile bits of nothingness.
Hard disagree with you. New methods are objectively better. One obvious illustration is that online books/websites are better than paper books at information delivery and teaching people in general. The "society is degrading" meme is as old as time itself and frankly it's getting a bit boring.
What kind of knowledge are all you getting from Tiktok? Technical knowledge? Philosophical? I thought the app was for lip syncing + lazy cheerleading and the occasional Chinese data mining. But I am perfectly happy hopping on the bandwagon if it has substance!
I just use all the various sites as tools, instead of having loyalty anywhere. Search for something you're interested in and see what shows up. YouTube is definitely no longer the central repository of everything that it once was. A good way to demonstrate the average difference between YouTube and TikTok is to show the same video from the same guy, but optimized for different platforms. This is a video on deadlift form (in weight training):
No idea what the deal is, but I'm guessing the YouTube algorithm is optimized for longer form videos so a lot of stuff ends up with just a lot of filler. The TikTok video is everything in the YouTube video, but with all the fluff removed. I've also used TikTok for foreign language lessons with excellent results, largely for the same reason. The videos tend to have a lot better information density than what YouTube optimizes for.
----
Also, just watching that YouTube version again. Perhaps one of the best things is no more "If you like this video be sure to like, subscribe, and comment below. It really helps the channel out." An algorithm that relies on such a stupid, gameable, opt-in metric is always going to be inherently dysfunctional.
The “meta game” of optimal length on YouTube has been shifting towards longer videos for quite a while - one of the more important metrics towards monetising is “watch hours”. So you get a lot of filler.
YouTube also has shorts, which it measures somewhat differently (as views as opposed to watch hours).
Great links! Thank you for this comparison. I am a huge fan of spreading content across platforms as to avoid a monolith. I will be looking in to this Tiktok. And I truly cannot stand the "smash the like" whatever every. single. damn. video.
It's a totally under rated platform imo - there are a lot of high quality creators!
In particular I'm following UX (@designertom), CX and design channels as I'm still transitioning from backend to full stack development. Tech news, Producthunt-like content (especially in bleed edge areas like AI), art stuff, technical gardening (@transformativeadventures), science stuff (@hankgreen) and health/workout (@dr_idz).
The initial problem with tiktok is that you need quite a bit of time to train the algorithm to actually give you the stuff you want as the search and other discovery areas are really bad on purpose.
Also worth noting that Tiktok now supports long form content so some video can get pretty long. The player also has 2x video speed and good seeker so it's easy to roll through a lot of information very efficently!
I use it to figure out AI image techniques. Basically the theory is in papers, the nearly accepted stuff is in blogs, but the people on cutting edge are in youtube and tiktok trying 100 things before they can post 1
The hour long video on a thing that covered most of the bases and edge cases gets cut down into a 30 minute (likely 15 minute) YT video with important information missing.
The 15 min YouTube video turns into two 30 second TikToks that speed run through 70% of what you need to know, sure, but is the 30% they didn't cover (or know about!) actually important? Who knows!
Example: I was cleaning our jacuzzi bath tub the other day. The previous owners had never cleaned it, so black gunk came out of the jets.
A YouTube video (that was actually a TikTok!) suggested unscrewing the jet nozzles (amongst other things), which she demonstrated as a really simple "just unscrew them" sort of deal.
As it happens, not every jet nozzle is meant to be removed! And not every jet nozzle should be removed! Also, it is really hard to get replacement nozzles after you crack one because it was affixed to the housing!
Yeah. I got myself a songwriting course and it's certainly not compressed, more like 20 - 30 hours in total length. And those guys do a lot of, let's call it, meandering in the topics. Like, one of the basic topics was time signature and time signature notation. And the basics are somewhat simple indeed, but after a minute or two, there's a detour into why german Volksmusik is different about on- and off-beats, and later on there was a detour into irish music and shanteys. This very much reminds me of some of the more advanced university classes - very much stringent about a topic, but entirely ready to look at the flowers left and right.
And while it makes it longer, I find it helps my retention. For one, I have to allocate 40 minutes to an hour for a course part - and that's enough time to make it a conscious decision. And it helps to put new information into context much better, which very much helps retention. Heck, even something like a cat crashing into a keyboard helps remembering things, haha.
Yeah, I have the same experience with retention. The extra time spent on the detours helps to contextualise the theory. To expand on your flowers analogy, looking at the flowers helps to remember the path better.
I appreciate your optimism. A lot of people point out that education hasn't changed meaningfully in hundreds of years. Professors, long lectures, textbook readings, homework and exams. I am curious if this trend will be the catalyst for a new education systems to topple current status quo.
You're right: the sum total of human knowledge is larger than it has ever been so to reach the boundaries of our understanding requires more learning than ever. Compressing that learning process therefore seems necessary to continue our upward trajectory.
I'm both excited and terrified to see what a "TikTok-ified" engineering curriculum would look like.
One thing has changed dramatically in education over just the past 100 years. This [1] is an entrance exam for Harvard in 1869. It was expected that the applicant would be fluent in Greek, Latin, English, that they would have a exceptional grasp of history and geography across the entire world, be able to carry out complex mathematical calculations, compose geometric proofs, and more. That trend also was the same as you went down to high school and even middle school.
The reason the curriculums were like this is that education was largely optional, and so it was designed for the exceptional over-achievers who would voluntarily, with no extrinsic force, opt into such. Now a days education, including tertiary, is designed for anybody with a pulse. And this creates a terrible scenario for overachievers and underachievers alike. The overachievers are bored senseless in lengthy classes because "Yes, I got what you said 40 minutes ago. Why can't we move on?" By contrast the underachievers lack the attention span and focus to follow a 50, let alone 90, minute lesson, so struggle even given the much slower pace.
A 'TikTok-ified' education is really just a desire to stop wasting so much time, but we waste that time because of this change in education. And far from creating a nation of scholars - middling or otherwise, this change has instead just created a nation where your barista probably has a college degree, and 6 figures of debt to show for it.
I studied quite a bit of latin in high school (not in the USA through) and I have to say I'm not overly impressed by the test. It certainly doesn't test for fluency in Latin or Greek, most of the exercises are purely about grammar. The exercises that do require translation skills, don't require any vocabulary either. The sentences themselves are quite long, but latin grammar lends itself to complex run-on-sentences.
I think the better students in my class would have done quite well on this test even through latin wasn't one of their main subjects.
The Greek portion seems fairly similar if not a bit easier.
Every single history question is about ancient Rome or Greece.
Every single geography question is about the location of rivers.
How does this constitute an "exceptional grasp of history and geography across the entire world"?
The maths section does seem to have some difficult parts, we didn't do any proofs in the regular high school curriculum for example. The other math sections just show how much the requirements have shifted, a lot of the exercises in the arithmatic section got replaced by calculators, but I don't see any calculus on this test.
Looking at this, I think today's Harvard students are more educated overall than the ones in 1869. Our priorities just shifted.
I have to disagree. Your example illustrates that the quality demands of Harvard entry have declined and it's reasonable to assume that other universities have also become more open, as you claim. You're right about that. Universities are less elitist than they used to be.
Nevertheless, the overall demands in higher education have increased tremendously and the courses have also become more focused. In most disciplines, what formerly would have counted as a Ph.D. Thesis is nowadays at M.A. level. I've been in committees for hiring postdocs and Ph.D. students in the humanities, and it's insane what kind of demands are put on the students. I know some center of excellence in my discipline that (unofficially) requires one publication in a good international journal to be taken into consideration as a Ph.D. student. The majority of postdocs who apply for a one year grant nowadays have CVs that would almost certainly have made them assistant professors 30 years ago. There are also more courses and more subjects to learn because progress in science has accelerated very rapidly. For example, someone who studies CS nowadays will learn proofs in complexity theory that used to be cutting edge research 40 years ago.
Overall, I don't buy your negative claims. Over-achievers can quickly finish their undergraduate studies and will move on. At most universities it is possible to cram your curriculum and exams and pass "easy" undergraduate courses very quickly. Lack of elitism is an imaginary problem, probably often made up for political reasons. If you want to see real problems, look at the high student fees, accompanying debt, and the resulting desire to focus on practical job skills and overly fast-paced studies. These are detrimental to science, of course.
Why would you call it elitist? From my perspective something like this is the ultimate 'democratization' of education. If you can pass the test - you're in, regardless of who you might otherwise be (or not be). For instance one individual who would have passed a similar exam was Richard Greener [1]. He was the first black man to graduate from Harvard. He did so in 1870, shortly after the Civil War, for some social context. He would, unsurprisingly, go on to lead an extremely distinguished life. Passing the test would obviously require a rigorous dedication and pursuit to academic knowledge, but what is that if not the most pure definition of what college ought be?
I didn't talk about democratization (why the scare quotes?). I called it elitist because it's elitist. Harder entry exams are a way to only allow higher education for a much smaller percentage of the population, i.e., to an elite. You're obviously not one of them but there are people who think that the largest percentage possible should benefit from higher education. This also requires it to be affordable economically, of course.
I understand that this idea is not necessarily appealing to elitists and to people who only consider education in terms of the economic benefits it might provide.
The scare quotes as I think that's an inappropriate, though common, use of the term.
Rather than argue I'm quite curious of your take on something. Imagine we have a hypothetical individual - Bob. Bob comes from a single parent and poor family. He's the first of his family to attend college. Bob, like most kids doesn't really like school at all. But he eeks by. And he graduates. Having no clue what to do with his life, he decides to whimsically apply for Whatever U, where he's accepted.
Once in he looks for the easiest major - and settles on psychology. Made even easier after noticing all the cute girls in Psych 301. Bob squeezes by mostly using pickaprof, ratemyprofessor, and so on sites to just find the easiest profs who pretty much hand out 4.0s for just showing up. So naturally he ends up a solid 3.0 student. Finally he graduates.
From my perspective, he's now in pretty much the same situation as he was 4 years ago, except now he has 6 figures of debt and a sheet of paper. How would you say Bob has benefited from higher education, in concrete ways? Would you expect his life to be better or worse relative to him just deciding to e.g. pick up an electrician apprenticeship out of high school?
> You're right: the sum total of human knowledge is larger than it has ever been so to reach the boundaries of our understanding requires more learning than ever. Compressing that learning process therefore seems necessary to continue our upward trajectory.
This happened a while ago and the solution to that was specialization and not tiktokization. As much as I am for modernizing, uhm, everything, I'm not sure I see an advantage in the current trend that's happening right now.
Quality have been declining on the software side, but that's only because I am exposed to that as a software developer. Everyday items quality are on the decline too (my Nike -will be the last ones- shoes have deteriorated in the interval of 3 months, just ridiculous).
So here we have it: a combination of short-span attentions, a system that rewards the short-term and a political class that does not care. It's a mystery how our societies have not collapsed yet. Or maybe we are close?
If there’s one thing I internalized about learning is that consuming content and reading doesn’t mean learning. In fact, nowadays, if I want to learn something I make summaries with my own words and create pages where I connect other stuff I think are related to that concept. (discussing it with ChatGPT is also a way to practice handling the subject in a back and forth conversation)
“Compressing” content to the bare minimum to be quickly consumed in 30 seconds can be good for news or being aware something exists maybe, but it’s terrible for learning. If you’re not actively engaged in a task and struggling with it you’re not learning anything new.
> I'm both excited and terrified to see what a "TikTok-ified" engineering curriculum would look like.
That's what it already is, isn't it? It's not like Galileo could just read a small description on the Universal Law of Gravitation and understand what was going on. Think about explaining solving a linear system of equations vs using a matrix inverse to calculate a solution. As phenomena become better understood, we literally condense disparate observations into more general rules and theories that offer us more clarity and understanding.
I like to follow a bunch of woodworkers on TikTok and once the algorithm gives you the right set of people, you can use the quick videos of how they setup jigs or some interesting joinery they do and riff off it in your own work. Sure if I were a beginner it would be different, but it's a very useful resource for someone with experience.
> There is far more content than there used to be but no more hours in the day. Our filters must reject more.
Attention span cannot be measured by what we don’t pay attention to. And there has always - always - been more information than anyone could process. I think it quite obviously is determined by how long we pay attention to things we chose to engage with. Clearly watching 15 second short clips instead of reading books has had a detrimental effect.
This rhetoric is partly why reading gets such a bad rap. So many people put "reading books" on a pedestal, and cite random educational studies on how the mere act of reading stimulates the brain and is beneficial in itself.
But 90% of the books in this world are fucking boring to downright garbage and not worth reading for most people. Telling people to read "books" is as helpful as advising suburban kids to go "places" and see them stay home after the fifth trip to Wallmart.
Would a random kid be better off reading "Rich dad Poor dad" than try to fix the brake cable of his bike ? Should they read "The Boys from Biloxi" or go to a theater with their friends and have an actual social exchange with a real human ? Are the dozens of self help books pusblished every week better than their Substack equivalent ?
They should definitely read whatever interests them so when they want to acquire useful knowledge (like what to do when the obvious brake cable fix doesn’t work), they have access to tons and tons of written material on that topic.
It’s odd how you’re acting like you’re making an argument against the general imperative to read books, but your examples are people being requested to read specific books which may or may not be useful and/or interesting to the reader.
> like what to do when the obvious brake cable fix doesn’t work
The interesting thing: those useful information probably aren't seen as "books" even when they're in written form. The obvious example is the repair manual in PDF on the brake maker's site. Or the blog explaining how they dealt with their vintage bike's brake cable.
Of course that information is probably actually a youtube video (I refresh every single components of a bike with a toolkit from amazon and hours of youtube videos with Shimano's PDFs on the side)
> It’s odd how you’re acting like you’re making an argument against the general imperative to read books, but your examples are people being requested to read specific books which may or may not be useful and/or interesting to the reader.
You're right. I should be clear that I find the very premise of pushing people to do a category of activity pretty odd. I wouldn't tell people to "go watch internet videos" or "go study academics" or "do some sport". If I'm close to them, such a generic advice would be dumb, and if I don't know their life, I shouldn't be throwing random platitudes at them either.
From that point of view, "read books" doesn't make any sense, and If I had to say something I'd come up with an actual thing that could benefit them.
The same way I'm kinda wary of people explaining they read X books a month. This feels like they don't see books as individual items, but as a KPI, which is weirding me out.
I think it's a decent point. Maybe we generalise from "the sort of people who read books" suggesting it's something virtuous. Or we have certain books in mind (rather than trashy romance novels).
Reminds me a bit of the idea of tea drinkers as some cultured ideal.
You’ve snuck in another conditional: when you’re close to someone, correct, you shouldn’t offer imperatives like “read books.”
You’d ideally be recommending books that are tailored to their needs and interests (which, remember, is not the same as your prior examples, which were specific books pushed on people regardless of their needs and interests).
You’d be hard-pressed to find a single decent educator who disagrees with you, though it’s easy to come to the idea that you’re contrarian on this point.
There are two factors: 1) educators have to suggest a small set of books for a large group of students due to the scale of mass education. AFAICT almost every educator regards this as a problem but it’s not exactly helped by views like “reading books is a dumb goal.” 2) when educators are talking about overall goals and policy, they obviously cannot say “well for person X we want them to read book X; for person Y we want them to read book Y, etc.” This washes out as “we want students writ large to read books writ large.”
The overall objective is to build a baseline ability for every student to be able to find, navigate, and interpret information. A lot of our society’s information is stored in a particular format called “book” that is navigated differently from other formats. Despite the generic-sounding imperative to read books, it is self-evidently the case that the only way to achieve it is by meeting each individual student where they are in terms of both skill level and interests, so that’s what educators try to do. Again: this is extremely challenging in mass education environments!
Other information formats should be and are also part of a good educational system, obviously, including manuals and instructional videos. I have clear memory of manuals and cookbooks even in my early education, which was low quality by US standards.
Agreed re weirdness of # of books as a meaningful metric, and I suspect most educators would agree with us as well.
I agree with what you are saying. Much like a muscle, when you work out, you're not lifting weights so you can lift more weights later.
Your general strength is higher, so most physical tasks are easier. Same for reading, if you practice reading books and enjoy looking at the written word, then reading developer documentation doesn't seem as intimidating which opens more doors for you.
My counterargument would be about why you need to lift weights in the first place ?
Some people like lifting weight, sure. But if your goal is to carry injured big dogs in your medical center, you won't be lifting random weights, nor doing generic workout. And you might not even do that in a gym, you could as well train with healthy dogs and get an actual feel of their weight distribution.
For books it goes the same. Reading stuff that will actually help you and bring you closer to your goals will help a lot more than reading for reading. On the other side you'll be able to go through thousands of page if the subject matters to you, where reading dumb prose will kill your attention in matters of minutes.
Learning to plow through uninteresting books for the sake of it doesn't sound like a worthwhile use of someone's time. But yes, just like workout, some people enjoy the effort in itself and like to feel the pain.
> Learning to plow through uninteresting books for the sake of it doesn't sound like a worthwhile use of someone's time.
Of course not, but that's just a very specific edge case of "go read books".
When I recommend for someone to go read a book, I consider it implied that they should find good, interesting and constructive books to read, not the opposite.
I personally think reading is as objectively good for the mind as walking is for the body.
Walking is objectively good and healthy for everybody and anybody, bar physical conditions and edge cases.
However, you could go walk in weird ways in boring or dangerous and unhealthy places and that's not ideal, but who would purposefully do that?
I think I see your point but 10% (or heck even 1%) of books being engaging and worthwhile is still more books than anyone could realistically read in a lifetime. Good books are a fair bit more accessible than good urbanism or rural activities are to a suburban kid (speaking from experience sadly), and I don't think people extolling the values of reading are suggesting doing it completely in lieu of other productive activities like fixing a brake cable (fwiw probably an activity better assisted by YT or maybe even TikTok than books).
I have tried fixing the brake cable on my bike (and other bike repair tasks), by using the myriad of youtube videos assigned for the task. It's actually quite difficult to hold a brake cable in tension, while forcing an old school front derailleur not meant for the bike into millimeter perfect position and at the same time pause/unpause/rewind a youtube video.
It is honestly easier to put a book down open to the right page, in line of sight. It requires no interaction and book authors are forced to write instructions that can be understood primarily without pictures, so there's no pause/unpause/rewind mentality. Books engage the mental tools to infer what happens next from textual description, an ability videos degrade in the viewer. The implications of that effects decision making subtly.
Books are also a welcome release from the trance-like state induced in consumers by most electronic displays.
I agree most books are garbage (like most of everything), the filtering of low quality books has failed. In my limited understanding most of the great literature was written between the 1700s and 1900s anyway. Recent literature does not grant the broad understanding and height over the subject matter.
It is somewhat problematic to rely on multiple youtube videos and find they all have 'small detail' gaps in their perfomances that exist because the video format is to train the user along a generic path of action. Those cracks often reveal a chasm of difference in understanding and in the end, the video is replacing turn-key parts and I'm angle grinding off a stuck cup and cone bearing, even though on the surface the problems look identical.
> It's actually quite difficult to hold a brake cable in tension, while forcing an old school front derailleur not meant for the bike into millimeter perfect position
Maybe you are missing the tension adjustment screws on the Bowden tube? You ajust the tension if the cable as close as you can with a pair of pliers with one hand, and a screwdriver to lock the cable in the other. The you use the fine adjustment screw to precisely align the chain over the cogs.
> Clearly watching 15 second short clips instead of reading books has had a detrimental effect.
This isn't clear and very much depends on what you're trying to accomplish. Want to get deeply engrossed in fiction? Sure. On the other hand, I'm currently reading a book about sleep training a baby that's ~20 years old, and it's way too long - just jam packed with repetition and fluff that doesn't help at all. There's really a few critical pieces of information that you need to understand in order to sleep train your baby, and then a few more secondary pieces of information that are interesting/useful context around why sleep training works and the studies that support the methods in the book.
Particularly since I have a four month old baby and thus very little free time, 15 second videos/blog posts/whatever other short form content on this topic would clearly be superior to reading the book.
But how would you judge the quality of 15 second videos? It's crucial to be able to push through that in order to make a good judgement if the content is any good. The inability to think and judge critically is a huge problem nowadays in my opinion.
How do I judge the quality of the needlessly long book? I would argue that 15 second videos are better for this, because I can easily watch a bunch of them from different sources to corroborate what I see. I can also research the quality of the video's creator, just as I can research the book's author.
But also now I have a magical machine in my pocket where I can tap the screen a few times and get a book read to me sped up to my optimal input speed.
Those two things (audiobooks on my phone, the default feature to enable play at 2x-3x speed) have vastly increased the information I absorb.
Now if only someone could come up with a screen/document reader with a decent text-to-speech and decent content filtering, it would be truly magical (read just the bulk text not and don't vocalize every single piece of text, most of which are major distractions to flow and don't need to be read)
Part of the reason I'm learning a bunch of ML things is so I can make this for myself.
On that topic, does anyone know of a really good, open, text to speech model? All of the ones I have been able to find have ranged between garbage and mediocre, none near "good enough" for the thing to be useful to me.
Though reading is more efficient, perhaps they just have more time to listen to audiobooks to the point it is more total information gathered. Multi-tasking for the win!
While I can utilize speed-reading techniques for some content somewhat faster than I can listen, it is an all-hands-on-deck situation attention wise and extremely sensitive to interruption.
On the other hand I can push 3x or faster depending on the narrator for audio content in most situations and 2x while doing nearly anything (the only exceptions being literally trying to carry on a conversation with a person or driving fast on a mountain switchback road).
I don't do "ordinary" reading particularly well due to my own brand of vision issues + ADHD/neurodivergence/whatever label is in vogue. This is a bit sad for me, but I've obviously got workarounds.
Depends on whether or not you have the time to read. Listening to audio books can be done while walking, vacuuming, etc. Also while commuting, where reading might cause motion sickness.
I used Amazon Polly (text to speech engine) on a project a few years ago - the “neural” voices were decent. I’ve since heard much better synthetic voice engines, but don’t know names. What’s the best quality one you have found recently?
That's an interesting take on having books read to you!
I am genuinely curious though: what do you do with the information you absorb? Are you able to retain all of it and use it at will? If so, do you have any tips on how to do that?
Keep notes on the things i really want to keep, some details and random facts that happen to trigger my memory, and a general sense of the work with most of the rest of the details faded away. Also, you know, entertainment.
No I don’t have eidetic or whatever the version of that is for audio or text. Closer to it when I was younger than I am now. I have a feeling this is something you can train with effort but not exactly sure what that effort is or if I have the motivation to try it.
To remember the most out of reading you have to engage with the material. Keep notes while reading or after, try to summarize what you have read after each session of reading or at the end of each chapter or each day. Ask questions about the material. Physically write these notes and questions down, ideally with actual pens and paper. Additionally engage with other people about what you have read vocally. Reading, writing, and speech engage different but connected pieces of your brain. Engaging with material in multiple modes reinforces memory from multiple directions as does the practice of summarizing, questioning, and discussing. Gets you into somewhat subconscious habits of paying attention to material differently.
But also, not everything is worth remembering. If I need a recording of the material i already have it on paper or an audio file, it doesn’t have to be accessible to my consciousness from my own memory when it’s just right there easily found.
I'm reading a book on sleep training a baby that's ~20 years old, I think, and it's like 75% unnecessary. I attribute it to the fact that when it was out, there were fewer books/other media competing for attention, plus it wasn't easy to go find additional information by Googling, so people attributed value to extra, borderline-useless info in books.
The method we leaned on (with success) was a one-page PDF shared by another parent. So many books of this sort are a bullet-point list fleshed out to justify the sale price, and little else. Case studies and history and whatever else.
I think this is just a result of too many things grabbing attention - especially in almost everyone's career this seemed to be the norm for the last 15 years or so. The jobs we do, the tools we use, and the structuring of companies and employee hierarchy, projects and general culture have all introduced elements that force you to frequently context switch all the time. Add to that, things on the internet like Social Media, apps pushing useless notifications all the time, Tiktok, Instagram, Twitter, etc where things are updated very frequently and if you're participating on those platforms, it creates somewhat of an urge to keep checking new stuff as it comes.
Engineers can choose to have some flexibility, however the managers and upper managers have no breaks, and it kind of sucks for them to zoom in and out of contexts between meetings, and it often makes them lose track of things. If workload is not shared, it very quickly becomes a shitshow altogether. All my managers in my last 3 companies were struggling to keep up with what was going on, and had no rest; they would get pinged about issues even while they were on PTO which seemed inevitable.
Other high context-switching jobs may be those of doctors, stock brokers and analysts (they have to keep tracking a million things every day), lawyers, and sales guys. They come with their own way of causing mental overload quite frequently unfortunately.
Honestly I feel it's the circumstances some of which are in our control that can be adjusted to minimize distractions and remove some of the useless things that we think are not needed at all. It's no wonder that those "distraction elimination" browser plugins and mobile phone apps are a real thing!
Yes! The Tiktok format has been fantastic for getting creators to consider what the core of their messages is and edit out everything else. A well-made Tiktok video watched 5x is more valuable than a 10 minute YouTube video where the camera is left running and filler added just to pad out the monetization minimum time.
True but the dev’s of the 80’s understood transistors and other things i never needed to know as a millennial. As these things get abstracted, complexity keeps increasing at the top of the stack so thats what’s keeps us employed (for now)
I see recipes Tiktoks get readily debunked in longer-form content, such as by channels like "How To Cook That"¹; the Tiktoks amount to little more than content farming. What real information might be present is drowned out by fakes and bad advice that exist for no other reason than to soak up eyeballs.
I get far more out of longer-form videos, IMO.
(And yes, the bloated recipe blogspam is also a form that is rapidly approaching 0 bytes of information per byte transferred.)
¹(even this is "cheap" content, IMO; debunking stuff leaves the viewer back where they started, although How To Cook That specifically will occasionally show the "no, here's how you actually do this, and it takes less time than the tiktok's "time saver"." There's an endless stream of junk to debunk, so you don't really have to worry about running out, per se.)
Any examples of fake tiktok recipes? All the top "How To Cook That" debunking videos seem to be about other youtube videos or just weird food videos that aren't recipes.
I follow some YouTubers that try to recreate TikTok recipes and most of them come out just fine. If you're an experienced cook, it's really easy to judge whether a recipe is gonna be decent or not.
I wouldn't call them fake recipes, but here is a youtube channel that is highlighting Tik Tok cooking videos, some of which seem to be just for the video... that is, I don't believe even the people showing those "recipes" are eating that.
That only works for dishes that are related or similar to other dishes you've made in the past.
As soon as a banana leaf, raw fish, etc etc enters the picture you'll want to read the long form. Also if you want to actually get better at cooking, tiktoks just won't cut it as there's no time to discuss all the nuances - and quality food is an amalgamation of many small nuances.
Doesn't sound very different from Grandma's recipe cards in terms of information density, potentially harder to follow if you have to deal with video playback. Recipe blogs also have compressed instructions, so it's not like you need to read all the fluff, but when first stumbling upon a recipe, a short video is definitely better at grabbing your attention and showing you the major points and the result, in a way that traditional blogs and cooking sites fail to replicate. Video is also better for conveying technique, but I prefer paper or cards while cooking something that's not completely new to me, for fewer handwashing interruptions.
> An introduction to a concrete skill that would at one time have been padded out to fit into an hour long movie or lecture might become a 30 minute youtube video and then a 30 second tiktok,
> it's beautiful to see.
I totally agree. Fleshing out subject matter is seen everywhere, films, books, education, and what the internet has delivered is near instant communication, and access to a much wider array of knowledge and things to see and do. More people can travel the world with ease and relative low cost, so why not try to be more informed about the laws and culture.
When I look at a country that I might want to visit, I have never seen a TL;DR of their laws for inhabitants or tourists, so that puts me off travelling because I dont want to fall foul of the law.
I have however been able to establish that some parts of the German Autobahn has no speed limit when conditions permit, and there is extra car insurance which can be purchased if one is wanting to drive at faster than normal speeds.
The same goes for driving on the Nordschleife, there is an extra insurance option available to purchase in case an accident happens when driving around there.
Considering all the tech and knowledge that exists and Google, the information is still incredibly fragmented, but some of the older professions like law and medicine rely on this knowledge being fragmented in order to justify their existence and I think it gives away their subconcious bias.
You are correct! We may be dismayed by how fast we approached new information (and thereby the speed of action), however, productivity has been steadily rising. On top of that, we are able to act and learn more variety than ever before. There are clear downsides and upsides to the speed. Overall, faster has shown to be better.
> An introduction to a concrete skill that would at one time have been padded out to fit into an hour long movie or lecture might become a 30 minute youtube video and then a 30 second tiktok
It's a thin illusion. Brain candy masquerading as real food. Those snap-cut tiktok cooking instructionals aren't teaching my girlfriend to cook the dishes any more than a snap-cut BJJ youtube short could teach me how to do a berimbolo. She's gonna have to read a recipe and spend hours in the kitchen, and I'm gonna have to spend hours on the mat with a training partner.
She's gonna have to read a recipe and spend hours in the kitchen
You are missing the point. Something might take hours in the kitchen to make, but you don't need to have the entire thing on video. I don't need to see someone make all the shapes with the bread. I don't need to see them wait 45 minutes for something to cook.
Most recipes aren't hours long. Most of them are a list of ingredients and a few short paragraphs: Most online recipes reflect this as well.
And if someone is really unfamiliar, they can look up additional resources. "Best way to dice an onion" or "how to peel a tomato" both have plenty of videos.
I'm going to guess that it is the same for a berimbolo: You don't need to watch hours of someone else on a mat. You just need the instructions so that you can do the practice - just like you don't need to watch someone else practice an instrument, but you might be helped see how they play basic stuff.
Permit me to tie a couple of personal observation on this topic.
I do not think we "lost" our ability. I think we changed our thinking.
I think the world has mostly accepted the "good enough" versus "perfect". As we all have heard, to become an expert, on average we need to pursue the subject for 10 thousand hours (? Malcolm Gladwell). But, we do not need to spend 20 years practicing. We can obtain "good enough" in a few weeks, or even few hours (obviously depending on the subject).
For example, to win the Grand Prix de la baguette de Tradition Française de la Ville de Paris (i.e., French Baguette Competition of Paris), many spend a lifetime perfecting their craft. I can teach you in a day how to make an edible baguette that the average consumer will enjoy.
I think our "attention span" has shifted to "good enough" in many instances. I do not think this destroyed our attention span capability, it just made it slightly different.
Final anecdote to "prove" my point we did not lose our attention. I have taken ADHD-diagnosed boys to camp and fish. Of the twelve (ages 12-16), only one could not sit patiently and watch the line and bobber for extended period. He became bored, and started whittling for the same amount of time. Once they returned to "civilization", they "became" ADHD again.
As someone already noted, in my experience humans cannot multitask. We can context switch, some very slowly, and some very fast. But, we do not multitask.
> Final anecdote to "prove" my point we did not lose our attention. I have taken ADHD-diagnosed boys to camp and fish. Of the twelve (ages 12-16), only one could not sit patiently and watch the line and bobber for extended period.
You’re completely misrepresenting ADHD, so I don’t see how this anecdote proves your point.
ADHD isn’t the inability to focus. In fact, it often comes with the ability to hyperfocus better than neurotypical minds. ADHD is the inability to regulate focus on specific activities, particularly ones that are boring and not what the individual finds stimulating. Camping and fishing are not what I would typically consider a difficult task to focus on for someone with ADHD. Especially because it’s a physical activity, which are often better suited for an ADHD mind rather than mental tasks that involve being sedentary.
Perhaps what you’re unintentionally getting at is people with ADHD are much better suited for specific tasks than neurotypicals, and society is largely set up to favor neurotypicals at the expense of those with ADHD.
> society is largely set up to favor neurotypicals at the expense of those with ADHD.
As someone with diagnosed ADHD, I think this misrepresents ADHD. There's no form of society in which ADHD is an advantage; in a hunter-gatherer society, my tent would still be untidy and I'd still procrastinate fixing my spear before the hunt until the day before, and I'd still lose focus during the hunt and miss the deer walking past, because I was watching ants move sticks around.
ADHD is a disability, not a difference of ability. It's not very enjoyable for me to accept as I'm reasonably bright, but I'm mentally disabled in a way that someone of equivalent intelligence without ADHD is not. They can more easily achieve what they want in life than I can. I don't think there's any task in which someone with ADHD is more suited to than someone without; there are tasks which are less difficult for the person with ADHD than other tasks, but ADHD doesn't bestow many, if any, advantages over a neurotypical person.
There was a recent brief blog post[1] on this from Scott Alexander, if you're curious to read more, but I've come to the same conclusion independently.
Entirely pet theories that may be untrue but worth considering:
- Nicotine helps mitigate the negative symptoms of ADHD, but US society has seen a major drop in tobacco usage
- Having access to a secretary greatly compensates for workplace related tasks that are difficult for those with ADHD. I have had a few conversations with people at my company much older than myself about how the number of secretaries has vastly shrunk. My boss’ secretary used to reach out to me about a meeting she had scheduled for us. Now my boss doesn’t have a secretary, and I have to schedule those meetings, update them, and communicate this with multiple parties while also having a full plate of my own dev work.
If I'm reading the parent comment correctly, I think they're saying that that the high rates currently being diagnosed require one of their three conditions to be true.
That is, if ADHD is accurately diagnosed, and never had an evolutionary advantage, and isn't caused by some modern factor, then why would ADHD be so highly diagnosed today? Which is an interesting thought in my view
It could share a root cause with something that was evolutionarily adaptive or simply not have been sufficiently maladaptive to be strongly selected against.
There are other conditions. There's at least a cultural component in that we haven't always embraced mental health care in the US as much as we do today. Many other parts of the world still don't.
Whether over or underdiagnosed now, it was underdiagnosed in the past.
I’m a fellow ADHD sufferer and you’re 100% right that it is disabling in many contexts where “normal” function is optimal. But I want to encourage you not to despair entirely—-there are legitimate ways in which the ADHD mind has an edge. Mainly in recognizing lateral connections that neurotypical people might not see—-in essence, creativity, innovation. Not that neurotypical people can’t be creative, they obviously can be. But the ADHD mind has been shown to have this unusual quality. It’s why people with ADHD are often especially good conversationalists and jokers, even if they are terrible at getting their taxes done on time.
I agree with GP. My apartment is a mess. I neglect serious notices from the government. I can't hold a job doing boring work.
However, give me a programming project I'm interested in and I can work on it 14 hours a day. I'm not sure if it would be ADHD or autism but I'm quite happy with it.
I'm not the person you were responding with you, and I completely agree with you, but I think there is a more charitable reading of that statement which I also think is true.
There is no form of society in which ADHD is an advantage. But I think there are forms of society in which ADHD is less of a disadvantage. Another way of putting it is that ADHD is inherently a disability on it's own, but there are ways that a lot of our society is set up that exacerbate those problems. Just as there are ways to improve accessibility in spaces for those with physical disabilities, and that doesn't stop them from being disabilities, there are also ways that there could be cultural shifts and choices made to improve accessibility for those with ADHD.
> There's no form of society in which ADHD is an advantage
> I don't think there's any task in which someone with ADHD is more suited to than someone without; there are tasks which are less difficult for the person with ADHD than other tasks, but ADHD doesn't bestow many, if any, advantages over a neurotypical person.
Your experience is totally valid! I can only speak for my own experience, but I find compared to my neurotypical peers, I am much more capable at learning a breadth of topics. My peers are significantly better at performing 40 three-way matches for invoices in a work day while I would totally fail at that. But let me make my own video game where I have to learn how to program it, design art, and create music, and I’m great at that! My power comes from the ability to bounce around things depending on what I want to focus on that day. I create an environment for myself where I have dozens of tasks I can do, so that I always have something that will grab my interest.
The way this usually plays out in the actual working world is that I struggle with boring, repetitive tasks, but now that my boss knows how I work, he has let me set a more friendly set up and now frequently sees me deliver major changes that would have taken another developer weeks to accomplish given the diversity of tasks I have solved. Meanwhile the other developer is delivering on certain types of tasks I would be much worse at accomplishing. Our team plays to our strengths.
I’m great at deeply learning various concepts then hyperfocusing and delivering significant code changes at work. Some days I’ll complete 3 major jira tickets complete with testing because I’m so focused. Now of course there are other days where it’s much harder to get things done. But setting up that environment where I have diverse things to work on lets that aspect of my ADHD shine.
I will note that I’m properly medicated and have a great diet that helped a lot. School was always really tough for me. Then I discovered when I was studying for the CPA exam, that I learn much better from books than from a teacher in a classroom. I experienced this again when I self-taught CS. These days, I view my ADHD as less of a hindrance and more of a different approach. It’s true that it has downsides compared to a neurotypical, but I feel that it absolutely has its upsides as well when in a healthy state.
I was going to write the same thing you did. I've been diagnosed with ADHD and while it is a disadvantage for certain things (I've lost quite a bit of money due to invoices not being sent on time to clients as a contractor, fines for not submitting tax on time, etc...), it's also been hugely beneficial in other areas.
When I was in university, I quickly realized that I had a hard time paying attention to lectures but that I would be passionate about reading the reference books on the various subjects. So I skipped a lot of classes (it was a French engineering school, attendance is not mandatory if your grades are good) and I read all the relevant textbooks. I was studying CS so I was passionate and very interested in what I was reading so I could easily hyperfocus on those books and did very well. On the other hand, if I had been forced to attend lectures instead, I would have not done nearly as well.
In my work life, I've been the go to person who solves the hard problems because when I find an interesting problem to work on, I can completely focus on it until I solve it. On the other hand, I have a very hard time dealing with boring repetitive tasks but that's why an effective manager makes sure each people play to their strength.
So I do take issue with this statement "As someone with diagnosed ADHD, I think this misrepresents ADHD. There's no form of society in which ADHD is an advantage; " and "ADHD is a disability, not a difference of ability". On the contrary, ADHD IS a difference of ability and effectively managing ADHD means finding ways to adapt your life and work life to take advantage of the good sides of ADHD and find workarounds around the problematic aspects of it.
I'm honestly not sure if I'd have been as successful without ADHD, I would have lost less money for sure but maybe wouldn't have earned as much as I did. It would have been less stressful though for the first 36 years before I got diagnosed when I didn't understand why I couldn't just do certain tasks.
No it's a disorder. What you're saying may be true to an extent, but understanding how the condition works, and how people function, it's a disorder. I think that's a step beyond what you said.
Not everyone with unmedicated and untreated with ADHD may feel like they have trouble functioning or a disorder, but many do. It's obvious if you look at their lives. Many kill themselves over it.
It may be true that a better fit for society could be incredibly helpful. In fact I think it may be one the single most beneficial thing we could do for ADHD. That said, lots of people with it feel it's a hindrance. ie. they can't live meaningful lives.
Suppose there was a perfectly accepting society, what would that mean? I just drift around starting and stopping hobbies, not forming deep friendships, contributing to society, or even big pointless achievements like building some big sandcastle? This is really the ultimate solution?
Ideally, people with ADHD should be able to choose to conform or not based on how they feel the condition effects them. We can find solutions that are practical (unlike previously described).
To get back to your key point, perhaps there is some hypothetical society where ADHD thrives. It's not now. I doubt it was ever for some people with ADHD. It's harsh, but basic life is really hard for people with severe ADHD.
My theory is that there's advantages also, that may be completely disconnected with the drawbacks. It could just be that some people have a type of ADHD that has really have negative symptoms. I'm also confident it's heavily under-diagnosed, and a small percentage, but huge number slip detection, based on it being "their personality".
So I guess I have two opinions on this, maybe they balance out.
Opinion 1, adults should have options/facilities available to them to enable their functioning in modern society which supports administering ADHD treatment (medication) if they so desire. After all, every living thing dies alone.
Opinion 2, children (in the US, and teenagers in particular) are over-diagnosed with ADHD and various other disorders which is a direct consequence on the obsession with academic achievement.
Bottom line, if everyone needs to be on medication in order to be productive in a society then it's the society that has the disorder not the individuals.
> Bottom line, if everyone needs to be on medication in order to be productive in a society then it's the society that has the disorder not the individuals.
A huge percentage of the population aren't even suspected of having ADHD, because they clearly don't. This has become a silly trope, but it doesn't match reality for people with the disorder. It's a far cry from "everyone". There's a very specific cluster of behaviors that may get better or worse over time, but rarely go away.
Our obsession with academic achievement is harder than it seems to address. It will require a fundamental change in society, like capitalism or communism. Even if you get rid of the academic component, it will simply be replaced with some other filter (smokescreen for the elite?), with some unable to keep up.
We need it so that people can live doing pretty much nothing that generates profit. The system doesn't support large numbers of people here, and they often don't do very well, either. Some things could help and we should do.
It seems that we don't really value academic performance. We want a high stakes game to decide who gets to be rich and poor. This fundamentally would need to change, and I don't see an end in sight. We need ladders and nets, and to support careers that don't follow the high_school -> university -> career -> retirement, with no room for interruption.
I recall a real sad comment on reddit about someone that wound up depressed very recently in high school. What he described was the grind of classwork, fitness, and chores that seemed impossible to bear. Get home from a run, try to force your brain to study, then immediately sleep... and for what? You do this forever, now!
> I think we changed our thinking. I think the world has mostly accepted the "good enough" versus "perfect". [...] For example, to win [a famous baguette competition] many spend a lifetime perfecting their craft.
Hasn't it always been that way, though?
Decades or centuries ago we've been doing things (e.g. breadmaking) with fewer resources and worse tools and tighter margins, so "good enough" was probably even more important, rather than less. Great works were often made despite those limits, rather than in concert with them. Surely the techniques and investments used in competition are not the same ones that baker would use to feed a large hungry crowd.
"Perfect" probably only showed up either (A) where that's just the next frontier for a successful professional to stay engaged in their craft or promote their brand and (B) products commissioned by figurative if not literal royalty.
My attention span is making me re-read your comment to decipher whether I agree or disagree.
Jury is still out on this one :)
Anyways, over a decade back, I was watching a recording of a musical artist from either the 70s or 80s, and was surprised at the quality of the presentation (audio, music, harmony within the team) and was thinking on the same lines - good vs good enough, and how this team working in the 70s made musical magic.
I think the constraints those days made the masters really practice practice practice so that they could be great at the spur of the moment. A lot of people today may be more expert in synthesizing music so they can take a second's worth of snippet here, 5 seconds of snippets there, and eventually make something good, but it is all editing and they may not be able to do that live, but the old timey greats had to perform live, and for that they had to be great.
I am not saying no one does live music nowadays, quite the contrary, just saying that a lot of great music comes from people who never do live music, and that is because they can afford the luxury of 'editing' (which is kinda similar to divided attention span). Similarly, for the baker once the bread has gone bad there is nothing they can do but to start over, and that would be the incentive to get everything right.
I think my underlying thesis probably renders to something like:
The idea that people in the past had personalities that valued the pursuit of perfection more-highly than now is often assumed, but seldom is good evidence given to support it. In many cases it may be explained by survivorship-bias or by applying inaccurate modern assumptions about the relative costs of different materials/labor and the prestige/rarity of events.
> Anyways, over a decade back, I was watching a recording of a musical artist from either the 70s or 80s, and was surprised at the quality
Example "humans weren't fundamentally different" hypothesis: When it is more expensive to make multiple attempts, people invest more effort into making sure they do very good attempts, in order to avoid the downside risk. This does not mean they would not have gladly chosen an average-result if dramatically lower production costs were possible. Therefore the difference does not stem from a change in mentality, but from economic factors.
___________
It's kind of like looking at airplanes of the 1950s, with beds and champagne and caviar, and ruefully comparing them to the ones of the 2020s.
Is it because airlines became greedier? Did the average passenger (in the same income-bracket) lose their desire for fine things? Nah: It's probably because once you remember to adjust for inflation, those 1950s passengers were paying $7,300 per flight instead of today's $600.
> The idea that people in the past had personalities that valued the pursuit of perfection more-highly than now is often assumed, but seldom is good evidence given to support it.
I think its not hard to show that people have come to prioritize the low cost of goods/services at the cost of quality as a way of artificially propping up lifestyle habits as available disposable income goes down.
An appliance like a refrigerator or clothes washer had to be good in 1960 not just because of its cost (which in today's money accounting for inflation is huge) but because consumers had the ability to financially demand it.
But if household disposable incomes go down (for a myriad of reasons), to where even two adult fulltime earners can barely tread water, a "good" appliance is unthinkably cost prohibitive in the short term. A cheap one that will barely outlast the warranty is their solution. And when it fails (and it will) you go buy another and it goes to a landfill.
Its like the typical exmaple of the poor person who buys shitty work boots that only last a few months because they can't drop a couple hundred up front for ones that will last decade(s).
In the US (can't speak for elsewhere), your 1950s stereotypical middleclass sitcom style family lifestyle died around 1970. Its been kept going ever since by the combos of decreasing quality wherever possible while using debt to wallpaper over the lack of micro-economic sustainability (hence the huge % of the population who has no real savings, little or no retirement investments, and will loose their home if they get a bad medical diagnosis).
hence the huge % of the population who has no real savings, little or no retirement investments, and will loose their home if they get a bad medical diagnosis
Was this not also true in earlier generations?
Also, most people from highly developed countries with sound social safety net (G7 and close friends) do not have such a doom-and-gloom outlook. I assume your view comes from the US.
where even two adult fulltime earners can barely tread water
This is certainly not true in the vast majority of G7 and close friends -- only US.
I think we used to strive more for perfection. Changing sheets, ironing, folding clothes.. But also in trade, just look at all the ornaments.. attention to details, caring for tools / materials. When was the last time you got a suit made?
At some point we compromised and slapped a shiny finish on it. I blame advertising.
Why are you so certain it must be an personality change from unclear causes, when a bunch of huge and proximate economic factors haven't even begun to get ruled-out?
> Are you making your bed everyday?
Is your bed still made out of hay/down, or is it made out of something far easier to maintain and keep clear of pests without the same level of fuss?
> Changing sheets, ironing, folding clothes.
Do you still invest the same X hours of wages into buying a shirt that someone would have spent back in 1723? (Hint: No.) Why would you ever expect the same degree of maintenance "perfection" when it doesn't represent the same kind of up-front investment?
Do you try to impress people by wearing powdered wigs like a proper and hygienic individual, or do you sometimes Just Not Shower Today because we have nowhere near the same kinds of problems with head-lice and body-odor?
> I think the world has mostly accepted the "good enough" versus "perfect".
I think you're correct, but I would also argue that this makes everybody worse off, and therefore, is a wrong trend. It cancels, to some extent, the advantages of division of labor, because the society is chasing diversity of experience rather than depth of it. It is also impractical because now each individual has to deal with more flaws of good enough products, which pulls everybody down.
What happened in the past is that you obtained good enough with a lot more hours and effort then now. And perfection, however you define it, was even further out of reach.
I think it is related but not the same - instant gratification is what leads to our brains to shorten the attention span. I could see it happen to me in the last few years. I sort of stagnated in almost every endeavor because I was picking low hanging fruits if you will and did only the things that I could easily see the end result for…
Until I started training for a half marathon. Not having run seriously in my life, I challenged myself to join my friends for a 10K last year. The training was an unexpected lesson in humility and thought - I could not go for the 10K regardless of how motivated or pumped up I was. I needed to train myself to get there properly… 2 KM on day 1, 5KM by day 15, and 10K by day 30 or so. With enough rest in between. I could do the 10K last year happily.
This year I am training for the 21K. I’m practicing for 3 months slowly improving my pace and endurance.
I’m not what one would call athletic. I’m still doing it and it makes me incredibly happy.
The analogy I went for is - the same applies to learning anything new, or mastering something- it takes time, and continuous effort. Not instant gratification. It sounds very logical and simple in hindsight, but I had to learn it as an adult now.
I follow Natália on twitter, and she's repeatedly tried to get them to address her criticisms, to no avail as far as I've seen. It's not a good look for a supposedly scientific blog, especially given that her criticisms are detailed and data-based.
I don't think we should dismiss things based purely on the source, but caution seems warranted (as always, I suppose.)
I still think it's important to note that the blog's work is shoddy. The fact that this blog post is #1 on HN means that multiple people will read SMTM's other blog posts, which contain several falsehoods that they've refused to fix.
>The fact that this blog post is #1 on HN means that multiple people will read SMTM's other blog posts, which contain several falsehoods that they've refused to fix.
LessWrong is obviously not a cult, the allegations are nonsense. Well established researchers post there (e.g. Paul Christiano) and some even work or worked for OpenAI or DeepMind (e.g. Richard Ngo).
It's rather that the cited source "RationalWiki" is a highly politically biased source which routinely attacks anything which is in any way associated with the rationalism community, or which engages with things that are deemed unacceptable by its far-left authors (such as intelligence research). They have in the past attacked highly respected bloggers such as Scott Aaronson and Scott Alexander.
Well, Aaronson is also a leading quantum computing researcher, and Alexander is read by respected scholars, such as the psychologist and linguist Steven Pinker, or the economist Bryan Caplan.
So? That means nothing as to the quality or even topic of their blog content.
Unless blogs started receiving peer review while I wasn't paying attention, there's likely a reason any given content lives there and not in an academic paper.
Well, yeah, blogs are not academic journals. And if a blog post disagrees with academic consensus, the academic consensus will be right the vast majority of the time. But this applies to all blogs on the internet, not only those on LessWrong.
He's been more of a quantum fanboy of late. He does still talk about obvious fraud, but he's largely uncritical of the bigger players' offerings. Likewise, he's on openAI's payroll; no hard-hitting critique to be found. His Busy Beaver content is top-notch, though.
But when he shares his opinions on cancel culture, gender, etc.? He's just another blogger.
It's not just bloggers. There are computer scientists with publications in top CS conferences that post on LW. For example, Alexander Turner, Dan Hendrycks, Paul Christiano, and Jacob Steinhardt all post there.
Then the question you should be asking is why choose to post on LW when the value of publishing a paper is substantially higher.
Perhaps because what they post doesn’t hold up to scrutiny from their peers, so they prefer to play in the mud with the folks who won’t notice the issues with their ideas.
This excerpt from the article on Scott Alexander on RationalWiki is the only thing I need to quote to make others understand the stakes here.
>As usual, you can make anything worse by adding Reddit. /r/slatestarcodex is an unofficial fan forum for the blog. Scott comments occasionally and is a moderator. The culture wars (a regular weekly thread, until it was recently branched off to the Scott-endorsed /r/themotte) and pseudoscientific racialism of "human biodiversity" are regular and upvoted topics (literally advocating the Fourteen Words will get you 40+ upvotes[45] and admiring replies). Of course, much more offensive than the racism is objecting to the racism, which gets you a day's ban.[46] According to one moderator, "A belief in HBD doesn’t automatically equate to racism", somehow.
I also invite people to visit citations 45 and 46. The corresponding claims in the article might technically be true, but I think they're mislealding to say the least.
> "Safety issues" are to him more like, "the AI will kill everyone to fill its objectives"
Yes, and he was way ahead of the curve here, since similar positions got a lot more mainstream in the past years and months. E.g. two of three AI Turing award winners (Hinton and Bengio) now say that superintelligent AI poses a serious extinction risk. OpenAI has also identified the alignment problem as a major issue. Even former AI skeptics like Douglas Hofstadter now broadly agree with this assessment. Yudkowsky's opinion was quite prescient.
I know how to find the statements by Hinton and Benigo that superintelligent AI poses a serious extinction risk, but I can't find any statements by Hinton and Benigo--or anyone who is taking the risk seriously--suggesting that air strikes would not be warranted, so any clues on where to look would be appreciated.
So in other words, people who've spent their entire lives thinking and talking about AI praise other people who also have spent their entire lives thinking and talking about AI.
This is a bubble conversation that makes no sense to people outside of that bubble, and for good reason; it doesn't matter outside of that bubble.
Trying to read classic literature really makes this apparent. Hemingway's Sun Also Rises must have been a riveting adventure story when it was published in 1926, but how can it compete with 10,000 hours of adventure travel on youtube, netflix etc.? Same with Moby Dick in 1850s... these were glimpses into exotic lives rarely heard of back then, but today you can find those stories or similar in vivid moving pictures and audio everywhere, in much more digestible forms.
I really wish I could appreciate these great human achievements in the arts, but at least for books, I don't think my tech-atrophied brain has the ability.
I don't think you read Hemingway or Melville for riveting stories per se, but for their prose. The way they convey their stories in words that pique your curiosity or tickle your aesthetic sense. Words that express an idea or feeling you'd not encountered before, or express an idea or feeling you are familiar with in a totally fresh and unfamiliar way (btw their styles are very different, so you may very well find one engaging while the other leaves you flat, and of course some people will enjoy both or neither).
For example, I have never read "The Sun Also Rises," but I looked at the sample on Amazon and came across this on the second page: "I mistrust all frank and simple people, especially when their stories hold together, and I always had a suspicion that Robert Cohn had never been middleweight boxing champion, and that perhaps a horse had stepped on his face..." After a few matter-of-fact paragraphs, the narrator suddenly slaps the reader with this frank and funny statement of his utter cynicism. That kind of thing pulls me in. I want to know more about this narrator and see what other shocking things he may have to say.
The average person in 1926 didn't and wouldn't read Hemingway either. The first print run of Sun Also Rises was 5000 copies. Most people didn't read much and a significant proportion couldn't read at all, and I suspect most of the silent movies of the time would seem quite trivial compared with much amateur YouTube content today
I suspect that in 100 years time, bestselling books particularly popular with today's tech addled brains will also be considered a bit dry and hard to relate to by the average reader.
Hemingway was never JK Rowling, but he wasn't an obscure writer known only to academics and literature aficionados either. Your print run figure undersells his popularity quite a bit. "The Sun Also Rises" was his first novel when he was an unknown. Wikipedia goes on to say that it was on its 8th printing two years after publication.[0] This says that it had sold a million copies by 1961 [1]. The first edition of "For Whom the Bell Tolls" was 75,000. [2]. "The Old Man and the Sea" was published in Life magazine, with a circulation of millions.[3]
I certainly wouldn't claim Hemingway was obscure, but I don't see any of those figures undermining my point that the average person in 1926 wouldn't feel the inclination to read him over more digestible stuff (and the Bible) if they read at all.
Sure, the first eight print runs of The Sun Also Rises probably had a total circulation equivalent to the playthrough of some tediously-narrated niche videos on YouTube, and by 1961 when Hemingway was firmly established as a Great American Novelist it had as many copies over more than quarter of a century as Where the Crawdads Sing sells in a quarter, but I don't think you can infer anything much about attention spans from the appeal of literary fiction.
Agatha Christie is one of the bestselling authors of all time and I know my grandparents used to read her books while growing up in the 30's and 40's.
Those books have pretty good prose, it's perhaps not on the level of complexity of Hemingway (I've never read Hemingway so I couldn't say myself) but it is pretty complicated so I think there's definitely something to the argument that entertainment has 'dumbed down'.
Hemingway was probably the best-known of the ex-pat Paris crowd during his lifetime. I expect some the Algonquin Roundtable writers may have been at least as equally well-known though as many wrote for popular magazines. Dorothy Parker's first volume of poetry sold 47,000 copies. [1]
5,000 copies was a lot for the time. The book was culturally significant, and Hemingway became a celebrity for the time. I think this was as close to a minor "influencer" someone could be back then:
> Still, the book sold well, and young women began to emulate Brett while male students at Ivy League universities wanted to become "Hemingway heroes." Scribner's encouraged the publicity and allowed Hemingway to "become a minor American phenomenon"—a celebrity to the point that his divorce from Richardson and marriage to Pfeiffer attracted media attention.
> I suspect most of the silent movies of the time would seem quite trivial compared with much amateur YouTube content today
I can't say what constituted "most film" in 1926 (are we going by what sold the most tickets? big blockbusters versus daily newsreels & cartoons?) but if you look at the era of 1925-1927 it includes Phantom of the Opera, Battleship Potemkin, The Vanishing American, The Bat, Metropolis, and Wings. Many of this list is iconic to this day to where they continue to be referenced & emulated in pop culture that many people recognize even if they haven't seen the originals before.
Granted this discussion will be hamstrung by how much film has been "lost" with so much of that content no longer in living memory at all (to say if it was masterpieces or crappy filler).
Books don't have to be experienced as vicarious adventures. That's what YA lit is, mostly, but we can read books for their insight rather than fantasizing about being participants in a series of events that they're narrating. There's no reason to transform their thoughtfulness into the sort of disjointed juvenile power fantasies that modern movies are attempting to appeal to.
Also, you don't have to read literature or novels. Read the narratives and nonfiction around what people experienced in times and places that will never be experienced again, and that youtube and netflix don't care about. Read about thoughts and reasoning that exceptional and forgotten people had in the 19th century that are ripe for rediscovery.
The death of attention span is real, but the idea that the substance of "content" now is of better quality than the writing in 1890 is a slander. It's just the difference between a quick, tasty, and a bit vulgar value meal at McDonald's vs. an actual high quality meal. The laziness gets addictive.
I'd like to believe it is reversible. It's not a genetic problem so it is a problem of environment. If you tech detoxed for a whole year living in the woods or something then tried to read Moby Dick it would likely be very tolerable again.
You could study some people who have gone to prison and have little opportunity for endless media consumption.
Relatedly, I've found my life-long "sleep problems" go away very fast if I stop using electronics or electric lights after sundown.
Go figure, you light up rooms with hundreds of candle-power like it's friggin' daytime, and have world-class entertainment of most any kind available at the press of a button like you're living in a World's Fair crossed with Vegas crossed with a Red Light district crossed with a video game arcade, and it's hard to sleep and you don't feel tired as early as you do if you don't do those things. Live like it's pre-war (more or less) and the problems vanish. Who'd have guessed?
Hard to keep that up in a modern world with two working adults who need to Get Shit Done at night and zero other people you know are living on that kind of schedule—plus, Winter nights are way too long—but it worked. Sun goes down, read or play cards or whatever by candle light (I found two beeswax candles next to each other were enough to read by—and you'll quickly figure out why really-old fireplace mantles often had mirrors behind them, if you didn't already know!) for an hour or so, and the yawns are coming hard and fast, time to go to bed.
Shit for air quality, so, that's a problem. Never did find a cheap battery-powered warm-light not-brighter-than-three-or-four-actual-candles lantern to replace the candles with, while I was trying it.
Once you're used to it, whole-room lighting seems blindingly bright and totally insane. Interesting for getting another perspective on ordinary modern life.
Same! I can't stand bright light at night unless I'm doing something that warrant that (cleaning dishes, searching for something,...). I have a single desk lamp in my living room and just enough light to not bump to things in my bedroom. And my TV is not that bright (no HDR). I match the brightness of my devices to this amount of light, and sleep comes easily. Another thing I swear by is blackout curtains. When I turn off the lights to sleep, it should be dark.
Anytime I turn on the main lights, it's like a shot of adrenaline as everything is just so bright.
There's not much to write. I tried it for about two weeks, consistently, approximately no cheating: nothing but candles for lighting, no "screens", period (save, like, double-checking the alarm on my phone for the next day right before bed—but no browsing, no messaging, none of that, total use measured in seconds once the sun was down). No TVs or computers whatsoever. By night two I was already getting to sleep much earlier, and sleeping better. The effects of the experiment hit fast and strong.
You can still: read, write, play music, play card and board games, lift weights to some extent (I didn't—a bit contrary to the getting-tired, plus I just didn't keep it up long enough to build in anything like that, though nb you probably don't want to be attempting max-weight lifts in low lighting or otherwise doing anything that's not very-safe) or do many bodyweight exercises, take walks or run (night walks are nice!), talk, do limited chores or cleaning, plan meals (assuming written recipes, or books), and so on. There is a ton of stuff to do by candlelight, or in no lighting (including, ahem, partner activities). You won't run out of stuff to do if you've got any kind of varied tastes in activities at all—but you also won't be doing stuff so extremely engaging that it keeps you from noticing when you're tired.
If I'd kept it up, I'd have found a way to listen to "the radio" (but actually streaming or flacs or whatever) and had exceptions to the no-screens for, at least, setting that up initially. Goal would have been to still barely look at the screen, though: set an album or a few albums or a long playlist of podcast episodes or old radio show episodes (some of these are really fun!) to play, and mostly leave it alone. I'd also have probably made an exception for e-ink book reading devices, provided no cheating by using them to browse the web—that seems well within the spirit of the thing, to me, and unlikely to veer back into hyper-stimulative or way-too-bright territory.
Things I noticed:
As mentioned, full-room lighting is stupid-bright. Way brighter than needed, and that's an understatement. Made me recall some of my older relatives' long-owned houses and how many rooms didn't have built-in lighting, but just had a switch-controlled outlet with a lamp attached—even just having the light come from somewhere lower made the room less intensely-bright, all else being equal. That seems far saner to me now, than what's normal in new houses. Are you trying to take daytime-like photos indoors at night? No? Then your lights are probably way too bright, likely by an order of magnitude, than what's remotely necessary. There's no way that's not affecting sleep—right?
You do need to have a way to carry one or more candles. You don't want to have dedicated candles lit in every room (including bathrooms!) unless, I dunno, you have people over maybe. Could probably safely remove a lot of this need with a small number of very-dim night lights (some are way brighter than a couple candles, though! Gotta shop carefully) without ruining the whole exercise.
Candles are painfully bright if they're your only light source, they're nearby (say, you're reading), and you look right at them, even in passing. Very unpleasant. If I'd kept it up, I'd have had to find a way to shade them somehow.
In fact, though, to keep it up longer-term, I'd have had to find or build an electric alternative. Unscented candles that don't smell like absolute ass (once you've smelled anything other than the common petroleum-based ones, those smell like a chemical waste dump) are fairly expensive to burn, are a fire hazard (obviously), and harm air quality even when they're a relatively-decent material. Most e.g. portable camping lights are far too bright and are a too-cold temperature to replicate candlelight. I expect I'd have ended up building a few little battery-powered lanterns with the dimmest warm-temp LEDs I could find, or maybe even dim and under-driven filament bulbs to get a smoother light curve—they'll last a long time if you run them way under-spec, and the inefficiency hardly matters at powers that low.
One persistent change I've made as a result of this is having a very-dim-bulb warm-temp floor lamp (some cheap Ikea thing, with about the dimmest warm-temp LED bulb I could find that'd fit it) for my office. It's the only nighttime lighting I use in there, most of the time—rarely use my overhead light. Point it at the wall if I just want to avoid stubbing my toes on things, aim it almost at me if I'm reading. It's more than enough light. Been meaning to swap out the end-table lamp bulbs in the bedroom with something about 1/4 as bright, but keep forgetting.
Another thing is that the past is a little more relatable. I've lived with candles (if not long—but it's enough to notice some things!) and know the pain points and really get and appreciate tools & design features that were built with them (and oil lamps, rushes, all that stuff) in mind, in a way I didn't quite so well before. Stuff uncommon now that used to be somewhat-common, like family read-alouds of novels make more sense to me now (not only were books more expensive, so you might have fewer of them—it takes way less lighting for one person to read, than for five, which really matters when light is expensive! IIRC the Brontës were supposed to have done a lot of this, but I gather many people did)
What makes it hard to keep up is life. In the Winter, especially, even if I'd tried to keep doing this, I'd have had to set a cut-off later than sundown, no question—you can't realistically go no-electronics-or-lights at like 5:30PM. Nobody else lives like this. (Almost) no place you go with others at night is like this. Chores pile up during the day because stay-at-home spouses aren't the norm anymore, so by the time you have even a second of free time it's already night, or else you've deferred all the chores to night, and adding friction to those is not fun. Going out after dark anywhere (say, grocery shopping) kinda screws it up. It's just fairly inconvenient given that all of life is now built around having day-like lighting available at night.
However, I think getting a good chunk of the way toward it is feasible, and would probably bring many of the same benefits.
I grew up a bit like this. For long periods we turned the electricity off - to save money, not on principle - so in the evenings I read by candlelight. Indeed it's hard. It wasn't so difficult to do because we lived in the country and didn't need to "sync" with other people.
I'm thinking how to replicate some of this in my life now. It's difficult because I also want to write, blog, do research, and other screen-related things. OTOH I feel there's something very wrong about spending 10 hours of your day on-screen.
I wonder if there's room for a community to try to do some of this stuff together.
Moby Dick was never meant to be read how it is read now. It was originally a serial, so like a webcomic or a fanfic that's actively being worked on. Those are still very popular mediums of media.
I don't need to spend a year in the woods for my attention span to come back. A day or so is enough for me to get in the place where I'm able to get lost in a good book. A soon as I'm back within short reach of the internet my attention span goes to hell again.
They're a different epistemology. Tools of storytelling - video vs. book in this case - are being conflated with their use. They are tools not exclusive to a single epistemology, many ways to explore many perspectives of 'what'.
Contrast, for example that either of these books' narration allows a reader to fill in a lot of the blanks with experience, both real and imagined, constructing it for themselves even down to the temperature of the air, shade of light at dusk, smell of the air, exact tone and pitch of voice of a speaker. Vs. a video where the experience is fully narrated and scenes presented in 4k. Is the video 'more digestible' because it's a different thing being digested?
Not that these tables can't and often aren't reversed. Some vloggers say little, sometimes 'scaffolding' only some understanding through a bit of backstory, or allow their audience to similarly construct experiences of the pieces of a journey. And some books are instruction manuals, or unintentionally read as so.
A tool, be that video, text, audio, game, is only that. It's just a medium, a method that's surface to the deeper world view of the 'maker'.
Popular literature of years past looked more like those trashy-cover, cheaply-made, deteriorating, fits-in-a-suit-jacket-outer-pocket-without-wrecking-the-drape, thin genre novels you sometimes see carefully preserved in bookstores today, that never saw a hardcover printing (LOL, why? Pick one up, and 99% of the time they're clearly hastily-written formulaic crap), than Hemingway or Melville (the latter of whom, famously, had to be "re-discovered" in order for us to recognize his name today—he'd vanished from pop-consiousness very fast). Or "penny dreadfuls" (similar deal) before that.
I am listening to the unabridged audiobook for Les Miserables. Victor goes into tons of unnecessary detail about things - as an example they arrive at a monestary and he tells the entiry story of the place, including details and rules for who can wear what kind of color clothing.
I recon that it is considered one of the great and famous books, but it could have drastically been improved if he had had an editor. A modern author would have had one and would have produced a better book.
In fact, why shouldn't modern books be better? Nearly everything else is (compare central heating to a fireplace), and the rest disappear once you account for the price you pay for it.
I am reading the same book right now, but with my eyes rather than ears. I don't actually mind the detail in retrospect, because it's used both to add flavour to the story and to reflect on ideas that are very unique, at least to me. I do agree that it's a bit terse at times though. The part about the monastery did make me bored
So don't read those ones. Read books about things that can't be replicated by video, like Joyce's wordplay. It's like how photography liberates painting from realism.
And I fully empathize with your difficulty in appreciating such works of fiction. Maybe I ought to try reading one this summer myself...
What you're mentioning has little to do with technology and nothing to do with attention span. Those books just don't give you anything you want. Well, why read them? I think microprocessors are great achievements, but I struggle to think how a work of fiction can be a great achievement. I don't think the world would be very different if neither of them had ever been written.
I find this a very frustrating topic. I've studied psychology so I always feel compelled to look at the data first, but here I'm going to focus more on personal experience.
For me, the short answer is: it depends. Somtimes, I do have a short attention span--I'll start watching a video but then also look at the comments and suggested videos at the same time.
But if I find a truly interestng article or video, I will give it complete attention, even if it's a longform article or a half-hour documentary. I listen to hour-long lectures and podcasts.
What I don't like about this entire phenomenon is that people saw attention spans were declining and then decided to create stuff that doesn't require much attention.
I think instead of creating more short content, we should have made it scarce. People will consume what is available. There is much more short-form content available today than it was 5 years ago, because social media turned it into an advantage.
We should have continued creating long-form content. People would have paid attention to it if that was the only thing available.
I'd also partly blame the idea that you must be reading a lot, watching a lot, keep up with everything, and in general get more things done. So people turn to summaries and bullets, just to avoid being "left behind."
It is a lot like the author theorized— it uses Mechanical Turk to measure how long people will attend to a task given an incentive.
Anyone interested should be sure to check out this survival graph, that depicts the length of time people will attend to a task, given characteristics of the task and the value of the goal they are trying to achieve: https://invisible.college/attention/dissertation/survival.pn...
I may be reading this incorrectly, but in the article, the 65% appears to be authors confidence in the statement that attention spans appear to be declining, as denoted by the sub-script. Whereas in the HN title it reads as if it's saying a "65% decline in attention span".
Various other assertions in the post also have sub-script confidences associated e.g. "my guess: yes90%".
I could totally believe that there has been a 65% decline in attention span. "Stolen Focus" by Johann Hari certainly makes 65% seem conservative!
1. It will definitely rain, on 75% of the relevant area.
2. It will definitely rain, for 75% of the relevant time period.
3. It will rain with an intensity of 75% of the maximum our instruments can measure.
4. Three out of four meteorologists think it will rain.
5. It will rain on 75% of the population.
6. It will rain on everyone, but 75% of the population forgot their umbrella.
7. It will rai
8. 25% chance of dry.
9. 25% chance of snow.
10. When you become trapped in a Groundhog Day-type loop and are forced to repeat today three more times, then a subsequent analysis will show that it rained on exactly three of the four total days. Probably.
For related reasons, I like the blog's claim that much of the difficulty in establishing whether the proposition is true or not is because none of the wealth of literature on attention span was in the form of long term studies. Perhaps the researchers got bored and moved on to something else!
(I'm not sure if you were joking or not and I know it's probably not in the same spirit you intended it here / a bit OT but...) I've been using literally that exact expression for a while to describe the situation in which, during somewhat complex discussions within a group, in order to not be perceived as jerks participants are forced to follow an unnecessarily long, repetitive, trivial and most often also completely pointless "line of reasoning" just to have their own attention completely derailed from any productive/actually-interesting argument anyone was trying to make, often ultimately resulting in giving up because recalling those lost mental threads is by then even more difficult and there is only so much mental energy (for you and collectively) to dedicate to that discussion.
Just saying, imho it's already a thing (with different incarnations in different contexts).
This is definitely a thing, but at least in my experience, it is also a thing that narcissists do. They can dig up emails and examples from the dark caverns where you were just having a water cooler chat, and they somehow took it as very serious and something you should have meant to defend if it left your lips.
Reordering and paraphrasing what they actually wrote:
I've been using that expression when giving up on participating during discussions.
Other people's line of reasoning are unnecessarily long, repetitive, trivial and most often also completely pointless. This derails my own productive/actually-interesting argument because I only have so much mental energy.
Thanks for this comment, yes this title is wrong and should be changed. The article's conclusion is:
> It seems likely to me that individual attention spans have declined (I’d give it ~70%), but I wouldn’t be surprised if the decline was relatively small, noisy & dependent on specific tests.
Context since this has now been fixed: the original title as submitted was "Have attention spans been declining? – Yes, 65%". The bit after the dash was erroneously added by the submitter and was not part of the article's actual title.
But as OP mentions, the 65% as printed in the title conveyed the false impression that there's been a 65% decline in attention spans, whereas the actual tl;dr should have been this sentence from the end:
> It seems likely to me that individual attention spans have declined (I’d give it ~70%), but I wouldn’t be surprised if the decline was relatively small, noisy & dependent on specific tests.
Disagree. Article titles on the web are often very bad. Often this is for clickbait reasons, but also frequently just because the author was not writing for the HN front page as their audience. Almost always, I prefer the rewritten headlines on HN. However, this seems labor intensive to accomplish, and there is usually a delay before the edited title appears. What I wish is for article submitters to consider the use case, and rewrite the headline to conform to HN guidelines on submission.
Well, it's a balancing act. The original title may represent the article more accurately than the submitter's title, or it may be misleading clickbait and the submitter is trying to improve that. The policy here seems to be the best middle ground we can do, mostly go by the original title but also be ready to edit away from clickbait (which of course is subjective.)
There are some good reasons this isn't the case. Often the submission title itself has something wrong in it, or is click-bait-y, or just needs some pointless fat trimmed from it to get to the point.
But if the source can't get the title right in a clean and objective sort or way, isn't that a signal for "there's got to be a better source"? For example, how many times have we've seen a click-bait-y title followed by content or narrative reflective of that mindset?
It would be nice if they either did or didn't. The current system where submitters are encouraged to carefully choose a title and moderators are encouraged to stomp on it is the worst of both worlds.
Thanks for mentioning that book. I am trying to decide if it's worth reading. The negative reviews agree with the main premise of the book but say it's short and superficial. Is there information in there worth reading beyond the usual tips ie keeping your phone in another room, avoid news first thing in the morning, no screen time 2 hours before bed, long cardio workouts, etc?
I certainly wouldn't recommend it as a practical title, though there are a few practical tips along the lines you mention, the point of the book is more about the societal problem than the individual. But as someone who gets highly frustrated with my inability to focus on occasion, I would say it's reasonably cathartic.
You develop the skill of quickly determining what deserves your attention and what does not. Having a long attention span doesn't imply you give everything your full attention.
My point is the effect feel-good, whether it is achieved by avoiding fluff, by (over-)simplifying, by some special content that can only be shown in short form or by feeling more knowledgeable. If you can read more of them, you feel better. I just wondered whether this is the reason few develop the skill - it is mostly not worth the time with no one around usually caring (whether you are superficial or not).
Another commenter mentioned that modern humans are exposed to more content, and necessarily we must apply filters.
I would go further to suggest that not only is there more content, but that content is expressed in a manner intended to only consume a brief amount of attention.
Typically, such content is also presented in a way that encourages the user to continue providing their attention, except that their attention is directed to new content. And so while it may be argued that the person "is still paying attention", the rules of attention are simply different when engaging with such content.
To be clear, I find such touch-and-go content to be generally reprehensible. It lacks nuance, and leaves little room for intelligent discourse. I am simply pointing out that it's very possible that humans are just as capable as they've always been, and it is simply the medium that has changed.
I find a lot of content online (videos and blogs) as well as offline (various books) are too long to the point of being disrespectful of my time.
If you’re writing a blog and incentivized to stuff in as many SEO terms as possible, you will do so at the expense of my time and irritation.
If you need to make a 10 minute video but really only have 2 minutes of content, you value those 8 minutes of my life at zero and unless I do to, I’m skipping through the filler.
Same thing with books. If your book contains one central argument, respect your reader enough to cut out the fat. I don’t need a comprehensive history of the subject if your argument can boil down to a few paragraphs.
> Same thing with books. If your book contains one central argument, respect your reader enough to cut out the fat. I don’t need a comprehensive history of the subject if your argument can boil down to a few paragraphs.
I tend to politely disagree with that. In most cases "comprehensive history of the subject" is important to understand the subject fully and, most importantly, reasons for the argument.
Currently we have endemic of people who suggest doing thing in particular way, but when you question reasoning, there will be no answer. It's picking up ideas here and there without understanding of premises, and it's very dangerous because idea without premises is likely inapplicable to slightly changed premises.
I went to a theater to watch Mission: Impossible last night. It has a really long runtime (2h 48m) and, toward the end of the film, I started seeing multiple people pull out their phones for a time check; I had to stop myself from doing the same.
> Very, very few action movies need to be more than 100min.
I’ve read somewhere that blockbuster movies are trending longer at the moment, because people are ever less willing to pay the high prices for cinema tickets, drinks and snacks, so movie studios need to offer a really big experience to make it seem worthwhile.
While I watched a lot of films on the big screen at school (I was involved in the film group), over the years I've watched fewer and fewer. These days my maybe once a year criterion is whether it's something I'd want to see in IMAX if I could. Though I'm not sure how much length has to do with it.
I'm honestly surprised that movie theaters have come back to such a degree. I suppose the lesson is that the pandemic didn't really change things all that much--certainly not to the degree a lot of people expected. Even more flexible work arrangements, much less fully remote work, is mostly in some bubbles.
I mean not that many films that I really want to watch come out per year, so if Oppenheimer is 3hs instead of 2hs or whatever then let it be; I'd rather spend one more hour at the theater and watch the movie knowing the director didn't need to cut an hour off the film.
(Although I do have to say Oppenheimer got somewhat tedious in certain parts)
Don't forget the ads. They have gotten obnoxiously long. The last time I went, the ads for products and trailers went on for 45 minutes with an intermission so they can sell ice cream and snacks. I haven't been to the movies since.
Years ago I went with a friend to see Transformers. He is an absolute massive Transformers fan, so he insisted that we arrive no later than an hour before the movie would start. Me and my girlfriend meet up with him and a few other 90 minutes before, got our tickets, went for dinner and arrived about when the commercials ended and the trailers started. My friend was completely out of it, in his mind we missed part of the experience. I think he's the kind of customers the movies want.
For me, I just get my ticket in advance and show up about 20 minutes late, seems to work out fine.
Movies with a compelling story keep your attention, but with action, your brain can only take so much stimulation. Unless it has a really compelling story, your mind will wander.
Oh man. I wish more directors knew this. I remember watching one of the fast and the furious movies. It had like a 30 minute long action sequence. That is long enough for the adrenaline to wear off and now it’s all just tedious. I think i literally yawned during what should be edge of the seat action. I’m convinced what it needed was more breaks in the action, or at least more changes in tempo.
It's incredibly difficult to keep a viewers attention for 3 hours without them taking a single break, even if they are compelling. I was starting to get antsy in general around the 2h 30m mark.
3 hours is probably getting towards the upper limit but it's hardly historically anomalous. Lawrence of Arabia was over 3 1/2 hours and other "epics" of that era were often in the 3+ hour range. (Though movie theaters often had intermissions which most would probably never do today.) Live theater (with an intermission) is often in that range as well although many newer plays seem to be more like straight through 90 minute length.
ADDED: I originally read sittings as parts but if you mean intermissions, I agree. That's what long movies used to do and it's pretty standard for live plays much over two hours.
Definitely with the intermissions. But I really mean that I often like to stop a long movie, think about it awhile, and then come back to it later. This is how short TV series work, for example. This is also how books work, by their nature. It's a kind of "intellectual digestion" - an intermission is good, but a night's sleep is even better.
Yeah, but no way really to do that for a theatrical release. You can have a Part 1 and a Part 2 spaced a year+ apart if the total length is 3 1/2 to 4 hours a la Dune. That's not what you're asking for though.
Agreed. Went to watch Mission Impossible, i pulled out my phone once right at the beginning to put it on silent. My friend might have checked his phone three times for any texts/time maybe.
All I'd like to add to this conversation is a request to not treat all humans as "the average human" as deduced by any scientific study.
I must admit that the fifth vibration finally got me to check my phone during Oppenheimer. Bio pics are generally accepted to have long run times compared to action flicks though.
> Very, very few action movies need to be more than 100min.
Not to mention an action scene gets really really boring after about 2min. Just more and more of the same. Then add the usual hyperactive editing popular in modern blockbusters and it's like "Yo can you slow down a little? I'd like to actually see a punch happen sometimes"
New movies are just long for no reason. I feel like 1.5hr was typical before. Longer ones like Star Wars 5 and Lawrence of Arabia were worth, but now everything is 2h at least. Marvel movies are the worst offenders, often 3hr with maybe a decent plot the first hour followed by filler action.
Also, somehow the art of making dialog audible was lost in the 2010s, to the point where everyone uses subtitles at home now.
>Also, somehow the art of making dialog audible was lost in the 2010s, to the point where everyone uses subtitles at home now.
Yeah, I don't have the world's greatest hearing but I basically always keep subtitles on. I forget what I was watching a few weeks back but I was basically continuous cranking the volume up so I had a chance of understanding the mumbled dialog and then down when some avalanche of sound threatened to destroy what was left of my hearing.
My regular reminder that 2001: A Space Odyssey was only 2h 23m, and that movie had an intermission at the cinema. Tess of the d'Urbervilles? 18 minutes longer than M:I, and that one had an intermission, too. Ghandi had an intermission, but $DEITY, no human bladder can hold that much so it was kind of required.
Your Marvel movie is approaching 3 hours, and no intermission? Fuck that, I'll watch it at home while your CEO whines that no one goes to the movies anymore.
(Wait a minute, Oppenheimer is three hours long? Sorry, ChrisN, but that's one's getting streamed in my living room, too.)
> (Wait a minute, Oppenheimer is three hours long? Sorry, ChrisN, but that's one's getting streamed in my living room, too.)
That's a smart decision. Oppenheimer was a good movie (I give it a B+), though very long, and the director (or editor?) gives the audience very few moments to take a breather. I'd have enjoyed it more if I knew I could pause to take a leak or fast forward through the boring part about the affair.
Wasn't intermission because movies was on physical rolls (sorry, don't remember correct name) which had a maximum length and they needed an intermission to change roll?
I think this says more about the quality of the movie than it does people's attention spans. A movie that doesn't keep you sucked in until the end is just a bad movie.
Look at Avatar (2009) vs the sequel. The original was ~3 hours long, and a pretty ok movie. Kept the attention of everyone in my family from beginning to end. The sequel was 15 minutes longer, but we gave up before the end of the first hour because it was atrocious, particularly the paper-thin "dude, bro" writing.
The look of the original Avatar was something you had never seen before but the story was essentially a reworked version of the overrated Dances with Wolves. The latest Avatar is basically a remake in an aquatic setting and it has none of the visual freshness that the original had. I dutifully watched it when it hit whatever streaming service but it was sort of a waste of time.
~3hrs is a long-ass movie! I'm fine with epics, but, yeah, it most circumstances I'm looking for a ~2hr experience, especially if I'm having to sit in a theater. I don't think it's an attention thing, I just don't think most movies require that long to tell their story and/or show a few cool action set pieces.
M:I:DR has some very weird editing in the train sequence as a consequence of the production wildly over-scoping it originally, so I'm not surprised some people would end up with their sense of how much more movie is left totally thrown off.
> I do focus on stuff for extended periods of time like watching a movie, reading a book, working on something, etc.
Note that watching a movie requires far less attention than reading a book. Movies, first, are compressed stories (movie scripts are less than a hundred pages) where, second, nothing is left to the imagination. I'd bet that people read much fewer books today than before video streaming became popular, or before TV went mainstream. (I do rarely watch movies, but I mostly stopped reading books since I bought a smartphone.)
Yeah an e-book is better than reading on a tablet or smartphone, since the limited OS means there are less distractions. But for me, there mere existence of a smartphone in my pockets triggers my Web addiction.
I like the theory that the reason attention span has declined is because the demand for quality has hugely increased.
Since there are so many blogs, films, books and videos out there, we no longer want to waste time on things that don't entertain or provide value to us.
If in the past you might've given in and read or watched through boring parts because there was less options available, now people demand less B.S. and instant value. Because if you don't provide that, there's someone else a click away who will.
Lord of the Rings. An absolutely genre-defining work, but if you read it, it's really friggin slow and not at all compatible with today's attention spans. It's not a demand for quality, it's a demand to be endlessly entertained without having to invest the time or have the patience to wait for a backstory to unfold. Which is fine, I'm opening my phone to doom scroll for a second too, I just don't think it's quality that's being sought-after, but instant-gratification dopamine-inducting entertainment. It's bad. Until I took the time to retrain my. brain, I couldn't read books because I didn't have the attention span for them.
I think it's more a matter of investment vs. payoff. I read TLotR from beginning to end once and I liked it, but it's such a time sink for so little payoff that I wouldn't do it again. There's so much else I could do with that time instead. I'd rather put the movies in the background while I do something else over the course of nine hours, or maybe reread a specific bit I liked. It's not about instant gratification, it's that we're no longer constrained in our media selection. 70 years ago you might have reread the books right after finishing them because you had nothing else to do. That's not true today.
The Return of the King is particular has a real structure problem. Multiple endings after the actual ending. And then really the final wrapup/epilogue is in the Appendix. The movie tried to clean a lot of this up--most controversially by axing the scouring of the shire--but still didn't wholly succeed.
To be honest, the scouring is one of my favorite bits because it shows how much the main hobbits have grown over the course of their adventure. I think the multiple endings thing worked better in the book for multiple reasons. For one, you could tell at any time that there was still a considerable chunk left to go, so the narrative didn't create any false expectation that it was about to end. But also, unlike in the movie where it does the usual swelling soundtrack and post-climax wrap-up like it's about to finish but then just continues and does it again, I feel like the book doesn't do a literary equivalent of that at all. It doesn't feel like multiple endings, but rather multiple conclusions to separate story arcs, which is something the books do all throughout. The book is just forced to cram several right at the end, because it has to finish somehow. It's unfortunate that the structure didn't translate well to the big screen, at least not without cutting things out.
I'm not totally convinced but it's been a long time since I've read the books and at least accept that a book is better able to weave together multiple threads and story arcs than a film easily can.
I never read lord of the rings. I read the hobbit though.
The Tolkien work I struggled with was The Silmarillion. I understand it’s not much like the others and was a posthumous publication, but middle school me struggled to stay interested enough with the way it was structured.
I believe the Silmarillion was really more of reference type book originally so yeah not particularly something you sit down and read all the way through.
Not GP here, but I've gone through a similar brain reset exercise recently. There's no good way to do it, because it's essentially just dopamine deprivation. Caffeine withdrawal, internet withdrawal, gaming withdrawal - whatever stimulus you're using, you take it away and suffer for a while.
Quitting my corporate gig has also been massively helpful - our big tech overlords were committing Guantanamo level atrocities on our minds via Slack overload, and I will never return to that kind of environment. I'm barely exaggerating.
If you're a podcast kind of learner, Huberman has an episode on this "dopamine fasting" concept.
Read a bit in bed before sleeping, make this a daily habit; leave all electronic devices outside of your bedroom. No screens in bed; let your mind learn to accept that.
Start with books you can focus on now, gradually move on to more complex works. Don't feel bad about picking a genre which really draws you in. You don't have to read Dickens on day one (although he is a great writer, older works ask a lot in terms of focus and frame of reference, including some understanding of society back then and there), and it's fine to indulge in Young Adult fiction if that works for you (Harry Potter not too demanding to read, and pretty good too).
I don't necessarily buy that. TikTok and a lot of content on Netflix, etc is cringe, or some kind of just "grub TV" or "trash TV" if you will. Doesn't stop humans from consuming anything to distract from what they need to be doing, maybe. Or what might be useful, more wholesome. To be sure, trash TV does have qualities, that make people watch them, not necessarily great ones though.
It's not like people are reading high quality things though. They are watching tick tock. There is no demand for high quality work. People aren't reading gonzo journalism books, they are eating up soundbites and clickbait and very pleased about it.
I dunno. I think that quality in movies and tv went down to large extend. Average script is so bad, that basically mediocre dialog gets praise. Fights and tricks are better then before, but other then that, production last years is just meh
I'd measure attention span by the length of time one can dedicate to a task one enjoys during the day.
It would be entertaining maybe to compare high school automechanics programs over years, for instance, as one learning culture.
Btw, I wouldn't be shocked if it found our attention spans are fine, what we're dealing with is environments that are getting harder and harder to do sustained activities within - because so many different demands on attention are going on all the time.
While not without it’s problems (which is to be expected considering the impossibly complex subject matter), It does a fantastic job detailing how the obesity epidemic is not some kind of oversimplified moral sin issue where we all suddenly became gluttonous (“just eat less!”) and slothful (“just do more”).
The major thing I find lacking is the omission of just how much the many different types of fatty and amino acids also act as major chemical signals that ultimately[0] affect whether your body tries to burn energy “wastefully” for heat, or tries to store it. The articles leans toward environmental chemicals as being the primary effectors subtly acting on our metabolism, but the very food we eat is itself a massive load of chemicals with complex effects that we can’t ignore.
[0] The next time someone wanting to feel clever and smug about it says “calories in, calories out” or “muh thermodynamics”, send them this: http://biochemical-pathways.com/#/map/1
Those posts have grave errors, which the authors have been made aware of long ago but have refused to fix. For example, they literally made up the claim that wild animals have been getting more obese (there is no evidence of that). See this post I wrote for more details: https://www.lesswrong.com/posts/NRrbJJWnaSorrqvtZ/on-not-get...
My attention span has grown. I've watched multiple hour interviews by a number of people on a variety of topics I've found interesting.
My tolerance for narrative disguised as news has vanished. I stopped watching TV and mainstream news. I deliberately follow some people I strongly disagree with, to keep from falling into an echo chamber.
Curate your sources, and it's a wonderful world. If you let the algorithm decide what you see, it controls you, eventually.
I feel like my own attention span has declined quite a bit, but I notice that younger people (including my kids) turn to Youtube for education - I've _never_ had an attention span that would allow me to sit through, say, a video programming tutorial (dear God, give me a book), but it seems to be the preferred approach for the next generation.
I don't know that it's much of a positive tho, basically going from active to passive learning. Sitting through videos instead of reading books (much more efficient too IMO) seems like a prime example of attention span erosion to me.
And on top of that they aren't realizing any of the gain of the modern technology here. They buy a $1000 laptop, with $100 a month internet plan, to go watch a 15 minute youtube video with 4 minutes of discrete ads and 4 minutes of sponsor product mentions cut into the actual video, all to gleam a fraction of the understanding that a single page of whatever someone in their position a few decades ago would have checked out for free from the library. Whats all of this effort and technology even for then? The only people who eek out a gain here are the advertisers who now have put themselves front and center of what used to be an entirely unmonetized trip to the library, a known good source of truth.
Well, local libraries don't really have that much in the way of good academic texts - you have to have access to a university library, or be willing to fork over $60 of your own for a print copy, for that. But I get what you mean.
You aren't getting anything from a youtube video thats in academic texts either. But you can certainly get academic textbooks from library systems. Are they on the shelf at your local library branch? Maybe not. Can you order it from the library system for free and get it sent to that branch in a few days? Sure. I just looked online at what one of the seminal works for a field I don't understand but might be generally useful to homeowners or people seeking to upskill: carpentry. I found the title "Building Construction Illustrated" recommended a few times. My library system has just about every edition of this book published over the decades, in good supply, even in ebook form. This is what I mean with how the library system can easily serve you very high quality information on a variety of topics.
This article was incredibly interesting to me given the chain of events that have unfolded at Reddit.
I used to be a hardcore Reddit user. On the site daily for ten years or so. While I didn't spend hours on the site per day (though there were some days where I definitely did), I _had_ to visit it every day or I would be irritable.
Note that I felt like this despite unsubscribing to every subreddit in favor of multireddits (so that my front page would be empty), not enabling notifications on the site, disabling infinite scroll and having a very limited set of subs in my multis, few of which I'd consider classic time-wasters (like /r/consulting, which is mostly consulting memes, or /r/programmerhumor).
A problem with Reddit (for me) is that despite doing all of this, there is essentially infinite content, and all of it is interesting!
Since I also browsed Hacker News, what ultimately happened was that I would speed-run HN and Reddit in an attempt to be "caught up" during the "limited free time" I had to myself.
Like many here, I quit Reddit cold-turkey a month ago when Reddit committed to their new API pricing strategy and Christian committed to sunsetting Apollo. I figured that it was only a matter of time until Reddit sunsets old.reddit, I had no interest in being forced to use new.reddit, and all of the subs I cared about were (rightfully!) going dark and losing subscribers in the process.
Since I browse Hacker News via skimfeed.com (excellent curator) and there are, at most, 15 posts on there at a time, I don't spend nearly as much time on "Internet things" as I did before. However, my tolerance for reading (and writing!) long form like this has gone way up! I think this is due to me not feeling as if I need to split my time as aggressively as I did before.
I'm also much more chill about "needing" to browse daily. I still hit up HN every day, but because there is so little to catch up on, I feel like I got much more of my time back now.
(One more thing. Reddit, by and large, rewards short, quippy, knee-jerk responses. Longform is typically skipped over. You don't have to think very much to use the site. Ironically, it is very easy to scroll through a Reddit thread while also feeling rushed to speed-run through it to get caught up.
Very different from HN, where the content is very interesting, and longer comments tend to be upvoted more.)
The data by age show a consistent decline in attention:
34 to 55 year-olds were significantly lower than either 18 to 34 or 55+
This is interesting because it fits with none of the hypotheses and none is suggested. What could it be? To me it suggests that working and/or raising children have negative impacts on attention span.
I feel increasing frustration every time I see a graph with a clear upward trend being waved off as "probably just better awareness". I am even more frustrated when it's a graph tracking some mental health issue. If adult ADHD numbers have doubled, that should mean something.
That's all well and good, but it's a plain ol' fact that it could all be from increased awareness. If someone is saying "no, it's too significant a change to be just awareness," it's kind of a nothing statement.
What's the point of bringing awareness to undiagnosed mental health issues if we brush off signs that it helps people get diagnosed as a WORSENING of mental health?
ADHD has only been a recognized disorder for ~40 years. Its only been accepted by the medical community for about 5 to 10 years, depending on country. Its hardly surprising that diagnosis is increasing for something that has essentially been ignored by everyone until very recently.
Keep in mind the first person diagnosed with autism died last week.
Cynically, I'd say it means that if you want to make staff Adderall is going to give you a considerable advantage over the next guy, all else being equal. Anyone smart enough to be in the running for staff or higher is smart enough to google ADHD symptoms and present with them at a psychiatrist.
ADHD numbers have doubled because people like drugs that make them productive, and the victimhood points you get for being "neurodivergent" ... aka, its all fake.
This idea that people are getting drugs to acquire victimhood points is so detached from reality, I am pleading with you, please spend less time on social media. It is not healthy and is changing your entire worldview.
Your first swing at a reason doesn't make any sense either, as ADHD numbers have not solely affected adults. Even if you account for those "just wanting to be productive" by apparently ruining their entire ability to compensate dopamine naturally, the numbers are still increasing.
The number of people in my life diagnosed with various disorders, with accompanying drugs, via short meetings with doctors they've met once is staggering.
Re: children, do you know many parents? I know many and many of their children are medicated for ADHD. They are usually active and healthy boys who don't like sitting still in class or have grown accustom to video game and screen time, so when they don't have those things, flip out. On demand entertainment and smart phones have created a screwed up reward system for young children. ADHD medication is usually the prescribed solution to that problem. By the time they reach middle school, they have no chance. Its an uphill battle to stay ahead with school and maintain all the hobbies/extracurriculars. I blame the parents, and while they're responsible, the FDA/"doctors"/pharma companies facilitate it.
I'm not saying people are faking things like schizophrenia or bipolar disorder. Mostly ADHD, anxiety, and depression disorders. Seems like 1/3rd of people I interact with have one of those, or all of them.
All your linked article suggests is that some are turning to social media to find explanations for their behaviors. Nowhere does it suggest that there is a notable increase in fake diagnosis that results in accessing pharmaceuticals, or that anyone takes a fake diagnosis more seriously than the traditional route.
There is zero evidence that there is an increase of people who are diagnosed via short meetings and immediately drugged especially when the drugs are under far heavier restrictions than say Ozempic.
The rest is anecdotal, just as easily as you say you've met so many who have been diagnosed and drugged, I can say I have not met many. Most of them seem very anti-diagnosis even when it's clearly affecting them due to this stigma. Yet you'll discount my position because it's anecdotal.
If you think your worldview holds water, why are there so many studies proving the efficacy of both non-stimulant and stimulant drugs in trials that are double-blind, placebo controlled, and using parallel groups? (eg https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3000197/) Shouldn't it be identical to placebo for reducing ADHD symptoms?
Far more interesting to me is how you've met so many neurodivergent people. Many studies have talked about subconscious biases of those who are neurotypical (eg: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5286449/) which goes both ways, as we gravitate to those most similar to us. Unless you're a social worker or creating statistical anomalies in your head of course.
Are there people out there who have been drugged by their parents without having a real problem? Probably. But these are parents who would go through a hundred doctors to find one who can be bought, as now they would have to specifically avoid the plethora of alternative solutions doctors offer due to ADHD stigma (stigma noted in papers like https://sites.ucmerced.edu/files/laura-hamilton/files/metzge...).
This is a fool’s opinion. Spend some time reading what people diagnosed with ADHD say about their disorder. Medication merely makes them able to keep up some of the time.
I've heard and met many - its just hearsay to me. Fools believe hearsay. Until there's some advanced blood test or biological marker, the diagnosis is complex social engineering or at best, pseudoscience.
The perception that ADHD is nothing more than the inability to "sit still and pay attention" is very wrong and very out of date. Before you comment further I would strongly recommend you watch a series presented by one of the leading researchers in neurological basis of ADHD, Dr. Russell Barkley, called "ADHD: "The 30 Essential Ideas Every Parent Needs to Know" [1].
I struggled with many of these things -- primarily time blindness, executive function and impulsiveness -- for most of my almost six decade life, and finally understanding what was happening -- and getting help for it -- has been life-changing.
As one friend put it, he would cut off his left testicle if it meant having working executive function. The idea that I -- and friends and acquaintances who share similar experiences -- are faking it is ignorant and insulting. That you do not suffer is a gift; be grateful. Maybe don't be so dismissive of the struggles and suffering of others.
I'm curious what the shape of the distribution actually is in terms of attention span, and if there is skew.
I used to spend hours and hours pouring over math problem sets. There isn't a 'tiktok'ed way to be good at solving linear signal problems by hand, as far as I can tell. It is simply a matter of focusing and practicing, being able to recognize patterns and take the appropriate steps. I'm not saying it was a useful skill to learn, but it was a difficult one to pick up, at least for me.
While I do feel that my attention is often "pulled" by multiple things, when I am truly able to focus is when I have sole responsibility, rather than many conflicting things. In that sense, my attention is often drawn by multiple things.
> In 2004, in our earliest study, we found that people averaged about one hundred fifty seconds (two and a half minutes) on a computer screen before switching their attention to another screen
Maybe it's just me, but I look at several different screens each day (Phone, Workstation, Raspberry Pi for home network logging using Pi-Hole, and a tablet). Not to mention our main TV. This is all normal for people these days. One screen for everything doesn't cut it, unless you are very disciplined and force yourself to use one device for everything, but I don't see the main purpose of that.
Maybe if you're running a business with tricky logistics, you need to be pinned to a single device and need real-time interaction with your systems to get things done?
I don't know if they have or not, but I just spent the last few minutes reading about the neologism ephebiphobia and feeling vaguely annoyed by the assumptions and general tone of the wikipedia article. Now I'm closing this tab and going to get a burrito.
I remember reading a study about a year ago I've been trying for a while to find it and post here now but basically it was observing that people, specifically kids recently have had instant access to information that they needed and so when something was taking longer than expected to absorb the information it would be passed on for some faster method of absorbing the info.
The jist of the study was that it wasn't so much attention spans are in decline but patience for consuming information. I am not that particular generation but for sure if I'm looking for some info recently i get frustrated if I don't get the info I'm looking for quick enough.
I listened to this podcast over the weekend on NPR's "How I Build This" (tho it's from early to mid 2022). It's an interview with Johann Hari, the author of "Stolen Focus". The book sounds like it's worth a go, even at close to 400 pages!??!?
This is entirely anecdotal, and as I have ADHD this may not be the norm; but I have noticed throughout my life so far that my attention span is a product of what I pay attention to.
If I am primarily consuming content to enrich myself or critically analyze something, my attention span tends to be longer for a period of time. Conversely, if I am consuming content to socialize or out of boredom, my attention span shortens significantly.
In both cases, there seems to be a “cooldown period” of at least several days before my attention span reverts to somewhere in the middle. I wonder if others share in this phenomenon and how common it is.
Yeah I, subjectively, have experienced the same. But with the same caveat about having ADHD. What I find more substantial is my stress level correlates with what media I consume. Books, long form writing, or even just any text medium (as opposed to video) tend to be more relaxing and allow me to focus better.
I wonder if stress is the causal mechanism for focus here? Maybe not entirely but a proportion of it.
I think speculation is allowed here since the author was unable to draw substantial conclusions either. But did distill a lot of useful information together.
There's also the problem that "mindfulness" is itself obfuscated to mean something specific about autonomy, individual power, and personal freedom. This is a problem because it assumes attention occurs within a fixed cultural and economic framework as an individual commodity that can be traded. It's very transactional.
The practice of attention beyond what the mindfulness movement has co-opted includes attention to the ethical and ontological structure underlying our experiences. It is subversive in the sense that deep attention requires questioning assumptions. That's something the purveyors of distractions would prefer to discourage.
For me mindfulness is about being present and aware of myself (including my sensations, thoughts etc.) and my surroundings (arguably the same thing as myself, since we are the ones perceiving) in the present moment.
I genuinely don't go beyond that. Otherwise it overcomplicates itself, which is also when you then to lose it in the long term.
Being aware is the action. It's not a particularly complicated thing, in 99% of contexts it's about looking at things through a curious lens. Paying attention. Observing, without judging.
From an implementation point of view, I evaluate myself based on how mindful I managed to be during specific moments/activities during the day i.e. brushing my teeth, working etc.
I think where a lot of people go wrong is that they continue to evaluate their day based on what they achieved or what goals were met. That's one of the quickest ways to dissolve mindfulness and forget about it entirely.
People hack themselves to find the quickest release of feel-good chemicals in their brain but most do it at the expense of not realizing that it numbs their entire existence.
And now it’s more prevalent than it ever has been. Grabbing a phone is a nice way to stay in the shell that you have built around yourself and pretend like you have control over your life, but no one has control over their lives.
The more you build walls of ideas around yourself the easier it becomes to make you discombobulated with the slightest gust of wind.
It’s honestly pathetic we live this way and continue pretending it’s normal and that’s how it should be.
>> Grabbing a phone is a nice way to stay in the shell that you have built around yourself and pretend like you have control over your life, but no one has control over their lives.
Probably a major cause of the increased levels of anxiety younger generations have. Simple things like blocking out the world with headphones, or standing in line engrossed in your phone reduce the likelihood of somebody interacting with you in a way you don't control. Avoidance breeds anxiety.
And the problem is that the more closed you become, the more amplified are your experiences which you deem hostile.
A member in my family is like this and I try to leave her alone for the most part, but there are opportunities where I try to remind her that no one is out there to get you.
Just do nice/good things for yourself and the rest will present itself.
Exactly. And sometimes you will enter a cave that is so full of terror it might knock you out for a while, but there are two things to this:
1) you actually made it that far. Life doesn’t treat you like shit when you are trying, it simply ups the ante and then you see how far you can go.
2) you now have an opportunity to learn from yourself. How did you end up in this place and how bad was it really? How would you feel about going back and how would you approach the situation this time?
And the treasure you find is wisdom that you can then pass onto other people, particularly those close to you.
I have been fortunate to meet people who have had a lot of experience with this but I consider myself an absolute apprentice even if I have dared to take a few big leaps.
Many people's lives are not that good, there are many people with bad jobs they hate, or partners they can't stand. It's easy to imagine why humans act this way.
Your comment resonates a lot, at least with me. I don't enjoy my life, any distraction I can get and any negative interaction I can block is a positive for me.
I don't care if someone finds it problematic or pathetic, at the end of the day I just need to get through the day, something that sometimes is really, really difficult.
I really don't care for extremely judgmental people like the one you replied to, they either don't get it, or somehow figured our different ways to cope and consider themselves far superior to the rest of us.
Because I happen to come from the gutter like most people. I was manipulated, bullied, I was a drug addict and I did a lot of other dumb shit. I am so extremely experienced in how gullible humans beings are it’s honestly sometimes hard to live in this world.
And if you’re wondering why it’s hard, it’s because those experiences define who you are. Do you think I chose them because I thought it would be fun? Nah, those are the things I had to experience and learn from.
Most people don’t want to learn from their mistakes because it means you have to give up a lot all at once. How are you going to be a decent human being if you can’t give up the old to let it the new? Your plan is to spend the rest of your life feeling sorry for yourself?
>Your plan is to spend the rest of your life feeling sorry for yourself?
You are the only one feeling sorry for other people in this conversation.
Even people with what would be generally considered great lives chase dopamine. Believe whatever narrative you need to, but in general this behavior is not that deep or problematic vs an actual drug addiction or real problems. It seems you have an axe to grind with anything "self-control" related.
You are implying that my argument isn't from experience when it is. My argument is that you are biased and projecting your own self-control problems on others and treating your solution as a universal truth to be learned. You are also conflating people looking at their phones to your previous drug addiction.
>And please don't use what a person has said
Ok, I'll remember to not include any topics or details anyone ever brings up in conversation. I don't think any more or less of you based on your past you mentioned. I'm just attributing it to your bias.
Criticism is not disrespectful. I should probably stop replying to your messages however, this is going somewhere unhelpful so I apologize for that.
All this being said, if your disdain for that behavior keeps you personally away from bad behavior, I could see how promoting that strategy internally and externally is important to you. Disdain can be a useful tool.
No, the brain is self optimizing for less calorie usage, thus the state of affairs is normal. The actor for the interests of all (democratic state) used to have the task to push everyone to their limits, for the benefit of all.
Long for content is still around (e. g. https://www.lrb.co.uk/) but the ruined intellectual allmende is unable to parse it.
Lowered attention spans may be widely regarded as a net negative, yet I propose to reframe the conversation:
• Call it "whiffreading" after the term in the Book of the Subgenius [1983]. A way to intuitively, instinctively and logically determine if you can get the sense of something quickly. If you sense it is worthy, you will invest the time in it. If not, and you feel you are wasting time, you skip it.
Quicker abandonments, glossing, and skipping ahead are all natural human adaptions to the increasing and constant bombardment of data and stimuli we are presented with. This is why we want the "tl;dr." This is why you want a 1-minute explainer video instead of a 30-minute or 1-hour lecture.
Now, if you catch someone's attention, they'll follow you down the rabbit hole and will keep reading or watching. Maybe for hours. Look at TikTok lives and Twitch streamers. People doomscroll, next, next, next... but as soon as they encounter the right content — whabam! They will stick there until the livestream ends.
Similarly, look at Hacker News itself. If a 1-sentence topic grabs your interest — whabam! You'll read the article (which could be thousands of words long) and you may engage in the threads of conversations, which could eat up your day.
Oppositely if a content creator bores people or if the audience start to sense what you are talking about is sus, or totally outside their interests, they'll just drop it. A post on Hacker News will sink like a rock in the middle of a very deep ocean.
Again, I propose this is an adaption to the environment. Time is speeding up. The onslaught of content trying to grab your attention is relentless. So this is the natural result.
I agree. I would also position that for some, it may be less a case of 'attention deficit' and more 'attention prudent'. There is way too much garbage content out there and information overload can easily lead to processing overload. Not speaking to the very real medical conditions out there but to societal generalizations - I place a very high personal value on my time and have little tolerance for wasting it on content I feel is uninspired and/or lacking depth or any sense of meaning to me.
Then there is the issue of there being a high quantity of such damn good quality content out there (curses, HN). In that case, its just too much to ingest while working to maintain other forms of focus in life.
I immediately think of the immediate and endless marketing spam notifications that are a result of daring to allow an app like Uber to send push notifications at all.
Attention spans are an uber representation of something that is far more granular - complexity of task, context length, prior knowhow (affects complexity at a personal level), environmental noise.
If our cognition is impacted by information, then by the very fact that attention span is a function of the information in our environment. I quite subscribe to the fact noisier environments lead to decline in attention spans.
Yes. Our brains have been rewired. We consume information in smaller bits - the brain can no longer focus on long-form content. I used to be able to read a chapter of a book - now I can't. That's NOT an accident, and I didn't somehow acquire ADHD at 40 (and I will argue that most people who are hogging Adderall right now didn't either).
Yeah I recently started reading again in my adult life (highly recommended btw), and I had to retrain my brain to sit through the slow drip of information. First a page, then a couple, finally a chapter.
One things that stood out to me is books are incredibly verbose. You wade through an ocean of verbiage on the given subject and saturate yourself with the knowledge - and in the end you come out far more of an expert than any internet perusing could allow.
I know my attention span has declined, whats difficult for me is I work in an IT role so its very interrupt driven, happens through tickets or direct messages. (Got a slack message while writing this message!)
One of my favorite things about cycling is it forces me to stay focused while I'm on the bike, I can't look at my phone or do something else.
My pet-hypothesis is that part of the problem isn't the tech per se, but that office workers do parts of jobs that would have been the specialty of several different people before, say, the 1980s. And not only that: they switch between those, and aspects of their own job, much faster than before.
A person who had two kinds of tasks to worry about in a given day in 1975, might have twenty today—technology didn't so much eliminate work, as allow it to be more concentrated. Someone's still doing the work, it's just five people instead of twenty-five, and none of them as is focused or specialized as before. Everyone's a secretary now, in other words—plus whatever else they do. Everyone is the mail room. Little bits of jobs like project management or plain ol' management get devolved down to ordinary workers. And so on.
My biggest problem with attention span is my eyesight. It has been deteriorating badly from myopia to having an added farsightedness, so that there is only a small range where I see clearly with or without my glasses.
I can't motivate myself to get new glasses with progressive lenses because apparently they bring their own new problems.
I find it incredibly difficult to focus on movies while at home, I like being forced to not use my phone, etc by being in a theatre. I think this applies to most things; I don't know if attention has been declining so much as there are so many other things you can do at the same time which limits your attention.
Why is attention span even so important? At a survival mechanism level, shouldn't imediate threat priority assignment be more of value long term than attention span? Since we spend less time focusing on one thing now, maybe our survival performance as a species increased. What am I missing?
I think I just know what the thing is going to be about. I just need to read a headline. If I think there's more to it I force myself to read a bit. Something that comes with repeating patterns over and over again for decades.
I'd guess that the shocking "Yes, 65%" in the HN title is what catapulted it into the #1 front page spot -- but that bit isn't currently in the title on TFA (nor in the 2 archive.org captures).
This may be partly due to television. TV gets eyeballs by providing a constant stream of quick cuts that pander to the brain's addiction. See Neil Postman "Amusing ourselves to death".
Do you think children should be in a classroom? I tend to believe that children should be outside personally, with very little indoor activity. Even if it's snowing. I learned way more by doing my own thing, what ever that is, then I ever did stuck inside of a class room having a teacher read word for word from a textbook.
I think there are varieties in the aspect of learning to be considered, some don't learn the same way of others. And that should be considered as well.
But generally I don't think attention spans have anything to do with classroom. People were not paying attention in classrooms WAY before cell phones were invented. The problem is the "room" part of a classroom in my opinion.
"I learned way more by doing my own thing, what ever that is, then I ever did stuck inside of a class room having a teacher read word for word from a textbook."
And I learned a lot more by reading stuff from Wikipedia on my own pace than I learned reading textbooks. So, good point.
my attention span is extremely short, can't even sit through a 5s short video, but I think it benefited my ability to improvise music (which is the only thing i care about now), it forces me to jump from ideas to ideas keys to keys quickly and kinda formed my style.
I think this is just an unusual attempt to formulate their guesses more precisely without relying on linguistics. You could see it as somewhat earnest, as in they can be shown to wrong to a certain extent, and can't try to weasel out of their specific claims. In this kind of meta study without a quantitative result, I think it conveys the intention rather well.
The author tries to find specific evidence to answer whether individual people overall had their attention span reduced, in the time period where internet + social media became widespread. They go to some effort to define terms and review existing literature.
They do not end up with a specific way to test for attention span, and they do not find definitive answers in existing literature. However, every thing, even if somewhat flawed, points to some forms of reduced attention spans.
More interestingly, each element of the question is kinda questioned and widened during the article (“is there such a thing as attention span?”, “is each measure of shorter time spent on an activity actually a measure of shorter attention span?”)
Imo what the intro touches on but fails to be explored in sufficient detail is what role gratification plays. I think so much time is spent on quantifying existing results, that when trying to design a good new test not enough time is spent applying game theory. I feel like there could be some good ways to properly quantify the link between gratification and attention, but I don't have the game theory / psychology bagage to go any further.
Anyone else read the headline and go straight through to the comments?
I think that HN, which has added deliberate friction elsewhere on the site, should consider hiding the comments link until you have clicked through to the article.
Recognize that your most focused mindstate is first thing in the morning. Do tasks that require the longest attention span right then. Don't turn your phone on until they're done.
Generally don't keep your phone in your pocket, put it in another room on silent. Disable lockscreen notifications for all apps but the essentials. Do the same to your computer. No screen time 2 hours before bed.
Do long cardio workouts outside.
Recognize that it's OK to be bored, you don't need to fill every gap by looking at your phone. It's OK just to let your mind wander around.
You give up looking at screens in your free time, that's the only real answer. It's sucks, it's hard, and I've had no success at it, but that seems to be the key.
Curb usage of attention-seeking technologies. Shift your habits. Start by acknowledging the problem and its sources. Distance yourself from them psychologically, then physically.
You just pay attention longer. It takes effort but that's all you really need to do. For example, go observe and pay attention to a plant for 45min. Think of it like a workout, it's uncomfortable and a PITA but that's all there is to it.
Whew! I saw the original title about attention spans declining by 65% and was worried, but now I see the updated title and consistent with Betteridge’s Rule can be confident they have not been actually declining. Thankfully I don’t have to read the article as it seems long.
Edit: Okay. That was a "low effort" response. So I'll expand. That we even have such an abbreviation makes me think that maybe (contrary to Betteridge's Law[0]) the author of TFA might be on to something.
With so many different things fighting for your attention, it can be difficult to stay focused. It takes practice and discipline to shut out other things and maintain your focus on something. I don't claim to be expert in doing so, but I do try -- and fail some of the time.
I believe that maintaining focus/attention is a learned skill. One that isn't considered important to acquire/teach.
Once upon a time, there were fewer immediate distractions which enabled (forced?) us to focus our attention for extended periods. Nowadays, there are so many things fighting for our attention that it's more difficult to learn that skill.
Is that good or bad? I suppose that depends on your point of view.
It's not enough to just base this on screens. It's also workplace environments.
Even something as "simple" as working in a pizza place, say Domino's, is increasingly a frantic assembly line where orders can come at a breakneck pace through internet applications.
You'll still be working somewhere perpetually understaffed and who give you basically no training, but expect you to pay attention to multiple threads at once, all day long. You're a delivery driver, but you're also expected to do kitchen prep, take phone orders, take in-person orders, do dishes, cut and box pizzas, help on the prep line and generally be on-call for anything else needed to be done in the store.
When your workflow is literally constantly being interrupted by other parts of the workflow, because you're always expected to be paying attention to multiple parts of the workflow, you lose the ability to focus on just one thing for an extended period.
Anyway, that's my two cents, it's not just social media, phones and screens. It's also a way of life in America, to be expected to manage numerous expectations all at once and always be on your feet moving. If you can't do it, you're likely to lose your (shitty) job, so forcing yourself to be able to focus on numerous things at once without giving your whole focus to one thing is literally pounded into your head in your workplace.
Yep. I worked at a software shop where they liked for developers to have roughly 6 projects simultaneous at any time. The constant context-switching drove me insane. By the I time I 'loaded into mental RAM' all the context for one project, it was time for a meeting about another project.
For an attention deficit programmer maybe this is a bonus. Not for me. I like to focus in on the task, take notes, figure it out, and get it out the door and off my plate. I don't like to nibble on code.
Most jobs don't allow for deep focus or long-term thinking. How do we expect people to be good at it without practice while we encourage the opposite behavior.
I'm calling it: in 5 years mails will be read in bullet point summaries based on the actual formal email which the sender has an AI write from quick notes.
Yes, there are costs -- deep work & study both suffer -- but there are benefits too: informational content that can be compressed does get compressed. An introduction to a concrete skill that would at one time have been padded out to fit into an hour long movie or lecture might become a 30 minute youtube video and then a 30 second tiktok, by which point it has become a snap cut between the critical actions and finger-wag followed by pitfalls. You can look it up, watch it multiple times until it's committed to memory, and you don't have to spend hours torturing yourself with irrelevant tangents and nonsense. This is an astonishingly compact form of communication and it's beautiful to see.