On the other hand, tech in the classroom has never been demonstrated to improve educational results at all, other than for various disabled kids, who have been helped by it.
One of the best programming courses I took we only used pen and paper to write pseudo-code. Not having actual computers allowed us to focus on what we were learning, rather than wasting time with the arbitrary details of a particular language’s syntax.
> rather than wasting time with the arbitrary details of a particular language’s syntax.
Learning strict syntax and semantics is the most difficult for students just starting out with programming. It's not wasting time at all. Programming on paper can work great, as long as the syntax is well-defined and graded for.
In an intermediate theory course asking to "not be bogged down by syntax" would be a big red flag for a student (to me). Perhaps we just have a different attitude to this, that is okay.
We spent many hours each week on 4 homework problems (per week) that were progressively harder, and built on each other.
Not sure how to explain, but there was no need to have any extra time spent on getting the syntax right. If you could figure out the logic, that was what you were supposed to be learning.
People who learned by preparing punch cards and getting the results back another day probably get their programs correct on the first try more often though. At least for me, there's a lot of cycling on stupid errors, and it bites me when I work in a language that's slow to compile or in an environment that's slow to run (or both, like Arduino with its C++ and flash before run, or some corporate development stacks)
I'm currently on HN while waiting for my latest pull request to go through the test suite. Many times I try to work on another PR instead, but am very bad at context switching between two branches, and inevitably screw both up.
I even fixed the command prompt to display which git branch I'm on, but that's not enough.
> All of my programming classes have been helped tremendously by having access to a computer…
That's sort of like saying that writing is helped tremendously by access to pen and paper.
Usually the question of "does tech in the classroom help" is related to overall educational outcomes. And where it's not so clear is the fact that, well, overall educational outcomes in the US have been on a downward trend for decades despite access to tech in the classroom only increasing over time.
Overall educational outcomes in the US have been on an upward trend for all groups, with the percentage of test takers achieving a perfect score on the SAT or the ACT steadily increasing year over year. People who went to school decades ago were dumber than people who went to school recently, and it isn't even close.
You're looking at averages, which are strongly affected by the fact that more people are going to college. The percentage of perfect scores shows that even despite more people going to college, people at the top (though it's true at every level) are smarter.
Idunno. The people that learned APL were arguably better off than the people forced to pull a bunch of knobs on some high level framework with no understanding of what happened underneath.
I'd say this one is probably a wash, whereas in the other subjects it's actively harmful.
You are not representative of the majority of children being schooled. You are actually a 1% rarity which should not be used as a template for the majority.
Wow, so reading the comments it seems like HN is now one of these sites where people just comment on the headline without reading the article.
The article spends about a third of length on the topic if tech might be good for kids or not, the rest (and the meat) is an argument against "the techies as thought leaders because we live in a tech world", and the idea that tech people are uniquely able to understand the consequences of tech on society just because they built it. Actually when reflecting to that, it's even more ironic that the comments in this tech forum fail to address the point and even the top comment talks about the author "inferiority complex".
Well, I'd argue tech people have a unique perspective on tech: we've used it the longest and understand it better than most people.
I for one have a fundamental creed from using the internet since the 1980s and computers since the 1970s:
99% of a all problems best solved without technology and if your first go-to is a technology solution, you are an idiot, a shitty engineer and questionable as a solution creator.
Why? Because I've seen technologies come and go all bleating the same promises that are never kept. And if you truly examine the problems they are trying to address, you see most of the time that the question is wrong or the answer is ill-posed. Most peoples are solvable by communication between humans in a negotiated fashion. These are also the cheapest and fastest solutions. The details of what negotiation means are pretty wide open, but still technology should NEVER be your first, second, third or fifth solution proposal!
it sounds like the author confronting their own inferiority complex.. "techies" are somehow the smartest people in the room, but also they actually aren't and are full of hubris.
i dont get this weird journalist hate fest about "techies" as if average software engineers call the shots on what their employers do either. its such a well orchestrated distraction. the average engineers job is to execute, not to decide what to build. just like every other industry, a few people at the top call the shots on what to do, the rest of us are executing or making tiny decisions on how to do it. me and Mark Zuckerberg are both "techies"... no difference right?
I agree. Something about the tone of this article felt fundamentally
dishonest (perhaps an edu-tech puff piece) and too keen to prove a
point about a whole class of people using vague allusions.
> "technologists know how phones really work, and many have decided
they don’t want their own children anywhere near them"
I do. And I have. Although I speak for only one technologist.
> "These articles assume that techies have access to secret wisdom
about the harmful effects of technology on children."
Some of us do because we _design_ it, and under conditions of NDA.
Only recently leaks of emails proved this precise point. Many more of
us have "not-secret" wisdom as we proudly publish scientific papers
about it. Moral judgements aside, we know what this stuff does. But
the author attempts to insinuate the language of conspiracy.
> "Based on two decades of living among, working with, and researching
Silicon Valley technology employees, I can confidently assert that
this secret knowledge does not exist."
Then either the author has not been living, working and researching
very well, or his confidence is misplaced.
> I do. And I have. Although I speak for only one technologist.
Same here. I'm a programmer, and my goal is to delay phones for my kids until the last possible hour.
My kids spend time drawing, doing sports, reading books, swimming, playing with friends and animals.
Phones and devices are like digital crack for kids, and they prefer to keep hitting that lever over any of these other activities when the option is there.
We're taking a different approach, where we teach limits on tech. TV is not inherently bad if the right content is played for the right amount of time. Phones are not inherently bad if used for the right activity for the right amount of time. Evening gaming is not inherently bad, if used for the right amount of time.
Yeah, we don't want to raise a phone zombie. But I believe you can do that without totally shielding them from tech.
They're not being raised without tech at all, they just don't have their own devices, or any open access to one.
Their screen time is limited to a small amount of passive consumption per week (2 * 1/2hr episodes on Netflix per week, or a movie), and engaged or creative use of a tablet for about 2h/week. Lately that's been drawing tutorials on youtube, duolingo, wordle.
We're pushing off allowing them to have their own phones for as long as possible. We're friends with a lot of families with the same approach, but also a lot of friends of my older (10yo) kid do have phones already. At some point it will be very difficult when they start getting left out of the party (so to speak).
My best case scenario is that the extra time this frees up will give them the space in their lives to grow talents and health that they wouldn't otherwise. I'm much happier to see my sons getting really good at pencil drawing and playing soccer than having a social media profile or being good at some video game. I feel having a really healthy body and some old school analogue skills like drawing will carry them further than whatever general digital distractions have to offer.
In my anecdotal experience the difference has been strongly tied to socioeconomic factors. And the first study I was able to find [1] on this topic confirmed this is the major factor. See page 18.
There's an extremely strong inverse correlation between education/income and how long people allow their children to use screen media devices. Low income, low education = high screen media time with a sharply increasing trend. High income, high education = low screen media time with a mostly flat trend. This is also happening at the same time that low income families are also having a declining rate of home access to laptop/desktops.
Suffice to say that if it turns out that screen media is indeed detrimental to the development of children then we're basically ensuring that the socioeconomic divide is going to turn into a chasm.
I agree that high SES parents tend to better regulate screen time. I don’t think that’s the same thing as banning phones/tablets/video games generally, though.
The cultural meme is, “these techies don’t give their kids phones or other tech, period,” which I think is unlikely to be a widespread phenomenon. I mean sure, some techie parents do that, no doubt, but I’m guessing it’s far from the majority.
> if it turns out that screen media is indeed detrimental to the development of children then we're basically ensuring that the socioeconomic divide is going to turn into a chasm
Depends on what's the alternative for all that screen time. If these parents could substitute screen time with healthy family / healthy / educational activity, then sure, media time is making things worse. I'm pretty convinced that the extra presence of screen time at the lower socioeconomic scale is a consequence of parents' lack of time, education and resources to provide that kind of alternative. So, the actual alternatives to screen time might likely be worse (inattention, boredom, early exposure to adult world, etc).
> > we're basically ensuring that the socioeconomic divide is going to turn into a chasm
Yes I think that positive feedback cycle is already well established.
> Depends on what's the alternative for all that screen time.
I'm pretty convinced that the extra presence of screen time
at the lower socioeconomic scale is a consequence of parents' lack
of time,
What is "lack of time"? A careless reader would assume you meant they
are busy working [1]. No. The parents are on their phones. From birth all
these kids see is mum's face lit from below with a blue light.
To start with, the kids try to break through neglect and inattention
by calling "mum" mum!" and trying ever more naughty things to get
attention. Eventually they are given a phone of their own (aged 2 is
the youngest I have seen) as a pacifier. The parents give their kids
the same digital drugs to be free of their "disruption" so they can
indulge their own addictions.
> So, the actual alternatives to screen time might likely be worse
.... early exposure to adult world
I know what you meant (sex, violence, alcohol...), but the stark
reality is that the "adult world" is staring into a 6 inch screen for
12 hours. Adults don't have time for anything else nowadays. These
pacifiers work for all ages. It's natural that children want to copy
what they see adults doing.
[1] It's actually when the parents are working that these digital
orphans get better childcare, as they're usually at school or with an
adult caregiver who has them in mind.
> What is "lack of time"? A careless reader would assume you meant they are busy working [1]. No. The parents are on their phones. From birth all these kids see is mum's face lit from below with a blue light.
Sure, that may exist. But your generalisation is so sweeping as to be in effect just a slur against the poor.
> I know what you meant (sex, violence, alcohol...), but the stark reality is that the "adult world" is staring into a 6 inch screen for 12 hours. Adults don't have time for anything else nowadays. These pacifiers work for all ages.
Ah, OK, so it's a slur against everybody? I suppose that's better... Somehow.
Or are you still talking mainly about the less-well-off? Then my point above remains.
You're assuming things like boredom are not only bad, but worse than endless entertainment. I'd claim that necessity is to invention as boredom is to creativity.
> necessity is to invention as boredom is to creativity.
Well said.
We have a no B-word rule in our house.
"Boredom" is a cultural malaise. Neither my partner nor I ever use the
word. Until age 5 my kid never had the concept of "being bored". But
after starting school she started coming home, huffing and shrugging
like a teenager and saying "that's boring!".
She gets sent to her room to "find something interesting to think
about", and 30 seconds later she's happily playing a dolls game or
shooting her bow and arrows at the wardrobe.
To take a Wittgensteinian approach - words "stand in" as templates for
as yet unformed concepts - young children use words with no
understanding of their adult meaning, so they'll say, "that's gay", or
"that's spazzy", (or whatever word they picked up in the playground)
which means something like "I want be in control, and so I reject
whatever it is you think I want". In 2022 "boring" is the worst
accusation that can be levelled against anything.
It's adults who create and propagate the conceit of "boredom". How
could anybody actually be bored in this world, with all the marvels
and freedom we have?!
What adults mean is something different; "Don't leave me alone with
myself, exposed to the existential chasm in my soul, and my abject
learned helplessness".
> What adults mean is something different; "Don't leave me alone with myself, exposed to the existential chasm in my soul, and my abject learned helplessness".
I've come to accept and adopt my late father's idea about the purpose of portable music players, which he formulated shortly after the Sony Walkman had got competition from other brands of cassette players, and which now, after an (in retrospect rather short) interlude of dedicated MP3 players, applies to the music player apps in smartphones:
That's why everybody[1] nowadays walks around constantly marinating their brains in a drizzle of background music[2]: To drown out their own thoughts -- or to cover the lack thereof.
___
[1]: i.e. more or less "everybody", at the time of writing, under the age of... Idunno, ~50?
> I'd claim that necessity is to invention as boredom is to creativity.
This is a good point, but I would say that while one is fairly necessary for the other, it is far from sufficient. In the kind of context we were describing, it's certainly possible for people to pursue creative endeavors, but in most cases not in a way that opens up many avenues to break out of that environment.
"it sounds like the author confronting their own inferiority complex."
CONTRIBUTORS
Morgan G. Ames
Morgan G. Ames's book, The Charisma Machine: The Life, Death, and Legacy of One Laptop per Child (MIT Press, 2019), chronicles the history and effects of the One Laptop per Child project and explains why - despite its failures - the same utopian visions that inspired OLPC still motivate other projects trying to use technology to "disrupt" education and development. Ames is on the faculty of the School of Information at the University of California, Berkeley.
Martin Heidegger’s The Question Concerning Technology is relevant here.
I do not disagree with you, but Heidegger makes some very important points about how “just executing” does not imply guilt but it does not imply innocence either. It’s complicated.
Tech destroyed journalism's ad-driven revenue model over a decade ago, and their attitude since then has been a mixture of inferiority complex and white-hot rage and contempt.
Also, hearing "learn to code" on Twitter pissed them off.
"Learn To Code" was aimed at journalists after they published some derisive business at blue-collar workers, that laid off coal miners should "learn to code."
Having it thrown back in their face is what irritates them.
Because it's like telling a poor person "just don't be poor", or a depressed person "just lighten up". Coding simply does not come easy to large swaths of the population, one has to respect that fact and don't assume we're all neurologically wired to talk to machines all day.
Also, people might well be better suited to very different activities. Would we be richer, as a society, if Van Gogh had decided to drop this "paintings" business and start a boring but profitable civil-servant career instead?
Last but not least, the value of each individual programmer, in a capitalistic society, inevitably decreases as the labor pool increases, with obvious consequences in the long run for the profitability of the profession. Car mechanics used to be respected and expensive professionals too.
Elite politicians and journalists were the first to happily bark "learn to code" at freshly-unemployed blue collar workers with all of the apparent condescension you described.
Others flinging the phrase back at later-unemployed journalists are merely returning proverbial fire, with subsequent Twitter bans meted out by those that can't take what they dish out.
It’s funny because the truly wealthy and powerful seem to rely more on favors and intangible relationships, not by putting a price tag on things. Indeed, we see money as a corrupting influence
That is not what I wrote. I asked what the better alternative is. Simply complaining makes for uninteresting conversation.
I am not opposed to a universal basic income, given that all other subsidies are ceased.
How would society increase the number of surgeons, who require a decade of intense study and risk to become one? How would society incentivize people to live 2 hours from the nearest hospital in order to farm the land and create food for the country? What would society have to do to incentivize people to work jobs where they have to commute somewhere to work, or many places for work, rather than work from their own home?
It was in response to them telling other the same thing. It was never pointed at the blue collar people who lost their jobs, but instead pointed at elite people who had no compassion for those that had the world they grew up working in, being ripped from them.
Oddly I see this as a "for the worker" phrase. Don't tell my 60 year old welder dad that just lost his 40 year job, that he should learn to code, and get a better job. Empathise with him, and come up with real plans.
It was used to taunt journalists, and others who spouted the platitude.
> It was in response to them telling other the same thing. It was never pointed at the blue collar people who lost their jobs, but instead pointed at elite people who had no compassion for those that had the world they grew up working in, being ripped from them....
> It was used to taunt journalists, and others who spouted the platitude.
No it wasn't. It was used to taunt journalists generally [1]. Your justification, frankly, is just bullshit meant to help justify acting like an asshole. The most charitable interpretation requires the taunters relying on sloppy stereotypes akin to someone who generalizes that all black people are criminals based on seeing one black guy mug someone. Except in this case, the taunters didn't even see someone get mugged, they just misinterpreted someone loaning someone some money. Look at the origin timeline: it was Mark Zuckerberg and Michael Bloomberg who were being condescending and aloof, and the journalists where just investigated the questions that condescension raised (e.g. Mike Bloomberg says they're not capable, but he's wrong).
That was confirmed to me the last time someone tried to justify this taunting, and the only "example" they could provide of the supposed condescension of journalists was a sympathetic profile of some coal miners who did, in fact, learn to code, which the taunter-apologist thoroughly misinterpreted through their bias.
Here's some notable examples of no-or-low-tech households of high-tech individuals:
Steve Jobs, Sundar Pichai, Susan Wojcicki, Alexis Ohanian, Mark Cuban, and Evan Spiegel to name a few.
What I don't personally understand is that this isn't news per-say. Many of the "tech elite" don't get high on their own supply. It's not hard to see why. Tech is addictive. Even the WHO declared screentime for kids under 4 shouldn't exceed an hour right before the pandemic...Montessori schools definitely have been on the rise for awhile too.
Montessori schools have been on the rise for a while because traditional schooling is dehumanizing and authoritarian, not because traditional schooling has too many electronic devices.
I don't know enough to comment on whether or not their educational outcomes are 'better', or what definition of 'better' we should use.
But that's what the author meant with high tech individuals mirroring the opinions of their middle class peers.
Besides, the jury is still out about the outcome of said low-tech upbringings. Steve Jobs' kids are now adults (the oldest is older than 30), and while it's already an accomplishment for somebody in that income & privilege stratum to be leading an entirely scandal free life (as far as I can tell), it's not like any of them seem to have done anything truly impactful with their lives either.
So maybe the reason for that low tech upbringing was that they were not really expected to have to do anything major, but were being raised, essentially, to be modern gentry, to clean up nicely, sit on the board of family foundations, keep a clean nose, and remain somewhat detached from the grubby details of the actual running of the world. Nice work if you can get it, to be sure, but necessarily a line of work available to non-billionaires.
I think it’s very very hard to judge what a child of someone like Steve Jobs can be capable of or do in isolation of that fact such that it’s a useless comparison.
Because our public schools are buying and creating and adoption this type of 'crack' software and we're advised to buy software to keep our kids engaged.
Steve Jobs also went for naturopathy for treating cancer (which apparently did pretty much nothing). I don't think one can simply assume rational good decision from these actors except in the cases they could prove they were successful at.
I just finished "Becoming Steve Jobs", great book by the way. And with all the truly impressive stuff Steve pulled of the way he went after his cancer is probably among the most stupid things I have seen anyone do. I had cancer, colon cancer to be precise so in totally different ballpark then Steve's. Between diagnosis and surgery where 10 days. Followed by 6 months of bi-weekly chemo therapy sessions since it spread to some nearby tissue (all of which was removed). Steve got really lucky, his cancer was one of the rare ones with a half way decent chance to be cured. And then one of the smartest guys around, one who really learned to not wait around and drag his feet in business, waits for 10 months doing nothing. And then, after surgery when he knew it had spread, decides to forgo chemo therapy. Quite interesting if you think about it.
But then I know this stuff can be really scary, it was for me when for a short moment there was the possibility cancer was back. So I'm not judging him, not by a long shot, even if it sounds that way. Fear can really blind you and throw your normal reasoning into chaos.
Just to add something more meaningful to the discussion. From the book, Steve struck me as a guy who liked to have an almost simple life. Maybe to have some balance for the Apple high tech environment. Maybe people in the tech world not having that tech in their private lives that much is the same as cooks not regularly cooking at home.
He attributes a lot of his success to reading software manuals. That opened doors for him that he never expected (including his first real success in business).
He may not be an engineer, but I'd say he's a high-tech individual.
The big thing that got him most of his wealth was creating an internet radio website - broadcast.com. He got lucky he sold it right before the dotcom bubble crashed.
It wasn't just that. Cuban also had 14.6 million Yahoo shares at $95 valued at $1.4 billion. He had put options at an $85 strike price, so when Yahoo cratered to $13 during the crash, he made bank.
To clarify, he didn't make bank off of Yahoo crashing, he just locked in the value of his shares at $85. It's not like he profited from Yahoo crashing, it's more that after buying the puts he guaranteed a worst case outcome if Yahoo crashed. If Yahoo didn't crash and just stayed at $95, he'd still have made the same amount of money.
If Yahoo went up in value by a significant amount, he'd have actually been better off (he'd have lost value on his options but regained all of it and then some on the value of his stock).
The people interviewed in the Social Dilemma hit on some of this. Being closest to the source and truly understanding how they used different techniques to manipulate people's attention scared the daylights out of them. The sad this is that now they made their buck and are turning on the industry to try to look like saints.
Giving them the benefit of doubt, so, they maybe didn't understand when they entered the industry. It takes time to realize the problems, then it takes time for that really sink in, followed by some more time of elaborating whether or not it really that bad. Followed by some more time to finally come to a conclusion regarding what to do about it. And they decided to do something about it, not like the rest staying in that industry also knowing what's going on.
> "The more important point here is that believing techie parents have secret insider knowledge about the harmful effects of children’s technology usage reinforces the dangerous myth that techies are always the smartest people in the room — and that the more technical they are, the more wide-ranging their expertise."
This is a strawman argument. The point is that the people closest to the genesis, application, development, and conversations around tech, are in a particularly unique position to understand its effects (the very effects that were designed to happen). It's not that we think developers are superhuman and have secret knowledge. It's that the ones in question have designed platforms to specifically addict its users, and if these same people are not allowing their kids to use the technology, that speaks volumes.
It's obvious to anyone that bringing screens is just bringing in distraction. And even in the fields where they may be of some benefit (maths for example) the benefits are so minor as to be pointless.
“it’s obvious to anyone” is not a substitution for proof and that statement makes it obvious that you’re just speaking from intuition. Screens are not inherently…anything.
I find that a copout answer. Screens are inherently distracting. They literally flash and flicker. The devices that those screens are attached too have buttons and little jiggery bits. There's a ton of distracting software on those devices, browsers, games. The supposed learning software is all about gamification and slidey swoopy menus and other completely distracting BS.
So yes, "screens" as a pars pro toto for digital learning is inherently distracting. It's only when you reduce it so far as to be meaningless, - "a screen is just some LED-lights organized in a square, how's that different from paper," - that you can say they're not.
Yes, screens aren't inherently anything, but I think they're using "screens" as shorthand for the phone/social media/apps ecosystem that does have addictive qualities and is fairly well studied. Tons of good references in here: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4240116/
OTOH I think it's fair to say that something like an e-reader or a dumb watch has low-addictive qualities, and have screens.
Where he speaks from intuition I speak from personal experience. Through my entire High School experience iPads and computers were more often used to play Clash Royale or Tetris in class than used for actual work.
im interested in the opposute, in not just teaching via technology, but in trying to create an empowered enriched relationship with technology.
in my fantasy world classes are setting up & adminhing their own clusters, are playing with userscripts & social web protocols, are creating nodebots.
im not really sure where to look, where the vanguard forces might be. huge respect to projects like the first robotics competiton but that still feels like a narrow course, one where the digital has some co-incident, but feels extra-curricular & dominated by other mechatronics concerns.
Learning to actually use technology "actively" and/or creatively isnt the same thing as the "passive" use that gets pushed on the public through the infinite scroll, constant surveillance and psychological manipulation that people worry about.
You are absolutely right! We expect all information to be given to us, instead of us seeking the specific information we need.
The solution is to - like your maxim implies - reverse the paradigm, by treating technology like it was initially designed: a tool for us to use with intent.
Even the apps and websites which have been optimised for engagement can be valuable. The value is what we make of it, just like how the websites make value out of us.
Advocating for quitting technology unfortunately becomes more and more of an impractical advice, as technology is further ingrained into our lifestyle.
What matters the most is being mindful and honest with ourselves. The moment we lose sight of how we use something, it starts using us.
It seems like your book focuses on a similar mindset. I've stumbled upon it before, but now I'll definitely be giving it a read! :)
> The solution is to - like your maxim implies - reverse the paradigm, by treating technology like it was initially designed: a tool for us to use with intent.
I agree strongly on logical, emotional, & spiritual levels with this assessment. Ursala Franklin's lectures on Holistic enabling technologies versus Prescriptive directing technologies[1], or her Work versus Control technologies distinctions, is a good perspective to assess by, to view tech by. They define this struggle for what lets us work with intent, versus what molds it's intent upon us very strongly, very sharply, & very smartly, & are essential readings for socio-technology today in my view.
It is all very complicated right now. Technologies like Instagram or Tiktok are both highly creative & enabling, presenting a rich ability to craft, but extremely limiting & process/control oriented in terms of distribution/viewing/engagement, as fixed as can be. Understanding how walled & limited we are on some fronts while enabled on other fronts presents a serious challenge of understanding, and this challenge is far heightened by the severely limited alternatives and options we have in the world today. Even if someone can set up or gain access to a good PeerTube instance, they'll have less creative tools available & face a myriad of other distribution/engagement challenges. The tech may be less limiting, especially if they run their own instance, but at other great costs.
Genuinely trying to create a technology which can be engaged in & learned of requires not just good teaching intent, but a software ecosystem which is learnable & explorable. We need technology which is capable of being engaged in. The web, here, is remarkably strong & powerful, with userscripts being a powerful demonstration of how some small tidbits of common knowledge can give us wideranging power over most every corporate/institutional property on the planet. There's nothing like it! It's purely incidental though- few web sites are specifically implemented to enhance understanding of how the site itself functions. There's a big-ball-of-mud/complexity under the covers, in a variety of different programming languages/libraries as we travel about, & almost none of it is designed for external use. I like to imagine a web where components could more clearly state their intent, be poked & prodded & understood by outsiders using common devtools. To me, this is one of the latent & most inspiring hopes of could-be web platform systems like WebComponents, that they could help make the web more broadly understandable & explorable for everyone, that it could make the hypermedium itself rich not just in what it expresses, but internally as well. This idea of making the technology open & exposed & engageable is a precondition for a society capable of growing a healthy, non-poisonous, non-treacherous, resillient relationship with technology.
> They define this struggle for what lets us work with intent, versus
what molds it's intent upon us very strongly
I often couch this struggle as the relation between IA (intelligence
amplification) and AI (artificial intelligence). I am not the first to
draw the lines that way, and I don't literally mean AI (machine
learning) in the sense that most of us techies would understand. But
I think it's a good demarcation to show the way computing is
splitting/polarising along socio-political lines. For BigTech, it's a
means of control, not discovery.
> Ursala Franklin's lectures
Thanks for this. I've heard of her I think, but obviously have some
more reading to do.
> Work versus Control technologies distinctions, is a good perspective
In Digital Vegan I talk some about the transformation of digital
technologies from enabling tools to instruments of enforcement.
> The web, here, is remarkably strong powerful, with userscripts being
a powerful demonstration of how some small tidbits of common
knowledge can give us wideranging power over most every
corporate/institutional property on the planet. There's nothing like
it! It's purely incidental though- few web sites are specifically
implemented to enhance understanding of how the site itself
functions.
Ted Nelson (and some of the other WWW pioneers) had similar visions of
a "lisp machine-like" queryable distributed structure.
> imagine a web where components could more clearly state their
intent, be poked & prodded & understood by outsiders using common
devtools. To me, this is one of the latent & most inspiring hopes of
could-be web platform systems like WebComponents, that they could
help make the web more broadly understandable & explorable for
everyone, that it could make the hypermedium itself rich not just in
what it expresses, but internally as well.
Web1.0 carried that hope but failed. Web2.0 tried (at least brought
interactive participation) but was usurped by commerce. Is Web3.0
trying again? Is the arc of web history on the side of your vision? I
hope so.
I used to think that until my kids got to be 9+ years old and the negative effects of technology became very real. Not all technology is bad but it is hard to draw a line between good and bad. Impossible, IMO.
What negative effects, out of curiosity? My kids are a fair bit younger and I'm generally pretty positive on technology but am interested in what you've noticed.
Being the devil's advocate for a second: why? If they become adult white collar workers they will be indoors most of the time, interacting with technology. I loved being outside as a kid and teen, but is it really deleterious if the new generations don't get exposed to that?
> If they become adult white collar workers they will be indoors most of the time, interacting with technology.
Which is bad enough. But if they've led an active life as children, their base health profile will be vastly better than if they'd been sedentary since they were toddlers, and many of them might even carry the habit of physical exercise with them into adulthood. This will vastly improve their quality of life from as early as their late twenties, and definitely by their forties.
(Source: I was a bookworm from [before?] first grade, never took an interest in sports, and obese from my teens... And I was still probably more active as a child than many children today. Basic military training was Hell, and the metabolic syndrome I was diagnosed with in my mid-late forties blossomed into full-on type II diabetes in my early fifties. Not super fun.)
> I loved being outside as a kid and teen, but is it really deleterious if the new generations don't get exposed to that?
If it's expected that children of today will spend most of their adult time indoors, then maybe it's a very good idea that children and teens get outside now before the opportunity to live outdoors passes!
I use a screen all day as an adult, and am very glad I rarely watched TV as a kid, and of course had no computers. Lack of that didn't impair me in the least as an adult.
I remember wanting badly to watch "Green Acres", but my parents put their foot down and said it was garbage and I couldn't watch it.
When Netflix rolled around, I finally had an opportunity to watch it. About 10 minutes in, I concluded my parents were right and that was the end of that.
Until the end of elementary school I was an unmoored, uncaring kid with no connection to the planet, the future, or the world. Yeah I'd play outside, but I felt no real identity, & longed to understand how I might begin to grow or to have purpose to engage myself in.
I had a public school teacher who went way above & beyond, introduced us to technology & computing (bringing his own PC's into the class room, which we only used very occasionally) & ham radio (although the planned club never really materialized). This gave me a much stronger relationship to the world I'd face, & began a genuine interest in the world in me.
Kids are blank slates- tabula rasa- & they need models of engagement & things to grow interested in over time. Simply expecting play & fun time, without specific encouragement to go deeper, without examples of others about them who are interested & are going deep into things- is an extreme danger. It's been so hard seeing so many so adrift, wishing so badly these people had had more exposure to different things in the world where they might moor themselves.
Technology is fantastically difficult to root into in todays world, is oppressively technocratically nightmarishly resilient to exploration & playing with at any genuinely technical level. But it's still more accessible, a bigger open field, with so vastly much wider open potential (in part because this rotten technocratically driven core is so low-nutrient & exploitative & unwilling to do anything but suck down our life energies, not contributing value) than so so so much else in the world that I yearn so much to see some of the near breakthroughs that could happen anytime. I have deep reverence for biology, medicine, chemistry, material science, design, ux, for so music & culture & art, but wow, so many of these require so much more dedication & luck to break through in, to run the gauntlet of. Computers still seem so rife with potential as a place I'd want people to seek engagement, in spite of their sad hulky rebuffing outer shell of "user interface" that lies & betrays, that is Plato's Cave's Marionette scene, endlessly lying to us.
Kids have so much cognitive surplus. They have so much potential. Pretending that you know better, denying them mediums of thought they might explore & forcing them into simplicity, prolonging their childhood forcibly, is not something I am sympathetic to. Clay Shirky had a 2008 conference talk that electrified the world, on cognitive surplus, contrasting Watching Gilligan's Island versus Editing Wikipedia[1]. These words weren't used but it was a contrast of inert, useless, goes-no-where activity versus the chance to engage in something, in starting the ability to find your own meaning. This process should not be force-started upon kids, and there is a lot of inert, dead end uses of technology they'll hook themselves on & click endlessly. But to deny the whole project, to opt them out of exploring the deeper angles of the world- that's a bad mistake I'd wish on no one. Tech, in spite of all the bad I've said, still has the lions share of open & explorable angles. There's a huge question about how we open the real explorable, interesting angles, versus letting ourselves be sucked in to inert activities, but imo the risk of denial is worse than the risk of what is possible.
In general it’s very attractive to see the entertainment value in technology and succumb to babysitting your child by sitting them in front of an iPad for hours at a time.
>>> These articles assume that techies have access to secret wisdom about the harmful effects of technology on children. Based on two decades of living among, working with, and researching Silicon Valley technology employees, I can confidently assert that this secret knowledge does not exist.
LOL, this didn't age well. See for example Facebook's attempt to suppress reports on how their technology is harmful to teen girls.
I think the idea that Facebook is "harmful to teen girls" is just media sensationalism. That's not an accurate representation of what Facebook's internal reports showed. The report had 12 measures of well-being and found that instagram had positive impact on 11/12 and negative impact on 1/12. The media, naturally, sums this up as "Facebook knows it harms girls and does it anyway!!!"
In my view, it's quite positive that Facebook is studying the impact of their platform on teens. I don't think that's a standard that major media companies, e.g. The Wall Street Journal or The New York Times, live up to. I also think it's commendable that Facebook internally confronts problems with their product in terms of impact on teens. I have some concerns that the media's hysterics will disincentivize companies from studying this kind of thing for fear that it will exploited by the media.
> See for example Facebook's attempt to suppress reports on how their technology is harmful to teen girls.
Pretty sure that was confirmed as sensational reporting at best almost immediately. Though the major news organization didn't let facts get in the way of dunking on Facebook.