This reads like Plato's warning (through Socrates' words) 2,400 years ago that writing will make people forgetful:
"For this invention [writing] will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant and hard to get along with, since they are not wise, but only appear wise." [1]
Plato's criticism is very true on an individual basis. What he didn't consider was the effect the written word had on a social scale in terms of being able to exchange ideas over physical and temporal distances.
It turns out that was a greater benefit that to some extent makes up for losing the exceptionally well trained memory of scholars from a verbal tradition. Not that that last part isn't a loss though, and his warning against false wisdom is more relevant than ever today when it feels like half the people you talk to online are just googling up wikipedia articles to base their world view upon on the fly.
I think it's better to think of these as different, rather than superior... especially given our times' proliferation of mediums with differing cognitive implications.
Is Twitter > books? Is HN > podcasts? Every medium has its tendencies... positive, negative and complicated. We don't actually choose between them. It's useful to keep an eye on losses, not just gains.
Oral traditions, for example, naturally evolve and diversify. A fairy tale can literally spread over the whole world, and develop thousands of versions. Each one adapted and localised. Written stories also spread, but remain identical... even more-so after printing.
This is, largely, the story of monotheist religions. Whether or not they had sacred scrolls in ancient times, the religions were mostly based in folk traditions, temple authority, prophets, judges... people. As the moved from oral tradition to authoritative canon, the religions became extremely stiff an unyielding.
> This is, largely, the story of monotheist religions. Whether or not they had sacred scrolls in ancient times, the religions were mostly based in folk traditions, temple authority, prophets, judges... people. As the moved from oral tradition to authoritative canon, the religions became extremely stiff an unyielding.
The Catholic Church is interesting counterpoint. Their religious teaching still openly draws upon tradition, institutional church authority and individuals like the Pope.
> The Catholic Church is interesting counterpoint. Their religious teaching still openly draws upon tradition, institutional church authority and individuals like the Pope.
IIRC this was their main contention against the Protestant reformation. They believed that just pointing at scripture without some practice of custom and tradition to ensure you're properly interpreting it in a sensible way would lead people to folly and extremism/fundamentalism.
Granted it's sort of a pick your poison deal. You either get the fundamentalism of hardline scripturalists or you get the corruption that comes with a hierarchical institution that requires people to offloading some of their critical thinking. Or you can blend them together and get the worst of both worlds by creating something that looks like the more noxious elements of the American Evangelical movement.
> You either get the fundamentalism of hardline scripturalists or you get the corruption that comes with a hierarchical institution that requires people to offloading some of their critical thinking.
I don't think it is fair to say the latter leads to "corruption" or "offloading some of their critical thinking". This phobia is largely a feature of radical individualism and the lack of humility that the notion of radical self-sufficiency (intellectual or otherwise) entails. The paradox of individualism is that it actually weakens the ability to be an individual. Human beings are social animals whose lives are relatively short and therefore whose experiences are limited. And because individualism atomizes the individual, the effect is that this social animal becomes subject to greater levels of fear and stress about his isolation, and when isolated from the social and from tradition, he becomes easy prey for weird ideologies.
In this specific case, scriptural canon was compiled by the Catholic Church. You obviously cannot appeal to the canon to codify the canon, so some extra-biblical body of knowledge and understanding must exist that would allow such a canonical body of scripture to be identified. This body of knowledge and understanding is called Tradition, and it is guarded, refined, and transmitted by an unbroken chain of authority found in the apostolic succession. (Tradition is the basis for the transmission and development of all knowledge in human history; there is likewise a scientific tradition without which there could be no science.) The notion that an individual could do better alone without relying or drawing on tradition is belied by the presence of over 40,000 Protestant sects in the US alone, each claiming to be able to interpret scripture. And how do they interpret scripture? However they want! In most cases, it means interpreting it according to whatever happens to be fashionable at the time. So much for "critical thinking". This is very visible in the mainline Protestant churches that, not so incidentally, are shrinking very rapidly. Where this Protestant mindset has leeched into Catholic circles, it has produced the same deadly effect.
But the body of canon law would have been impractical to standardize and disseminate without the printed word.
See: how many schisms and papal crises there were in medieval times (admittedly, the Catholic Church was also much more a political-nation-religious entity then vs now)
I largely agree about the administrative difficulty although I‘d emphasise the written word over the printed word in this context.
Using thousands of monks to copy documents by hand instead of using printers was incredibly resource expensive, but affordable for the Medieval Church.
It'd be fascinating to see a conversion to person-hours to copy one book in medieval times.
And I mean fully-loaded, so how much time needed to be spent {farming/military/political administration} to give a monk time to learn to read and write, and then allow them time to copy the book itself.
I think they viewed the transcription itself as a form of meditative/devotional practice and didn't prioritize efficiency as much. The illumination on some of these manuscripts can get pretty elaborate and, sometimes, kind of whimsical. So they were clearly doing some creative expression while doing it.
It's an interesting thought. Challenging because of the fixed costs though. The cost to produce one book would be almost identical to the cost to produce 1000 books; it is the infrastructure more than the marginal resource cost per book.
Could see it going either way. Agricultural efficiency (in terms of person_fed : person_farming) in Europe was pretty low. So the dedicated headcount (monks) might dominate the net cost, which would make it scale more linearly with number of books.
> But the body of canon law would have been impractical to standardize and disseminate without the printed word.
Would it? The Bible, which is much larger, was practical to standardize and disseminate without the printed word, and its not like there was a proximate relationship between the availability of printing (presumably with movable type, in the 15th century) and the codification of canon law in the 20th. The idea that this was waiting on printing is…hard to defend.
> See: how many schisms and papal crises there were in medieval times
The really durable schisms after the 11th Century emerged after or just before (and became crystallized because of) printing, though.
I'm not following. The parent, I believe, was not talking about canon law specifically (which is changing and to be distinguished from doctrine which only undergoes development, but never revision). I also don't understand what papal crises have to do with anything in this context. And what is meant by "political-nation-religious entity"? The Church is a religious institution whose bishops (including the pope) have also exercised distinct secular political authority in addition to religious authority. It was never "national" in character, only supranational (hence, "catholic").
Parent was observing that the Catholic Church was an exception to GP's point that the written word stiffened and froze monotheistic religious practice, as it blended oral and written transmission.
I was observing that the modern Catholic Church (post-printed word) has offified in exactly that way, with the printed word allow direct dissemination of the "agreed" canon (i.e. pope / hierarchy) to the entire church, thereby unifying them into a single practice.
In counterpoint I offered the number of papal crises and schisms that abounded during the medieval and pre-medieval period (i.e. before printed material became cheap and available).
Ergo, the use of the printed word allowed the Catholic Church to unify and standardize in a way it had not before -- GP's point.
It's indicative that the majority of monotheistic religious splits within Catholicism happened before the printing press, when intervening human parties could reinterprete the canon as it flowed through them.
For political-national-religious entity, the Catholic Church / Holy Roman Empire partnership allowed the Church to become a nation in all but land, by the power its blessing conveyed upon a chosen leader.
> This is, largely, the story of monotheist religion.
This is largely the story of religions, especially old pagan religions. Myths and folklore came about. The moment when religions went scriptural, the message became firm and technical. And gave rise to law as a study.
Judaism preceded the pagan Roman kingdom. Judaism is a scriptural religion. Judaism gave rise to law as a study.
Non scriptural religions had evolving myths and folklore like Greek and Roman gods and goddesses. Those religions did not contribute to law like the prior comment mentions.
Not really get your point. The Roman law system was pretty advanced without the influence of Judaism.
India developed complicated law system before Judaism. Seems like it's part of human societies that grow in complexity, not something special to Judaism.
Protestantism definitely had a big impact on the development of western empirism and probably modern law. In general our current Western individualistic skeptic mindset is the heritage of the reformation.
Protestantism led to direct literacy and scholarship in Bible studies. Again, this is influenced by religion. The adoption of literacy is what led to Northern Europe's success.
It's also useful to point out that historical verbal tradition trained a very specific type of memory recall but that doesn't automatically make anyone wise.
Just because you memorized 10000 random articles on Wikipedia, doesn't mean you now have the wisdom to apply that in a particular circumstance.
I think the contrast should be between studying and internalizing a subject versus having the ability to look up a subject. That seems the most true to Plato's intention.
It's common and easy to fall into considering the things you could look up as things you already know.
What's the difference, one might ask? What's the problem with offloading some of this knowledge and free up space in your head? Well the thing when you learn something is that it doesn't just permit access to the information, it also permits synthesis of new ideas. The sum of knowledge is greater than its parts.
A very concrete example: As someone who only speaks English one may look up the Latin terms 'manus' (hand) and 'facere' (to act/do/make); but unless you actually do, you'll probably not immediately grok the etymology of the English term 'manufacture'.
Exactly. My compsci prof was forcing us to learn so much by heart, but then it's internalized and you start to think in those terms. Right now I am writing my PhD thesis in management and in the beginning I didn't have all of those papers really in my head. But now slowly that knowledge accumulates and I can think through things I couldn't think before. But on the other hand, I now think, how could I not understand that, it's trivial. And to add, it is the same for literature and poems. If you know a poem by heart, it's not just fancy to recite it, but that you start to really incorporate part of that language.
There are some computations (synthesis) that require so many (non-front loadable) memory accesses that it's impractical to do them from memory with significant delay (books), because the number_of_accesses * access_time dominates the project time.
Instead, you must have a working set of core information (or at least pointers to information) in low-latency memory (your brain).
Example: How much longer would it take me to do a multi-digit multiplication if I had to look up the process for multiplication in a book for every digit multiplication I did? And what if that multiplication were just one of many in the higher-level math problem I was trying to solve? (Then generalize to any problem that requires a core base of knowledge)
It is very similar to caching performance impacts. And like you say, sometimes performance is just faster, and sometimes it actually enables functionality…
Strong memory is almost always an indicator of exceptional skill. Whether it's chess players, musicians, writers or poets, mathematicians or also programmers, people who excel generally have astonishing ability to recall.
That's not an accident. Wisdom emerges out of practice and the effortlessness that comes with it. The genius piano player isn't that good because of some pie in the sky wisdom about music, just like the AI they just played tons of scales and training pieces. Literally meaningless stuff. This rejection of rote memorization as some sort of lower skill, that students should be 'smart and lazy' is one of the stupidest modern tendencies.
>Just because you memorized 10000 random articles on Wikipedia, doesn't mean you now have the wisdom to apply that in a particular circumstance.
It's much more likely that you can, that someone who would only look them up "on demand", however, as you at least are aware of the possibilities in those domains.
This is actually what Plato also mentions, a few paragraphs later:
> He who thinks, then, that he has left behind him any art in writing, and he who receives it in the belief that anything in writing will be clear and certain, would be an utterly simple person, and in truth ignorant of the prophecy of Ammon, if he thinks written words are of any use except to remind him who knows the matter about which they are written.
Cued recall in the form of verbal tradition is often criticised to be subject to inaccuracies as, for one example, stories accrue embellishments over time.
plato is specifically thinking of the social impact but he's also not actually arguing against writing. after all, he's making his argument in writing. he's trying to push the reader to look into their souls for the truths that he thinks are already there. "remembrance" here is specific, so is "memory". It doesn't mean, to remember facts and things that happened, but to reach back into ones mind beyond sensation and to "remember" the eternal forms.
Plato gives multiple hints across his works about the esoteric, unwritten teachings [1]. We can only speculate about the content; for me, they were surely tied with the Eleusinian Mysteries [2]: early, myths-driven psychonautics based on ergot.
If we never had writing, would the Huns have carried forth Platonic wisdom? Would it really have survived thousands of years of cultural evolution and the decline of Greek civilization itself? Who even is the "we" who should consider it important?
> that would be another benefit: it would also filter unimportant works over time.
How would it even be possible to know what's been lost in the oral tradition without anyone writing things down? Only takes one autocrat to massacre followers of a specific tradition to effectively erase it. Things get filtered based on their ability to survive, which is partially a function of "importance", whatever that actually means at a given time.
> a third benefit: only people interested and caring enough would know Plato. Not dilletantes.
This just seems like unabashed elitism. A "benefit" of the oral tradition is that knowledge is reserved for the deserving? No thanks.
Tired Alfred North Whitehead quotes aside, he's basically the single most influential author in the western tradition, arguably even eclipsing the bible.
Well that's the human brain for you. It's very good at extrapolating new information from things it's seen and heard that may not be true, without actually telling you that it's basically making shit up as it goes. It's yet another reason why education matters.
Would this be different in a world without writing? We generally synthesize incoming information into a knowledge base and a set of beliefs - I'm not convinced we're less susceptible to that when all knowledge is oral.
What's the difference between having read something last year and believing it, vs having heard someone tell you something last year and believing it? Would people stop in a conversation and go try to find the source of their information if they had to track down a specific person?
Socrates (for it is he who was opposed to writing) did not appreciate how the act of writing exercises our abilities of both memory and reasoning, at least as much as speaking and certainly more than listening. Someone who has written extensively on a topic is almost certainly well-versed in it, and more so than when they began (even crackpots are generally masters of their own flawed theories.)
I'm sorry, Plato wasn't thinking about the effects of the written word on social issues over time and space?!
The whole point of his writing all it down, the very literal foundations of ~2500 years of philosophy and the bedrock of Western (and others) civilization, was so that he could have an effect on social issues over time. That was the whole point! He is super clear about this in his writings and throughout his life and travels.
> You have invented an elixir not of memory, but of reminding;
> What he didn't consider was the effect the written word had on a social scale in terms of being able to exchange ideas over physical and temporal distances.
Why would his words be less potent when applied to billions of people, as opposed to only a few? Are those billions not just as susceptible to fallacy as the elite few?
Plato's warning is true, actually. Since I started to write, I remember less and less about the things I have written down, because the brain replaces them with a pointer.
However, writing for thinking and writing for storage is two different things, and it activates different parts of the brain.
I'm an avid pen and paper user, and using a real pen on a paper allows me to think much more deeply. It regulates thinking speed, so things doesn't escape or my focus doesn't get derailed. Also it changes how brain works so it can think better.
Also, blogging and writing documentation at work made me a much better thinker, because converting these abstract concepts to meaningful sentences with a context and easy to understand structure affects also how clearly you communicate in other parts of your life.
Offloading this really robs you of the joys and power of meaningful communication on many mediums.
I know this is HN, but can we please stop drawing these vague parallels between basic programming concepts and the functioning of the human brain?
No, the brain does not "replace written information with a pointer". I cannot tell you with complete confidence whether writing stuff down improves recall or diminishes it, but I can tell you that you can construct computer analogies in support of either case.
I could posit that writing actually improves your memory, because the minute movements of your arm require more of your brain to compute, therefore creating more neural connections and possibilities of recall.
This is likely just as wrong, but demonstrates that thinking up some vague explanation is insufficient to make a point about a system as complex as the human brain.
> I know this is HN, but can we please stop drawing these vague parallels between basic programming concepts and the functioning of the human brain?
No, because I literally don't remember my to do list after writing to my EDC notebook. The only thing I know is, it's in the notebook. So I only know the address of my to do list. I'm a literal sitting duck if I leave my notebook at home.
> I cannot tell you with complete confidence whether writing stuff down improves recall or diminishes it ...
I can tell you because I observe myself over 5+ years since I started to write my daily plans consistently to my notebook.
> I could posit that writing actually improves your memory, because the minute movements of your arm require more of your brain to compute, therefore creating more neural connections and possibilities of recall.
What it improves is not the "verbatim memory", but the concepts you work on. Working on something slowly, and with a reflection medium (i.e. pen & paper) allows your brain to refine the thing you're working on better and store it as a concept, rather than memorized string.
> This is likely just as wrong, but demonstrates that thinking up some vague explanation is insufficient to make a point about a system as complex as the human brain.
I have read a paper about supporting the claims I did in this comment, but I failed to find it in a whim. Will update/reply this comment if I find it. I use these analogies because of that paper.
> I know this is HN, but can we please stop drawing these vague parallels between basic programming concepts and the functioning of the human brain?
Why? Those are perfectly good parallels, and quite specific at that. Forgetting the thing, and replacing that memory with the memory of the location of that thing - that's an "indirect reference", i.e. a pointer in programming terms.
> I cannot tell you with complete confidence whether writing stuff down improves recall or diminishes it, but I can tell you that you can construct computer analogies in support of either case.
You can construct analogies to anything in support of anything, what matters is whether the analogy is bringing the reader towards or away from the truth.
Also, even when you have two similar analogies pointing in opposite direction, that doesn't mean either is wrong - they may be talking about two different things. For instance:
> I could posit that writing actually improves your memory, because the minute movements of your arm require more of your brain to compute, therefore creating more neural connections and possibilities of recall.
I believe that to be true just as much as GP's pointer analogy. In fact, I attribute my learning of irregular verbs in English directly to my grandfather insisting I copy the verb table by hand several times. Doing it, I went from near-zero recall to near-perfect recall in two sessions (with each having me copy the table once or twice).
Now the difference here is, in case of irregular verbs, I wanted to memorize them. The act of writing the information down myself somehow helped commit it to memory, perhaps because the brain was more focused, or because I was more invested, or a bit of both.
In case of GP's (and mine too!) experience with TODO list, the act of writing serves the explicit purpose of getting the thing out of your head. This is the opposite of memorization - you literally want to forget about tasks you write down, so they stop circling around in your head, distracting and stressing you.
Incidentally, when I write some TODOs by hand, I actually improve my immediate recall of them - should I try and check. But I don't, because I don't want to reinforce those thoughts. My brain understands, and before I know it, I no longer remember most of them in detail.
A librarian might say they create a reference card, a web person might say they're creating a hyperlink.
In all cases I think parent post is trying to communicate that they are putting the large mass of information elsewhere, and only retain a reference locally. (implementation details notwithstanding)
Fun fact about the biological machine is that we could maybe make it "replace written information with a pointer". In planarian worms we can use a voltage-sensitive dye to see a "memory" the cells will fallback to in case of injury and they regenerate based on that "memory", however we can alter that "memory" to say "build a worm with 2 heads" instead of the regular worm with head and tail [1].
The abstract idea of a 'pointer' is a piece of information to the location of some other information. It is not even an analogy - it is literally the same thing, differing only in implementation details.
An LLM is not a conscious entity, and while it can babble and assemble meaningful sentences, it has no guarantee of correctness, a baseline honesty, and other small bits and pieces we value in a conversation. It's just a sentence builder.
As a result, it can't replace a real human being backed by real experience and thoughts, hence it can't be as useful. I converse with myself, too, but it's not the same either with a different human being or an artificial construct which can babble meaningful sentences.
Currently, we simply ignore or don't understand the fact that the meaning of a sentence is not solely built with word order. There are higher orders of information carried implicitly, and they are not articulated in words. Hence an LLM can not replace, even mimic a real human being in a conversation.
> no guarantee of correctness, a baseline honesty, and other small bits and pieces
Humans for sure do not guarantee this either.
I think of LLMs as an amazing rubber duck. It's heard of everything, and it always responds with something that sounds like it came from the same sphere. You have to use your own mind to figure out if it's meaningful, but this is not so different from conversing with a person. People can babble too.
> There are higher orders of information carried implicitly
You're not always looking for those in a conversation. Sometimes you really are just checking that you've thought things through. Like if someone asks you what the arguments in favor of democracy are, you want a list of points so you can check you haven't forgotten something.
However, humans are more nuanced than that. One might remember wrongly, or act in bad-faith. This is why I said "baseline honesty". One people's traits definitely affects how their words are perceived, and this is not carried in the sentence itself.
> I think of LLMs as an amazing rubber duck. ... People can babble too. (Snipped for brevity)
Humans are not as random as an LLM.
> You're not always looking for those in a conversation. Sometimes you really are just checking that you've thought things through.
Again the person you're asking or answering has their own character and their or your words are affected by that implicit knowledge. This is a background process we're not aware of unless you dig into yourself and look for it.
> Humans you think you're conversing with online, not so much.
Two decades of experience shows me otherwise. Even your two comments shows a consistent tone. We're just started to discuss, yet I have started to build an image of you thanks to your comments.
This is one of the mechanisms we don't fully understand and don't dare to dig much, because tinkering with people's minds are dangerous.
But it's a pretty amazing rubber duck that has read the entire internet and thus can correlate your ideas with the ideas of all of humanity in a split second on the spot.
Whilst by no means a new idea, physical and digital zettelkasten may interest some folks more than conversing with an LLM. Some people describe it as conversing with yourself or having a second brain. I don't find discourse with an LLM to be productive, but I do find surfing my zettelkasten to yield new and novel ideas, especially when it comes to problem-solving and research.
However, I'm still on industry forums and communities like HN to hopefully have new and conflicting thoughts thrown my way by peers. That's my primary concern with LLMs, a lack of fresh perspective that has all the nuance of experience and understanding behind its output.
If convincingly arguing about something was the only thing we needed for proving correctness, we wouldn't have the scientific method.
Or, if word order built the meaning and it's devoid of the character itself, we wouldn't be praising authors for their character building skills and embedding things not written or spoken into their stories, regardless of the medium it's presented.
To apply the scientific method and prove correctness, you need to define the subject first. Meaning is ill-defined, just like consciousness, intelligence, and others.
LLMs can clearly grasp higher order abstractions and concepts just by reordering the words. In fact, embeddings (developed much earlier) are specifically intended to represent the semantical meaning extracted from text using just statistical methods; until the introduction of transformers they lacked a good architecture to demonstrate the usefulness of that.
This makes many people argue that being a sentence builder is enough to be intelligent, as they also received most of their intelligence from the same source (concentrated experience of someone else - social intelligence).
"Honesty" and other forms of self-introspection is just a high level construct which current models aren't trained for, and likely not a fundamental issue with being a stochastic parrot.
I agree. Instance from today; I have a very novel task at work that nobody seems to know how to approach, I had a hunch that metaheuristics could play a role; I used ChatGPT to help me better formulate the problem and at least now there is a way to tackle the problem.
It made a few errors in at least presenting my ideas, but these errors were a consequence of my misunderstanding and lack of clarity.
The errors themselves, in my opinion, are invaluable because they force you to think, to be clear, and to guide the model into giving me useful ideas to look into.
For all intents and purposes, it was an example of the Socratic method, albeit inverted, where the student is asking questions to an all knowing teacher, and the teacher / LLM responds with ideas and hints. Ultimately, it's up to the student to synthesize the solution, be critical of the data, and tie everything together.
Most of the comments here are assigning this as Plato's opinion.
I'd just like to point out that Plato very rarely wrote in his own voice so it's very hard to say if it's his views or not that are being expressed.
In this case however, this is almost certainly an expression of Socrates' views, not Plato's. Not only because it's in the voice of Socrates but also by what's transparent in their actions: Socrates didn't leave anything in writing and Plato left us arguably the most important written corpus of classic Greek philosophy.
Maybe he felt ambivalent about it, but he certainly thought there was a value in the writing.
Plato turned out right. I'm not extremely old and I still remember time before the Internet, when to know something, you had to look it up, which took ages, so you simply had to know it. Yourself.
We measured intelligence by the things you knew. We still have shows with questions based on this concept. This concept no longer makes sense in the modern world. But it used to.
Writing is a crutch, Internet even more so, and AI even more so. Eventually you can build the entire thinker out of those crutches and you need no humans anymore.
We measured intelligence by the number of memorized facts we'd internalized, but little did we know that collecting more facts doesn't make us more intelligent
Well, "intelligent" has always been a murky term, and as tech is rapidly surpassing us in every way, it'll become even more meaningless as times goes on.
It's true. People can memorize 100'000s of words of text and this ability was common place among learned people. The pali canon is an example. I found myself remembering huge amounts, having just read a text, with no intentional effort to recall after a month in a forest monastery and in the 4 months I was there but afterwards when exposed to the internet and sensory stuff again found my ability to recall texts had gone.
"What is the cause of people forgetting texts they had learned before?" I can't remember the precise answer but that was one such line in the texts.
Today if you say that you sound kinda crazy.`
but like socrates there's still a few orders of wandering mendicant homeless-like orders out there (the theravada forest tradition being an example)
> [W]hen exposed to the internet and sensory stuff again [I] found my ability to recall texts had gone.
The overstimulation could be part of it as well, couldn’t it? There isn’t a lot of other information-rich inputs in a monastery.
Holding a summer school in a forest or on an isolated resort seems like exploiting a similar idea, and it does work quite well (for a month or so, before isolation sets in).
It may be true that writing makes people worse at remembering unaided, but better if they can check their diary. I imagine ChatGTP type things may likewise make people lazier at unaided thinking but better at AI aided thinking.
I tried asking ChatGTP what it thought and it came back with
>... it's also important to note that technology is a tool, and its impact largely depends on how it's used. For instance, AI writing tools ... can be used to augment human creativity and productivity, rather than replace it. These tools can help writers brainstorm ideas, overcome writer's block, write more quickly, and even learn to write better by providing examples and suggestions.
>Moreover, the use of AI tools might also stimulate new forms of thinking. Just as calculators didn't eliminate the need for mathematical understanding but rather allowed for more complex problems to be tackled, AI writing tools could help individuals refine and expand their thinking, enabling them to tackle more complex writing tasks or express ideas more effectively.
Which is kind of interesting - I didn't think of writers block on my own for example. Also lazy thinks at the moment probably tend to just go along with what they see in the media. AI aided thinking could improve on that.
I’ve argued that if ChatGPT means that motivated students spend less time doing expository writing, and more time:
* learning to ask good questions
* learning to fact check
* learning to edit the writing of others
Then they have truly learned the Socratic method. Which is a much more valuable tool than writing from scratch solipsistically: being able to actively engage with the writing of others.
Whether this applies depends on what situation you are supposedly replacing by writing.
If you replace intense IRL dialogue and discussion, with writing, this might be the case.
On the other hand, if you replace just letting your thoughts fade in the void, with writing them down as coherent though out ideas in writing, I would argue that the writing has actually both triggered retention from memory (thus exercising the memory), and helped you critically assess and make your own thinking clearer.
Thus, I think writing is an extremely helpful tool for processing your thoughts for most knowledge workers who are not already involved in constant dialogue with others (which is most of us I guess).
Actually, I always found this part much more interesting:
You know, Phaedrus, that is the strange thing about writing, which makes it truly correspond to painting. The painter’s products stand before us as though they were alive. But if you question them, they maintain a most majestic silence. It is the same with written words. They seem to talk to you as though they were intelligent, but if you ask them anything about what they say from a desire to be instructed they go on telling just the same thing forever.
Because it is now not the case, and AI trained on a corpus of many books can in fact say something new.
Plato had it easy... When we were young, there was no memory. You had to work everything out yourself, each time. And it was a hard life having none of that fancy thing of memorizing but, frankly, life was better for it.
I have seen this happen before my eyes:
When i was young (~17 y.o) i was able to remember all my family IDs, my friends phone numbers and a lot of data about them. Now i rely a lot on my smartphone to tell me their phone numbers, their birthdays, their address, etc... (Sadly today i dont even know the phone number of my gf, out of pure laziness). As i have become more and more lazy and let the tools do their job i'm losing my own skills on it "because i need my brain for bigger things - sure".
Maybe kinda pedantic but, this is not Plato/Socrates saying this here, but Socrates telling Phaedrus what King Thamus said to Thoth. It's kinda important I guess, because Socrates is ultimately somewhat ambivalent by the end of the dialogue on this issue.
Sure, the game of différance [1] can be played till the end of the universe. Socrates is well known as an aporia [2] lover, but at the end of the day even he had to take an executive decision, leave the agora, and go home to the nagging wife [3]. Ancient Greek misogyny aside, Xanthippe can be interpreted as Socrates' aporia solver, in the same manner his 'daemon' [4] would only tell him "No", Xanthippe would tell him "Yes".
Especially for this quote, I like attributing it to Socrates as Plato did. We obviously don't know the details and nuances. However, I feel it captures the (perhaps inexact) tension between Socrates' old, mostly oral tradition of philosophy and Plato's newer tradition of written philosophy.
It's ancient Marshall McLuhan. Oral philosophy and written philosophy create different philosophies, not just a different medium for the same philosophy.
I can only begin to wonder what would his opinion on Google search would be since we (developers) always practice the "don't memorize what you can easily search", and there's a sizeable amount of people claiming that memorization shouldn't even be a concern most of the time, and that's an advice given to people who are new in the field (like myself).
I think understanding is more important than both. Even if you're cut off fro the internet somehow, you can still re-derive things if only you understand them. And understanding also allows you to solve novel problems.
Do people really practice this? Like I usually physically unplug my computer from the network as I develop as it's such a distraction to have emails and stuff popping up. Doesn't really feel like a limitation to not be able to look stuff up.
I go in the completely opposite direction nowadays. I have chatgpt [*] up, I use google searches, I have online api documentation and tutorials open. I log on to IRC and chat with (other) skilled developers. (psst... I even read stack overflow ... sometimes [*]) ... so ... I definitely use the internet a lot while developing.
As mentioned above, I do try to research things to the point of understanding. Which is something human contact definitely helps with. To stay on topic: ChatGPT is obviously less good, but is often good enough, is patient, and guaranteed to answer you in a couple of seconds.
[*] My exact workflow would not quite fit into this footnote, but I seldom cut-and-paste answers, unless it's for something exceptionally non-critical. I try to understand what is written and then type out my own version (which should be very close), or at very least I type it across by hand rather than use ^C-^V. (the latter is an edge case). In general, there's this AI-koan about how some things only work if you also understand them, which has a kernel of truth to it. ( http://www.catb.org/jargon/html/koans.html first one about tom knight)
> For this invention [writing] will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding;
That is correct. I mean, some of that is exactly why I write at least 5000 words a day of journal notes while working. Because I want to offload everything from memory to a written form that is indexed.
> and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant...
This part is where we veer into prediction of how it will alter people and it is both right and wrong. I think the education system has shown us it's possible to have a great memory and to remember many facts and still be ignorant. At the same time, there's so much material that can be consumed and people have surface knowledge of many things and once you dig into a conversation with them on it, the ignorance pops up and it turns out that they don't know much about the topic. I know this. This has been me on many occasions and is still me on some occasion.
>... and hard to get along with, since they are not wise, but only appear wise.
Again, there's some truth to this. I think about the times where I've jumped into a conversation about politics with some half baked knowledge that I read somewhere but failed to understand or recall correctly and I think I must have a looked a fool at those times.
But then again, I've course corrected here and there and I've spent time writing and sharing thoughts with others where I've used previous writings to pull up facts and my own synthesis, and I feel like I've been a better participant at times like that. In that way, writing has helped me steer away from ignorance.
I can't comment further though on the quote because I've never read the rest of the context so I may be missing out on some stuff.
I think Paul is on to something here though. I can't count how many thoughts I've had that I thought were clever but when I started to write the thoughts down and organize it on my own, I discovered for myself that either I didn't have enough evidence to prove myself entirely or that I was just flat out wrong. Writing is a form of thinking in my opinion. And if we don't write, we do lose that side of thinking too. I do appreciate though that Paul doesn't go further into predicting the consequences as Plato's warning does here. But I'll be willing to bet that Paul is right that there will be some negative consequences for sure.
Socrates’ main argument is that the word itself cannot teach but that it takes two parties which are often a student and a text instead of a student and a teacher. If Socrates sat down and explained to a student, he could impart not only the written word but other examples from their own lives. But the word can be uprooted from its context and read by many who will learn just the word instead of the meaning of the whole of it. There’s even a whole class of philosophy who argues _inside_ the definition of words.
Books (largely) aren't written for memorisation these days, with possible exceptions of some poetry, or children's books.
Classic oral literary traditions are based on rhyme, meter, repetition, simile, metaphor, and references to cultural touchstones and themes, for the most part, all of which strongly assist with memorisation. These are not only necessary for a literature which isn't written down and instead is passed on through generations orally, but also quite likely represents a survivorship bias in that works which didn't exhibit these patterns didn't survive. And of course the versions we know are the end of a long pipeline of transmission (or the world's largest and oldest game of telephone), captured in writing and then passed on to us as (among) the first written traditions.
I've committed a few poems and homilies to memory, one example for nearly thirty years now. I do refresh that recollection from time to time (and realised I was dropping a stanza consistently). It's an interesting counterexample.
Then again, I turned up a piece earlier today I'd written myself about eight years ago and had all but no recollection of.
The point was that with Plato (but also ~2,000 years before his time, with the Sumerians, who probably invented writing) we were already discovering that we don't have to bring the exterior into an inside which we call "self", memory, we can actually leverage stigmergically [1] the environment to enhance and extend the "self", i.e. by writing, leaving a mark; the Greek word grapho γράφω [2] meaning literally to carve.
Not sure if LLMs will enhance our current "self" by themselves, but once we get a chip or two inside our brains, once we start controlling the cells to regenerate or grow specific limbs [3], we will surely consider our present "selves" as tiny as we consider the "self" of one of our cousins, the chimpanzee, in a zoo today. The journey and expansion of the "self" is merely starting.
[1] "the trace left in the environment by an individual action stimulates the performance of a succeeding action by the same or different agent", https://en.wikipedia.org/wiki/Stigmergy
The thing is our bandwidth is limited and we allocate it differently now. However, not all replacements are beneficial and adding external dependencies adds points of failure.
Challenging yourself is conscious decision. Make choices to challenge yourself, and don't build an echo chamber.
It's remarkable what some 200,000 people (maybe fewer?) in ancient Athens were apparently able to come up with.[1]
They even did pretty well with computing with the Antikythera mechanism and its predecessors; though, like Babbage, they were limited by the manufacturing technology of the day.
Unironically one of my favorite uses of chatgpt is to tell it to rewrite things as a Homeric epic in the style of Tennyson so it's more pleasant to read aloud and I have a better chance of remembering it
There were the same warnings about the printing press, both books and newspapers; "kids don't play anymore, they read books all day", or "people don't share news anymore, they all read their own newspapers".
I'm not going to deny it's a change, but it's too early to decide if it's a bad change. Every generation has its own big shift in the past 100-200 years, and every generation complains about the change.
"For this invention [writing] will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant and hard to get along with, since they are not wise, but only appear wise." [1]
[1] Phaedrus 275a-b, http://www.perseus.tufts.edu/hopper/text?doc=Perseus%3Atext%...