> There's no one thing, not even a collection of things, that can be identified with what we think of as the conscious mind. Instead, we've got a whole bunch of different things, none of which has "consciousness" in a traditional sense, and these come together in a way that makes it seem as though we're conscious.
I just don't buy this at all. This seems precisely as I described it:
> "well, we have detectors for this and that and these predictive capabilities and these modelling systems, and so ... ta-da, we're conscious!"
Also, note the heavy lifting being done by "makes it seem" from the Reddit quote. This goes back to the basic problem: Dennett (and the authors in TFA) are describing what we are conscious of, what makes up our consciousness, but he and they are not addressing how it possible for there to be any subjective experience at all.
I would go a little further, even: the whole reason why there is a sense of self is precisely because there is a singular subjective experience. You can figure out what drives that experience, and even note that it isn't rooted in any kind of singular and/or stable physical system, and that's actually really interesting. But that's not addressing how subjective experience is possible at all.
> I just don't buy this at all. This seems precisely as I described it: "well, we have detectors for this and that and these predictive capabilities and these modelling systems, and so ... ta-da, we're conscious!"
No, it's actually, "ta-da, we're not conscious! but here's why we think we are!"
> but he and they are not addressing how it possible for there to be any subjective experience at all.
Because neuroscience will do this by elaborating the mechanisms. Like in this paper:
An analogy for tech nerds would be how the illusion of multitasking on a single CPU machine arises from imperceptibly fast context switching. Something similar happens in that theory, where our perceptual faculties are constantly switching between signals from our internal representations and our senses, thus producing a simplified but false conclusion that subjectivity is present.
> I would go a little further, even: the whole reason why there is a sense of self is precisely because there is a singular subjective experience.
And I'd say you're just telling yourself a retroactively edited story that there is a singular subjective experience in order to make sense of our own thoughts and behaviours. In fact, this sort of retroactive editing has been demonstrated multiple times.
> And I'd say you're just telling yourself a retroactively edited story that there is a singular subjective experience in order to make sense of our own thoughts and behaviours
So look, "The Intentional Stance" is for me one of the most important books I've ever read in this general area, and I totally buy all the stuff Dennett and others have built up around the idea that what we are conscious of is an edited, self-created, intention-injected model of our own selves (to whatever extent there is a unitary self to be a model of).
But I don't think that any of that addresses "how can we be conscious of anything at all".
In the quote I included above, who is the "you" that is telling and who is the "yourself" that is being told? But more importantly, what does "being told" mean? How does one have an experience (whether it is being told, or being cold, or being old)? It's not enough to say "we're not conscious, we just think we are" - the conundrum of consciousness is not about how humans think, but the fact that we have subjective experience (which may includes lies told to ourselves by ourselves).
> In the quote I included above, who is the "you" that is telling and who is the "yourself" that is being told?
This is still begging the question by the use of "who". There is no "who", there is no self, there are only thoughts that refer to a "self", but the referrant does not actually exist in the way that's implied by these thoughts; there's no spirit or homunculus in your mind to which "self" actually refers.
> the conundrum of consciousness is not about how humans think, but the fact that we have subjective experience (which may includes lies told to ourselves by ourselves).
I think the paper I linked is a good start on answering this question. Per my other reply to you, whether this kind of answer is satisfactory depends on what you take "subjective experience" to mean.
If you buy the thought experiments (p-zombies, Mary's room) that suggest some sort of "ineffability", then this explanation will not be satisfactory. Personally, none of those thought experiments are remotely convincing.
> We argue that the attention schema theory provides a possible answer to the puzzle of subjective experience. The core claim of the theory is that the brain computes a simplified model of the process and current state of attention, and that the content of this model is the basis of subjective reports.
Sure, that's all fine. Subjective reports are interesting. But they are not the same as subjective experience. What we say about what we experience is no doubt complex, and has a complex relationship with actual brain behavior. But consciousness, at its heart, is not about what we report, it's about the experience of being something.
> No, it's actually, "ta-da, we're not conscious! but here's why we think we are!"
And of course the "think" has a quality to it that the hard problem is about. It's interesting how illusionists and eliminativists explain away aspects of SE by invoking (other) aspects of SE. "You merely have an illusion of being conscious" - that illusion is the hard problem, so now explain that illusion. I could be having an illusion of an illusion of consciousness.
Imagine something that doesn't exist in the usual physical sense e.g. a dinner table on the Moon. Does that table exist? Not in the usual physical sense. Your thought or imagination of it does, though. What is that thought or image in your mind's eye "made of"? Sure, you might be able to correlate it precisely with certain neurons and yet you've not answered the question. You might call the mind's eye table an illusion, but you're not gonna deny that the picture of it exists in some sense. Three things exist: the physical table, your neurons and, separately, although not entirely independently from the neurons, (the picture of) the mind's eye table. Hence, the latter is part of the universe and the fundamental substrate of the universe must support if somehow, in a way that's different from the usual physical matter tables and neurons. Is your visual brain circuitry involved in the imagination, perhaps even generating the image in your mind's eye? Perhaps, but this doesn't answer the question. If we're nothing but our perceptions, then what the heck is that imaginary table that I'm visualizing quite well while there's no perception of an actual table? What are the physical laws characterizing such mind's eye objects, somehow coupled to ordinary physical matter and yet not of the same "stuff"?
Models like the one linked don't explain why SE exists in the universe. They posit certain physical/mathematical strucutures and claim that if this or that structure is present, then ta-da there is SE (or the illusion of it, which is the same thing). People in the stone age had a model of that kind: "this piece of matter, structured with two arms and legs - it's conscious". At some point we developed language and the model got a bit more precise by demanding the piece of matter emit certain sounds from a specific location on their body. What we have today is no different in kind. We've just become more precise at locating the pieces of human matter to verify the presence of conciousness (or illusions). None of that says why that configuration of neurons experiences or has illusions, only that it does. Science tells us that experience is in the nature of certain pieces of matter and we just have to accept that without further explanation, like the fact that electric charge exists and follows certain rules. Deeper "why" answers are out of the scope of current science.
> It's interesting how illusionists and eliminativists explain away aspects of SE by invoking (other) aspects of SE. "You merely have an illusion of being conscious" - that illusion is the hard problem, so now explain that illusion.
I've explained this elsewhere, but will repeat here: this argument relies on a definition of "illusion" that begs the question on the existence of a subject, just like Descartes. Define illusion as "a perception that directly entails a false conclusion", and there is no subject needed, and no hard problem remains.
It's like you're asking me to explain the dinosaur you saw while you were hallucinating. Sure, I agree we should explore the biochemistry and neurology involved in dream-like states that yield distorted perceptions that imply false conclusions about reality. Let's not go so far as to posit that those distorted perceptions are real if there's no corroborating evidence of their existence.
> It's like you're asking me to explain the dinosaur you saw while you were hallucinating.
No, it's not like that at all. We're not discussing the dinosaur. We're discussing the existence of hallucinations (and SE in general). The dinosaur is irrelevant; the fact that it was possible to have the experience is the central question.
Again, this comes back to my fundamental argument with Dennett (and one that he graciously conceded in an email back in the 90s; not sure he would do so now): trying to figure out what it is that we are conscious of, rather than how we are conscious of anything at all. I'm 110% ready to concede that everything we are conscious of is an illusion, an error, a projection, an intent-laden stance etc. I'm 110% ready to concede that everything we think we experience as a "self" is wrong.
None of that helps to explain how experience is possible. So you're either denying that SE exists, or like Dennett insisting that mysterious SE can be explained by non-mysterious stuff.
> There's no one thing, not even a collection of things, that can be identified with what we think of as the conscious mind. Instead, we've got a whole bunch of different things, none of which has "consciousness" in a traditional sense, and these come together in a way that makes it seem as though we're conscious.
I just don't buy this at all. This seems precisely as I described it:
> "well, we have detectors for this and that and these predictive capabilities and these modelling systems, and so ... ta-da, we're conscious!"
Also, note the heavy lifting being done by "makes it seem" from the Reddit quote. This goes back to the basic problem: Dennett (and the authors in TFA) are describing what we are conscious of, what makes up our consciousness, but he and they are not addressing how it possible for there to be any subjective experience at all.
I would go a little further, even: the whole reason why there is a sense of self is precisely because there is a singular subjective experience. You can figure out what drives that experience, and even note that it isn't rooted in any kind of singular and/or stable physical system, and that's actually really interesting. But that's not addressing how subjective experience is possible at all.