Communicate the concept of "green" to me in text.
The sound of a dog barking, a motor turning over, a sonic boom, or the experience of a Doppler shift. Beethoven's symphony.
Sour. Sweet. What does "mint" taste like? Shame. Merit. Learn facial and object recognition via text.
Tell a boxer how to box by reading?
Hand eye coordination, bodies in 3 dimensional space.
Look, I love text, maybe even more than yourself. But all these things imbibe, structure and influence or text, but are not contained in them.
To make substantial inroads to something that looks like human esque AI, text is not enough. The division of these fields are artificial and based on our current limited tech and the specialisation of our researchers, faculties and limitations.
When we read, we play back memories, visions, sounds, feelings, etc, and inherent ideas gained through experience of ourselves as physical bodies in space.
Strong AI, at least to be vaguely recognised as such, must work with algorithms and machinery that understand these things, but which then works at that next level of abstraction to combine them into proper human type concepts.
Of course, there is the question about why we would want to create a human like AI, it's my contention that human like AI isn't actually what many of us would want, but that's another topic...
I won't touch the qualia aspect, but everything necessary to flawlessly pretend to understand the color green, the sound of a dog's bark, or the experience of hearing a sonic boom can be represented with text. As an existence proof, you could encode an adult human as a serial stream of characters.
But if you must pretend to be sighted and hearing, there are many descriptions of green, of dogs barking, of motors, etc, scattered through the many books written in English (and other languages.)
Are these descriptions perfect? Maybe not. But they are sufficient to mimic or communicate with humans through text. It's sufficient to beat a Turing test, to answer questions intelligently, to write books and novels, and political arguments, etc. If that's not AGI, I don't know what is.
I don't propose that a human could lose all of their senses and still be able to communicate. But I do believe computers could do so, if they are designed to do that. Humans are not designed to work lacking those senses.
Now we are just speculating. We believe a computer might be able to understand things for which it doesn't have the sense - but that is speculation and totally untested, and certainly can no longer be justified by using human minds as an example.
That seems hard to believe.