Sorry if my comment didn’t give enough context. I’m not the OP, so I’m not asking any questions.
I was interpreting the parent comment as saying the spark of consciousness only needed a cost function.
Personally, I disagree that our current neural nets are accurate representations of what goes on in the human brain. We don’t have an agreed upon theory of consciousness, yet ML businesses spread the idea that we have solved the mind and that current LLMs are accurate incarnations of it.
More than the functionality of ai replacing current human jobs, I worry what we will lose if we stop wondering about the universe in between our ears thinking we know everything there is to know.
Just add a cost function.