Four years ago, I determined that while development work might seem to be near the top of the food chain, there will at some point where my work will be replaced by AIs.
This is not so different from how word processors replaced the specialist job of typesetters. Word processors make "good enough" typesetting. You can still find typesetters practicing their craft; the rest of us use word processors and don't even think about it.
At the time, I was learning to put the Buddhist ideals of emptiness and impermanence to practice, and to become more emotionally aware: the _main_ reason I had thought I would never be replaced by an AI writing software has more to do with wishful thinking and attachment than any clear-sighted look at this.
I also made a decision to work on the technologies to accelerate this. Rather than becoming intoxicated by the worry, anxiety, and existential anguish, I decided try to face it. Fears are inherently irrational, but just because they are irrational does not mean it is not what you are experiencing. Fears are not so easily banished by labeling them as irrational. Denial is a form of willful ignorance.
Now, having said all that, whether our tech base will come to that, who can say?
Since then, I have been tracking things like:
Viv - a chat assistant that can write it's own queries
DeepMind's demonstration of creating a Turing-complete machine with deep learning using a memory module.
I watched a tech enthusiast write a chat bot. He does not write software professionally. Talking with him over the months when he tinkers with in his spare time, I realized that in the future, you won't have as many software engineers writing code; you would learn how to _train_ AIs when they become sufficiently accessible to the masses. Skills in coaching, negotiation, and management becomes more important then some of the fundamental skills supporting software engineering. And like typesetting, I can see development work being pushed down the eco-ladder.
It's not surprising to me to see that Wired article about how coding becoming blue collar work. And even that will eventually be pushed down even further.
It's not surprising to me about Google's site-reliability engineering book, branding, and approach. I have done system admin work in the past, and I can already see traditional, manual sysadmin work being replaced.
It's easy to get nihilistic about this, but that isn't my point here either. I know the human potential is incredible, but I think we have to let go of our self-serving narratives first.
The second idea that interests me is this idea of very high technology. It is built upon layer after layer of very clever tech year after year that I wonder how long it would take to start again from scratch if some disaster rendered a large part of one of these layers unusable.
For instance, if you were on a desert island, could you (would you want to?) build some piece of tech? An electric generator would be useful, perhaps. How long would it take to build? You'd need knowledge, raw materials, plant, fuel etc. It's not an easy solve. And that's way down the tech stack before you start talking about AIs. I suppose what I'm saying is, that the AI layer is based upon such high tech, that is inherently fragile, because it is so hard to do.
I don't know! :-D
I don't know what society would look like from a purely technological point of view. From a spiritualist point of view, though, it could either go very well or very badly. When everything is automated, would people have enough time and space to really start asking the really big questions? Or would it accelerate and intensify existential anguish?
> There are a small number of people reaping the benefits, and huge swathes of the population being marginalised and disenfranchised as a result.
Yeah. Arguably, this has already happened.
> The second idea that interests me is this idea of very high technology. It is built upon layer after layer of very clever tech year after year that I wonder how long it would take to start again from scratch if some disaster rendered a large part of one of these layers unusable.
The stuff of sci-fi :-D Among them, alt-history novels (what happens when someone drops into a lower-tech era; you'd have to start from 0 ... literally, 0, as in Arabic numerals).
Open Source Ecology is trying to preserve some of this tech base. I find their aims awesome, though I am not sure how effective it is.
The flip side are things being spoken from well outside the techno-sphere, (for example, shamans and mystics) It is the perspective that the further evolution of human consciousness will, at some point, no longer require a technology or artifacts. Technology seen as the last crutch. The collapse of a high-technic civilization then sets the stage for a removal of that crutch, and humans learn to stand with two feet (so to speak).
Not a fair argument against the point made above, however, I believe we will find the next big challenge for software to solve as soon as traditional problems are commoditized/automated and considered solved. Also, just knowing how to code is not going to be enough. You must complement it with domain expertise to solve challenging unsolved real world problems.