Like, we'll go from calling this thing "AI" into just thinking that systems that don't have these basic cognitive functions are really stupid and tedious.
Decades ago, you might have considered spell checking a form of "AI". Now, maybe we would consider an advanced grammar checker "AI". Maybe in five years, we will appreciate "semantics checkers" that check if what we say actually makes sense, or "pragmatics checkers" that make sure we aren't using an inappropriate tone of voice.
In a way "AI" is just a name for the frontier of making computers work for us.
But just like you don't need to be a compiler engineer to use advanced programming languages, you probably won't need to be a machine learning engineer to build user-facing systems.
So a lot of "AI" companies are focusing on building APIs for other systems to utilize. Double buzzword whammo: AI in the cloud!
Buzz words are essentially generated by investors, used by companies and then consumed by the media. When you hear about something from the media, it's already too late.
I am not in the filed and I am not even a developer, but I do enjoy building trading algo robots on the side (they are quite successful, so I am not sure what is the side anymore, my real job or my side project earning me more on a monthly basis) and I really keep an eye on the so called AI space, because I believe my systems will be profitable as long as humans are involved and trading the market (either manually or with algos)... Essentially what we see today are rule based and data driven systems, but the logic behind is all human with its own greed and fears (especially talking into the stock market context)
I think the real AI will come form the military (as all other great live changing inventions), so what I am looking for the determine if the singularity already occurred is something like this: news report about a drone swarm enters in to ISIS controlled city and kills all terrorists and not a single civilian casualty...
I hope you get the idea and who knows, the real AI might be as greedy and fearful as us humans, which will not be a good thing about the human race to say the least...
Your example of singularity indicator is very Hollywood-esque. When i consider what an ai system would have to do to first determine who to target in such a scenario is much more illuminating. There are many paths to the singularity and strong ai, not all them involve software running on hardware, human augmentation and group minds being two. Your scenario may very well be possible today, though not desirable.
For the my singularity scenario, it is just an example, but I do think it will come form the military, just like the jet engine, the digital photography, the internet, the gps... only this time around it might not be the US military complex, which can make it even more dangerous... in the end we all know what is most likely to happened when we meet more advanced civilization, especially home grown and "infected" with the "human dna" (greed, fear, etc)
The best thing you can do as a developer to determine if AI is relevant to you, is read about how AI is being applied (the use cases) and if they sound interesting, learn the methods. Eventually you will discover if and how they apply to your work.
I can't speak to blockchain but that is orthogonal to AI.
Working for a "digital innovation lab" where a big part of my job is sorting out things that developers can actually use to make cool products from Gartner report buzzword hell, I have a few thoughts about this.
Historically, AI has seen boom and bust cycles. AI, unlike other buzzword technologies, has a large cultural mythos about it - Asimov didn't write "I, EC2 Instance", after all. Thus, non-technical people have very strong and often unrealistic set of expectations around AI, so every time there are technological advances, people expect that general AI is right around the corner, and are disappointed when it doesn't materialize. Right now, we're experiencing a boom cycle as a result of the emergence of deep learning techniques. After every boom, funding dries up and investors loose interest, but the techniques that really work stick around and become part of a developers everyday toolkit.
Like all "black sorcery" technologies, AI has a ways to go in terms of building convenient tooling. Tensorflow is a huge improvement over writing low level CUDA code, but it's still too low level for folks without a strong background in mathematics and machine learning. That said, it's been improving in usability, documentation, and tooling, and just a couple of weeks I was part of a hackathon where we turned their pet recognition demo into software to detect objects of interest to my company in satellite images.
At an even higher level are a number of startups (such as Clarifai) that offer AI-as-a-service. In Clarifai's case, you can train your own image recognition models and apply them with a few lines in your favorite programming language (yeah, yeah, I'm shilling them a bit, but I really like their product ;-).
So at the end of the day, I think we'll all be building various kinds of AI into our products in the not-so-distant future, but you won't really need to go deep into tensorflow and similarly low level tools to do so.
Even if AI stalls out again for a while, you'll probably still have to integrate some piece of ready-made AI into a product at some point during your career.