* AI is something that potentially poses an existential risk to humanity, ie, it might wipe out our species.
* AI is a trendier topic to write about than other things: there's lots of possible pop culture references, the general public doesn't need to feel guilty about it
In comparison, something like climate change probably doesn't pose an existential threat to our species: it may just merely wipe out a fraction (20%? 80%?) of the human population over the next hundred years.
Edit: to be a bit more on topic, do you think there are very dangerous technologies around these days that would have been very difficult to anticipate / seemingly ludicrous to consider back 100 years ago? Would the development of nuclear weapons in the cold war have seemed like a serious concern or viable development in the 1850s?