This feels like alarmism to me. Technology-driven mass calamity almost always happens in the span of generations, not weeks or even years, so the idea of 'preparing' seems silly.
Alarmism is a perfectly valid stance to alarming developments.
Technology-driven mass calamity, as you put it, follows no rule or law that causes it to play out slow, and frankly, I can think of half a dozen examples where it in fact did. Most notably, nuclear weapon proliferation became a crisis within a few years of the invention of nuclear weapons (and required enormous capital inputs to reproduce, whereas I already have a copy of LLaMA.) Other examples include: the arrival of firearms; the invention of crack cocaine; and the radio, which, we must be at pains to recall, was one of the key tools used by the National Socialists to weaponize the latent hatreds of 1920s Germany.
Given that with minimal effort I've thought of several plausible counterexamples, and you self-report as beginning from your feels about alarmism, it seems to me like you're operating from general dislike of alarms -- "just unalarmism", I could say. A good, solid pragmatic stance that is simple, straightforward, and, given the technology in question, probably wrong.
I'd love to keep the conversation focused on facts, rather than feelings, and predictable, inferrable consequences, rather than on historical rules-of-thumb. (Unnamed and uncited, at that.)
Maybe invest in some AI companies, if you want.