When the weapon has the ability to kill indiscriminately (as in the above categories) it should be banned. AI should never be considered sufficiently able to discriminate to make a kill decision.
If there is no kill decision, just wait and fire, then it is equal to a mine.
100% should regulate / should ban.
It would likely pass and be agreed to easily just like treaties on mines, chemical and biological weapons.
Seeing how the US has yet to sign the Ottawa treaty I wouldn't call it easily agreed on by any means.
We already have computer controlled missiles right?