> One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing [...]
Specifically: If _most_ of a task is automated, human oversight becomes near useless. People get bored, are under time pressure, don't find enough mistakes etc and just don't do the review job they're supposed to do anymore.
Nothing a few rounds of war crimes trials, complete with gallows, in the Hague won't sort out, especially if the court decided (oh I can dream) that the execs and major investors of the company that implemented the system are culpable.
Given their system is in the decision tree for operations any war crime committed during the course of said operations should apply. Did you want an abridged list of likely candidates? Because I just finished spending an hour poring over ICC Article 8 and the Geneva Conventions and I have copypasta on hand ready to go.
I asked for a single, strongest form of your argument. That means an event and a law. You provided a reference to the law. This sounds like you don’t have an argument, just the most generic of sources.
Really? Because from where I'm sitting it sounds like you're trying to avoid the reading assignment. You don't wanna do your homework that's your business but don't expect me to let you crib my notes. Having directly addressed your nuisance attempt at shifting the focus of conversation, let me bluntly remind you the original point was IF war crimes are committed AND a company's product features prominently in the planning of said THEN it stands to reason that the executives and major investors of the company should share a slice of the responsibility for the war crimes their product helped enable. If you're looking to pick a fight over whether the Israeli army's evergreen struggle with correctly identifying aid convoys, UN aid warehouses, and bog standard emergency response vehicles (all explicitly protected under international law) constitutes a war crime take that nonsense to Facebook or X.
> where I'm sitting it sounds like you're trying to avoid the reading assignment
I’ve worked at the UN. I know the Rome Statute. You’re citing it wrong. (Also, your link doesn’t work.)
The operating law is also NOT Article 8, but the Geneva Conventions. Art. 8 is about giving the ICC jurisdiction, not what is and isn’t illegal. (The entire Rome Statute is about establishing the ICC as a venue. Again, not what is and isn’t illegal.)
> IF war crimes are committed AND a company's product features prominently in the planning of said THEN it stands to reason that the executives and major investors of the company should share a slice of the responsibility for the war crimes
This isn’t how the Geneva Conventions work. (“Features prominently” doesn’t factor into jurisdiction nor criminality.)
But again, do you have an example of even an alleged war crime being committed where Lavender is being blamed? (10% error rate isn’t a war crime.)
I’ve been genuinely asking for facts on the ground, not misquoted international law. To my knowledge, Lavender hasn’t been cited in the targeting of an aid convoy—if anything, having that happen in code would make intent trivial to demonstrate.
Article 8 is about giving the ICC jurisdiction over prosecuting war crimes and then it goes on to provide a list. I'm not filing a brief over here so again dispense with the pedantry. To the best of my knowledge Lavender hasn't been cited in anything yet, that would take a fairly comprehensive investigation, thus the IF featuring very prominently.
> I'm not filing a brief over here so again dispense with the pedantry
You repeatedly referenced a single piece of law and did so incorrectly. Now you’re failing to bring any on-the-ground facts to the table. (Not asking for conclusive facts, just even reasonable accusation.) It’s fair to say you don’t have an alleged war crime.
I mean at the end of the day an AI being rubber stamped or a human being rubber stamping "minor intel" for a drone strike its still bullshit.
But blaming AI is just easier than acknoledging at every step of this theres a human being Oking it, the war is Ok'd by a human, the target list is ok'd by a human, the missle launch/bomb drop is ok'd by a human, the fucking trigger is pulled by a human.
But sure because the target list was vetted by an AI its the AI's fault.
Brings the Ironies of Automation paper to mind: https://en.m.wikipedia.org/wiki/Ironies_of_Automation
Specifically: If _most_ of a task is automated, human oversight becomes near useless. People get bored, are under time pressure, don't find enough mistakes etc and just don't do the review job they're supposed to do anymore.
A dystopian travesty.