Let's put aside the war crimes stuff for a second and think about this as Hackers.
Israel's military actions are so awful that they obfuscate the instruments of surveillance and warfare Israel's tech and defense industries have been quietly building.
Israel has a captive population at its borders that they and are using to prototype tools to identify and track individuals, identify their familial/professional/social networks, break open their communications and remove any privacy of speech, and the tools to remotely execute (via drone) anyone who ends up marked for termination. All with stunning scale, if not necessarily accuracy. Some of this is mentioned in the WaPo article, some elsewhere (consider coverage of persistent surveillance tech deployed in the West Bank as well as Gaza).
This technology is not staying in Israel. The NSO Group's software is probably used by law enforcement near to you to break into iPhones. Elbit systems has been contracted by DHS for surveillance of the US-Mexico border, using camera and drone systems used in Gaza and Lebanon over the past decade.
The question you have to ask is whether you're ready to give up your last vestiges of privacy of movement or communication. Are you ready for the chilling effect that comes from that? Are you ready for what a government that has this technology, and the companies and capital that guide it, will ask of you?
If you are referring to the 20% of your comment after the anti-Israel bits, I agree that intrusive technology, security theater, etc. is concerning. That a company in Israel is a leader, but certainly not the exclusive developer of this stuff is less relevant.
A few months ago I stated that we might we witnessing the first information technology genocide in history, similar to how the holocaust was the first industrialized genocide. I received a lot of backlash for that statement, people (rightly) pointed out the Rohingya genocide in Myanmar or Nazi use of information tech, and afterwards I wasn’t as convinced.
Reading your post, I am convinced again. The use of information technology in the Gaza genocide permutes through the chain of execution, all the way from propaganda, information gathering, controlling the population, targeting civilians, and the killings and the exterminations. There is no genocide where information technology is used for such a wide variety of tasks on such a massive scale, until the Gaza genocide.
An estimated 70% of buildings in Gaza have been either damaged or completely destroyed. The majority of the population has been displaced from their homes. Civilian casualty estimates range from 40,000 to 186,000 (according to earlier projections from The Lancet medical journal), though accurate counting is difficult since most hospitals are no longer operational and it's hard to retrieve bodies from under rubble. Polio and various other diseases are resurging among the civilian population.
Given that more bombs have been dropped on Gaza than were used in the World War II bombings of Dresden, Hamburg, and London combined, claims about AI-enabled precision targeting capabilities seem difficult to reconcile with the scale of destruction.
This article seems more like marketing for Israeli "defense" tech. And the AI here is really just a "get out of jail free" card for war crimes.
It turns out that you can use "precision strikes" to flatten whole cities if you do sufficiently many. Which you can do if you use AI to give you targets until you run out.
Top of mind, I recall that the US sent Israel 14,000 of the Mk84 2000lb bomb alone since Oct7. The lethal fragmentation radius of the Mk84 is 400 feet. The Strip is about 140 mi^2, so you're talking about 100 Mk84's per square mile, and that's just this particular munition sent from the US to Israel for this specific purpose.
Bogus stats. According to those statistics, even one bullet scratch on the building will make it damaged.
From all the videos I saw maybe parts of Rafiah and really Northern Gaza parts are partially flat. In one year just a couple of hundreds of buildings is really nothing and it will take them decades to flatten everything with that rate.
They need something like Dresden bombing campaign but that won't happen because of the hostages... and Hamas knows it and that's why it drags with ceasefire. It will never end.
> They need something like Dresden bombing campaign but that won't happen because of the hostages
You cannot bomb Lebanon like Dresden because fucking civilians live there, not because Israeli hostages are held there. If you advocate for carpet bombing Beirut then you shouldn't act surprised when dozens of countries advocate for cratering Tel-Aviv. It is unacceptable behavior and tantamount to terrorism, regardless of who does it.
Thankfully, the rest of the international community is holding both sides accountable since Israel refuses to hold anyone to a moral standard.
> In 2014, the IDF’s acceptable civilian casualty ratio was one civilian for a high-level terrorist, said Tal Mimran, a former legal adviser to the IDF. In the Gaza war, the number has grown to about 15 civilians for one low-level Hamas member and “exponentially higher” for mid- and high-level members,” according to the Israeli human rights organization Breaking the Silence, citing numerous testimonies from IDF soldiers. The New York Times reported the number as 20 earlier this week.
This is key new information (to me, maybe it’s old news to people following the situation closely) that’s more horrifying than the AI angle. And given the liberal criteria to be classified as combatant and probably downplayed civilian casualties, the ratio has gotta be substantially higher than the already appalling official threshold.
It's a variation of the tragedy of the commons problem... The rules of war are meant to protect civilians, but what do you do when an opponent abuses it by using civilians as human shields? And when civilians even harbour the enemy?
Should Israel just let Hamas attack them without consequences since Hamas hides among civilians?
That's not what's going on. Read about Lavender and Where's Daddy, and how they were configured to kill whole families + neighbors.
More similar to collective punishment: demolishing the houses of terrorists, in the West Bank.
But now killing the whole family plus neighboring families, instead.
It just doesn't make sense to accept 15 civilian fatalities, to kill a Hamas militant who himself didn't even participate in the October terror attack. (Maybe he would have done if he could, but he didn't, for whatever reason. About 3000 attacked Israel, but there's sth like 30 000 in Hamas in total.)
Consider the same situation with individuals rather than states. If you're attacked, and the attacker hides behind an innocent bystander, is it okay to shoot the attacker along with the bystander to defend yourself? FWIW, laws pretty everywhere say no to that.
Most countries have laws saying if you aid a criminal, it's the same as committing the crime yourself. Gazan terrorists are hiding among their friends and families, not random people they took hostage.
They are trying to separate out the bad guys, you are making a strawman. If Israel tried to kill em all they would have already done so a really long time ago.
Per TFA, they consider 20 civilian deaths to be an acceptable collateral for one Hamas combatant. I don't know in what world this could be possibly called "trying to separate out the bad guys", but I don't live in it.
An individual Gazan, or Gazans as a people? Before the war, many western countries took in Palestinians so that was a way for individuals to opt out.
Anyhow, the answer right now is fight back. Stop hiding hostages (assuming any are still alive), stop supporting family members who are terrorists, kick them out of your neighbourhood and grab a gun and fight back. Presumably the terrorists are outnumbered by those who aren't terrorists...
The question isn't if you're attacked and the attacker hides behind an innocent bystander and does nothing.
It's if you're attacked, the attacker hides behind an innocent bystander and continues attacking you, while at the same time he's holding your kid hostage and torturing him.
In that situation, I think most moral and legal systems say it is ok to risk the innocent bystander to save your child, if there is no other way to get him back.
(Of course, this isn't a perfect analogy either. Hamas is still holding a 100 hostages while hiding behind innocent people, but it's very much arguable whether "there is no other way to get the hostages back" at this point, among other things.)
Perfect analogy except instead of killing the attacker and bystander to get the hostage you also take out 40 other bystanders in 200m radius and just to be sure level every building in that radius to the ground.
Of course a simplistic analogy won't be good, on either side. In reality it's not a lone attacker, it's a paramilitary that is also the governing body of Gaza. It's not like a cartel invading Texas - it's like the Mexican government sending its army to invade Texas, then hiding its army amongst civilians.
And it's like this is happening in the broader context of the Mexican government believing that the US has stolen its land and trampled on its rights for the last 75 years.
If millions of people organized and did that it would be different, extremely rare events are handled very differently than massive organized behavior.
If you refused to shoot innocent people when fighting evil dictators then it would be trivial to beat you in a war, make a million soldiers hold an innocent person in front of them and you can't win, they will go and take over the world and then you would be ruled by terrorists, nobody wants that.
The navel gazing about AI is a stupid angle. It sounds like a typical corporate tale of the untouchable new boss who has determined that the software he brought in is infallible, “emperor has no clothes” style.
The real story is that they formulated a much higher acceptable ratio of collateral deaths and changed the internal processes to make it impossible to escalate problems. To the point that they allowed Israel to be attacked because nobody could report what they saw up the chain of command.
End of the day, if I’m allowed to blow up a crowd to kill one dude, more people are gonna die. The AI angle is a volume knob.
Unasked questions are:
Who exactly demanded the higher target count?
Did the flooding of the combat arms teams with targets allow important targets to escape?
Did this program advance the goals of executing a war? Or did it cause a needless war? Or did it primarily serve a political purpose?
this is important to think about, but more important is to not ever ever let this sort of nonsense stop you blaming a human.
if people outsource "target detection" to some software, they personally are responsible for what it does.
if people outsource the actual murdering to some software, they personally are responsible for those deaths.
a human made the decision to permit this and the entire consequences of it should be on their head. we can never ever let the people who wrote and/or approved the software escape blame for what it does.
we're already seeing this with self-driving cars - which individual at uber should be serving sentences for their cars killing someone? or did we accidentally make murder completely legal as long as you obfuscate the cause chain enough?
Indeed. As we also need to introduce personal responsibility in corporations at large. Corporations = people, but for now, they have a free pass at poisoning the living, killing whoever they need to in places devoided of the rule of law, manipulating politics at large. Heck, some corporations even benefit from in-house police powers and privatized legal systems (when permitted in some places). But that wouldn't be enough. We also need a reformed legal system where crimes are not considered the fault of someone, but more aptly, an evenly shared social failure. Because in fine, we are all responsible for the crimes we allow to be perpetrated. When someone rapes a woman, kills another, or do anything against anyone, we should always question society and ask: how come we the people let that happen? Crime is only an answer to social disengagement, never an isolated incident. Yes, put aside perpetrators until they realize what they've done. But never loose track of the fact that society is sick, and must be healed. Justice as we know it is nothing but an organized revenge system. It will never lead to sanity unless root causes are systematically examined.
What are these dramatic sentences that people are supposedly serving for killing someone? Perhaps we can have Travis or Dara serve the same sentence that someone from their city, SF, who drives drunk across to the other side of the road and then kills a bicyclist: exactly nothing.
Or perhaps we should have them serve the same sentence as someone who takes a turn across a pedestrian walk signal and flattens a toddler and her father. No jail time? Community service?
What’s this idea that this is a crime in any sense? I’m sure Uber can hire someone to do community service per killed pedestrian. That does simplify matters.
Now, let's continue this line of thought. What if people die because your AI told you to not take action when you could have easily intervened to save a life?
That's not a serious issue and, going by US law at least, you're never obligated to intervene to save someone unless you created the situation that threatened them, have a specific special relationship with that person (i.e. parent), or have accepted a role where duty to rescue is part of your obligations (i.e. firefighter).
As an uninvolved third party witnessing some harm you will face no penalty for not intervening.
It is, of course, encouraged to intervene and laws have been written to ensure that a reasonable effort to intervene won't cause you legal issues but you are never obligated to do so.
Oh - well in that case a health insurer has a contract to provide reasonable care... but I think it's pretty clear (especially with UHC's absolutely insane 90% inaccurate rejection bot) that there are bad faith actors involved here.
To be slightly more depressing - before AI auto-rejectors existed companies would just hire doctors and pay them in a way that encourages (I'd hazard to say intentionally) doctors to deny claims as quickly as possible - Cigna was recently caught[1] doing this to an extreme.
Far from being an original line of inquiry, this is covered by the legal concept of Duty to Rescue. Short story is it differs by country, but generally you're not required to save someone's life. In some places you are required to at least contact authorities or emergency services if able.
In the real world, “computer said to” provides just enough plausible deniability for _no_ human to be _ever_ blamed by the action the computer decided on.
Israel used ML to generate targets for its genocide in Gaza.
Fixed the title, given that it's so "off the mark", to use the IDF's words. Let's break it down.
AI to generate targets:
To maintain the war’s breakneck pace, the IDF turned to an elaborate artificial intelligence [...] which could quickly generate hundreds of additional targets.
Flawed since any engineer reading these knows how inaccurate these can be. Imagine killing people based on those predictions.
Reviewing reams of data from intercepted communications, satellite footage, and social networks, the algorithms spit out the coordinates of tunnels, rockets, and other military targets. Recommendations that survive vetting by an intelligence analyst are placed in the target bank by a senior officer.
Another machine learning tool, called Lavender, uses a percentage score to predict how likely a Palestinian is to be a member of a militant group, allowing the IDF to quickly generate a large volume of potential human targets. Other algorithmic programs have names like Alchemist, Depth of Wisdom, Hunter and Flow, the latter of which allows soldiers to query various datasets and is previously unreported.
Genocide since:
Genocide charges against Israel brought to The Hague by South Africa question whether crucial decisions about bombing targets in Gaza were made by software, an investigation that could hasten a global debate about the role of AI technology in warfare.
Are we going to forget that Gaza used to literally be Egypt? And that Egypt blockaded Gaza recently, even before the war?
Israel literally let hundreds of thousands of Gazans work in Israel, travel and let goods through... Egypt also has a border with Gaza. What did they do? Did they help them?
> I thought Israel's strategy as it related to Gaza was to "mow the grass" every couple of years. The Dahiya doctrine.
The Dahiya doctrine relates to the fact both Hamas and Hezbollah have been lobbing rockets at Israel, more or less continuously, for the last 15 years.
> Israel literally let hundreds of thousands of Gazans work in Israel, travel and let goods through... Egypt also has a border with Gaza. What did they do? Did they help them?
I'm baffled by your wilful ignorance. You paint Israel as this benevolent entity. Here are some numbers from this latest conflict alone[1]:
Gaza Strip:
45,484+ killed
Indirect deaths likely to be multiple times higher
6,000 to 10,000+ missing
108,090+ wounded
16,300+ detained
1,900,000 displaced
West Bank:
835 killed
6,250+ wounded
12,100+ detained
Militants inside Israel:
1,609 killed
200+ captured
Lebanon and Syria:
Total killed: 52,368+
Israel:
952 civilians killed
902 security forces killed
13,572 wounded (as of 22 Jan. 2024)
251 captured or abducted
200,000–500,000 displaced
Hamas breaks way more rules of war than Israel does, this is the punishment for that. It is impossible to "play fair" when the opponent is hiding their active soldiers among civilians, there is a reason that isn't allowed.
This doesnt change the rules for war states agree to. No matter what some group in conflict does. Especially if that group is minority and you are the significantly more powerful.
> Are we going to forget that Gaza used to literally be Egypt?
We could exhume a number of territorial mistakes that wouldn't be very kind to either side here. I think that it's best if we leave that can of worms where it lies, for the sake of Israel's territorial neighbors and particularly for the sake of Israelis.
Israel's military actions are so awful that they obfuscate the instruments of surveillance and warfare Israel's tech and defense industries have been quietly building.
Israel has a captive population at its borders that they and are using to prototype tools to identify and track individuals, identify their familial/professional/social networks, break open their communications and remove any privacy of speech, and the tools to remotely execute (via drone) anyone who ends up marked for termination. All with stunning scale, if not necessarily accuracy. Some of this is mentioned in the WaPo article, some elsewhere (consider coverage of persistent surveillance tech deployed in the West Bank as well as Gaza).
This technology is not staying in Israel. The NSO Group's software is probably used by law enforcement near to you to break into iPhones. Elbit systems has been contracted by DHS for surveillance of the US-Mexico border, using camera and drone systems used in Gaza and Lebanon over the past decade.
The question you have to ask is whether you're ready to give up your last vestiges of privacy of movement or communication. Are you ready for the chilling effect that comes from that? Are you ready for what a government that has this technology, and the companies and capital that guide it, will ask of you?
reply