Israel's AI Targeting System [Original Post, ~April 2024]

I just read this article in the Guardian about Lavender, the AI system used by Israel to choose bombing targets in Gaza. If you work in AI, I recommend you read the article, because this could have a material impact on the profitablity of your business.

The system described in the article consists of multiple parts. First there is the surveillance, which was used to generate a list of 37,000 targets. It is not specified in the article how this surveillance was done, but I would imagine AI algorithms were used to process the raw data. This would mean things like speech-to-text algorithms and algorithms that can interpret language in order to determine what is being said in an automated way. Given the state of technology in the world I would be surprised if Israel did not have some method of obtaining access to the cellphone of every Palestinian in order to retrieve the raw data from emails and the microphone.

The second part of the system is the selection of targets. This is covered in detail in the Guardian article. Unfortunately, this system is necessarily dependent on statistics. From the article we can see they have measured a 90% accuracy rate in their predictions, which is considered acceptable for their use case. Additionally, the system provides the operator with a control to limit the number of civilian casualties where are acceptable per strike. In this case, it appears they have configured the system for 10 to 100 casualties per strike, depending on the political situation at the time of the strike, and the emotional state of the Israeli leaders.

I am personally opposed to almost every aspect of this system. I personally think even the Hamas members deserve trials and I do not believe in capital punishment. However, regardless of that, I think this system is not sustainable. If you are trying to predict the future, this is important. I think this technology is so dangerous that I cannot see a realistic future timeline that does not involve regulating technology around the world to prevent people from creating these kinds of systems. If we do not regulate it, we get destroyed, and so that timeline is unlikely from a business perspective.

If it is inevitable that it will be regulated, then maybe the tech industry should get started on figuring out what the solution will be. Any company that participates in the solution will have a say in the form the solution takes, and can better protect their interests.

Article: ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets


Reply 1 - Shane O'Connell

I probably should have linked the original article that the Guardian got the information from:

https://www.972mag.com/lavender-ai-israeli-army-gaza/

This appears to be an Israeli publication. I am pleased to read articles by like-minded Israelis.


Reply 2 - Shane O'Connell

Alright, I just finished reading the article at the +972 website. It is quite long but worth reading. I have never heard of this publication before, but the Guardian is describing it as an Israeli-Palestinian publication, and the author lives in Jerusalem. There are references to multiple Israeli military officers expressing regret for what has occurred.

In my opinion it is important for peace that people can see that there are others on the opposing side that do not want to cause harm and are able to empathize with the situation of the other side. The Israelis that admitted feeling regret in this article did a brave thing that helps Israel in my opinion.


Reply 3 - LinkedIn Connection #6

Completely agree. This tech has been in development for many many years by many of the same companies that pioneered AI and autonomous driving (in particular). It has also always been for surveillance, weapons and killing. The industry has know and either turned a blind eye, or knowingly supported it.

I’m glad that people are finally being heard when they speak out. Responsibility, representation and regulation of AI are needed now. Thank you for highlighting this and sharing your thought.


Reply 4 - LinkedIn Connection #7

v dangerous, just like scifi movies