The Israeli army is using AI in the war with Hamas in Gaza. There was a lot of use of AI in the initial days of the war and it is believed that this also resulted in more deaths of common people. An AI system called Lavender has been used by the Israeli army to identify its potential targets in Gaza. This AI system at one time confirmed about 37,000 Palestinians' possible ties to Hamas.
Read also
How is AI being used?
An investigation by an Israeli-Palestinian magazine has revealed that the Israeli army is using this tool to kill enemies. According to this magazine, this program named Lavender has been developed by Unit 8200, the special intelligence department of Israel Defense Forces.
How does lavender work?
Lavender works in a very special way. It depends more on data. This program has identified 30,000 Palestinians as suspected terrorists. The allegation against them is that they are associated with an organization named Paletinian Islamic Jihad. According to the information, the Israeli Army uses this data. According to the information, Israeli soldiers take the decision to drop the bomb only 20 seconds after receiving the report.
Are there any side effects to lavender?
It is said about lavender that it has an error margin of up to 10 percent. This means that 90 times out of 100 it may be right and 10 times wrong. Many times innocent people also get caught in its programs. According to the report, initially permission was given to kill about 15 or 20 civilians while targeting Hamas fighters.
Even before Lavender, another AI system was developed. The name of this cyst is The Gospel. According to the report, Gospel marks buildings from where terrorists operate, while lavender targets terrorists.