Insider Report: Israel Utilizes Artificial Intelligence for Target Selection in Gaza

The Israeli military is facing scrutiny for its use of artificial intelligence in bombing targets in Gaza, code-named “Lavender,” which reportedly has a 10% error rate. According to reports, the human review of suggested targets is cursory, with personnel spending only 20 seconds per target.

While the Israel Defence Forces (IDF) have not denied the existence of the AI tool, they have denied using AI to identify terrorists. The IDF has emphasized their commitment to reducing harm to civilians, following international law and restrictions in the target identification process.

However, there have been reports of thousands of Palestinians, including women and children, being killed in night attacks on homes. The army has allegedly used unguided missiles on junior militants, posing a greater threat to civilians. Despite this, the IDF has stated that they make efforts to reduce harm to civilians by reviewing targets and choosing appropriate munitions.

Officials argue the necessity of heavy munitions to eliminate Hamas fighters who have killed over 1,200 people in Israel. The IDF’s actions have raised concerns about the impact on civilians and whether the use of AI in targeting has increased the risk to innocent lives.

As the situation continues to escalate in Gaza, questions remain about the ethics and effectiveness of using artificial intelligence in military operations. Insider Wales Sport will continue to monitor the situation and provide updates as more information becomes available.

See also  Canadian Prime Minister Justin Trudeau appoints Bill Blair, former police chief, as defense minister

You May Also Like

About the Author:

Leave a Reply

Your email address will not be published. Required fields are marked *