Daily Sabah logo

Politics
Diplomacy Legislation War On Terror EU Affairs Elections News Analysis
TÜRKİYE
Istanbul Education Investigations Minorities Expat Corner Diaspora
World
Mid-East Europe Americas Asia Pacific Africa Syrian Crisis Islamophobia
Business
Automotive Economy Energy Finance Tourism Tech Defense Transportation News Analysis
Lifestyle
Health Environment Travel Food Fashion Science Religion History Feature Expat Corner
Arts
Cinema Music Events Portrait Reviews Performing Arts
Sports
Football Basketball Motorsports Tennis
Opinion
Columns Op-Ed Reader's Corner Editorial
PHOTO GALLERY
JOBS ABOUT US RSS PRIVACY CONTACT US
© Turkuvaz Haberleşme ve Yayıncılık 2025

Daily Sabah - Latest & Breaking News from Turkey | Istanbul

  • Politics
    • Diplomacy
    • Legislation
    • War On Terror
    • EU Affairs
    • Elections
    • News Analysis
  • TÜRKİYE
    • Istanbul
    • Education
    • Investigations
    • Minorities
    • Expat Corner
    • Diaspora
  • World
    • Mid-East
    • Europe
    • Americas
    • Asia Pacific
    • Africa
    • Syrian Crisis
    • Islamophobia
  • Business
    • Automotive
    • Economy
    • Energy
    • Finance
    • Tourism
    • Tech
    • Defense
    • Transportation
    • News Analysis
  • Lifestyle
    • Health
    • Environment
    • Travel
    • Food
    • Fashion
    • Science
    • Religion
    • History
    • Feature
    • Expat Corner
  • Arts
    • Cinema
    • Music
    • Events
    • Portrait
    • Reviews
    • Performing Arts
  • Sports
    • Football
    • Basketball
    • Motorsports
    • Tennis
  • Gallery
  • Opinion
    • Columns
    • Op-Ed
    • Reader's Corner
    • Editorial
  • TV
  • Opinion
  • Columns
  • Op-Ed
  • Reader's Corner
  • Editorial

Gaza as a testing ground: Israel’s AI warfare

by Sibel Düz

Jul 03, 2025 - 12:05 am GMT+3
"Rather than minimizing harm, AI tools have accelerated the pace and scale of deadly force, often against non-combatants, turning decision support into decision automation with devastating human costs." (Illustration by Erhan Yalvaç)
"Rather than minimizing harm, AI tools have accelerated the pace and scale of deadly force, often against non-combatants, turning decision support into decision automation with devastating human costs." (Illustration by Erhan Yalvaç)
by Sibel Düz Jul 03, 2025 12:05 am

In Gaza, Israel lets AI decide targets and treat civilian deaths as 'acceptable casualties,' making accountability disappear

A new kind of war is taking place in Gaza, one led not just by missiles and drones, but by computer algorithms. The Foundation for Political, Economic and Social Research's (SETA) report “Deadly Algorithms: Destructive Role of Artificial Intelligence in Gaza War” reveals a disturbing reality: Israel’s growing use of artificial intelligence (AI) in military operations is changing how wars are fought. In this new model, machines, not people, decide who lives and who dies. This shift is causing more civilian deaths and breaking international laws meant to protect innocent lives during conflict. The main point of the report is clear: Israel’s use of AI in war has removed human judgment from many decisions, especially in Gaza. High-tech systems like Lavender and Habsora are being used to identify targets and carry out attacks. But this process, led by machines, often fails to tell the difference between civilians and fighters, leading to devastating results.

Accelerating mass killings

Israel is known for using advanced technology in its defense systems. One example is the Iron Dome, an air defense system that uses AI to stop rockets in mid-air. While this system helps protect Israeli cities, the same technology is also used offensively with serious consequences.

After the Oct. 7, 2023, Hamas’ Operation Al-Aqsa Flood, Israel used AI tools like Lavender and Habsora to identify bombing targets in Gaza. According to the report, the use of AI-based decision support systems like Habsora and Lavender by the Israeli military has led to a dramatic increase in violence against civilians and a dangerous erosion of accountability in warfare.

Habsora, introduced in 2021, was built to speed up target selection using data from drones, satellites, communications and social media. While it claimed to improve precision, the reality was very different; most attacks were carried out with unguided bombs, and there was little evidence of efforts to protect civilians.

After Oct. 7, the Lavender system was deployed, significantly increasing the scale and speed of airstrikes. Lavender could identify and approve targets in just 20 seconds, often without any meaningful human review. It reportedly created a list of 37,000 people labeled as Hamas members, many of them low-ranking or without confirmed military roles. These individuals were frequently targeted in their homes at night, when their families were present. For example, a strike on a United Nations school in Nuseirat on July 7, 2024, killed 23 people and injured over 80; the Israel Defense Forces (IDF) later claimed that only eight of the dead were Hamas members. Sources from within the IDF admitted that operators referred to low-level targets as “garbage” and accepted that most casualties were women and children. The system also allowed for a "collateral damage threshold" of up to 20 civilians per strike, applied automatically without assessing the actual threat posed by each target.

AI systems even tracked individuals using software like “Where’s Daddy?” to mark the exact moment they entered their homes, triggering attacks. These practices suggest a deliberate strategy of mass killing under the cover of algorithmic efficiency. Rather than minimizing harm, these AI tools have accelerated the pace and scale of deadly force, often against non-combatants, turning decision support into decision automation with devastating human costs.

Other than that, some investigative news confirmed in March 2025 that Israel’s surveillance systems were collecting biometric and behavioral data on Palestinians in real time. This included tracking social media activity, phone records and daily routines. AI then used this data to help choose targets, turning daily life into a source of military intelligence. The report stated that facial recognition systems in cities like Hebron were linked to movement prediction tools that fed into targeting programs, such as Lavender. This means surveillance wasn’t just about monitoring; it was directly connected to attacks.

When no one is responsible

One of the significant challenges to using AI in war is accountability. Who is to blame when a machine causes a civilian death? The Israeli military says human officers approve AI-generated targets. But according to the report, many operators trusted Lavender so much that they approved its targets without checking them. Some strikes were allowed to cause up to 20 civilian deaths and were still approved as “acceptable.” This creates a dangerous situation where no one is clearly responsible. Under international law, someone must be accountable for civilian deaths in war. But with AI making the decisions, responsibility becomes unclear. The report calls this a “responsibility gap,” where soldiers, commanders and software engineers can all avoid blame.

The report also shows how Israel uses AI for constant surveillance in Palestinian areas. Systems like Blue Wolf, Red Wolf and Wolf Pack collect facial recognition data and track movements, turning cities into digital prisons. For example, in Hebron, the Wolf Pack system was used by holding detained Palestinians in front of Closed-circuit television (CCTV) cameras to match them with a central database. Blue Wolf, introduced later, gamified data collection; soldiers were ranked and rewarded for capturing the most Palestinian faces using a mobile app. Red Wolf, currently deployed at checkpoints, scans faces and assigns color-coded status indicators (green, yellow, red) that determine whether individuals are allowed to pass. This data is then used to support targeting and attacks.

During the Gaza war, facial-recognition cameras were reportedly set up at makeshift checkpoints to scan people fleeing from north to south, with some being detained or beaten based on algorithmic matches, some of which later turned out to be false. People in these areas live under constant watch. Every movement could be logged, analyzed and flagged as a threat. Soldiers have even admitted to deferring to the system over their own judgment, treating people as digital profiles rather than human beings. This kind of monitoring not only limits freedom but also increases the risk of wrongful targeting, reinforcing a system of mass surveillance that strips Palestinians of basic rights like privacy, movement and dignity.

Moreover, investigative news adds to this picture, explaining how drones with AI were used to watch for “suspicious behavior.” The AI tracked people’s routines and flagged them if something seemed unusual, even without real evidence. These alerts could then lead to those individuals being targeted by military strikes, with little or no human review.

International silence

Despite these findings, the international community has taken little action in response. Some human rights groups and U.N. officials have criticized Israel, but there are no strong laws in place to regulate AI weapons. Most war laws were written long before this technology existed and don’t apply well to today’s automated systems. The report calls for urgent action. New international laws are needed to ban fully autonomous weapons and make sure people, not machines, remain responsible for decisions in war. Human rights organizations have warned about the abuses of facial recognition technology, but their statements alone are insufficient. There must be legally binding rules that all countries follow.

Gaza has become a testing ground for AI-driven war, and the results are deeply troubling. The report shows a battlefield where human judgment has been replaced by speed and automation. Ethical standards have eroded under the pressure to act quickly and aggressively. Israel’s use of AI may offer short-term military advantages, but it comes at a great cost. Civilian lives are being lost, and the foundations of international law are being undermined. If the world doesn’t act soon to regulate this technology, we may enter a future where machines lead wars, and no one is held responsible for their actions.

About the author
M.A. in Middle East Studies from Middle East Technical University (METU), senior researcher at SETA Foundation
  • shortlink copied
  • KEYWORDS
    habsora lavender ai war technologies israel-palestine conflict gaza surveillance
    The Daily Sabah Newsletter
    Keep up to date with what’s happening in Turkey, it’s region and the world.
    You can unsubscribe at any time. By signing up you are agreeing to our Terms of Use and Privacy Policy. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
    No Image
    Photos display devastation of 'disaster of century' in Türkiye
    PHOTOGALLERY
    • POLITICS
    • Diplomacy
    • Legislation
    • War On Terror
    • EU Affairs
    • News Analysis
    • TÜRKİYE
    • Istanbul
    • Education
    • Investigations
    • Minorities
    • Diaspora
    • World
    • Mid-East
    • Europe
    • Americas
    • Asia Pacific
    • Africa
    • Syrian Crisis
    • İslamophobia
    • Business
    • Automotive
    • Economy
    • Energy
    • Finance
    • Tourism
    • Tech
    • Defense
    • Transportation
    • News Analysis
    • Lifestyle
    • Health
    • Environment
    • Travel
    • Food
    • Fashion
    • Science
    • Religion
    • History
    • Feature
    • Expat Corner
    • Arts
    • Cinema
    • Music
    • Events
    • Portrait
    • Performing Arts
    • Reviews
    • Sports
    • Football
    • Basketball
    • Motorsports
    • Tennis
    • Opinion
    • Columns
    • Op-Ed
    • Reader's Corner
    • Editorial
    • Photo gallery
    • DS TV
    • Jobs
    • privacy
    • about us
    • contact us
    • RSS
    © Turkuvaz Haberleşme ve Yayıncılık 2021