At the end of 2023, Israel Ibrahim Biari, a top commander of Hamas in the Northern Gaza Strip, wanted to kill that had helped in planning the slaughter of 7 October. But Israeli intelligence could not find Mr Biari, who, according to them, was hidden in the network of tunnels under Gaza.
So the Israeli officers turned into a new military technology steeped in artificial intelligence, three Israeli and American officers who have informed the events. The technology was developed ten years earlier, but was not used in battle. Finding Mr Biari caused a new incentive to improve the tool, so engineers in the Israeli unit 8200, the equivalent of the country of the National Security Agency, quickly integrated AI in it, the people said.
Shortly thereafter, Israel listened to Mr Biari's phone calls and tested the AI Audio tool, which gave an estimated location where he made his phone calls. With the help of that information, Israel ordered air strikes to direct the area on October 31, 2023, killing Mr Biari. More than 125 civilians also died in the attack, according to Airwars, a London -based conflict monitor.
The audio tool was just an example of how Israel used the war in Gaza to quickly test AI-stundled military technologies and to use it to a degree that had never been seen before, according to interviews with nine American and Israeli defense officials, who spoke about the condition of anonymity because the work is confidential.
In de afgelopen 18 maanden heeft Israël ook AI gecombineerd met gezichtsherkenningssoftware om gedeeltelijk verborgen of gewonde gezichten te matchen met echte identiteiten, zich tot AI gewend om potentiële luchtaanvaldoelen samen te stellen en een Arabisch-taal AI-model heeft gecreëerd om een chatbot te kunnen scannen en te analyseren sms-berichten, sociale media-berichten en andere Arabische-taalgegevens, twee People with knowledge of the programs.
Many of these efforts were a partnership between recruited soldiers in unit 8200 and reserve soldiers who work at technology companies such as Google, Microsoft and Meta, three people with knowledge of the technologies. Unit 8200 stated what became known as 'De Studio', an innovation hub and place to match experts with AI projects, the people said.
But although Israel raced to develop the AI Arsenal, the use of the technologies sometimes led to incorrect identifications and arrests, as well as civil deaths, the Israeli and American officers said. Some officials have struggled with the ethical implications of the AI tools, which can lead to more supervision and other civil murders.
No other nation has been as active as Israel in experimenting with AI tools in real-time fighting, said European and American defense officials and gave an example of how such technologies can be used in future wars and how they fail.
“The urgent need to accelerate the crisis, accelerated innovation, much of it AI-driven,” said Hadas Lorber, the head of the Institute for Applied Research in Responsible AI at the Israel's Holon Institute of Technology and a former Senior Director at the Israeli National Security Council. “It led to game-changing technologies on the battlefield and benefits that were crucial in battle.”
But the technologies “also raise serious ethical questions,” said Mrs. Lorber. She warned that AI needs checks and balance, adding that people should make the latest decisions.
A spokeswoman for Israel's army said she could not comment on specific technologies because of their 'confidential nature'. Israel “is dedicated to the legal and responsible use of data technology tools,” she said, adding that the army investigated the strike on Mr Biari and “was unable to provide further information until the investigation is completed”.
Meta and Microsoft refused to comment. Google said that “employees who have reserve obligation in different countries around the world. The work that employees do as reservists is not connected to Google.”
Israel previously used conflicts in Gaza and Lebanon to experiment with and to promote technical tools for his army, such as drones, hack tools from telephone hacking and the Iron Dome Defense System, which can help intercept ballistic missiles at a short distance.
After Hamas launched cross-border attacks in Israel on 7 October 2023, in which more than 1,200 people were killed and 250 hostages used, AI technologies were quickly deleted for the efforts, four Israeli officials said. That led to the collaboration between unit 8200 and reserve soldiers in “The Studio” to quickly develop new AI options, they said.
AVI Hasson, the Chief Executive of Startup Nation Central, an Israeli non -profit that connects investors with companies, said that Meta, Google and Microsoft reservists had become crucial in stimulating innovation in drones and data integration.
“Reservists brought know -how and access to important technologies that were not available in the army,” he said.
The army of Israel quickly used AI to improve its drone fleet. Aviv Shapira, founder and chief executive of Xtend, a software and drone company that cooperates with the Israeli army, said that AI-driven algorithms were used to build drones to lock and follow goals from a distance.
“In the past, home capacities relied on zero on a picture of the target,” he said. “Now AI can recognize and follow the object itself – perhaps it is a moving car or a person – with deadly precision.”
Mr. Shapira said that his most important clients, the Israeli army and the US Department of Defense, were aware of the ethical implications of AI in warfare and the responsible use of technology.
A tool developed by “The Studio” was an AI-language AI model that is known as a large language model, according to three Israeli officers who are familiar with the program. (The great language model was previously reported by Plus 972, an Israeli-Palestinian news site.)
Developers had difficulty making such a model because of a lack of data from Arabic language to train technology. When such data was available, it was usually Arabic written in standard, which is more formal than the dozens of dialects used in spoken Arabic.
The Israeli army did not have that problem, the three officers said. The country had for decades of intercepted SMS messages, transcribed phone calls and messages scraped from social media in spoken Arabic dialects. So Israeli officers created the great language model in the first few months of the war and built a chatbot to perform questions in Arabic. They have merged the tool with multimedia tabases, allowing analysts to perform complex searches between images and videos, said four Israeli officials.
When Israel killed Hezbollah leader Hassan Nasrallah in September, the Chatbot analyzed the answers in the Arabic-speaking world, three Israeli officers said. The technology distinguished itself between different dialects in Lebanon to gauge the public response, so that Israel was helped to assess whether there was public pressure for a oppike.
Sometimes the chatbot could not identify any modern hose terms and words that were transported from English to Arabic, two officers said. That required Israeli intelligence officers with expertise in various dialects to revise and correct his work, said one of the officers.
The chatbot sometimes also gave wrong answers – for example, the return of photos of pipes instead of weapons – said two Israeli intelligence officers. Nevertheless, the AI tool was considerably accelerated research and analysis, they said.
With temporary checkpoints set up between the northern and southern Gaza Strip, Israel also started to equip cameras after the attacks of 7 October with the possibility of scanning and sending images with high resolution of Palestinians to an AI-steep face recognition program.
This system also sometimes had problems identifying people whose faces were obscured. This led to arrests and interrogations of Palestinians that were wrongly marked by the face recognition system, said two Israeli intelligence officers.
Israel also used AI to search data collected by intelligence officials on Hamas members. Before the war, Israel built a machine-learning algorithm code name “lavender”-that could quickly sort data to hunt at low-level militants. It was trained in a database of confirmed Hamas and meant to predict who is even more part of the group. Although the predictions of the system were imperfect, Israel used it at the start of the Gaza war to help choose attack goals.
There are few goals that were larger than finding and eliminating the senior leadership of Hamas. At the top of the list was Mr Biari, the Hamas commander who, according to Israeli officials, played a central role in planning the attacks of 7 October.
The military intelligence of Israel quickly intercepted the calls of Mr. Biari with other Hamas members, but could not determine his location. So they turned to the AI-stundled audio tool, which analyzed different sounds, such as sonic bombs and air strikes.
After distracting an estimated location where Mr Biari was brought his phone calls, Israeli military officials were warned that the area, which included various apartment complexes, were densely populated, said two intelligence officers. An air raid should focus on different buildings to ensure that Mr Biari was murdered, they said. The operation was enlightened green.
Since then, Israeli intelligence has also used the audio tool in addition to cards and photos of the underground tunnel maze of Gaza to find hostages. Over time, the tool was refined to find exactly individuals, two Israeli officers said.