Skip to content

How facial recognition is being used in the war in Ukraine

    In the weeks after Russia invaded Ukraine and images of the devastation there flooded the news, Hoan Ton-That, the CEO of facial recognition company Clearview AI, began thinking about how to get involved.

    He believed his company’s technology could provide clarity in complex war situations.

    “I remember seeing videos of captured Russian soldiers and Russia claiming to be actors,” said Mr Ton-That. “I thought if Ukrainians could use Clearview, they would get more information to verify their identities.”

    In early March, he reached out to people who could help him contact the Ukrainian government. One of Clearview’s advisory board members, Lee Wolosky, a lawyer who has worked for the Biden administration, met with Ukrainian officials and offered to deliver a message.

    Mr. Ton-That drafted a letter explaining that his app “can immediately identify someone from a photo” and that police and federal agencies in the United States were using it to solve crimes. That position has made Clearview critical of privacy concerns and questions about racism and other biases within artificial intelligence systems.

    The tool, which can identify a suspect captured on surveillance video, could be valuable to a country under attack, Mr Ton-That wrote. He said the tool can identify people who may be spies, as well as deceased people, by comparing their faces to Clearview’s database of 20 billion faces from the public web, including from “Russian social sites like VKontakte”.

    Mr. Ton-That decided to offer Clearview’s services to Ukraine free of charge, as previously reported by Reuters. Now, less than a month later, New York-based Clearview has created more than 200 user accounts at five Ukrainian government agencies, conducting more than 5,000 searches. Clearview has also translated its app into Ukrainian.

    “It was an honor to help Ukraine,” said Mr. Ton-That, who provided emails from officials from three agencies in Ukraine confirming they had used the tool. It has identified dead soldiers and prisoners of war, as well as travelers in the country, and confirmed the names on their official IDs. The fear of spies and saboteurs in the country has led to heightened paranoia.

    According to an email, Ukraine’s National Police obtained two photos of dead Russian soldiers on March 21, which were viewed by The New York Times. his face through Clearview’s app.

    The app surfaced photos of a man who looked similar, a 33-year-old from Ulyanovsk wearing a paratrooper uniform and holding a gun, in his profile photos on Odnoklassniki, a Russian social media site. According to a national police official, attempts were made to contact relatives of the man in Russia to inform them of his death, but there was no response.

    Identifying dead soldiers and informing their families is part of a campaign, according to a Telegram report by Ukraine’s Deputy Prime Minister Mykhailo Fedorov, to break the cost of the conflict from the Russian public and “demonstrate “the myth of a “special operation” in which there are “no conscripts” and “no one dies,” he wrote.

    Images from conflict zones, of massacred civilians and soldiers left behind on city streets turned into battlefields, have become more widely and immediately available in the age of social media. Ukraine’s President Volodymyr Zelensky has shown world leaders graphic images of attacks on his country to make his case for more international aid. But in addition to conveying a deep-seated sense of war, those kinds of images can now offer something else: an opportunity for facial recognition technology to play an important role.

    However, critics warn that the tech companies can take advantage of a crisis to expand with little privacy oversight, and mistakes made by the software or those using it could have serious consequences in a war zone.

    Evan Greer, deputy director of digital rights group Fight for the Future, opposes any use of facial recognition technology, saying she believes it should be banned worldwide because governments had used it to persecute minority groups and suppress dissent. Russia and China, among others, have deployed advanced facial recognition in cameras in cities.

    “War zones are often used as testing grounds, not only for weapons, but also for surveillance tools that are later deployed on civilians or used for law enforcement or crowd control,” Ms Greer said. “Companies like Clearview are eager to exploit the humanitarian crisis in Ukraine to normalize the use of their malicious and invasive software.”

    Clearview is facing several lawsuits in the United States and using photos of people without their consent has been declared illegal in Canada, Britain, France, Australia and Italy. It risks fines in Britain and Italy.

    Ms Greer added: “We already know that authoritarian states like Russia are using facial recognition surveillance to quell protests and dissent. Expanding the use of facial recognition doesn’t harm authoritarians like Putin — it helps them.”

    Facial recognition has become strong and accurate in recent years and is becoming more accessible to the public.

    While Clearview AI says it makes its database available only to law enforcement officers, other facial recognition services that search the web for similarities, including PimEyes and FindClone, are available to anyone willing to pay for them. PimEyes will show public photos on the web, while FindClone will look for photos scraped from the Russian social media site VKontakte.

    Facial recognition vendors are taking sides in the conflict. Giorgi Gobronidze, a professor in Tbilisi, Georgia, who bought PimEyes in December, said he banned Russia from using the site after the invasion began, citing concerns it would be used to identify Ukrainians.

    “No Russian customer is allowed to use the service now,” said Mr Gobronidze. “We don’t want our service to be used for war crimes.”

    Groups such as Bellingcat, the Dutch research site, have used facial recognition sites for messages about the conflict and about Russian military operations.

    Aric Toler, director of research at Bellingcat, said his favorite facial search engine was FindClone. He described a three-hour surveillance video that surfaced this week, allegedly from a courier service in Belarus, showing men in military uniforms packing materials, including TVs, car batteries and an electric scooter, for shipment.

    Mr Toler said FindClone allowed him to identify several of the men as Russian soldiers who were sending “loot” to their homes from Ukraine.

    As Ukraine and Russia wage an information war about what motivated the invasion and how it is proceeding, journalists like Mr Toler sometimes act as arbiters for their audiences.

    Federov, Ukraine’s deputy prime minister, tweeted a still of the same surveillance tape of one of the soldiers at the counter of the courier service. Federov claimed the man had been identified as a “Russian special forces officer” who committed atrocities in Bucha and “sent all stolen items to his family”.

    Mr Federov added: “We will find every killer.”

    The technology has more possibilities than identifying victims or tracking certain units. Peter Singer, a security scientist at New America, a Washington think tank, said the increasing availability of data on people and their movements would make it easier to track down those responsible for war crimes. But it can also make it difficult for citizens to hide in a tense environment.

    “Ukraine is the first major conflict where facial recognition technology is being used on such a scale, but it is far from the last,” said Mr Singer. “It will become increasingly difficult for future warriors to keep their identities secret, as it will for ordinary citizens walking your own city streets.”

    “In a world where more and more data is collected, everyone leaves a trail of dots that can be connected to each other,” he added.

    That track is not only online. Drone images, satellite images and photos and videos taken by people in Ukraine all play a role in discerning what is happening there.

    Mr. Toler of Bellingcat said the technology was not perfect. “It’s easy to fail — that goes without saying,” he said. “But people are more right than wrong about this. They have figured out how to confirm identifications.”

    Faces can look alike, so secondary information, in the form of a mark, a tattoo, or clothing, is important to confirm a match. Whether that will happen in a tense, wartime situation is an open question.

    Mr. Toler isn’t sure how much longer he’ll have access to his favorite facial recognition tool. Because FindClone is based in Russia, it is subject to sanctions, he said.

    “I have about 30 days left on my service, so I’m desperately trying to add more juice to my account,” Mr. Toler said. “I have a friend in Kyrgyzstan. I’m trying to use her bank card to top up my account.”