“What we do is transform CCTV cameras into a powerful monitoring tool,” says Matthias Houllier, co-founder of Wintics, one of four French companies that won contracts to deploy their algorithms at the Olympics. “With thousands of cameras, it’s impossible for police officers to [to react to every camera.]”
Wintics won its first public contract in Paris in 2020, collecting data on the number of cyclists in different parts of the city to help Paris’s transportation authorities plan to build more bike lanes. By pairing its algorithms with 200 existing traffic cameras, Wintics’ system, which is still in use today, can first identify and then count cyclists in the middle of busy streets. When France announced it was looking for companies to build algorithms to improve security at this summer’s Olympic Games, Houllier saw it as a natural progression. “The technology is the same,” he says. “It analyzes anonymous shapes in public spaces.”
After Wintics trained its algorithms with both open-source and synthetic data, the systems were modified to count, for example, the number of people in a crowd or the number of people falling to the ground. Operators are alerted when the number exceeds a certain threshold.
“That’s it. There’s no automatic decision,” Houllier explains. His team trained Home Office officials on how to use the company’s software, and they decided how to implement it, he says. “The idea is to get the operator’s attention so they can double-check and decide what needs to be done.”
Houllier argues that his algorithms are a privacy-friendly alternative to controversial facial recognition systems that have been used in the past at global sporting events, such as the 2022 World Cup in Qatar. “Here we’re trying to find a different way,” he says. For him, having the algorithms crawl CCTV footage is a way to ensure the event is safe without compromising personal freedoms. “We’re not analyzing any personal data. We’re only looking at shapes, no face, no license plate recognition, no behavioral analysis.”
Privacy activists, however, reject the idea that this technology protects people’s personal freedoms. In the 20th arrondissement, Noémie Levain, a member of the activist group La Quadrature du Net, has just received a delivery of 6,000 posters that the group plans to distribute, designed to warn Parisians about the “algorithmic surveillance” taking over their city and urge them to refuse the “authoritarian occupation of public spaces.” She rejects the idea that the algorithms don’t process personal data. “When you have images of people, you have to analyze all the data in the image, which is personal data, which is biometric data,” she says. “It’s exactly the same technology as facial recognition. It’s exactly the same principle.”
Levain worries that AI surveillance systems will remain in France long after the athletes have left. She says these algorithms will allow police and security services to monitor wider swaths of the city. “This technology will reproduce the stereotypes of the police,” she says. “We know they discriminate. We know they’re always going to go to the same area. They’re always going to harass the same people. And this technology, as with any surveillance technology, will help them do that.”
As drivers in the city center rage over security barriers blocking the streets, Levain is one of many Parisians planning to head to the south of France as the Olympics take over. Still, she worries about the city that will welcome her back. “The Olympics are an excuse,” she says. “They—the government, businesses, the police—are already thinking about what happens afterward.”