Washington, DC, is home to the most powerful government on Earth. It’s also home to 690,000 people — and 29 obscure algorithms that shape their lives. City services use automation to screen housing applicants, predict criminal recidivism, identify food aid fraud, determine whether a high school student will drop out, inform juvenile sentencing decisions, and many other things.
That snapshot of semi-automated urban life comes from a new report from the Electronic Privacy Information Center (EPIC). The nonprofit conducted 14 months of research into the city’s use of algorithms and found that they were used by 20 agencies, with more than a third being deployed in law enforcement or criminal justice. For many systems, city departments did not provide full details of how their technology worked or was used. The project team concluded that the city probably uses even more algorithms that they could not discover.
The findings are notable outside of DC in that they add to evidence that many cities have quietly put bureaucratic algorithms to work in their departments, where they can contribute to decisions that affect citizens’ lives.
Government agencies often turn to automation in hopes of adding efficiency or objectivity to bureaucratic processes, but it is often difficult for citizens to know they are working, and some systems prove to be discriminatory and lead to decisions that ruin human lives. . In Michigan, an unemployment fraud detection algorithm with a 93 percent error rate caused 40,000 false allegations of fraud. A 2020 analysis by Stanford University and New York University found that nearly half of federal agencies use some form of automated decision-making systems.
EPIC dug deep into a city’s use of algorithms to give a sense of the many ways they could impact citizens’ lives and encourage people in other places to do similar exercises. Ben Winters, who leads the nonprofit’s work on AI and human rights, says Washington was chosen in part because about half of the city’s residents identify as black.
“Often, automated decision-making systems have a disproportionate impact on black communities,” says Winters. The project found evidence that automated traffic enforcement cameras are disproportionately placed in neighborhoods with more black residents.
Cities with significant black populations have recently played a central role in campaigns against municipal algorithms, particularly among the police. Detroit became an epicenter of facial recognition debates following the false arrests of Robert Williams and Michael Oliver in 2019 after algorithms misidentified them. In 2015, the deployment of facial recognition in Baltimore after the death of Freddie Gray in police custody sparked some of the first congressional investigations into law enforcement’s use of the technology.
EPIC hunted algorithms by seeking public disclosures by city agencies and also filed public registry requests, requests for contracts, data-sharing agreements, privacy impact assessments, and other information. Six of the 12 city agencies responded, sharing documents such as a $295,000 contract with Pondera Systems, owned by Thomson Reuters, which uses fraud-detection software called FraudCaster to screen food aid applicants. Earlier this year, California officials found that more than half of the 1.1 million claims by state residents that Pondera’s software was flagged as suspicious were in fact legitimate.