WASHINGTON – Author Cathy O’Neil warned about the presence of discrimination and the absence of transparency in computer system designs. Her new book, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, addresses the lack of understanding of the algorithms that we rely on every day.

“I think of myself as a translator,” O’Neil said at the nonprofit New America think tank Monday. “The goal of the book was to translate what was understood essentially by technicians, by insiders, to the public.”

According to O’Neil, that insider information is that algorithms are not fair and objective. The computer programs that determine everything from teacher ratings to credit scores, employment eligibility to recidivism risk, among other examples, are inherently problematic.

For example, O’Neil points to teacher rating systems that are supposed to gauge teaching ability and impose severe penalties for those with low-scores. But the scores seem maddeningly inconsistent, she said. Tim Clifford, a New York City public school teacher interviewed for the book, received a six out of 100 one year, a 96 the next—without any particular indication as to why.

O’Neil earned her Ph.D. in mathematics from Harvard and worked as a quant on Wall Street when the 2008 recession hit. She witnessed firsthand what she referred to as the “mathematical lie” of the mortgage crisis, which launched her interest in “weaponized mathematics.”

Flaws in data selection processes can create algorithms that reflect inaccurate assumptions about their users, according to a May 2016 White House report that identified a threat to civil liberties.

“Big data techniques have the potential to enhance our ability to detect and prevent discriminatory harm,” the report said. “But, if these technologies are not implemented with care, they can also perpetuate, exacerbate, or mask harmful discrimination.”

O’Neil coins the term “weapon of math destruction” to describe this very outcome: the ways in which mathematical algorithms often unintentionally lead to discriminatory practices with broad reach–ranging from commerce to criminal justice.

To qualify as a weapon of math destruction, the algorithm or program has to meet three criteria: opacity, scale and damage. In other words, it has to have a large, quantifiably negative impact on a group of people and it has to lack transparency.

“Mathematics has the weird position where it’s trusted but it’s also feared,” O’Neil said. “It’s kind of like the perfect mechanism to keep people from asking questions.”

Daniel Castro, vice president at the Information Technology and Innovation Foundation, warned that blaming algorithms for societal inequalities would distract from the more important search for real solutions.

Fellow panelist K. Sabeel Rahman, assistant professor at Brooklyn Law School and New America fellow, suggests instead that skepticism toward these algorithms can be beneficial to activists looking to make change.

“If you care about criminal justice reform… you really have to pay attention to how data and algorithms are operating inside the criminal justice system, otherwise the reforms we might propose or be thinking about are not going to hit the mark,” Rahman said.

“It’s not that the technology itself is a problem,” he said, “it’s the way that it enables certain uses and renders opaque other forms of accountability.”