Despite near-universal trust in the science of mathematics, human biases can readily creep into data analysis and design of algorithms, leading author and algorithmic auditor Cathy O’Neil to warn that biases baked into algorithms may create sweeping consequences for both individuals and society as a whole.

Speaking at the Brookings Institution on May 22, O’Neil argued that algorithms at a basic level can be thought of as opinions – ones that are constructed based on predictions about the future and trends extrapolated from past events. And at their very foundation, O’Neil said, algorithms only need two components: historical data and a definition for success.  That definition of success – no matter who defines it – remains based on human judgment, not hard science.

Too often, O’Neil said, people put trust in data that is biased in some way.  In turn, computer scientists write algorithms that allow biased data to be wrapped in the armor of mathematical science. The result: the data produced by the algorithms may be afforded a higher degree of trust than warranted.

O’Neil said that result can have dangerous consequences.

“The people who are in power define what success is,” she said. “What we have now is a situation where we have a bunch of people building algorithms hiding behind the authority of mathematics.”

O’Neil said some algorithms that incorporate bias – whether intended or not – can produce results that in turn unfairly impact people by having the effect of limiting employment and other opportunities. She dubbed those kinds of algorithms “weapons of math destruction.”

Algorithms are increasingly playing a larger role in processes like hiring, particularly with personality tests. O’Neil cited a teenager who suffered from bipolar disorder and tried to get a minimum-wage job bagging groceries. He had to take a personality test driven by algorithms that detected his mental illness, and the test advised managers not to hire him.

The teenager applied to minimum-wage positions at other companies with similar personality tests under the guidance of his father, a lawyer, and was rejected without receiving an interview at any of the firms. His father filed class-action lawsuits for discrimination against his son, and they won all of the suits, O’Neil said.

Algorithmic bias also occurs when important data is missing from the equation, O’Neil said.  One example, she explained, is in predictive policing based on systems that use arrest data – rather than crime data. One result, she said, is that even though marijuana use is equally prevalent among black and white communities, black people get arrested for marijuana possession four to five times more frequently than whites.

When such algorithms are based on arrest data, they tend to reinforce the wisdom of pursuing more arrests in already high-arrest areas, and that can lead predictive policing models that don’t look more broadly for the prevalence of crime in low-arrest areas, O’Neil argued.

To combat algorithmic bias, O’Neil said that building diversity in data analysis and IT teams is essential. And when certain communities – such as the uneducated and poor – are less likely to have the skills to enter the IT field, O’Neil said it is important that IT teams build paradigms to account for diversity in ways that include those groups.

Building diversity has been identified as a long-term challenge in IT fields. The House Homeland Security Committee’s Cybersecurity, Infrastructure Protection, and Innovation Subcommittee held a hearing on May 21 to parse out the challenges of building diversity in the cybersecurity field, for instance, where witnesses and House members said that women and people of color were largely underrepresented.

The brighter side to the story, O’Neil said, is that carefully considered and constructed algorithms can also work to address societal harms, rather than reinforce them.

“[An algorithm] is a codified version of our human process,” O’Neil said. “You can think of algorithms as tools that help us improve ourselves. … I think that’s an opportunity. It doesn’t have to punish people.”

Read More About