How Big Data Impacts Black Lives

Uncategorized
Written by: Madeleine Smith

March 2, 2021

What is Big Data?

We’ve all spent time on datasets—big, medium, and small. But imagine a dataset so big that you cannot conceptually fit it on any number of screens that you might have in your office. ‘Big data’ refers to datasets that are so colossal they need many machines to process and store their information. These armies of machines are usually linked to cloud platforms that manage them as a single unit. 

Big data is everywhere. It is constantly being collected by companies through our daily actions—hospital trips, energy bills, cell phone and mobile app usage, advertising, and even job applications. One of the attractions of big data is the low cost of collecting it. Many companies and decision-makers will use proxy variables from big data sets to make decisions at scale. A proxy variable is a data category that substitutes immeasurable data (e.g. trustworthiness) with something else (e.g. a credit score based on previous financial behavior). This is where the danger of big data comes in—proxies are prone to losing qualitative context in the ocean of quantitative numbers. This is what author Cathy O’Neil addresses in her book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. 

“This is where the danger of big data comes in—proxies are prone to losing qualitative context in the ocean of quantitative numbers.”


What is a ‘Weapon of Math Destruction’?

Besides being a symbolic pun, a “weapon of math destruction”, or a “WMD”, is a harmful algorithmic model for these big data sets. Data models are created by mathematicians and statisticians and they can be encoded with the implicit biases that those mathematicians and statisticians have. This may not seem like a problem on a case by case basis, but when applied at scale to millions, or even billions, of people, these models cause their algorithms to have what O’Neil calls “toxic feedback loops” which reinforce the original bias. These toxic feedback loops disproportionately impact the poor and marginalized while simultaneously reinforcing systemic issues like racism, xenophobia, and poverty. 

WMD’s have three major characteristics that O’Neil outlines in her book: opacity, scale, and damage. O’Neil explains how algorithms used by large companies are shrouded in mystery and heavily protected, sometimes being sold for billions of dollars (e.g. Google’s search engine algorithms). Their opaqueness makes these algorithms dangerous and the companies that use them unaccountable for the consequences their products create. As mentioned earlier, the scale at which WMD’s are used allows them to harm millions of people. An example would be the case of Michelle Rhee. Rhee was hired to increase educational outputs in Washington D.C. and used test scores to judge teacher efficacy with no consideration of the context in which they were teaching. Her approach caused dozens of quality teachers to lose their jobs. 

Some advocates of these secretive algorithms claim that the number of people who benefit from their product or service outweigh those that are harmed. O’Neil rebukes this by claiming that “…the point is not whether some people benefit. It’s that so many suffer. These models, powered by algorithms, slam doors in the face of millions of people, often for the flimsiest of reasons, and offer no appeal. They’re unfair.” It is this level of scale that creates the third characteristic of WMD’s—damage. Toxic feedback loops act as self-fulfilling prophecies for people who happen to fall into the wrong end of the algorithm.

“…These models, powered by algorithms, slam doors in the face of millions of people, often for the flimsiest of reasons, and offer no appeal. They’re unfair.”

There are many different demographic groups that WMD’s negatively impact around the world. This article will be focusing on the inequitable experience of Black Americans in the United States. Let’s take a look, shall we?

Predictive Policing – PredPol

If you haven’t heard of predictive policing, let me enlighten you. Predictive policing is the use of historical crime data within any given police department to utilize officers time and energy efficiently. The theory behind this type of policing is that studying trends of previous crimes will inform the department about potential future crimes. O’Neil points out the unfortunate reality of predictive policing by analyzing the impact of a software called PredPol. She goes on to explain how the software works, saying that “The program processes historical crime data and calculates, hour by hour, where crimes were most likely to occur.” The softwares predictions are presented as squares on a grid system. When a square lights up, police are encouraged to patrol that area on the grid. 

This approach to reducing crime and increasing police efficiency has led to the over-policing of low-income and historically Black neighborhoods. This approach has also opened doors to more invasive police behaviors—standardized stop and frisk (oftentimes accused of being a racist practice) and even taking pictures of civilians to upload to facial recognition software. O’Neil tells us that in New York City alone, 85% of these stop and frisk encounters involved young Black or Latino men, with only 0.1% of those men being linked to any type of violent crime. The reason this becomes dangerous with programs like PredPol is because the model behind the algorithm is based on historical police department data—which is based on the biases of previous police. This data goes back generations—to times when policies like segregation were legal. So the algorithm for PredPol is a prime tool for the toxic feedback loop that leads to over-policing in Black and brown neighborhoods,  arrests of more Black and brown men and women, and more hurdles to overcome.  

For-Profit Colleges

Another example of how big data disproportionately impacts Black people in the United States are the predatory advertising algorithms used by for-profit colleges. Have you ever seen an advertisement on your Google search engine for DeVry University or the University of Phoenix? These for-profit schools use algorithmic models that specifically target vulnerable populations. According to O’Neil, a similar institute called Vatterott College, “directs recruiters to target ‘Welfare Mom w/Kids. Pregnant Ladies. Recent Divorce. Low Self-Esteem. Low Income Jobs. Experienced a Recent Death. Physically/Mentally Abused. Recent Incarceration. Drug Rehabilitation. Dead-End Jobs—No Future.” While O’Neil doesn’t explicitly discuss the racial component of these targets, it’s easy to find if you do a little research. According to the National Center for Education Statistics, Black Americans consistently make up between 12-25% of individuals on welfare. According to The Bureau of Justice Statistics, 38% of state prison inmates are Black. And finally, according to the United States Census Bureau, 18.8% of Black Americans are living in poverty. These numbers may not sound alarming, but when we apply the second characteristic of WMD’s (scale), these percentages equate to millions of people being targeted by for-profit colleges. The colleges are notorious for inflating their program costs to astronomical levels and lack credibility in the job market, leaving graduates with high-interest student loans with little ability to earn a higher income. 

Conclusion

The inequalities of big data are present in many other areas outside of what Cathy O’Neil explains in her book. Racial discrimination is rampant in areas like our healthcare system, facial recognition software, and online hiring platforms (to name a few). The good news is that people are beginning to highlight the inequalities caused by these once unmonitored algorithms. Better yet, they’re working to fix the algorithms that cause such blatant inequity in the first place. Not only do we need more diversity and representation in data science, we need those who are not affected by poverty and marginalization to educate themselves on the needs of others and work to build algorithms that help lift people out of poverty and eradicate systems like racism. 


Leave a Reply