As technology plays a growing role in our lives, so too does data. On the internet, virtually every click, scroll and like is tracked, often used to maximize your engagement. But as it turns out, other forms of data — criminal history, income, even location— are used in the criminal justice system to determine who gets stopped and who goes free.
The problem, however, is that although the numbers seem innocent, their backstory reveals a much more nuanced and disturbing truth: the data used in these seemingly harmless, objective policing systems perpetuates racial bias in criminal justice, especially against Black communities.
Today, we’ll explore some history that most of us don’t learn in school: racist mid-20th century housing policies known as redlining — the dark, discriminatory underside of the New Deal — and racial profiling in policing. Next, we’ll unpack the effect of these historical discrimination tactics in modern crime data. Finally, we’ll look at how racially-biased crime data is used in high-tech criminal justice — predictive policing — and how it treats poor racial minorities unfairly. In the criminal justice system, any form of injustice ought to be abolished.
The New Deal and Redlining
In 1933, between the two world wars, the US faced an unforgiving beast: the Great Depression. Banks were failing, people were starving and the housing market was running dry.
To combat this economic crisis, President Franklin D. Roosevelt proposed the New Deal, creating the Federal Housing Administration (FHA) and the Homeowner’s Loan Corporation (HOLC) among other progressive reforms. The FHA aimed to create new housing while minimizing financial risk, but ended up segregating America.
Through policies known as redlining, the FHA refused to back mortgages in or near African-American neighborhoods because of perceived economic riskiness. The HOLC colored these ‘hazardous’ areas on the map in red.
These efforts were “primarily designed to provide housing to white, middle-class, lower-middle-class families,” Richard Rothstein explained in his book “The Color of Law.” It was a “state-sponsored system of segregation.” A May 2017 NPR article added that “at the same time, the FHA was subsidizing builders who were mass-producing entire subdivisions for whites — with the requirement that none of the homes be sold to African Americans.”
Redlining created a network of discrimination in virtually every American city. Racist housing policies left Black people with poor housing and no federal investment or funding to improve the conditions. “Whites-only” neighborhoods increased the economic and social status of white people while forcing racial minorities into low-quality, underfunded neighborhoods.
In one case, Harvey Clark, a Black World War II veteran, tried to move into Chicago’s all-white Cicero neighborhood in 1951. The police harassed his family and threatened arrest, but the court ordered that the Clarks be allowed to move in. As soon as they did, a white mob of around 4,000 invaded and pillaged the apartment, throwing out the Clarks’ belongings and eventually setting the place on fire. Over 100 people were arrested, but instead of indicting the rioters, the grand jury indicted Clark, his real estate agent, his attorney and the white landlady who rented to him. The offense? Inciting a riot and attempting to lower property values, as Rothstein notes in “The Color of Law.”
Although the Fair Housing Act of 1968 effectively made redlining illegal, segregation still remained. Also, notice how the police in Clark’s story threatened his family with arrest in the beginning, even though they did nothing wrong. This fits into a broader pattern of police targeting racial minorities, with or without the intention to do so, a practice still evident to this day. It’s better known as racial profiling.
Racial Profiling
Racial profiling is by no means new. It’s been a fact for African-Americans since the first organized police force: the slave patrol, founded in 1704 in South Carolina. Its job, you can probably guess, was to hunt down enslaved African and African-American people on the run to freedom. It was a heavily-imposed social control on African-Americans and was undoubtedly race-based. As policing evolved, this racial bias and over-policing of Black people persisted and evolved with it.
Fast-forward to the 1980s. When making efforts to halt drug smuggling, Florida State Trooper Bob Vogel started to notice factors that kept arising when he stopped a driver and found drugs. The 1985 Florida guidelines for police on “The Common Characteristics of Drug Couriers” noted that rental cars, “scrupulous obedience to traffic laws,” drivers wearing “lots of gold” or who don’t “fit the vehicle,” and “ethnic groups associated with the drug trade,” were profiles police should be suspicious of. What? Police were encouraged to use this overtly racialized profile in stops.
However, Vogel didn’t include the times the drivers were innocent. Had he included these interactions in his list, the proportion of innocent people matching the profiles may have outweighed the guilty drivers. Because he didn’t, his assumptions were baseless.
Drug enforcement by minor traffic violations is technically legal because under the Fourth Amendment to the U.S. Constitution, once police stop someone for a traffic violation, they can then investigate other crimes such as drug trafficking. They just wait for a suspicious-looking vehicle or driver to commit any minor traffic violation, then pounce.
It’s often hard for drivers to make racial discrimination accusations because those charges fall under the 14th Amendment’s Equal Protection Clause, which requires a long, expensive civil lawsuit with no success guaranteed. Many people simply don’t have the time or the money required to pursue legal action.
Vogel received much positive national media attention for identifying the drug courier profiling factors. And in 1986, the Drug Enforcement Agency introduced profiling techniques to highway patrolling nationwide that implicitly encouraged targeting of nonwhite drivers. In the same year, the DEA also initiated Operation Pipeline to train tens of thousands of officers across the nation how to use traffic stops as a pretext for drug and weapon searches. These tactics especially affect poor racial minorities, even in St. Louis.
A report by the Associated Press in 2019 found that Black Missouri drivers are 91% — almost two times more likely — to be stopped by the police than white drivers. They’re even more likely to be stopped in their own communities, segregated through racist policies like redlining. Among St. Louis County drivers, Black drivers are over twice as likely to be pulled over compared to whites. Among Kansas City residents, Black drivers are almost three times as likely to be stopped.
The data is clear: disproportionate policing is undoubtedly present in Missouri’s police forces, whether or not it’s intentional.
Interestingly enough, a recent five-year Stanford study analyzed over 95 million traffic stops across the nation and found that while Black people “tend to get pulled over more frequently than whites, the disparity lessens at night, when a ‘veil of darkness’ hides their face,” and their race. Clearly, Black drivers face a higher likelihood of being stopped by the police, especially when their highly-melanated skin is visible in the daylight.
Instead of leaving implicit or explicit racial prejudice in the past, however, automated predictive policing algorithms perpetuate these biases in the modern era of high-tech policing.
Predictive Policing
Algorithms claiming to predict the time and place of future crimes use racially biased historical crime data to spit out location targets for police to patrol. Instead of replacing human bias with an objective data-driven solution, these algorithms only perpetuate historical racial disparities in policing, caused in part by racial wealth gaps resulting from redlining and inaccurate racialized criminal profiles from the ‘war on drugs.’
In 2015, Aaron Shapiro went on a three-hour ride-along with a St. Louis County police officer in the suburbs around Ferguson, Mo. to test out a predictive policing program the department was about to start using. Because of racist housing policies like redlining, it’s a highly segregated, majority Black region. Shapiro had just started researching predictive policing, an algorithm that uses historical data to predict when and where crime is most likely to occur next. It’s a good idea — in theory.
During the ride-along, Shapiro noted how “unremarkable predictive policing is in practice.” The officer drove to the marked areas that the algorithm highlighted, looked for suspicious activity and then left. In one instance, Shapiro said “the officer spotted a rental car and tailed it for a few blocks. Rentals, he explained, can be a sign of trouble.” Sounds a lot like Vogel’s descriptions. “When the driver turned without using his signal, we pulled him over, using the traffic violation as a pretext.” After looking at the driver’s license and registration, the officer called for assistance because he smelled marijuana. After searching and not finding any, the police let the driver go.
This interaction was typical “war on drugs” style, using a minor traffic violation as a pretext for drug searches. The area around Ferguson is predominantly Black and low-income — the median income is $41,657— a combination that yields racially biased traffic stops and subsequent searches. Cars are expensive, and due to redlining policies that economically excluded people of color from accruing wealth over time, they are less likely to own a car. For those on a tight budget, rentals or cheaper, lower-quality used cars are a great option. But these cars can also be targets for police.
Even though data shows that white people are more likely to possess drugs, Black people are the ones caught for it since they have a higher stop and arrest probability. White drivers just aren’t getting stopped as often. This unfair pattern is perpetuated with predictive algorithms.
The St. Louis County Police Department has been using a predictive policing program called HunchLab since 2015, in the aftermath of Michael Brown’s death in Ferguson and the protests that ensued.
Predictive policing is only as accurate as the data it’s fed, which often disproportionately contains racial minorities. To make statistically accurate crime predictions, the algorithm takes in historical data of crime records and outputs the patterns, the times and places crime is most likely to occur.
From the outside, it’s a fair, objective way to predict and prevent future crime. However, the historical data is not objective, creating biased targets even if police officers were completely unbiased. What’s more, scholars have proven the algorithms to be “fundamentally flawed.”
Since poor minority neighborhoods have historically been racially profiled and overpoliced, the data put into the algorithm marks these areas more, continuing the cycle in what’s known as a “runaway feedback loop.” Biased inputs lead to biased outputs that are justified when the police catch a criminal.
The result: disproportionately higher arrest rates in historically overpoliced neighborhoods get put back into the algorithm. This data is then used to justify continuing to disproportionately target these neighborhoods. And so the wheel of injustice rolls on.
It’s important to note that some people in communities with high crime rates feel more policing is necessary. Reacting to The Marshall Project’s investigative piece on predictive policing, one commenter noted that “as one who lived and worked for over 30 years in a high-crime, predominantly Black inner-city neighborhood, the increased presence of police in the high crime area … was welcomed and sought.”
It’s also important to note that the technology doesn’t factor in race. But because of redlining, the systematic segregation and disinvestment in communities of color by the federal government, geography and income are proxies for race. Poor neighborhoods are more often populated by minority groups, and wealthy neighborhoods are generally populated by white people.
“It’s a vicious cycle,” John Chasnoff, program director of the Eastern Missouri American Civil Liberties Union chapter said. “The police say, ‘we’ve gotta send more guys to North County’ because there have been more arrests there, and then you end up with even more arrests, compounding the racial problem.”
Chasnoff proposed a solution. He said, “I don’t think anyone, in the abstract, has a problem with figuring out where crime is and responding to it. But what’s the appropriate response? The assumption is: we predicted crime here, and you send in police. But what if you used that data and sent in resources?”
Former St. Louis County Police Chief Jon Belmar, who retired in 2020, agreed that other resources are necessary. “We can’t just have the criminal justice system solve our problems,” he said.
Fortunately, police departments across the nation have begun to take action against these unfair algorithms. New Orleans shut down its program in 2018 that operated for six years without public knowledge. Chicago followed suit in 2020. The same year, Santa Cruz, Calif. banned predictive policing. In St. Louis, however, the program is still active. And some cities are adopting systems that predict not which places but which people are more likely to commit a crime, based on age, gender, criminal history and other factors.
We can all agree that community safety is of utmost importance, and stopping crime before it happens would make a lot of people’s lives better. But with the tools we have now that perpetuate historical inequities by continuing to overpolice minority neighborhoods in a runaway feedback loop, we have to ask, “at what cost?”
If the cost is increased racial profiling through this biased technology that hides behind a façade of objectivity and fairness, is it worth it? If the technology in the criminal justice system is inherently biased, is it really creating justice?
Maybe technology isn’t the answer to all of our problems. Maybe we’d be better off going back to face our unsettling past and move forward from there. Data-driven technology like predictive policing is a reflection of our history, the good and the bad. So, for the time being, we have to take a good look at the continuing history of bias and discrimination in our justice systems before handing over the reins to technology. Just like we have to push backward on the floor to walk forwards, we have to go back and understand the past to move forward as a community, nation and world.
Throughout history, our national systems have denied justice for Black people. Slavery and the slave patrol denied them the right to freedom, a fundamentally American value. Racist mid-20th century housing policies made homeownership and good quality, affordable housing nearly impossible to attain, widening the racial wealth gap. Most recently, the ‘war on drugs’ exacerbates racial profiling in policing, in part because suspicious-looking vehicles — rental cars and cheap used cars in semi-disrepair — are targets, and are more prevalent among people who don’t have the money to buy a new car or pay for repairs. And because of redlining, low-income neighborhoods are generally populated by racial minorities. Regardless of individual police officers’ beliefs and actions, the entire system indisputably works against Black people. Adding technology to the mix doesn’t fix these inherent structural issues.
In an institutional framework that continues to deny Black people their civil liberties, it’s no surprise that seemingly objective technologies used within these unjust structures only perpetuate and augment injustice. Bias existed before predictive policing and only continues and grows with the technology. Future policing innovations will also claim to provide an unbiased, data-driven approach to high-tech policing. But based on our criminal justice system’s 300+ year track record of malicious actions towards racial minorities, how can we be so sure? Ultimately, justice hinges on who owns the algorithms.
To truly ensure liberty and justice for all, we must either dramatically change the objective of predictive policing tools to uplift communities by sending in essential resources, or eliminate these machines of injustice altogether. Either way, justice for Black Americans is long overdue.
For more information on how technology can endanger human rights, you can check out the critically-acclaimed Coded Bias documentary that is now available on Netflix and the free PBS app.
Fluchel • Apr 13, 2021 at 12:09 pm
This is a painstakingly researched and thought-provoking piece WITH heart and voice. I’m proud of you, Paige! Thanks for speaking about the issues you care about! 🙂
Heather • Apr 6, 2021 at 5:50 am
Thank you for filling in mote gaps in my knowledge about discrimination.
Tiffany • Apr 5, 2021 at 1:39 pm
This was a well researched topic that needs more attention in all of our schools today. If we don’t understand the harms of history, how will we prevent them from affecting our present and future?