Bias In Algorithm Costs Patients Precious Time, Money and Treatment

Doctor and patients with laptop
Brooke Brown
June 3, 2020

Institutions that rely heavily on human decision making such as hospitals or courts use a shortcut in hopes of lessening the effects of flawed - and costly - subjectivity. The problem is that their application of data-driven algorithms could be encouraging the opposite. 

In 2019, a study published by Science revealed that “a widely used algorithm [used to manage the health of populations]... exhibits significant racial bias.” The bias was revealed to have come from the assumptions fed into the programmed sequence, which is used to quickly evaluate and then recommend care options for patients with complex healthcare needs.

Optum, the developers of the algorithm, failed to account for the effect of systemic racism - where Black patients experience disparities in access to quality preventative, diagnostic, and treatment care - within their data sets. These assumptions led to a mistake that cost millions of patients dearly.

The result, the study revealed, was that under Optum’s algorithm, Black patients had to be significantly sicker to be recommended by the same level of care as a healthier white patient. We see this challenge playing out on the front lines in the fight against the coronavirus (COVID-19).

“Conditions such as diabetes, hypertension and asthma that tend to plague African Americans more than other groups could contribute to more Covid-19 deaths,” reported CNBC

Undertreatment costs Black patients money, time, and quality of life. These expensive sacrifices could be avoided if algorithm bias was checked before it could distort healthcare administration decisions.

The problem doesn’t just show up in healthcare administration either. Algorithms such as COMPAS, used to analyze whether people released from jails are likely to commit another crime, PredPol, used to predict where crime will happen, and those used in police facial recognition technology, were shown to exhibit biases that led to harmful consequences such as over-policing, criminalization, and disproportionate sentencing of certain groups.

Errors at the hands of justice administrators cost defendants their freedom, reputations, employment and income, and custody of children - not to mention the cost of competent legal counsel itself.

In short, the same qualified candidates who have historically been screened out by human bias might also be screened out by algorithm bias, thus perpetuating homogeneous hiring.

A Brookings Institute report on algorithm bias offered this: “Algorithms, by their nature, do not question the human decisions underlying a dataset. Instead, they faithfully attempt to reproduce past decisions, which can lead them to reflect the very sorts of human biases they are intended to replace.” 

This is bad news for people from marginalized groups, like Black Americans, who are subject to bias from the same humans who are building the algorithms that may define our society’s future.

We have a quick favor to ask:

PushBlack Finance is a nonprofit dedicated to raising up Black voices. We are a small team but we have an outsized impact:

  • We reach tens of millions of people with our BLACK FINANCIAL NEWS & ECONOMIC EMPOWERMENT STORIES every year.
  • We fight for ECONOMIC JUSTICE to protect our community.
  • We run VOTING CAMPAIGNS that reach over 10 million African-Americans across the country.

And as a nonprofit, we rely on small donations from subscribers like you.

With as little as $5 a month, you can help PushBlack raise up Black voices. It only takes a minute, so will you please ?

Share This Article: