Posted on November 4, 2019 at 9:10 AM
Can algorithms show racial bias? That is the conclusion of a recent article published in Science by Obermeyer, et al., entitled, “Dissecting racial bias in an algorithm used to manage the health of populations.”
According to Science, the algorithm’s goal is “to predict complex health needs for the purpose of targeting an intervention that manages those needs.” Fair enough. That certainly sounds like a worthy goal, especially in these days of complex medical conditions, with equally complex treatments. However, the problem raised by the research done by Obermeyer, et al., is that “the algorithm predicts health care costs rather than illness, but unequal access to care means that we spend less money caring for Black patients than for White patients. Thus, despite health care cost appearing to be an effective proxy for health by some measures of predictive accuracy, large racial biases arise.” The study concluded that with a reduction of bias in the algorithm, a much larger percentage of Black patients would receive the advanced interventions that the health care system offers.
In addition, the Minneapolis Star-Tribune reports that “New York regulators are calling on Minnetonka-based UnitedHealth Group to either stop using or show there’s no problem with a company-made algorithm that researchers say exhibited significant racial bias in a study.”
In its note on the research, Nature states that Optum (the algorithm’s developer) raised questions about the study’s conclusions: “The cost model is just one of many data elements intended to be used to select patients for clinical engagement programs, including, most importantly, the doctor’s expertise.” Nature also reports that “Obermeyer is working with the firm without salary to improve the algorithm.”
The results of this study deserve to be examined closely. If we truly want to affirm the dignity of each individual, bioethics must address these areas of disparity whenever possible. It’s not likely that we will ever achieve a bias-free world, but it is surely helpful to be made aware of our biases so that we can better serve those in need.