Fart ignites fire during operation in Japan; patient seriously burnt
Representational Image Reuters

Many computer algorithms designed to make complex decisions in lieu of humans are replicating the same racial, socioeconomic and gender-based biases they were built to overcome, reports have said. Software widely used in the healthcare industry is affecting access to care for millions of Americans, said new research by the University of California, as the program, which determines who gets access to high-risk health care management programs, routinely permits in healthier whites ahead of blacks who are less healthy.

The study published in the journal Science obtained algorithm-predicted risk score in 43,539 white and 6,079 black patients enrolled in a hospital and compared it to more direct measures of a patient's health, including number of chronic illnesses and other biomarkers, to find that for a given risk score, blacks had significantly poorer health than their white counterparts.

"Instead of being trained to find the sickest, in a physiological sense, [these algorithms] ended up being trained to find the sickest in the sense of those whom we spend the most money on," explained lead author Sendhil Mullainathan, the Roman Family University Professor of Computation and Behavioral Science at Chicago Booth.

The study also suggested that fixing the bias in the algorithm could more than double the number of black patients automatically admitted to these programs. Ziad Obermeyer, acting associate professor of health policy and management at UC Berkeley and co-author of the study, said the algorithms encoded racial bias by using health care costs to determine patient "risk" or who most likely benefitted from care management programs.

"Because of the structural inequalities in our health care system, blacks at a given level of health end up generating lower costs than whites," Obermeyer explained, adding that black patients were sicker at a given level of the algorithm's predicted risk. The researchers said they were able to correct much of the bias initially built into the algorithm by tweaking the software to use other variables such as costs that could be avoided by preventative care to predict patient risk.

Obermeyer said that incorporating routine audits into algorithm developers' workflows would help fix the bias in prediction algorithms designed by private companies and proprietary, making it difficult for data scientists and researchers to analyze them. Researchers said that patients whose risk scores landed in the top 97 percent were automatically identified for enrollment in the care management program, and the percentage of black people in the automatically enrolled group jumped from 18 to 47 percent by correcting the health disparities between blacks and whites.

Obermeyer said training the algorithm to determine risk based on other measurable variables, such as avoidable cost, or the number of chronic conditions that needed treatment in a year, significantly reduced the racial bias.