Home | Today's Research

U of T researchers uncover equity gap in diabetes-related AI

U of T researchers Quynh Pham (left) and Joseph Cafazzo (right) say the vast majority of research into AI-based diabetes interventions does not include, or report on the inclusion of, ethnic or racial training data (photos courtesy of IHPME)

From minimally invasive robot-assisted surgery to training computers to detect breast cancer, the potential for artificial intelligence (AI) to transform health care is breathtaking. But what if the data used to develop some AI-based tools – such as those that help a clinician predict whether a patient will go on to have a disease – are incomplete or unsuitable?

Researchers at the University of Toronto’s Institute of Health Policy, Management and Evaluation (IHPME) at the Dalla Lana School of Public Health asked themselves this very question.

A paper written by Quynh Pham and Joseph Cafazzo shows that the vast majority of research into AI-based diabetes interventions does not include, or report on the inclusion of, ethnic or racial training data – an important finding given that diabetes patients from certain ethno-racial groups are more likely to have poor outcomes.

The paper, titled “The Need for Ethnoracial Equity in Artificial Intelligence for Diabetes Management,” was published recently in the Journal of Medical Internet Research.

“A lot of artificial intelligence is done by training models on retrospective data and historically, those datasets poorly represent Canadians,” says Cafazzo, a professor at IHPME and executive director of the Centre for Global eHealth Innovation at University Health Network (UHN).

“People who are associated with academic centres tend to get enrolled in research trials. We tend to gather data on them, but people who are more rural and our Indigenous populations don’t get asked to be in research trials, so their data is never collected and therefore never incorporated into these models.”

Pham, who is an assistant professor at IHPME and a scientist at UHN, notes there are cultural and biological factors that make it more difficult for certain racialized communities to manage diabetes.

About one in three Canadians has prediabetes or diabetes, which puts people at greater risk of heart disease, stroke, and kidney failure.

Pham, Cafazzo, and colleagues Anissa Gamble and Jason Hearn conducted a secondary analysis of a highly cited review paper that was published in 2018 called “Artificial Intelligence for Diabetes Management and Decision Support.” The 2018 review looked at research articles on diabetes interventions using AI that include applications for: clinical decision support; identifying adverse events; self-management that, for example, prompts a person to make a lifestyle change; and tools that predict the risk of developing diabetes based on genetic or lifestyle factors.

The team found that of the 141 articles included in the 2018 review, 90 per cent made no mention of the ethnic and racial makeup of the datasets used to inform AI algorithms. Only 10 of the articles in the original review reported ethnic or racial data, with the average distribution being 70 per cent white, 17 per cent Black and four per cent Asian.

Skewed training information is problematic because of a concept called distributional shift. This means, for example, that a diabetes prediction tool developed using data from a group that is unlike the people on which it will be used could be flat-out wrong.

“If you train your model on a data set that looks nothing like the population that you intend to apply it to, you’re going to have a massive mismatch,” says Pham, who is the first author on the paper.

“Some of these studies were 99 per cent white populations – if you take that and apply it to Markham or Mississauga or Scarborough, obviously that’s not going to work because of the demographic makeup of those communities.”

The researchers recommend using representative training datasets for digital health interventions to improve accuracy and generalizability.

“I think the important thing for research, especially in Canada, is to have more inclusive prospective datasets to train these models on,” says Cafazzo. “It brings it back to: How inclusive do we want to be in research and be honest about our differences?”

The team has also developed a tool – a set of five questions – for researchers to assess how they are collecting data and the ethnic and racial relevance of the AI algorithm. The goal of this work, says Pham, is to ensure equity is built into how health innovations are designed so that all communities can benefit from them.

“In five years, AI-based interventions will be standard of care,” Pham says. “They will be federated eventually to a level where everybody, to some degree, is accessing care where something has run through a model to assist a clinician in making a diagnosis or a judgement call. We want to make sure everybody is cared for equally.”

Article By Alisa Kim

2021 marks the 100th anniversary of the discovery of insulin, highlighting the collaborative effort that U of T and its affiliated hospitals and industry partners took to develop, advance and distribute this life-saving treatment to millions worldwide.