AI can help diagnose some diseases – if your country is...

AI can help diagnose some diseases – if your country is...
AI can help diagnose some diseases – if your country is...
Artificial intelligence promises Expertly diagnose diseases in medical images and scans. However, a closer look at the data used to train algorithms to diagnose eye diseases suggests that these powerful new tools can sustain health inequalities.

A team of researchers in the UK analyzed 94 data sets with more than 500,000 images that are commonly used to train AI algorithms to detect eye diseases. They found that almost all of the data came from patients in North America, Europe, and China. Only four datasets came from South Asia, two from South America and one from Africa. none came from Oceania.

The inequality in the source of these eye images means AI eye examination algorithms are less secure for racial groups from underrepresented countries, says Xiaoxuan Liu, an ophthalmologist and researcher at Birmingham University who was involved in the study. “Even when there are very subtle changes in the disease in certain populations, the AI ​​can fail pretty badly,” she says.

The American Association of Ophthalmologists has shown enthusiasm for AI tools that promise to help improve standards of care. However, according to Liu, doctors may be reluctant to use such tools on ethnic minority groups if they learn they came from screening predominantly white patients. She notes that the algorithms could fail because of differences that are too subtle for doctors to notice for themselves.

The researchers also found other problems in the data. Many records did not include key demographics such as age, gender, and race, making it difficult to assess whether they are otherwise biased. The datasets were also usually only created for a handful of diseases: glaucoma, diabetic retinopathy, and age-related macular degeneration. Forty-six sets of data used to train algorithms did not provide the data.

Xiaoxuan Liu, Ophthalmologist and Researcher, University of Birmingham

The U.S. Food and Drug Administration has approved several AI imaging products in recent years, including two AI tools for ophthalmology. According to Liu, the companies behind these algorithms usually don’t provide details about how they were trained. She and her co-authors urge regulators to consider the diversity of training data when examining AI tools.

The bias in eye image data sets means that algorithms trained on this data are less likely to function properly in Africa, Latin America, or Southeast Asia. This would undermine one of the great supposed benefits of AI diagnostics: its potential to bring automated medical expertise to poorer areas that are lacking.

“You get an innovation that only benefits certain parts of certain groups of people,” says Liu. “It’s like a Google Maps that doesn’t fit into certain zip codes.”

The lack of diversity in eye images, which researchers refer to as “data poverty,” likely affects many medical AI algorithms.

Amit Kaushal, assistant professor of medicine at Stanford University, was part of a team that analyzed 74 studies of the medical use of AI, 56 of which used data from US patients. They found that most of the US data came from three states – California (22), New York (15), and Massachusetts (14).

The WIRED guide to artificial intelligence

Supersmart algorithms don’t do all of the work, but they learn faster than ever and do everything from medical diagnostics to showing ads.

“If subgroups of the population are systematically excluded from AI training data, AI algorithms will tend to perform worse for these excluded groups,” says Kaushal. “Problems with underrepresented populations may not even be investigated by AI researchers because no data is available.”

He says the solution is to make AI researchers and doctors aware of the problem so they can look for more diverse data sets. “We have to create a technical infrastructure that enables access to various data for AI research, and a regulatory environment that supports and protects the use of this data by research,” he says.

Vikash Gupta, a scientist at the Mayo Clinic in Florida who works on using AI in radiology, says that simply adding different data could remove the bias. “It is currently difficult to say how this problem can be solved,” he says.

These were the details of the news AI can help diagnose some diseases – if your country is... for this day. We hope that we have succeeded by giving you the full details and information. To follow all our news, you can subscribe to the alerts system or to one of our different systems to provide you with all that is new.

It is also worth noting that the original news has been published and is available at de24.news and the editorial team at AlKhaleej Today has confirmed it and it has been modified, and it may have been completely transferred or quoted from it and you can read and follow this news from its main source.

NEXT Barrage of Russian attacks aims to cut Ukraine's lights