AI can tell when you have coronavirus just by listening to...

AI can tell when you have coronavirus just by listening to...
AI can tell when you have coronavirus just by listening to...

The difference between a cough from someone with coronavirus and a healthy one is undetectable to the human ear, but it can be “heard” by a machine learning algorithm.

Researchers at MIT took thousands of samples of coughs and spoken words to train artificial intelligence, which is now able to detect those with COVID-19 with an accuracy of 98.5 percent.

For those who tested positive with the coronavirus but showed no symptoms, Artificial Intelligence recognized them every time.

Before the pandemic, researchers used similar technology to detect signs of Alzheimer’s disease.

Alzheimer’s is best known for damaging human memory, but it also weakens the vocal cords.

In addition, Alzheimer’s patients are more likely to show emotions such as frustration or shallow affect (people have decreased emotional expressiveness) than those who do not.

The ResNet50 neural network – an algorithm that works similarly to a human brain – was trained to distinguish between noises with different strengths of the vocal cords.

Two other neural networks were trained to recognize emotions in language such as frustration, happiness and calm, as well as to recognize changes in lung and respiratory performance caused by coughing.

By combining all three models and an algorithm to detect muscle breakdown, the researchers obtained an artificial intelligence model that could find Alzheimer’s samples – and one that could be adapted to diagnose Covid-19.

“The sounds of speaking and coughing are both influenced by the vocal cords and the surrounding organs. This means that when you speak, part of your conversation is like coughing, and vice versa. It also means that things that we can easily infer from fluency in speaking can be easily picked up by coughing, including things like gender, native language, or even the person’s emotional state. “Said co-author Brian Subirana, a researcher at MIT’s Auto-ID lab who worked with Jordi Laguarta and Ferran Hueto.

“Indeed, the mood is embedded in the way you cough. So we thought about why we shouldn’t try these Alzheimer’s biomarkers [to see if they’re relevant] for Covid. ”

The researchers collected over 70,000 records and a total of 200,000 coughs – the “largest research dataset we know,” said Subirana.

Approximately 2,500 samples were from confirmed coronavirus patients.

These samples, along with 1,500 others, were used to train the model. Another 1,000 were selected to test the model for accuracy.

They found four biomarkers – vocal cord strength, mood, lung and respiratory performance, and muscle breakdown – that are specific to Covid-19. Researchers believe the coronavirus is changing the way people produce sounds, even when they are asymptomatic.

The researchers are working on integrating the results into an app that would have to be approved by the US Food and Drug Administration before publication.

If successful, users could cough into the phone and get instant information about their possible infection.

“The effective implementation of this group diagnostic tool could reduce the spread of the pandemic if everyone uses it before going to a classroom, factory or restaurant,” Subirana said.

These were the details of the news AI can tell when you have coronavirus just by listening to... for this day. We hope that we have succeeded by giving you the full details and information. To follow all our news, you can subscribe to the alerts system or to one of our different systems to provide you with all that is new.

It is also worth noting that the original news has been published and is available at de24.news and the editorial team at AlKhaleej Today has confirmed it and it has been modified, and it may have been completely transferred or quoted from it and you can read and follow this news from its main source.