The apps help make medical diagnoses, but they’re still a work in progress

The apps help make medical diagnoses, but they’re still a work in progress


The same tools used to take selfies are being repurposed and commercialized to quickly access information needed to monitor patients’ health. The heart rate can be measured by pressing a fingertip to the lens of the phone’s camera. A bedside microphone can detect sleep apnea.

In the best of the new world, data is transmitted remotely to a healthcare professional for the convenience and comfort of the patient – ​​all without the need for expensive hardware.

However, the use of smartphones as a diagnostic tool is still in progress. Although doctors and their patients have achieved real-world success, experts said their potential remains untapped and uncertain.

Smartphones are equipped with sensors that can monitor the patient’s vital signs. They can help assess people for concussions, monitor atrial fibrillation, and perform mental health checks, to name a few emerging application areas.

Enthusiastic companies and researchers take advantage of phones’ built-in cameras and light sensors; microphones; accelerometers that detect body movements; gyroscopes; and even speakers. The apps then use artificial intelligence software to analyze the collected sights and sounds to create a simple connection between patients and doctors. Grand View Research reports that in 2021, there were more than 350,000 digital health products available in app stores.

“It’s very difficult to put devices into a patient’s home or a hospital, but everyone just walks around with a cell phone that has a network connection,” said Andrew Gostine, CEO of sensor network company Artisight. According to the Pew Research Center, most Americans own a smartphone, including more than 60 percent of those 65 and older. The pandemic has also made people more comfortable with virtual care.

See also  Is Uno the future of password management?

Manufacturers of some of these products have applied for approval from the Food and Drug Administration to market them as medical devices. Others have been exempted from the regulatory process and placed in the same clinical classification as Band-Aids. But how the agency handles AI and machine learning-based medical devices is still a work in progress to reflect the adaptive nature of the software.

Ensuring accuracy and clinical validation is key to ensuring healthcare provider buy-in. And many tools still need to be refined, said Eugene Yang, a professor of clinical medicine at the University of Washington.

Judging these new technologies is difficult because they rely on algorithms built by machine learning and artificial intelligence to collect data, rather than the physical devices typically used in hospitals. So researchers can’t “compare apples to apples” with medical industry standards, Yang said. Failure to include such safeguards could undermine the technology’s goals of reducing costs and access, as the physician would still need to monitor the results, he added.

Major technology companies such as Google have invested heavily in the field, catering to clinicians and home care providers as well as consumers. Currently, users of the Google Fitness app can check their heart rate by placing their finger on the lens of the rear-facing camera, or track their breathing rate using the front-facing camera.

Google’s research uses machine learning and computer vision, the field within AI that relies on information from visual inputs such as videos or images. So, for example, instead of using a blood pressure cuff, the algorithm can interpret subtle visual changes in the body that serve as proxies and biosignals of blood pressure, said Shwetak Patel, director of health technologies at Google and professor of electrical and computer engineering. at the University of Washington.

See also  How disabled Indians find love on dating apps - DW - 22/02/2023

Google is also investigating the effectiveness of its smartphone’s built-in microphone for detecting heartbeats and noise, and using the camera to preserve vision by screening for diabetic eye diseases, according to information published by the company in 2022.

The tech giant recently acquired Sound Life Sciences, a Seattle start-up with an FDA-approved sonar technology application. It uses a smart device’s speaker to bounce inaudible pulses off the patient’s body to identify movement and monitor breathing.

Israel-based also uses a smartphone’s camera to calculate vital signs. The software studies the region around the eye and analyzes the light reflected from the blood vessels onto the lens, said company spokeswoman Mona Popilian-Yona.

Applications also cover disciplines such as optometry and mental health:

  • With a microphone, Canary Speech uses the same underlying technology as Amazon Alexa to analyze patients’ voices for mental health conditions. The software can be integrated with telemedicine appointments and allows clinicians to screen for anxiety and depression using vocal biomarkers and predictive analytics, said Henry O’Connell, the company’s CEO.
  • Australia-based ResApp Health received FDA clearance in 2022 for an iPhone app that can screen for moderate to severe obstructive sleep apnea by listening to breathing and snoring. SleepCheckRx, which will require a prescription, is minimally invasive compared to sleep tests currently used to diagnose sleep apnea.
  • The application of Brightlamp Reflex is a clinical decision support tool that helps with concussion and vision recovery, among other things. The mobile app uses the iPad or iPhone camera to measure how the pupils react to changes in light. Through machine learning analysis, images provide data points for professionals to evaluate patients. Brightlamp is sold directly to healthcare providers and is used in more than 230 clinics. Clinicians pay a standard annual fee of $400 per bill, which is not covered by insurance. A Department of Defense clinical trial is underway using Reflex.
See also  Criminal application concerns and helpful AI

In some cases, such as the Reflex app, the data is processed directly on the phone — not in the cloud, said Brightlamp CEO Kurtis Sluss. By processing everything on the device, the app avoids privacy issues, as streaming data elsewhere requires the patient’s consent.

But algorithms must be trained and tested by collecting data sets, and this is an ongoing process.

For example, researchers have found that some computer vision applications, including monitoring heart rate and blood pressure, are less accurate with darker skin. Studies are ongoing to find better solutions.

“We’re not there yet,” Yang said. – That’s the point.

He created this article Kaiser Health News, a program of the Kaiser Family Foundation, a nonprofit organization that provides information on health issues to the nation.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *