Potentially fatal math errors are common in mobile applications used in clinical and emergency rooms, but a team of researchers from the New Jersey Institute of Technology’s Ying Wu College of Computing has found mathematically demonstrable solutions that could save lives.
The applications, also called medical score calculators, can be downloaded by anyone and are popular among less experienced healthcare workers. But the apps are riddled with errors, sometimes due to flawed source data from medical reference tables and sometimes due to poor implementations from developers who don’t understand the science.
Examples of scores used for early warning, intensive care units, and triage include HEART (history, ECG, age, risk, troponin), PAS (pulmonary asthma score), and SOFA (sepsis-related organ failure assessment).
Computer science professor Iulian Neamtiu, who supervises graduate students Sydur Rahaman and Raina Samuel, who now work for Google and Montclair State University respectively, said they started discovering such flaws several years ago during broader work on event-based mobile applications.
“The barrier to entry for publishing an app is very low. Anyone with marginal programming skills can publish to mobile app stores and call that a medical or health app. We also had a few preliminary articles looking at the claims these apps And we saw that the claims can be outrageous. We saw apps that are supposed to diagnose cancer, cure cancer, cure your DNA.
Overall, the team found significant bugs in 14 of the 90 Android applications. They also expect to find bugs in Apple programs and in Web-based applications, Neamtiu explains. In addition to simply finding incorrect data, the team built new software specifically for this study.
‘Informatics [aspect] is to put the problem into a mathematical mode or mathematical framework. So we know we have a problem. There are ages or age groups, or physiological parameters that are misinterpreted or mishandled, and one of our most important contributions is to cast this in such a way that it can be controlled mathematically and therefore rigorously,” he added.
In mathematical terms, they treated the apps as guilty until proven innocent, using a testing method called an automated theorem prover.
“Essentially, we formulate the problem like this: the score must meet certain correctness criteria, and if it doesn’t, find me an example. And if the automated theorem prover finds an example, you’ve essentially succeeded in to poke a hole in the score.”
“And then we realized that these apps are just interpreting something that doctors have been publishing for fifteen, twenty years. Let’s take a look at those papers, and to our surprise, and shall I say disappointment, we discovered that original sin was actually in the medical papers , because the medical documents contain errors and they contain parameter ranges and ages of the patients that are simply not covered. So those excerpts come from the medical literature, and those documents have been cited and used for twenty years, Neamtiu noted.
“They are implemented in emergency care systems. Yet they contain errors, and these errors persist. So every time a new system is built, a new article is published, they reproduce these errors. That’s another problem that has nothing to do with apps have to do.”
The trio published their article: “Diagnosis of medical score calculator apps,” in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies last year. Several offshoot papers have also been written, and others are on the way, including some that address skin surface error calculations — which are important because they are used to determine chemotherapy dosages, Neamtiu said.
Several application developers responded positively to the NJIT team’s research and made necessary updates, although errors in a popular medical manual have not yet been corrected, Neamtiu noted.
Looking ahead, Neamtiu says, “Frankly, we were surprised that papers published in prestigious journals just didn’t pass muster for basic mathematical research. So that’s an area of work I plan to continue, such as with funding from the National Institutes of Health , on finding these types of errors in the medical literature.
“I think the scrutiny needs to be stricter. There needs to be more mathematical rigor because those errors are relatively easy to spot, so I’m surprised that not only did they make it, that they passed the peer review, but that these errors have continued to exist.
“We use all these calculators. Everything that has anything to do with fitness, health or medical calculations. Medical scores, dosing, anything that’s based on formulas, or whatever kind of calculations are involved, and just takes away the problem.
“Imagine you have a monolith the size of a mountain, and that’s medical errors. We’re out there with our picks, with our jackhammers, and we’ve managed to have a positive impact. My goal is public health.”
More information:
Sydur Rahaman et al., Diagnosis of medical score calculator apps, Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (2023). DOI: 10.1145/3610912
Quote: Researchers fix critical flaws in medical mobile apps (2024, November 25) retrieved November 27, 2024 from https://medicalxpress.com/news/2024-11-critical-medical-mobile-apps.html
This document is copyrighted. Except for fair dealing purposes for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.