Predictive analytics will push the frontiers of clinical care; the question is whether FDA regulation will promote or stifle innovation in the name of consumer protection.
FDA regulation in this area is a moving target. Let’s see what we know so far.
Predictive Analytics in Health & Medicine
Predictive analytics extracts information from data and uses it to predict trends and behavior patterns. Predictive analytics combines statistical techniques from modeling, machine learning, and data mining that analyze current and historical facts to make predictions about future or otherwise unknown events.
In healthcare, predictive analytics can tap patient’s electronic health records and mine Big Data to assess the types of patients who might respond most favorably to a given intervention. Life sciences companies might use predictive analytics to target market pharmaceuticals to specific sub-populations, for example. Both pharmaceutical and medical device companies may look to super-targeted solutions for individualized patients.
Predictive analytics can improve healthcare in several ways:
Predictive analytics (PA) uses technology and statistical methods to search through massive amounts of information, analyzing it to predict outcomes for individual patients. That information can include data from past treatment outcomes as well as the latest medical research published in peer-reviewed journals and databases.
Not only can PA help with predictions, but it can also reveal surprising associations in data that our human brains would never suspect.
In medicine, predictions can range from responses to medications to hospital readmission rates. Examples are predicting infections from methods of suturing, determining the likelihood of disease, helping a physician with a diagnosis, and even predicting future wellness….
I think about the Bayer TV commercial in which a woman gets a note that says, “Your heart attack will arrive in two days.” The voiceover proclaims, “Laura’s heart attack didn’t come with a warning.” Not so with predictive analytics. That very message could be sent to Laura from her doctor who uses predictive analytics. Better yet, in our bright future, Laura might get the note from her doctor that says, “Your heart attack will occur eight years from now, unless …” – giving Laura the chance to restructure her life and change the outcome.
Predictive medicine is the ultimate patient-centric healthcare. Bioinformatics, genomic medicine, and proteomics are all part of predictive medicine.
FDA Regulation of Mobile Medical Apps
In its mobile medical app guidance, FDA specifically announced that it would not address software that performs patient-specific analysis to aid or support clinical-decision-making.
FDA also stated that mobile apps that are intended for general patient education and facilitate patient access to commonly used reference information, are unlikely to be FDA regulated as medical devices. FDA noted that apps can be patient-specific, but intended for “awareness, education, and empowerment, and ultimately support patient-centered health care.”
These kinds of apps, FDA explained, are not medical devices, because the apps are “intended generally for patient education, and are not intended for use in the diagnosis of disease or other conditions, or in the cure, mitigation, treatment, or prevention of disease by aiding clinical decision-making (i.e., to facilitate a health professional’s assessment of a specific patient, replace the judgment of a health professional, or perform any clinical assessment).”
Examples FDA cited (of apps that are intended to educate, not diagnose/treat, and therefore are not regulated medical devices) include mobile apps that:
- Provide a portal for healthcare providers to distribute educational information to their patients regarding their disease, condition, treatment, or upcoming procedure
- Help guide patients to ask appropriate questions to their physician
- Provide information about gluten-free food products or restaurants
- Help match patients with potentially appropriate clinical trials
- Provide tutorials or training videos on first-aid or CPR
- Allow users to input pill shape, color, or imprint and display pictures and names of matching pills
- Find the closest medical facilities and doctors
- Provide emergency hotlines and physician/nurse advice lines
- Provide and compare costs of drugs and medical products
Note that none of the above are involved in diagnosis and treatment, and none aid the clinician in assessing or treating any patient. These are all consistent with FDA’s position on the types of apps that are not likely to be considered regulated medical devices–i.e., apps that:
- Provide general information about disease
- Organize and track health information (log blood pressure measurements)
- Help patients document, show or communicate (e.g., video-conferencing)
- Automate simple tasks for providers (like calculating BMI)
- Enable patients or providers to interact with EHR
Regarding the last bullet, FDA states on its website that its mobile medical app policy does not apply to mobile apps that function as an electronic health record (EHR) system or personal health record system. What the Mobile Medical App Guidance actually says is that FDA “intends to exercise enforcement discretion (meaning that FDA does not intend to enforce requirements” under the Food Drug & Cosmetic Act relating to medical devices), for mobile apps that enable patients or providers to interact with electronic health records.
In Appendix A, Examples of Mobile Apps for which FDA Intends to Exercise Enforcement Discretion (i.e., not medical devices), FDA references include:
- mobile apps that “provide patients a portal into their own health information, such as access to information captured during a previous clinical visit or historical trending and comparison of vital signs”
- mobile apps that allow a user to collect, log, track and trend data, such as blood pressure, heart rate, weigh tor other data from a device to eventually share with a health care provider, or upload it to an online (cloud) database, personal or electronic health record.
As to apps that FDA regards as likely to be mobile medical apps (regulated medical devices), these:
- Transform the mobile platform into a device, because they use the built-in feature (such as camera) to diagnose or treat disease. Example: app uses an attachment to measure blood glucose levels.
- App controls the operation or function of a medical device. Example: app controls the inflation or deflation of a blood pressure cuff.
- App displays, transfers, stores, or converts patient-specific medical device data from a connected medical device. Example: app connects to bedside monitor and transfers the patient’s data to a central viewing station.)
Apps FDA regards as possibly mobile medical apps (MMA)s, these are apps that:
- Provide behavioral tips (example: for smokers to quit)
- Provide patient-specific recommendations once patients input their characteristics such as age, sex, and behavioral risk factors
- Record a clinical conversation and send it to the patient after
- Keep track of medications and provide reminders, or compare vital signs
- Allow individuals to log, record, track, evaluate, or make behavioral decisions relating to general fitness, health or wellness, such as those that provide meal planners or recipes, calculate calories burned in a workout, help people track sleep.
Again, this list consists of patient-centered apps, but the key is that they are apps that move the needle more closely into the realm of diagnosis or decision-making regarding treatment.
FDA–At the Crossroads of Innovation of Regulation and Innovation
FDA currently faces a digital health dilemma: regulators are struggling to keep pace with health-care technology innovation. Digital health is “at the nexus of clinical innovation, behavioral science innovation, pharmaceutical innovation, and consumer electronics and gadget innovation … That requires a complementary structure of policy and regulation and data security and privacy.” (HIPAA creates its own regulatory burdens, apart from FDA regulation).
FDA cares about patient safety, and classifies medical devices according to risk.
Accordingly, FDA recently announced a more hands-off approach to general wellness products that pose a low risk of danger to consumers.
But FDA retains enforcement discretion, and looks not only to risk but also to the intended use of the product, as shown by all the marketing materials (including claims on the manufacturer or distributor website).
If the intended use is to diagnose or treat disease, then the product may be considered a drug or medical device, subject to all the regulation that goes along with this regulatory categorization.
This means that for industry, it can be a guessing game as to whether FDA will or will not regulate the product along certain lines. Legal counsel can help with critical business decisions including important marketing judgments and fine-line edits of claims on the website, as these will all be fodder for an enforcement action if they run afoul of regulatory lines.
FDA Regulation of Clinical Decision Support
Look for FDA to issue new guidance this year on clinical decision support systems. Thus far, FDA has backed off, and there are moves to carve out arenas from the enforcement process.
The recall classification gives us some clues as to what FDA considers to be high-risk clinical decision support software, and high-risk bugs that might be found in that CDS. More specifically, FDA considers software that helps highly trained anesthesiologists make clinical decisions with regard to such things as spotting the potential for adverse drug reactions to be in the moderate range of the risk scale. And when it comes to defects, FDA concludes that a bug that populates a patient’s record with the wrong data, for example, creates significant risk. When you combine those two factors, you get a Class I recall…
The McKesson Anesthesia Care software product seems to be pure clinical decision-support software, albeit with the option to download data automatically from various sensors. From the company’s description, the software simply provides analytic decision support, as opposed to software that controls the operation of another medical device or is designed and marketed to pair with a specific, branded medical device to analyze data from that device. The software does not automatically control the anesthesia equipment; it simply informs the anesthesiologist.
The issue with the McKesson software is a classic one that has been identified with electronic health records (EHRs): a bug that substitutes one patient’s information for another patient’s data. So far, FDA has chosen not to regulate EHRs, despite taking the position that EHRs qualify as medical devices. So, if this problem were to happen with an EHR, presumably FDA would not get involved in the recall; it would simply leave it to the manufacturer to handle. But here, given the significance that FDA attaches to the functionality of the software—informing the decision of an anesthesiologist looking for a possible adverse drug reaction—this type of bug is enough to place the recall in the highest risk category. That’s a big deal.
The article concludes that “even clinical decision-support software aimed at supporting the most educated of healthcare professionals can be high risk,” and thus subject to FDA enforcement as a medical device, if the software makes the healthcare professional (and patient’s health) dependent on its proper functioning.
In other words, some clinical decision-support software poses no or low risk, while other clinical decision-support software poses patient risk because the health professional’s dependency on its accuracy, leaves the patient vulnerable.
It’s no secret that everything is in flux.
Technology changes, and FDA gets squeezed between calls by industry for rule-making that accelerates innovation in the globally competitive marketplace, and the traditional, bureaucratic stance of going the distance to protect consumers against dangers. From digital health to nanotechnology, the pressures will only increase.
As a result we see a patchwork of guidance: FDA announces it will back off here, but retains enforcement discretion to pursue rogue actors there. We see a Class 1 recall and interpret the tea-leaves from sometimes ambiguous and conflicting FDA guidance.
It seems that software is steadily encroaching upon the ageless sanctum of the doctor. You’ll be wearing your doctor on your sleeve (so, apparently, is our President, but that’s another story), and he or she (or “We?”) will be continually providing you updates on your health and fitness status. Meanwhile, how will FDA regulate your product – or can you fall outside its enforcement discretion?
Areas of controversy include:
- EHRs and apps that include clinical decision algorithms
- Apps that make diagnostic or treatment recommendations
- Software that uses Big Data to make individualized treatment suggestions
- Software that uses machine learning and has an impact on patient safety
- Software that coordinates care
The more artificial intelligence and machine learning move us into the Singularity, the more these regulatory conundrums will accelerate. I already feel us sliding into the Singularity, like home plate. Meanwhile, it’s important to read the bright lines set down by “the Agency” and calculate wisely how far a venture can move into the brave new world, without tripping into an unwanted regulatory burden.
Assertive action by industry will continue to push the boundaries, and that’s where legal strategy and counsel can help position a product to assess whether it’s a regulated drug, medical device, or falls within some other regulated category. Ask us how we can help assess and position your venture within this evolving marketplace so as to manage the FDA unknowns.