At its heart, healthcare has traditionally revolved around just a couple of basic practices: Looking for signs of illness in a patient, and listening to what the patient has to say about their symptoms. But these are by no means simple undertakings. A wide range of symptoms can be difficult to detect accurately, and can combine in baffling ways with respect to diagnosis; as for patients’ own accounts, all kinds of communication barriers can prevent the flow of accurate information from patients to doctors.
Meanwhile, the kinds of sophisticated biometric technologies that have emerged over the last several years are generally designed to perform one particular function: To scan particular body parts and match them against previous samples — for example, matching a face to a previously registered biometric template. And as artificial intelligence has evolved, it has led to the emergence of machine learning computer vision technology that has further enhanced the power of these kinds of biometric systems to spot patterns in human (and even non-human) physiology.
The implications of these developments for healthcare are clear enough — at least so far as looking for symptoms goes. And some cutting-edge applications have already started to emerge.
Looking for Symptoms
One of the most striking examples came into focus last year through a collaboration between researchers at the United Kingdom’s Moorfields eye hospital, the University of College London, and, crucially, AI specialist DeepMind. They worked together to develop a system that uses machine learning algorithms to determine the physical symptoms of eye disease as they appear on patients’ retinas. When applied to actual medical retinal scans, research indicated that the system could detect eye disease with an accuracy of 94 percent, exceeding human medical professionals’ diagnoses both in terms of accuracy and speed.
This is one of the clearest indications yet of how AI-powered biometric technology could be used to perform the first critical function of healthcare practitioners, scanning the human body for the known symptoms of illness and making statistical estimates about what diagnoses would be appropriate based on those scans.
Of course, not all illness is visible to the naked eye, but even in these cases biometric technology can help. And a perhaps surprising area in which such technologies have already started to offer clinical diagnosis is one of the least physically obvious ailments of all — brain injury.
The US Army was already experimenting with related technology way back in 2014, when researchers were trialling a solution called the Blast Gauge System that was designed to detect concussions and other signs of brain trauma. Researchers have only become more interested in using biometric technologies to detect brain-related health issues in the ensuing years, and in 2017, a group at New York University’s Langone Medical Center turned their attention to PTSD through the unusual avenue of voice biometrics. The idea was that PTSD and other kinds of brain trauma could show subtle symptoms in patients’ vocal patterns, and preliminary research found that their machine learning system could identify individuals who had been clinically diagnosed with PTSD with an accuracy of 77 percent.
Listening to Patients
And this is where AI-driven biometric technologies start to demonstrate applications in the second main aspect of healthcare practice — listening to patients, literally. Around the same time that the NYU researchers were delving into PTSD diagnosis, researchers at the Mayo Clinic were taking a similar approach to heart disease. There, too, they reasoned that symptoms of illness could be produced in voice patterns, based primarily on the idea that chest pain would naturally affect how patients talked. And their initial research found 13 distinct vocal patterns that could be associated with heart disease, offering a promising start to this new research avenue.
Around that same time, private sector companies were starting to look into the potential of voice and speech recognition technologies in helping healthcare practitioners to save time and energy by automatically recording patient stories for medical documentation. One of the biggest tech companies in the world, in fact — Google — teamed up with researchers at Stanford University to develop a system that not only ‘listens’ to patient-doctor conversations, but summarizes the patients’ stories in note form. Further down that track, last summer a Singapore-based digital healthcare platform provider called Doctor Smart announced a partnership with language technology specialist LangNet to work on a system that can transcribe patient stories and apply AI to those transcriptions for potential diagnoses. What’s more, in announcing their collaboration the companies noted the “promising results for disease diagnosis” that voice biometrics technologies have shown, suggesting that this will be the natural next step as their technology progresses.
The Future of Medical Diagnosis
All of this seems to be pointing pretty clearly in one direction. Just ahead of this year’s HIMSS conference, Nuance Communications announced an AI-driven “Ambient Clinical Intelligence” system for hands-free clinical documentation that is slated for commercial launch in early 2020; and while the company has not yet mentioned any biometric components to the system, its proven expertise in voice biometrics suggests that this kind of technology could be put to diagnostic use in the background while doctors converse with patients. Even if Nuance doesn’t go there, it’s a fair bet that someone else will.
In parallel, there’s every reason to expect that the kind of computer vision technology used to diagnose eye disease at Moorfields is going to emerge for other maladies, too, with a similar machine learning approach likely to yield similar results. And all the while, other biometric technologies traditionally used in clinical settings like ECG analysis will continue to evolve, too, especially as they benefit from the kind of machine learning technology that has revolutionized facial recognition.
Healthcare is always going to involve looking closely at patients and listening to them carefully, but it’s increasingly going to be done with the help of AI. And as is the case in other areas of healthcare where biometrics are booming, that’s going to free up resources for doctors and improve outcomes for patients.
February 21, 2019 – by Alex Perala