Thoughts About the Future of AI Applications on Apple Devices and Their Impact on Public Health

Benefits of AI for public health

Apple has recently extended the personal safety features on our wearables and personal portable devices. These new features exploit the capabilities of the on-system artificial intelligence (AI). These are only the beginnings. I have written before in the comments to John Martellaro’s Particle Debris columns about where I thought these AI-enabled personal safety features were headed, some of which are now featured in Apple’s new product line. Let me opine, based on current and near-ready AI use cases, on what I believe are near-horizon capabilities and features to aid public health. I am not necessarily predicting that this is what Apple or others will do. Rather, I’m outlining what they could do in order to reduce prominent but preventable causes of morbidity and mortality.

Background 

At Apple’s recent “Far Out” event, the Cupertino-based tech giant introduced some new and enhanced AI features. These extended the capabilities of its new iPhone and Apple Watch offerings. The previous safety features are there, such as Fall Detection and monitoring blood oxygen saturation and heart rate/rhythm. To these were added Crash Detection, a feature that exploits the new 3-axis gyroscope in the Apple Watch Series 8. It also utilizes the gyroscope in the iPhone and accelerometers in  both devices’ chipsets to detect catastrophic-level shifts in momentum. These can include head-on, side and rear collisions, as well as rollovers.

As with Fall Detection, the AI on either the Watch or the iPhone helps public health concerns by alerting the user that it has detected such an event. Then. it initiates a countdown and awaits the user’s assurance that they are okay or disable the countdown. Failing that, the device initiates a call to emergency services for assistance. Further, on the new Apple Watch Ultra, the onboard AI logs hiking routes. This enables back-tracking in order to prevent one from becoming hopelessly lost. (As a former Boy Scout, I can imagine my Scout Master considering this a “cheat.” He would make us disable this feature on our wilderness survival outings.) 

Current AI Applications in Public Health

Recently, the World Health Organisation (WHO) held a global conference on pandemic preparedness. Amongst the presentations were those regarding the role of AI in pandemic preparedness. These applications include using past mutations of the virus that causes COVID-19 to predict future changes to the virus’s ACE receptor binding protein. This binding protein defines new variants; predicting its changes allow researchers to anticipate and prepare for vaccine modifications. Another was the use of AI to identify therapeutics with a high probability of effectiveness against novel pathogens. The idea here is to expedite appropriate animal and human clinical trials. This helps researchers avoid the conventional time-wasting trial and error discovery method.

Yet another example was using AI to make epidemiological projections of global hotspots. Though developing these models, we can prepare for logistical intervention to those sites. In this way, health providers can offer the most efficient, cost-effective and life-preserving approach to intervention. Next we have the use of AI to model and predict which classes of pathogens (viral, bacterial or other) had the greatest potential for spread from animals to humans, directly or indirectly (via an intermediary species), in order of likelihood and possible timelines. Yet another was that of mapping epidemics in realtime, including the rapid identification of high-risk groups for priority intervention.

These examples provide insights into where a company like Apple, that places a premium on health applications, might lean. 

Acute Injury

Apple’s fall and collision detection, two events responsible for a large fraction of traumatic injury worldwide, are a great start. We can also teach AI to detect other abnormal patterns associated with acute illness or injury. For example, targeting programming and machine learning could teach our devices to read changes in peripheral body temperature and heart rate most commonly associated with shock, whether from blood loss or other conditions, and query the user on how they feel. If needed, the device could go on to place an automated call to emergency responders after a failure to respond. This assessment could also include heart rhythm.

Using Cameras and AI to Improve Public Health

Another possibility could be stroke detection. Apart from falling in the event of a stroke, in which enabled Fall Detection would be invoked, Apple devices have cameras. We spend a good deal of time staring at our screens with a front-facing camera. There are several ongoing studies using eye movement to predict and/or diagnose stroke. If/when these are standardized, Siri could use them for even more life-saving potentials.

The operating system could enable the camera to periodically train on the face and torso of the user sitting in front of it. Abnormalities of eye movements, facial or limb symmetry, could provoke a query from Siri to elicit speech from the user. Confused, slurred or aphasic speech patterns, alone or accompanied by these other signs, could prompt Siri to notify emergency responders and provide the user’s location.

Abnormalities of eye movement are also related to neurodengerative diseases like Parkinson’s Disease. On-device forward-facing cameras could pick up many of these types of movements. Aided by AI, these could potentially detect the tell-tale types of abnormal eye movements associated with the conditions. In this way, the iPhone could provide users with an early warning of disease onset. Those with predisposing conditions, such as previous brain trauma, and/or those with a family history, could enable such a feature. 

Further Use of Accelerometers

Another option, possibly using a watch’s accelerometers, could be tonic/clonic seizure (convulsion) detection. This could involve identifying abnormal, spastic and violent movement of the extremities and torso. Such a capability would enable AI to improve public health by aiding those who suffer from this class of epilepsy in perhaps notifying family members and/or first responders for more prolonged seizures that could result in permanent brain damage without intervention. These uses of cameras and accelerometers are but four common sources of acute morbidity and mortality that, with little modification to sensors, but in-built detection algorithms, Apple could use for AI-enabled crisis intervention. Most of these applications should be feasible in the very near term. 

Immediate Preventive Intervention

There are conditions that physicians monitor to predict impending illness or catastrophic events. If, for example, Cupertino succeeds in adding glucose monitoring to the Apple Watch, it could monitor blood glucose to alert the user of potential new-onset diabetes. It will undoubtedly be used to aid known diabetics in monitoring their glucose levels in order to stave off diabetic ketoacidosis (DKA), a life-threatening condition. For for many people, DKA is their first alert to new-onset diabetes, with often fatal consequence from delayed intervention. 

Adding More Sensors to Increase What Apple Watch Can Monitor

A feature that would require a new sensor could have multiple applications — odor detection. There is an entire industry devoted to training canines to sniff out a variety of acute infectious and chronic illnesses, exploiting the body’s release of unusual organic chemicals known as volatile organic compounds (VOCs) that are associated with a number of illnesses. These include acute infections, including COVID-19, metabolic, genetic disorders and environmental exposures.

Adding the detection of a VOC like acetone to a separate glucose sensor on the wrist, for example, could aid in early detection of DKA in diabetic patients. Currently, VOC detection requires large, cumbersome apparatus and multiple steps. Apple would need to adapt these in creative ways, relying heavily on AI for probabilistic  modeling. It might initially need to focus on a single volatile or class of volatiles, like ketones, to address a high priority disease, like diabetes. Start small, get one important volatile right, with huge benefits, and expand from there as needed. 

The development and adaptation of sensors to detect these volatiles, whether exhaled in human breath, exuded from the skin or present in sweat, urine and faeces would require modification to accommodate a wrist and/or phone-borne application, but it is the kind of problem that a well-resourced and imaginative company like Apple could elegantly solve, thereby potentially saving millions of lives per year. 

Most of these AI uses would involve longer-term projects, investment and testing. The dividends in disability-adjusted life years (DALYs) would be incalculable.   

Longterm Prevention and Participation in Prospective Cohort Studies 

Amongst the current uses of the Apple Watch is the opportunity to participate in a number of ongoing prospective studies. Heart health, sleep studies, and studies related to women’s health are all on offer. As sensors become more capable, the range of these studies will substantially increase. These sensors will provide physicians and scientists with, in many instances, more reliable, higher quality data with multiple samples over time for the types of measurements that the Apple Watch can provide, without having to deal with “white coat syndrome” a phenomenon that can adversely affect such measurements. Moreover, these can be collected throughout the day to correlate with different points along our circadian rhythms for better predictive modelling. 

Volunteers should soon be able to participate in other forms of disease and outbreak mapping exercises to improve our models of how, and the speed at which, communicable diseases spread, as well as detect other trends in both communicable and non-communicable diseases. Medical researchers and developers can use these, in turn, to identify unsuspected environmental exposures earlier than is currently done. The possibilities are limited only by our device capabilities and our imaginations. 

Conclusion

Tim Cook has stated that Apple’s greatest contributions will be in health. With these few examples of potential future AI-enabled applications, in addition to those already available, this is no exaggeration. With all its devices switching to Apple’s proprietary SoC chipsets, equipped with machine learning, all of these machines can be pressed into service with user consent. User consent is, of course, critical. Thus far, Apple has been consistent in using opt-in for such features, and in matters of health, this is an essential ethical requirement. 

Having switched its entire lineup to its ARM chipsets, Apple has the capacity to creatively provide interventions against acute injury as well as immediate, near and long-term prevention, as well as early discovery of disease trends using numerous device combinations. Given the worldwide installed base of Apple users, Apple may have a global reach that few, if any, can match. This can provide access to higher quality data from under-represented populations and under-appreciated diseases and injuries than at any point in human history. 

While this provides Apple with an unprecedented opportunity to better global health, if the impact is anywhere near as far-ranging as suggested above, it will raise an ethical question: whether or not accessibility to such technology should be considered discretionary or essential, particularly for those most at risk. Perhaps by that point, with proof of principle in hand, economies of scale will have made these technologies, like today’s generic smartphones, accessible to the masses, thus avoiding that debate altogether.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.