Grief Tech? Digital Twin?

My sons are conversant with Artificial Intelligence (AI), my older son works with “machine learning” and AI every day. Most people have watched the AI headlines grow at an explosive rate, applied to more and more contexts.

Now there is some news about AI and palliative care, and specifically bereavement care, in a new and controversial practice to create a virtual version of a deceased loved one.  Called “Grief Tech”, a lifetime of digital data can be used to make the life-like replica interactive.

Tools can be ethically neutral, how they are used can be a different matter. Dr. Alan Wolfelt is the director of the Center for Loss & Life Transition in Fort Collins, Colorado.  He advised that “those who are grieving a loss should listen with caution. The key is that as you do that, you’re still acknowledging their death… And you are not listening to their voice 24 hours a day.”   

Would the virtual presence of a loved one prolong grief or soothe the pain or both? Every person grieves in their own way, so perhaps what is harmful for one is helpful to another. We have reminisced using two dimensional photographs, what makes a 3D representation so different? Interaction certainly takes things to a different place.

The interactive “death tech” companies are studying how to create a “digital afterlife.” AI blogger Ginger Liu says these companies are trying to create a digital twin based on a person’s knowledge. She points out that we might really recognize the essence of another person, and our essence is far more than our knowledge. AI poses questions about the attributes that make us exactly who we are.

Consent and data ownership also become thorny. The Death Tech companies will propose fees for capturing a digital twin. But what happens to that data if a family no longer wants to pay to view the uniqueness of an individual? Liu says, “By the time photography was affordable to the masses in the late 19th century and throughout the 20th century, family snaps filled photo albums, and often of deceased relatives. These albums were vessels of remembrance, identity, and storytelling. Death tech companies like StoryFile represent the recent shift from photography to AI as tools for remembering the deceased.” The implication is that these will be tools for the wealthy, at least until Moore’s Law kicks in.

AI is also being tested for detection of pain in patient’s faces. Bedside clinicians and physicians have always observed non-verbal signals of pain, particularly when the patient lacks the verbal capacity to express their pain. The researchers developed the AI initially using the Critical Care Pain Observation Tool (CPOT)   The tool looks at facial expression, body movements and muscle tension, but comes with “pearls and pitfalls.” The original study included cardiac patients who were relatively healthy, and likely without the kind of pain experienced by someone with pancreatic cancer, or late-stage MS. That original CPOT study had other limitations.

The AI approach to pain observations is hinged in part to staff shortages. Medscape author Jake Remaly says, “Clinicians typically must be with patients to complete those measures, however, which is not always possible in a busy hospital or nursing home, and this can lead to delays in administering care. That's one reason researchers are training artificial intelligence (AI) models to take up the task.”

Patient-reported measures beat AI, which is only accurate by comparison 66 percent of the time. Patient reported pain contains existential suffering, the emotional, psychological, social, and spiritual components of pain. But non-verbal patients cannot name their pain overtly.

Heat maps of facial expressions of people in pain are used to develop the algorithms. One characteristic of all forms of AI or machine learning is the iterative ability to become increasingly accurate with repetitions feeding data back in. When AI is compared to the results from the CPOT, AI’s readings were 88 percent accurate.

Medscape author Jake Remaly continues “An Australian company, PainChek, recently announced plans to study its pain assessment tool in nursing homes in the United States in an effort to obtain regulatory clearance. The tool has been studied in infants and in adults and already has received clearance in the European Union, the UK, Canada, and several other countries.”

For the US study Aussie researchers will enroll 100 nursing home residents and compare the PainCheck rating  to the Abbey Pain Scale for Dementia Patients.

Physicians are voicing caution. Patients may not show their pain and studies have found the clinicians tend to underestimate pain. Pain is an exceedingly personal and complex issue and one hopes that clinicians will use these new tools with caution.

Palliative care often intersects with ethical concerns, and the use of AI is likely to heighten the need to evaluate the ethics used to govern the use of these tools.  The experience of serious illness is ever so human.

Pat Justis, Washington State Department of Health