When my mom was dying, hospice came daily and stayed for about ninety minutes. They answered questions, checked what needed to be checked, and did what good professionals do: They made a brutal situation feel slightly less impossible.
And then they left.
Ninety minutes go fast when you are watching your mother decline. The rest of the day stretches out in a way that does not feel like time so much as exposure. Every sound becomes a data point. Every small change feels like a decision you did not train for. Her breathing sounds strange. What do we do? How often should we turn her to avoid bedsores? What is the diaper situation, exactly?
That was the gap, the long, quiet stretch between professional help. In those hours, what you want most is not a miracle. It is simply someone to ask.
AI ENTERED MY LIFE IN A WAY I NEVER EXPECTED
AI found its way into my life when I least expected it. Not as a replacement for care or love, and not as a shortcut around grief. It was a tool that did not get tired. A place to put the questions you are embarrassed to ask. It was a way to stop spiraling long enough to make the next decision.
Before we reached hospice, my mom’s illness had already become a full-time information problem. Over the last few years of her life, her heart and kidney disease worsened, and the complexity multiplied with it. There were doctors and specialists, tests, lab results, scans, phone calls, and constant medication changes. The burden of continuity fell on us, and it was easy to feel like we were one detail away from missing something important.
I kept feeling disappointed that I was not managing the “data” better. The dates. The times. The medication lists. When tools like ChatGPT took a leap forward, I suddenly had something I did not have before: A resource that could help me understand what I was looking at and organize what I could not hold in my head.
In practice, it was not one magical capability. Depending on the day, AI played different roles: assistant, organizer, translator, sometimes just a calm voice to complain to that could talk back. I built multiple custom GPTs with specific jobs. One focused on medications. One helped me draft clear messages to doctors. One existed for the “dumb questions,” the ones you hesitate to ask because you think you should already know. Another served as a simple health profile, a place to store key details so I could reorient myself when I was exhausted.
It might sound like overkill until you have lived long enough inside the healthcare system to realize how inconsistent it can be. People change. Portals change. Instructions change. That little AI “team” was consistent. It was there at any hour when my brain was foggy, and I needed to turn a messy thought into clear words.
It even became emotional support in a way I did not anticipate. I built something like a caregiver therapist, somewhere I could say what I was feeling, including guilt, and got feedback that, even though I knew it was an algorithm, still brought real solace.
AI WAS NOT PERFECT
This is the part people do not like to say out loud. AI gave wrong information sometimes. It forgot a medication from a spreadsheet. It dropped something from a list. It did not remember a doctor when I asked. If you use these tools in caregiving, you must double-check, especially with medication, reminders, and timing. You must treat it like a friend who knows a lot but can be flaky.
Still, even with those limitations, the difference was profound. This was never about delegating love. It was about delegating the parts of the experience that did not need to consume the last of my cognitive energy.
When my mother finally passed, the AI journey took another turn. It became a project manager for funeral arrangements and the memorial service. It helped me think through practical details, such as food for 30 people and what flowers might cost. It helped me craft a eulogy by taking a messy voice memo, my unstructured stories, and the tone I wanted, and shaping it into an arc in my voice at a time when I could not simply “turn on” my best writer brain.
In some ways, the most startling part is that I have a control group. My father passed away about three to three and a half years ago, right before the age of AI. The difference between then and now has been night and day. With my mother, having these tools did not make it easy in the way people mean when they say “easy.” It made it more dignified for everyone, including her.
WHAT CHANGED WAS NOT GRIEF. IT WAS THE OVERWHELM
Dignity is not the absence of pain or a tidy emotional arc. Dignity is being able to show up without drowning in chaos. It is being able to look your mother in the eye and be present, instead of being trapped inside your own spinning mind, trying to remember whether you wrote down the one thing that could change everything.
In the end, the most important thing AI gave me was not an answer. It gave me room. Room to think, to breathe, to steady myself, to stay with my mother instead of disappearing into logistics and fear.
Grief will always demand something from you. It demands tears, memory, love, and the kind of courage that does not feel like courage while you are living it. But it also demands paperwork, phone calls, deadlines, and decisions made on days when you can barely form a sentence. AI did not carry the grief. It carried some of the weight around it, so I could carry her, and then carry myself, with a little more dignity.
Edwin Endlich is president of the National Alliance for Financial Literacy and Inclusion and chief marketing officer at Wysh.
source https://www.fastcompany.com/91472674/i-turned-to-ai-while-my-mother-was-dying
Discover more from The Veteran-Owned Business Blog
Subscribe to get the latest posts sent to your email.
You must be logged in to post a comment.