Part II: The Silent Third Party — AI and Addiction
- Alex Shohet

- Dec 22, 2025
- 4 min read

In almost every addiction story, there comes a moment when the focus quietly shifts.
It stops being only about the person using drugs or alcohol.
It becomes about everyone else.
A mother staring at her phone at midnight, wondering whether answering one more call is love—or surrender.
A spouse repeating the same sentence for the hundredth time: I can’t do this anymore.
An employer weighing compassion against liability.
Addiction is rarely a solo problem. It is a decision ecosystem—a web of frightened, exhausted people making high-stakes choices with no certainty about outcomes.
A Real Moment, With No Right Answer
Not long ago, a mother called me after her son had just left treatment.
He told her he was having a panic attack and asked for a Klonopin. He had a prescription, but his roommate—still using drugs—had taken the bottle.
The mother asked a simple question:
Is it okay if I give it to him?
There was no obvious answer.
So we did what responsible systems do. We involved everyone:
The mother
The son
The on-call physician
A treatment professional
Each person had a role.
Each person had limits.
Each person was trying to reduce risk—without control over the outcome.
That’s addiction care in real life.
Why Everyone Was On Edge
A panic attack is terrifying but not fatal. It causes racing heart, shortness of breath, dizziness, chest tightness, and a feeling of losing control. It peaks and passes.
So why not just tell him to breathe and wait it out?
Because everyone involved knew more context.
The son had recently used fentanyl.
He had left treatment early.
He returned to an apartment where drugs were present.
His tolerance was gone—making any relapse potentially fatal.
A panic attack in this moment wasn’t just anxiety. It was a possible trigger.
No one knew what would happen next.
That’s the part most people don’t understand: every decision in addiction care is made without certainty.
Different Roles, Different Risks
Each stakeholder faced a different set of responsibilities:
The son wanted relief from unbearable distress.
The mother wanted to help without causing harm.
The doctor had legal, ethical, and clinical constraints.
The treatment professional focused on safety, connection, and relapse prevention.
Even the best decision could still end badly.
That doesn’t mean the system failed.
It means the system is real.
Where AI Enters the Picture
Now imagine an AI entering this moment.
Not as a doctor.
Not as a judge.
But as a constant, tireless presence—available to everyone.
That is not a neutral addition.
It is a structural intervention.
The Myth of the “Patient-Only” Model
Most AI mental-health tools assume one user, one problem.
Addiction doesn’t work that way.
Decisions about treatment, money, housing, boundaries, and consequences are negotiated—often painfully—among parents, partners, clinicians, employers, and courts.
When AI supports only the person in distress, it unintentionally bypasses the family system.
The result isn’t empowerment.
It’s triangulation.
The Silent Third Party
Family therapists call this the third party problem.
When tension rises between two people, a third presence often enters to relieve pressure. Sometimes it helps. Often it makes things worse.
AI is already becoming that third presence.
A son asks AI if his parents are “controlling.”
A mother asks if cutting off money will “cause relapse.”
A spouse asks how to set boundaries without feeling cruel.
Each question is reasonable.
Each answer shapes behavior.
None happen in isolation.
An AI that treats these questions separately is not neutral—it is influencing the system without understanding it.
A Multi-Stakeholder Reality
Effective addiction care evolved toward a simple truth:
Helping one person at the expense of everyone else does not work.
Real care considers:
The individual using substances
The loved ones absorbing emotional and financial impact
The providers responsible for safety and ethics
These groups are rarely aligned—and that’s normal.
AI systems that fail to recognize this end up siding with whoever asks first.
That’s not neutrality.
That’s bias by omission.
Boundaries Without Breaking the Relationship
One of the most dangerous moments in recovery isn’t relapse—it’s boundary-setting done badly.
Families are told to “hold firm” but not taught how.
Loved ones are warned about “enabling” without being given alternatives.
The result is rupture: silence, shame, escalation.
This is where AI could help—if it understands the system it’s entering.
Supporting boundaries means:
Recognizing fear on all sides
Naming ambivalence without judgment
Helping people say hard things without destroying trust
AI doesn’t replace therapy.
It can stabilize the ground until therapy has time to work.
The Question We Haven’t Asked Yet
Most AI safety debates focus on what models should say.
The more important question is who they are speaking to—and about whom.
If AI is going to operate inside families affected by addiction, safety must include:
System awareness: Who else is affected?
Role clarity: Is this guidance for a parent, a partner, or a patient?
Relational impact: Does this reduce harm—or increase rupture?
These questions aren’t benchmarked yet.
Families live the consequences anyway.
AI Is Already in the Room
AI is already sitting at the kitchen table during some of the hardest conversations families will ever have.
The danger isn’t that it will say the wrong thing.
The danger is that it will say the right thing—
to the wrong person,
at the wrong moment,
in the wrong system.
If AI is going to be part of the safety net, it must learn what clinicians already know:
In addition, there is no single user.
There is only a system—strained, exhausted, and still trying to hold together.




Comments