March 4, 2026 Forum
I don’t know if AI can own value formation. They may be able to regurgitate value formation, but I don’t believe AI yet, as even agentic AI, can fully incorporate values, and when confronted with a completely new event, bring values to bear on that. And, you know, I think what we’re called upon to do is to be with people who are suffering, perhaps to know more than others, and to share that knowing. I think we’re called upon to tend to pain, to wipe tears, and other things, to clean up blood, and other body fluids, and to set bones and hearts. And that’s sort of my mantra now. I think about what can AI do, and what can’t it do? I worked through my day as a family doctor, and I ask could AI do this? Oh, my God, AI could call my patients and invite them to get a mammogram. That’s awesome. I don’t need to do that, and my nurse doesn’t need to do that. We can actually help patients do it when it’s at their most convenient. Awesome.
Do I need to write prescriptions? No, I can just ask, can you write a prescription for me? But I don’t think AI can look in an ear, and I don’t think AI can, sew up a complex facial suture, while listening to the person talk about their alcohol use that got them into the car accident, because their kid ran away from home, because, because, because. So, Jeremy, I think this was sort of both AI and behavioral health. I think the behavioral health piece of this is an interesting confounder for medical AI, and I don’t like to separate them. So it’s fun to hear how behavioral health integrates with the rest of medical AI? Because I think most of AI in medicine now has been used for diagnosis. Thinking from a therapeutic standpoint, how does behavioral health fit in? So, I’m excited to hear more about the group’s thoughts on that component, because I think there may be some very cool places where behavioral health may be able to work in partnership with actual humans.
Bill B
I was struck when you were talking about the revolution from the priest to the written book. It’s interesting in behavioral health to consider how many people rely on self-help books. For instance, one of my colleagues. John Preston, wrote a best-selling book on depression. It’s a small book, and still, to this day, I and other people I know, recommend it when someone has issues of depression. One of the first things to say is why don’t you look at this book? Dr.Preston writes a bit about depression. He writes a bit about some of the ways to deal with depression, writes about how you go to your physician to talk about it. So, Jack, one of the things that you’re saying is, why couldn’t my colleague’s book on depression be picked up by AI? And suddenly, rather than have a book, you have a gentle, wonderful voice saying, “Well, I understand you’re dealing with issues of depression, let me tell you a bit about it. . . .” So, in some sense, we’re going back now to the priest, but now the priest is not to be found in the book; instead, we’re going to AI. A soft voice and an interactive way might be very helpful. If people can benefit from self-help books, could they not also benefit from an interactive AI process?
Bill G
Well, that’s the point. AI can be exploited in many different ways to advance the service we are providing our patients. You don’t have to reject AI, we need to get ahead of AI.
- Posted by Bill Bergquist
- On March 30, 2026
- 0 Comment

Leave Reply