Blog, The Hospital Finance Podcast®

Why a Culture of Patient Safety Matters: Part 2 [PODCAST]

besler insights blog corner graphic

The Hospital Finance Podcast

In this episode, we welcome back Dr. David Feldman, Chief Medical Officer for the Doctors Company & TDC Group, on part 2 to talk about why a culture of patient safety matters.

Learn how to listen to The Hospital Finance Podcast® on your mobile device.


Highlights of this episode include:

  • Economic benefits of establishing wide-ranging patient safety measures
  • Important step that healthcare can take to maximize patient safety
  • AI in EHR systems
  • Establishing patient safety as an industry wide concern
  • Parting thoughts

Kelly Wisness: Hi, this is Kelly Wisness. Welcome back to the award-winning Hospital Finance Podcast. We’re pleased to welcome back Dr. David L. Feldman, Chief Medical Officer for The Doctors Company and TDC Group. In addition, he is senior vice president and chief medical officer at Healthcare Risk Advisors, or HRA, a strategic business unit of TDC Group. Previously, Dr. Feldman was vice president for patient safety, vice president of perioperative services, and vice chairman of the department of surgery at Maimonides Medical Center in Brooklyn, New York. Under his leadership, HRA provides resources and a collaborative environment designed to minimize claims and lower premiums for HRA clients by preventing patient harm, enhancing teamwork and communication, and improving documentation. Dr. Feldman currently serves on the steering committee of the American College of Surgeons for retraining and retooling of practicing surgeons. Dr. Feldman received a Bachelor of Arts degree and Doctor of Medicine degree from Duke University, completed training in general surgery at the Roosevelt Hospital, now Mount Sinai West, and plastic surgery at Duke University Medical Center. He earned a Master of Business Administration degree from New York University. He’s back to continue our discussion around why a culture of patient safety matters. In part one of our discussion, we laid out the problem. In today’s part two, we will dig into what we do about it. Thank you for joining us again, Dr. Feldman.

Dr. David L. Feldman: Thank you so much, Kelly, for having me back. I appreciate it.

Kelly: Hey, it’s great to have you back. And today, we’re going to jump right into it again. We’re looking at what would be the economic benefits of establishing wide-ranging patient safety measures.

Dr. Feldman: Well, some of this is sort of obvious, Kelly. Some of it, maybe not so much. There’s this constant tension that those of us in patient safety feel about how we prove to finance people that actually improving quality reduces costs, and part of that is the timing. So typically, some of the projects that we get involved with take a while to show their results. We talked in our last podcast about the fact that there’s a time delay and that culture change takes a while in healthcare, in particular. And finance people don’t really have [aware?] with all the timing to wait so long for this. So that’s a very tricky thing to do. I think there’s a couple of things to keep in mind. One, it’s incumbent upon us who work in this field to try and demonstrate these things even if it’s a number of years later. Our podcast is timely because we literally just released an article on the [inaudible] College of Surgeons that looks at a number of years of reduction in retained surgical items. That is leaving things in people during surgery, which is one of those safety events that really should never happen. And what we did to try and reduce that was to do two things. One, training teams, OR teams, to work with each other better, communicate better so that these things wouldn’t happen. And then two, using a technology that actually puts a chip in the sponges so that at the end of the operation, in addition to the typical, traditional way of preventing leaving things in people– that is by counting both the sponges, instruments, the needles, all those things.

But in addition, for those of the soft goods that we call them, the sponges, the lap pads, towels and so forth had little chips in them, and you wave this wand over the patient, and it will detect if there’s a chip in the patient. So, the team training and the radio frequency together, we looked over literally a 10-year period, almost a million operations across all of our institutions here in New York, and a 50% reduction in the number of retained surgical items. And of course, along with that, a reduction in the number of lawsuits. Now, those are costly things to happen. As we mentioned in the last episode, you don’t get paid for that. When that happens, there’s additional care. There may be a lawsuit, which is also expensive, but it took us a long time to do this, so I think we need to try and number one, make sure that we let people know that this takes a while. And number two, do the research so we have the data to show people that it was worth spending the money. These chips obviously cost more. You have to buy chips with sponges and there’s a cost to that. I don’t think anybody would argue that at this point that that cost is worth it. But there are still places that don’t have these kinds of things, and I think this is a help in convincing them. The other thing that I would mention that I think is really interesting and something I think is a helpful way to engage with our finance folks. I’ve really been impressed with the financial folks that work in hospitals, especially in many of the hospitals that I work in, where so much of the income is really out of our control.

It’s Medicare and Medicaid, and those fee schedules are determined obviously, by the states and the federal government. They change from year to year. The fact that a finance expert would work in a hospital, which is really a difficult place to be, to me, means that they really are interested in doing their part to improve patient care, which I really think is great. And all of these folks that I’ve worked with to a person have really been terrific and very engaged. Why not bring them into some of these clinical situations? When things go wrong in a hospital, we often do what’s called a root cause analysis. It’s a pretty intensive review of what happened. Why not invite your finance folks to those meetings and have them listen to what went on in this particular operation in this case? And now, why would that be? Because invariably, at the end of this analysis, we usually develop what’s called a corrective action plan. That is, what are we going to do to prevent this from happening again? And often that requires resources. It requires money. If you leave a sponge in somebody, somebody might say, well, why don’t we put the chips in the sponges? Well, who’s going to pay for that? So, a few weeks later, when the folks who are assigned to address these corrective actions go back to talk to the finance department, wouldn’t it be nice if they were sitting in that room with you and hearing about this case? I think it makes a compelling argument for them, perhaps direct some of the resources to help with these things, to help prevent them. So, I think there’s generally a way to establish not easy. Nobody said it’s easy. But I think those are some of the things that I would be focused on or suggest people focus on to get the attention of financial folks that really hold the strings and the funds to make some of this happen.

Kelly: Those are some really great ideas. Dr. Feldman, would this have a wider impact on patient care generally?

Dr. Feldman: Well, I think so. I think the general feeling is that if you can improve patient care, if you can reduce the number of things that go wrong, people are going to be much more satisfied with the system. They’re more likely to engage with the healthcare system. A lot of patients are afraid to come to a hospital to see a doctor, to have an operation because they hear stories from friends and family about when things go wrong. To know that we’ve reduced the likelihood of these things happening, I think makes it more likely for them to seek care, to have their routine colonoscopies, mammograms, those kinds of things. And then there’s this whole tension between the current system, most of the current system, which is a fee for service system. So, we pay healthcare providers based on what they’re doing. And unfortunately, when there is an adverse event when something goes wrong, most of the time not always, but most of the time you get paid for taking care of that patient more, which is sort of incentive wise, a little bit perverse. It’s interesting. I’m a plastic surgeon and doing cosmetic surgery, which is not reimbursed by insurance. Patients pay for this. We would typically tell a patient I would typically tell a patient that anything related to their cosmetic surgery afterwards, I would not charge them for.

I’m making a commitment to get them where they want to be even if things unexpectedly go wrong. Now, there are certain fees. I wouldn’t have control over additional surgery fees or anesthesia fees. But I would not charge them more. So, it wasn’t quite a guarantee. I guess you’d call it more of a warranty. And that’s what happens in a value-based care system. You get paid for delivering quality care. So, in my cosmetic surgery example, if I do a successful tummy tuck, let’s say, and I don’t have to treat any complications, I don’t have the additional work, the patient’s happy. It’s a win-win. Value-based care has the same sort of ideas that doctors and nurses and other health care providers get paid for keeping patients healthy. And that, I think, is something we’re all sort of hoping to head towards. But it’s not so easy. And it is cost efficient, if you think about it, if you’re incentivizing people to deliver higher quality care.

Kelly: So true. Yes. What is the most important step that healthcare can take to maximize patient safety?

Dr. Feldman: Well, we talked about this last time. I think those of us responsible for patient safety and there’s a range here. I spent a lot of time with big hospitals, small hospitals and some large groups. But a lot of healthcare is delivered in small places, small practices. That’s still the majority of healthcare in this country is not in a hospital. It’s outside a hospital. And more and more people are pushing that because nobody wants to be in a hospital. So assessing where you are in those four elements we talked about last time, right? Developing a culture of respect so you can start getting the folks involved with delivering care to talk to each other and be upfront with each other and then focusing on getting people to work in teams together and even in an office. We did an interesting video of this a number of years ago here at Healthcare Risk Advisors to see what team training looks like in an office setting where you might only have a handful of folks, a receptionist, maybe a nurse, a medical assistant, maybe an advanced practice clinician like a PA or a nurse practitioner, and a physician, maybe one or two physicians. How does that team work together? Do they huddle at the beginning of the day to plan what the day’s events are going to be like? Do they have a debrief at the end of the day? Do they talk about how they can do things better?

That’s a really important way for even a small team to get engaged. And you can take those principles all the way up to large institutions where the teams are even bigger. And then what we talked about last time, the human factor side of this thing, have you done that? Have you thought about ways that you can use technology, use system fixes to make it easier to do the right thing and less likely for there to be a problem?

So, here’s an example. As a surgeon, I tend to use surgical examples. The typical patient coming to a doctor’s office has a problem with their knee. They’ve tried some physical therapy and so forth. And the orthopedic surgeon feels like the patient needs to have an arthroscopy, an outpatient procedure to correct whatever they believe is wrong. So, they have the conversation with the patient. The patient walks out of the examining room, and then there’s usually a conversation with somebody at the front desk. And there’s a process for calling the ambulatory surgery center. There’s about five or six different steps along the way. There may be an opportunity for a right knee arthroscopy to become a left knee arthroscopy. We can all see how that would happen. A busy day, maybe this front desk person has five of these operations to book. Well, if you gave the surgeon the opportunity to schedule the operation from his or her device, an iPad and iPhone, whatever it is, we certainly have those things are ubiquitous. And the doctor and the patient are sitting there, and the doctor says, “I think you need to have this left knee arthroscopy.” And in front of the patient types into the OR schedule in their iPad, left knee arthroscopy. That’s one step with the patient there. Think about how much more likely it is that you’re going to have the patient’s going to show up the day of surgery and there will be the form that says left knee arthroscopy, the correct side. That’s just a small example of how we can use technology that already exists to try and make things much more likely to go the right way and not the wrong way. Those are the kinds of things you want to incorporate into your practice. And that whole just culture principle asking people, do we really have the kind of culture where people are willing to speak up, to say things when things go wrong, or even when they almost go wrong?

Close calls. Many would say we should be studying close calls when things almost go wrong, because there’s more of those, there’s more data to look at. But you need people to be willing to point those things out. And in a culture where people don’t think they’re going to be punished for saying those things, you’re much more likely to be able to study them. So those are those four elements again, and I would ask people who are really interested in this, the most important step you can take is to assess where you are in those four things and try and baby steps. You can’t do everything all at once, try and achieve all of them, ultimately to try and get to that safety culture.

Kelly: Dr. Feldman, I’m sitting here just shaking my head at you. Yes, yes, yes.

Dr. Feldman: I hope so.

Kelly: Yes. If you could see me, I’m over here just nodding away. Just great information and great tips. Is there anything new in patient safety that we should be aware of? For example, use of AI in EHR systems, RFI and surgery, anything like that?

Dr. Feldman: Right. Well, I mentioned earlier the study we’ve just published on retained surgical items. That technology is not so new, but not everybody uses it. And I think that’s a great way of matching or marrying, if you will, humans counting and then machines detecting to create what engineers would call death of defense to make it less likely that something will go wrong. Honestly, there’s lots of basic things in patient safety. The two things that I always talk about, team training and simulation, have been around for a long time, but we still have places in healthcare where those things don’t exist. We’ve done simulation programs with anesthesiologist who really were the first to the patient safety table, if you will, and many of the first around simulation with ICU folks and emergency medicine physicians, gynecologists and surgeons, obstetricians, all doing simulation programs is very interesting. When I was at Maimonides, I used to drive in from New Jersey every day, and people would say, “Well, do you warm up before your operation?” Many professionals warm up before their event, whether it’s a musician or an athlete. And my warmup was an hour on the Staten Island expressway. Patients probably don’t want to hear that kind of a thing. Instead, if we had a warmup where you actually could do an operation on a simulator the day before, an hour before, and we did this at Maimonides actually a number of years ago, where we gave our hospitals, including Maimonides, a simulator for vascular procedures so that you can actually take the patient’s CAT scan and put it into the simulator and practice doing the procedure.

This was what was called an endovascular abdominal aortic aneurysm repair. So, you’re repairing the abdominal aorta not to an open operation, but by putting a stent and then a device in through a vein in the groin. So, it’s a much less invasive procedure, but you’re looking at this all radiologically. So, imagine if you could practice on this simulator before you did the operation and this happened. It was actually in the local press. It was very cool. It’s one case, but it’s the idea that this is really how to optimize how we do things. It’s not all that new, but not many people are using the opportunity to actually be able to use simulation, use team training. And now the latest thing that we’re involved with is something called an operating room black box, like an airline black box. It was sort of modeled after that, where you gather all these different inputs from the things that are in an operating room and actually study them using some AI. Imagine you have four hours of all this data from the anesthesia machine, and you have cameras in the patient if it’s that kind of surgery, you’ve got microphones, you’ve got temperature controls. If you really want to go all in, you can have a vest that the surgeon wears and monitor their heart rate and their breathing and so forth. And all this gets coordinated so that you can see for every step of this operation where things either went wrong or maybe more likely almost went wrong, so that you can do things to adjust them, to understand how we can make things better. That’s something we’ve never had.

I think it’s really the wave of the future, having cameras. There are cameras everywhere in our lives. Why don’t we have them in operating rooms? And I do think that most of the time that’s really a helpful thing for us to learn how we can do things better. So, there’s lots of opportunities for this. There’s use of AI and understanding radiology examinations, not to eliminate radiologists, but to help them focus on the things that the humans need to do and so that they don’t miss things that they may not be focused on. It gets back to what we talked about earlier, this idea of using systems to help humans prevent errors, prevent humans from making mistakes. So, you need this combination of the human touch, the human sense of where things need to be with a system to keep those humans from making mistakes.

Kelly: Completely agree. And those are some really cool ideas that you mentioned. So how do we establish patient safety as an industry wide concern?

Dr. Feldman: Well, this podcast is one way we can do that, Kelly, right? We need visibility. We need the ability to get the word out, if you will, and for better or for worse things like podcasts and webinars. And unfortunately, the pandemic has really caused us to communicate in different ways without being in person. But maybe that’s not such a bad thing for this kind of approach where you can do this through social media and all these ways of getting to people quickly, right. You want to be quick as healthcare people don’t have a lot of time if you can get them snippets of information that they can find useful and then giving them the ability to bridge the gap between the clinicians, the physicians, the nurses, and others who usually understand this and have some ideas with the administrators that run the systems and often have the purse strings so that you can really start addressing some of these things in small steps. I’m a big believer in small steps that I think ultimately, as we said earlier, will reduce costs. It just makes sense for all the reasons we talked about.

Kelly: And any parting thoughts related to our discussion?

Dr. Feldman: Well, I guess if I were to think about the main points of what we talked about already, I have to always start with respect. I think it’s true, not just in healthcare, but everywhere. People work better if they feel like they’re being treated well. Nobody likes to go to work and be treated poorly, and that includes people who work in health care. So, all of us should really pay. And don’t get me wrong, that is not easy to do in a high stress environment. And healthcare is a high stress environment. So, understanding that we’re all under stress and avoiding the human instinct when we’re under stress, to go to what we learned in college psychology, fight or flight does not work in any situation. And that’s our instincts. So, overcoming those instincts, right? So that would be the first thing I really focus on, how we can be nicer to each other.

And one of the things that I like to talk about when I was at Maimonides working with somebody there, and quality improvement. So, what does the patient see? What does the patient want, right? And the patient doesn’t. They want three things. They want to get better, and that’s why they’re in the hospital. So, heal me, don’t hurt me, don’t make things worse, and be nice to me. And I think those three things are the patient’s view and from our perspective, there’s the translation and how we deal with those three things, right? So, make sure that we’re focused on quality so that we get patients better, don’t do things that might cause harm. So that’s patient safety.

And of course, the whole respect piece about being nice to people, I think that’s an interesting way of thinking about how patients think that allow us to tailor what we do when we treat patients. And then the other thing I would say is that we really, as physicians and I come from the malpractice world, it’s much more likely for doctors to be sued when they don’t treat patients well. There’s been data about that for years. Doctors are more likely to be sued if their patients are patients who have unsolicited patient complaints about doctors. More of those predicts more lawsuits and predicts bad outcomes. So, there is a direct relationship between how we not only treat each other, but how we treat patients and lawsuits. And sometimes that’s an incentive for physicians and others to be nice to people and then focusing on all the things we talked about those four elements of patient safety. It’s certainly part of our mission at the Doctor’s Company to advance, protect, and reward the practice of good medicine that is core to what we do and why we’re so interested in helping others do the same, and why we’re happy to help advance this mission through podcasts like this. Thank you for having me.

Kelly: No, it’s great. I love the foundation of kindness and respect. I think we could use a little bit of that everywhere, right?

Dr. Feldman: Yeah. Yeah. Not so easy today.

Kelly: No, you’re right, it isn’t. It’s lacking in a lot of areas. But Dr. Feldman, we’re so appreciative of all this fantastic information. If someone wants to get in touch with you, how best can they do that?

Dr. Feldman: I’d be happy to communicate by email. My email address is dfeldman, my first initial last name @tdchra.com (dfeldman@tdchra.com). Delighted to hear from listeners with additional questions, thoughts, always looking for new ideas. I think that’s the way we all get better by listening to each other and so happy to communicate with anybody who cares to reach out.

Kelly: Well, it was really a delight to have you on the show. Just thank you so much for joining us today, Dr. Feldman.

Dr. Feldman: Thanks for having me, Kelly.

Kelly: And we appreciate you all joining us for this episode of The Hospital Finance Podcast. Until next time.

[music] This concludes today’s episode of the Hospital Finance Podcast. For show notes and additional resources to help you protect and enhance revenue at your hospital, visit besler.com/podcasts. The Hospital Finance Podcast is a production of BESLER, SMART ABOUT REVENUE, TENACIOUS ABOUT RESULTS.

 

If you have a topic that you’d like us to discuss on the Hospital Finance podcast or if you’d like to be a guest, drop us a line at update@besler.com.

The Hospital Finance Podcast

 

SUBSCRIBE for Weekly Insider Updates

  • Podcast Alerts
  • Healthcare Finance News
  • Upcoming Webinars

By submitting your email address, you are agreeing to receive email communications from BESLER.

BESLER respects your privacy and will never sell or distribute your contact information as detailed in our Privacy Policy.

New Webinar

Wednesday, April 3, 2024
1 PM ET

live streaming
Podcasts
Insights

Partner with BESLER for Proven Solutions.

whiteboard