The “A.I.” Doctor is in the House – Part II

Nurses and doctors are now being challenged in their medical decision making by computers – called artificial intelligence or “A.I.” That’s the subject of this two-part series on changes in medicine.

In Part I, we learned that nurses and doctors face scrutiny of their decisions when computer alerts tell them to respond one way but their medical experience and training tells them to respond another. While humans are not perfect – neither are computers. Which one would you prefer to put first in decisions about your healthcare?

Lisa Bannon, writing for The Wall Street Journal, says this is a real dilemma nurses and doctors are facing every day since the implementation of A.I. algorithms into medicine.

In a survey of 1,042 registered nurses published this month by National Nurses United, a union, 24% of respondents said they had been prompted by a clinical algorithm to make choices they believed “were not in the best interest of patients based on their clinical judgment and scope of practice” about issues such as patient care and staffing. Of those, 17% said they were permitted to override the decision, while 31% weren’t allowed and 34% said they needed doctor or supervisor’s permission.

Naturally, hospitals are denying such pressure to follow A.I. algorithms but nurses and doctors feel differently. “If a nurse feels strongly this does not make sense for their patient, they should use their clinical judgment” and contact the doctor, the UC Davis Medical Center said. “The ultimate decision-making authority resides with the human physicians and nurses.” But one of their nurses said, “I’m not demonizing technology, but I feel moral distress when I know the right thing to do and I can’t do it.”

Nurses have enough stress in their jobs already without this additional burden. Since the Covid-19 pandemic, the ranks of nursing have been depleted and hospitals in our area have billboard signs begging nurses to apply for work. Hospitals are chronically understaffed and many complain of high stress and exhaustion. In a survey of more than 12,500 nurses in November 2022 by the research affiliate of the American Nurses Association, 43% of nurses said they were burned out. Add to this the growing doctor shortage and you have a real crisis in healthcare.

Jeff Breslin, a registered nurse at Sparrow Hospital in Lansing, Mich., has been working at the Level 1 trauma center since 1995. He helps train new nurses and students on what signs to look for to assess and treat a critically ill or severely injured patient quickly. “You get to a point in the profession where you can walk into a patient’s room, look at them and know this patient is in trouble,” he said. While their vital signs might be normal, “there are thousands of things we need to take into account,” he said. “Does he exhibit signs of confusion, difficulty breathing, a feeling of impending doom, or that something isn’t right?”

Like most trauma centers, Sparrow uses algorithms to alert nurses about changes in patient conditions. Over the past several years, Breslin has noticed that newer, digitally native nurses often trust the algorithm over their own observation skills. If new nurses over rely on AI-based decisions, he said, “you’re not going to have the same assessment skills to look at a patient and know, ‘I’ve got to do something right away,’ ” he said.

Hospitals defend these A.I. system algorithms as intended to support nurses’ clinical judgment, not replace it. That may be true in some cases, but not in others. Whether a nurse is confident enough to trust her own judgment to override an algorithm often depends on hospital policy. Clinicians who are penalized for making a wrong decision may start deferring to the computer, nurses say.

Cynthia Girtz, a registered nurse at Kaiser Permanente Medical Group, was on duty one afternoon at a call center in Vallejo, Calif., where she worked as what the company calls an “advice nurse,” along with more than 200 other nurses, according to proceedings from an arbitration case last year. Her job was to answer calls from Kaiser members who feel sick and offer clinical advice on what they should do next. Advice nurses at Kaiser use algorithms to categorize the illness of the caller, typing answers into a drop-down menu based on the patient’s symptoms, according to the arbitration decision. The answers determine the next steps for treatment.

When Kenneth Flach, a retired professional tennis player from Marin County, called Kaiser complaining of a cough, chest pains and fever, Girtz chose Kaiser’s cough/cold and flu algorithm. That protocol doesn’t provide an option for an emergency room or in-person doctor visit “unless the patient was spitting up at least 2 tsp of frank [visible] blood,” the arbitration decision said. Since Flach wasn’t doing so, the nurse followed the algorithm’s directions to schedule a phone appointment with a doctor several hours later.

Flach was later diagnosed with pneumonia, acute respiratory failure and renal failure and died several days later. The arbitrator in the case held the nurse responsible, deciding she should have exercised her clinical judgment to override the algorithm. “Pressured by this policy, Nurse Girtz viewed it as a directive,” the arbitrator found. The nurse, “notwithstanding the Kaiser policy, owed a duty of care to provide a reasonable nursing assessment.” Kaiser, as the nurse’s employer, was ordered to pay the family about $3 million.  In a written statement, Kaiser said the algorithmic recommendations are “guidelines for triage, not directives,” and nurses are expected to adapt them to the unique needs of each patient. Nurses have the authority to override protocols based on their clinical judgment.

Is it any wonder that hospitals are having difficulty hiring nurses and doctors are retiring early?