National Nurses United

National Nurse magazine April-May-June 2024

Issue link: https://nnumagazine.uberflip.com/i/1521950

Contents of this Issue

Navigation

Page 20 of 23

Indigenous, and other people of color because it draws from data generated by an already prejudiced health care system. In a winter 2024 survey of 2,300 RN members about A.I. and tech- nology, National Nurses United (NNU) revealed disturbing experiences. About 40 percent reported that their employers had intro- duced new devices, gadgets, and changes to the EHR over the past year, but 60 percent disagreed with the idea that their hospitals would prior- itize patient safety when implementing them. About half said their facilities use A.I. algorithms to determine patient acuity, but about two- thirds of nurses said the computer-generated acuity measurement did not match their real-world assessment. Almost a third, 29 percent, said they could not override the system when they disagreed with determi- nations and categorizations generated by the A.I. software. As Big Tech and Big Health seek to adopt these technologies at breakneck pace, NNU believes it is imperative that our union must proactively help nurses educate themselves and the public about them, organize and fight back when they threaten patient safety and nursing practice, and participate in development of legislation and policy to regulate their use. Unlike a new drug or treatment that must be tested and pass the Food and Drug Administration for safety and efficacy—even compared to a placebo or the status quo of doing nothing— almost all A.I. currently does not undergo any kind of testing, verification, or validation. To that end, NNU has developed a "Nurses and Patients' Bill of Rights: Guiding princi- ples for A.I. justice in nursing and health care," a document that will serve as a touch- stone for evaluating upcoming policy and legislation to regulate A.I. These rights include the right to high-quality, person-to-person nursing care; right to safety; right to privacy; right to transparency; right to exercise pro- fessional judgment; right to autonomy; and the right to collective advocacy for workers and patients. See the sidebar and NNU's website for more details. "It's actually incredibly shocking when you consider that hospitals are deploying these technologies immediately into real-life patient care health settings where people's lives are on the line and actual harm can be done to patients," said Michelle Mahon, RN and assistant director of nursing practice at NNU. "Patients are not guinea pigs and most com- panies are offering no proof that their products are safe or effective." W hat exactly is a.i., the acronym for "artificial intelli- gence"? It has meant different things at different times, but the definition that NNU uses now is as follows: A.I. is a machine-based technology that attempts to mimic human intelli- gence by taking inputs in the form of massive amounts of data, processing them through algorithmic software, and generating out- puts from that process. Those outputs may be in the form of predictions, content, recommendations, directions, or decisions. In the health care context, A.I. often analyzes and generates recom- mendations or other conclusions based on patients' electronic health records and other sources of data collected from patients, health care workers, and the environment. This type of data aggregation and processing has been going on for decades, but what has catapulted A.I. into the headlines lately is the phenomenon of "generative A.I.," where a user can prompt the computer or system to draw upon the data available to it in order to produce what appears to be an original piece of content, whether an essay, piece of artwork, or a nursing care plan. The goal, as always, is profit. Nowhere was this more blatant than in a March 2024 announcement by software company Nvidia that was partnering with Hippocratic AI to offer health care institutions generative A.I. "nurses" that cost only $9 per hour to operate. In its promotional video, Nvidia shows an A.I. "nurse" named Rachel interacting with a patient who has just been discharged home after an appendectomy. They discuss medications and the avatar on the tablet offers some general recovery education. Currently, A.I. is already being used in software systems that determine staffing and scheduling, clinical prediction, remote patient monitoring, automated charting and nursing care plans, and more. A.I. also makes possible many of the programs that on first blush would not appear to be related to A.I., such as tele- or remote- sitting and hospital at home, because it is largely relying on the A.I.-driven software to alert humans when something is wrong. One of the most frustrating aspects for nurses of confronting A.I. on the job is the lack of transparency about when A.I. is in use. Does this scanner use A.I.? Does that pump that's always squawking alerts use A.I.? Employers usually do not disclose or make announcements about the deployment of A.I. technologies, even though they should and, in many of our hospitals, are contractually obligated to do so and bargain over their effects. But if you browse the websites for the companies creating these technologies, many very clearly tout that their product integrates A.I. It's probably safe to say that almost every nurse is currently encounter- ing A.I. via their interaction with whichever electronic health records system their hospital uses. Nurses' documentation feeds directly into the EHR and patient classification systems that determine acuity. The two big players that share market dominance in EHR management are Epic and Oracle (formerly Cerner), whose product name is Clairvia. The main problem that nurses report with these EHR systems is that they misclassify patients at a lower acuity than they actually are, thereby leading to lower staffing levels than are actually needed to do the care work and keep patients safe. Every four to five hours, the sys- tem analyzes whatever the nurses have charted and uses that to predict what staffing should be for the next shift. However, nurses are often not able to chart in real time when their first priority is providing safe patient care, and also point out that certain aspects of the care they provide simply are not or cannot be accurately captured by the EHR. For example, Kaiser Permanente nurses report that nursing care hours for treatments like continuous bladder irrigation or intrave- nous immunoglobulin, which require nurses to be constantly monitoring and entering the patient's room frequently, are not prop- erly accounted for within Epic. Pediatric oncology RN Craig Cedotal noted that there are many hours of preparation, checking, and dou- ble checking by a second nurse that happens for kids receiving chemotherapy before the patient even steps foot in the facility. This work, and the time it takes to do it, is not accounted for by Epic in determining staffing levels. "I don't ever trust Epic to be correct," said Cedotal, who works at Kaiser Permanente Oakland Medical Center. "It's never a reflection of what we need, but more a snapshot of what we've done." This type of "snapshot" staffing is particularly inappropriate for certain kinds of units, said Allysha Shin, an RN in the neuro ICU at Keck Medicine of USC and a California Nurses Association/National Nurses Organizing Committee board member. "Our unit is always very busy," said Shin. "We can easily get six to eight admissions in one A P R I L | M AY | J U N E 2 0 2 4 W W W . N A T I O N A L N U R S E S U N I T E D . O R G N A T I O N A L N U R S E 21

Articles in this issue

Links on this page

Archives of this issue

view archives of National Nurses United - National Nurse magazine April-May-June 2024