Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
July 20, 2020 08:50 pm

Patients Aren't Being Told About the AI Systems Advising Their Care

At a growing number of prominent hospitals and clinics around the country, clinicians are turning to AI-powered decision support tools -- many of them unproven -- to help predict whether hospitalized patients are likely to develop complications or deteriorate, whether they're at risk of readmission, and whether they're likely to die soon. But these patients and their family members are often not informed about or asked to consent to the use of these tools in their care, a STAT examination has found. From a report: The result: Machines that are completely invisible to patients are increasingly guiding decision-making in the clinic. Hospitals and clinicians "are operating under the assumption that you do not disclose, and that's not really something that has been defended or really thought about," Harvard Law School professor Glenn Cohen said. Cohen is the author of one of only a few articles examining the issue, which has received surprisingly scant attention in the medical literature even as research about AI and machine learning proliferates. In some cases, there's little room for harm: Patients may not need to know about an AI system that's nudging their doctor to move up an MRI scan by a day, like the one deployed by M Health Fairview, or to be more thoughtful, such as with algorithms meant to encourage clinicians to broach end-of-life conversations. But in other cases, lack of disclosure means that patients may never know what happened if an AI model makes a faulty recommendation that is part of the reason they are denied needed care or undergo an unnecessary, costly, or even harmful intervention. That's a real risk, because some of these AI models are fraught with bias, and even those that have been demonstrated to be accurate largely haven't yet been shown to improve patient outcomes. Some hospitals don't share data on how well the systems work, justifying the decision on the grounds that they are not conducting research. But that means that patients are not only being denied information about whether the tools are being used in their care, but also about whether the tools are actually helping them. The decision not to mention these systems to patients is the product of an emerging consensus among doctors, hospital executives, developers, and system architects, who see little value -- but plenty of downside -- in raising the subject.

Read more of this story at Slashdot.


Original Link: http://rss.slashdot.org/~r/Slashdot/slashdot/~3/OIvf5t91p8I/patients-arent-being-told-about-the-ai-systems-advising-their-care

Share this article:    Share on Facebook
View Full Article

Slashdot

Slashdot was originally created in September of 1997 by Rob "CmdrTaco" Malda. Today it is owned by Geeknet, Inc..

More About this Source Visit Slashdot