Artificial intelligence is entering healthcare. We should not let him make all the decisions.


AI is already used in healthcare. Some hospitals use the technology to help identify patients. Some use it to help diagnose or develop treatment plans. But Sandra Wachter, a professor of technology and regulation at the University of Oxford in the UK, says the true scale of AI adoption is unclear.

“Sometimes we don’t know what systems are being used,” Wachter said. But we know their adoption will increase as the technology improves and health care systems look for ways to cut costs, she says.

Studies show that doctors may be putting too much faith in these technologies. In a study published a few years ago, oncologists were asked to compare their skin cancer diagnoses with the conclusions of an AI system. Many accepted AI results, even if those results contradicted their own clinical opinion.

We are at risk of over-relying on these technologies. And here fatherhood can come.

“Paternalism captured by the ‘doctor knows’ idiom.” Melissa McCradden and Roxanne Kirsch of the Hospital for Sick Children in Ontario, Canada, write in a recent issue of the journal Science. The idea is that medical training makes a person a better person to make decisions for the person being treated, regardless of emotions, beliefs, culture, and anything else that might influence the choices any of us make.

“Parenting can be fixed by replacing the omniscient doctor with an omniscient AI when AI is placed as the highest level of evidence,” McCradden and Kirsch continued. There is a “growing trend toward algorithmic paternalism,” they say. This is problematic for several reasons.



Source link

Related posts

Leave a Comment

ten − three =