Salad

‘Rectal garlic insertion for immune support’: Medical chatbots confidently give disastrously misguided advice, experts say


Popular AI chatbots often fail to recognize false health claims when they’re delivered in confident, medical-sounding language, leading to dubious advice that could be dangerous to the general public, such as a recommendation that people insert garlic cloves into their butts, according to a January study in the journal The Lancet Digital Health. Another study, published in February in the journal Nature Medicine, found that chatbots were no better than an ordinary internet search.

The results add to a growing body of evidence suggesting that such chatbots are not reliable sources of health information, at least for the general public, experts told Live Science.

Related posts

‘That’s why there’s 9 billion of us and not 9 billion of some other primate’: Why our ability to adapt is humanity’s ‘superpower’

sys.admin

Drones could achieve ‘infinite flight’ after engineers create laser-based wireless power system that charges them from the ground

sys.admin

The Snow Moon will ‘swallow’ one of the brightest stars in the sky this weekend: Where and when to look

sys.admin

Leave a Comment