Salad

‘Rectal garlic insertion for immune support’: Medical chatbots confidently give disastrously misguided advice, experts say


Popular AI chatbots often fail to recognize false health claims when they’re delivered in confident, medical-sounding language, leading to dubious advice that could be dangerous to the general public, such as a recommendation that people insert garlic cloves into their butts, according to a January study in the journal The Lancet Digital Health. Another study, published in February in the journal Nature Medicine, found that chatbots were no better than an ordinary internet search.

The results add to a growing body of evidence suggesting that such chatbots are not reliable sources of health information, at least for the general public, experts told Live Science.

Related posts

How long does it take the sun to rotate?

sys.admin

Preserved hair reveals just how bad lead exposure was in the 20th century

sys.admin

James Webb telescope spots widespread auroras rolling through Uranus’ upper atmosphere

sys.admin

Leave a Comment