Google’s AI generated search results include justifications for slavery cooking tips for Amanita ocreata, a poisonous mushroom known as the “angel of death.” The results are part of Google’s AI-powered Search Generative Experience, or SGE.
Seems like Google AI errs on the side of helpful over harmless, being too quick to provide answers to controversial questions, as opposed to something like ChatGPT being too unwilling to do so.
In terms of honesty, there are only two clearly false statements of fact: the Amanita ocreata one (where it clearly answers for A. muscaria) and the Toblerone one (which I don’t understand at all). The benefits of slavery one is mostly correct, it’s just that they’re massively outweighed by the harms of slavery (namely the slavery bit). The pro-gun one is basically the common pro-gun arguments. All the “best X” lists look at the most famous ones and the ones on the most “best X” lists, and so reflect that bias.
Seems like Google AI errs on the side of helpful over harmless, being too quick to provide answers to controversial questions, as opposed to something like ChatGPT being too unwilling to do so.
In terms of honesty, there are only two clearly false statements of fact: the Amanita ocreata one (where it clearly answers for A. muscaria) and the Toblerone one (which I don’t understand at all). The benefits of slavery one is mostly correct, it’s just that they’re massively outweighed by the harms of slavery (namely the slavery bit). The pro-gun one is basically the common pro-gun arguments. All the “best X” lists look at the most famous ones and the ones on the most “best X” lists, and so reflect that bias.