„Bei 42 % der KI-Antworten wurde davon ausgegangen, dass sie zu mittelschwerem oder leichtem Schaden führten, und bei 22 % zum Tod oder zu schwerem Schaden.“ Eine vernichtende Forschungsarbeit legt nahe, dass der medizinische Rat von Bing/Microsoft Copilot AI Sie tatsächlich töten könnte.
https://www.windowscentral.com/microsoft/microsoft-copilot-ai-medical-advice-danger
8 Comments
Very click baity headline. Answers don’t lull people.
Someone was pitching an AI product he was pleased to announce that it would produce a valid result with 87% accuracy. Didn’t sign, crazy.
Yep, tried asking AI how to cure my covid, it told me to inject bleach. Horrible advice, absolutely awful.
So castration isn’t a cure for baldness? Dammit!
A pattern of these sorts of studies is they always seem to ask the worst AI products least suited to what they’re asking for.
I literally cannot believe this is being allowed to happen.
microsoft copilot? is that even designed for giving medical advice? there are a dozen other ai chat and search engines that are specifically designed for that type of open ended question. microsoft copilot?
Super glad the most recent windows 10 update pinned copilot to my taskbar without me asking it to