„Bei 42 % der KI-Antworten wurde davon ausgegangen, dass sie zu mittelschwerem oder leichtem Schaden führten, und bei 22 % zum Tod oder zu schwerem Schaden.“ Eine vernichtende Forschungsarbeit legt nahe, dass der medizinische Rat von Bing/Microsoft Copilot AI Sie tatsächlich töten könnte.

https://www.windowscentral.com/microsoft/microsoft-copilot-ai-medical-advice-danger

8 Comments

  1. onceinawhile222 on

    Someone was pitching an AI product he was pleased to announce that it would produce a valid result with 87% accuracy. Didn’t sign, crazy.

  2. Yep, tried asking AI how to cure my covid, it told me to inject bleach. Horrible advice, absolutely awful.

  3. SoldierOf4Chan on

    A pattern of these sorts of studies is they always seem to ask the worst AI products least suited to what they’re asking for.

  4. stuartullman on

    microsoft copilot? is that even designed for giving medical advice? there are a dozen other ai chat and search engines that are specifically designed for that type of open ended question.  microsoft copilot? 

  5. StriderHaryu on

    Super glad the most recent windows 10 update pinned copilot to my taskbar without me asking it to

Leave A Reply