Ouch.

    • OsrsNeedsF2P@lemmy.ml
      link
      fedilink
      English
      arrow-up
      16
      ·
      4 hours ago

      Holy smokes I stand corrected. The chatbot actually misunderstood the context to the point it told the human to die, out of the blue.

      It’s not every day you get shown a source that proves you wrong. Thanks kind stranger

      • Mog_fanatic@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 minutes ago

        One thing that throws me off here is the double response. I haven’t used Gemini a ton but it has never once given me multiple replies. It is always one statement per my one statement. You can see at the end here there’s a double response. It makes me think that there’s some user input missing. There’s also missing text in the user statements leading up to it as well which makes me wonder what the person was asking in full. Something about this still smells fishy to me but I’ve heard enough goofy things about how AIs learn weird shit to believe it’s possible.

      • megane-kun@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        3 hours ago

        No problem. I understand the skepticism here, especially since the article in the OP is a bit light on the details.


        EDIT:

        Details on the OP article is fine enough, but it didn’t link sources.