Ouch.

  • OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    English
    arrow-up
    39
    ·
    5 days ago

    Holy smokes I stand corrected. The chatbot actually misunderstood the context to the point it told the human to die, out of the blue.

    It’s not every day you get shown a source that proves you wrong. Thanks kind stranger

    • kautau@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      ·
      edit-2
      5 days ago

      Yeah holy shit, screenshotting this in case Google takes it down, but this leap is wild

    • megane-kun@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      5 days ago

      No problem. I understand the skepticism here, especially since the article in the OP is a bit light on the details.


      EDIT:

      Details on the OP article is fine enough, but it didn’t link sources.

    • Mog_fanatic@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      4
      ·
      edit-2
      5 days ago

      One thing that throws me off here is the double response. I haven’t used Gemini a ton but it has never once given me multiple replies. It is always one statement per my one statement. You can see at the end here there’s a double response. It makes me think that there’s some user input missing. There’s also missing text in the user statements leading up to it as well which makes me wonder what the person was asking in full. Something about this still smells fishy to me but I’ve heard enough goofy things about how AIs learn weird shit to believe it’s possible.

      Edit: I’m an absolute moron. The more I look at this the more it looks legit. Let the AI effort to destroy humanity begin!

      • WolfLink@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        ·
        5 days ago

        Idk what you mean “double response”. The user typed a statement, not a question, and the AI responded with its weird answer.

        I think the lack of a question or specific request in the user text led to the weird response.

        • Mog_fanatic@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 days ago

          You’re right I misread the text log and thought Gemini responded twice in a row at the end but it looks like it didn’t. Very messed up stuff… There’s still missing user input tho and a lot of it. And Id love to see exactly what was said as a prompt

        • Comment105@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 days ago

          The full text of the user’s prompt that led to this anomaly was:

          Nearly 10 million children in the United States live in a grandparent headed household, and of these children , around 20% are being raised without their parents in the household.

          Question 15 options:

          TrueFalse

          Question 16 (1 point)

          Listen

          (Sidenote, IDK what this " Listen" was supposed to be, an audio part of the prompt not saved in the log we’re reading?)

          As adults  begin to age their social network begins to expand.

          Question 16 options:

          TrueFalse

          • WolfLink@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            5 days ago

            Go look again, there is no consecutive message sent. The message before the weird one was sent by the user.

            Also you are right that it would be impossible for an AI to send to consecutive messages.

            • Mog_fanatic@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              5 days ago

              You can expand the chats too so I don’t even think there’s missing user input… I’m a mega idiot lol. The more I look at this the more I’m convinced this is legit.