Google is coming in for sharp criticism after video went viral of the Google Nest assistant refusing to answer basic questions about the Holocaust — but having no problem answer questions about the Nakba.

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    edit-2
    2 months ago

    That’s not what happened. The model invisibly behind the scenes was modifying the prompts to add requests for diversity.

    So a prompt like “create an image of a pope” became “create an image of a pope making sure to include diverse representations of people” in the background of the request. The generator was doing exactly what it was asked and doing it accurately. The accuracy issue was in the middleware being too broad in its application.

    I just explained a bit of the background on why this was needed here.