• Hildegarde@lemmy.world
    link
    fedilink
    arrow-up
    251
    arrow-down
    1
    ·
    11 months ago

    The point of verification photos is to ensure that nsfw subreddits are only posting with consent. Many posts were just random nudes someone found, in which the subject was not ok with having them posted.

    The verification photos show an intention to upload to the sub. A former partner wanting to upload revenge porn would not have access to a verification photo. They often require the paper be crumpled to make it infeasible to photoshop.

    If an AI can generate a photorealistic verification picture, it cannot be used to verify anything.

    • RainfallSonata@lemmy.world
      link
      fedilink
      arrow-up
      71
      arrow-down
      3
      ·
      edit-2
      11 months ago

      I didn’t realize they originated with verifying nsfw content. I’d only ever seen them in otherwise text-based contexts. It seemed to me the person in the photo didn’t necessarily represent the account owner just because they were holding up a piece of paper showing the username. But if you’re matching the verification against other photos, that makes more sense.

      • RedditWanderer@lemmy.world
        link
        fedilink
        arrow-up
        70
        arrow-down
        1
        ·
        11 months ago

        It’s been used way before the nsfw stuff and the advent of AI.

        Back in the days if you were doing an AMA with a celeb, the picture proof is the celeb telling us this is the account they are using. Doesn’t need to be their account and was only useful for people with an identifiable face. If you were doing an AMA because you were some specialist or professional, giving your face and username doesn’t do anything, you need to provide paperwork to the mods.

        This is a poor way to police fake nudes though, I wouldn’t have trusted it even before AI.

    • oce 🐆@jlai.lu
      link
      fedilink
      arrow-up
      31
      arrow-down
      2
      ·
      11 months ago

      Was it really that hard to Photoshop enough to bypass mods that are not experts at photo forensic?

    • DominusOfMegadeus@sh.itjust.works
      link
      fedilink
      arrow-up
      13
      ·
      11 months ago

      On a side note, they are also used all the time for online selling and trading, as a means to verify that the seller is a real person who is in fact in possession of the object they wish to sell.

    • trolololol@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      11 months ago

      How does traditional - as in before AI - photo verification knows the image was not manipulated? In this post the paper is super flat, and I’ve seen many others.

      • Hildegarde@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        11 months ago

        From reading the verification rules from /r/gonewild they require the same paper card to be photographed from different angles while being bent slightly.

        Photoshopping a card convincingly may be easy. Photoshopping a bent card held at different angles that reads as the same in every image is much more difficult.

        • stebo@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          8
          arrow-down
          1
          ·
          11 months ago

          That last thing will still be difficult with AI. You can generate one image that looks convincing, but generating multiple images that are consistent? I doubt it.

            • EldritchFeminity@lemmy.blahaj.zone
              link
              fedilink
              arrow-up
              2
              ·
              11 months ago

              I feel like you could do this right now by hand (if you have experience with 3d modelling) once you’ve generated an image. 3d modelling often includes creating a model from references, be they drawn or photographs.

              Plus, I just remembered that creating 3d models of everyday objects/people via photos from multiple angles has been a thing for a long time. You can make a setup that uses just your phone and some software to make 3d printable models of real objects. No reason preventing someone from using a series of AI generated images instead of photos they took, so long as you can generate a consistent enough series to get a base model you can do some touch-up by hand to fix anything that the software might’ve messed up. I remember a famous lady in the 3d printing space who I think used this sort of process to make a complete 3d model of her (naked) body, and then sold copies of it on her Patreon or something.

      • KneeTitts@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Jut ask for multiple photos of the person in the same place, AI has a hard time with temporal coherence so in each picture the room items will change, the face will change a bit (maybe a lot), hair styles will change… etc