It looks like the ex-DDG employee got the details wrong, and read the slides backwards.

  • 0xD@infosec.pub
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    1 year ago

    Could have done it with proper encoding, don’t need to remove it lol o.O

    • MotoAsh@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      1 year ago

      Allowing tainted data in to the dataset means every single client has to do every single spot of content rendering correctly or else be vulnerable to easy hacking. Keeping it out of the dataset means not all clients have to be perfect for Lemmy to be a secure place.

        • MotoAsh@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          3
          ·
          1 year ago

          I’m a fan of the swiss cheese model of safety. While blindly blocking arbitrary characters is a bit silly, not filtering/encoding the data even on the output from web services can end up in disaster.

          It’s an open API that serves publicly-sourced data. I’d not want to serve up anything more than markup content even if every single API call had perfect handling. At least not without a lot more sophisticated filtering in front of it. Even certain totally valid arrangements of HTML can be vulnerable as all hell.

          Even certain markup systems have problems, but I doubt this one has huge vulnerabilities to exploit. Certain wiki systems in the past had to be completely retired over such things.