Fake4000@lemmy.world to Technology@lemmy.worldEnglish · 9 个月前Reddit started doing what they always wanted to do, sell user content to AI.www.reuters.comexternal-linkmessage-square206fedilinkarrow-up11.1Karrow-down115cross-posted to: technology@lemmy.worldtechnology@beehaw.org
arrow-up11.09Karrow-down1external-linkReddit started doing what they always wanted to do, sell user content to AI.www.reuters.comFake4000@lemmy.world to Technology@lemmy.worldEnglish · 9 个月前message-square206fedilinkcross-posted to: technology@lemmy.worldtechnology@beehaw.org
minus-squareJohnEdwa@sopuli.xyzlinkfedilinkEnglisharrow-up12·9 个月前Robots.txt has been always ignored by some bots, it’s just a guideline originally meant to prevent excessive bandwidth usage by search indexing bots and is entirely voluntary. Archive.org bot for example has completely ignored it since 2017.
Robots.txt has been always ignored by some bots, it’s just a guideline originally meant to prevent excessive bandwidth usage by search indexing bots and is entirely voluntary.
Archive.org bot for example has completely ignored it since 2017.