Alb_x_008@lemm.ee to Technology@lemmy.worldEnglish · 5 个月前ChatGPT provides false information about people, and OpenAI can’t correct itnoyb.euexternal-linkmessage-square61fedilinkarrow-up1206arrow-down111cross-posted to: technology@lemmy.worldfuck_ai@lemmy.worldnews@lemmy.world
arrow-up1195arrow-down1external-linkChatGPT provides false information about people, and OpenAI can’t correct itnoyb.euAlb_x_008@lemm.ee to Technology@lemmy.worldEnglish · 5 个月前message-square61fedilinkcross-posted to: technology@lemmy.worldfuck_ai@lemmy.worldnews@lemmy.world
minus-squareNeoNachtwaechter@lemmy.worldlinkfedilinkEnglisharrow-up4·5 个月前 LLMs don’t actually store any of their training data, Data protection law covers all kinds of data processing. For example, input is processing, too. Output is processing, too. Section 4 of the GDPR. If you really want to rely on excuses, you would need wayyy better ones.
minus-squarevithigar@lemmy.calinkfedilinkEnglisharrow-up2arrow-down2·5 个月前Right, so keep personal data out of the training set and use it only in the easily readable and editable context. It’ll still “hallucinate” details about people if you ask it for details about people, but those people are fictitious.
Data protection law covers all kinds of data processing.
For example, input is processing, too. Output is processing, too. Section 4 of the GDPR.
If you really want to rely on excuses, you would need wayyy better ones.
Right, so keep personal data out of the training set and use it only in the easily readable and editable context. It’ll still “hallucinate” details about people if you ask it for details about people, but those people are fictitious.