misk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square234fedilinkarrow-up1906arrow-down119
arrow-up1887arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agomessage-square234fedilink
minus-squareMahlzeit@feddit.delinkfedilinkEnglisharrow-up1·11 months agoOh. I see. The attempts to extract training data from ChatGPT may be criminal under the CFAA. Not a happy thought. I did say “making available” to exclude “hacking”.
minus-squareJackbyDev@programming.devlinkfedilinkEnglisharrow-up1·11 months agoThe point I’m illustrating is that plenty of things reasonable people would assume are fine the law can call hacking.
Oh. I see. The attempts to extract training data from ChatGPT may be criminal under the CFAA. Not a happy thought.
I did say “making available” to exclude “hacking”.
The point I’m illustrating is that plenty of things reasonable people would assume are fine the law can call hacking.