corb3t@lemmy.world to Technology@lemmy.ml · edit-22 years agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square185linkfedilinkarrow-up136arrow-down115file-textcross-posted to: technology@lemmy.worldtechnology@lemmy.worldfediverse@lemmy.mltechnology@beehaw.org
arrow-up121arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comcorb3t@lemmy.world to Technology@lemmy.ml · edit-22 years agomessage-square185linkfedilinkfile-textcross-posted to: technology@lemmy.worldtechnology@lemmy.worldfediverse@lemmy.mltechnology@beehaw.org
minus-squareredcalcium@lemmy.institutelinkfedilinkarrow-up1arrow-down1·2 years agoIf you run your instance behind cloudlare, you can enable the CSAM scanning tool which can automatically block and report known CSAMs to authorities if they’re uploaded into your server. This should reduce your risk as the instance operator. https://developers.cloudflare.com/cache/reference/csam-scanning/
If you run your instance behind cloudlare, you can enable the CSAM scanning tool which can automatically block and report known CSAMs to authorities if they’re uploaded into your server. This should reduce your risk as the instance operator.
https://developers.cloudflare.com/cache/reference/csam-scanning/