corb3t@lemmy.world to Technology@lemmy.ml · edit-21 year agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square185fedilinkarrow-up136arrow-down115file-textcross-posted to: technology@lemmy.worldtechnology@lemmy.worldfediverse@lemmy.mltechnology@beehaw.org
arrow-up121arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comcorb3t@lemmy.world to Technology@lemmy.ml · edit-21 year agomessage-square185fedilinkfile-textcross-posted to: technology@lemmy.worldtechnology@lemmy.worldfediverse@lemmy.mltechnology@beehaw.org
minus-squareredcalcium@lemmy.institutelinkfedilinkarrow-up1arrow-down1·1 year agoIf you run your instance behind cloudlare, you can enable the CSAM scanning tool which can automatically block and report known CSAMs to authorities if they’re uploaded into your server. This should reduce your risk as the instance operator. https://developers.cloudflare.com/cache/reference/csam-scanning/
If you run your instance behind cloudlare, you can enable the CSAM scanning tool which can automatically block and report known CSAMs to authorities if they’re uploaded into your server. This should reduce your risk as the instance operator.
https://developers.cloudflare.com/cache/reference/csam-scanning/