The report hints at it but doesn’t really say it out loud: get rid of one particular server and there goes 99% of it, along with 90% or so of the overall Japanese userbase (as they were the first big Japanese instance and had a mostly-trusted locally relevant company behind it). But nearly every non-Japanese-orientated instance already either fully defederated from it or has something to strip media content from it. It’s essentially its own thing not really related to Mastodon aside from the software in use.
https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf
The report is here. There are some good points, and it’s definitely not an obvious smear-job, but it’s definitely not perfect. And it is likely to be misused by less-responsible authors who are interested in smearing Mastodon or other services not operated by major tech companies.
Those less responsible authors should be shown this study from the same organization last month showing similar problems on Twitter
In the course of the investigation, researchers found that despite the availability of image hashes to identify and remove known CSAM, Twitter experienced an apparent regression in its mitigation of the problem. Using PhotoDNA, a common detection system for identified instances of known CSAM, matches were identified on public profiles, bypassing safeguards that should have been in place to prevent the spread of such content. This gap was disclosed to Twitter’s Trust & Safety team which responded to address the issue. However, the failure highlights the need for platforms to prioritize user safety and the importance of collaborative research efforts to mitigate and proactively counter online child abuse and exploitation.
That being said, people who code for the Fediverse should see this report and pay particular attention to things like
Current tools for addressing child sexual exploitation and abuse online—such as PhotoDNA and mechanisms for detecting abusive accounts or recidivism—were developed for centrally managed services and must be adapted for the unique architecture of the Fediverse and similar decentralized social media projects.
I honestly don’t know crap about coding, but this seems like a very solvable problem and something I’d very much like for the people who do to engage with. I would absolutely donate some money to support a project like this.
e; I guess what I meant to say is I would absolutely donate some money to purchase API keys from Microsoft
After a bit of reading, another option may simply be to include a “report” button that generates a hash of the image and federates the list. That being said, their may be a similarity algorithm under the hood of PhotoDNA that works better. Hard to say since it’s all proprietary and pay-for-membership. Prices aren’t even listed publicly unless you use a cloud API.
Yeah, I’m just discovering that it’s proprietary
In 2009, Microsoft partnered with Dartmouth College to develop PhotoDNA,
Good to know my tax dollars went to helping Microsoft develop another product! /s
Or… license PhotoDNA of course!
Oh, surely they don’t charge for a tool to stop child abus-
Wow, say what you will about capitalism, but it really is an engine for innovation and coming up with new ways to make me lose faith in humanity
Mastodon is the new Discord!