Twitter only reported 2% of accounts suspended for child abuse content, org says
- Irwin confirmed that most Twitter account suspensions “involved accounts that engaged with the material or were claiming to sell or distribute it, rather than those that posted it,” the Times reported.Portnoy said that the reality is that these account suspensions "very much do warrant cyber tips.
- Ashley Belanger - Feb 6, 2023 8:01 pm UTC Last week, Twitter Safety tweeted that the platform is now “moving faster than ever” to remove child sexual abuse materials (CSAM).
- Portnoy confirmed to Ars that NCMEC and Twitter remain seemingly divided by a disagreement over Twitter’s policy not to report to authorities all suspended accounts spreading CSAM.Out of 404,000 suspensions in January, Twitter reported approximately 8,000 accounts.
- So the organization created an automated computer program to detect CSAM without displaying any illegal imagery, partnering with the Canadian Center for Child Protection to cross-reference CSAM found with illegal content previously identified in the center’s database.
- Irwin told the Times that Twitter is only obligated to report suspended accounts to authorities when the company has “high confidence that the person is knowingly transmitting” CSAM.
- Lloyd Richardson, the technology director at the Canadian center, which ran its own scan for CSAM on Twitter to complement the Times' analysis, told the Times that “the volume we’re able to find with a minimal amount of effort is quite significant.”
3 with 0 posters participatingLast week, Twitter Safety tweeted that the platform is now moving faster than ever to remove child sexual abuse materials (CSAM). It seems, however, thats not entirely [+5760 chars]