Not the best news in this report. We need to find ways to do more.

  • Aesthesiaphilia@kbin.social
    link
    fedilink
    arrow-up
    14
    arrow-down
    2
    ·
    1 year ago

    Because it’s another “WON’T SOMEONE THINK OF THE CHILDREN” hysteria bait post.

    They found 112 images of cp in the whole Fediverse. That’s a very small number. We’re doing pretty good.

    • etrotta@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      It is not “in the whole fediverse”, it is out of approximately 325,000 posts analyzed over a two day period.
      And that is just for known images that matched the hash.

      Quoting the entire paragraph:

      Out of approximately 325,000 posts analyzed over a two day period, we detected
      112 instances of known CSAM, as well as 554 instances of content identified as
      sexually explicit with highest confidence by Google SafeSearch in posts that also
      matched hashtags or keywords commonly used by child exploitation communities.
      We also found 713 uses of the top 20 CSAM-related hashtags on the Fediverse
      on posts containing media, as well as 1,217 posts containing no media (the text
      content of which primarily related to off-site CSAM trading or grooming of minors).
      From post metadata, we observed the presence of emerging content categories
      including Computer-Generated CSAM (CG-CSAM) as well as Self-Generated CSAM
      (SG-CSAM).

      • Rivalarrival@lemmy.today
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        1 year ago

        How are the authors distinguishing between posts made by actual pedophiles and posts by law enforcement agencies known to be operating honeypots?

      • Zak@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        In an ideal world sense, I agree with you - nobody should abuse children, so media of people abusing children should not exist.

        In a practical sense, whether talking about moderation or law enforcement, a rate of zero requires very intrusive measures such as moderators checking every post before others are allowed to see it. There are contexts in which that is appropriate, but I doubt many people would like it for the Fediverse at large.