Hi

I am a computer science student and am just starting my masters thesis. My focus will be on content moderation (algorithms) and therefore I am currently exploring how some social media applications moderate content.

If I understand the docs correctly, content moderation on mastodon is all manual labor? I haven’t read anything about automatic detection of Child Sexual Abuse Material (CSAM) for example which is a thing that most centralised platforms seem to do.

Another question which kind of goes in the same direction is reposting of already moderated content. For example a racist meme that was posted before. Are there any measures in place to detect this?

Thank you for your help!

  • georgehotelling@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    If you haven’t already, you’ll probably want to read this study on CSAM from Standford which discusses the lack of automated tooling (and how PhotoDNA isn’t really equipped to handle thousands of Mastodon servers).

    Some other Stanford researchers (including Alex Stamos, former CSO for Facebook) just put out this piece on Mastodon moderation too, which is worth a look.

    When considering moderation, it’s also worth thinking about the role of defederation. From the first report, the CSAM on Mastodon sounds like it’s mostly on Japanese servers, and most western-oriented servers have defederated from them over that, so the content won’t travel across the network. I know that if that appeared on my server, the admins would probably defederate until the source cleaned up their act.

    • never_obey@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Tank you for your response. How could I not have found this study myself. Pretty sure that it will answer a lot of the questions I have and probably will have while digging deeper into the subject. This really helps me.

      I will now have a read of the study and blog post of Mr. Stamos. It is also good to know that server admins and moderators seem to act fast on those issues. Moderation is always a delicate thing but when it comes to illegal content it is really a must have.