• happybadger [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    They gave r/snackexchange to some random guy who had participated in the subreddit one time a decade ago. When he was getting weird and manipulative trying to reshape the subreddit, his top priority was instituting a new ID verification service of his choice.

    The first guy to request control of r/science was an antivaxxer who sold supplements.

    It’s so wholly arbitrary and underhanded. I just stopped interacting with my other subreddits and don’t care what happens to them.

  • cynetri (he/any)@midwest.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Kinda funny how similar this is to how scab labor is also typically low-quality, expect Reddit is literally not paying anyone anything and decided to kill the goose that lays golden eggs

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    Reddit’s moderator purge could have real impacts on reliability and information safety as it rushes to replace mods with inexperienced, poorly vetted volunteers, according to Ars Technica.

    With testimony by both expelled former moderators and some of those who replaced them, Ars Technica’s report shows the trouble with the company’s push to quickly replace the mods who sent their subreddits dark, marked them NSFW, or turned them into jokey John Oliver fan forums earlier this year.

    Reddit began removing protesting moderators in June and said it would continue doing so until morale improves unless subreddits opened back up.

    A moderator with zero 3D-printing experience joined as a “joke” to replace a mod whose expertise included identifying functional gun printing recipes.

    A new home automation moderator insists expert knowledge is unnecessary in a subreddit where bad advice can lead to electrocution or compromised cybersecurity.

    Stevie Chancellor, a computer science and engineering professor from the University of Minnesota, is quoted as saying she was concerned that mods wouldn’t be able to stop malicious users from encouraging people in mental health support forums “to hurt themselves for their own perverted desires.”


    The original article contains 381 words, the summary contains 188 words. Saved 51%. I’m a bot and I’m open source!