Moderation features that I would implement in a chat service.
This means no bots whatsoever.
Disadvantages with moderation bots:
- Could become unavailable at any moment in time for any reason
- Unstable or buggy which can lead to a worse experience
- You have to use text to use the bot which is not a great thing for accessibility
- You have to go through the pain of setting the mod tools up
- Feels hacky
Advantages with a native implementation:
- Always available as long as the platform is alive
- Upholds to the platform's standard of code quality
- Can be used from the GUI, as well as provide keyboard shortcuts and navigation (depending on the implementation)
- The mod tools come out of the box. You don't have to do anything
- Feels great (allegedly)
The tools have to be easily accessible if the necessary permissions are met. They should be made to be used fully by mouse and/or keyboard.
They have to come with good defaults to minimize the setup mods have to do.
There should be a little guided setup when someone creates a space form scratch, asking about the protections they need and informing them about what they can do in future abuse situations.
There have been multiple instances where a user would use dog whistles or act suspicious where you couldn't do anything as a moderator until they did a more concrete violation. A way to note that down as a reminder when you encounter the user again would be good.
Compared to a warning, this would only be visible to moderators. Some places where the red flags would show up are:
- The user profile
- In a chat menu
This could be a duplication of user notes which are more or less the same thing, but general purpose. Depends on the level of importance you give this feature.
I have recently witnessed what the lack of restrictions to new users did to matrix rooms where a spam attack wave shared illegal pornography. At first, moderators had to resort to putting the rooms to invite only due to the lack of new user restrictions, then they put the rooms back to public but disabled images and videos. Now everyone has to use third party image/video hosts to share genuine work.
Give mod options to limit the amount of things a new user to the space can do. This can prevent spam waves or trolls at the cost of genuine new users who feel like they can't share anything besides text:
- Disable media sharing until a certain threshold has been met
- Disable link sharing, same as above
- ...
Maybe the threshold could be a challenge, sort of like a captcha, when the user first joins the space. This would only prevent malicious bots, not human trolls which go to great lengths to waste space on this earth.
Automatically detect spam and mass pings and do an action set by the moderators*
Same as Discord, it would be good to have a dashboard with space specific data of a certain user
The user menu gets pretty big on Discord and it feels super busy. Could show certain features only in the mod view.