Volunteer moderators triage all reports into one of 5 colour codes.
Green implies a "housekeeping" level of moderation. Cleaning up where people have quoted moderated posts, posting in the incorrect sub-forum, thread duplication, closing invalid reports, etc...
Blue are low-level moderation concerns. On their own they would not result in severe repercussion, but repeated abuse would be recognized: Report Abuse, Cross-Linking, Unconstructive comments, Flaming, Bashing, Insults, Vulgarities, Spamming.
Yellow were intended to borrow from the baseline moderation point of other games: Where three-strikes and you are out. This was set to include discriminatory behaviour, profile/signature abuse, extreme content and advertising.
Orange was the one-strike policy, where an automatic ban would be delivered after staff review of the account in question. Any ban required at least two staff members to sign off on it. This was set to include Spambotting, Phishing, Illegal Use, Account Trading, Impersonating Staff, Releasing Pirated or Malicious Content.
Red was the emergency escalation protocol, in which the volunteer moderators were instructed to contact a team member at any hour of the day or night immediately. This included any viable real-life threat to the safety of a team or community member.
Each possible infraction was given one of these colours, as well as a definition, which have been further refined and are publicly visible in the
the post I made under the Primus account. It also included a Procedure to walk a new Moderator through the admittedly obtuse IPboard moderation system as well as templates for both the internal log notes and the communication to be given to players.
The distinct procedures are in place to handle the distinct nature of each problematic behavior. For example, often we find that poking a human and telling them we regard bumping as a form of spam as a friendlier approach than treating them as equal to a spambot offering free furniture and phishing.
A list of Do's and Don'ts is also provided to Moderators. The Do's are composed of sanity measures on the clerical side of things: Always reading notes before moderating, always adding a note when moderating. The Don'ts are mostly protective measures: Don't discuss moderation actions in public or with anyone besides staff or the player being moderated. Don't use the term "Troll" or other ambiguous and potentially insulting phrases; Don't take sides in personal arguments; and so forth.
As a general rule, content is not hard deleted unless it poses a safety threat or breaches vulgarity standards. Threads or posts which appear to be entirely deleted are actually "un-approved", meaning that they remain visible to staff members and moderators, but not to other players.
Reports are handled by a moderator according to the procedures provided before being reviewed and closed.
We have otherwise experimented with other systems, such as follows:
- 1st Offence: Verbal Warning.
- 2nd Offence: 24-72 hour Posting Disability
- 3rd Offence or Major Threat: Permanent Ban
Penalties offered on the IPboard system include:
- A Warn Status meter: Players who are warned are given a visual representation of the number of times they have been warned, visible only to moderators and staff.
- 3 Variations of Suspension:
- Moderation Preview. All player posts start un-approved. Only a moderator or staff member may re-approve. Aside from the high recidivism rate where-ever we have applied it's use, it is also not possible to distinguish un-approved posts from previewed posts.
- Posting Disability: Player is unable to Post. The benefit of this system is it provides a timer for temporary suspensions. The problem with it is that it doesn't prevent use of the Private Message system.
- Bans: Player is unable to access the forums. This is the most successful system at removing repeat abusers, though some still take this as an invitation to circumvent this.
The forum system is distinct from the moderation systems of other social media outlets. Staff members may reserve the right to block or ignore any player who is appearing to harass them on Twitter or other channels. On Facebook, the automated system vulgarity filter is set to strong, meaning that it may occasionally hide even positive comments. "%^$# yeah that's awesome!". Facebook profiles which appear to be un-constructive sock-puppets are reported according to Facebook's policies. Persistently aggressive or toxic commentators are hidden, where their responses are visible to staff but not to other followers.
The forum moderation is also somewhat distinct from the game moderation system, which focuses on chat abuses and game mechanic abuses and is exclusively managed by staff. A player who is forum banned can continue to play the game. A player who is game-banned cannot post on the forums.