Follow

OK, so a thread on how to use the moderation features of Mastodon:

This thread will focus on using Mastodon on the web UI, not on mobile or with an app.

If you see a post in a timeline that you need to report, or deal with immediately, then the menu (three dots) on the toot itself gives you moderation options. If you need to deal with the toot right away, you can "Open this status..." to deal with the post, or "Open moderation interface..." to deal with the user.

Show thread

I should make it clear that not all of these options are available to users --- this thread is to educate people with admin responsibility on an instance.

You can also deal with a user directly from their profile:

Show thread

It is *highly* recommended that a members of an instance report toots that they feel are problematic, that make them feel unsafe, or if they have any other issue with the content or implication. Don't spam your moderators, of course, but do speak up.

When reporting a toot, please give information on what is problematic about the post. While it's often obvious from the toot's content, moderators are not mind readers.

Show thread

Once a toot has been reported with information on what is problematic about it, it can be viewed by moderators in the reports area of the moderation interface. This can be found by going to Preferences → Moderation → Reports.

Reports by local users show the username of the person who reported. Reports by federated users show only the instance that the account that reported is on. This protects anonymity / security of federated users but still lets the moderators keep an eye on who reports what on the local instance.

Show thread

At this point, a moderator can do a number of things. The reason for reporting is particularly helpful here: in this case, my toot was reported for "Oats". The reason given will help the moderator(s) come to a decision on what should be done with the individual post, or the account that posted it.

Show thread

A user that has been reported more than once will have multiple reports under their username. Each one can be clicked on, and if there is a post associated with the report, then the problematic posts can be read.

Show thread

Once the moderator has gone into an individual report, they will see a lot of information related to the report. The most important parts are the toots that were reported and the reason for reporting, but at this point the moderator has the option to take action.

Show thread

The warn / disable / silence / suspend buttons are particularly important.

Once a report has been dealt with (or, if action is not considered necessary) the report can be "Marked as resolved" which effectively closes and archives the report. A report can be reopened if there is a need.

Show thread

Warnings will be sent to the user (via email if you choose) and do not have any other effect unless a moderator chooses when sending the warning.

They are good for when a user steps out of line, but a moderator does not consider immediate suspension necessary.

The menu at the top gives the full range of options available to a moderator. The options will be explained in the next few posts.

Show thread

Disable login will do just that: nothing else about the account changes, but the person accessing it will no longer be able to.

As I understand it, silencing an account stops the account's posts from being seen by people that don't follow it, but leaves them visible for those that do follow the account.

"Suspend and irreversibly delete account data" is the big one. It does just that --- the user is effectively gone from the instance if they're local, or become entirely invisible if they're a federated account. Links to their account (including ones in toots) fail and their posts are deleted.

Show thread

These same actions can be taken from the blue buttons on a report page, as I discussed before.

Show thread

If action on the account as a whole is not required, then an individual toot can be modified by a moderator.

If there is an image attached to a toot, the image can be made sensitive / not sensitive (hide or show the image by default when someone views it, respectively). While there is currently no option to add a content warning to a toot that has already been created, a toot that needs a content warning / is unsuitable for the local instance, can be deleted.

Shown in the image is a close-up of the report page for this toot, which has been reported. You can also modify an account's toots from their page in the moderation interface, which I will cover in the next toot.

Show thread

If you are not viewing a report, but instead the profile of the account through the moderation interface, then the same actions are available.

To get to an account through the moderation interface, go to Preferences → Moderation → Accounts.

There are a lot of options along the top to refine your search for a particular user, or you can scroll through the list.

Show thread

Once you have found the account, clicking on it will show you the profile / bio etc, and underneath, the options shown in the image.

One of the most useful parts here is the button that lists the number of statuses. Clicking on it will show you a timeline of posts, replies and boosts that the account has created. Any of these can be individually modified (make images visible / hidden by default; delete toot) in the same way as I have already stated.

Show thread

@alyaza @DissidentKitty @RadioAngel this should help to get you up to speed. Let me know if you'd like a clarification, or more information about anything.

Show thread
Sign in to participate in the conversation
Sunbeam City 🌻

Sunbeam City is a Libertarian Socialist solarpunk instance. It is ran democratically by a cooperative of like-minded individuals.