Wikipedia:Moderator Tools/Automoderator

The Moderator Tools team is building an anti-vandalism 'automoderator' tool for Wikimedia projects. It will allow moderators to configure automated reversion of bad edits based on scoring from a machine learning model. In simpler terms, we're building software which performs a similar function to ClueBot NG, but making this available to all language communities. Below you'll find a summary of this project, as well as some English Wikipedia-specific questions we have.

Further details and centralised discussion can be found on MediaWiki, but we wanted to also create a discussion venue on the English Wikipedia to discuss how Automoderator might be used here, particularly because of the existence of ClueBot NG. We recognise that ClueBot NG has been used here for a long time, and has the trust of the community. If English Wikipedia editors don't want to use Automoderator, that's fine! Because ClueBot is specifically trained on English Wikipedia, we may find that Automoderator simply cannot be as accurate or comprehensive. But in the event that we find Automoderator is more effective or accurate than ClueBot NG, we want to ensure the door is open for the community to evaluate either transitioning to Automoderator or having it run in parallel. We might also want to explore building shared features, such as false positive reporting and review, which ClueBot NG could leverage even if Automoderator isn't enabled as a full system.

Please share your thoughts on the talk page here or on MediaWiki. We also have an infrequent newsletter, which you can sign up to here.

Current status: We're looking for input into our measurement plan and invite users to test out Automoderator. We plan to pilot Automoderator on the Indonesian Wikipedia in May.