Jump to content

Trust and Safety Product

From mediawiki.org

According to the Wikimedia Movement Strategy, the movement should be inclusive, welcoming, safe, and harassment-free. This is our shared commitment, and these issues affect everyone, both all members of the movement and all users of our platforms.

In the context of online communities and platforms, this problem space is known as trust and safety. Our team works in this area and builds software to address issues of digital security, online privacy, knowledge integrity, and more. We collaborate with different stakeholders to cover these issues from multiple directions.

Team members

[edit]

How we work

[edit]

We provide the technical component of trust and safety, contributing to the wellbeing of all users of the Wikimedia platforms. We do this by:

  • Building software that allows anyone to take actions to enhance their security and privacy,
  • Working with community functionaries, and learning from community-led strategies, to tackle vandalism, disinformation, and other unwanted behaviors,
  • Working with our colleagues in Site Reliability Engineering, Security, and Legal on scaled abuse work and to ensure we keep up with the data privacy standards and regulations.

Our team reaches out to functionaries like stewards, CheckUsers, and others. We do that to ensure that what we do is informed by everyday experiences of our volunteers. We also work with the Research team to ensure that our strategies are supported by scientific methods, models, and insights.

Selected projects

[edit]

Current projects

In the annual plan for the year 2024–2025, our work is documented in the objective WE4, Trust & Safety, key results 1, 2, and 4. These are executed as the following projects:

  • Temporary Accounts: this project creates temporary user accounts for unregistered users when they edit any wiki project. This enhances user privacy by reducing the amount of personally-identifiable information that would otherwise be exposed by unregistered editors.
  • Incident Reporting System: the aim of this project is to make it easy for users to report harmful incidents safely and privately, and bring them to the attention of specialists. These specialists include: administrators, Arbitration Committees, other community functionaries, and the Wikimedia Foundation Trust and Safety staff. The goal is to have unwanted behavior addressed effectively and with appropriate urgency. It is a requirement of the Universal Code of Conduct to have a reporting system in place, and it's also captured in the the Movement Strategy recommendations.
  • Getting better at blocking bad activity on wikis: IP addresses are increasingly less useful as identifiers of an individual actor, and blocking IP addresses has unintended negative effects on good faith users. This project will aim at decreasing collateral damage and increasing the precision in anti-abuse actions.

Other projects

  • IP Info: this project aims at providing patrollers with reliable information about IP addresses. This makes it easier and more efficient for functionaries to protect the projects and communities from unwanted behavior.

Metrics and instrumentation

[edit]

Decision records

[edit]