Wikipedia:Wikimedia Foundation statement on paid editing and outing

Investigation of undisclosed paid editing on Wikipedia often requires a search for information about an editor outside-Wikipedia. Information such as listings on job sites, real identities listed on a company website, or a link between a username and an identity on a third-party website can all provide evidence that an editor is engaging on Wikipedia in areas where they have a conflict of interest. A transparent, public record of these investigations allows users to draw links between connected paid editors, expose repeat offenders, and ensure that investigations are being conducted with integrity.

On the other hand, the English Wikipedia's harassment policy states that personal information about an editor should not be disclosed on Wikipedia, that even discussing a person’s other pseudonyms may be controversial, and that in the case of investigating issues such as undisclosed paid editing or conflicts of interest, the information should be privately emailed to investigators such as admins, functionaries, arbcom, or the Wikimedia Foundation.

The combination of the outing policy and the needs of effective undisclosed paid editing investigation currently create a conflict: dedicated users who identify and post links between a Wikipedia account and a paid editor would like guidance to avoid violating the outing policy.

WMF Legal’s viewsEdit

WMF Legal has been asked to clarify our role in combating undisclosed paid editing on the projects and to provide our opinion on the outing vs. paid editing investigation conflict.

First, it should be understood that our position here is advisory. The privacy policy applies to data collected by the Wikimedia Foundation, by Foundation partners, by some users in special roles (such as checkusers), or by some third parties who provide data to the Foundation. It does not apply to the publicly visible postings of users on the projects or to information collected from other websites and later posted to the projects. As such, our opinion represents our view of the best way to handle these issues, but does not represent a legal requirement for the projects.

The communities' roleEdit

In our view, community members who are interested in these issues play the most important role in protecting the projects. If a user violates the Terms of Use or any other policy, the first course of action is to help educate the user, warn them about the issue, or block that user as appropriate. Our role in WMF Legal is to back up these community decisions as necessary. Our tools are able to provide support to community actions, address severe situations, and prevent misuse of the Wikimedia trademarks.

We also think that some degree of transparency in investigations helps the communities do a better job combating undisclosed paid editing. Posting and discussing information such as links to an editor’s job posting, company profile, or other information connecting that editor to editing an article subject for pay can be an effective way to identify and stop undisclosed paid editing. These kinds of transparent investigations may also help prevent abuse and ensure that people who aren’t actually connected to editing for pay can have an opportunity to explain their situation if circumstances cause a mistake to happen. It’s also important to remember that WP:OUTING can’t be used as a way to avoid the disclosure requirements in the Terms of Use: if someone is editing for a company and fails to disclose it, an admin properly posting that person’s company where it is relevant to an investigation is helping bring the account into compliance with those requirements.

NB: The Arbitration Committee posted a statement emphasizing the importance of upholding strong privacy protections for users. We strongly agree with this value, and would like to clarify that investigations do not necessarily need to be conducted transparently where personal data is at stake. We are supportive of any solution the community reaches that balances transparency where necessary with protection of personal information as decided by community consensus. Jrogers (WMF) (talk) 23:18, 3 March 2017 (UTC)

In determining whether such a post is done properly, when looking at rules in the abstract, it can be difficult sometimes to draw the line distinguishing harassing someone by posting their personal information from engaging in a fair and transparent investigation. However, we’ve seen the Wikimedia communities successfully address issues that require making difficult judgments. In our opinion, it is never appropriate to hound a person, frequently post their personal information, or even post the personal information of an innocent person once in order to maliciously draw attention to them. But we think it is appropriate, as described above, to post some already public personal information as part of a good faith investigation. We trust that editors, administrators, and functionaries can tell the difference between harassment and appropriate flagging of public personal information in the vast majority of cases, and they can always contact the Wikimedia Foundation if a controversial borderline case arises.

Some factors that could help in distinguishing between good faith investigation and harassment include:

  • The type of information being posted and whether it’s more than necessary for the investigation
  • The source of the information (for instance, whether the source is reliable and publicly accessible)
  • How public the individual being identified, or the information being posted, is already
  • Where and how much the individual posting the information on wiki is doing so (i.e. is someone spamming someone else’s personal info around?) What is the scale of the problem and does it require a particularly thorough investigation to combat?
  • Why the information was posted

WMF Legal's roleEdit

WMF Legal has a number of tools that we can use to help address issues of undisclosed paid editing, but they do not apply to every case and they can have varying levels of effectiveness depending on the details of a particular case.

First, we collect information about paid editing whenever reports are sent to us or flagged to us. This allows us to identify repeat players or problematic trends over time.

Second, we enforce the Wikimedia trademarks. People who engage in inappropriate paid editing practices and advertise their services sometimes use the puzzle globe or other Wikimedia marks, which is unfairly appropriating the goodwill of the Wikimedia projects for their own profit. When these cases are flagged to us, we respond quickly to them because trademark law has a relatively refined reporting regime on most websites. This can lead to malicious websites being taken down, individual postings being removed, or specific bad actors being banned from other sites.

Third, we can choose to send cease and desist letters where violations of the Terms of Use are found related to paid editing and blocking the user hasn’t been enough to solve the problem. This is reserved for severe cases because we do not want to bring this sort of legal tool to bear against someone who is complying with the Terms of Use or has simply made a mistake. Cease and desists are also reserved for severe cases because if one fails, it may lead to a lawsuit, which we see as a final option.

On privacy generallyEdit

We applaud the user communities for setting high standards of privacy for themselves beyond what the law requires. There have been rare instances where posting of personal information on the Wikimedia projects has been used to harass people, and we encourage users to err on the side of caution and avoid posting identifying information of others if you are uncertain about whether you’re doing the right thing by doing so.

As indicated above though, we don’t think there’s a single hard and fast rule that applies to when it’s okay to post publicly available personal information. Good faith investigations on people who are required to post things like company information by the Terms of Use are examples where posting some information is helpful rather than harmful, in our opinion. And there may be other situations such as in combating vandalism where a transparent investigation could help users identify links between different sock puppets even when not engaged in paid editing. Again, though, we recommend considering factors such as the source of the information and how public it is in making such determinations.