User:Doc James/Community health

We can do better regarding protecting our community from incivility and harassement. During the 2030 strategy process we came up with some specific ideas and other board directions in which we want to work.

My question is what concrete steps can we take to improve community health?

1. Anonymous reporting of specific edits edit

I have some ideas on tech to address issues. In my opinion anonymous reporting of concerning edits to the community would be useful. Reporting could occur by anyone and does not need to be by the victim themselves. Concerning edits would end up on a Wiki page were admins / others could review the concerns and hand out escalating blocks as needed. In my opinion it is essential to detect issues early and redirect users who are creating issues before the issues become entrenched. Doc James (talk · contribs · email) 04:09, 11 May 2020 (UTC)

  • We have lots of non-specific edit reviews through Huggle, Autowikibrowser, ORES, Wikiloop Battlefield, and the undo/rollback buttons. We have no system for anonymous reporting of specific edits. If we wanted to implement such a thing then the future system could follow any of those workflows. The challenging part of this is keeping anonymity, which for things like harassment may reveal the person flagging. That should not be a concern in most case. The anonymous reporting issue came up in meta:Community health initiative/User reporting system consultation 2019, which is a perennial community need but which is super-problematic for the Wikimedia Foundation to host. In this system it would be possible to send free form complaints in of any kind, not just flagging edits. That project seems sunk with no reported outcomes. Blue Rasberry (talk) 15:30, 11 May 2020 (UTC)
I see this as a way in which people trying to make a point could harass an editor with complains about relatively mild impoliteness that is basically within the community norm, and hope that some admin would take it more seriously than it deserves. This is one of the dangers of all forms of anonymous complaints. I see nowayofconstructing a system that isfree rom this. DGG ( talk ) 01:02, 17 May 2020 (UTC)

2. AI assisted detection of concerns edit

We have been looking at AI tech to detect undisclosed paid editing[1][2] and harassment[3] with the ultimate hope of combining this sort of tech into ORES since at least 2015. Doc James (talk · contribs · email) 04:10, 11 May 2020 (UTC)

I started documentation at Wikipedia:Automated moderation, yes, this is a great direction. Blue Rasberry (talk) 15:30, 11 May 2020 (UTC)

3. Allow users to remove geolocations from their own Commons images edit

Many phones automatically add geolocation in the EXIF of images. While one can make this less visible one cannot actually remove it from Commons / Wikipedia unless they are an admin, download the image, use a spell tool or print screen and put it into paint, and re upload it plus delete the original copy. This exposes the users location which in many situations is not required or appropriate. Doc James (talk · contribs · email) 04:17, 11 May 2020 (UTC)

4. Develop metrics to measure community health edit

We need to have measures to track our efforts against. How do we measure if we are improving or not? One method is of course regular surveys of our editing communities. But what do people see as other measures? Number of socks detected? Frequency in which profanity is used to insult other editors? Number of accounts blocked for incivility? Doc James (talk · contribs · email) 04:28, 11 May 2020 (UTC)

For various reasons I believe there is conflict between what the WMF will want to measure versus what the community will want to measure. When the community reports certain conflicts then for some reason the WMF be feels responsible and feels an obligation to address the conflict. If the WMF cannot address the conflict then in various ways they suppress the reporting system as a way of eliminating the reporting of problems. We can come up with measures of health but the programs at meta:Community health initiative were problematic in that there is some barrier between community requests and WMF acknowledgement, and I think the issue is WMF existential fear of having awareness of problems rather than technical barriers or lack of resources.
One way of going forward with this is arbitrarily choosing any health metrics which are palatable, regardless of their utility or community demand for them, just to advance the conversation and precedent that there should be metrics for community health and that the WMF does not need to feel shame over their existence. Blue Rasberry (talk) 15:36, 11 May 2020 (UTC)

5. Real time feedback of concerns edit

Twitter just launched these. You can see an example here. Would give people a second chance to double check that they truly want to publish something. Doc James (talk · contribs · email) 07:29, 14 May 2020 (UTC)

might be simpler to have the standard gmail 30 second pause to think before posting--for everything. DGG ( talk ) 03:15, 17 May 2020 (UTC)