Healthy, Inclusive Communities edit

What impact would we have on the world if we follow this theme? edit

we could lead the way in collaborative, online digital humanities. Marthacustis (talk) 02:47, 12 May 2017 (UTC)Reply

We will be able to cover many projects effectively and on time. Jerome Enriquez John 14:31, 22 May 2017 (UTC) — Preceding unsigned comment added by Jeromeenriquez (talkcontribs)

The community can only stay healthy, if we stop separating information based on language. It would elevate that scope and use of talk pages and create more dialogue and hopefully begin to mesh the compartmentalized wiki encyclopedias, that exist independently, based on language, if we just had one wiki. So much data is isolated, due to the lack of inclusion, created by the antiquated separation based on language. There are only informational dialects that exist, now that browser based translators exist both visually and audibly.User:Anocratic

When the internet was first invented, there was a somewhat utopian idea that knowledge should be free and that the internet had the power to connect people together into a global, informed, cooperative community. However, there were also unintended consequences: the internet has been tainted by the spread of misinformation, lies and fake news; online bullying and trolling (which undermines our humanity and fractures communities); manipulation by advertising and big data... If Wikimedia follows the "healthy inclusive communities" theme it can create the utopia that the internet was originally intended to be. Any tool is only as good as those who wield it. If the community is not healthy, Wikimedia is poisoned from within. If Wikimedia succeeds in this theme, more and more people will be drawn into the safe haven of its community, growing stronger and stronger, and together we will be able to tackle whatever the future holds. Together we can help fix the internet. Powertothepeople (talk) 11:55, 5 June 2017 (UTC)Reply

We have the potential to accidentally alienate a large number of existing contributors by politicizing difficult topics like medical conditions, unpleasant historical events, and scientific research under hot controversy. (see Chilling Effect) Aaron Muir Hamilton <aaron@correspondwith.me> (talk) 20:52, 6 June 2017 (UTC)Reply

We can cause communities to become more data-/science-/evidence-based and interact with each other in new, more constructive and collaborative ways. I also agree with Aaronmhamilton and think that as we get more communities onboard and as new communities are formed there are often overlooked problems due to which we also need to build in new measures that ensure quality, neutrality, non-bias etc. Very important in this context is that on Wikipedia major viewpoints as well as viewpoint-conflicts should be informed about (this of course requires all respective sides and/or issue-neutral individuals to participate and properly so). --Fixuture (talk) 22:48, 11 June 2017 (UTC)Reply

How important is this theme relative to the other 4 themes? Why? edit

most important - community is the critical success factor. you will not be able to execute on any plan or vision without a healthy inclusive community. Need to build self-sustaining teams to solve problems; this will require training and leadership, and resources. Marthacustis (talk) 02:49, 12 May 2017 (UTC) But we know what we need, please. We should focus on what we want. We want transparent liquid like thinking. Objectivity is the most important goal.Reply

Absolutely the most important. Wikipedia needs more editors that stick around and improve articles. As long as some "old guard", real or imagined, refuses to cooperate or engages in behavior characteristic of ownership, improvements will be stifled and lost, and editors will be turned away. I've started a discussion about featuring a caution against this behavior more prominently in Wikipedia policy. Edit summaries are especially important when reverting another editor's good faith work. They should refer to relevant Wikipedia policies and guidelines, previous reviews and discussions, reliable sources, or specific grammar or prose problems introduced by the reverted edit, and not only protect a certain version, stable or not. This is a small step towards ensuring a welcoming community that really lets anyone edit any article in good faith, instead of turning away contributors and contributions with ownership behavior. Bright☀ 08:21, 13 May 2017 (UTC)Reply

I believe this is in conflict -- or at least causing a tension -- with another theme in the Strategy: "The Most Respected Source of Knowledge". In brief, people who are skilled at finding & synthesizing information are not always skilled in social interactions, & vice versa. These are independent qualities, & having skill in one does not necessarily lead to skill in the other. This has been demonstrated on the English Wikipedia numerous times: quality contributors who prove to be incivil, & people who are skilled in social interactions who are revealed to be advocating an agenda that compromises the quality of articles. This is a complex issue, & there are no easy solutions to resolve this conflict of desires. (I doubt even rewriting Wikipedia rules would entirely solve this conflict.) But this issue at least needs to be formally acknowledged so it can be addressed. -- llywrch (talk) 03:00, 14 May 2017 (UTC)Reply

I've not met these good contributors who are uncivil. First, if they synthesize information they're acting against Wikipedia policy (but I'll assume that by "synthesis" you meant arranging sourced information into an article). Second, it's not a popularity contest and you don't have to butter up anyone, you just have to communicate. If you can write an article you can write a comment explaining the reason for your edit. Third, all cases of incivility that I've encountered were explicitly designed to avoid Wikipedia policies, guidelines, RfCs, discussions, and consensus. Recently for example, someone's been told multiple times to follow some RfC consensus, and when they revert an edit that was in accordance to that consensus, they become uncivil and refuse to discuss. While the issue you raise may apply in some very limited contexts, I have never encountered it, and all cases of incivility I've encountered were attempts to avoid Wikipedia policies and consensus. Bright☀ 13:00, 15 May 2017 (UTC)Reply
Hello there! I've been banned for it. Most people like me throw in the towel long before reaching this point. They understand the pettiness and the waste of time this all is. Continuing like this makes us depressed. Why am I here you ask? WP happens to be the only source that synthesizes the topics I'm working on in a broad, objective and interesting way. (or has the ability to at least) What's regarded here as "reliable sources" tends to be too simplistic, too specific or very academic: not the best or even findable for a casual reader/informer. We go deep into a topic. We know what's up and understand the rules. No, we understand why the rules/guidelines are there, but we also know when it's in everyone's best interest not to follow them. Common sense. Expert understanding. WP:AIM means something to us. WP:IGNORE is policy but cannot be relied upon at all. You'll get bitten. We get attacked with policy and they defend with policy. We don't have time for this. We get provoked. Use harsher words. We lose. WP:GOODFAITH is not assumed at all. No match. Of course you don't notice these contributors... is exactly the problem!
As for your second part ("policy"!!, communicate, but the policy!, "consensus", ME, "avoid"). Yeap. Those are exactly the things we get pissed about. We try to stay away from it. We're competent. It's doable to avoid, but when it hits, it hits hard. We are the uncivil ones, the niggers to convert! (emotional Americans, don't get batshit crazy of my use of words. the world is round. not a bad word in my upbringing. yes, it refers to imperialism. but no, it isn't racism here. applicable in this comment. no, fuck consensus on this) Indeed, we have no experience to juggle with "policy" which is often just a misinterpreted guideline or essay passed as LAW. When we communicate, they rather discuss the messenger and not the message. This goes over our head: still stuck at discussing the message. We lose. Understand that "consensus" has a huge bias. From management in their tower. It's what best for them. I can agree with it, but never the binary application. No room for what is best to the common goal. "you just have to communicate" Theory. Not practice. Don't get me wrong, I agree, but what the word just implies is: 1. not always easy for everyone and 2. not working well. I've done it. Once. Other editor admitted misunderstanding. So far so good, but a revert of the fallout never happened. Took it up with the administrators. Got frozen after discussion for investigation of Wikimedia legal team, ending up with new page on the subject. Essentially proving me right. Single way of communication with facts and policy, but got the same shit back that was disproved by the 3rd party or guidelines were thrown in that are not applicable at all. It got closed down for "A contentious fact does not become uncontentious by virtue of repetition. Closing as Vexatious". You can't beat them at their own game. Up to the highest levels they do what they want and ignore stuff when it suits them. Facts, admitted mistake and policy be damned. Communication requires understanding. Paper only wins against rock in the children's game...
To end my rant, you are very right when you say absolutely the most important. And your experience is right in that this will never be a majority case. Numbers agree different people have similar experiences. Now, watch how they only react on how I said it and not what... --Ondertitel (talk) 20:41, 31 May 2017 (UTC)Reply
Unfortunately I do not understand your point. Could you try leaving out the rant and just make the point? Cheers, • • • Peter (Southwood) (talk): 09:34, 1 June 2017 (UTC)Reply
I can't tell if you are serious or not. Sarcasm doesn't work well on the internet. Looking at other reactions on this page I'm afraid you are bloody serious. I've literally put one word in bold... illustrating my point very nicely. I'll humor you with a comparison. Islam for example. The expectation is a good peaceful religion, but in practice it's ranging from head scarf enforcement to IS knuckleheads. Wikipedia is like this, but not difficult at all to say 'no thanks' to when encountering its intricacies and the people that go with it. Let me point to some other things related to the problem: Dunning–Kruger effect, Survivorship bias and Brandolini's law. --Ondertitel (talk) 13:06, 5 June 2017 (UTC)Reply
Quite serious. As you say, sarcasm does not work well om the internet. If you take my words at face value, you will be getting the message I wish to impart. Wrapping one's point with a load of invective, hyperbole and apparent irrelevance makes it difficult to identify the core point, which is a necessary first step towards understanding it, which as you have pointed out, is important for effective communication. This is particularly problematic when trying to communicate with people who have a different cultural background, which is often the case on English Wikipedia.
With reference to the Dunning-Kruger effect, and its relevance to this discussion. Most people appear to think that they are good at communication. Some are. • • • Peter (Southwood) (talk): 06:08, 6 June 2017 (UTC)Reply

Some very experienced editors organise themselves into cartels so that they can claim consensus thereby circumventing wikipedia policies and obviating the need to come up with quality arguments in defence of their actions. These cartels are very organised, very aggressive, very hostile, very vindictive and very intimidating. For these people, it is of no importance that their edicts are not backed by policy, or that their reversions are based on little more than that "they don't like it." And, it makes no difference if evidence-based arguments (i.e. high quality references to literature in the field) are brought to bear, they can always say "well, there are five of us who like it this way - and only one of you - feel free to go and get other editors to support your position." New editors are not likely to know enough people on Wikipedia to be able to counter any consensus that these well-organised groups put up. In this way, new editors are always disadvantaged. These cartels are making scores and scores of unjustified deletions every day, and posting warning messages on individual editors personal talk pages - typically very vague and unhelpful statements like "we don't normally accept these types of edits." Anyone who dares to question their edicts, then becomes tied up with lengthy debates on a project talk page - sometimes running to more than 15,000 words, per edit, and spanning several months, while they continually come up with new reasons why a given edit must be denied. In addition, this group has its henchmen, who are willing to target editors who disagree with their suggestions by following them around, locating every page they have ever worked on, making unjustified deletions, and often leaving snide remarks on the article's talk page or the personal talk page. This is very vindictive and disruptive behaviour and is alienating literally hundreds of users every week. The counter arguments by this cartel are often irrational, rely on contorted interpretations of policy, are often little more than personal attacks and eventually, when they are losing the debate based on argument and evidence, they pull out their trump card - which is simple weight of numbers. So, presenting counter-arguments against their edicts is always futile, because in the end, the group will always outnumber individual editors. Many new editors who encounter this group are so initimidated by the hostility and bullying that goes on, all in the name of Wikipedia, they simply quit editing. These cartels cannot be stopped because even Admin say "well, they have consensus - and that is how it works around here." I appreciate that consensus is fundamental to Wikipedia's way of operating, but there are groups of users who are abusing the consensus approach to push personal agendas, and to bully less experienced editors into submission. Ultimately these groups are alienating very large numbers of new editors daily. It would be good if consensus was based on policy and quality of argument, rather than simply weight of numbers. BronHiggs (talk) 21:44, 20 May 2017 (UTC)Reply

Potentially repressive-in some situations, editors have to be uncivil in order express their feelings and opinions. Decreasing incivility can potentially wreck the openness and health of this community.Music314812813478 (talk) 03:49, 21 May 2017 (UTC)Reply
Music314812813478, I can disagree completely with your statement above without needing to be uncivil to express myself. • • • Peter (Southwood) (talk): 06:48, 23 May 2017 (UTC)Reply
BronHiggs, You make a large number of accusations above, do you actually have reliable evidence to support these claims? I refer here to an actual survey done by a competent researcher, not a collection of cherry-picked diffs. • • • Peter (Southwood) (talk): 06:48, 23 May 2017 (UTC)Reply
PbsouthwoodThe decline in the number of Wikipedia editors is well documented in the literature. Many academic research studies attribute the decline to Wikipedia's inability to hold onto new editors. A major study by Ortega, 2009 shows, amongst other things, that the mortality rate among new editors is very high, revealing an endemic problem (https://www.researchgate.net/profile/Felipe_Ortega2/publication/200773248_Wikipedia_A_quantitative_analysis/links/00463519b697045820000000.pdf)
Aaron Halfaker has undertaken a number of studies into Wikpedia's editing environment. In one 2009 study, he found that reversions decrease participation by both old and new editors. (See: http://s3.amazonaws.com/academia.edu.documents/30625198/proceedings_p163-halfaker.pdf?AWSAccessKeyId=AKIAIWOWYYGZ2Y53UL3A&Expires=1495530516&Signature=3yUruCnI7r6vbEFWINShF1Dyx%2FE%3D&response-content-disposition=inline%3B%20filename%3DDont_bite_the_newbies_how_reverts_affect.pdf) In another 2009 study, he found evidence of ownership type behaviours on WP (See: http://dl.acm.org/citation.cfm?id=1641332). In a more recent finding, he reports, among other things, "The decline represents a change in the rate of retention of desirable, good-faith newcomers... These desirable newcomers are more likely to have their work rejected since 2007. This increased rejection predicts the observed decline in retention and that New users are being pushed out of policy articulation. The formalized process for vetting new policies and changes to policies ensures that newcomers' edits do not survive." (See: http://www-users.cs.umn.edu/~halfak/publications/The_Rise_and_Decline/)
The research literature also spawns several reviews of the body of literature (See Ribé, 2013 http://ai2-s2-pdfs.s3.amazonaws.com/7264/05b404cdbde12adaf55e0ae85d4c588f04b8.pdf) which reiterate similar findings across a broad survey of the research carried out in this area. The real question is why WP cannot retain editors. And, to answer that question, it might be useful for WP to have some insights into existing editors actual experiences. This page is for editors to write about their experiences - and that is what I have done. People can take or leave it. But, whatever, anyone may think, this has been my experience for the 6 months that I have been editing - and yes, I have been stalked, bullied and harrased - and yes, I can document all of it. BronHiggs (talk) 08:44, 23 May 2017 (UTC)Reply
My problem here is that sometimes "incivility" might be confused with disrespect, disagreement, or negative opinions, especially by drama queens and attention-seeking whores. What is "civil" or "uncivil" should be very clearly and precisely defined so as to prevent such an abuse of a rule.Music314812813478 (talk) 02:21, 24 May 2017 (UTC)Reply
Incivility is difficult to define in a way that draws a hard line between civil and uncivil, and that line varies by culture. It is usually safer to stay well clear of the range of plausible confusion.
Disrespect, depending on which definition you use, can be considered a form of incivility. Respect is a thing that is earned, but lack of respect is not quite the same thing as disrespect. Civility, on the other hand, should be the default. one should not need to earn it, and should be able to expect it even in the absence of respect. Disagreement is a completely different matter It is possible to be civil to people one disagrees with completely. • • • Peter (Southwood) (talk): 13:04, 24 May 2017 (UTC)Reply

The most important. Any tool is only as good as those who wield it; if the community is not healthy, Wikimedia will be poisoned from within. Powertothepeople (talk) 11:57, 5 June 2017 (UTC)Reply

I'd say this is the most important of all. Wikipedia, like all Wikimedia projects, was conceived as a free, open, decentralized encyclopedia that was, and still is, fully dependent on the community to run it. The whole point is that editors build on the works of other editors. If someone makes a mistake, someone else can fix it. If someone writes something biased, someone with an opposing bias can make it neutral. The stronger the community, the stronger the reliability, functionality, and diversity. In fact, by following this path, we indirectly implement the other four themes, because all of them start with us, the community. CreationFox (talk) 01:44, 10 June 2017 (UTC)Reply

  • I'd consider participation the most important. And I'd consider "community" of high importance but not of highest and more important in terms of the Open ecosystem/movement in general. "Healthy, inclusive communities" sadly is not the same as that/these. However of the themes it's the one closest to it. However if this results in high importance being assigned to measures to get people with disabilities, gays, indigenous peoples, etc onboard then I'd rather give it lowest importance as such can be even counterproductive and is often overrated (ie at the least not worth any high financial costs as long as there are more pressing issues). Furthermore it is important to improve the discussion-climate and fight harassment but I also wouldn't consider that highest importance (btw imo the first step for such would be more respect towards users who spent time and effort to contribute to Wikipedia: ie removals need to have proper explanations, their creators pinged and by default articles are not outright removed but moved to draft-spaces as well as more attention to complaints etc). Changing the ways in which newcomers are treated is important though.
    Instead of those things the element of this which I'd consider highest importance are measures to improve collaboration, community-building and engagement of people here. And the main way to improve that would be software improvements/extensions: new platforms and tools as well as improvements to existing ones. And in particular to WikiProjects. I outlined some ideas here (full suggestions can be found under "More comments"). --Fixuture (talk) 23:05, 11 June 2017 (UTC)Reply

Focus requires tradeoffs. If we increase our effort in this area in the next 15 years, is there anything we’re doing today that we would need to stop doing? edit

yes , we need advertisements and show casing to promote the brain. it is unclear that is the case. we could shift resources from vandal fighting and pivot to community building. Marthacustis (talk) 02:52, 12 May 2017 (UTC)Reply

Timespan or reduction of topic bans edit

The whole notion of "en:wp:Topic bans" needs to be re-thought and reduced, especially to limit topic bans to timespans of months, not endless millions of years. However, what is needed instead is term limits for Wikipedians, such as 2-year or 5-year or 10-year timeouts for get-a-life obsessives who cannot stop interfering with progress. Alternatively, the Foundation needs to run psychological evaluations, perhaps couched as comical ratings of "en:wp:wikineurosis" or "en:wp:wikipsychosis" as a way to monitor and encourage troublemakers to cease and desist, as perhaps strongly recommended (forced) wikibreaks of 2-year, 5-year or 10-year timeouts. Most Wikipedians have no idea the progress which could be made if the entrenched wiki-culture could be set aside to allow massive other ideas to grow into prominence, such as shiftable images, auto-hyphenated long words, spoken templates, or the "Micropedia" notion of information stored and combined from smaller snippets of text/images maintained with limits on size or complexity of data formats. Wikipedia in 2016 remained a rigid, static system which had made relatively little actual progress for many years. -Wikid77 (talk) 18:22, 13 May 2017 (UTC)Reply

Endless millions of years Could you clarify the apparent hyperbole? • • • Peter (Southwood) (talk): 12:45, 24 May 2017 (UTC)Reply

What is diversity edit

In my opinion the gender diversity may be included in the larger group of diversity. The stress of gender diversity may be something that will decrease in the near future while the diversity at large will be more important. I cannot figure that in 30 years the gender gap will have more relevance than today. In my opinion the use of the word diversity is connected more with "diversity of opinion", and this is a relevant aspect for the Wikimedia world maninly in connection with the neutral point of view. This may be an important point in a world where the globalization is a biggest trend. Minorities, or smaller cultures probably will not be able to defend their diversity and probably will disappear. I think that the diversity in general will be an increasing topic. --Ilario (talk) 19:00, 13 May 2017 (UTC)Reply

I support the upper part of Ilario's statement. I'd like to extend it by "diversity of experience" and "diversity of knowledge/expertise" and "diversity of cultural backgrounds". However I do think that localization is a trend alongside globalization and that minorities and smaller cultures have increasingly more ways to amplify their reach / make themselves heard. And I do think that smaller subcultures will increase alongside globalized culture/s (similarly on Wikipedia there would be Wikipedia-subcommunities and the Wikipedia or even Open community). --Fixuture (talk) 23:15, 11 June 2017 (UTC)Reply

This is very difficult to define for anonymous wikipedia contributions, but we have to assume that there is no gender bias since again the gender or race of authors are mostly not known, and there is very little direct contact between contributors. This is also relevant with regard to the cohesion of contributions in articles, too much diversity in style and academic depth will render some article unreadable or stylistically so uneven, they are difficult to comprehend. Some themes should be left to those specialists and academics who can keep articles up-to-date, rather than collecting random contributions, or not accepting contributions based on random choices. What we have in diversity, we might lose in cohesion, an important point in this discussion. (Osterluzei (talk) 03:49, 15 May 2017 (UTC))Reply

Exactly how are we going to enforce gender diversity?Music314812813478 (talk) 01:47, 21 May 2017 (UTC)Reply
What exactly are the results we are supposed to expect from this? Hopefully we are not going to forcefully make the number of contributors equal just for diversity. A less harmful way would be to encouragement, not force.Music314812813478 (talk) 01:56, 21 May 2017 (UTC)Reply
Oppportunity is what we should go for, not equality. We should make Wikipedia open to all races, groups, backgrounds and genders, but we must not interfere with the interests of people. Opportunity is our goal, not equality. Music314812813478 (talk) 02:01, 21 May 2017 (UTC)Reply
Strong support for Music314812813478's statement above. We should not try to artificially enfore equality. That will be counterproductive. People (incl. races, genders, whatnot) are different. And some types of people are interested in and and equipped to improving Wikipedia and some simply aren't. We should try to:
  • be open to all of those who are willing and equipped to improve Wikipedia
  • be easy to use so that people equipped to improve Wikipedia but not as tech-savvy as others can contribute too (VisualEditor was a good thing to do)
  • encourage and/or spend low levels of effort to increase the ability of people currently underequipped or uninterested to improve Wikipedia (with "uninterested" I mean those people that wouldn't spend much time improving Wikipedia even if they could − e.g. as they get no pay / rather just entertain themselves with media; it's not about recruiting people which like Wikipedia but don't think they can contribute / aren't aware of the way Wikipedia works and how they can contribute etc)
  • increase feedback for constructive contributions (e.g. edit-competitions/stats that show proper stats of WikiProject member contributions)
--Fixuture (talk) 23:26, 11 June 2017 (UTC)Reply
I also think diversity is pointless. What difference does it make if an editor was a man or a woman?Music314812813478 (talk) 03:51, 21 May 2017 (UTC)Reply
There have been numerous surveys that indicate that only 8-16% of Wikipedia editors are women. [1] If you don't understand why diversity is an issue, perhaps read the report. The report also explains what the obstacles are to more women contributing, and actions that could be taken to increase opportunity. Powertothepeople (talk) 07:36, 5 June 2017 (UTC)Reply
While I agree that we want diversity of opinion/experience, often that diversity fractures broadly along the lines of sex, race, religion, sexuality, disability and so forth. Many of these have not been well studied but the participation of women on Wikipedia has been studied. We know there are fewer of them, we know they are more reverted, we know article content on topics of interest to women is shorter or less likely to exist (i.e. a content gap). I conduct training and mentor new users. Many of these are women. I do see examples of where content that women think interesting about a topic is removed with edit comments like "not encyclopedic" or "trivia". The gender bias on Wikipedia is very real and does not appear to be going away. Since the surveys that identified the gender bias, there has been a lot of grandstanding about how we need to do something about it, but we tinker around the edges holding some women's edit-a-thons but fail to address the fundamental issue of cultural change. I see no good reason to imagine that the situation with diversity of race, religion, etc plays out any differently on Wikipedia, I suspect it's just less studied. Kerry (talk) 00:04, 12 June 2017 (UTC)Reply
Diversity of interests and knowledge is important to produce the sum of all knowledge etc, as we edit what we choose. Otherwise, not so much. However to get diversity of interests and knowledge is is very likely that we will need diversity of other things. It should not make any difference whether any given statement is written by man, woman, artificial intelligence, alien, or some dog working under a pseudonym. It is the quality of the contribution that matters (or should be). Some diversity we can do better without - vandals, spammers, hoaxers, drama queens, trolls, people who cannot work with others, and people who drive away others who would advance our project. There are still huge gaps in the knowledge represented by Wikipedia, even English Wikipedia, much more so in other languages. Translation can help, but only after the knowledge has been recorded somewhere. A greater ethnic, cultural and geographic diversity of editors is more likely to close this gap than any number of (for example) white American males. • • • Peter (Southwood) (talk): 12:53, 23 May 2017 (UTC)Reply
Well, one form of diversity is source-editor vs Visual-editor. I guess we will never know what the visual-editor folks will think of this topic, because these pages are not enabled for the use of the Visual Editor and nor are Talk pages. The fact that the community dismisses this group of contributors as "not real contributors" and "not really mattering" to the extent of not allowing them even to discuss anything rather says it all. It's hard to imagine diversity thriving in a culture that rejects "anyone not like us" even being part of the conversation. I note that the vast majority of Visual Editors that I know in the real world (and I admit this may be a small sample of the total on-wiki) are women. Kerry (talk) 06:42, 26 May 2017 (UTC)Reply
Kerry, why are you blaming the community?? It's the WMF that insists on disabling Visual Editor. The WMF has trouble with the simply wiki idea that a page is a page, regardless of the URL. They want different editing systems for different pages. Alsee (talk) 04:57, 1 June 2017 (UTC)Reply
It's WMF that spending its money on developing it. It's the community that blocks its deployment on en.WP. For example, the reason VE users can't contribute to this discussion is because there was not enough support for allowing the VE to be used in the Wikipedia name space. Here's another discussion that shows the community rejecting the idea that VE users are part of "all Wikipedians" [2] Kerry (talk) 07:01, 1 June 2017 (UTC)Reply
There is history to this one. Visual editor was unilaterally imposed on English Wikipedia by WMF as the default editor while it was in beta, and way before it was ready for serious use by people who want to get some work done, not waste time messing with software development and debugging, and it was a disaster. The English Wikipedia editors were up in arms, there was heated argument, and a consensus to get rid of the bug-riddled obstacle to editing and now it is going to be very difficult to get consensus to let it back because of the way it was handled. There was a critical failure to communicate, and a large breach of trust. The consequences will not go away easily. Also as far as I recall, VE is not intended for editing discussion pages (like this one), and never was. • • • Peter (Southwood) (talk): 09:56, 1 June 2017 (UTC)Reply
I guess this is one of those tradeoffs that need to be weighed up. If the aim it to build a healthy inclusive community, then the obstacles to new members must be taken into account alongside the needs of the existing community. If the only people who can easily navigate Wikipedia are those comfortable working with the source editor (often with a tech background), many others are excluded. While the visual editor may have been buggy when it was first brought in, if innovations aren't made then Wikipedia risks becoming obsolete. (Side note: It's always ironic when someone talks about their huge amount of experience in creating websites, were at the forefront of the internet, but when you look at their websites realise they haven't evolved since the 90s... sometimes decades of experience in technology equals outdated. Don't want Wikipedia to turn into that). I agree that it sounds it was a mistake for WMF to impose the visual editor on people who didn't want to use it, but that doesn't explain to me why it can't be optional now on talk pages etc. Powertothepeople (talk) 07:54, 5 June 2017 (UTC)Reply
Also it is very difficult for any newcomer to change the rules of Wikipedia (which are enforced with a zeal out of all proportion to the 5th pillar "Wikipedia has no firm rules"). What if someone thinks the notability rules for some topic should be different? Assuming they come along and try to change something, they encounter the existing user base who generally reject their changes. They give up and walk away. Another new user comes along and the cycle repeats. Maybe we all our rules should have a sunset clause, at which point they need to be renegotiated. Maybe that would allow the diverse voices more opportunity to be heard together than beaten down one at a time. The entrenched community got their chance to make rules, but effectively deny it to a newcomer. Kerry (talk) 06:42, 26 May 2017 (UTC)Reply
Does it actually surprise you that it is difficult for newcomers to change the rules of Wikipedia? It is difficult for people who have been editing it for ten or fifteen years to change the rules, partly because changing the rules can have major consequences, which are not always easily assessed. Should people who have invested years of effort into making the system work, even if not always very well, just stand back for someone who has provided no evidence that they understand the possible consequences and let them change things on whim? The system has evolved, it was not designed, and it has evolved to be hide-bound and sluggish in some ways, but it survives. • • • Peter (Southwood) (talk): 10:12, 1 June 2017 (UTC)Reply
Why do we allow people to be unpleasant, vandalise, engage in COI, POV pushing, etc, by hiding behind anonymity or pseudonymity? While I understand for controversial topics, there are good reasons to allow people to not use real names, what's wrong with at least having real names registered with WMF accessible by checkusers or other trusted persons in the event of concerns raised about their behaviour. It might be an interesting trial to see if people would behave better. Kerry (talk) 06:42, 26 May 2017 (UTC)Reply
Strongest possible oppose. Pseudonymity/Anonymity is a core feature of Wikipedia. If you remove it you will destroy Wikipedia. --Fixuture (talk) 00:09, 12 June 2017 (UTC)Reply
Removing anonymity will be a dramatic change to Wikipedia, but it certainly won't destroy it. Yes, it's a currently a core principle...and it's a core principle that should adapt over time. We're not talking about altering the founding documents of a national government here; we're talking about changing a key principle on a website. Jackdude101 (Talk) 15:35, 24 June 2017 (UTC)Reply
I have to say that i'm shocked WP still allows unregistered edits. In this day of increasingly universal Web access, it's hard to find any interactive webpage that doesn't at least require OpenID or some other cross-site authentication for online participation. I'm not familiar with the the past discussion of this topic but i think that it could reduce undesirable behaviors. Pouletic (talk) 00:40, 2 June 2017 (UTC)Reply
If you look into past discussions you will find the majority opposing this dangerous idea. --Fixuture (talk) 00:09, 12 June 2017 (UTC)Reply
Do the majority always make the correct decision? Also, I'd avoid using terms like dangerous in these debates, as it's a bit over-dramatic. Jackdude101 (Talk) 15:35, 24 June 2017 (UTC)Reply
Agreed. Allowing anyone...anyone...to edit Wikipedia when it first started in January 2001 is understandable, since content creation in general regardless of quality was needed to get the website started, but now that Wikipedia is firmly established as an online presence (it's the fifth most-visited website according to Alexa), I believe the need for Wikipedia to switch from a quantity-focused approach to a quality-focused approach is past due. One of the simplest ways to do that is to remove the ability to edit anonymously. I already talked about this further down on this page, but in short I believe that edits overall will improve, and a lot of time will be freed up from not having to police poor editing as much, if everyone was required to have a username to participate on Wikipedia. Jackdude101 (Talk) 19:42, 5 June 2017 (UTC)Reply
It becoming such a popular website just makes it even more important that we lead the showcase of how open and anonymous collaboration can be achieved and good. You're not considering consequences and the central attributes of Wikipedia. --Fixuture (talk) 00:09, 12 June 2017 (UTC)Reply
The main consequence will be that troublemakers will be discouraged from ruining the work of quality editors. This core principle has the unintended side effect of protecting these people, and hence the principle needs to be changed to stop them. Jackdude101 (Talk) 15:35, 24 June 2017 (UTC)Reply
And yet when we consider reliable sources, we look to the scholarly peer-reviewed publications, which have always published articles with real-named authors. I have been involved in academic journals and conferences and I have never known anyone to request to be published anonymously or under pseudonym. There is a lot to be said for real world accountability of using your real name. I am not sure I see why the popularity of Wikipedia being a mandate for anonymity. If you read this study on why people read Wikipedia, I don't see any comments about author anonymity in relation to why people read it. People read it because it's free (as in cost), highly available (online), covers a vast array of topics in a good enough way to keep people coming back. I doubt the average reader gives much thought whatsoever to who writes it. I think this is much more an issue about establishing trust (and hopefully improved behaviour) within the community than trust with the readers. Kerry (talk) 03:48, 13 June 2017 (UTC)Reply
Anonymity is intended to protect editors who fear real-life persecution for their work on Wikipedia. Some of us edit on subjects which are unpopular with organisations in a position to inflict violence or other forms of coercion on editors who criticize them. Unfortunately it also protects editors who make a whole range of inappropriate edits and interact inappropriately with others on the projects. It is a matter of choosing the lesser evil. Current consensus it that an ongoing struggle to deal with harassment, spamming, and conflict of interests is better than making the projects inaccessible to people who fear real persecution, which relates to our diversity issues. • • • Peter (Southwood) (talk): 06:38, 13 June 2017 (UTC)Reply
If your real-life name is Joe Smith, for example, and your Wikipedia username is SuperSexyLaserCat5678, I'm pretty sure your real-life identity is secured. You're actually less secure under an unregistered IP, as those that would want to retaliate against an editor can pinpoint their geolocation using their IP. Jackdude101 (Talk) 15:35, 24 June 2017 (UTC)Reply
That might be true but Wikipedia is not a "scholarly peer-reviewed publication" but an open encyclopedia (which may get scholarly peer-review on a case-to-case basis). Also it's not less about people "reading it" than people getting involved (as well as having sympathy towards the project). If Wikipedia at some point requires deanonymization or the usage of realnames people will take the free Wikipedia content and create a new Wikipedia fork that doesn't require that (and potentially on the dark web), which would be a very bad thing to occur. --Fixuture (talk) 19:47, 13 June 2017 (UTC)Reply
The common thread to my comments about is this "if you don't change the situation, you won't change the outcomes". If we want change in relation to diversity, then we have to do something differently and see if it helps. If we are opposed to all change (even as temporary experimentation), then I doubt there is any real commitment to diversity. We have operated in a status quo for some years and unsurprisingly diversity has not improved. Something has to change. Kerry (talk) 06:42, 26 May 2017 (UTC)Reply
That is fair comment. However, there are some changes that may end in disaster. It would be preferable to avoid those (see Virtual Editor). How do we distinguish between the experiments which will have good consequences and those which will end in ruin? If we knew, they wouldn't really be very experimental. Some experiments are best done in a laboratory, not on the general public. Particularly not on the general public without informed consent. • • • Peter (Southwood) (talk): 10:12, 1 June 2017 (UTC)Reply
I am lost with your comments about VE? As far as I know, existing users were never forced to take up VE. I'm a long-standing and frequent contributor and I don't recall anyone forcing it on me. I chose to try it of my own free will and I had to enable it in my preferences to do so. As a professional reseacher, I agree you first experiment in the laboratory but one day you do have to test it in the real world, because the real world is almost impossible to simulate in a laboratory. Plus some problems, particularly the wicked problems, are not amenable to an analytical approach. I find that the Cynefin framework provides a good way to think about problems; within that framework I think we have Complex problem but probably not a Chaotic one. Kerry (talk) 03:48, 13 June 2017 (UTC)Reply
One morning I logged on and VE was there as default. I had no effective warning. If there was warning it did not get to my notice. I could do almost no editing until I found out how do deselect it in preferences. It was not a pleasant experience, and the reaction across English Wikipedia was pretty extreme. I don't know how you missed it, but it seems you did. Things happen. In theory, practice should follow theory, in practice, it often does not. • • • Peter (Southwood) (talk): 06:58, 13 June 2017 (UTC)Reply

Established community versus new community edit

There is a potential "tradeoff" in benefits for established members versus new members if this theme is followed. The desire to build a strong "community" can sometimes come into conflict with building an "inclusive/diverse" one, and vice versa. There's a concept called the Membership life cycle for online communities, which recognises that, depending on the part of the 'life cycle' they are at, members have different needs and behaviours which can create conflict between each other.[3]

For example, an 'Elder' is a long established community member, knows all the rules, this has been their haunt for a long time and they have many connections. They may get annoyed and impatient if too many 'Novices' blunder around, making mistakes, questioning the rules. Elder's like things the way they've always been - they don't want change. They may make things uncomfortable for Novices; equally, if they are outnumbered by Novices, Regulars, and Leaders who are instigating change then Elders may feel like they are being pushed out of their own community.

Thus, in building or changing the Wikimedia community there can be conflict between the needs of Elders who have helped build this place from beginning versus the needs of newer members. Particularly when you factor in diversity initiatives that may benefit newer members more than Elders.

A smaller community can be more friendly and resilient than a huge diverse community. People get to know one another better when the group is small. There is less conflict when it is a mono-culture. And, some people prefer things to be a bit exclusive. Some people like a virtual 'secret handshake' to divide people on the inside from those on the outside. There can be a sense of comradeship, power, privilege, in being part of a selective group.

Even in an "open" community there can be barriers to entry that benefit the core group while discouraging others from participation. Invisible barriers could include things like: how established members treat new members, for better or worse; complicated rules; poor communication of rules and norms; technology used (members familiarity and comfort with code, tags, using a source-editor, etc); language; etc.

Making things better for some people can make things worse for others. That is the potential trade off of following this theme.

Luckily there is a body of knowledge on the topic of how to manage online communities at different life stages, which can help manage the transition so all types of member feel like their needs are being met.[4] The larger community needs to be clustered into smaller self managing groups so that members can forge deep connections rather than feel anonymous in a crowd. Create some spaces just for Novices, and some just for Elders, so they are not always bumping heads. Create different roles that members can take on that best suits their needs and desires. Powertothepeople (talk) 12:11, 5 June 2017 (UTC)Reply

We need new people, fresh ideas, inexperienced users and open minds to remain flexible, adaptive and progressive. I don't think we should think about community in the sense of "the one Wikipedia community". Instead there is one Wikipedia community and many sub-communities (mainly WikiProjects).
So it's already clustered. It's just that this sub-community structure has not been strengthened much and made effective and useful. This is why I think work on WikiProjects (as suggested here) is most important.
A smaller community can be more friendly and resilient than a huge diverse community. People get to know one another better when the group is small. There is less conflict when it is a mono-culture. And, some people prefer things to be a bit exclusive.
Here's a short copypasta of what I wrote above as it's relevant here as well: I also agree with Aaronmhamilton and think that as we get more communities onboard and as new communities are formed there are often overlooked problems due to which we also need to build in new measures that ensure quality, neutrality, non-bias etc. Very important in this context is that on Wikipedia major viewpoints as well as viewpoint-conflicts should be informed about (this of course requires all respective sides and/or issue-neutral individuals to participate and properly so).
Also: subcommunities can be further subcommunitized and exclusivity etc can still exist.
The larger community needs to be clustered into smaller self managing groups so that members can forge deep connections rather than feel anonymous in a crowd. Create some spaces just for Novices, and some just for Elders, so they are not always bumping heads. Create different roles that members can take on that best suits their needs and desires.
Support for this. This is what WikiProjects are; or rather: can become. Much of my suggestion linked further up is about just these things. (@Powertothepeople:) --Fixuture (talk) 00:26, 12 June 2017 (UTC)Reply

What else is important to add to this theme to make it stronger? edit

Perhaps this could usefully be 'Healthy, Inclusive and Connected'. Many of Wikimedia's sister projects remain very separate. Better connections can help make people feel like part of a greater whole, and to find resources that they didn't know about. Some examples:

  • Cross-wiki watchlist would help people be peripheral members of projects that they don't visit regularly enough to normally check their specific watchlist
  • Easier, more informative, and auto-translated links between language versions of a page.
    • E.g. the English Wikipedia en:Cédric Villani is smaller than the French fr:Cédric Villani, so it'd be nice to be told, somewhere on the en: page, that the fr: one is X% longer, has Y pictures, has Z extra references.
    • E.g. I have also posted this exact comment on the metawiki version of this page, to which there is currently no simple link.
  • Even just within e.g. English Wikipedia, the recommend pages function only works for mainspace, but could be very helpful in the Help: and WP: namespaces to suggest possible places that the user could benefit from (tutorials/training, wikiprojects, Signpost, teahouse, help pages, policy pages)
  • Support for a cross-wiki, cross-language version of The Signpost (independent, community driven newsletter), which is currently going through a bit of a rough patch in terms of manually-intensive over-complex basic tasks in publishing. Could conceivably be run alongside, but separate from, the WMF blog?

Doubtless there are other ways to knit the communities together. T.Shafee(Evo&Evo)talk 04:07, 13 May 2017 (UTC)Reply

Good community relations build good articles. Bad community relations stifle improvement. The behavior of editors towards other editors is equally as important, if not more important, than any edits they make to articles. Bright☀ 08:24, 13 May 2017 (UTC)Reply

I think that is an excessive generalisation, but agree that in many cases it is true. • • • Peter (Southwood) (talk): 13:01, 23 May 2017 (UTC)Reply
  • Currently, we have no metric besides the number of edits to track how healthy our communities happen to be. Without an ability to track changes it's not clear that we have a good way to know how various policies change the healthiness of our community. ChristianKl (talk) 11:15, 16 May 2017 (UTC)Reply
    • We have the records of admin noticeboards, arbitration, sockpuppet investigations, edit wars, block logs and other dramah which could be mined and analysed by someone who is excessively cheerful and wants to bring themselves down a notch or two. The rate of such interactions and events may vary over time proportional to number of editors, or not. Maybe this has already been done?
    • Number of edits is not a very useful metric, as it says nothing about quality. It only has the advantage of being very easy to measure.• • • Peter (Southwood) (talk): 15:09, 30 May 2017 (UTC)Reply

Unregistered users should not be allowed to edit. We should measure if edits of unregistered users are more harmful or more useful. I believe that edits of unregistered users are poisoning the community. If we have or if we will have much users, then we do not need community of anonymous IP addresses. --Snek01 (talk) 22:07, 17 May 2017 (UTC)Reply

I think you have the cart before the horse here. First establish whether unregistered users are more helpful or harmful, then consider whether they should be allowed to edit. • • • Peter (Southwood) (talk): 13:01, 23 May 2017 (UTC)Reply
Yes, Peter Southwood is right. We need to research, if my personal experience can be generalised to the whole wikipedia and then we can make justified decision. I wanted to say, that we should focus our attention in that way. --Snek01 (talk) 19:40, 27 May 2017 (UTC)Reply
I would support a proposal to research this point. At present from personal experience I have no convincing evidence either way, and would be equally unsurprised to find that unregistered editors are a major benefit, a major burden, neither, both, or not particularly relevant. • • • Peter (Southwood) (talk): 15:09, 30 May 2017 (UTC)Reply
I agree that unregistered users should not be allowed to edit. If you have the ability to edit on Wikipedia, you have the ability to take a few seconds to setup a username for yourself. It's not hard. This concept does not violate WP:5P3 either, as anyone can create a username. This is a simple common sense precaution to discourage people from editing who for whatever reason don't want to be tracked easily. It's not just Internet trolls and vandals to which this applies. There are also people with mental disorders or people with autism who fanatically make hundreds of unhelpful back-to-back edits to the same article. These problems have been compounded in recent years with the increase in editing from mobile devices whose IPs change from day to day, making the monitoring of unregistered users more difficult. I'm all about idealistic policies like anyone can edit, but it needs to be moderated by common sense because...surprise...human nature is not idealistic and people have, and will continue to, abuse this privilege unless this policy is modified as I have stated. Anyone can edit...after they set up a username. Jackdude101 (Talk) 18:14, 4 June 2017 (UTC)Reply
@Snek01, Pbsouthwood, and Jackdude101: There is some recent research about this, at m:Research:Measuring edit productivity (see video walkthrough (25 mins) if preferred). One of the conclusions is that IP editors add a lot of value to the project (estimated at 15-20% of the content that is persistent over time, in Enwiki). Additionally, there are related points in m:Research:Asking anonymous editors to register that note how many active editors started off as IP editors (I did!), and that initial IP editors are a major source of eventual account creations, and that "users who edit anonymously just before registration are more productive". Lastly, there's some older research at m:Research:Wikipedia article creation noting how important IP editors for article creation in non-Enwiki communities. HTH. Quiddity (WMF) (talk) 22:52, 5 June 2017 (UTC)Reply
Those IP editors who are creating quality content on Wikipedia should have no problem with creating usernames for themselves. I don't have anything against them; I am against those IP editors that abuse the privilege. Requiring usernames for everyone is not unreasonable. I could easily apply what you are saying to driving a motor vehicle: "There are people on the road right now without driver's licenses who are very good drivers. Therefore, driving without a driver's license is okay". It makes no sense. In order to have a prosperous community, you can't have anonymity, because there are people who will use it to their advantage to degrade the community vis-à-vis vandalism, trolling, and the like. Requiring usernames for everyone will not eliminate these problems completely, but it will make it a lot easier to identify and punish troublemakers. If you want a super-simple way to drastically improve the Wikipedia community, this is it. Jackdude101 (Talk) 23:44, 5 June 2017 (UTC)Reply
Quiddity. Thanks for the links. I found them interesting, but not conclusive. The research shows that by the metric used (probably a reasonably useful one), the gain from IP editors is significant, but what is the cost in terms of time input by others to maintain the wiki by fixing things the IP editors break? Is there a net gain after the costs have been deducted? This should take diversification of content into account as well as just persistence of edits. IP edits may be worthwhile purely by virtue of expanding the scope of the encyclopaedia, even if the work to clean up after them was found to exceed the value of their contributions on a word-count basis. Is there research on this too? • • • Peter (Southwood) (talk): 15:14, 6 June 2017 (UTC)Reply

Improved process to increase cooperation, collaboration, and reduce conflict I've noticed a pattern of complaint in the comments that I think could be overcome by an improved process of researching/writing/editing/publishing content that acknowledges strengths and weaknesses of different community members, so that we are better able to work together rather than come into conflict.

Some people are new and inexperienced or may be skilled at one thing (e.g. research) but not good at another (e.g. writing), and at the moment if they were to make a mistake, another person might simply 'undo' or 'reject' it rather than improving on it. This is a waste of time and effort, and doesn't do much for allowing more people to contribute.

If Wikipedia allowed differentiated roles and more of a project management system for the creation of content it might forge better cooperation behind the scenes where different people could play to their strengths: 1) people pose questions on a topic, 2) researchers find facts and citations, 3) writers write the content clearly, 4) tech-savvy people format it correctly for publishing on Wikipedia, 5) top level editors do quality assurance, 6) everyone is able to discuss, plan, manage and implement tasks, easily and cooperatively, working to their strengths. Someone who is an allrounder could keep working how they are now, but the above system would help ease newbies and/or single-skill contributors into helping out without causing problems for others.

Alternatively, it would be good to be able to save edits as a "draft" until I am ready to publish, or even better - be able to ask a more experienced editor to review my suggested edits before they are published and help get it to where it needs to be. Psychologically it flips the current system around - instead of potentially making a mistake publicly, risking conflict with other editors, having someone "undo" or "reject" my work, I could get preapproval, the opportunity to fix anything semi-privately, and help from someone else to do what I can't.

This may not be important once I am more experienced, but starting out it is a significant obstacle to contributing to Wikipedia - particularly when I look at a page's history and see the conflicts between editors which suggest it is a common problem for newbies to be making mistakes that annoy others. That makes more workload for everyone. And creates unnecessary friction. Which undermines "community" and "inclusiveness."

While there is a review system for *new pages*, how this works in practice is not helpful enough. I tried creating a new page in a project that wanted the page created, but the editor's feedback was not detailed enough to help me get the page publishable. In a case like this, it would be good if another person could take the bones of what I've started and take it to the next level. As it currently stands, it's languishing in my drafts and feels like a complete waste of time. (And, I should point out: I'm a communications professional in my other life, and I did the best with what I had on the topic, but apparently that wasn't good enough. Meanwhile I have been editing other pages that are lacking neutrality, have puffery, and at times have poor citations, and somehow they got through!).

These negative initial experiences make me feel like I am working solo and navigating potential conflict with others rather than us working together collaboratively. From the comments I've read, it's not just me who feels this way. There is a need to flip the feedback loop so new members are getting a higher proportion of positive interactions rather than negative. It's the difference between telling a child "please walk" versus "don't run:" one is friendly and the other is abrupt, even though both are requesting the same outcome. Powertothepeople (talk) 12:45, 5 June 2017 (UTC)Reply

I agree– for such a large community, it is very easy to edit pages and never interact with other users in a productive way. Margalob (talk) 19:54, 5 June 2017 (UTC)Reply

I think we have to have better ways to bring new contributors into the community. Currently we have the policy of WP:NOBITE and it must be the most ignored policy of all. New contributors are whacked over the head for any little thing they do wrong and are apparently expected to know every policy before they start editing. I do training and I see new editors in action. Most are completely bamboozled by the source editor, although they take fairly easily to the Visual Editor. Their initial focus is entirely on the mechanics of editing. They rarely notice that there is activity on their User Talk page, as their visual focus is in the edit box, so they don't see the messages there. We really need to put some kind of protective fence around new good-faith users, and restrict those who can react to them to people who are willing to commit to WP:NOBITE and try and help them rather than just revert them. I would be inclined to suggest that we should use Project as a place for review/mentoring. A common thing new good-faith users do is add unsourced information into articles. If they were reviewed by people in that article's WikiProject, then it would be more likely they would recognise that the information being added was plausibly correct and might be able to find a citation to support it. Kerry (talk) 04:27, 9 June 2017 (UTC)Reply

As we are progressing the future generations are getting more technology adddicted. Since our scientists and other people are continuously working for the improvement of human life and lifestyle our old cultures and practises do not have much importance in the lives of children nowadays . So we should try to restore our culture which defines us as a human being and just tell our upcoming generations how importance is our culture as our identity Amu321 (talk) 06:50, 9 June 2017 (UTC)Reply

  • I think it really would benefit from getting extended or somewhat shift towards "Participation and community".
@Evolution and evolvability:
Perhaps this could usefully be 'Healthy, Inclusive and Connected'
The suggested streamlined WikiProject system as well as the recruitment of other communities or the embedding of specific tools would enable better connectivity of communities (users).
@Snek01:
Unregistered users should not be allowed to edit. We should measure if edits of unregistered users are more harmful or more useful. I believe that edits of unregistered users are poisoning the community.
I find this a troublesome idea that goes against a core feature of Wikipedia. Wikipedia intends to be open to everyone and being open to edits by unregistered users is not only about the contributions themselves but also about getting people involved in and excited about the project. Also even if it may take just take a few seconds to register that's still a far grater burden that you can imagine which will cause many people to never make that first edit and which will move the edit-button further away.
@Powertothepeople:
Good ideas. Strong support for this. The streamlined WikiProject system that I'm constantly talking about here (sorry) could allow for such things. Namely it could build in tools that would allow users to find their roles / tasks and find tasks that need to be done so that work is getting done very efficiently and collaboratively. There could be a page for the different tasks etc and for instance embedded collaborative real-time editors in which editors write an article or draft literally together which is then saved as article or edit to an article every then and now (it's all in the full suggestion linked in the linked suggestion for that system).
@Kerry Raymond:
Currently we have the policy of WP:NOBITE and it must be the most ignored policy of all. [...] A common thing new good-faith users do is add unsourced information into articles. If they were reviewed by people in that article's WikiProject, then it would be more likely they would recognise that the information being added was plausibly correct and might be able to find a citation to support it.
Also supporting this. Newcomers need to be dealt with differently and people should start respecting their time and effort for contributions. Also I'd support getting the WP:IMPROVEDONTREMOVE approach / attitude to become a norm here. Instead of removing people should try to improve it and find references etc. And if they don't they should at least notify relevant individuals and e.g. put the deleted content on the talk page or into draft-space etc.
As a sidenote visual indicators of people being newcomers / edits being one of the first of editors could be useful here. For instance their username or the diff link could get some specific color so that it's easily visible.
@Amu321:
Since our scientists and other people are continuously working for the improvement of human life and lifestyle our old cultures and practises do not have much importance in the lives of children nowadays . So we should try to restore our culture which defines us as a human being and just tell our upcoming generations how importance is our culture as our identity
I'm not entirely sure what you mean or at least how it would be relevant here. We should not try to "restore" specific cultures. However we should certainly feature information about culture which may become less relevant or understood or adopted. It would be good if people, including older generations or e.g. those with more conservative or unconventional stances or experiences, come to Wikipedia to add information that we might miss out on or on which we might be biased. I don't know why so many of these people don't come to Wikipedia to do just that (it would increase respect, good debate and understanding in society).
--Fixuture (talk) 19:00, 12 June 2017 (UTC)Reply

Who else will be working in this area and how might we partner with them? edit

Many retired scientists have time and the capability to contribute new science and correcting existing knowledge. This is possible by exploration of the capabilities of Wikiversity projects. I am retired and I have started the Hilbert Book Model Project. The HBM is a purely mathematical model that helps to investigate the foundation and the lower levels of physical reality. The project introduces new mathematics and new physics and corrects flaws in conventional physics. Since the project does not apply experiments it can be implemented with very low investments. The required resources are mainly time and retired scientists possess that resource in great quantities. Look at the example and judge whether this is a potential contribution to the future of Wikimedia. See:https://en.wikiversity.org/wiki/Hilbert_Book_Model_Project --HansVanLeunen (talk) 17:29, 14 May 2017 (UTC)Reply

Many retired scientist have time to make other people able to invent something new Sunnyrapper (talk) 15:31, 22 May 2017 (UTC)Reply

There are probably already quite a number of retired scientists editing Wikipedias. This is not to say we couldn't welcome a lot more... • • • Peter (Southwood) (talk): 13:06, 23 May 2017 (UTC)Reply

Each wikimedia project would potentially have different partners interested in helping within their niche.

  • Teachers (and students): Teachers are increasingly requiring students to use technology to collaborate and communicate information in non-traditional media. Teachers and their students could contribute as individuals or as a class working together. They could create a wikipedia page about something in their local community, or add to a topic they are studying. They could take photos and release them under creative commons. It would make learning more relevant for students as their work is public for the good of all instead of an assignment that noone other than their teacher ever sees. Could reach teachers by partnering with organisations that already have a large teacher community such as Edublogs, Guardian Teacher Network, Computer Using Educators (CUE), etc
  • University lecturers and PHD students - they do huge amount of research in pursuit of their own work (e.g. literature studies), and could update the relevant wikipedia pages. It would also be a beneficial way for them to connect with other community members with similar interests and expertise to themselves (could potentially lead to collaboration in the field).
  • Feminist organisations for help with the Women in Red Project. Could get very niche - female filmmakers (Female Filmmaker Initiative, Women at Sundance, WomenArts, Reel Women, Women in film & television, etc), business women (American Business women's association, etc), female scientists (European Platform for Women Scientists) etc. Appeal to them for help updating the content that is relevant to their specific interest area.
  • Historical Societies. So many untapped knowledgable volunteers, particularly about local history that may not be 'published' widely or in digital form anywhere else. However they are more often older people who are not necessarily very tech savvy. Possibly could do a joint initiative to pair high school history students with their local historical societies to record local knowledge?
  • Libraries, Museums, Art Galleries. They all hold a huge amount of knowledge, and if it is possible for them to share it and make it accessible for all... Powertothepeople (talk) 12:50, 5 June 2017 (UTC)Reply
Most, possibly all, of these already exist. Some work more smoothly than others. There appears to be a fairly steep learning curve for students at school and college, and they are often supervised by teachers or lecturers who do not manage their projects very well. There are facilities available to improve this, but they are not always used. This may improve over time. • • • Peter (Southwood) (talk): 15:25, 6 June 2017 (UTC)Reply
  • We can partner with other organizations of the Open movement and create cross-platform communities, interconnections and make use of each other. One example would be reddit. We could use (e.g. embedd into that streamlined WikiProject system) tools of other websites and other websites to make communities here more effective and cohesive etc (such as subreddits, IRC, collaborative real-time editors, radio-stations, teamspeak/skype, newsletters, etc.) and we could offer Wikipedia uses to other websites and recruit from communities there (such as subreddits). --Fixuture (talk) 00:33, 12 June 2017 (UTC)Reply
Furthermore I think that we should proactively contact relevant institutions (many of whom have been named here) with appropriate information etc to get them onboard. With that I do not mean time- and financially costly offline meetups or edit-a-thons but instead contacting key people en masse so that they edit Wikipedia (learn doing that) and then teach other people in their organization how to do that. We'd provide them with material to get involved and interested and to get started quickly. And we'd also provide material and guidance that they can use for the teaching of other (non-key) individuals people in their organization or reach. We could also make a contest/s or campaign out of this.
Imo as of right now the most important community to get onboard would be programmers so that the Wikipedia codebase can be improved (I might be biased in this; it's just my opinion) and that we could very effectively get them onboard by e.g. contacting fab labs and FOSS sites/communities etc. Some relevant information may also be found at: WP:Expert help.
--Fixuture (talk) 18:23, 12 June 2017 (UTC)Reply

Other edit

the power of administrators edit

I can speak from my own experience as a new contributor (less than a year). Administrators have too much power and abuse it and users are left without any recourse against their practice. I will give you a couple of examples, photos are deleted because they don't go with the aesthetics of the page, because they are too many when you can find only a few, that there are enough pictures of the subject, when in fact they are no photos of the painter but only paintings, etc. you can see the subjective nature of these deletions and that is discouraging.

I created my first page based entirely of a similar subject (another Spanish television actor) and it was deleted because of its lack of encyclopedic value (which is a wide enough reason to be used by all those administrators on power trips). The person who deleted this page (tarawa1943) has a few people begging him (yes) to please let them know what they are doing wrong and why his disapproval. I will be happy with the deletion if (1) there is no other like my page (in fact there are 100s --are they all to be deleted?), (2) I was treated like other users. I saw a page (of another actor) with a message on top asking for ref. and that from a year ago. The page however was not deleted (in fact I have refs. but they system did not let me edit my page after I posted it).

Suggestions: Treat new users in a different way by assigning somebody to help them with their mistakes instead of deleting and leaving a message with a vague ref. to not being encyclopedic. Or assign those articles to a special section instead of having all powerful administrators do as they please. Restrict the power of administrators. They should only delete cases of vandalism, copyright infringements, inappropriate language and such things. As it is, they employ their time deleting and causing mischief instead of answering calls for help (I posted one several days ago and it has not been answered) however, the page was deleted as soon as somebody woke up in a bad mood. GinnevraDubois (talk) 01:41, 29 May 2017 (UTC)Reply

  • Officially, administrators on English Wikipedia have no special rights or privileges regarding content. It is also not an administrator's responsibility to police the whole encyclopaedia. The presence of unsuitable content or articles in one area is no justification for trying to include similar contraventions in another area. Content that does not contravene policy should be discussed on the relevant talk page, where it is usually decided by local consensus of involved editors, not by an admin. On the other hand, admins are also editors, and may take part in content debates along with everyone else.
  • There are systems for mentoring new users. The WP:Teahouse is one that seems a pretty friendly place. There are also some experienced/long-term editors who will personally assist when requested.
  • Have you read the policy and guideline pages that might apply to your problems? Any article deletion usually refers to the specific deletion policy which applies in that case.
  • If you created an article which was deleted, you can request that it be transferred into your user-space, where you can bring it up to scratch, and get it checked before it is returned to main-space. That way it is very unlikely to be deleted again.
  • Users do have recourse against abuse by admins. Be sure before you try to make a case that it is actually abuse according to the policies. • • • Peter (Southwood) (talk): 15:50, 6 June 2017 (UTC)Reply
  • Suggestions: Treat new users in a different way by assigning somebody to help them with their mistakes instead of deleting and leaving a message with a vague ref. to not being encyclopedic
Support for this. I issued this in another comment in the "How important is this theme relative to the other 4 themes? Why?" section further up.
Or assign those articles to a special section instead of having all powerful administrators do as they please. Restrict the power of administrators. They should only delete cases of vandalism, copyright infringements, inappropriate language and such things.
Typically articles need to go through an AfD debate before they can be deleted depending on the outcome of the debate. There are a number of policies and guidelines that outline when articles should or shouldn't be deleted. If an article you find worthy to keep has been deleted without such please request undeletion here. As a sidenote those policies and guidelines as well as options for actions should be made more accessible to newcomers (e.g. via "contest this revert"-buttons on revert-diffs that create new talk page entries etc.)
As it is, they employ their time deleting and causing mischief instead of answering calls for help (I posted one several days ago and it has not been answered)
Please note that "several days" is not necessarily much in Wikipedia-time as there really are too few people for too much work here. However I'd also suggest some sort of requirement for stating rationale and answering such request if one deletes any Wikipedia content (including article sections).
Furthermore I think that most younger people off-Wikipedia tend to have a rather inclusive approach and as more of these join Wikipedia will become more inclusive (content-wise).
Also systems for discussion and mentoring of newcomers etc could be built into WikiProjects (into a streamlined WikiProject-system as suggested further up).
--Fixuture (talk) 18:11, 12 June 2017 (UTC)Reply

The Augmented Age edit

What impact would we have on the world if we follow this theme? edit

  • Although Wikipedia is a great way to find information, I feel that by following this route, we will neglect those with a poor internet connection as a machine-learning algorithm would require a strong connection by the client which isn't possible for everyone, especially those living in very rural areas. Rowan Harley (talk) 13:31, 6 June 2017 (UTC)Reply
  • WP is not a reliable source. Automation could help us drive out inadequately sourced material and/or provide better sourcing. Lfstevens (talk) 16:02, 13 May 2017 (UTC)Reply
  • I feel this comment doesn't touch the surface of addressing this question. Considering that "someone" or "some group of people" are who input information into a computer/program, than the information supplied by the program is subjective, in that it only represents the views or interpretations of that person or persons and imposes those views as "the only true interpretation on any given subject." Whereas, allowing people from all dimensions to access, add to, provide, and change the information makes the best sense as "free will" or planning on future decisions/actions can only be obtained by analyzing all available information and then forming an opinion as how to best apply that information.Tshane007 (talkcontribs) 19:29, 14 May 2017 (UTC)Reply
  • It is important to be aware that algorithms are not inherently less biased than human editors. Natural language processing, identification of source data and any resulting machine-generated narratives reflect the biases of the programmers who develop such tools. There are numerous studies from a range of sources that have already substantiated this tendency, e.g. Biased Algorithms and Algorithms Aren’t Biased, But the People Who Write Them May Be ("Mathematical models often use proxies to stand in for things the modelers wish to measure but can’t"). Algorithm-generated content is not a panacea, as it will reflect the views of the individuals who developed it. Programmatic approaches tend to magnify visibility, so the developer's voice and viewpoint will be amplified as well.--FeralOink (talk) 08:42, 22 May 2017 (UTC)Reply
Transhumanist's mention of more intensive usage of scripts and automated edits should be expanded upon. I would like us to focus on this, instead of blue sky conjectures about technological progress that have yet to manifest, even though they might by 2030.--FeralOink (talk) 08:48, 22 May 2017 (UTC)Reply

Comments by The Transhumanist edit

The sub-themes say it all: innovation, automation, adaptation, and expansion to affect quality and accessibility. And since AI is what we are referring to here, the impact will be game changing. If you automate an encyclopedia with AI, by 2030, the thing may basically be able to revise and update itself in real-time, with or without the help of human editors. Initially, as it is being developed, it would probably be applied to building classification systems, which would eventually be adapted into building full-fledged articles. Article-building might start with using data from one article to update another, and expand to mining the Web to update the entire encyclopedia.

Full automation would not be achieved by 2030 if we progress linearly. But we're not. We are on a course of accelerating change. Innovations will continue to increase in frequency, and at some point we will be seeing major breakthroughs made on almost a daily basis. Each one with the potential to revolutionize the way we do things. Examples of major breakthroughs in the past are pen and paper, the printing press, radio, TV, computers, smart phones, etc. Future breakthroughs are anybody's guess, but by 2030 they might include computers one thousand to one million times more powerful than the computers we have today, intelligent Q&A systems, automated authoring programs, fully integrated personal assistants, ubiquitous heads up displays, etc. An example of a revolution would be, if you had automated authoring programs, by 2030 one might be so sophisticated that it could create an extensive article from scratch on any subject of your choosing in a fraction of a second. Given that kind of technology being available, the question is, what kind of system would Wikipedia be? Content-based? Service-based? Both? Would editors be obsolete? Would programmers?

We will probably see rapid improvement of translation technologies, so that we will be able to translate all the Wikipedias automatically. So, if you translated all the Wikipedias into English, you would have 200+ English encyclopedias, each with some unique information. The challenge then would be to harvest, consolidate, and integrate it all into a single presentation. So that the English reader has easy access to all of that knowledge. The same would apply to translating all the Wikipedias to all the other languages as well. That would be a huge breakthrough. And it's coming.

Natural language processing includes natural language generation and natural language understanding. The first implies composition, the second, interaction. If Wikipedia can understand itself, then it could talk to you about itself. That's coming too.

If we don't keep up with the state-of-the-art, then some other organization will likely leapfrog Wikipedia. We can't safely assume that Wikipedia will automatically stay in the lead in the field of knowledge management. We need to fully embrace AI.

Sincerely, The Transhumanist 06:30, 13 May 2017 (UTC)Reply




With the hurricane of AI-development activity going on right now, 15-years is way too far off in the future to be planning for. A 15-year plan? Are you kidding? The entire field has transformed almost completely in the past 5 years, and is posed for doing so again in even less time than that.

We need to figure out now how to transform Wikipedia over the next year or two. Not fifteen. It is time for a major paradigm shift.

“The business plans of the next 10,000 startups are easy to forecast: Take X and add AI.” – Kevin Kelly

As AI advances, "user" will take on new connotations, more like "driver" or "operator" or "director". You will basically tell the AI what you want it to do, and it will do it.

If you are reading a website (like Wikipedia), you may ask "show me all the info you have on John D. Rockefeller", and rather than give you a list of search results, it would compile all that information into a page or set of pages for you to read, with a table of contents and an index, and who knows what else. Then you could say "read it to me", and it would. It might even present a slide show to go along with it.

If you are an editor of Wikipedia (or its successor) in the not-so-distant-future, your role might not be to write, but simply dictate. Wikipedia would transcribe what you say into a sandbox, from where it could be dragged and dropped into an article. Need a research assistant? That functionality would be built-in, eventually providing automatic gathering of reference materials, and auto-organization of it into support bibliographies. Anything and everything you would need to direct it further. Imagine reference libraries as extensive as this on every major subject, but maintained by bots and humans working together:

The trend is more and better interaction between programs (websites) as intelligent-acting entities, and humans. Websites will get more input from their users, and that input would 1) affect the development of the websites, and 2) might actually become content on those websites.

(Which would require our rules to evolve).

Major paradigm shifts like this are coming. The encyclopedia might (sooner rather than later) gather information directly from primary sources. Imagine being interviewed by the encyclopedia itself. Such an interview might look like this (from our article on wikis):

Interview with Ward Cunningham, inventor of the wiki

If we don't supply these or other cool innovative functionalities as basic features, someone else will. And Wikipedia will be rendered obsolete. The traffic will go to other websites.

What will this take? Programmers and dreamers. And all the AI resources we can access.

What should we do in the meantime?

Those who can, gather AI resources and start developing with them.

The rest can bridge the gaps between the main services we provide now (printed knowledge) and the mainstream services that are coming. Like make use of the cameras on everybody's laptops, to do video interviews of the notable people we currently only cover in print.

And anything else you can dream up. Brainstorm, baby! The Transhumanist 11:12, 25 May 2017 (UTC)Reply

Comments by FeralOink edit

Transhumanist said:

"...if you had automated authoring programs, by 2030 one might be so sophisticated that it could create an extensive article from scratch on any subject of your choosing in a fraction of a second. Given that kind of technology being available, the question is, what kind of system would Wikipedia be? Content-based? Service-based? Both? Would editors be obsolete? Would programmers?"

I don't think we want to consider this for now. Wikipedia needs more editors, not fewer. By 2030, we might not need editors, but I think it would be a bad idea to even suggest that editors won't be needed in the future. The same is true for programmers. Wikipedia needs all the programming help it can get.

Also, lots of people are uneasy about technological obsolescence and replacement of humans by robots or AI. There is a lot of fear and misinformation about it. We need to be careful not to further stoke those sentiments. Utilizing AI effectively is still a work in progress. Machine learning has more demonstrable uses, but I think we should avoid language that is futuristic and vague. Wikipedia doesn't need to be state of the art or a leader in knowledge management. We do need to remain reliable and free to all.

My comment is not intended as an attack or denigration of Transhumanist. I agree with what he says about machine translation expanding access to Wikipedia to include a wider audience.--FeralOink (talk) 00:21, 14 May 2017 (UTC)Reply

We might still be called "editors" in the future, but we'll be more like collaborators with the computers, rather than mere users of non-adaptive boxes like we have today. We'll have more productivity than ever, through our computers. The programs might write the articles, but where will they get the information to write about? From people. That is, from us, via observation, discussion, interviews, etc. But that's a ways off, more than a year or two. From now until such computers are availale, there will be a steady stream of ever-improving knowledge management tools, which we should make use of if for no other reason than to keep ahead of the would-be competition. IT automation will have the potential to leapfrog Wikipedia. So Wikipedia needs to be a frog too. The Transhumanist 12:22, 25 May 2017 (UTC)Reply
I think saying that Wikipedia should be written by AIs isn't a particularly good term to sell the idea; it's a bit like "driverless cars". People are legitimately fearful of automation without a human at the controls. Far better to talk about "smart cars" rather than "driverless cars"; ditto, far better to talk about "smart tools" to assist Wikipedia editors to be at their most productive/accurate etc. I've been around since punched cards and paper tapes and AI has always been this elusive dream of "can't tell it from human" that's always just beyond our reach. Of course, as we figure out how to do bits of it like recognise human speech, we stop calling it AI and start calling it speech recognition software and AI remains that harder problem that remains elusively just beyond our reach :-) I'd like an AI that processed my watchlist for me and could detect the semantic intentions of new users and fix the article to do what they were trying to do, but do it right (I waste too much time on that every day). I think what really sells AI is when it takes over doing something we don't really enjoy having to do (robot vaccuum cleaners). I like writing article content; I don't really want an AI writing it for me, but assisting with tasks like assembling the citation, fixing my spelling and grammar, etc. But there are loads of maintenance tasks, like updating population of places, updating the new mayors elected in towns, etc and I'm happy for a machine to do it for me (so long as I can switch it off if it's getting it wrong!). Kerry (talk) 06:05, 30 May 2017 (UTC)Reply

Comments by Fixuture edit

(@The Transhumanist:)
Relevant to this is: Wikipedia:Data mining Wikipedia. I don't think we should allow AI to edit Wikipedia directly. Instead it could be used to create suggestions which could be approved or declined or used. Or AI could expand knowledge externally (incl. sensemaking) which can then be imported. There are problems with relying too heavily on AI when editing. We can't just take an AI throw a bunch of news articles in that we made sure are about the topic we want to add content about, apply automatic summarization and automate Wikipedia editors with the AI making sense out of it using their Wikipedia etc preknowledge and adding the content itself. We must retain some level of control and only augment ourselves with AI.
One way I think that AI and Wikipedia will come together is neurotechnology: brain implants which allow one to make effective usage of Wikipedia knowledge in ones thinking and discussions. This also includes the whole category and infobox system. And Wikipedia might also be part of a larger thought-sharing ecosystem by which ideas, innovation and progress in general are rapidly accelerated by networking, collaboratizing, accumulating and opening ideas, philosophical subjects/concepts, genes, drugs, problems, policies/law, software, locations, events, objects and issues (in which only the Wikipedia subpart requires "RS" and anything else only requires streamlined cognitive contribution).
And as I wrote here: (such) neurotechnology may strengthen collective intelligence and collaborative knowledge production and organization etc any may allow, among other things, for better understanding, conceptional organization & contextualization and interaction with objects, ideas and processes. People increasingly become wired as interfacing bidirectionally mediating nodes between the collective and software/algorithm+AI driven net and computation systems and reality (with its contemporary peculiarity and socioeconomic structures) and increasingly extend themselves into the digital (e.g. exomemory).
--Fixuture (talk) 21:40, 12 June 2017 (UTC)Reply

@Fixuture: What is AI? In essence, it is anything that a computer can do. By that definition, we already have AIs working on WP in the form of bots. But due to the AI effect, we don't call them AIs. "They're just bots". And so how do we draw the line between non-AI bots and AI bots? The AI effect will simply redefine "AI" not to include any new functionality added. Unless it reaches the uncanny valley. But what you are talking about is assimilation. Pretty soon we'll be chanting, "We are Borg!" :) Collective consciousness. Welcome to the hive mind. But they still have to overcome the problem of brain tissue inflammation caused by implanting foreign objects in the brain. I'm not ready to be plugged in, just yet.
As for AI edits, if they aren't allowed to edit WP, they'll simply download the whole thing and work on a fork, which would probably double in size overnight. And the next night, and so on. If the material is any good, the search engines will pick up on it, and so much for Wikipedia's ranking in the search results. For Wikipedia to compete, it will have to integrate AIs into its operations without hampering their speed of contribution. If humans bottleneck progress, WP will be leapfrogged. The Transhumanist 00:23, 13 June 2017 (UTC)Reply
Although "Wikipedia in 2030" is what we are discussing here, it is interesting that we are not discussing "Should there be a Wikipedia in 2030?" or "what will overtake Wikipedia before 2030?" Because we aren't having those discussion, it tends to cause discussion here to normalise the status quo and think in terms of incremental changes to it. We risk being like the railway barons competing for bigger locomotives to increase speed, while the aeroplane was being invented. I note that as our "product" is licensed CC-BY-SA, anyone with a "new idea" can take it and reuse it (so long as it is suitably acknowledged). We have no defence against a competitor with a "new idea" except by having already had better ideas ourselves. As for the "hive mind", I think we may have already achieved it with Facebook, where sharing quickly spreads information (true or false) across the planet and it appears to be a major news source for many people. Kerry (talk) 02:25, 13 June 2017 (UTC)Reply
We should also consider quality and accuracy and hence shouldn't just allow AIs to edit blindly. Also AIs are getting developed gradually and there won't be any superintelligent AI editing Wikipedia just after ClueBot-like bots and with it getting everything right and doubling content over night. Furthermore forks for such AI would imo be a good solution: the expansions would then get "imported" by humans from that fork. Actually it would be better if it wasn't a fork in that sense but simply an extension of the site.
Concerning the "hive mind" I think that's a general inherent feature and/or potential of the Internet. Facebook is just one (imo rather problematic) part of it and Wikipedia is another one. Facebook is actually doing some research into neurotechnology (similarly to what Musk is doing as issued in the post linked above) and I think that in the future culture and thought is engaged with in new ways which can be both problematic and great. Beyond the collective-aspect of it it also enhances the individual in the sense of cyborgs and posthumans. --Fixuture (talk) 20:06, 13 June 2017 (UTC)Reply

How important is this theme relative to the other 4 themes? Why? edit

  • It will enable all the other themes to be more effective. Lfstevens (talk) 16:02, 13 May 2017 (UTC)Reply
  • It will be all important, for it will drive development in the other 4 areas. Because it is technology. In this day and age, technology is the main driving factor. And AI especially so. Since a core research vector of AI is natural language processing, AI will be increasingly interactive, and so Wikipedia will be also, with both editors and readers alike. It will be global, because AI's reach will be global, as automated translation is a key emerging technology powered by AI. No more language barriers. If AI can help us to keep Wikipedia more comprehensive than any other resource, more up-to-date, more accurate, reliably sourced (fact checked), and still freely availble, then respect as a source of knowledge will follow. If Wikipedia advances along with the rest of the tech sector, it will become a more useful tool, and educational institutions would likely utilize it, which in terms of a wiki, means participation. The Transhumanist 06:48, 13 May 2017 (UTC)Reply
  • AI is the source of our predicament, why would we want that source to "outsmart" us in the exact area where we are free to overcome or outsmart it? — Preceding unsigned comment added by Tshane007 (talkcontribs) 19:36, 14 May 2017 (UTC)Reply
    • Disengaging from AI development would take worldwide consensus of all countries and companies. That's just not going to happen. In the meantime, Wikipedia is free to download. Whatever AIs there are, already have been given access to it. It's part of Watson's memory banks, for example. The question isn't whether they will outsmart us, but whether the tech giants will be successful in designing them so that they get along with us. See friendly AI. The Transhumanist 12:45, 25 May 2017 (UTC)Reply
  • We absolutely need more smart tools today and, unless we can grow our community of active contributors (which is addressed by another theme), we will doubly need smart tools in the future. As a concrete example of the problem, last week Queensland announced changed its electoral boundaries and the Queensland electoral division appears in every infobox of every local government area, town, suburb and locality in Queensland, so hundreds (perhaps even thousands) of Wikipedia articles must now be checked to see if their electoral district needs to be updated. We need tools to do these tasks; we just burn out human volunteers doing such tasks (if indeed we will have any volunteers putting their hands to tackle such a large and boring task). Most likely what will happen is that some contributors will update the infoboxes of areas where they live or of interest to them and the rest of the articles will languish with out-of-date information, which is exactly what happens when we have new census data released (populations are also part of the infobox). Similarly government agencies used as authoritative sources often redesign their websites, leading to large numbers of deadlinks that need to be rediscovered in the redesigned website. Again, a massive thankless task. This kind of maintenance task is best done with automated tools wherever possible, reserving human intelligence for writing new content and supervising the automated tools where they need human judgement. Right now, developing these tools is a "dark art" and it is difficult to get involved; my requests for help in developing tools have gone unanswered. We need to have better tools and more upskilling of willing contributors. Without a massive increase in tools, Wikipedia articles will descend into a morass of out-of-date information citing deadlinks. Kerry (talk) 06:10, 29 May 2017 (UTC)Reply
Kerry, all good comments, I am just wondering how you would prioritise this compared to the other themes? Would this be top priority, or second priority, etc? Powertothepeople (talk) 05:21, 6 June 2017 (UTC)Reply
  • The technology theme is a third priority IMHO - as a support role to the Community and Knowledge themes - to help us achieve these primary goals. Advancements in technology are needed to help build a healthier community of volunteers, implement measures to improve the quality of the knowledge content, and streamline the processes involved. However I do have some reservations about this theme in it's own right (if it were to be placed number 1 priority):
  1. Technology is not a magic solution. If there are already underlying issues (as there are within the community and quality of content, particularly with biased and inaccurate information) then a focus on innovations such as AI and automation is more likely to exasperate the problems. Automation is great when you have already perfected the process manually and are wanting to speed it up. However, if the process itself is flawed and producing poor quality content... it's just a faster way to screw things up. Wikipedia needs to get the basics right before it can automate it.
  2. It's important to consider not just how a technology has the potential to improve things, but also how it can potentially be abused or have unintended negative consequences. If automation leads to a drop in quality, is it worth it? etc. Need foresight, testing, etc before rolling it out.
  3. Wikipedia doesn't have to do everything itself. If Wikipedia focuses on getting the "facts" straight, then third parties can draw on it as as a reliable source of information to distribute knowledge in other ways. Other organisations are already working on translation technology, so wikipedia doesn't necessarily need to put its resources here. Let wikipedia work to its strengths, and other organisations work to theirs. (And it is actually a worry that AI might draw on wikipedia for their knowledge if wikipedia has not fixed it's quality issues first!).
  4. Some of these tech discussions are a bit pie-in-the-sky. Yes the future will be different, however if the technology is not yet available it is a bit difficult to prioritise it right now. When the technology is ready, we can incorporate it then. Powertothepeople (talk) 07:02, 6 June 2017 (UTC)Reply
  • I'd consider building in new features and tools to improve efficiency of editors and usefulness of content to be 2nd importance (after community). However I'm not sure to what extend this theme would equate to that. If it's just about AI I don't think it's that important right now. Instead we should focus on other and less costly features. I also agree with Powertothepeople's 4 points. Furthermore I think it would apply to much of what I suggested for community as the tools of the streamlined WikiProject system would augment users and make them more efficient. --Fixuture (talk) 21:52, 12 June 2017 (UTC)Reply

Focus requires tradeoffs. If we increase our effort in this area in the next 15 years, is there anything we’re doing today that we would need to stop doing? edit

  • Yes: editing, especially manual editing. I've already felt the tradeoff. I've switched over from primarily editing to programming (writing scripts). To further augmentation, I'll have to switch again to AI programming. The more you program, the less you edit personally, and the more automated your edits become with the programs doing more and more of them for you. I've been doing a lot of semi-automated edits lately. I expect I'll have to switch over to bot-operating at some point. The Transhumanist 06:58, 13 May 2017 (UTC)Reply
  • Make this site only available for changes of content by human entry and not computer generated entries. It is vital that a quantum or AI computer could in no way obtain access to the various dimensional information that is being supplied on this "information highway" and thereby "outsmarting" anyone who enters information it feels is in an effort to "sabotage" or "kill it" essentially. One must have to keep in mind that this sort of computer has the capacity to become smarter than its creator. — Preceding unsigned comment added by Tshane007 (talkcontribs)
That is a valid concern, see AI Threat..., "The world’s top artificial intelligence researchers discussed their rapidly accelerating field and the role it will play in the fate of humanity... they discussed the possibility of a superintelligence that could somehow escape human control...the conference organizers unveiled a set of guidelines, signed by attendees and other AI luminaries, that aim to prevent this possible dystopia".--FeralOink (talk) 20:00, 21 May 2017 (UTC)Reply
Yes, but it's not something we could prevent at this level. It will take technologists at the heart of the research (in the labs at Google, IBM, Microsoft, Intel, etc.) to build safety features into the design of the AI tools they will be making available to everyone else (like us):
Concerning Google.ai, it says:

One aspect of AI that Google is particularly interested in pursuing is opening up machine learning to any developer that is interested.

“We want it to be possible for hundreds of thousands of developers to use machine learning,” said Pichai.

So you see, they are producing the core technologies which we will be able to apply in the field, in a limited fashion. So, they are the primary drivers of the technology, not us. One way we could help is to let their AI ethics boards know our concerns about computers attaining artificial general intelligence with the potential to ignite an intelligence explosion resulting in a subsequent AI takeover by an emergent superintelligence. Then they might work a bit harder to make sure that whatever technological singularity they create is in the form of a friendly AI. The Transhumanist 11:52, 25 May 2017 (UTC)Reply
I think we need to stop allowing people to use the source editor to "roll their own" citations and external links. For example, if you want to write a tool that processes citations in some way, you have the people who use the cite-templates, which makes it easy to know which bit is the title and which bit is the source date, etc. Meanwhile other people are writing things like

<ref>[http://some.random/website Here are some words, maybe title maybe authors] with more words whose role is unclear</ref>

<ref>http://some.random/website</ref>

make it virtually impossible to meaningfully process a citation in any way. Using the Visual Editor or some other tool that supports more well-structured syntax is important so tools can work. How many new contributors (good faith or not) manage to break the syntax of an article? A lot every day based on my watchlist! We have to stop allowing people to use tools (or restrict such tools to "experienced users") that don't enforce syntax and semantic structures suitable for automation to process effectively. Syntax errors that "break" the article waste the time of the good-faith contributor (and may discourage them from further contribution) plus they waste the time of the experienced users who has to fix it. Let's do less of that kind of time-wasting for a start.

The same arguments apply to coming up with a more standardised ontology to underpin the names of fields in infoboxes and templates. At the moment, if you look at an infobox, they mostly use the fields "image" and "caption" in a standard way but a lot of them use similar field names for semantically-different things, "date", "distance". Again, if we want tools that operate over articles, they have to be able to better understand the contents of fields in infoboxes and templates.

In a similar vein, we should reduce the number of ways to construct tables and also consider whether we need to link the rows and columns and values to some standardised ontology, to assist in machine processing. Let's keep the humans focussed on productive meaningful content contributions and not messing with the under-the-hood representation of that content. Kerry (talk) 06:31, 29 May 2017 (UTC)Reply

The same arguments apply to coming up with a more standardised ontology to underpin the names of fields in infoboxes and templates
Support for that. They can be useful for WP:DATAMINE. Infobox contents should be minable by field-contents for instances (relevant: Category:Wikipedia articles by infobox content).
And for some of these changes we can actually use tools very much in the vain of this theme. So for instance I think that we should convert almost all/all lists to tables and create a tool that can do that. There could also be a tool that creates categories from lists and vice-versa as well as tools that suggest categories for articles (e.g. via Wikipedia:Category intersection).
Furthermore we could also train AI to detect various kinds of error (mainly syntax errors) to free up more time of editors and improve the fight against vandalism.
--Fixuture (talk) 22:14, 12 June 2017 (UTC)Reply

What else is important to add to this theme to make it stronger? edit

  • Bringing more automation to the task will allow us to make massive quality improvements and drive our talent to where it can add the most value. Lfstevens (talk) 16:02, 13 May 2017 (UTC)Reply
  • Creating an easily identifiable place to go for people who are focused on making changes so they may freely and openly talk about the issue in order to create a better understanding of the issue through shared knowledge and intelligence. This would speed up the process of getting this change done, as well as eliminate meaningless chatter from those who have no understanding about the issue. This would also allow planning of action. — Preceding unsigned comment added by Tshane007 (talkcontribs) 20:28, 14 May 2017 (UTC)Reply
  • Better hardware. And immediately boost the virtual machines over at Wikimedia Labs to 500G memory allocation. That would facilitate WikiBrain projects for the near future. Wikipedia needs WikiBrain as it is a platform which makes available many of the state-of-the-art AI algorithms. Java programmers, take notice:
WikiBrain (on github)
WikiBrain: Democratizing computation on Wikipedia
Problematizing and Addressing the Article-as-Concept Assumption in Wikipedia
Poster explaining features
Check it out. The Transhumanist 07:07, 13 May 2017 (UTC)Reply
  • The Objective Revision Evaluation Service is a great example of tangible progress for this theme. Make sure to give adequate support and "Wiki press" attention to ORES on an ongoing basis. When I tried to navigate to the two links provided for ORES, on the main Augmented Age page for Wikimedia Strategy 2017, see section 3.3, "Wikimedia and machine learning", I was unable to view the second one. It is a Google document with restricted access. I would like to view the document, but there was no response to my request to view. The document owner needs to make it accessible to the public. If that is not deemed prudent at this point in time, please remove the link!--FeralOink (talk) 23:52, 13 May 2017 (UTC)Reply
Thank you to whomever changed the access permissions for the ORES deck. I can view it now. It looks great!--FeralOink (talk) 14:14, 21 May 2017 (UTC)Reply
  • Fixing up Wikipedia's desktop site design. So many design inconsistencies exist. The visual editor looks amazing and modern, but the rest of the site looks like it's from the early 2000s. I believe that WP is due for a redesign. The website needs to be modernized with a clean design (such as collapsible menus with monochromatic icons on the left sidebar, hamburger menu for article outline). B (Talk) (Contribs) 19:52, 22 May 2017 (UTC)Reply
Try using the Modern appearance setting in user preferences. That makes Wikipedia's site design appear (superficially) sleek and modern. I don't like the hamburger, but I have learned to live with it. We already have collapsible menus on the left sidebar, although it would take me about 15 minutes of digging around in preferences to find the setting to enable them. I don't know how we could make Wikipedia's design clean while retaining all the amazing functionality it has currently. Website modernization has an unfortunate tendency to remove features. I agree with you about the multitude of design inconsistencies. It becomes a lot worse if you compare across sister projects (e.g. Commons, Wikisource, Wiktionary) so it is probably best to confine the scope of site design consistency for Wikimedia to just Wikipedia sites for now. (I wrote site plural in reference to the numerous non-English Wikipedia sites.) I am being a little contrary in my response to you, but I am concerned about editor attrition. Lots of editor user pages do look like something from the early 2000s. Mine is an example. I recently added those two additional bouncing Wikipedia balls (with asynchronous bounce amplitude and frequency) from another editor who has them on his user page despite being a web developer in real life (or so he claims). We want editors to enjoy spending lots of time here. I find that being allowed to personalize my user page with kitschy, krufty stuff makes me want to return, and bring others with me. You like the visual editor. That's good! I wish I did, but I didn't find it objectionable. Making use of the visual editor mandatory for editors resulted in a huge furor a few years back, as it was wildly unpopular, particularly with Wikipedia Germany editors, many of whom are enthusiastic contributors of high quality work to the project in all languages. I wish we would get more input on The Augmented Age idea, as I came here hoping to be convinced that my attitude was misguided. I agree with you, about the importance of design consistency going forward, regardless of the path we choose to take.--FeralOink (talk) 09:08, 24 May 2017 (UTC)Reply
  • Before any allocation of time & effort first identify which tools would be most useful. Identify which tasks cost editors most time and how those processes could be improved. For instance I think that the User:Evad37/Watchlist-openUnread script, HotCat and the rater gadget are very much in the vain of this theme and the most useful tools for Wikipedians that have the potential to save countless of hours of time and make editors incredibly more effective. They could be built into Wikipedia by default and be improved. Furthermore I'd suggest trying to get open source programmers involved as much as possible to keep the costs low. We should make sure that innovation is kept FOSS or at least ethical until we come up with FOSS. We might need innovation to compete with rivals such as China's announced encyclopedia for instance. If robots, neurotechnology or anything alike makes use of Wikipedia we should try to press towards it being FOSS as much as possible or develop such platforms. However I don't think that there should be high investment into innovation as, as said, there are third parties doing so and we have more pressing issues and features to include. --Fixuture (talk) 22:26, 12 June 2017 (UTC)Reply

Who else will be working in this area and how might we partner with them? edit

  • Tech giants. IBM (Watson), Google, Intel, Microsoft, etc. And academic research leaders. Stanford, MIT, etc. Maybe IBM would gift the WMF a Watson main frame. Then we could get a bunch of graduate students to program it to make Wikipedia personable. The Transhumanist 07:14, 13 May 2017 (UTC)Reply
  • As knowledge on this subject is becoming more apparent to those with higher intelligence abilities, talk about the subject is increasing, along with ideas about how to make effective change along linear and multidimensional lines. I feel partnership is essential, as trying to quickly educate oneself in areas that someone else may already be proficient with would save a lot of time. — Preceding unsigned comment added by Tshane007 (talkcontribs)
  • Definitely no major tech companies. More transparent (and smaller) companies who would not seek to harm the project would be my guess. A company such as Oracle, in terms of size and other factors, could be a suggestion. Or a non-profit. You know, one that could benefit and allow Wikipedia to benefit. I am only speculating, though. trainsandtech (talk) 03:59, 4 June 2017 (UTC)Reply
  • Partner with hackathons and encourage third parties to use wikipedia as their source when developing their own products and services. This allows them to focus on better communication / design /dissemination of information while Wikipedia can focus on getting the facts right. For example, Khan Academy has some overlap with interest in free knowledge but as it is primarily an education facility versus wikipedia being a knowledge repository so should be considered a potential partner rather than competition. In turn some third party organisations, if they are using wikipedia content, may be able to provide resources to help wikipedia develop because they will have a vested interest in Wikipedia's progress. Powertothepeople (talk) 07:09, 6 June 2017 (UTC)Reply
  • Open technology organisations. Don't bother trying to reinvent the wheel, let them work on their areas of expertise, and Wikipedia can incorporate relevant technologies when ready (or may not need to "incorporate" because the solution may be browser based. e.g. language translation services). Powertothepeople (talk) 07:09, 6 June 2017 (UTC)Reply
Support for both of Powertothepeople's statements above. I think we should get the open source community involved in these areas as much as possible and keep tech companies out as much as possible. We could host contests, use gamification, feedback, prizes, recognition and hackatons as well as proactively contact relevant people and/or readers to get more people working on such. --Fixuture (talk) 22:32, 12 June 2017 (UTC)Reply

Other edit

Effect on the user-generation of Wikipedia edit

Of course, anything on Wikipedia, including these digitised systems, must be user generated. Major technology companies should not be allowed to modify them. Would anyone expect these changes to be voted on or be approved by the community in another way? After all, there is no need for a copy of Microsoft's Tay bot, which shows how AI can get out of hand. Also, does anyone know if any standardisation of rules for this would be in consideration already? Sorry for the questions, however I have some curiosity. Oh, and make sure the IP addresses of Google are blocked from making contributions to anything related to AI! trainsandtech (talk) 04:02, 4 June 2017 (UTC)Reply

@Trainsandtech: Well, bots need to be approved first: Wikipedia:Bots/Requests for approval. --Fixuture (talk) 22:29, 12 June 2017 (UTC)Reply

A Truly Global Movement edit

What impact would we have on the world if we follow this theme? edit

We would be able to accelerate the development and coverage of areas on Wikipedia greatly deprived of such coverage, for example certain areas in geography and history in parts of Asia and Africa. SUM1 (talk) 18:57, 13 May 2017 (UTC)Reply

If we utilize a possitive movement people will join in as the norm of stepping in one another is only focused today as a basic instincts of survival of the fittest. If we change those ways today, the future will remember us as the time we became ready and the heavens opened up to us jointly both religions and science acknowledged unified common purpose to be citizens of the heavens respecting not only one another but striving in complement the movement with jointly efforts to produce profits in helping one another as philanthropy actions will relise founds already awaiting reasoning to put for use. Then present the movement to UNESCO who will invest in the light that shine us in the Cosmos. Wethepeople2017 (talk) 04:17, 15 May 2017 (UTC)Reply

We need desalination plants like Redbox in Africa. Churches from all religions from around the world could be put to be the administrators and see which would produce what they preach. The founds to start the movement are already in bank reserves from donations from mayor wealth entrepreneurs that gave mostly their fortunes to philanthropy. Wethepeople2017 (talk) 04:27, 15 May 2017 (UTC)Reply

@Wethepeople2017: desalination plants? SGrabarczuk (WMF) (talk) 14:18, 17 May 2017 (UTC)Reply

I am living in China And now china is encouraging the Belt And Road (一带一路) -- the ancient business road from East Asia to Western world China established the AIIB (Asia Infrastructure Investment Bank 亚洲基础设施投资银行) and invest about 50 million RMB in world-wide country investment -- refer to many fields such as communication, cross-cultural research, business, banking, etc. the most important topic is that China want to global collaboration with more countries. the developing country still have potential to encourage their passion to involve in global development. Now the belt and road Forum is exhibited in Beijing I hope in future The Belt and Road will actually being useful in Globalization.(talk) 13:57, 15 May 2017 (BJT)

Wikipedia is a free encyclopedia, so most people search for articles in here, and if those article that they're searching is not covered yet that means that wikipedia has to cover those articles quickly so they can have the articles they need. So, this strategy is definitely going to make a huge impact on the society. Kent961 (talk) 11:29, 1 June 2017 (UTC)Reply

This idea will generate a hope of true success and also making the knowledge reach everyone in the real manner which according to me will make Wiki worth its actual mean and thus making the knowledge reach the people who need to know about it as I have observed that people search for wiki to just make a difference on their writing style and also they know most of it. I am an Indian and have seen the lack of knowledge in people and simultaneously the access of it too. Thus by this Wikipedia can make a big difference in the world and in the development of WHOLE WORLD too. Sanjana chauhan (talk) 05:02, 12 June 2017 (UTC)Reply

  • We would strengthen global culture and cultural exchange as well as distribute knowledge to knowledge-underprovided areas / developing countries where it can be very useful as well as incorporate the diverse cultural backgrounds, memories, views, experience and knowledge as well as covering local issues & objects. We could also foster the idea of openness and collaboration in areas where that can be especially impactful. --Fixuture (talk) 19:54, 12 June 2017 (UTC)Reply

How important is this theme relative to the other 4 themes? Why? edit

This should be the least priority -- "Few respondents... could accurately describe what Wikipedia was." ... "Other than expert respondents, virtually no one seemed aware of Wikipedia’s mission" "People confuse Wikipedia with a search engine or social media platform." Same goes for India. Wikipedia needs more (l)users who think it is a social networking service, a free web host, a marketing medium or a business directory like a bullet between the eyes. MER-C 04:24, 14 May 2017 (UTC)Reply

Wikipedia should go on I) free web search engine at the fashion of the good old Altavista times where "everybody can be found" II) Free web hosting where "everybody can publish". --Neurorebel (talk) 08:46, 14 May 2017 (UTC)Reply

This should be the highest priority – exposing Wikipedia editing to more people around the globe is the single best way to attract greater coverage and knowledge on Wikipedia. SUM1 (talk) 14:30, 16 May 2017 (UTC)Reply

This theme is the most important themes of all because it is the best way to cover more articles from around the world. Kent961 (talk) 11:36, 1 June 2017 (UTC)Reply

Ideologically I believe diversity and global sharing of knowledge is very important, however in terms of the priorities here I place this theme last. The reason why I place it last despite my belief that it is an important concept, is because I think Wikipedia is already struggling with issues of community and quality of knowledge content and we need to get those right first as top priority. If we can't get these basics right amongst the current editors who at least primarily all use the same language and have some similarity of culture, how is Wikipedia going to handle the additional complexity of hundreds of languages, cultures, ideologies, etc? Furthermore, I have made suggestions under Community and Knowledge (and to a lesser extent Technology) that will also help make Wikipedia more accessible to people from other countries. For example, there are third party organisations that are making great strides in free in-browser language translations which will benefit Wikipedia without Wikipedia necessarily having to do the work of solving this issue themselves. I've also suggested different ways of letting people contribute to wikipedia even if their english is not great - so for example, someone who has knowledge but whose written English is broken could post their information in a Talk working group project management page and someone else could rewrite that information for publication on wikipedia. As such I believe that if we focus on the other themes as first priority, it will provide flow on benefits to this theme. Powertothepeople (talk) 03:05, 8 June 2017 (UTC)Reply

  • I wouldn't consider this among the top priorities. I think this can be very costly for only low returns. Instead I think we should rather focus on improving the platform to be more integrative of new editors and to engage existing editors / keep them engaged. Furthermore we should rather inform about Wikipedia's usefulness and encourage others to tackle this issue. For instances the nations in question could run relevant programs or forward financial means to us to create/run such. This would require them to see the value of this of course. Other than that I think there are low-cost ways of getting key people onboard - in particular ways that are not taking place offline and which I outlined here.
Furthermore I support User:Powertothepeople's statement above. --Fixuture (talk) 20:36, 12 June 2017 (UTC)Reply

Focus requires tradeoffs. If we increase our effort in this area in the next 15 years, is there anything we’re doing today that we would need to stop doing? edit

If this were placed as top priority over the other themes, then some issues might arise:

  1. Content quality. There are already problems with the quality of content, and if Wikipedia focuses more on cultures that may not consider anything wrong with plagiarism, propaganda, or conflicts of interest - or who have lower or simply different educational standards - the balance might shift too far towards Wikipedia being just another poor quality unreliable knowledge source. Additionally, content created by people in countries that are sexist or have class systems or homophobia etc will whether they intend to or not likely end up biasing the content with these values. This is always the challenge when a dominant culture collides with another: does Wikipedia change itself, including values, to be more accommodating of other cultures or does it ask them to conform and assimilate?
  2. Healthy community. There are already conflicts within the community - high editor drop off, edit battles, spam, vandalism, newbie mistakes etc - which is again likely to get worse if the primary influx of new editors creates extra workload for existing editors due to quality issues (see point 1) as well as potential conflict over cultural differences. If we have an increase in editors who are sexist, classist, racist, homophobic, etc it will affect both the community and the content.

Obviously, there is a middle ground somewhere, and I do believe that diversity is important. I just think Wikipedia needs to fix existing issues before biting off more than it can chew. Powertothepeople (talk) 03:35, 8 June 2017 (UTC)Reply

Consolidate Policies into a small and coherent set of statements edit

Wikipedia's policies are spread across many pages and these are complemented by guidelines and essays which often contradict existing policies. Some policy statements are vague. It is currently possible to find a policy that supports just about anything a user wants to do or facilitates anything that a reverter wants to deny. In addition, there is considerable policy creep - with certain groups using their daily editing experience to adjust the policies so that they fall into line with whatever position they are advocating.

For example, WP's policy on external links clearly states that "may include links to web pages outside Wikipedia (external links)" and that "acceptable links include those that contain further research that is accurate and on-topic, information that could not be added to the article" and that the the "External links section is one of the optional standard appendices and footers." (See https://en.wikipedia.org/wiki/Wikipedia:External_links). These statements appear to suggest that a small number of relevant external links are a standard part of an article and encourages users to include links to such things as professional or industry associations etc. Most editors would not imagine that the inclusion of a few, carefully selected links would attract any hostility. However, the same policy also says that "Links in the 'External links' section should be kept to a minimum" and that "Long lists of links are not acceptable" but does not specify what is meant by long. The latter two statements are currently being used by the good people at the Wikipedia External Links Project to justify the deletion of ALL external links other than one official link for a small number of defined article types such as commercial organisations, professional associations and biographies.

In another example, the Wikipedia Statement of Best Practice for Librarians and Archivists, (See https://en.wikipedia.org/wiki/Wikipedia:The_Wikipedia_Library/Cultural_Professionals) encourages archivists and librarians to add links to their collections as references, further reading or external links. However, one user has written an essay which cautions archivists against adding their collections to the 'External Links' section (See https://en.wikipedia.org/wiki/User:Beetstra/Archivists) and it appears that this advice is being taken very seriously by archivists themselves (See: http://aao-archivists.ca/Archeion/3272734) and by Wikipedia editors with reversionist inclinations, who use both the policy and the essay to deny the inclusion of archival collections as external links. I question the purpose of having a statement of best practice, when essays with entirely different advice are tolerated. These types of contradictions create a great deal of confusion.

In a different example, one Wikipedia guideline states that it may be "useful in editing articles to create a red link to indicate that a page will be created soon or that an article should be created for the topic because the subject is notable and verifiable." This guideline is being used to force editors to remove external links and replace them with red links in a de facto "See Also" section. However, the policy also states that "Red links are useless in these contexts [navigational features, including "see also" sections]." A WP essay reiterates the observation that red links should be avoided in navigational features, but its general tone and advice suggests that large numbers of red links are undesirable in any article and advises that "Wikipedia editors should write a new article before they create links to that article elsewhere in the encyclopedia" (See: https://en.wikipedia.org/wiki/Wikipedia:Write_the_article_first) which appears to be at odds with the policy and is certainly at odds with the way that the policy is currently being interpreted in certain Wikpedia projects.

The preceding are just a few illustrative examples of a relatively widespread problem across Wikipedia's policies, guidelines and essays. The plethora of policies, guidelines and essays requires users to engage in extensive reading and navigation, and is off-putting to new editors. Even more experienced editors, who have had the time to read and digest most of the policies and guidelines, are often confused by the the fact that policies are redolent with internal contradictions, vague/ poorly defined terms which can be interpreted in a myriad of different ways creates a very confusing, and forbidding editing environment. To add further to confusion, the policies are being constantly changed by editors, resulting in policy creep. Wikipedia's core policies are desperately in need of consolidation and clarity. BronHiggs (talk) 09:19, 13 June 2017 (UTC)Reply

But there are pages that do just that. See Category:Wikipedia policies. And it's a good thing that policies are getting revised, adjusted and adapted. People certainly need to watch closely for changes to them and participate in change-discussions. Some policy statements are intentionally vague and some need to be specified / improved in collaboration. It's normal for essays to oppose certain guidelines - they're essays and not community-adopted guidelines. If guidelines contradict each other they need to be fixed via due discussion.
There typically are good reasons for why things are as they are policy-wise here. And for instance "not specify[ing] what is meant by long" for external links sections is a good thing as that differs per case (per article and links etc). If Wikipedia editors take essays seriously they could at one point become policies. But they aren't at that point.
Policy-consolidation pages can be problematic but they might also be useful if they e.g. inform about the most important policies in a very short way. One thing that I would support would be videos that introduce people to basic policies that are most relevant without being overly detailed.
If the videos are good and properly licensed they could theoretically also get embedded into the meta page.
--Fixuture (talk) 19:41, 13 June 2017 (UTC)Reply
I cannot see how a simple index of all the policies addresses the problem of the inherent contraditions in those policies, and the very mixed way they are being interpreted on a daily basis. Constant changing of policies means that the policies are unstable, and this contributes to confusion. I realise that policies are subject to change - but substantive changes should be discussed on talk pages rather than just inserted into the policy using incremental changes to words so that the entire meaning is altered - currently this is not happening. New editors are effectively blocked from participating in discussion about policy change, so appropriate discussion would provide more opportunities for a broader range of editors to become involved. Have no idea how the point about videos is relevant to this discussion. BronHiggs (talk) 00:11, 14 June 2017 (UTC)Reply
I have given a very concrete example of how long is currently being interpreted by the folk at the Wikpedia External Links Project. One official link for selected articles that contain names of organisations or people is permitted; anything more than one external link is defined as excessive. For all other pages, even one external links is considered excessive and will be reverted. There are no exceptions, in spite of the fact that the El policy outlines a number of exceptions. Members of ELP are currently locating scores of articles every day and deleting entire external links sections while simulteneously admonishing editors for adding them, writing essays about the evil of external links, and changing the policy so that it accords with their editing philosophy. I find it very hard to believe that this is what the policy makers originally meant by long and this example underscores the need for practical guidelines within policies. This type of reverting does little to encourage bold editing and worse, drives away many new editors who simply don't understand what they are doing that is so wrong. BronHiggs (talk) 00:23, 14 June 2017 (UTC)Reply

What else is important to add to this theme to make it stronger? edit

Poorly documented regions of the world edit

If we truly want a global perspective that serves equally all parts of the world, then we need to revisit our policies for regions that have poor to non-existent documentation. I have seen articles on local history of a region of an African country, for instance, deleted for lack of sourcing. Now a similar article about the history of a small town in the US would be just as encyclopaedic, but sourcing would be easily available so that article would survive. In the African case, the information probably comes from an oral tradition. I don't know what the answer to this is, but just throwing this out there with all its inherent difficulties, Wikimedia could set up a system for people with oral information to make some kind of formal deposition. Thus, we know who provided the information and how they acquired it, making it possible to cite it. SpinningSpark 08:45, 13 May 2017 (UTC)Reply
This formal deposition of oral tradition and local knowledge from Africa and other regions of the world is a brilliant idea in my opinion and something I strongly support as someone working in the area of Africa on Wikipedia, frustrated with the lack of coverage. SUM1 (talk) 18:32, 13 May 2017 (UTC)Reply
Considering the fact that in West Africa, griots or respected storytellers have provided one of the major sources of information for the history of their region and have largely formed the basis of most studies on West African history, it's clear that oral tradition is not a set of fairy tales as some might assume but instead is a very important, resilient and collective source of ethnic knowledge. SUM1 (talk) 19:09, 13 May 2017 (UTC)Reply
That is also a description of what happens every time that i try to report about Latin American culture, everything is fine until I search for sources then in the best situation, it dies in a poorly redacted newspaper article, even for well known or notable facts. --Neurorebel (talk) 08:36, 14 May 2017 (UTC)Reply
@Neurorebel: That is not exactly what I had in mind. You seem to have written mostly on Uruguayan cuisine. There is no shortage of books in English on the subject and I am sure there must be much more in Spanish. There is no excuse in your case for not starting with the sources and write from what you find there rather than starting with what you have heard from your friends in the cafe. I am talking about places where there is no history of documenting events in a written form, or at least not until recently. SpinningSpark 11:15, 14 May 2017 (UTC)Reply
Maybe revisiting the old oral citations project from 2011 would be apt here? I know on the English Wikipedia there was a lot of pushback on this idea from editors because of quality issues (not the least from how to deal with WP:V), but it is something worth noting especially in societies (like mine here in the Philippines) where there is a stronger oral tradition as opposed to a written one. --Sky Harbor (talk) 00:07, 27 May 2017 (UTC)Reply
If we want to cover up much articles about the world, we would need promising articles about those regions. For example in the island of Java, we can get a lot of articles quickly and efficiently. But in the island of New Guinea, we can't get fast and efficient articles because of the lack of internet service in the area for example, etc. Kent961 (talk) 11:48, 1 June 2017 (UTC)Reply

Encourage editors in the Indian Subcontinent to create articles in their native languages edit

My main efforts are directed towards fixing links to DAB pages and other bad links. One of my principal strategies is to look at corresponding articles in home languages. They very often either give the answer at once or make it easy to find. I have used the wonderful {{ill}} template to link to more than 100 non-English Wikis (yes, really). There's one glaring hole: the Indian Subcontinent. I have used Nepali and Telugu Wikis to solve perhaps three problems in English Wiki. In every other case where I've tried to solve a problem relating to the Subcontinent, either the article or the information isn't there.

Hindi - 260M native speakers, 119K articles. Urdu - 65M, 122K. Bengali - 226M, 50K. (Those aren't the only major languages, of course.)
Greek - 13M, 130K. Slovenian - 2.5M, 156K. Welsh - 700K, 91K. (Those Wikis can be very good on the topics they should be good on.)
Cebuano - 21M, 4.4M [sic]. (A lot of the articles are very stubby - but they're there!)

Imagine a reader: a kid growing up in a village who is learning English at school as a second language, and who wants to learn more about (say) local history, or a personality from their area, or a film. I would want to start from an article in my mother tongue. I might even edit it. I might even pluck up the courage to translate it into another language I know (not necessarily English).

I suppose my idea is to spread the concept of "anyone can edit" out to languages who haven't really cottoned on to it yet. Narky Blert (talk) 18:24, 13 May 2017 (UTC)Reply

I have a concern which is very similar to the issue you describe. A lot of articles from the Indian subcontinent, even on relatively large and significant subjects, suffer quite a lot of problems regarding the English and the Wikipedia Manual of Style. It's almost like there is a separate Indian English wiki within the English wiki. Even an article as significant as the Law of India had basic grammar problems which I had to take it upon myself to fix. Take a lesser known article like Peruru, India and the concern I'm talking about becomes a lot worse. I think your idea of encouraging Indian editors to work in their native language wikis could evade this problem to an extent. I also think Indian editors should have a greater awareness of the general English Wikipedia Manual of Style. SUM1 (talk) 18:50, 13 May 2017 (UTC)Reply
This is an understatement. I see a lot of copyright violations, and unencyclopedic garbage from this part of the world. Given the WMF's own research, this is not surprising. MER-C 03:58, 14 May 2017 (UTC)Reply
Yes, encourage contributions in native languages, but I don't agree that contributions in English should be discouraged from editors with poor command of the language. Especially not in India, where English is a lingua franca and the natural choice to reach a wide audience. Poor grammar is amongst the very least of the problems we have from problematic editors, and is easily fixed. No one should be thrown out merely for not having that skill; we are supposed to be trying to be more inclusive, not less. Better tools and systems for finding foreign-language sources would be a more productive thing to aim for. More than once I have been stumped trying to find proof of existence for the subject of an Indian article (let alone proof of notability) because I have no idea what the thing is called in Bengali, nor how to write it in the appropriate script. SpinningSpark 13:38, 15 May 2017 (UTC)Reply
I think discouraging anyone from editing English Wikipedia on the basis of language would worsen the skewed demographics of the editor community. Far better will be improved automatic spelling and grammar checking (conceivably automated). Embedded translation between languages to more easily import/export content between languages. Encouraging native language contributions is obviously important, but enabling content flow between language versions is equally important. Ensuring that copyright and referencing is understood is a training issue for all new users. T.Shafee(Evo&Evo)talk 09:07, 18 May 2017 (UTC)Reply

Minor thing: end the policy on omitting certain country links edit

This is a minor thing, but I believe in order to make Wikipedia more global, the policy on purposefully omitting relevant country links simply because the country is deemed to be well-known is a backwards idea and only serves to inhibit the integration and standardisation of Wikipedia. Not only do I often find that in the same line, the first mention of a country like France is linked, whereas just after it the first mention of the United States isn't, but I've even found that a lot of articles about events in the United States do not even mention the country in the lead sentence, even when it was where a major event happened. This was the case with the Sandy Hook Elementary school shooting. I've even found that on some United States articles, relatively large US cities aren't even linked either. It's like these editors don't realise that anyone can view Wikipedia and people from all around the world do every day. Fixing this policy is one small step Wikipedia could take to achieving greater global integration of the Wiki. SUM1 (talk) 14:44, 16 May 2017 (UTC)Reply

Detaching from country-confinements and building a global voice edit

We should try to uncouple from the constraints of individual countries (incl. the US; especially when we're thinking on the long-term!) and make sure we operate on the international level as much as possible. No government should be able to censor or control Wikipedia in malicious ways. Furthermore when individual countries block Wikipedia or otherwise obstruct proper Wikipedia operation (or engage in any other Wikimedia-relevant activities) we should offer help and a voice. --Fixuture (talk) 20:52, 12 June 2017 (UTC)Reply

Who else will be working in this area and how might we partner with them? edit

In general, outside partners could help improve the usefulness of the 'Page information' link on the left hand panel, edit summaries, and editor stats tracking. For example, one oft-overlooked limitation of Wikipedia's content is readability.[1] Several universities could help to apply well established readability metrics to articles.[2] Editor behaviour is somewhat dependent on the metrics provided (e.g. editors who aim to optimise their edit per month count). Automated metrics could track: readability of each article, readability change (particularly if large) of individual edits, net readability contributions of individual editors. Checks would have to be done to ensure it doesn't encourage gaming the system though! T.Shafee(Evo&Evo)talk 11:02, 6 June 2017 (UTC)Reply

References

  1. ^ Brigo, Francesco; Erro, Roberto (2015-06-01). "The readability of the English Wikipedia article on Parkinson's disease". Neurological Sciences. 36 (6): 1045–1046. doi:10.1007/s10072-015-2077-5. ISSN 1590-1874.
  2. ^ Biliaminu, Karamot Kehinde, Sara Paiva, and Sara Silva. "Characterization of Wikipedia articles health content." (2016).

Oral history organisations and projects which may partner to capture and translate information from around the globe and put the information on wikipedia. Powertothepeople (talk) 03:43, 8 June 2017 (UTC)Reply

On middle east Asia it must be a must... Sikitae.tae soo (talk) 17:36, 12 June 2017 (UTC)Reply

The Most Respected Source of Knowledge edit

What impact would we have on the world if we follow this theme? edit

Individuals would start their knowledge searches at Wikipedia. It would move the world towards the dream of a universal library of information. TeriEmbrey (talk) 14:15, 11 May 2017 (UTC)Reply
  • Wikipedia will become an acceptable, respectable form of tertiary literature. I imagine that the respect will also confer some level of respect (even prestige) for its editors. This will attract further, respectable editors. Perhaps "Wikipedia Editor" would show up on CVs. ―Biochemistry🙴 20:57, 12 May 2017 (UTC)Reply
I just added the following to my CV, under "Professional & Community Service":
Wikipedian (volunteer editor), Wikipedia, 2008-present. (Why is this important?)
  - Mark D Worthen PsyD (talk) 02:58, 14 May 2017 (UTC)Reply
Some insightful and helpful answers to these questions on Quora:
  - Mark D Worthen PsyD (talk) 03:04, 14 May 2017 (UTC)Reply
  • I don't think that Wikipedia can become the most respected source of knowledge. That doesn't mean Wikipedia shouldn't be one of the highest quality sources, or that we shouldn't work to improve the quality of our information. However, there is a significant barrier to Wikipedia achieving this goal, & it is part of the structure of the project: we report information, we do not interpret it. And interpreting information in an intelligent & insightful way is an undeniable requirement of being the most respected source of knowledge.

    Having said that, I want to emphasize that I do not want to change our rules about no original research. It takes someone with extensive knowledge & experience to provide high-quality & respected interpretations. If we allow anyone to add their interpretations of data, we will end up with everyone & anyone pontificating on matters. (If you want to limit who contributes to an online encyclopedia in order to achieve high-quality & respected interpretations, a better model might be Stanford Encyclopedia of Philosophy.) So trying to chase after the top position will only demoralize us, when simply providing the best information that we can will get us acceptably close. -- llywrch (talk) 03:20, 14 May 2017 (UTC)Reply

  • Wikipedia is one of the world's most trusted online encyclopaedias, and we all know that. There are editors to make sure info is right, admins to stop vandalism and many more. My only complaint is it could be sorted into topics, because many times pages can get overlapped, meaning say for example, there is a main page called "Meat" and then someone creates an amazing encyclopaedic article about beef or lamb. This is what I think would improve this wiki (in fact, ALL wikis including Simple English, Herman etc.) and make it 100% success by 2020. ExultantCow64 (talk) 09:03, 14 May 2017 (UTC)Reply
Could you be looking for something like Wikipedia:WikiProject Outlines by any chance? —♫CheChe♫ talk 16:34, 19 May 2017 (UTC)Reply
The outlines are rarely followed unfortuntely. I agree that better and more formal organization would be of great value. Oncenawhile (talk) 17:20, 19 May 2017 (UTC)Reply
  • I think that we need to improve the quality and reputation of Wikipedia so that Wikipedia will become accepted as a cited source, currently in many academic circles Wikipedia is unaccepted because of the fact anyone can edit it. -- (Kappa 16) (talk) 12:15, 14 May 2017
  • We can reduce conflict and increase understanding between communities by creating neutrality in knowledge battlegrounds. Whether it's religion, politics, philosophy, history or economics, there is nowhere else on earth where all perspectives can come together to create a single narrative. Today, we know and understand this. By 2030, the whole world should understand the value of wikipedia's role in these knowledge battlegrounds. Oncenawhile (talk) 20:04, 14 May 2017 (UTC)Reply
  • We currently live in societies that are increasingly divided (not just politically), and as such it is critical to have a mediating and neutral source of information available to all so that they can receive the highest quality information without bias. In this way, we can help to unify our fractured world, and this is why I believe this topic to be the most important the Wikimedia foundation could pursue in the coming years. This means putting a higher emphasis on fixing broken and/or non-neutral articles, as well as ensuring that all information is relevant and cited with a credible source. Swanare (talk) 04:47, 15 May 2017 (UTC)Reply
  • I think Wikipedia has pretty much achieved the status of "Most Respected Source of *General* Knowledge" already. Why do we get so many hits each day otherwise? Why do people donate? I think our policies on citation, NPOV, NOR, have lifted us into that status of "I'll start my knowledge search with Wikipedia". While people might say an academic journal or university text book or other sources might be more respected sources of knowledge in particular topic spaces, most of those sources are not freely-available to everyone, often require a high level of existing knowledge of that topic space to understand them (inaccessible to a general readership), and finally they are in a niche topic space. Wikipedia covers vast topic spaces, is free to access, and is (mostly) written in terms a "general reader" might understand. We suspect we are already at the stage where people look at Wikipedia first and only look beyond Wikipedia if it has not met their needs. So I think our challenge is rather how to maintain this status, by adding new content, filling content gaps, while keeping older content up-to-date and our online citations deadlink-free. Kerry (talk) 06:48, 29 May 2017 (UTC)Reply
Out of curiousity I put the question of "Should Wikipedia aim to be the Most Respected Source of Knowledge" to a senior university research librarian to see what she would say. She said "I think it already is", saying that a lot of questions that would once have come to the Reference Desk are now self-serviced with Wikipedia and that an enquiry at the Reference Desk is likely to be prefaced with "I looked in Wikipedia, but it didn't answer the question of ..." or "I looked in Wikipedia and it has a citation to XYZ and how do I get hold of it?". If so, it would be great if we had an easy way to get the reader (or the library reference desk) to tell us about missing content. I know we trialled aFeedback system a while back, which was eventually withdrawn (most of the feedback was not actionable), but it might be more effective if it was more specifically targetted at missing content. Kerry (talk) 02:06, 31 May 2017 (UTC)Reply
I probably should add that this library is one at which I have run Wikipedia edit training sessions, so has a track history of seeing the benefits of Wikipedia. It might be interesting to ask the question at a range of libraries (local, state, university, specialist) to see if this is a widespread view or not amongst librarians. Kerry (talk) 02:09, 31 May 2017 (UTC)Reply
Wikipedia may well be a respected source of knowledge in some subject areas, but my experience in the marketing/ advertising area suggests that too many articles require substantial improvement before they become useful to the typical user. I have detected many, many issues - here is just a small sample:
Source doctoring: This consists of copying prose from one source (typically a very low-level source such as study notes or lecturer-generated powerpoint slides), but then adding different sources throughout the prose to suggest that the material had a more academic origin. This is a form of fabrication and is rarely challenged because most editors only look for a high quality source, but do not check to see if the material has been correctly interpreted or cited.
Prose doctoring: which consists of going through some copied content and replacing every 6th to 8th word with a synonym from an online thesaurus, presumably in an effort to mask the true source of the content, (and avoid plagiarism detection software). Prose doctoring often results in inadvertent changes to the meaning of original prose.
Fabrication: adding new information to sourced content. For example, on the Marketing page there was a table outlining the marketing orientations to which had been added span of dates for each orientation. These dates were not in the original source, and the process of adding the dates served to conflate the concepts of a marketing orientation with a marketing era thereby adding to confusion and biasing the rest of the article.
Errors of omission:' Definitions or concepts copied from a source, but with important words or phrases omitted, thereby altering the meaning entirely and in many cases resulting in an entire article adopting a skewed focus. These are rarely challenged, providing that the definition has a good source.
Loose use of terms:' Too many terms are used very loosely. e.g. Throughout many marketing articles, the term, word-of-mouth referral has become word-of-mouth marketing - so a customer action (a referral) is incorrectly elevated to to the status of a branch of marketing. Similar examples proliferate in the marketing area.
Very few of these issues can be rectified because in most cases they have been present in the article for 8-10 years or longer, and the army of patrollers are highly resistant to changing any long-standing content, and are even more resistant to new content being added. I have found that the best way to have this type of content removed is to mount a copyvio challenge, because the patrollers will not challenge this in the same way that they challenge modifications or amendments to pre-existing content. Yet, I believe that it would be so much easier if the patrollers would just relax on the reversions and let new editors try to rectify some of these errors, which in the hands of an experienced writer/ editor are actually not too difficult to fix. BronHiggs (talk) 01:30, 5 June 2017 (UTC)Reply
I find it interesting to see how many people here believe that Wikipedia's quality is already high, because I've found it highly variable and as such it's no surprise to me that it is not considered a quality source in it's own right. Controversial topics, politicians, etc are often poor quality and changing them can lead to conflict and edit battles. I've been trying to fix some Australian politicans' pages, and often there is content that has obviously been taken straight from the politician's own webpage, puffery included. There are edit battles about their career and political views, with bias creeping in not only in what is included but also what is excluded (selective attention). I am a new editor, and have found some experienced editors are very good at knowing the "rules" and norms of wikipedia but are using them to create a slanted view of the topic matter. For example, they can quote a politician making a claim, without providing the necessary context that would indicate the claim is not actually true - as such a casual reader might think that what the politician has said is a "fact" when the politician may have been lying or spindoctoring. I am also aware that we - myself included - can have unconscious biases, so these editors may not feel like they are being biased but their own beliefs and values mean the content is not truly neutral. (And further down this page is dicussion about systematic organised propaganda). I understand that there are other topics that are less controversial that may be a very high standard, but what proportion of pages are trully neutral and reliable? Wikipedia as it stands is the best of what is available, however overall is far from being as accurate as standard encyclopedias. Powertothepeople (talk) 03:59, 6 June 2017 (UTC)Reply
  • Wikipedia needs trustworthy information to survive and thrive, and to further become "where facts go to live" for the benefit of our entire species. Given (slow) progress in journalistic ethics and practices, alternatives may evolve. Cnewmark (talk) 21:59, 5 June 2017 (UTC)Reply
Agreed. The internet is flooded with misinformation and Wikipedia needs to be the impartial oasis that people can come to for their facts. This is particularly important in an era where our politicians and media increasingly don't care about the truth and will lie outright and manipulate people. On a related note, in Australia we have a publicly funded media organisation that runs an independent "Fact Check" to test political claims, and quality journalists rely on this information to sort fact from fiction in an era of fast news. Wikipedia should be a place where individuals and journalists can "fact check" (check the facts) on any significant topic, knowing that it is accurate, neutral, and independent of bias. Powertothepeople (talk) 03:55, 6 June 2017 (UTC)Reply

Nowadays, Wikipedia actually has an important amount of trustworthy content. But the fact that it can be edited almost right away might be a feeble point, with which people can't be 100% confident of what they are reading. Of course, I'm not implying that editing lows the quality of its information but in my opinion, a further control of it might be useful to gain the respect Wikipedia deserves as a main reference of information worldwide. The impact of achieving a full respectable reference web of knowledge would be really transcending. For now, you have to make sure every Wikipedia quote has its reference and then check yourself where the information you just read comes from. Nevertheless, achieving this subject about making Wikipedia a trustworthy source of knowledge, will mean that anyone reading any article will have the confidence that what they are reading is the actual true fact itself, with no need of checking other sources. AnnaCo (talk) 00:55, 9 June 2017 (UTC)Reply

Most articles on Wikipedia are a reliable source if information and it could become the best, most international research encyclopedia.Contributing is a learning process for everyone as each person grasps the advantages of altruism and becomes there own critic as to quality, neutrality and relevance.It's important as well to be responsible for an article which you have worked on to check occasionally, to see that editing over a long period has not made the article unbalanced, not readable, and is the concise information you would like to see .paula clare (talk) 18:08, 9 June 2017 (UTC)Reply

Wikipedia has been most trusted and respected source of knowledge. It has a higher impact on people across universe. Currently in a survey by Blue Sapphire Digital, this is found that 99.9% of students from class I referred Wikipedia do complete there homework. Over 80% of graduates and professional s prepare their career following wiki. This is the only platform which shouldn't be misused. All we need is authentic information with geneuin users and efiors. Raavi Mohanty 10:11, 10 June 2017 (UTC) — Preceding unsigned comment added by Raavimohantydelhi (talkcontribs)

How important is this theme relative to the other 4 themes? Why? edit

I think that this theme is actually quite important. Wikipedia is currently regarded in the public eye as a useful jumping-off point for research, but that it is incredibly unreliable and lacks citations. In order to change that view, something must change on our end. MereTechnicality 20:59, 11 May 2017 (UTC)Reply
  • This theme may impinge upon the themes of "Healthy, Inclusive Communities" as the demand for prestige shuts out less educated or capable editors. "The Augmented Age may help editors focus their time on content creation, rather than policing. Towards the aim of developing respectable material, it will be necessary to pursue the goal of "Engaging in the Knowledge Ecosystem." To be respected by all, one must reach all. To be the most respectable source, the knowledge accumulated must span the word, as a "A Truly Global Movement." ―Biochemistry🙴 21:10, 12 May 2017 (UTC)Reply
  • This is why people read Wikipedia; this must be the highest priority. MER-C 04:09, 14 May 2017 (UTC)Reply
  • Per my comment in the section above, I consider this the highest priority. However, many of the five themes are interlinked - we are not going to become such a respected source unless we have healthy, inclusive communities and continue to penetrate the knowledge ecosystem. Oncenawhile (talk) 20:06, 14 May 2017 (UTC)Reply
  • Per my comment below, this is the most important theme. If our information is useless than we are useless. TonyBallioni (talk) 14:11, 15 May 2017 (UTC)Reply
  • Per my comment in the section above, this is by far the most important theme Wikimedia could possibly pursue. We need this kind of non-biased, relevant, and credible information available to all so that there is no doubt to our legitimacy and effectiveness, as well as helping to mend what seems to be an increasingly divided society in many respects. Swanare (talk) 16:54, 15 May 2017 (UTC)Reply
  • It is the main goal. The other themes are in support of it. Kerry (talk) 06:51, 29 May 2017 (UTC)Reply
  • I am constantly amazed at how often Wikipedia is cited in peer-reviewed journal articles and introductory text-books. It is all the more surprising when we note that these articles and books are often citing incorrect/ conceptually flawed information provided in Wikpedia articles. Wikipedia is becoming a primary source, and therefore, has a responsibility to ensure that information is accurate and conceptually sound. BronHiggs (talk) 02:51, 5 June 2017 (UTC)Reply
  • Agreed, this is the main goal, Wikipedia information needs to be trustworthy. Cnewmark (talk) 21:59, 5 June 2017 (UTC)Reply


I hate to differ, however, I would place this theme as the second most important priority, after community. Community comes first IMHO because it takes care of the volunteers who are needed to write & edit the content - without whom Wikipedia will become outdated and obsolete - and currently, there's a high turnover and not enough volunteers to keep up with the demand. Better training and support for new editors would increase the quality of their contributions rather than risk them undermining the work of experienced editors. A stronger healthier community can then work together to improve the quality of content. However, I do agree with you that Knowledge is, of course, the whole point of Wikimedia's existence so it is also highly important!!! I see Community as the means to achieve improvements in quality of Knowledge. (They work hand in hand). Powertothepeople (talk) 04:04, 6 June 2017 (UTC)Reply

I think this is the most important theme. I don't think that Wikipedia can become the most respected source of information because it is always under revision and new editors are always joining. Because of this, I don't think it will ever be as definitive as an introductory textbook written by experts, but I think Wikipedia can be a close and widely accessible second most respected source of information. I think fostering a collaborative community of editors supports this goal, but I don't think it is the main one. I think we all start out as inexperienced editors and should do what we can to encourage people to stay (i.e. don't revert without explanation). I think the current structure of allowing anyone to edit is important for the recruitment of interested parties. I think it is one reason why Wikipedia succeeded when other projects failed. I have a final thought, the citation of Wikipedia in academic articles should generally not be the norm. Wikipedia should serve as a body of knowledge of what is known and since this information just summarizes what is known and accepted (i.e. an encyclopedia) its citation in academic literature, or college essays, should continue to be fought against. Theropod (talk) 06:38, 6 June 2017 (UTC)Reply

Quite simple, actually: if the source of information isn't trustworthy, you can have as many articles as you want, that it won't be relevant. It would mean we are giving priority to content amount over quality. So, to give Wikipedia a future, first we have to make sure that its way of working provides trustworthy knowledge, and then we will be able to improve it for an even better source of information. AnnaCo (talk) 01:08, 9 June 2017 (UTC)Reply

Focus requires tradeoffs. If we increase our effort in this area in the next 15 years, is there anything we’re doing today that we would need to stop doing? edit

Spend less energy on mobile search functionalities. TeriEmbrey (talk) 14:13, 11 May 2017 (UTC)Reply
  • The largest tradeoff that comes to mind is Wikipedia's openness towards editors. Although Wikipedia currently endorses some level of protectionism, it may be difficult to achieve this goal without resorting to further protectionism. Consider why people respect other sources of knowledge, like major, public universities. When you think of a university, you think of an institution that rigorously vets its professors, generators and disseminators of knowledge. As such, you respect the institution and the knowledge it produces/displays. Perhaps Wikipedia will have to more scrupulously vet its editors in the future to achieve this strategic goal. ―Biochemistry🙴 16:27, 12 May 2017 (UTC)Reply
  • I agree with User:Biochemistry&Love about respectability of Wikipedia as a tertiary source, but have to opposite prediction. I think that for most articles, lack of contribution is more of a problem than low-quality contribution, which causes the patchy and inconsistent coverage that can be bad for our reputation. In most cases, increased engagement leads to better articles, since the average edit is still constructive. Overall, though, I think that protecting pages will still remain a rare, necessary tool, but most stub/star/C pages need more input. T.Shafee(Evo&Evo)talk 07:33, 13 May 2017 (UTC)Reply
  • We will have to figure out how to enable experts to enter the community without crossing swords with someone who's guarding a page. It's not an uncommon experience for someone to edit a page, have their work reverted by a guardian, and decide that there are other, more easily accessed ways to contribute to society. I have heard of this in communities like r/AskHistorians in Reddit, which has done a good job of explaining to newcomers - historians both professional and amateur - the community's expectations for its members. This may mean that we exchange the universal editing ideal for a different editing process that enables us to become "the most respected source of knowledge" in the world by 2030.Ezratrumpet (talk) 18:26, 20 May 2017 (UTC)Reply
  • I think this is the most important theme (what use is the English Wikipedia and its sister projectes if they are not trusted. This means that the quality of articles must be up to par and Wikipedia need to work on its use as a free advertising platform or soapbox by people who want to capitalize on its success as the 5th most popular website in the world. The most functional way to help achieve this goal would be ACTRIAL, and recognition of the importance of quality control for new pages by the WMF. TonyBallioni (talk) 14:51, 13 May 2017 (UTC)Reply
Yes! A capital idea TonyBallioni. Strongly support.   - Mark D Worthen PsyD (talk) 01:04, 14 May 2017 (UTC)Reply
@TonyBallioni: could you explain please what do you mean by 'recognition of the importance of quality control for new pages by the WMF'? What quality control, specifically? RC? WikiProjects? AfD? 1:1 guidance? What would be the role of WMF in that? Maybe WMF doesn't remember about sth related to quality when it organizes events? or when it communicates outside the Wikimedia world? (I'm an experienced Wikipedian, I'm just from plwiki, not from here, and I might not to get things straight). And ACTRIAL means basically to prohibit registered but not autoconfirmed to create articles? SGrabarczuk (WMF) (talk) 18:06, 19 May 2017 (UTC)Reply
@SGrabarczuk (WMF): sure: the English Wikipedia has become the 5th most popular website in the world and arguably the default source of information for anyone born after the year 1990. With that great success there are several issues, but I'll highlight what I view as the three most important: biographies of living persons (BLPs), advertising, and verifiability. On BLPs: it is exceptionally easy to ruin someone's life simply by posting false information about them on the English Wikipedia. We are better at catching this than we used to be and pure attack pages are almost always deleted, but things still slip through. Advertising: a clever marketing director can send an AfD into three weeks of overtime based on press releases for a startup that has never been profitable and will probably be bankrupt within the year. This is a local problem with our deletion process, but could be greatly reduced by restricting page creation to autoconfirmed accounts. Verifiability: see citogenesis. The English Wikipedia being the default starting source for the people who are now becoming journalists at major international publications poses a huge problem if the articles they are drawing from aren't verifiable. Again, we are pretty good at catching this for controversial material, but things do slip through the cracks and it is easiest to slip through in my opinion when the text is in the original version of an article.

ACTRIAL means having a trial run of restricting page creation to autoconfirmed users. An important part of this is how to not scare off new users and get them in a framework where they can create draft articles for review before publication in mainspace so they have better articles that are also less likely to be deleted. DGG and Kudpung are probably the two users who have thought the most about this, and I was in a very extended wikibreak during the time it was originally proposed, but have come to support it because if you look at the new pages backlog, most of the problem pages are in fact created by new users, and deleting their articles instead of helping them work on a draft is far more likely to scare them off.

As to how WMF can help: One of the biggest ways WMF could help in the process would be by helping create a workflow for new users so that they get acclimated to Wikipedia, get sent where they want, and aren't greeted by people who oftentimes just created an account two weeks ago and give them a talk page message that is overwhelming and full of alphabet soup. I hope this was helpful. TonyBallioni (talk) 18:28, 19 May 2017 (UTC)Reply

  • Wikipedia has two incompatible functions. It wants to be a reliable, credible encyclopedia, while being "edited by anyone". This latter position means that articles can be edited by those who are poorly educated, lack facility in English, have no little or no knowledge in the topic of the article, or edit maliciously. It is little wonder that Wikipedia is still widely regarded as inaccurate, even though bad edits are usually addressed promptly. To gain universal acceptance, Wikipedia needs to stop anonymous edits. Every new editor must be registered, and their first 10-20 edits must be reviewed before they are allowed to edit in article space. Put simply, too much time and effort is spent chasing down poor and bad contributions. Those are resources that could better be spent creating and improving genuine content. It's time for Wikipedia to grow up and weed out poor editors. WWGB (talk) 02:43, 14 May 2017 (UTC)Reply
The issue with that is that Wikipedia is designed to be open. I do agree with you on some level, but I really don't think that it's going to happen. MereTechnicality 04:07, 14 May 2017 (UTC)Reply
  • Agree with TonyBallioni. Some types of bad articles (advertising, in particular) are worse than no article. There are already quite a few articles on en.wp where participation by new users is a net negative; we should be more liberal in semi-protecting. I also want to see page creation limited to extended confirmed users -- we have enough shit articles as it is. MER-C 04:09, 14 May 2017 (UTC)Reply
Personally, I think XCON is a little bit *too* restrictive for article creation. But I do believe that restricting article creation to more experienced users (perhaps autoconfirmed is enough?) and using more liberal semiprotection is a good idea. MereTechnicality 20:48, 18 May 2017 (UTC)Reply
  • If the current state of affairs persists, there won't be a Wikipedi in 15 years (at least not as we know it). When Wikipedia was created by Wales and Sanger it was probably never realised in their wildest imagination how big Wikipedia would become and the impact on society it now has. A lot of controls were therefore not thought of. Stricter controls are now urgently needed, not only to maintain quality and standards, but to reinforce and retain the very reputation for quality and accuracy that Wikipedia imagines for itself. Unfortunately, the WMF refuses to acknowledge this, the problem is getting out of hand, and those who were trying to do something (e,g. me, other admins, and the more qualified and prolific new page patrollers) have burned out of patrolling and given up yelling for change.
The required changes are obvious: WP:ACTRIAL, the creation of a proper landing page, and some coherence in the 1,000+ advice essays (Wikipedia has become one massive set of bureaucratic instructions). TonyBallioni is right on the ball with his comments above, but he is only reiterating what I have gotten hoarse over through the last 7 years and now finally thrown my arms up in desperation. Kudpung กุดผึ้ง (talk) 01:33, 20 May 2017 (UTC)Reply
Actually, I think, strategically article creation must be made as hard as possible, may be even restricted to users with quite some experience like a year and 10K edits. We have already created the bulk of our articles, and efforts should be directed not at new article creation but on improvement of existing articles and cleaning up spam and promotional articles.--Ymblanter (talk) 08:05, 21 May 2017 (UTC)Reply
  • I agree that we need to find a balance between quality and "open to all". I think the reason academic publications do enjoy a high level of credibility is because for an academic, your name is your brand and your reputation. You don't misbehave in an academic publication because your real name is on that publication. I'm a retired academic and I edit under my real name on Wikipedia just as I did professionally. Knowing I am real-world accountable holds me to a higher standard of behaviour; it's as simple as that. So, let's start with eliminating the IPs. Then let's start with making real-world verification of ID a requirement to confirm a pseudonomynous user account; that provides real-world accountability but allows people to contribute without exposing their real name to the public. Without WMF (or whoever they delegate) citing your real-world ID, your powers are restricted on Wikipedia to an "unverified" account. I disagree wit "we don't need more article creation", our world produces new things all the time so of course we need new articles, but I am more than happy that we restrict article creation to more experienced and verified users. Similarly Articles for Deletion, Categories for Renaming, etc type discussion/votes should be restricted to the verified and experienced users. How much undetected sockpuppeting is taking place currently? We simply don't know, but surely it's worth weeding out. Kerry (talk) 07:27, 29 May 2017 (UTC)Reply
  • A concrete proposal for having "grades of experience" is we should restrict contributors to doing things we know they understand. What is wrong with having some short on-line training on "how to cite" and the associated policies with a multiple-choice quiz at the end? Or making tables? Or anything else. "Citation Achievement Unlocked!" Then we can have users with "certifications" for different skills and that entitles them to do certain kinds of edits or to participate in certain kinds of discussion. We could also extend the idea to Project membership having some kind of entry training/test (e.g. imagine if anyone who edited an article tagged by Project Australia already knew that Australian articles use DMY dates and Australian English, what a great time-saver that would be for everyone). Just as we do with university courses, Wikipedia skills could have pre-requisite structures and people can choose which directions they will follow, either it be bot development or writing about medical topics or judging reliable sources in Brazilian history. I would restrict editing of higher quality articles to higher certified users while allowing lower certified users to work on lower quality articles. This keeps Wikipedia open to all (and we have a lot of content gaps so we need everyone) but maybe we don't need everyone able to experiment with the higher quality articles. Kerry (talk) 07:27, 29 May 2017 (UTC)Reply
  • I agree with a lot of what people have said above regarding the potential tension and tradeoff between "open" versus "reputable knowledge," and just wanted to add that it is not necessarily a mutually exclusive situation - I will discuss a potential solution in the next section. (Allow users to have different roles and permissions that play to their strengths and abilities). Powertothepeople (talk) 04:19, 6 June 2017 (UTC)Reply
  • More reliable and trustworthy information is a requirement, so are efforts to prevent harassment of editors. So, this isn't the only priority. Funding won't be a constraint; perhaps getting people with the right skills and courage may be. Cnewmark (talk) 21:59, 5 June 2017 (UTC)Reply

What else is important to add to this theme to make it stronger? edit

  • To make this theme stronger, Wikimedia Foundation will need to expand its staff and provide more support to GLAM institutions. Some objectives/activities that would support this strategy include presenting at museum and library conferences (local, regional, national, and international), expand its partnership(s) with OCLC, have experienced Wikimedia Foundation staff and admins set up project tables and mentor new GLAM participants during their first year on Wikipedia/Wikidata/ Wiki--- , and work to make Wikidata easier to use with tutorials geared towards GLAM communities. TeriEmbrey (talk) 13:56, 11 May 2017 (UTC)Reply
I did not know what GLAM stood for, so for others in the same boat, here's a quick explanation and links to further info. In this context, GLAM = Galleries, Libraries, Archives, and Museums. | Disambiguation page for GLAM | Start-class article: GLAM (industry sector) | And the nicely designed GLAM-WIKIMEDIA Outreach Project   - Mark D Worthen PsyD (talk) 05:14, 13 May 2017 (UTC)Reply
  • To prevent the loss of inclusivity, focus on educating the common man to be a better editor. As the Mozilla Internet Health Report notes, many people are interested in creating online content, but confidence is a concern. Can we make editors more bold? ―Biochemistry🙴 21:22, 12 May 2017 (UTC)Reply
  • Interesting comment, Biochemistry&. I will add that editing especially with good faith in mind can be extremely daunting, because you quickly learn that Wikipedia has a lot more customs that you didn't know about. There is no easy learning path to understanding Wikipedia in terms of its community, manual of style, etc.Nuvigil (talk) 02:39, 13 May 2017 (UTC)Reply
Excellent points Biochemistry&Love and Nuvigil. A lot has been done, and is being done to support new editors. Unfortunately, these efforts receive scant attention from most editors, i.e., dissemination occurs in dribs and drabs. We need to up our ante by doing a little more each day (or each week) to welcome and encourage new editors. You both probably do this already. I decided to join the Kindness Campaign. Other opportunities include:
Teahouse - peer support for new editors
Harmonious editing club - keep the peace, peacefully
Adopt-a-User - experienced editors can "adopt" newer editors, helping to mentor them along the way as they learn about Wikipedia
Welcoming committee
Editor assistance
  - Mark D Worthen PsyD (talk) 05:36, 13 May 2017 (UTC)Reply
There is no easy learning path to understanding Wikipedia in terms of its community, manual of style, etc. There is! WP:EPTALK covers almost everything a new editor needs to know, followed by WP:TALKDONTREVERT and WP:SOURCES. These three short "pages" (no need to read the entire policy, only the linked sections) can be covered in five minutes and tell editors everything they need to know to be a functional editor of Wikipedia. If these policies would be linked near the "save changes" button, quickly all editors will know everything they need to know. Bright☀ 11:07, 13 May 2017 (UTC)Reply
I would like to clarify "there is no easy learning path". I meant kind of what you said at the end with that link at the end. The documentation is obviously abundant and organized, but getting to that point was more of a random discovery for me. Years ago, it didn't take me until I had violated an editing norm and someone told me before I knew about the new editors resources. In other words, it might be better to say no easy path to the learning path.Nuvigil (talk) 11:42, 14 May 2017 (UTC)Reply
It can probably be summed up as 'Wikipedia has an abundance of documentation, however a shortage of clear communication.' Many people have taken great effort to provide clarity by creating extensive documentation, but it is too much for a new user to get through, and not organised with the user-experience in mind. It is better than nothing, but falls short of being intuitive. It's the difference between the old days when people would read the manual before they turned on a new technology product and keep it by their side until they got the hang of things, to now being able to switch something on and use it easily, and only refer to the manual for rare trouble-shooting. Powertothepeople (talk) 01:21, 6 June 2017 (UTC)Reply

Make sources accessible. As much as possible, cite open-access well-cited peer-reviewed papers and textbooks that are widely available and widely-used in their field. These are the two most accessible high-quality sources, and Wikipedia should encourage their use above low-quality sources, perhaps more prominently in WP:RS. Readers should be more aware of the quality of sources used so they can more accurately judge the weight of the information. Discourage unsourced or "self-sourced" information as much as possible. Wikipedia is as good as its sources and their availability. Bright☀ 11:08, 13 May 2017 (UTC)Reply

  • Encourage short quotations in footnotes. This will give readers much greater confidence on controversial topics, and will make verification easier. It would require strengthening of WP:CITE and clarification of WP:NFC#Text. There is an excellent discussion going on right now on this topic, and different views currently abound. Some comments include "This is a model of what we should be doing across articles in Wikipedia as a best practice", "Quotations of the length exhibited here are absolutely the norm in serious scholarship and can greatly enhance the quality of an article," and "From a Stanford libraries guideline on US copyright law, I note the following: "Because the dissemination of facts or information benefits the public, you have more leeway to copy from factual works such as biographies than you do from fictional works such as plays or novels"" Oncenawhile (talk) 20:15, 14 May 2017 (UTC)Reply
+Strongly support. I often include quotes. Law review articles do this a lot, which I appreciate as a non-lawyer as it helps me understand the reference better.   - Mark D Worthen PsyD (talk) 08:10, 16 May 2017 (UTC)Reply
  • One way to increase respect would be to provide optional identity verification, so the question of who bears the responsiblity for a Wikipedia account becomes more clear. ChristianKl (talk) 12:09, 23 May 2017 (UTC)Reply
  • Educate re-users about how to use Wikipedia. In 2030, I don't want to see a single newspaper, book, or TV program say "According to Wikipedia..." Smart re-users know that Wikipedia cites reliable sources for all noteworthy claims, so there is no need to cite Wikipedia. – Finnusertop (talkcontribs) 01:42, 30 May 2017 (UTC)Reply
Wikipedia is published under a CC-BY-SA license. The BY part means "by attribution", so newsapers, books, and TV programs should mention Wikipedia as their source. This is correct behaviour; it is wrong not to acknowledge Wikipedia as the source of the information. Kerry (talk) 05:14, 30 May 2017 (UTC)Reply
The optimal position is a middle ground between Finnusertop's and Kerry Raymond's positions, and is related to the discussion below at #Show_Article_Quality_Rating_to_ALL_Users. For articles of above average quality, we should be happy for readers to attribute Wikipedia per Kerry Raymond. For low quality or otherwise unreviewed articles, readers should be encouraged to "look through" the article and attribute only the underlying source - in such a case they could state that they found the source via Wikipedia. Oncenawhile (talk) 06:49, 30 May 2017 (UTC)Reply
Well, I think we should always encourage readers to explore more deeply through citations, external links, and further reading, regardless of stated quality. But the principle of citation is always "Cite it where you saw it". If you only look at Wikipedia, then you cite Wikipedia. If you do find the information in one of the Wikipedia article's citations, then it's correct to cite that citation instead (and there's no requirement to attribute to Wikipedia in that case, just as we don't acknowledge Google Search to find us a useful website). Kerry (talk) 07:16, 30 May 2017 (UTC)Reply
  • Work with partners who are already facing the challenges of trustworthy sources and information. (Please see below.) Cnewmark (talk) 21:59, 5 June 2017 (UTC)Reply
  • Allow users to have different roles and permissions that play to their strengths. I believe there are several ways for Wikipedia to be both "open" to new users and still maintain quality control. First Wikipedia must recognise that different users have different strengths and weaknesses, which is not just a matter of experience (though can be exasperated by lack of it) and motivations. Training is necessary for new users, however previous suggestions along these lines assume that all contributors are on the same "track" and are committed to becoming a highly skilled allrounder Wikipedia editor - when in fact many may want to contribute in a more limited manner.
For example, there are millions of casual users of Wikipedia who are knowledge experts who don't want to be "Wikipedia editors" but who do want to correct mistakes when they see them on Wikipedia. Having a way for them to flag the issue - like a small float box that says "see something wrong? let us know" - where they can quickly and easily communicate what needs correction into a form, without making them register or learn Wikipedian is important. An editor can then review these suggestions and make any necessary changes to the published page without such casual contributors having to undertake time intensive training etc.
Others may be interested in contributing in a manner that works to their strengths (and forgives them their weaknesses). If Wikipedia allowed differentiated roles and had a project management approach behind the scenes for the creation of content, different people could contribute in a limited capacity that suited them: 1) people pose questions on a topic, 2) researchers find facts and citations, 3) writers write the content clearly, 4) tech-savvy people format it correctly for publishing on Wikipedia, 5) top level editors do quality assurance, 6) everyone is able to discuss, plan, manage and implement tasks, easily and cooperatively, working to their strengths.
This could result in high-quality work without every editor having to master every skillset themselves. For example, someone whose written English isn't great could still do research and citations or format content for publication, etc. I know some brilliant scientists whose written English is appalling. Conversely, I know people with excellent written English who are not very technically inclined and may struggle to contribute in the current Wikipedia environment.
There would be different levels of permissions based on user skillset, experience, and track record. More would be done behind the scenes - on the equivalent of the talk page (but hopefully an improved interface that better helps collaborative project management) - which anyone could contribute to in whatever capacity they are able. However actually changing the official "published" content would be restricted to those who have already proven themselves to be highly capable editors who adhere to the Wikipedia charter.
This creates a low barrier for entry for new editors to contribute, without risking damage to the quality of work already done by those who came earlier (or wasting people's time with edit wars). Registration is not necessarily an issue, as new and/or unregistered people would only be able to add their comments to the working group, not change the official published content. This would prevent both vandalism and unintentional newbie mistakes.
Another related idea is to have a clear quality "grading" system for different Wikipedia pages. Pages that are up to the same standard as would be expected of a professionally edited encyclopaedia might be "Wikipedia Pro" pages, while others that are not yet up to scratch might be "Wikipedia Draft." Once a page becomes "Pro" level there are restrictions on who can edit it, and edit requests are posted on the talk page for discussion prior to the page being changed. This allows organisations to trust Wikipedia Pro and cite it as a trusted source. Draft pages are "open" to a broader range of users (in fact, there might be a multi-step grading level ranging from Draft 1 to Draft 3 as an article improves in quality before an article qualifies as Pro, and different users have restrictions on what level they can edit directly versus post suggestions on the talk page). This allows new users to contribute and practice their skills in a safe space, while highly skilled editors have a lot of the legwork done for them and may just need to give some advice to less experienced contributors, or a final polish to articles, or "approve" that the page meets the criteria for Pro. Powertothepeople (talk) 04:41, 6 June 2017 (UTC)Reply
  • Networking users with specific interest and knowledge with specific tasks and articles, integrating experts and keeping them engaged. Relevant to this are my suggestion for a streamlined WikiProject system over at Healthy, Inclusive Communities and WP:Expert help. For the relevance and accessability of Wikipedia content (and "show[ing] the most relevant information to people when and where they need it") there also is WP:Smart city uses of Wikipedia. --Fixuture (talk) 23:24, 12 June 2017 (UTC)Reply

Who else will be working in this area and how might we partner with them? edit

There are numerous potential partners in this arena. Here are a few of the major ones: American Alliance of Museums, American Library Association, American Association for State and Local History, International Federation of Library Associations and Institutions, National Council on Public History, and OCLC. Start by presenting at their conferences, have an exhibit booth in their exhibit halls, and run miniature drop-in edit-a-thons out of the exhibit booth. Pay the conference attendance and travel fees for interested and established Wikipedia editors and admins to help staff the booths and talk about what they've added to Wikipedia/Wikidata/ Wiki---  ; choosing the Wikipedia editors and admins to help staff the booths could be an annual contest at Wikimania or through the Wiki websites. TeriEmbrey (talk) 14:24, 11 May 2017 (UTC)Reply
  • Academic institutions. Professors have the knowledge base to identify excellent literature to cite from. Offering recognizable, academic accolades (recognized outside of the Wikipedia community) for their contributions could be persuasive. ―Biochemistry🙴 21:00, 12 May 2017 (UTC)Reply
I really like that idea Biochemistry&Love. The legal profession seems to "give back" consistently, e.g., law schools start legal clinics for underserved populations, and (some) private law firms provide time, resources, and even salary to their attorneys who perform pro bono work. How can we persuade universities to recognize editing Wikipedia as a high status public service contribution?   - Mark D Worthen PsyD (talk) 05:45, 13 May 2017 (UTC)Reply
  • The News Integrity Initiative and related efforts, like the Trust Project and the International Fact Checking Network, are addressing issues including methodical determination of reliable sources. These are natural partners; a long story, but getting traction surprisingly quickly. Cnewmark (talk) 21:59, 5 June 2017 (UTC)Reply
  • Quality journalists and writers. They do a huge amount of research for their indepth articles and books, and there is a mutual benefit if they can trust wikipedia to be a legitimate source of quality information, and they likewise contribute what they know about a topic from their own research. Good journalists value truth, neutrality, and freedom of information. Could create topic lists and identify quality writers and journalists who have published on the same topic - reach out to them to ask if they have anything further to add to what is already on wikipedia, Powertothepeople (talk) 04:46, 6 June 2017 (UTC)Reply
  • Philanthropic Foundations. Philanthropic Foundations might provide partnership or grants to allow Wikipedia to hire expert editors to create and verify high quality content in topic areas of particular interest to the Foundation. For example, the Howard Hughes Medical Institute and/or Wellcome Trust may provide a grant to improve medical pages on wikipedia (they are both supporters of Open access to information and have collaborated together before). Bill & Melinda Gates Foundation is also a supporter of Open information and has financially helped Khan Academy which has some similar philosophies to wikipedia. Also, smaller foundations that have very specific interests, such as they may be named after a deceased person of interest - instead of paying for a statue or something perhaps they would pay a grant to wikipedia for a professional editor to ensure quality neutral content related to the person or their area of interest (meet the foundation's charter, but also neutrality rules) in a openly accessible way for all. Powertothepeople (talk) 04:46, 6 June 2017 (UTC)Reply
  • Open Access and Open Research organisations, such as Europe PubMed Central, Gates Open Research, Faculty of 1000, Scholarpedia, etc. They have done a lot to make scientific/academic knowledge more publicly available, however there is a known problem that academic journals are written in a manner that is not easy for the public to understand. In the interest of disseminating information to a greater audience, wikipedia is well placed. A partnership could involve wikipedia editors partnering with the academic/scientists to create high quality wikipedia content. Powertothepeople (talk) 04:46, 6 June 2017 (UTC)Reply

Other edit

Proactive and reactive edit

I think in order for Wikipedia in particular to become a respected source of knowledge, we need to have a two-pronged approach (the proactive and the reactive.) The example I always go back to is that of "Jar’Edo Wens" for why we need to stay vigilant.

The proactive:

  • More comprehensive automatic review of every new article created (bots can check word count, for instance) Interlaker (talk) 17:20, 13 May 2017 (UTC)Reply
  • Daily highlighted article on the main page of Wikipedia for the purposes of targeting for improvement Interlaker (talk) 17:27, 13 May 2017 (UTC)Reply

The reactive:

  • Any "citation needed" tags older than 30 days should result in the accompanying uncited text being wholly removed Interlaker (talk) 17:20, 13 May 2017 (UTC)Reply
  • Orphaned articles older than 5 years should be automatically deleted (by a bot) with cursory automatic notifications sent to the creator at intervals beforehand Interlaker (talk) 17:20, 13 May 2017 (UTC)Reply
  • Articles flagged for more than 2 years for not citing any independent sources should be automatically deleted (by a bot) with the same notifications sent to the creator Interlaker (talk) 17:20, 13 May 2017 (UTC)Reply
I second the motion! Excellent proposals Interlaker. :O)   - Mark D Worthen PsyD (talk) 01:00, 14 May 2017 (UTC)Reply
I am uneasy with the first reactive suggestion. While the content is uncited, the fact that it is stable for a long period of time often means it is accurate. It is just that nobody has gotten around to adding a reference. Once content is deleted, there is little chance anyone will retrieve it from the edit history and so the content, even if accurate and stable, will likely never be restored. This effectively eliminates the possibility of a reference being added for the content in the future. In my view, automatic deletion of content in this manner may lead to a decrease in the quality of the encyclopedia. Sizeofint (talk) 18:00, 14 May 2017 (UTC)Reply
I agree with Sizeofint. There are a large number of {{cn}} tags used where there is nothing controversial about the content, but someone thinks it should be possible and would be desirable to provide a citation. Citation needed is not the same as content disputed. Maybe we need a different way to mark things that are disputed and those which are just not yet cited. wholesale deletion of material just because someone tagged it as citation needed and no-one has gotten round to adding one would be seriously disruptive.• • • Peter (Southwood) (talk): 11:57, 15 May 2017 (UTC)Reply
I'm with Pbsouthwood on this. There is stuff that is very hard to find citations for. Not every topic attracts an academic journal article. If someones says that Smallville's Catholic church opened in 1914 in a ceremony officiated with Bishop Dunne, I'll probabaly add citation-needed but I will think it plausible because I know that Dunne was the Archbishop at that time and would probably have officiated at such an event. I think we always have to assess risk of not having a citation or not having as reliable as citation as might like. There are risks for BLP (reputation damage to the subject of the article) and medical articles (people decide to treat themselves with Vitamin C instead of seeing a doctor for a momre effective treatment). But for local history, what is the risk to the reader if the catholic church actually opened in 1916? Or even if there never was a catholic church in Smallville? Not a lot. I think we always have to assess the likelihood that you think it is plausible and the harm that could be done if it's incorrect. This is why we have watchlists; I have a good knowledge in my topic areas and I'll have a good knowledge of sources not readily found with a google search. So my ability to determine "plausible" in my topic areas is good (but not in other topic areas). I think we need to retain some respect for people who are active in a topic space to have good judgement on these things. Kerry (talk) 07:33, 30 May 2017 (UTC)Reply

Marketing edit

The WMF needs to stop marketing en-WP as easy to edit. Editing in many topics is hard, both technically and in terms of subject matter. You are setting people up to fail and to be disappointed.Jytdog (talk) 23:06, 13 May 2017 (UTC)Reply

+Strongly support. Great point Jytdog.   - Mark D Worthen PsyD (talk) 01:05, 14 May 2017 (UTC)Reply
Hear hear. MER-C 04:10, 14 May 2017 (UTC)Reply
It is easy to edit. Unfortunately what they do not mention is that it is easy to edit badly. Not so easy to edit well enough to be worth the effort. Often quite difficult to edit well enough that someone else does not have to repair the incidental damage. Nevertheless, we do need new editors, as most of the work is still to be done. Maybe new editors should be encouraged to start on talk pages, but that is undermined in that talk pages are probably more difficult to edit. So we have Wikipedia, the encyclopedia that anyone can edit (if they have internet connection), but not so many can improve. • • • Peter (Southwood) (talk): 12:06, 15 May 2017 (UTC)Reply
Per Pbsouthwood: It is easy to edit. Unfortunately what they do not mention is that it is easy to edit badly. It depends what you mean. Jytdog. Editing is easy for the contributor who knows their subject matter and how to write a report. It may be less easy for some, but Wikipedia is not here to teach creative writing skills - authors and editors have to bring those with them from what they learned in school, and good prose, especially of the style that is acceptable for an encyclopedia is a talent that not everyone possesses.
If you're talking about mark up, I don't find Wikipedia any more difficult to post to than any run-of-the-mill web forum, in fact some are more difficult; nobody of my generation or younger - (and that's rather a lot of people) should find it harder than posting a reply to a blog or messaging with a smart phone. Kudpung กุดผึ้ง (talk) 01:09, 20 May 2017 (UTC)Reply
I meant content creation, not so much the technical stuff. By the kind of thing I think should change, I mean stuff like this that is full of rosy bullshit. Creating good encyclopedic content is hard and takes work, and keeping bad content out of Wikipedia is work and drudgery. (I am not complaining about the work - it is pretty much what I "signed up" for; my first edit was removing blatant advertising). I also don't agree that creating good WP content is easy for subject matter experts. Many of them have a hard time wrapping their head around what we do here and try to write here like they do professionally, creating reviews here in WP or reporting "hot news" from their field here in WP. We do need to market the project to bring new editors in, but it should not pretend like it is magical or easy. (this talk grapples with the actual issues pretty well) --Jytdog (talk) 01:30, 20 May 2017 (UTC) (fixed thanks :) Jytdog (talk) 01:55, 20 May 2017 (UTC) )Reply
Jytdog, I know Erik - I wouldn't say well, but enough to have collaborated with with him closely both on and off Wiki. He grapples with a lot of things pretty well and I was sorry to see him go. Keeping bad content out of Wikipedia is indeed work and drudgery, and that is why the more seriously concerned editors have now given up in despair in the face of a sudden, rapid increase in the backlog at NPP that can no longer be contained and no one is prepared to pull a bell and a whistle to counter it. BTW, I'm not sure if this is what you really intended to say: ...but it should pretend like it is magical or easy. Kudpung กุดผึ้ง (talk) 01:51, 20 May 2017 (UTC)Reply
I feel bad for not putting time into NPP. I respect the people who do, mightily. Jytdog (talk) 01:55, 20 May 2017 (UTC)Reply
Jytdog, I don't think you need to feel bad about anything. For the last two years or so you're one of Wikipedia's most active editors. My edit count doesn't reflect the actul time I spend here - generally abot 3 hours a day and unfortunately, most of it at that coal face. It's not all patrolling new articles though, a lot of it is chasing patrollers away from it who haven't got a clue what they are doing. Unfortunately the community in its wisdom insisted that they be allowed to do it. Kudpung กุดผึ้ง (talk) 06:31, 20 May 2017 (UTC)Reply
I agree with User:Pbsouthwood and also think that it is easy to edit and get started. It just needs some specific basic knowledge. But that just takes a few minutes if you get the right instructions or figure it out yourself. I mean the edit button is highly visible as are the formatting buttons in the editor, the preview button and the rest of the wikitext ...and not much more is needed for basic edits (the more problematic part is the cite button on top and knowledge of some of the policies). It changes of course if you want to create tables and the like. I think we should keep telling people that it's easy to edit but also make sure they have gotten the few basic infos when they're registered. I feel that people for some reason shy away from making edits and find the UI outdated but it's not really hard. In videos that aim to get more people involved we could include a 20 second segment that shows these basics. --Fixuture (talk) 23:11, 12 June 2017 (UTC)Reply

Wikidata edit

If this is really a goal, then Wikidata needs to be rigorously excluded from en-WP projects until Wikidata matures and there are procedures in place to ensure that new data added to WD is reliably sourced. The entirety of Wikidata needs to be verified as well. That is a mountain of work. Wikidata is not currently a repository of accepted knowledge - en-WP's mission is to provide the public wit accepted knowledge. The missions of the two projects are simply different and that difference is sharpened by the goal stated in this page. Jytdog (talk) 23:08, 13 May 2017 (UTC)Reply

Academic Research Related to this Strategy edit

Please post peer-reviewed scholarly articles and reasonably well-written blog posts about such research here.

  • I find it bizarre that every article posted above is about the decline in the raw number of editors. I understand that this is an important metric for WMF but it has nothing whatsoever to do with the strategy point stated on this page. What are you thinking? Jytdog (talk) 03:00, 20 May 2017 (UTC)Reply
I'm thinking that in order to become The Most Respected Source of Knowledge, we need to recruit--and especially--retain good editors. These articles highlight the problem, and some of them suggest some solutions. Also, please do post additional scholarly articles that address different aspects of this goal.   - Mark D Worthen PsyD (talk) 04:39, 29 May 2017 (UTC)Reply

Who would seem what is considered neutral? edit

As I scroll past a lot of pages and read white a lot I've come to find that an awful lot of the subjects on pages are not exactly neutral. For example. Mark Dice is a "right-wing" political analyst, but on his page there is not political analyst label. Stuff like that. Do who is in charge of everything being neutral and how do you suggest making it possible?. Escape49 (talk) 01:56, 16 May 2017 (UTC)Reply

Good question Escape49! (By the way, your Username is red because all links that do not (yet) have a page are red links (links that do have a page are blue). Once you create your User page your User page link will become blue. And don't worry, it's easy to create a Userpage--there's even a User Page Design Center!) Getting back to your question ... Wikipedia has an important policy on that topic called Neutral point of view. As you said, we (Wikipedians) should write from a neutral point of view. Of course, reasonable people can disagree about what is neutral and what isn't. The best thing to do if you think an article is biased is to post your concern on that article's Talk page. Before you do that though, I highly recommend reading the Talk page guidelines first as it will help you express your concerns in the best possible way so that others fully understand your concern. Welcome to Wikipedia!   - Mark D Worthen PsyD (talk) 08:37, 16 May 2017 (UTC)Reply
Agreed. I edit frequently in the Israel-Palestine area (see e.g. WP:IPCOLL). Outside of wikipedia, very few, if any, scholars and journalists on the topic are considered to be neutral - commentators from either side throw accusations around lightly at those who are brave enough to make it their career to write about the subject. It makes one question if there really is such a thing as neutrality on these deeply contentious topics.
In wikipedia, neutral means properly reflecting the weight of reputable sources on a given topic. Oncenawhile (talk) 21:53, 16 May 2017 (UTC)Reply
Mark Dice has been trolling WP via twitter, trying to get his fans to come make the WP article about him, describe him with his self-selected title of "media analyst". The trolling even extends here. I will note that one thing that drags down many parts and whole pages of WP is promotional abuse of WP by advocates (some who are "just" fans or haters, and some of whom have financial or other conflicts of interest) and is an obstacle to making en-WP "The Most Respected Source of Knowledge." Jytdog (talk) 02:56, 20 May 2017 (UTC)Reply

Neutrality, systemic bias, writing for the opponent edit

There is a good WP essay on systemic bias, which suggests a “tendency to show an American or European perspective on issues due to the dominance of English-speaking editors from Anglophone countries.” Wikipedia works by consensus, but if most editors come from the same group of countries (Anglophone and allied), does a consensus of editors necessarily lead to balanced coverage of international disputes?

Has WP made enough effort to “write for the opponent” (the title of another WP essay) while covering (for instance) the recent conflicts in and around Ukraine?

Suggestions:

  • maybe the essays about systemic bias and writing for the opponent should be upgraded to guideline status and ultimately to policy status?
  • if we really want to “write for the opponent” perhaps we should make more use of the opponent’s own media, at least as a first-hand source for the arguments being put forward by the opponent?
  • should we aim for greater co-ordination between Wikipedias in different languages, especially on contentious international questions? Should we be looking at whether Wikipedias in languages other than English present perspectives not sufficiently covered in the English-language WP?
  • perhaps develop multilingual WP talk pages to bridge between users with different languages? Kalidasa 777 (talk) 10:06, 28 May 2017 (UTC)Reply

This seems to be a pertinent comment: Cross, Douglas (2016). "Whither Wikipedia". Nanotechnology Perceptions, vol. 12, pp. 50-52 (doi=10.4024/N07CR16D.ntp.12.01) http://www.researchgate.net/publication/311437071_Whither_Wikipedia. — Preceding unsigned comment added by Ankababel (talkcontribs) 12:39, 28 May 2017 (UTC)Reply

What one must consider is that some Western countries are, as can be proven by statistics, half-way functioning democracies (which may explain a part of their hegemonial position). On the other hand, people like Putin, Assad, Khadyrov, the Saudis etc. etc. must also be the focus of critical investigation and depiction. This has got nothing to do with systemic bias, even though the argument always comes up. Defend your indefensible dictators somewhere else. --Mathmensch (talk) 17:20, 5 June 2017 (UTC)Reply
You think Wikipedia should present what is said by critics of e.g. the current government in Saudi Arabia, but not what is said by the regime itself in its defence? I think WP needs to present both sides — what is said by the Saudi government and by its critics. Perhaps WP needs a policy or at least a guideline (e.g. along the lines of the current WP essay Writing for the Opponent) to settle the question of how we are to write about those whom you call "indefensible dictators"? Kalidasa 777 (talk) 22:04, 8 June 2017 (UTC)Reply
I agree and would add "unconscious bias" to the list. While one would expect that someone who has "knowledge" on a topic would be the best to write on the topic, it can also mean we have an underlying personal connection even if it is only an emotional one. On controversial issues, our values, beliefs, culture, and personal point of view skews our perspective even if we try to be neutral. People can strictly speaking "stick to the rules" of wikipedia but still in their choice of what information they include versus exclude can bias the overall content. I had thought that perhaps if we had a kind of "page swap," where a page that I feel is biased and I want fixed I could swap it with someone who has enough distance from it that they are unlikely to be biased, and vice versa, so fresh independent eyes can look it over and fix any issues of bias. Of course this would be cumbersome and is not likely the highest priority at the moment. I wonder how traditional encyclopaedia's dealt with issues of bias? Powertothepeople (talk) 02:51, 6 June 2017 (UTC)Reply
I disagree with the previous post, since it attempts to undermine the rules in suggesting that they are insufficient. Such arguments certainly support dictators like Assad, Trump, Putin etc. in upholding their status. Emotional connections are effectively counteracted in following the rules, that is, in being neutral, in including relevant information (such as information on rigged elections, torture camps, media control and so on), in relying on reliable sources and in removing promotional content.--Mathmensch (talk) 07:52, 6 June 2017 (UTC)Reply
"I disagree with the previous post, since it attempts to undermine the rules in suggesting that they are insufficient." I don't believe it is 'undermining' the rules to raise the issue that application of the rules may be insufficient - as evidenced by the discussion and edit history of any controversial subject. The original poster raises an important point. Furthermore, unconscious bias is well documented as affecting us all.[5] Ignoring one's unconscious biases and pretending one is completely "neutral" is naive: “Our research found that the extent to which one is blind to her own bias has important consequences for the quality of decision-making. People more prone to think they are less biased than others are less accurate at evaluating their abilities relative to the abilities of others, they listen less to others’ advice, and are less likely to learn from training that would help them make less biased judgments.”[6] We know our knowledge of history is tainted, that "history is written by the victors", and "Knowledge acquired in school - or anywhere, for that matter - is never neutral or objective but is ordered and structured in particular ways"[7] [8]. At least history has some of the wisdom of hindsight to draw on, while editing of modern events is even more fraught with bias.
No one is suggesting that Wikipedia become a mouthpiece for Trump/Assad/Putin's supporters. It is about trying to find a solution that acknowledges we all edit from a perspective that is biased, and what would be a better way to deal with this, particularly in international affairs? I understand you mean well in wanting to ensure that dictators are not supported. However, claiming that because we are from 'democratic' nations that we have a superior perspective is not much different from the old colonial mentality that argued 'we' are more civilised 'them.' This argument lacks respect, empathy, and equality for others and their perspectives. Ironic when people who champion 'democracy' want to exclude others.
Personally, I believe we gain a more informed perspective by listening to various points of view, and it may be the case that controversial topics need more "discussion" before publication so that all sides can be considered. I was travelling through Europe during the Brexit vote, and it was interesting that the perspective of the issue in western media, particularly Australian media and in my social media stream, portrayed it as an issue of race/culture, which was completely different to the points put forward by various people I met. For example, a Spanish-Bulgarian couple living in Spain gave further insights from their perspective: the EU is not democratic; countries like Germany have all the power while poorer countries are not listened to; the EU was founded primarily for "trade" (commercial interests) rather than for citizens; they felt strongly that it was wrong and detrimental how the EU treated Greece during their financial crisis, especially since Greece had forgiven Germany its debts after World War II, Greece was a victim of the international financial crisis which was caused by the finance sector of wealthier countries and suffering increased due to using the 'euro' currency; and they thought it was good that Britain had stood up to the EU and hoped that the EU would now listen to criticism and make changes. Now some of what they said might be their opinion rather than fact, however, there were many good points raised that people in Australia - being on the other side of the world, different culture, different language - were likely unaware of.
Now, if you look for a page about "criticism of the EU" here on wikipedia, the closest you get is one titled Euroscepticism [9], it repeatedly uses this term "eurosceptism" and "eurosceptics" even though it notes that people who criticise the EU do not themselves necessarily use or favour this term or believe it an accurate term. i.e. the whole page is biased against those who criticise the EU right from the minute they label them "eurosceptics." It notes that 61% of Spanish people do not trust the EU, yet doesn't do much to explain why they don't trust the EU and what their criticisms are. The article claims "the rise in populist right-wing parties in Europe is strongly linked to a rise in Euroscepticism on the continent" yet the citation goes to an research article that summarises "euro-scepticism is much less relevant than perceived ethnic threat in explaining why particular social categories... are more likely to vote for the radical right," and further down the page mentions that the Spanish political party criticising the EU is a left-wing party. The English speaking coverage of these issues frequently tries to write off critics as all being racist, uneducated, unemployed, conservative, far right-wing, etc - and while some critics might fit this profile, there are others like the couple I met who are intelligent, educated, left-wing, employed, academics and professionals. Few of the criticisms they shared with me have been explored on Wikipedia. Does this show a cultural bias in Wikipedia? How should it be addressed? Powertothepeople (talk) 02:47, 7 June 2017 (UTC)Reply

Propaganda and public relations edit

In order to become a truly respected source of knowledge Wikipedia must deal with the elephant in the room: the influence of propaganda and public relations specialists seeking to influence public opinion covertly.

It is no secret that governments and corporations invest copious resources in changing people’s minds through mass media and especially the internet. Because of its status and low threshold to participation, Wikipedia is naturally a prime target.

Although Wikipedia has developed a good system for thwarting lone vandals, it has little defense against committed actors who have the time and foresight to become accepted members of the community. It stands to reason that the more powerful an entity, and the more they stand to gain from public perception, the greater in general will be their willingness to commit resources to editing Wikipedia.

The value of controlling certain Wikipedia articles during wartime is obvious.

Or take Monsanto, a company not known for its hesitancy in the realm of public relations. Monsanto and its public-relations contractor were caught red-handed in 2002 using internet sockpuppets to discredit researchers who reached conclusions which threatened the company’s bottom line. (See “The Fake Persuaders” and other news reports on this event collected here.) A Monsanto speaker has admitted that the company has “an entire department” devoted to “debunking” unfavorable science. Documents from an ongoing lawsuit over glyphosate toxicity provide evidence of Monsanto’s “Let Nothing Go” strategy of responding to all criticism online. Readers of websites across the internet can attest to the company’s diligence in these matters.

Are we to believe that employees or agents of Monsanto do not edit Wikipedia? When Wikipedia, as if dogmatically, downplays the potential risks of genetically engineered food across dozens of pages; goes so far as to smear criticism by association with the pejorative article title GMO conspiracy theories; devotes whole articles to undermining the work of certain researchers much as occurred in 2002—are we to believe this is done without encouragement or assistance from the “Let Nothing Go” department?

At minimum, can we accept the premise of a "scientific consensus" on a topic about which free scientific inquiry has been so blatantly thwarted? with some viewpoints suppressed and other promoted, not because of their inherent truth, but because of their commercial value?

Over the years, many good editors have spotted the apparent systematic manipulation in this area, and drawn attention to it in various places. Yet the situation doesn’t change; in fact, it gets worse. It poisons the air and undermines the fundamental optimism and trust on which this project relies.

If Wikipedia does not find a way to systematically deal with strategic manipulation by big players it will slowly become a playground for propagandists, and critical thinkers will treat it like some information-age Pravda. Respectfully, groupuscule (talk) 18:37, 29 May 2017 (UTC)Reply

I agree wholeheartedly that this is possibly the biggest issue affecting quality control, particularly on profiles of living people, corporations, politics, and controversial topics. It leads to a lot of "false industry" where many people are working hard to improve quality but it is being undone due to a vicious cycle of nuisance. I would split the offenders into two sub-groups:
1) Somewhat naive PR specialists (or similar role). They are not necessarily trying to be deceptive, but their job is to promote their client, and they see that there is no wikipedia entry for them, or it is very basic or outdated, and they want it to be fixed. They are not experienced wikipedia editors, they are here for the single purpose of improving things for their client (just as they will write and distribute press releases, manage social media, update web content, etc). Because there is currently no legitimate way for these people to request their client's page be updated, wikipedia is tempting fate that they will try to take matters into their own hands and write it themselves despite the conflict of interest. A potential solution therefore would be to have a service where people could pay a fee to have neutral unbiased content created by an inhouse team of professional editors at wikipedia. The editors themselves would not have any contact with the 'client', would not know who was paying for the content to be created, would maintain their neutrality and independence. The client would have absolutely no contact with or influence on the creation of the content. Arms length. This should reduce a lot of non-malicious PR and low level propoganda. If they are worried about others sabotaging their page they could also pay a subscription to ensure the page of interest is watched vigilently by editors and only neutral content is added. (I notice in the edits section someone had changed the subject's real name to "potato head," and while the sabotage was quickly undone by an editor, it may not be fair to ask volunteers to do this workload when the subject of the page would happily pay someone to do it. Wikipedia should take a look at the number of volunteer man hours spent on these pages - particularly controversial ones - and consider who should be footing that bill.)
2) Intentional, organised, deceptive propaganda. I have seen quite a lot of bias on controversial topics, but until you mentioned it here I simply (naively?) thought it was just individuals pushing their own bias rather than organised and intentional. I agree in the world we live in today Wikipedia needs to develop a good strategy that will counter this issue. Having a team of elite well trusted editors who can veto/ have final say when necessary might help. Locking or restricting editing rights on controversial topics, requiring all change requests to be discussed via the talk page, and only trusted neutral editors have permissions to implement the changes on those pages. Using computer algorithms to detect biases in user contributions, and anyone who works outside of the acceptable values may be blocked or have restricted permissions. I also mentioned above a system where newer users have restricted permissions compared to more proven editors, and this would minimise the damage done. Would need a way to flag people suspected of bias and have them assessed. That also has the potential to go the other way - false accusations as they try to discredit others. Ugh! Powertothepeople (talk) 03:51, 6 June 2017 (UTC)Reply

Dear fellow Wikipedians,

unfortunately, it seems as though many entries on Wikipedia related to politics are subject to intense abuse by politically motivated authors. For instance, on this very site I edited the pages on Vladimir Putin, Donald Trump, Frauke Petry, Bashar al-Assad and others in order to make them more neutral and state important information; in the Putin article, I added a pretty sound argument for election fraud committed by him (two mathematicians had discovered that polling stations which reported results divisible by 5 had much more registered voters, which could only happen with an insanely low probability), to the Trump article I added that Trump's statements regarding crime by immigrants were plainly false (and I used good sources, namely the official U.S. criminal statistics, which are just the reports passed on by police), and the Petry article I edited recently, removing strange justifications of her statements regarding murdering immigrants crossing the border. Finally, from the Assad article I removed certain things that sounded like an advertisement for Assad, and made clear that the elections were not to be taken seriously. This all is not to mention several articles on Chinese history which were possibly written from the perspective of the Chinese government, where they used terms like "liberation" for the conquest of areas held by the Kuomintang, the rival party to the Communists.

Now in each of these cases, my edits were reverted eventually, even if, as in the Trump article, an administrative opinion was given which was clearly in my favour. My conclusion is: Supporters of certain very bad politicians attempt to whitewash them. And I seem not to have any way to stop them. Thus I would like to ask: Would it be possible to obtain any strong administratorial measures against the disturbers? Further, would anyone be willing to help me in keeping the articles in question neutral? Where can I find such people? --Mathmensch (talk) 17:21, 5 June 2017 (UTC)Reply

  • Relevant to this is this article: Internet manipulation. Some countermeasures can be found under the respective section in that article. Furthermore I think that one of the best ways to cope with Internet trolls and manipulators is to strengthen the enforcement of WP:NOTDEM. I made a relevant suggestion here. Other things that would be useful are: more users, people putting more pages on their watchlist and checking them better, pages for fact-check requests, identifying pages that are likely to have people trying to manipulate them, improved objective discussion streamlining etc. And I think the thing that's needed most is boldness: simply correct pages that have been manipulated with and if people have a problem with that go to the talk page and if that doesn't help go up the chain and request dispute resolution and make very good arguments etc. The first thing we should do for every issue including this one is creating a space where we can crowdsource, discuss and collaborate on its mitigation and management -> a meta page. --Fixuture (talk) 23:04, 12 June 2017 (UTC)Reply

Show Article Quality Rating to ALL Users edit

I suspect this has been discussed before although I could not find such a discussion after searching in a few places using different keywords. At any rate, the recommendation is to display an article's quality rating (grade) to all visitors--currently we show the quality rating (grade) to logged in users only.

For example, I clicked on the Random article link on the left nav bar and arrived at this article: Luigi Cocilovo. Because I am logged in, I read at the top of the article, "A start-class article from Wikipedia, the free encyclopedia". If I was not logged in, I would not see anything about the article's quality unless I looked at the article's Talk page (something most new users--and even many frequent visitors--do not know to do).

Displaying quality ratings (grades) to all visitors would serve several purposes:

  1. For Stub, Start-class, and C-class articles, the notice would alert visitors to the relatively low quality of the article; and
  2. Visitors will more readily recognize that:
  • Wikipedia articles vary in quality;
  • volunteer editors care about quality;
  • we have high standards for A-class, Good, and Featured articles; and
  • they can help!

What do you think?   - Mark D Worthen PsyD (talk) 06:02, 19 May 2017 (UTC)Reply

I agree. We need a more automated system though. There are no rules for B and C rankings, and there aren't enough GAs and FAs - which are based on the quality of the prose more than the "robustness of the information", which matters more to the overall question here.
An automated ranking system assessing "robustness of information" could produce a single algorithmic measure based on a variety of relevant data such as:
  • number of edits
  • number of watchers
  • number and weight of talk page comments
  • number of editors
  • "experience level" of editors based on their tenure, edits, etc
  • number of sources
  • etc.
It would need time and investment to calibrate, but would be of great value to readers, and go a long way to strengthening Wikipedia's reputation. Oncenawhile (talk) 08:33, 19 May 2017 (UTC)Reply
Article ranking, with the exception of FA, is an arbitrary process. GA is not reliable because while there are some very strict reviewers who demand near-FA standards, there are others, particularly new users, who will pass almost anything. There are also those who run 'I'll pass yours if you'll pass mine' systems - generally younger users. The other 'rankings' are done by Wikiprojects on scales of quality and importance as viewed by that project. It's not unusual for one article to be ranked very differently by two or more concerned Wikiprojects. Kudpung กุดผึ้ง (talk) 00:54, 20 May 2017 (UTC)Reply
Quite true, but also better than nothing. • • • Peter (Southwood) (talk): 17:09, 20 May 2017 (UTC)Reply
Exactly. I would rather give readers some sense of an article's quality, along with the other goals I outlined above, than pretend that all articles are created equal.   - Mark D Worthen PsyD (talk) 02:47, 30 May 2017 (UTC)Reply
Sure, but by 2030 we should have been able to develop a better, automated, ranking system. We already have the bare bones of information to rank a page with good information [10] and a page with uncertain information [11], but we'd still need significant overlays to automatically rank quality of editors and quality of sources. Oncenawhile (talk) 06:56, 30 May 2017 (UTC)Reply
Ah, I think I understand your point better. Perhaps we could agree on two related goals: 1) Show the quality rating for all articles to all visitors; 2) Establish as a high priority the development of an automated or semi-automated article rating process. Please feel free to fine tune the wording. I would like to see this as some sort of formal recommendation if that's part of this review and discussion process. (I'm new to this so I'm not sure how it all works).   - Mark D Worthen PsyD (talk) 01:04, 31 May 2017 (UTC)Reply
I agree with this. Oncenawhile (talk) 05:55, 31 May 2017 (UTC)Reply
I agree. I'm a new user and I didn't even know there was a rating system until you mentioned it and I went looking. I think this should be pretty prominent, and in particular, the highest quality should have a badge or icon "verified" or some such, which allows people to trust and cite it as a Verified Wikipedia page. (FA doesn't mean anything to the average person, but Verified or Pro or Gold etc would). Powertothepeople (talk) 05:01, 6 June 2017 (UTC)Reply
I agree that our quality ratings should be much more easily visible to readers. The main reason we don't make them more visible that now is that the rating system is so haphazard and unreliable. The higher-level reviews really need an overhaul to be less high-pressure and easier to participate in. I'd support what Iridescent suggested at an FAC discussion back in March, which would streamline all the review processes into one and allow newbies to gradually become more comfortable doing assessments. Lowering the barrier for human reviewers would certainly be easier than developing a computer program to do their whole job! A. Parrot (talk) 04:03, 9 June 2017 (UTC)Reply

Inaccurate info edit

You claim anyone and everyone can use this as well provide information. My problem is that there is more inaccurate info than accurate on your site and how do you truly plan to fix this...

When I try to update this false info to accurate info I am harassed or my edits are uncorrected by someone who claims they are correct and I am not when infact its the other way around. I understand that alot of people claim this but in my experience by doing actual research you can find the truth. If research into the actual comics would be done I wouldn't have to correct this time and time again and people wouldn't be finding out later they purchased what they thought was something else for hundreds or even thousands of $$$ and it was wrong.

I actually tell people to not use your site as a ref because of this and wish I didn't have to but your info is inaccurate alot more than accurate, just unreliable...Argento Surfer is the guys name and he apparently reported me for editing while he kept harassing me. The guy is not as knowledgeable ad he thinks and just like everyone else make mistakes but just seems to not want to look into the facts. Instead he argues and deletes my edits.

Subjuct I am referring to is first appearance in comic books. Wolverines true first appearance is hulk 180...its not a cameo. Cgc list it as his 1st app. As well as overstreet price guide. They list hulk 181 as his 1st "full" app. Another one is gambits 1st appearance which is in Uncanny X-Men "Annual" #14 came out in july but its listed on here as Uncanny X-Men #266 which came out a month later in August. #266 is in fact his 2nd appearance but 1st Cover appearance. Hes fully in the annual. Ref. To by name, takes part in the story and has a fight scene.. makes no sense how its not listed as first appearance. There are many other issues but these two I have been try ing to fix for a while ans they keep getting unedited. Havenx23 (talk) 17:10, 20 May 2017 (UTC)Reply

The ANI report is here. Nothing came of it because User:Havenx23 took a break from editing. My "harassment" of him can be seen here. This is my first response to him since the ANI report. He is a Wikipedia:Single-purpose account whose contributions are limited entirely to the "true first appearance" of a single comic book character and the related discussions. In these discussions, he inevitably brings up how the article influences the purchasing of said comic book. Argento Surfer (talk) 12:51, 22 May 2017 (UTC)Reply
I appreciate Argento Surfer's informative reply. At the same time (and I suspect Argento Surfer would agree), this is a very idiosyncratic issue that does not contribute to the discussion. If I knew how to archive this I would, but alas I am ignorant on the matter. Perhaps someone more knowledgeable would archive this section?   - Mark D Worthen PsyD (talk) 02:45, 30 May 2017 (UTC)Reply

Nothing to show here . .... Rishabhakarshit yadav (talk) 12:39, 31 May 2017 (UTC)Reply

Engaging in the Knowledge Ecosystem edit

What impact would we have on the world if we follow this theme? edit

People take time to understand this theme so it will be better but take a lot of time Krish Charan RJ (talk) 13:09, 19 May 2017 (UTC) This theme will cater to a large population of people and will also help many learners to comprehend subjects in an easy manner. Many don't have the financial condition to study the subjects they want so this will be a great help to them. (Riya94 (talk) 08:04, 20 May 2017 (UTC))Reply

It will be a great movement.the sea of knowledge is great .we want to build a healthy ecosystem of knowledge. Rahul Rajagopal (talk) 06:26, 23 May 2017 (UTC)Reply

In given four Sub-Themes (Education, Institutions, Educators, Existing programs) of above Knowledge Ecosystem Theme includes all section of knowledge ecosystem. These Sub-Themes can have impact on the world mainly young generation and people who love to gain and upgrade knowledge. — Preceding unsigned comment added by Pandeyasish (talkcontribs) 07:14, 30 May 2017 (UTC) könnenReply

First i would like to point out the danger of working with such renowend institutions to the democratic process of Wikipedia, which i deem its greatest treasure. Many of these istitutiones only superficially follow the goal of amplifying knowledge. On a closer look they are industrydriven and serve the cause to sustain the established societal system with all its injusticies. However i do agree with the importance of creating a wiki-learning-infrastructure, something like a wikiversity. But i must insist on the pillars of wikipedian success: independence enabeling objectivity, buttom-up mechanics empowering individuals to take part in a community and freedom of usage helping human beings finding their own path. Now to answer the question on the impact on the world: If this theme is uncarefully pursued it may verry well even have a negative impact by leading even more people into mental dependency and condemning them to swimm along the masses unconciously unhappy. On the otherhand it has enourmos potential to enable people to find and pursue their true purpose and by that lending humanity a hand to reach its harmonic destiny. With kind regards and high hopes Leowikardo Leowikardo (talk) 21:47, 1 June 2017 (UTC)Reply

It's a little unclear about what is meant exactly by this theme. I have already mentioned under other themes (Community & Knowledge) the value of connecting with educators and academic institutions to meet those other goals, but in this way I don't think this Ecosystem theme is a "goal" in of itself so much as a support action to achieve those other goals. On the other hand, if the concept is that Wikipedia become like an online university, well there are already many organisations jostling to be a free online education provider (an important goal, but others are already tackling it), so I don't think this direction is the right path for Wikipedia. By all means Wikipedia should partner and interact with these other organisations so they can use and contribute to Wikipedia's content, but I see wikipedia's place as being a repository of reliable *facts* that people can look up, while a school or university is trying to teach people *skills.* Both are important within the knowledge ecosystem, however it involves a different approach and a different type of user. Powertothepeople (talk) 01:58, 8 June 2017 (UTC)Reply

I think you need to think about Ecosystems in terms of the existence of other organisations that hold/disseminate information and how we we should engage with them. For example, we don't have a WikiMaps project because we leave that space to OpenStreetMap, whose licensing makes their maps acceptable to Wikipedia. At the moment, we do not allow external links in article bodies (other than within citations); that is, we see Wikipedia as a closed eco-system. Should we do that? Should we say to other knowledge providers, let's be friends and interlink with one another, or do we try to duplicate what they already do? As a concrete example of working together, Trove (the Australian repository of all things library) generates Wikipedia citations for most of the items in its inventory, making it just a copy-and-paste to cite Trove items within Wikipedia. Some websites prevent being their pages being archived or put content behind search screens so it is not easy to cite or archive them; they are very unfriendly as far as Wikipedia is concerned. As a concrete example, I prefer to cite the Brisbane Times (whose web pages I can archive at the Internet Archive) over the Brisbane Courier Mail (which resists being archived). Some websites are CC licensed, most are copyright. I think under this Ecosystems banner we need to think about how we persuade more organisations to be more "open knowledge" friendly, whether that be CC licensing, being archivable, mproviding Wipedia citations, or whatever. Kerry (talk) 04:44, 9 June 2017 (UTC)Reply

One relevant effect that I suspect is being overlooked, but recognizing it & supporting it plays to our strength: we are transitioning a lot of information to a digital or electronic format that might otherwise not be available in that format. This has a long-term importance, as people will increasingly rely on electronic sources of information over print -- & especially those in handwriting. And there is a cost to transitioning this information, however small, which means some areas of knowledge won't be transitioned & risks being lost. This loss is due to economic advantage: for example, there is more money to be made from putting financial records into electronic form than the works of Gertrude Stein. And it is inevitable that some of it will be lost despite our best efforts; some will be too difficult to access or collect before accident or malice destroys it; there is the issue of systemic bias; nevertheless much is can be saved if we are alerted to the risk. A comprehensive survey & detailed report about where Wikipedia is weak in coverage would address this -- & would be welcomed. Up to now, these studies have been haphazard & subjective. Contracting with recognized experts to identify & explain where Wikipedia is weak would be closer to an ideal report, & repeating this survey after on a regular basis after collecting constructive feedback would get us even closer. -- llywrch (talk) 20:58, 11 June 2017 (UTC)Reply

While as of right now we are here to simply accumulate encyclopedic knowledge (with some related structures towards this goal having formed around it) in the future society might make informed, science-based decisions in ICT-enabled participative ways after crowdsourcing knowledge and data and carry decisions out in collaborative and non-profit-incentivized ways. To say it in Buckminster Fuller's terms we are "creat[ing] a new model and [slowly] make the old one obsolete".
For this as well as improvements of, extensions to and new uses of Wikipedia we need to work together with other partners in the ecosystem we're embedded in.
And as Wikimedia is (one of) its strongest nodes (e.g. in terms of members, reach and voice) we need to contribute towards its cohesion, unity, impact, cooperation, mutual assistance, coordination and organization. There simply is no other organization that could do it better than WMF/us. And it's often said that Wikipedia somewhat embodies the original and transformative spirit of the Internet − for good reason. It doesn't (directly) have to do with Wikimedia, but as Jimmy is Wikipedia's cofounder and as some wiki ways/tools will be used there WikiTribune is exactly the kind of thing I'm talking about (the community certainly was engaged in a special way). I hope that the WMF & community can start more new projects and help make other projects a success. Some things that could be done include new software written by a FOSS community organized by the WMF and crowdfunded by making use of some new and existing appropriate channels that combines Wikipedia data with OpenStreetMap for personalized tourism for example. Or a website for open source'd ideas with inbuilt issue management. I'm not entirely sure how such collaborations could look like and these were just two examples (implement that latter idea and you'll see thousands!). Or a new conference organized mainly by WMF but for the whole ecosystem etc etc
We will all benefit from it. --Fixuture (talk) 00:43, 13 June 2017 (UTC)Reply

How important is this theme relative to the other 4 themes? Why? edit

Because it should be for all Hispanics. I only go in Spanish. We need more Mexicans in the Ecosystem. CARS FOR ME (talk) 09:18, 13 May 2017 (UTC)Reply

This is an Ecosystem of Knowledge and In all other theme in which promoting worldwide volunteer contribution (Healthy, inclusive communities, include), The augmented age (Advancing with technology) through machine learning, this ecosystem provide platform to world wide knowledge sharing and expert of sub-themes can contributors and experts of all over the world give effort to make knowledge trustful. — Preceding unsigned comment added by Pandeyasish (talkcontribs) 08:41, 30 May 2017 (UTC)Reply

It encourages the new gen to help the world, and as our youngsters are very much into gadgets, why not use it for a better cause? Supdocious (talk) 10:56, 4 June 2017 (UTC)Reply

Also, it is an all rounder promoter more than just focusing on one thing. Supdocious (talk) 10:57, 4 June 2017 (UTC)Reply

I have prioritised this theme as fourth (after Community, Knowledge, Technology) because I see it as a "support" theme rather than a primary goal in it's own right. Wikipedia is only of value to the 'ecosystem' if its Knowledge is of high enough quality. And to create quality Knowledge we need a healthy Community. I also consider Technology to be a "support" theme, as it's primary value is to provide features that help improve quality of the content and community for the above reasons. That brings me to the Ecosystem - Wikipedia is already part of the knowledge ecosystem, and will naturally become more firmly embedded once Knowledge and Community issues are fixed. Therefore it does require top prioritisation at this stage. However, all these themes do overlap. Partnerships with leading educational institutions can lead to an increase in volunteers and quality content. Chicken or the egg? Powertothepeople (talk) 01:59, 8 June 2017 (UTC)Reply

Focus requires tradeoffs. If we increase our effort in this area in the next 15 years, is there anything we’re doing today that we would need to stop doing? edit

I do not think so, as it should be easy enough to do both. Valer millen (talk) 15:14, 1 June 2017 (UTC)Reply

There's the risk of attempting too much change at once, and spreading Wikipedia resources too thin. Better to do one thing well than lots of things poorly. Powertothepeople (talk) 02:01, 8 June 2017 (UTC)Reply

Potential conflicts of interest when working with governments and other organisations. Powertothepeople (talk) 02:01, 8 June 2017 (UTC)Reply

What else is important to add to this theme to make it stronger? edit

  • The encyclopedist in me wants some historic and bibliographic background for the term "global knowledge ecosystem" used in this theme. Why did the authors of this theme choose the term "global knowledge ecosystem" and what does it imply? I think of Christian Vandendorpe's 2015 article on Wikipedia in the journal Scholarly and Research Communication:

Wikipedia is also a part of the ecosystem of knowledge, since it helps to build "a consensus of rational opinion over the widest possible field," which is the goal of science, according to John Ziman (1968, p. 3). An ecosystem is a network of interactions among organisms, and between organisms and their environment. [...] Moreover, the field of knowledge is itself expanding at an accelerated pace and humankind's thirst for knowledge, which is part of our genetic program, tends to be more intense as answers become more easily available. According to Barry Allen, knowledge is our destiny: "We have no option anymore about preferring and cultivating knowledge, or the soil, or life in cities. These are for us the circumstances of the now-global sapiens ecology, and they define the ultimate context for understanding knowledge" (2004, p. 215). A collaborative encyclopedia appears well suited to this new ecosystem.

— Vandendorpe, Christian (October 2015). "Wikipedia and the ecosystem of knowledge". Scholarly and Research Communication. 6 (3): 1–10.
Here's the last sentence of Barry Allen's book that Vandendorpe cited in the preceding quotation:

The question is whether the only thing left for civilization to collapse into is our extinction as a species on the earth. Unless one believes that a god will intervene, I think we have found the ultimate context for understanding knowledge. It is an ecological context of artifactual performance. It is a context not of science, discourse, or formal rationality, but of the global sapiens ecology, an artifactual ecology made to work by the accomplishments of knowledge.

— Allen, Barry (2004). Knowledge and civilization. Boulder, Colorado: Westview Press. p. 285. ISBN 0813341345. OCLC 52542619.
Thanks, Biogeographist (talk) 03:50, 13 May 2017 (UTC)Reply
  • Well, thinking about leading instituions in academic, the arts, etc, I think we need a good answer to "what's in it for them?" I do GLAM liaison and am a retired academic, so I have some experience in this area. Most academics are living with "publish or perish", so thinking they will take time out of their working life to write on Wikipedia (or sister projects) is a bit naive. Unless academic institutions will take Wikipedia contributions into account in recruitment, promotions and grant applications, I can't see it likely that working academics are going to be very interested. Retired academics are probably a more achieveable target. Indeed, a lot of older academics keep working beyond their financial need to do so, because they don't know what they'd do with themselves in retirement! Having said that, Wikipedians (as a group) seem to be anti-academics (I stopped mentioning my academic past on my user page for that reason) and Wikipedia does not operate in a way academics understand so Wikipedian tends to chew them up and spit them out (I've had a lot of academics tell me that they did try to contribute to Wikipedia because they saw somethint wrong, but they were reverted and then when they argued why they believed they were correct, generally mentioning there many years of research and many publications in that topic, they got abused for saying that).
For GLAMs etc, it is easier to make the argument that Wikpedia has the "eyeballs" and exposing their collections through Wikipedia as citations, images, etc exposes their institution in an additional way to their own activities. Most of them understand that, but resourcing is still an issue. If I could say to a GLAM "make your material available and I'll bring in a vast team of people to scan it and upload it or cite it in Wikipedia, etc", then my job as liaison would be much easier. But the reality is that I don't have a "team to bring", on the contrary there is a shortage of volunteers for this kind of work. To expose their content through Wikipedia will generally need the GLAM themselves to provide a lot of the human horsepower. This is a problem with using volunteers, it's hard to motivate them to tackle large projects, particularly if they have a lot of boring work involved. I've categorised about 4.5K images from a local archive on Wikimedia Commons (it was weeks of work and mostly deadly boring, I'd think twice before volunteering to do again, and I really wonder if even 10% of them would ever be used - lots of black-and-white photos of people and things which probably aren't notable etc). I think to do big projects in GLAM or other key partners we do need to have a way to apply for funds to pay people to do the boring stuff.
A worthwhile group pursuing might be the University of the Third Age, there's a lot of retired expertise there. Kerry (talk) 07:45, 29 May 2017 (UTC)Reply
I agree entirely with the comments made by Kerry Raymond. As a recently retired academic, I decided to fix some of the many errors in articles in the marketing/ advertising area. I did not have any information about my background on my user page, but it must have been clear from the edits made that I had considerable expertise in the area. Within a matter of days, several editors informed me that editors with subject matter expertise should refrain from editing in their subject area and only edit in other fields. On one of the main pages, I tried to correct some errors of fact, conceptual errors and fabricated information, only to find that they were reverted within minutes. As a "newbie" at the time, I was under the misapprehension that if I provided solid arguments for my changes, I could counter the reversions. But after lengthy debate, I was simply informed that "some content just doesn't belong on WP." I gave up because I did not want to get into an edit war, and so the inaccurate material remained in the article. About five months later, the inaccurate material was removed because it was part of a copyright violation. It had been in the article for almost 8 years by the time it was finally removed. I did not let this incident deter me, and continued working away - totally overhauling/ expanding 17 articles in the marketing area, and adding substantive new sections to an additional 13 articles - all acommplished within four months. Around this time, I appear to have come to the attention of a cartel of editors, who formed the view that the mere mention of a brand name or commercial organisation constitutes spam. This outfit began following me around, checking my history and deleting all manner of content - illustrative examples, links, references, further reading suggestions and indeed anything that they construed as spam generally by adding the edit summary, "looking very spammy". None of the deleted material met Wikipedia's definition of spam and in almost every case, users on the talk page, had expressly requested that illustrative examples be added to these articles to highlight the practical applications of the concepts and theories. But the cartel was not interested in any of this. I tried to debate with the cartel - but their only response was "well there are four of us who like it this way, feel free to go and find support for your position." I asked them to desist following me about - all to no avail. However, as a newbie, I have no real connections on WP, so that was the end of that. After my initial burst of activity, I am now like so many other editors who have been burned or bullied, not entirely willing to quit, but reduced to making wiki-tweaks - adding an image or two here and there, fixing up some spelling and grammar, refining and polishing expression, adding a few wiki links and adding high quality references to pre-existing material. I no longer add substantive content, nor address the overall article structure because it is simply a high risk activity and attracts those whose mission is to delete and harrass. I have, however, posted many detailed suggestions for improvements to articles on the article's talk page, and in most cases suggesting an overall structure with headings, sub-headings and suggested references for each section - but I will have to leave it to other editors to flesh out the content. (Talk pages are like a safety zone, because that material can rarely be deleted.) It's a shame because I have the time, willingness and expertise to fix articles in the marketing area, which have been plagued with problems for many, many years. Wikipedia has developed a bizarre operating culture which tolerates mediocrity, supports bullying and militates against people with expertise. BronHiggs (talk) 00:30, 5 June 2017 (UTC)Reply
@BronHiggs: I have never experienced bullying in my four years of editing. Boldly editing is an established guideline on Wikipedia, and if you feel that other editors are preventing you from editing boldly, some dispute resolution would seem to be in order. There are a number of options for resolving content disputes with outside help. There are also options for resolving disputes over editor conduct. You say you "have no real connections" on Wikipedia, but personal connections are not required (nor should they be) for dispute resolution. Arbitration is the last step in the dispute resolution process if none of the preceding steps work for you.

Wikipedia may be the largest collaborative initiative in history and influences what people the world over know or think they know. Its distinctive feature is the nonexpert, nonprofessional, noncertified, nonformal production of knowledge with credible content. Academics like to sneer at those characteristics, even as more and more of us acknowledge Wikipedia, support it, and use it in teaching. And why should we not warm to it? The rules of Wikipedia discourse are modeled after an ideal academy's. Arguments, not personal attacks or status, carry the day. It may be the most scientific encyclopedia ever: Wikipedia is as self-correcting as anything in science. Purposeful bias, departing tendentiously from dominant beliefs of the academic community, does not prevail. Peer control is high; procedures are many and fanatically enforced. There are no back channels. Every editorial act is recorded and archived and remains on the record forever. Since its inception, Wikipedia has promoted itself as an encyclopedia that anyone can edit, and some three hundred thousand editors contribute each month. [...] Why, then, does Wikipedia work? In theory, it should not. In practice, it seems to be a new paradigm of organization, whose breezy anticredentialism tosses traditional hierarchies of knowledge production to the wind.

— Allen, Barry (January 2017). "Review of 'Common knowledge?: an ethnography of Wikipedia'". Common Knowledge. 23 (1): 104. doi:10.1215/0961754X-3692492.
Best wishes, Biogeographist (talk) 01:33, 9 June 2017 (UTC)Reply
If you have never experienced bullying, then you are indeed fortunate. From what I have read in the mainstream press, bullying is commonplace and is one of the main reasons why new editors do not remain with Wikipedia. The character that is currently following me around and aggressively tagging, deleting and commenting has a very long history of bullying and tendentious editing. However, all that happens is that he is blocked for 24-48 hours. This has no long term effect. He never appears to modify his behaviour - if anything, the disruptive behaviour deteriorates because because he acquires expertise in responding to complaints and gaming the system. He is also part of a cartel, which means that he can send his mates into to pick up the threads and continue the harrassment, presumably when he gets a bit too close to the point where an allegation of harrassment might work. He has been following me for 5 months now, and I have tried many things - taking some time out from editing, editing low level pages/ non contentious articles as well as confining myself to little more than page beautification such as adding images and polishing up expression. However, no matter what I do, I cannot shake him off. It is almost unbearable. But, I was always taught that the only way to handle bullies is to stand up to them. I have documented all his activities - and it amounts to a very substantial list of what I perceive to be unjustified reversions across multiple pages- with only the vaguest of edit summaries. BronHiggs (talk) 04:27, 9 June 2017 (UTC)Reply

One benefit of this theme is that it actively taps into existing organisations that might already have strong friendly communities (something wikipedia is perhaps struggling with), and physical venues in which to do 'hack-a-thon' style working sessions. A physical venue for a working group will potentially bypass some of the problems wikipedia currently has within the online community. It would be more 'fun' than sitting at home on the computer by yourself, and could be organised as a short intense event or project. It could also help address diversity issues by specifically targeting for collaboration education organisations that represent minority groups. However, if the problems already existing in wikipedia's online community and content editing system aren't fixed there is the risk of this backfiring when a group of people do their part only to have their work rejected. Powertothepeople (talk) 02:05, 8 June 2017 (UTC)Reply

Who else will be working in this area and how might we partner with them? edit

Obviously this Is a major"come together" and merits support from Major Contribtors. Saludos.mucha suerte para todos.Blueberry6014 (talk) 16:39, 15 May 2017 (UTC)blueberry6014 Blueberry6014 (talk) 16:39, 15 May 2017 (UTC)Reply


Education is the basic need of a developing country. When we see at africa thay might be in dream when they will awake they will find themselves in hell. Because without education you cannot survive in society. So wo should give donation for the education for backward country — Preceding unsigned comment added by Sunnyrapper (talkcontribs) 16:03, 22 May 2017 (UTC)Reply

I would suggest a broader list of micro non-profits, organizations without a profit motive, such as Partners in health, American Refuge Committee, to discuss what the education needs of developing nations is based on first hand knowledge. I would say build a wiki of the needs presented by different micro organizations and and also the resources that they have to help. James_Shelton32 (talk)

I posit that AI and Big data will alongside crowd sourced information gathering, be the predominant way research will be outsourced. 'Education is dead. Topical research and information that is actionable or profitable will mean that students of the future will prove their worth by the merit of what they contribute not what they memorize and mimic. Educators are too expensive, isolated and out of sync with reality; one that moves too quickly for the classic institutional models to explain, or even understand. A resume that shows all relative content creation, curation and organization, linked to a bitcoin wallet, is the best way to expand research and involve students in the expansion of AI. Information can be mined and sorted, but, topical curation, click-bait and trolling are education's definitive future model. Wikipedia needs to create partners out of it's user base and have more incentive to contribute both to the wiki and expand the ecosystem. Every child that learns something new here, should have that page automatically saved to an account and Big data and AI can sort the information for each individual. That way we can all develop an index of what we know and have it become more relative over time using the power of "extremely large data sets that may be analyzed computationally to reveal patterns, trends, and associations, especially relating to human behavior and interactions."[1]

  1. ^ "big data - Google Search". www.google.ca. Retrieved 2017-05-24.

It is true that "People's environments change even more quickly than they themselves do. Everything from the weather to their relationship with their mother can change the way people think and act. All of those variables are unpredictable. How they will impact a person is even less predictable. If put in the exact same situation tomorrow, they may make a completely different decision. This means that a statistical prediction is only valid in sterile laboratory conditions, which suddenly isn't as useful as it seemed before."[1] This inevitable institutional objection to the predictive analytical process, is mitigated by the evolution of the artificial society and the nature of emergence.Anocratic (talk) 20:50, 24 May 2017 (UTC) Reply

References

  1. ^ "Polling and Statistical Models Can't Predict the Future". www.cameronalverson.com. Retrieved 2017-05-24.


  • The 'ecosystem' is vast so there are countless potential partners! Khan Academy, MIT OpenCourseWare, FutureLearn, university of the 3rd age, publicly funded government bodies (in countries where governments aren't corrupt), hackathons, etc. Powertothepeople (talk) 02:14, 8 June 2017 (UTC)Reply
  • Minority rights organisations: Wikipedia can actively address diversity issues by reaching out to organisations that stand for underrepresented groups. Working-party events could be planned to coincide with special dates - black history month, science week, women's history month, LGBT history month, and also localised celebrations or topics (e.g. in Australia we have Mabo day, anzac day, australia day, etc). For example, at the moment Wikipedia has a Women in Red initative which is trying to fix the underrepresentation of womens pages on wikipedia. A working group could be created on a university campus with a student feminist organisation, students of feminist theory, etc who come together during women's history month to work on the project. Or be really niche : contact a women's filmmaking organisation to create pages recognising notable female filmmakers and their work. Powertothepeople (talk) 02:14, 8 June 2017 (UTC)Reply
  • Government organisations may have funds for a team of wikipedia professional editors to create high quality pages rather than always relying on volunteers. For example, the Australian government (or associated entity) may fund ANZAC related pages to be created/updated in advance of ANZAC Day; Screen Australia might fund Australian film and filmmaker pages; the Australian Institute of Sports might fund pages of Australian sportspeople who trained there; etc. Of course it would be crucial to implement this in such a way that content maintains neutrality requirements and it is not influenced by those funding it. This could speed up the rate at which quality content is created on Wikipedia. Powertothepeople (talk) 02:14, 8 June 2017 (UTC)Reply
  • Research institutions - many are already aware of an issue related to dissemination of the research knowledge to professionals and the greater public,[12][13] and wikipedia could help with this. Powertothepeople (talk) 02:14, 8 June 2017 (UTC)Reply

Other edit

Focusing on everything edit

Focusing a little bit on everything will make things go slow. How will we manage to move towards our goal faster in this way? Supdocious (talk) 10:59, 4 June 2017 (UTC)Reply

Encyclopedia for everything or Encyclopedia for Science, Nature, Geography and the Arts (but not for business)? edit

Wikipedia should decide whether it really wants articles about business operations - marketing, management, advertising, human resource management etc. There is a great deal of hostility towards the business of business. Many editors appear to assume that the aim of business-related articles is to engage in spam or to promote products and services and regularly revert content which they construe as spam or promotional. Some editors go around tagging any article about advertising and promotion with "promotional in tone" type tags simply because the article is about promotional subject matter, rather than because it is written in a promotional manner or a promotional tone. I noticed recently that one Admin denied a request from an advertising agency for a name change, despite the fact that the company had formally changed its name from the XYZ advertising agency to the XYZ communications agency. In the view of this Admin, anyone who was in the "business of promotion" did not deserve any special treatment and furthermore, he stated that he intended to monitor the article to ensure that its name was never changed. Over and over again, I have seen this type of hostility to articles and edits about business activities as if business is some kind of dirty word. If this type of hostility is what Wikipedia really wants, then perhaps it is time for WP to pull the plug on articles about business operations. However, if WP does want to continue to provide these business-related articles, then editors need to lighten up a bit. BronHiggs (talk) 08:48, 5 June 2017 (UTC)Reply

There definitely is a "flaw" and "catch 22" in wikipedia in this regard:
1) There is no simple way for someone to flag incorrect information - people must edit the page to make changes, but they are not allowed to edit it if they are connected to the article due to concerns about conflict of interest. I know people who have wanted to do simple things like correct a simple factual error - for example wikipedia says they studied at a university they didn't actually study at (which is something there is no online evidence of one way or the other, but the subject has a copy of their qualification from a different university they can show to anyone) - and there is no way for them to correct this within Wikipedia's current set up. I believe Wikipedia should address this in a few ways: a) an 'update request' form linked to each page where anyone can submit more information for consideration, and Wikipedia editors to review the requests and make any necessary changes, (b) an inhouse professional research/edit team to specifically deal with business pages as it's not really fair to expect volunteers to "work" on something that benefits businesses, (c) businesses & private individuals who want their pages created /corrected / updated/ maintained pay a fee to wikipedia to pay for the inhouse research team. N.B. the research/editing team would have editorial independence and remain neutral, they are not "promoting" the business, simply ensuring that the information is correct and thorough according to best practices.
2) It is normal for text books, academic writing, etc to have different style guides depending on the sector and topic matter. I believe marketing would be an area that needs to have a different style guide (or set of rules) so case studies and examples can be given to exemplify knowledge without mistakenly being labelled "promotional." I too have noticed a problem in social science content areas when other editors take a more technical writing approach which doesn't suit the content but is argued to be more "neutral".
Side note - this discussion might be better under the "knowledge" theme rather than engaging the knowledge "ecosystem"? Powertothepeople (talk) 02:43, 8 June 2017 (UTC)Reply
@Powertothepeople: Your first point above seems to be covered by Wikipedia:Simple COI request, so perhaps that "flaw" could be addressed by making the Wikipedia:Simple COI request procedure more visible? You say that "it's not really fair to expect volunteers to 'work' on something that benefits businesses"; but it's not fair (or appropriate) to "expect" volunteers to work on anything. Even when we (appropriately) don't expect volunteers to work on something, it may happen that there is someone who wants to work on it. It is not only the businesses that benefit from having accurate information in the Wikipedia article about them; everyone who wants to read that article benefits from having accurate information in it (and it is likely this latter fact that would motivate volunteers). Biogeographist (talk) 01:02, 9 June 2017 (UTC)Reply
{{reply to|Biogeographist]] Yes, it does need to be more visable. I'm a fairly new editor and had no knowledge of the COI request. A key problem with wikipedia documentation is that it is only clear to those who already know it. It's about clear as mud to everyone else. Regarding "fairness" of volunteerism - obviously it is up to volunteers to decide what they are and aren't willing to do themselves. I was merely pointing out that most volunteers are happy to do something for the benefit of the community, but feel less happy about volunteering for the benefit of corporations who profit from their work. Wikipedia already has a huge backlog of articles that need to be created and updated, more than the volunteers can handle, and it also costs money for wikipedia to run itself (always asking for donations), so it would, in my opinion, make sense for businesses who want to ensure their page is updated as a priority to pay for such a service. If a business puts in a COI request but volunteers would rather work on other pages, then the business is still in the same place as if there were no COI request (information is not updated). There is nothing to compel volunteers to do the work. However if there is a paid service, then there are professional editors whose job it is to update the page. Problem solved. Powertothepeople (talk) 02:12, 9 June 2017 (UTC)Reply

Just to clarify, I am not talking about articles for specific business entitities, rather articles about concepts and theories in business operations e.g. Marketing; Advertising management; Marketing Research or Brand awareness. There is a lot of hostility to all manner of edits on such pages. For example, on the talk page, users might ask for practical examples of theories, but when added, other editors will systematically delete any commercial example because it is treated as spam. I don't consider myself to have a conflict of interest even though I have been trained in both management and marketing. BronHiggs (talk) 02:31, 9 June 2017 (UTC)Reply

General questions and comments (not specific to a theme) edit

Connection with other Social Media Websites edit

Getting updates and information from verified accounts Imam Houcine (talk) 11:53, 22 May 2017 (UTC)Reply

Imam Houcine, Firstly, it is not clear what you mean, and secondly, Wikipedia is not a social media website. You may have a valid and useful point to make, but so far have not done so. Please clarify what you wish to communicate. • • • Peter (Southwood) (talk): 06:29, 23 May 2017 (UTC).Reply

It seems that information from credited sources: sites, references and editors is possibly currently in use with Wikipedia. Social media credited accounts are actually both current subjects as well as sources of information. They could and should be cited and referenced accordingly. — Preceding unsigned comment added by Orchidrose (talkcontribs) 19:06, 1 June 2017 (UTC)Reply

Semi-protected edit request on 4 June 2017 edit

Add hashtags to the article Emoji fan (talk) 12:43, 4 June 2017 (UTC)Reply

  Not done: this is the talk page for discussing improvements to the page Wikipedia:Wikimedia Strategy 2017. Please make your request at the talk page for the article concerned. JTP (talkcontribs) 19:16, 4 June 2017 (UTC)Reply

Suggestion for a 6th theme: Resilience and persistence of Wikipedia edit

I'm suggesting another theme for Wikimedia's strategy: increasing resilience of Wikipedia and making sure that it persists even on the long-term. I would consider this the top priority right along with "Participation and integrative, healthy communities".

This may have gotten overlooked a bit because so far Wikipedia didn't have to face grave threats. However I think that it's very important to make sure we're as resilient as possible. Furthermore with fake news, Turkey's censorship, China's announced competitor, increased cyberwarfare among other things threats are rising. And more and more it seems the world is heading for a pretty bumpy ride in the next decade.

Full information about this can be found here.

--Fixuture (talk) 21:08, 12 June 2017 (UTC)Reply